One day in mid-October at 11:24 a.m., an alert went off in Facebook’s election War Room. Political news in a Utah congressional district wasn’t coming from inside the U.S.a mismatch Facebook had tuned its software algorithms to detect.A data scientist in the election-monitoring center at Facebook headquarters in Menlo Park, California, inspected the activity manually and discovered, at 11:47 a.m., that the source spreading the content was an ad farm in Bangladesh. Ten minutes later, an operations specialist removed all the suspect activity.That’s a real example from the 2018 midterm elections, shared by Facebook executives in a recent slide presentation in Paris, meant to demonstrate that the company’s tools are effective when working correctly. The slides, viewed by Bloomberg News, show in detail how Facebook has improved its process for rooting out bad actors using tactics similar to those Russian operatives used in 2016. The message: Facebook will be more prepared to take on misinformation and meddling in this year’s electionsin India, the Philippines, Ukraine, Thailand and other countriesas well as the U.S. presidential race in 2020.
Continue reading at AdAge.com
Source: Ad Age – News