Facebook’s recurring nightmare: Helping muddy up elections

FILE - In this March 15, 2013, file photo, a Facebook employee walks past a sign at Facebook headquarters in Menlo Park, Calif. The San Jose Mercury News reports Saturday, March 17, 2018 that building permits compiled by Buildzoom show Facebook plans to erect the 465,000 square-foot (43,200 square-meter) building at its campus in Menlo Park, Calif. (AP Photo/Jeff Chiu, File)
ad-papillon-banner
ad-banner-plbr-playa-linda
ad-banner-setar-tourist-sim-watersport2024
ad-aqua-grill-banner
ad-aruba-living-banner
265805 Pinchos- PGB promo Banner (25 x 5 cm)-5 copy
ad-banner-costalinda-2024
ad-banner-casadelmar-2024

By RYAN NAKASHIMA and ANICK JESDANUN
AP Technology Writers

MENLO PARK, Calif. (AP) — Facebook has a problem it just can’t kick: People keep exploiting it in ways that could sway elections, and in the worst cases even undermine democracy. News reports that Facebook let the Trump-affiliated data mining firm Cambridge Analytica abscond with data from tens of millions of users mark the third time in roughly a year the company appears to have been outfoxed by crafty outsiders in this way.

Before the Cambridge imbroglio, there were Russian agents running election-related propaganda campaigns through targeted ads and fake political events. And before the Russians took center stage, there were purveyors of fake news who spread false stories to rile up hyperpartisan audiences and profit from the resulting ad revenue. In the previous cases, Facebook initially downplayed the risks posed by these activities. It only seriously
grappled with fake news and Russian influence after sustained criticism from users, experts and politicians.

In the case of Cambridge, Facebook says the main problem involved the transfer of data to a
third party — not its collection in the first place. Each new issue has also raised the same enduring questions about Facebook’s conflicting priorities — to protect its users, but also to ensure that it can exploit their personal details to fuel its hugely lucrative, and precisely targeted, advertising business.

Facebook may say its business model is to connect the world, but it’s really “to collect psychosocial data on users and sell that to advertisers.” said Mike Caulfield, a faculty trainer at Washington State University who directs a multi-university effort focused on digital literacy. Late Friday, Facebook announced it was banning Cambridge , an outfit that helped Donald Trump win the White House, saying the company improperly obtained information from 270,000 people who downloaded a purported research app described as a personality test.

Facebook first learned of this breach of privacy more than two years ago, but hasn’t mentioned it publicly until now. And the company may still be playing down its scope. Christopher Wylie, a former Cambridge employee who served as a key source for detailed investigative reports published Saturday in The New York Times and The Guardian , said the firm was actually able to pull in data from roughly 50 million profiles by extending its tentacles to the unwitting friends of app users. (Facebook has since barred such second-
hand data collection by apps.)

Wylie said he regrets the role he played in what he called “a full service propaganda machine.” Cambridge’s goal, he told the Guardian in a video interview , was to use the Facebook data to build detailed profiles that could be used to identify and then to target individual voters with personalized political messages calculated to sway their opinions.
“It was a grossly unethical experiment,” Wylie said. “Because you are playing with an entire country. The psychology of an entire country without their consent or awareness.”

Cambridge has denied wrongdoing and calls Wylie a disgruntled former employee. It acknowledged obtaining user data in violation of Facebook policies, but blamed a middleman contractor for the problem. The company said it never used the data and deleted it all once it learned of the infraction — an assertion contradicted by Wylie and now under investigation by Facebook. Jonathan Albright, research director at the Tow Center for Digital Journalism at Columbia University, said Facebook badly needs to embrace the transparency it has essentially forced on its users by sharing their habits, likes and dislikes with advertisers.

Albright has previously noted cases in which Facebook deleted thousands of posts detailing Russian influence on its service and underreported the audience for Russian posts by failing to mention millions of followers on Instagram, which Facebook owns. Facebook is “withholding information to the point of negligence,” he said Saturday. “How many times can
you keep doing that before it gets to the point where you’re not going to be able to wrangle your way out?”

The Cambridge imbroglio also revealed what appear to be loopholes in Facebook’s privacy assurances, particularly regarding third-party apps. Facebook appears to have no technical way to enforce privacy promises made by app developers, leaving users little choice but to simply trust them. In fact, the enforcement actions outlined in Facebook’s statement don’t address prevention at all — just ways to respond to violations after they’ve occurred.

On Saturday, Facebook continued to insist that the Cambridge data collection was not a “data breach” because “everyone involved gave their consent” to share their data. The purported research app followed Facebook’s existing privacy rules, no systems were surreptitiously infiltrated and no one stole passwords or sensitive information without permission. (To Facebook, the only real violation was the transfer of information collected for “research” to a third party such as Cambridge.)

Experts say that argument only makes sense if every user fully understands Facebook’s obscure privacy settings, which often default to maximal data sharing. “It’s a disgusting abuse of privacy,” said Larry Ponemon, founder of the privacy research firm Ponemon
Institute. “In general, most of these privacy settings are superficial,” he said. “Companies need to do more to make sure commitments are actually met.”