The EU has now officially taken up arms against Facebook.
Yesterday in Brussels, European Commissioner for Competition Margrethe Vestager gave the keynote address at the kickoff of a new pro-democracy initiative called ALL for Democracy. Although it’s hard to know how this new group will evolve, the agenda for the half-day inaugural shows an effort to draw on the expertise of union and corporate representatives, political organizers and activists, journalists, government officials, and academics across Europe.
What is striking about this initiative is that it makes no bones about seeing Facebook as a power center that is a threat to democracy due to its ability to sway opinion. The causes include its lack of accountability and its “code as law” approach of relying on algorithms.
The debate over the role of the media in shaping public opinion goes back to at least the 1920s when the wildly effective World War I propaganda campaign run by the Creel Committee came to light. It was deeply disturbing for citizens to come to see how the government had gone to such concerted lengths to sway their views and how effective the program had been. Creel Committee members like Walter Lippmann and Eddie Bernays tried to make the case that in a complex society, average citizens lacked the time and in many cases the skills to make sense of more and more complex information. They needed experts to pre-digest it for their consumption. They depicted this “manufacture of consent” as necessary and benign.
It was journalists in the main who, as the BBC’s Adam Curtis would put it, molded mass opinion via how they characterized the ongoing outflow of official propaganda (like government and business news releases), while preserving the veneer of independence by also acting in an investigative capacity (again reinforcing their status by enforcing at least some norms). And even though there was a period when a relatively small number of media outlets could sway popular views (for instance, CBS, ABC, NBC, Time and Newsweek in the 1960s), it was a time of high prosperity when many social assumptions were broadly shared, and perhaps most important, the elites were much less distant from the rest of society and had a much greater sense of noblesse oblige than now. Thus when the press in a past period when had highly concentrated power, in many respects, it comported itself well.
Increasing social fragmentation, aided and abetted by the rise of new media outlets which facilitated targeting narrow audiences, eroded the notion that there are or even could be broadly shared social beliefs. Fox fans inhabit a different reality than New York Times subscribers. But even with an increasing splintered political landscape, there were enough power centers and channels for reaching citizens and consumers that most participants felt they had a shot in the war for share of mind. In other words, the new complex landscape seemed navigable and fair enough, at least if you were a pretty big dog. But Facebook as a choke point changes the equation in a way that threatens not just the press but also other power players.
Even though Verlager focused on how “social media” curating represented a threat to democracy, the gorilla in the room is clearly Facebook, which many believe has already become an dangerously influential arbiter of content. . And her critique goes much further than the US fixation on “fake news”. From her prepared remarks:
The citizens of Athens didn’t go to the assembly just to vote. They went there to debate. To hear everyone’s views, so they could make the right decision for the city. And in a parliament or a public rally, in a newspaper or on TV, democracy still depends on free and open debate for those who choose to engage.
That’s where, if we’re not careful, social media could let us down.
Because despite all the connections that it allows us to make, social media can also lock us up in our own worlds. No one knows what I see on my social media timeline but me – and the social media company itself. And we can’t have an open debate from inside separate worlds.
In the US, nearly two thirds of adults get their news from social media. Here in Europe, more than a fifth say it’s their main source of news. News which is the basis of our democratic debates. And yet most of us aren’t really in control of the information that we see.
A social network like Facebook gets more than 50 million status updates a day. It makes sense to turn to algorithms to help us sift through that information.
The trouble is, it’s very hard to know how an algorithm has made its decision. And the things it chooses to hide might as well never have existed.
So even if an algorithm is just designed to show us things we’ve taken an interest in before, it can still limit our horizons without us even noticing. It can get in the way of seeing new ideas, or looking at old ones in different ways. It can build up our prejudices until they seem to be, not just opinions, but a natural part of how the world works.
That isn’t just about news stories that aren’t true. It’s about only seeing the facts that match the ideas we already have. About losing track of the fact that other views even exist. Because we can’t have a democratic debate if we only hear selected views.
Although she focused on the dangers of confirmation bias, note she also pointed out how sticking within communities of the like-minded hampers citizens’ ability to work together when they hold diverging views.
Verstager also saw Facebook’s ability to narrowcast as a way to take politics out of the public square entirely:
The information that social media companies collect about their users can transform the way you advertise. It can help you put your message in front of exactly the people who are likely to buy it.
But when you apply that to politics, it could undermine our democracy.
Because if political ads only appear on the timelines of certain voters, then how can we all debate the issues that they raise? How can other parties and the media do their job of challenging those claims? How can we even know what mandate an election has given, if the promises that voters relied on were made in private?
Politico reported in its daily European e-mail summary:
At the launch of both ALL for Democracy, and at the OECD Forum in Paris this week, participants expressed strong concerns that algorithms can distort everything from election results to what updates from friends a person can see.
In a bit of synchronicity, the New York Times, without making reference to the ALL for Democracy event, published Facebook’s Role in European Elections Under Scrutiny, which is actually mainly about the UK. An earlier Times story on this topic was awfully close to a PR palcement, describing various tech approaches to fighting fake news. Needless to say, it did not consider that these algos are unaccountable and mainstream media stories, which these algos treat as factual,often don’t deserve such reverential treatment. And even worse, the Times only well into the piece acknowledged that Europe didn’t seem to be afflicted much with fake news:
So far, outright fake news stories have been relatively rare. Instead, false reports have more often come from Europeans on social media taking real news out of context, as well as from fake claims spread by state-backed groups like Sputnik, the Russian news organization.
Nevertheless, the tech promoters were sure it was coming.
Yesterday’s Times article alludes to the controversy over the data mining firm Cambridge Analytica’s claims that its Facebook targeting helped the Brexit Leave and Trump campaigns score their unexpected victories. Our Marina Bart debunked that idea. Even the Grey Lady published a skeptical piece.
However, the scare has produced some good results:
Lawrence Dodd lives in one of Britain’s most fiercely fought voting districts, and he has been peppered almost daily with ads from the country’s major political parties on Facebook. About a month ago, he tried to find out why….
Facebook provides little information on how political parties use ads to reach undecided voters on the site. And concern has been growing since the American presidential election about the company’s role in campaigns, including about how politically charged fake news is spread online.
Now, as voters head to the polls across Europe, groups in Britain, Germany and elsewhere are fighting back, creating new ways to track and monitor digital political ads and misinformation on the social network and on other digital services like Twitter and Google.
The political ads shown to Mr. Dodd are being tallied by WhoTargetsMe?, a nonpolitical group that designed a digital tool to monitor Facebook’s role ahead of the British election..
That lack of information has raised hackles about the activities of both Facebook and politicians in a country where campaigns are highly regulated and political financing is tightly capped.
Even the generally business-friendly Brits are increasingly arguing that Facebook needs to be constrained. Notice this academic isn’t saying whether or not to regulate Facebook; his question is how:
“It’s a fundamental conversation to have about how we regulate this,” said Nick Anstead, a media and communications expert at the London School of Economics. “Facebook has a responsibility to tell its users who is buying advertising that is targeting their votes.”
As we pointed out when the campaign to get Facebook to Do Something about fake news was hot, the California giant was clearly not keen about the assignment. It’s not clear whether its concerns were mercenary (censoring content can alienate advertisers and readers; it also imposes costs) or whether it thought there was no way it could do take on this role without creating more controversy. Some French media outlets, including Le Monde, said Facebook made it hard for them to send in alerts on potential fake news. And Facebook’s refusal to share enough data to determine if fake new it circulated had any impact on the elections is giving critics more fuel for their demands to force Facebook to become more open:
Academics and others scrutinizing the vote also said the company’s failure to provide data on what Facebook users in France shared among themselves made it virtually impossible to determine if false reports spread on the network affected the overall result.
Given that the Europeans are much more wiling to ride herd on businesses, particularly tech giants, than Americans, it makes perfect sense that they are in the forefront of the battle against Facebook’s excessive power. I wish them well.
I think its very interesting to compare the GM food issue in the 1990s with Facebook and democracy today. In the 1990s opponents of GM food argued that the technology was unproven and had unknowable possible side effects, and there therefore we should invoke something previously never deployed in social policy, the ‘Precautionary Principle’, and ban the technology until we could be satisfied it is safe. This was done in Europe without even understanding whether ‘proving it was safe’ (or otherwise harmful) was even possible. GM food remains as far as I can see the only new technology to be treated by an advanced society in this way.
Now Facebook seems a much more tangible threat that GM food, in that mechanisms by which harm can occur have been clearly identified, and harms are potentially very bad. It is more tractable than a biological system in that mechanisms by which Facebook acts can be clearly tracked by bespoke software, so it is comparably easy by those with the right skills to investigate how it works. So should we not invoke the precuationary principle and ban Facebook until such time that we are convinced that it is safe for our society?
Lots of people boycotted GM foods (and still do) but 3 billion are still on facebook and boycots on safety grounds are still exceptional.
Europeans are rightfully correct that FB should be taken to task a bit given it’s US HQ and likely very cozy relationship with US Intelligence. But I’m surprised that anybody is concerned over it’s fake news problem, which forever and always has been a problem. As readers of this site well know, the New York Times is the the mouth piece of the establishment through fawning pieces and anonymous sources. So called established media can be counted on to beat the drums for war after all. And prior to FB, we’ve had the Drudge report, Rush Limbaugh, Fox News, and any number of tabloid sources that will spew half-truths over any media. Perhaps the concern is that FB is a funnel that further concentrates these things further, but in my experience it tends to lead to family members having open fights over political disagreements. And that is interesting to me because I see my Uncle constantly prodding his sister and nephew over political arguments. I don’t know that they would get a counterpoint view anywhere else. FB to me seems to be an unpredictable beast in terms of what people might be exposed to because of their connections. I see FB to be less of a problem as it’s just another arm of the establishment that continues to try and manufacture consent. Their bigger problem is that they can’t quite control it the way the front page of the NYT can be curated.
I vote to give algorithms the vote. They currently make most of the important decisions in America, literally billions of times a day, and they do it pretty much flawlessly. They buy your stocks, hail your rides and present you with the products you are likely to buy. Without them, the world as we know it would probably come to an end. Since we, as a species, have proven ourselves to be unreliable, inconsistent and, for the most part, wrong about almost everything, it seems perfectly logical to grant algorithms manumission and allow them to vote on our behalf. Our corporate computer overlords will be pleased. For the record, if they’re happy, I’m happy.
Wow! The second time in two weeks an obscure name that I had never heard before popped across the radar screen.
Thought question from my linguist friend, Max Philips re the power of propaganda.
Q. What single man was responsible for more female deaths in the 20th century than any other single individual?
Hint: Not Stalin, not Hitler, not Chairman Mao, not Pol Pot.
A. Eddie Bernays (nephew of Sigmund Freud), who, in 1929, convinced women around the world that it was cool to smoke cigarettes.
Millions of women’s deaths due to lung and other cancers, emphysema, and other deaths triggered by smoking ultimately must be laid at the feet of Bernays.
Fascinating.
This post is on-point. Over the past couple of years I’ve read several tech site postings suggesting FB’s ultimate goal is to become a news media monopoly, replacing NBC,ABC, CBS, NYT, LAT and other older news media/formats. There is still some public accountability with the older media outlets. I’m not sure there’s any public accountability with FB.
Thanks for this post..
Bill Gates shares this dream. Digital First Media appears to be the current iteration of the Gates Family Foundation’s funding of the destruction of newsrooms and press rooms coast to coast. I say “appears to be” because you have to do a lot of digging to find Gates’ fingerprints. If you look closely at who the creditors were in the Dean Singleton bankruptcy, the Gates Foundation went in as a major creditor and an opaque Private Equity ownership group with ties to Redmond WA emerged from the restructuring. There have since been several mergers and re-brandings; all opaque.
No surprise that Zukerberg’s similar megalomania requires a propaganda arm…
No fan of Fakebook but its ironic we are holding the EU up to be come guardian of the truth? Is this a joke?
That really is the kettle calling the pot…
It’s American. Same as European Commission interest in Google.
As early as 3 year ago, maybe earlier, I noticed that my posts were not being seen even by my friends on Facebook. It apparently depended on how often that they had read (clicked on?) other of my posts. And also on FB algorythms that somehow measured each person’s personality and tastes and then decided if they would be interested or offended or irritated by this post or that post.
This made Facebook worthless.
Many of the friends I had acquired on FB were people I’d known in San Francisco some 20 years before. Many of my views had changed since then, as had theirs. Well, I would say that it seemed clear many of my views had changed a lot while theirs not very much at all.
I had learned so much about economics and finance, for one thing (among many other areas), traveled Western Europe, lived for several years in the MidWest and NYC, earned two degrees (my first ones).
I wanted to share with my old friends some of my new sources for learning. But this was not meant to be.
And this was even worse with new friends on FB, which could come from just the friends of friends sources, random suggestions, or an actual meeting and exchange of FB names. With those it seemed that FB was only going to show me endless news and pics about my friends’ pets and kids.
I felt like I was in a have a nice day nightmare so eloquently described by Barbara Ehrenreich in 2010s Brightsided.
Trying to get through this Facebook censorship was pointless. Once, I made a post in 3-parts on the idiocy of snark, using the past the present and the future. I thought it was kind of funny, but to the point.
Friends responded with confusion. Why? Because some got one of the posts, some got two, some got a different two, so it didn’t make any sense at all. Probably most got none of the posts, because it wasn’t a picture of my cat.
I left Facebook. I started posting on Twitter, but I can tell that they, too, do shadow-banning and otherwise make it hard or impossible for even my followers to see most of the things I post.
I’m glad that you wrote this article and strongly agree with what Verlager is saying.
These social media companies should all of them be turned into public utilities, with hard-line free speech policies wired into them. This would not sit well with some (maybe all) countries’ governments and definitely not with the social media companies themselves.
Maybe the real solution will be when the interenet becomes truly de-centralized and encrypted so that no one central authority, be it a Facebook or a China (or a UK; especially, a UK) can control the flow.
Well-written article, thanks.
Earlier this year, my eighth grade Social Studies teacher died at the age of 100. Although I was privileged to have several excellent primary school teachers, this man’s teachings have never left me—even 56 years later. He spent one half of the school year teaching us how to recognize propaganda. Imagine teaching 12- and 13-year olds in a lily-white, private suburban school how to recognize propaganda! It’s as well that my conservative parents never asked me what I was learning at school.
In memory of this man, the school magazine re-published his Methods of Propaganda and Methods Used by Newspapers. This article on Facebook overreach seems like an appropriate time to remind news aficionados of his principles:
Methods of Propaganda
1. Bandwagon (everybody’s doing it)
2. Cardstacking (emphasizing one side of an issue and suppressing another)
3. Glittering Generalities
4. Name Calling
5. Plainfolks (presenting one’s self as just an ordinary person)
6. Scare
7. Slogan
8. Testimonial
9. Transfer (of praise or blame)
Methods Used by Newspapers
1. Slanting
2. Quoting
3. Editorial
4. Cartoon
5. Burying
6. Omission
7. Photograph
Critical thinking can be taught. With today’s instant information, being able to sift out bias seems more important than ever.
Great list! I intend to incorporate it into my media literacy seminars.
It’s fun to think, sometimes, about new ideas as applied to old technologies: like if the government had decided somehow back in the days of rotary phones that it needed to prevent us from spreading false information to each other, whether by accident or not?
If we had to read and sign 10-page legal documents before entering a retail store? (TOS/EULAs)
If you needed to have separate keys (passwords) not just to open your mailbox but to read a magazine or to buy anything. Not just to open your front door and start your car, but a separate key for everything?
Crazy world.
As a non FB user I’ve never feel worried about it. I a increasingly concerned about Google and the reason is not that it is a US company as some suggest occurs to europeans. The reason is that Google is increasingly an ad tool rather than a search engine. Also, the longer you use it, the more biased are internet searchs by algorithms. I also dislike dependency and lack of competition. Natural monopolies should be public. There are also problems between public services and internet navigation programs. In Spain, official communication between companies and particulars and government agencies cannot longer use Google Chrome or Internet explorer (only firefox works) seemingly because these companies introduce bugs preventing the use of certain digital certificates. Why they do it? I don’t know.