For anyone looking to get up to speed on what the Cambridge Analytica-Facebook story is all about, you couldn’t do better than this piece by Zenep Tufekci, who’s written other incisive pieces about the power of big tech in our society, including this excellent article about YouTube. She provides a concise run-down of what happened and why it matters to the public and to our country, describing a business model in which Facebook’s customers are not its social users, but the “advertisers, political actors and others” to which Facebook sells our attention and personal information. Making a persuasive case that there was no meaningful consent by users to the release of their information, Tufekci describes what happened with Cambridge Analytica's exploitation of user information as something of an inevitable by-product of Facebook’s business model.
I don’t think it’s an exaggeration to say that any American who goes online and isn’t aware of basic concepts about how Facebook and other big tech companies make their money off the exploitation of user data is akin to a babe in the woods, unaware of the privacy being given up and the fundamentally predatory attitude of these corporations. And with the many reports of how online social networks were used by Russians and others to influence the 2016 election, the consequences of our mass blindness and exploitation begin to seem darker by the day.
This is a complicated and troubling situation that invites the classic “no easy solutions” response. On the tech fix side, Tufekci says that the Facebook business model is inherently flawed and will be abused, and that its lack of accountability needs to be reigned in. Amen to that — and the start to such accountability is getting the word out about how exploitation of its own users is at the heart of its business. But beyond this, we’re at an inflection point in how we think about conducting politics in this country. Ads and other techniques based on micro-targeting of voters put provoking an emotional response over dialogue and understanding; they seem to be most effective at raising fear, not building hope. Their tendency is to manipulate and push people to extremes, funneling down information flows rather than broadening its reach or facilitating critical thought.
For these reasons, they are the latest, most technologically sophisticated expressions of a propaganda model of politics, which has long haunted our democracy, particularly in the age of mass media. Voters are viewed as targets to be activated, not citizens to be persuaded, or, crucially, listened to. This model has existed for so long now that we all see it as part of the normal state of things. But with cutting-edge technologies putting manipulation at the center of its technique, its coexistence with democratic discourse is revealed to be more inappropriate than we might have thought. And its dangers are even more pronounced given the political-economic state of our union, which I would argue is badly in need of fresh, egalitarian, democratizing ideas that are less about reinforcing people’s existing views, and more about asking people to think anew and creatively about our common challenges.