Creative Good is a customer experience strategy consultancy founded in 1997. They help to clients deliver better experiences to their customers and users, thereby building more successful products and services.
You’ve probably heard that Amazon has abandoned its plans to build its HQ2 in New York City. (If not: NYT.) This was prompted no doubt by pushback from the City Council – here’s an excerpt on my radio show – and various community groups.
New Yorkers said “no” to Amazon.
It’s good news. Remember that the deal was denounced both on the right (Wall Street Journal, National Review) and the left (Rep. Ocasio-Cortez): there’s simply no reason for New York to engage in, as the WSJ put it, “crony capitalism at its worst.” Our governor offering to change his name to “Amazon Cuomo” only added to the embarrassment.
Among many other problems, the deal fell through because Amazon didn’t include the “customer,” that is, the citizens who would be affected. The company has arrived at the stage, perhaps inevitable for a global monopoly, of blind arrogance toward the people it affects. Here is venture capitalist Albert Wenger’s take: this was “a failure to ‘read the room’ of epic proportions.” (For more, here’s another thread – which I strongly agree with – and a more Amazon-friendly take which, less so.)
The root of the problem is concentration of power – especially in the case of Amazon, a company originally known for its “obssession” for the customer experience. Once Amazon achieved its monopoly, the commitment to customers fell away. (Does anyone really think the HQ2 deal showed concern – let alone “obsession” – for the people affected?)
Concentration of power. It’s not just a problem at Amazon. We’re also seeing it in Internet access in the US. I recently interviewed Susan Crawford, Harvard Law professor and author of the new book Fiber, which covers this very problem. Telecom and cable companies have cemented their monopoly and duopoly positions across the entire US. Guess what happens when customers say they want faster, more reliable, reasonably priced fiber access. Nothing. (Actually, in some cases, a lot happens – the companies actively discourage citizens from speaking up to change the system.)
Let’s take it further. What happens when a Big Tech company, which never was committed to users in the first place, enjoys ever-higher concentration of power? You get Facebook boasting about fraud (that’s my 2-minute segment on the topic), telling third-party developers how “friendly fraud” works to deceive children into buying virtual goods on their parents’ credit cards. We’re talking kids who are 5 years old. And an official Facebook memo, with “fraud” in the title, on how to trick them.
Yes, a public company has admitted to fraud, yet no action has been taken against it. Hello again, concentration of power.
All of this reveals two simple lessons:
• If you’re a company planning a launch – an app, a site, a redesign, an HQ2 – include the customer in your thinking. My book Customers Included is a good roadmap.
The companies with the most concentration of power will not follow my advice. They won’t change on their own. That’s what antitrust is for.
But for everyone else, include the customer. It doesn’t take a lot of time or money, compared to the endless buildouts of extractive, exploitative software we see Big Tech companies engaging in. You can say “no” to Big Tech and delight your customers. Just treat people right. You might even build a company that citizens are happy to welcome to their city.
A bit of good news from my neighborhood, Manhattan’s Upper West Side, comes from a small independent bookstore. Facing rising rents and a decline in book-buying, the 35-year-old Westsider Books announced a few weeks ago that it would close shop. But then customers posted a crowdfunding campaign and raised over $50,000, the amount the owner had said the store needed to survive. The store won’t close after all. (Here’s a post and a video with details.)
If asked what saved Westsider Books, some people might point to the Internet. How else, without online crowdfunding platforms, would customers have so quickly been able to fundraise to keep the store open? But then the Internet – or rather the monopoly platforms that have taken it over – helped cause the problem in the first place. If you want to know what New Yorkers do these days instead of read books as they used to, get on any subway train and see if anyone looks up from their handheld slot machines.
The Web was different in the early days. I’ve said this before, I know, but it bears repeating: we started with an open platform, with neutral protocols, an actual platform that people could create on. Not the locked-down prison, or slaughterhouse, that we have today.
Can we recover some of that founding spirit, that “creative good” that animated so many founders (myself included) to launch something new online? I’ve been searching for good news on my radio show and occasionally see a glimmer.
Douglas Rushkoff offers hope: in his new book Team Human, and on his podcast of the same name, he advocates for genuine human connection as a kind of resistance to today’s deadening, tech-soaked environment.
We need more of these glimmers of hope, as the monopolies are locking down every corner of the Net, the economy, and society. If enough of us embrace Team Human, or the “creative good,” or some other version of that founding idea, we might resuscitate the online world. It worked for Westsider Books.
The obit section this week noted that two people I’ve long admired – John Bogle, founder of Vanguard, and poet Mary Oliver – passed away within a day of each other. The news, though sad, showed symmetry. In very different fields, Bogle and Oliver each advocated the value of simplicity: Bogle offered financial customers a straightforward, non-exploitative investment service; and Oliver wrote hundreds of invitations for us to observe, appreciate, and live in the natural world.
We need more Bogles, and more Olivers, in today’s frothy moment. In the landscape I know best, where online tech has had an effect, things have veered recently toward the deceptive, the exploitative, and the contemptuous of customers and citizens. Despite that, I feel optimistic that 2019 will begin to show a turnaround.
First, let’s state a fact, unfashionable as it may be to say right now. You build a better business by treating your customers well, not by exploiting or deceiving them.
The twist, as explained by Peter Drucker – whose attitude was not unlike Bogle’s – is that this approach works in the long run. There’s no short-term pop, no instant infusion of wealth or status, from treating people well. It takes commitment and grit to stick to a strategy of empathizing with customers, and delivering benefits to them (and I mean actual benefits, not addictions, manipulations, or traps).
Much of Silicon Valley takes the opposite approach by exploiting users as a resource, rather than serving customers with a long-term commitment. This has the unfortunate effect of wiping out human-oriented, community-friendly organizations and institutions. (Look at Amazon killing Main-street retail; Facebook killing democracy and community; Google killing entrepreneurship and privacy, among other things; etc.)
This Big Tech approach will fail, 100% guaranteed. I’m not sure when – it could crater this year (signs look possible), or Big Tech might manage to monopolize the markets, capture the regulators, and anesthetize citizens sufficiently to last several more years.
But when the Silicon Valley mindset finally collapses – NOT IF, BUT WHEN – the teams outside Big Tech that have patiently treated customers well, all these years, will thrive all the more.
For those teams in 2019 that are fighting the good fight – trying to help, not exploit, their users – your task is to stay alive. Keep listening to customers, keep delivering good experiences and honest messaging, and stay hopeful that this winter will pass at some point. Indeed, there are green shoots already in the frozen ground (worthy of a Mary Oliver poem, I’d say).
This is why I predict that customer-inclusive strategy is coming back, and it’s happening in 2019. Many teams are tired of, or opposed to, the exploitative approach and are looking for alternatives.
The announcement of Amazon’s HQ2 plans for New York City has brought about important questions about the origins of this deal. How could our leaders pledge $1.5 billion to Amazon at a time when libraries and schools lack funding, the subway is in dire need of improvement, and one out of seven New Yorkers suffers from food insecurity?
Skepticism about the deal has come from all corners. After Representative-elect Alexandria Ocasio-Cortez posted her “extreme concern” on Twitter, the Wall Street Journal and National Review both published pieces agreeing with her. It’s remarkable that HQ2 has brought about a moment of bipartisanship during this polarized season.
Still, I don’t think these questions address the full scope of what’s happening. New York isn’t being taken over by Amazon, it’s being taken over by Big Tech. Consider:
– Google has just announced its intent to secure another 1.3 million square feet of Manhattan office space, which, when added to the company’s real estate in Chelsea, gives the company more square footage in New York City than in their northern California headquarters;
– Alphabet subsidiary Sidewalk Labs has installed over 1,600 LinkNYC kiosks throughout New York City, each outfitted with three cameras and dozens of sensors, collecting unknown amounts of data on New Yorkers (listen to my thoughts on this);
– Facebook, according to a New York Times story on November 14, called on Senator Schumer to help deflect the Senate’s investigation of Russian meddling on the social network (Schumer was happy to help – also, his daughter works for Facebook);
– Facebook was also the target of a walkout last week by Brooklyn high school students, who protested the Facebook-designed curriculum that their school forces them to sit through.
These are just a few of the many indicators that something is changing in New York. When viewed alongside the corruption of the HQ2 deal, they clearly point out the direction that Big Tech wants to take our city.
I can say with confidence – having worked over two decades in the tech industry – that these companies are optimized for one thing: launching platforms to crush all opposition. For example, Google’s Gmail has become the dominant email service in the world, despite nagging privacy concerns. Amazon is steamrolling the entire retail industry. And the worldwide effects of Facebook – propaganda, rising authoritarianism, even genocide – are all made possible by its dominance as the leading social network.
This domination, across the Big Tech platforms, is largely driven by continuous extraction of our data, usually without our knowledge or consent, and quite often without any legal basis (and accompanied by hypocritical slogans like “connecting the world” or “don’t be evil”). This data allows the companies to manipulate markets, elections, and social groups to behave in ways that further benefit Big Tech. Hence Google’s interest, for example, in those LinkNYC kiosks drawing down as much data as possible from New Yorkers. The citizens of New York serve as valuable data sources for Google’s algorithms.
What all this points to is Big Tech’s interest in turning New York itself into one of its platforms. Imagine: a city where all the students are forced to study a Facebook-designed curriculum … where all transactions go through Amazon … where all the sidewalks are fully surveilled by Alphabet sensors … and above all, where our mayor, governor, and (at least one) senator do the bidding of their west-coast benefactors! Once Big Tech achieves its dominance of New York City, we’ll be nothing more than their satellite state, useful for our physical footprint and some residual financial talent remaining from our past days of Wall Street pre-eminence. Our infrastructure may crumble, and our schools may continue to fail, but at least we’ll officially be in the care of Big Tech as one of its favored platforms.
New Yorkers still have an opportunity to resist this outcome. First, we must gain better awareness, paying attention to Big Tech’s increasingly strong claim on our commerce, our streets, and our real estate. Most people I talk to, for instance, have no idea who’s behind the LinkNYC kiosks. And many New Yorkers can’t say who owns popular services like Instagram or Waze. (Facebook and Alphabet, respectively.) The more we’re informed about Big Tech’s influences on our day-to-day life, the less likely we’ll be caught off-guard by their next attempt to take over our lives – or our city.
Second, and most importantly, we must actively support the people who are working to expose and counteract Big Tech’s plans for New York. Here I refer to the journalists, researchers, and politicians who – like Alexandria Ocasio-Cortez – are willing to stand up and tell the truth about the corrupt HQ2 deal. With this sort of pressure, our elected representatives will be forced to demonstrate where their loyalty lies: with the tech billionaires, or with the citizens of New York City.
With all the news about Big Tech’s missteps, it can make you wonder whether online tech is a lost cause, having turned into a morass of surveillance machines, soulless slot-machine apps, and dark patterns, all fueled by “the weaponization of UX,” as Evan Selinger put it.
But there’s hope. Small teams on the edges, most of them nonprofit, are beginning to create better tech that actually tries to benefit users, rather than exploit them. Some are more established, while some are just starting out, but they represent a new beginning, which we badly need.
A number of people recently have suggested that Facebook – by profiting from the social harm it causes, yet denying its culpability – is acting like the tobacco industry in the 1990s. While there undeniably are similarities between the two, I think there’s a better analogy.
Facebook isn’t cigarettes, it’s asbestos.
Asbestos, of course, is a dangerous carcinogen that was used as insulation in buildings for many years, before its harmful health effects became widely known.
Consider how asbestos sounds an awful lot like Facebook:
• Initially promising – and it was good at its stated purpose – it turned out to have toxic effects that far outweighed the advantages.
• Immediate harm to an individual is hard to detect at first, but long-term effects are visible; and society-wide, it’s terrible.
• People in poorer countries tend to be stuck with it, while wealthy industrialized nations are waking up to the danger, and ripping it out.
• And you do have to rip it out. You can’t gradually peck away at the thing, since it’s installed down at an infrastructural level. It’s a pain to get rid of, and expensive, but it has to be done.
• Once a society is educated to its danger, it is socially unacceptable to let your loved ones – especially your kids – live with it.
Perhaps this will be a helpful metaphor as we discuss what to do about Facebook.
This month marks 20 years of my writing the Creative Good (after starting this blog in ’97). I’ve seen a lot along the way: boom & bust cycles, buzzwords, trends and fads, so many things that have risen up and faded away. But along the way, the internet has permanently changed the world. For better and, more recently, worse.
Throughout, I’ve tried to maintain the “creative good” ideal, the belief in products that do good, in platforms that actually treat users with basic human respect, in services that actually benefit people in the long run. I’ve tried to show how it actually does work, if companies create good experiences, rather than exploiting people for short-term gain.
Recently it’s been a tough idea to hold onto. The rise of Big Tech and its business model of surveillance, manipulation, and monopoly – undergirded by techno-chauvinist arrogance – have brought about some disappointing outcomes, to put it lightly. And until recently, it hasn’t shown signs of fading away.
I’ve been wondering: where does it go from here? Is there still a place for people who think systemically about creating good experiences? I don’t mean designing an interface to more easily, say, share a toxic post on social media. I mean designing products that people actually want, and can gain value from – see Customers Included – rather than tricking them into psychological addiction loops.
But things do seem to be changing, a bit. For example, this appeared on Twitter a few days ago:
Ahh, late-stage Blockbuster. I remember it well from the late 90s. In those days, everyone rented movies from Blockbuster not because they liked it, but because it was the only option (beyond the odd indie rental place, and most were odd, in a good way). The company was fueled by punitive late fees: you had to bring back your movie in two days, or else. The experience was disappointing, if not outright hostile, to customers.
Then Netflix launched. Those little red envelopes started arriving into mailboxes everywhere, with no late fees, and in that moment, Blockbuster ceased to exist as a viable proposition. The end of Blockbuster's dominance can be traced back to one single factor: the improvement in the experience, as offered by a competitor.
Consider the parallels between Facebook today and Blockbuster in the late 1990s. Just about everyone uses Facebook today – not because they like it, but because it's the only way they know how to keep up with friends and family.
And yet, most people can sense that something about Facebook is a bit "off." Every week brings new revelations that the company has deceived its users – or enabled others to do so – for financial gain. Meantime, Zuckerberg and Sandberg have repeatedly been called to Congress to apologize.
• Facebook's harmful effects around the globe are getting higher-profile media coverage. For example, read this BuzzFeed News article about Facebook's explicit and strong support of the Philippine dictator Rodrigo Duterte. Another recent piece covers the role of WhatsApp – a Facebook property – in a series of lynchings in an Indian town. And then there's Facebook news from Myanmar, Germany, and elsewhere.
• Zuckerberg himself is getting more scrutiny, as in this new profile in the New Yorker, which reveals Zuck's bizarre fascination with Caesar Augustus (he named his second daughter "August" after the Roman dictator). Meantime, other celebrities in the news are swearing off any connection to Zuck's apps – like Michael Stipe announcing he's done with Instagram – which is owned, of course, by Facebook.
• Calls for anti-trust action, to break up Facebook into its component parts, are gaining momentum. Here's author and Columbia professor Tim Wu writing in The Verge: It's Time To Break Up Facebook.
I’m hopeful that with time, and what I suppose will be some strong anti-trust action, Facebook will eventually fade out, or at least get regulated into some shape that is not so egregiously toxic. The larger question is, does good experience still win the day, online? Or are we stuck with Big Tech pushing for ever-more invasive and ethically compromised products, while citizens hope for anti-trust regulators to save the day?
As I begin my third decade writing this newsletter, I certainly hope it’s the former. I’ll continue to advocate for products and teams that create good, that include customers, and that set their sights on long-term benefits. And if you’re on board, thanks for sticking with me. Onward!
Some people throw up their hands. “It’s too difficult. Someone else should do something. What difference can I make, anyway?” That’s the attitude of technofatalism, and we have to fight it.
As tech historian Marie Hicks writes, “Technofatalism isn’t logical; it’s a highly destructive failure of imagination and unwillingness to resist the status quo. It’s the sullen twin of technophilia. Both assume tech can and will determine society. It’s not that simple. We have a fighting chance. But we have to fight.”
Below, my recent material on doing something to fight Big Tech and get back to creating good with technology. Enjoy. And drop a line if I can help your organization. –mark
• Break Up Google, says the Boston Globe: “Never in the history of the world has a single company had so much control over what people know and think . . . [and] Google’s power is bound to grow still more. Last year, it spent more on federal lobbying than any other company. By tweaking the way information appears on search pages, Google can already promote its own websites and banish competitors to digital oblivion.”
• Venture capitalist Fred Wilson has unplugged his Amazon Echo. “If anyone in our house is uncomfortable with devices listening to our conversations, I don’t want to subject them to that. . . This raises a broader question about these voice devices which is whether the value they offer outweighs the creepiness they create in the home. For us, the answer has been a resounding no.”
• Amazon’s Facial Recognition Fans Big Brother Fears (WSJ, May 22): “The retail giant has been selling the technology as a means to help authorities identify suspects in surveillance footage . . . The ACLU and other civil-rights organizations sent a letter to Amazon Chief Executive Jeff Bezos expressing ‘profound concerns’ about the potential misuse of the technology, which Amazon calls Rekognition.”
• Alexa listened to a couple’s conversation and sent it to the husband’s employee without permission (BoingBoing, May 24): Complaint from a Portland woman that Alexa “listened in on a conversation and sent it to a random contact of theirs – one of her husband’s employees. . . When KIRO-7 questioned Amazon, they responded with this: ‘Amazon takes privacy very seriously. We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future.'”
• How to make sure your Amazon Echo doesn’t send secret recordings (CNN, May 25) – almost unbelievably, CNN actually suggests that you “live like everyone’s watching”: “You can unplug them all until you are confident in the tech industries [sic] privacy protections, or you can go about your daily life avoiding doing or saying anything embarrassing (or illegal).” Thanks, CNN, we’ll try to be more compliant from now on!
Is it just me, or are things with tech these days going a little sideways?
Just a month ago people were shocked, shocked that Facebook would allow third-party vendors to make off with their private data. “Trust us,” Facebook responded, “from now on we’ll keep all your data secure [that our surveillance gained and will continue to collect].” Facebook’s share price has since recovered and people are using Facebook more than ever.
I wonder if most people are paying attention. Or want to.
I spoke with someone last week who wasn’t aware, until I told her, that Instagram is owned by Facebook. “You might like my radio show,” I said. “No,” my acquaintance said, “I might be too scared of what I’d learn.”
Who can blame people for wanting to ignore what’s happening? The reality seems too big to grasp, and the necessary change – breaking up Facebook and Google, for starters – seems out of reach.
But! There’s good news. Consider:
• Both left-leaning organizations and some right-leaning politicians are looking at breaking up Facebook, a rare glimmer of bipartisan agreement.
• I’ve featured several guests on my Techtonic radio show with good news about change in tech. Listen to the shows on WFMU or via podcast. David Sax (“The Revenge of Analog” author), Anya Kamenetz (“The Art of Screen Time” author and NPR reporter), Corey Pein (“Live Work Work Work Die” author), and Len Sherman (Columbia Business School professor) all suggest ways we can live better in – and make change to – a world dominated by Big Tech.
• Finally, with a nod to Bruce Schneier (“The primary business model of the internet is built on mass surveillance” – source), I present you with a song I wrote about surveillance, set to “Camptown Races.”
• Google’s New Voice Bot Sounds, Um, Maybe Too Real (NPR, May 9): “On the first day of Google’s annual conference for developers, the company showed off a robot with a voice so convincingly human that it was able to call a salon and book a haircut – never revealing that it wasn’t a real person making the call. CEO Sundar Pichai demonstrated the new AI technology on Tuesday at the Google I/O conference.”
• Google insiders claim that “the final version of Duplex, the stunning AI bot that sounded so real it fooled humans, may be purposefully made less scary.” (Business Insider)
• What Google is doing with your data, by John Rolfe in the Queensland Times (May 14): “Experts from Oracle claim Google is draining roughly one gigabyte of mobile data monthly from Android phone users’ accounts as it snoops in the background, collecting information to help advertisers. . . . The information fed back to Google includes barometric pressure readings so it can work out, for example, which level of a shopping mall you are on. By combining this with your coordinates Google knows which shops you have visited. . . . Only turning off a phone prevents monitoring.”
Don’t forget Facebook…
• Jaron Lanier speaking about Facebook and Google as “behavior modification empires” that “rely on behavior modification and spying” for their business models. It’s cheaper to pay them to ruin things than to make positive change. Perhaps, Lanier says, it’s time for subscription fees. (Hey, it worked for HBO and Netflix!) “I don’t believe our species can survive unless we fix this,” Lanier concludes. “In the meantime, if the companies won’t change, delete your accounts.”
• Interview with Jaron Lanier in New York Magazine: “A lot of the rhetoric of Silicon Valley that has the utopian ring about creating meaningful communities where everybody’s creative and people collaborate and all this stuff — I don’t wanna make too much of my own contribution, but I was kind of the first author of some of that rhetoric a long time ago. So it kind of stings for me to see it misused. Like, I used to talk about how virtual reality could be a tool for empathy, and then I see Mark Zuckerberg talking about how VR could be a tool for empathy while being profoundly nonempathic, using VR to tour Puerto Rico after the storm, after Maria. One has this feeling of having contributed to something that’s gone very wrong.”
• Where Countries Are Tinderboxes and Facebook Is a Match, by Amanda Taub and Max Fisher in the NYT, about Facebook’s effects in Sri Lanka. “Time and again, communal hatreds overrun the newsfeed — the primary portal for news and information for many users — unchecked as local media are displaced by Facebook and governments find themselves with little leverage over the company. Some users, energized by hate speech and misinformation, plot real-world attacks. … ‘You report to Facebook, they do nothing,’ one of the researchers, Amalini De Sayrah, said. ‘There’s incitements to violence against entire communities and Facebook says it doesn’t violate community standards.'”
• Searching for a Future Beyond Facebook, by Jacob Silverman: “Facebook has accomplished a neat trick in the last fourteen years, draping itself in humanitarian intent while establishing a globe-straddling monopoly. In the name of connecting people, it has built the world’s largest surveillance apparatus, rivaled only by Google.”
• Nonetheless, Facebook’s share price has recovered. “Facebook wiped out all of its losses following the Cambridge Analytica data scandal. Shares hit an intraday high of $185.99 on Thursday.”
• More on that consent decree: NPR, March 2018: “The Federal Trade Commission is looking into whether Facebook violated a  consent decree by enabling third parties to access users’ information without their permission.” ($40,000 per violation.)
• Megan McArdle’s thread explaining “just how bad the economics of the media industry are” (see also the unrolled thread). “Subtitle: why you can’t have all the awesome free journalism you want and have come to expect.”
• There will be little privacy in the workplace of the future, from The Economist’s March 2018 issue. Excerpt: “‘Every aspect of business is becoming more data-driven. There’s no reason the people side of business shouldn’t be the same,’ says Ben Waber, Humanyze’s boss. The company’s staff are treated much the same way as its clients. Data from their employees’ badges are integrated with information from their e-mail and calendars to form a full picture of how they spend their time at work. Clients get to see only team-level statistics, but Humanyze’s employees can look at their own data, which include metrics such as time spent with people of the same sex, activity levels and the ratio of time spent speaking versus listening.”
• The Internet’s ‘Original Sin’ Endangers More Than Privacy, by Brendan Eich and Brian Brown (who run the privacy-minded Brave browser). “As much as half the data consumed on mobile plans goes to downloading ads and trackers, adding significantly to fixed mobile data plans. . . As much as 50% of mobile battery life is consumed by ads while browsing. . . The internet need not be characterized by predation and parasitism. It can once again be a place of infinite possibility. Innovation got us into this situation; it can get us out.”
Zuck was on the stand for about 10 hours across two days of Senate and House joint committee hearings. And I’ll admit, I was impressed with his composure. Here’s a 33-year-old guy who, unlike Bill Gates 20 years before him, didn’t give in to overt expressions of irritation or arrogance. Clearly he had been heavily coached beforehand – which helps explain the somewhat robotic mannerisms and the weird tic of beginning every response with “Senator” or “Congressman” or “Congresswoman” (see Slate’s supercut video).
He just didn’t tell the truth.
With 2 billion users and counting, Facebook is unavoidable, and it’s growing more influential by the day. It’s vital that we understand what Zuck is actually up to, especially since he didn’t reveal it in his testimony. A number of media sources have helpfully corrected his inaccurate claims. Thus I present you with…
Untruths, deceptions, and misdirections that Zuck fed to Congress:
• “You control your data.” Not true, says Washington Post’s Geoffrey Fowler (“No, Mark Zuckerberg, we’re not really in control of our data.”) Or as this EFF piece says, “Zuckerberg’s insistence that users have ‘complete control’ neatly overlooks all the ways that users unwittingly ‘share’ information with Facebook.”
• “You own your data.” That’s impossible, says Gizmodo, given the “shadow profiles” that Facebook keeps on people who never use Facebook. (How can non-users “own” or “control” the data that Facebook’s surveillance has compiled on them?) Even Facebook’s Privacy Operations team disagrees with Zuck, points out MIT’s Technology Review. See also this WSJ piece by Christopher Mims (“Facebook has a lot more data about us than it lets on”).
• “Facebook doesn’t sell your data.” That’s deceptive, says this Vice article: Zuck’s claim is “an expert-level demonstration of hair-splitting … [your data] is valuable, and by not allowing other entities access to it, Facebook can monetize that same data over and over again.”
• “Facebook is a community.” Not at all, says Ian Bogost…
• “We need to increase the security of data that users upload.” Misdirection, says Zeynep Tufekci. The real issue is “HOW WE KEEP FACEBOOK FROM COLLECTING DATA ON US”:
Justin Brookman describes the misdirection this way: “FB always tries to frame privacy as you vs the world, not you vs Facebook.” And as David Carroll put it here: “Zuck attempts to redefine the very definition of ‘privacy’ as what we share, not what he collects.”
• “Conspiracy theory” is how Zuck described users’ concern about Facebook snooping on the phone. Unfair, says LibrarianShipwreck: “Bear in mind that until pretty recently you were mocked as wearing a tinfoil hat if you talked about 99% of the things that Facebook is now in the hot seat for having done.” See also this video test of Facebook and Google by Mitch Hollow.
• “Facebook is here to serve you.” Hardly the case, says Ian Bogost: “The computer ceased to be a servant of human life and began to be the purpose for which that life is conducted. That’s the heart of problem with the technology industry today, and it’s a problem that data-privacy regulation alone has no hope of fixing.” See also his article in The Atlantic comparing the Facebook debacle to the dotcoms.
• “Facebook made mistakes because it was so idealistic, focusing on doing good things, and no one could have imagined bad things happening.” Delusional, says Wired’s Erin Griffith: “‘I’m sorry for being too focused on the good and not enough on the bad’ is about as sincere as saying your greatest weakness is you work too hard.” Or as Siva Vaidhyanathan put it, “The idealism is the problem. There is a fine line between pledging to do no evil and believing you can do no wrong.”
• As I posted: Don’t forget that what we’ve learned of Facebook and Google surveillance is only what journalists discovered. Be assured that (a) Facebook and Google are doing other, equally disturbing things that weren’t detected, and (b) they’re planning now for future launches to be harder to detect.