How Engineering’s Blind Spot Creates Runaway Monopolies

0

One of Big Tech’s biggest problems is that designers too often forget to put a soul in the machine.

In his new book World Without Mind: The Existential Threat of Big Tech, journalist Franklin Foer paints a deeply disturbing portrait of an age in which a handful of giant technology companies — Google, Apple, Facebook, and Amazon (aka GAFA) — exert extraordinary control over an information-based economy with their power to disrupt business models and decimate industries by siphoning off revenue.

Even more alarmingly, Foer writes, big tech increasingly influences not just what we know, but also how we think and what we do, by gathering vast quantities of our personal data and utilizing artificial intelligence to continually prod us to consume. In the process, Foer charges, they’ve created a world in which people are continually being observed and distracted. “The tech companies are destroying something precious, which is the possibility of contemplation,” he writes.

A former editor of The New Republic and now a staff writer for The Atlantic, Foer was a recent visiting speaker at Stanford Graduate School of Business’s Corporations and Society Program. We asked him to talk further about what’s wrong with the world created by big tech — and how to fix it.

You published your book 50 years after Vance Packard’s The Hidden Persuaders, which similarly exposed how advertisers and marketers were misusing behavioral psychology to manipulate consumers in the analog age. Are we simply seeing the same problem today but with better tools?

The crucial change is data. The ability to get inside our heads and therefore the ability to manipulate us is so much more invasive and intense than it was in the 1950s. It’s personalized and deeply exploitative of anxiety. And we’re with these technologies all the time. You might have turned off your TV, but your phone is always by your side.

In your book, you note that Silicon Valley always has been driven by a contradiction. It offers breakthroughs that promise to be liberating for individuals but end up serving monopolism.

The internet has the possibility of being as democratic as promised, and it comes with this dream of tying everybody together as one unit. But that impulse ultimately is what points it toward monopoly and conformism. Not inevitably, but it points in that direction.

You write that the tech giants’ concentration of power serves to “squash diversity of opinion and taste.” Will that ultimately stifle the sort of innovation that led to their rise?

No, because these companies spend so much on R&D and their machines are always teaching themselves new things. But I think it kind of squashes innovation in the economy as a whole. Capital flows to a bunch of well-established companies as opposed to being seeded throughout the economy, where it would disperse innovation to a whole bunch of new firms. It’s not healthy for an economy to have so much control concentrated in a few companies.

It’s gob-smacking that there is no comprehensive law in the country protecting data.
Franklin Foer

The problem is the way these giant tech companies exert control over markets, which can be bad for consumers and bad for the firms that depend upon these platforms. They have the ability to pick winners and losers. And as they continue to grow, their own products will be the winners on their platforms. The instinct for Google and Amazon to award their own products the highest placement is almost irresistible. Facebook, as it produces more and more original video, will be giving that heavier weight in its algorithms. The monopolistic effects of these platforms will end up crushing the entities that depend on these platforms. We see that already with journalism, which has come to depend so heavily upon Facebook and Google.

A lot of the negative effects of technological innovation described in your book seem to be unintentional side effects. Do we need a new approach to teaching business ethics, and how to use innovations more responsibly?

It’s a real shortcoming in the computer science curriculum and engineering curriculum more generally. Engineers and programmers are taught how to make efficient systems but rarely do they understand the human component of the systems, the ethical and political dimension of what they build. And so there’s an inherently large amount of impactful decisions that these companies make. And if it’s only about efficiency, they’re going to be making bad human decisions.

But I’m loath to shift too much of an ethical burden onto these companies. I do want the leaders trained in ethics, but capitalism is what capitalism is. It’s hard to imagine sacrificing profit for the sake of the greater good. That’s where government policy comes in.

So regulation would enable these companies to keep doing what they do, but with less damage.

It’s gob-smacking that there is no comprehensive law in the country protecting data. That is step one: legislation creating a new regulatory body. My desire is to see a regulatory regime that reviews issues of surveillance and monopoly as intertwined and understands that surveillance is the mechanism by which monopolies protect their incumbency.

The problem is so big and pervasive that there’s no single silver bullet. Regulation is a necessity, but I think cultural change among consumers is also a necessity. There needs to be comprehensive social change.

You’ve advocated creation of a federal data-protection agency that would not only guard consumers’ privacy but also protect the free flow of information on the internet from undue corporate influence. How would that work?

Europeans have pointed the way with a lot of what they’ve done, though there are parts of their model that I dislike. I’m not a fan of the right to be forgotten, which I think is contrary to a lot of our First Amendment beliefs. But I do think individuals should be able to exert much greater control over the use of their own data. It’s not simple to set up, but we could figure it out.

Increasingly, we’ve seen the public losing faith in institutions and turning on them. Is such a backlash against big tech inevitable? What might trigger it?

The Russian hacking of the elections is a pretty seminal event. It has caused trust in Facebook to diminish. I think that’s just an early warning sign to these companies. Ultimately, big institutions fall out of favor and lose the public’s trust. That’s pretty much just part of the cycles of American history.

You note that 62 percent of Americans now get their news from social media and describe how that has wrecked the news media’s economic model, creating one in which misinformation can spread virally. What consequences does that have for business?

It’s ultimately going to become a threat to business itself, where you could see businesses spreading misinformation about rivals. When misinformation flows so easily, there’s no telling where it stops. Re-establishing some common basis for fact is a business necessity as well as a political one.

Can you see corporations becoming players in this effort to restore truth?

I don’t know how they can do that precisely. Nobody has a good handle on how we can restore good basis for fact. But I do think that business is going to play an important role in shaping the regulatory environment. Every company that is not GAFA — be it Walmart or Microsoft or other massive players — will attempt to reassert some control over this economy. The wars over antitrust are going to be intense and they’re going to cross-cut in very interesting ways. I’m especially interested in seeing how larger players who still have political muscle will maneuver. What steps does Walmart take in response to Amazon’s hegemony? It’s hard to imagine that they won’t be aggressive at some point.

What advice do you give to people who work in the tech sector?

I think tech workers need to feel that they’ve pushed their companies to behave in the most ethical manner. When people ask me whether they should go to work for one of the big companies, I’m always agnostic. The companies are so powerful that having smart, well-intentioned people within who think in well-rounded ways is important. But it’s also important that those people not gravitate toward the center and that they continue to start up their own companies. The world needs a pluralistic marketplace. What we don’t want is a concentration of brains as well as a concentration of power.

This article first appeared in www.gsb.stanford.edu

Seeking to build and grow your brand using the force of consumer insight, strategic foresight, creative disruption and technology prowess? Talk to us at +9714 3867728 or mail: info@groupisd.com or visit www.groupisd.com

About Author

Comments are closed.