The Dominant Ideology in Tech

Originally written for the Continuous Discovery newsletter, November 24, 2017

The entire tech world is gonna be gobsmacked when they finally realize the solution is to take more time and think about people more.

— Alan Cooper (@MrAlanCooper) November 2, 2017

I keep thinking about that this rant from Alan Cooper a few weeks ago, poking fun at the “move fast and break things” attitude adopted throughout the tech community. Everybody’s been trying to figure out how to be Facebook, Amazon, Uber, Airbnb, or Netflix, and to do that they had to adopt various permutations of agile, lean startup, continuous delivery, DevOps, design thinking, etc. But I feel like the hype has peaked as the shimmer starts to wear off and people are disappointed by poor execution, etc., or in some cases merely good results.

If the last 5-10+ years in business have largely been about disruptive innovation, “software eating the world” and thinking about “every organization as a tech company,” what kinds of narratives and themes will resonate when attitudes inevitably change (or even reverse)?

Summary:

  • Tech culture is dominated by a risky assumption that technology is inherently good — and by extension, developing new technologies faster is better, and anything that slows that process down is bad.
  • We need to develop better ways to insert critical reflection into our organizations and processes to make sure we’re building the right things (cont’d in the next edition).

Feel free to skim the headings and links below, or even skip to the end for what I’m reading and working on for the next edition.

Transformation Whiplash

Leaders of the big digital innovation push at GE (including CEO Jeff Immelt) were shown the door a few months ago by dissatisfied shareholders. This is interesting because GE was a prominent case study for digital transformation at large, traditional companies.

“For the past 16 years GE has been undergoing the most consequential makeover in its history. We were a classic conglomerate. Now people are calling us a 125-year-old start-up—we’re a digital industrial company that’s defining the future of the internet of things…”

I met with tech leaders including Jeff Bezos, of Amazon; Paul Otellini, of Intel; Marc Benioff, of Salesforce; and Steve Ballmer (and later, Satya Nadella), of Microsoft, and had dinners with venture capitalists. I listened to them describe where they were going and how they went from strength to strength. I also read a lot. The two things that influenced me the most were Marc Andreessen’s 2011Wall Street Journal article, “Why Software Is Eating the World,” and The Lean Startup…”

How I Remade GE,” Jeff Immelt (HBR)

Of course GE might have found itself in even worse trouble if they hadn’t started that digital transformation. So I don’t want to make too much of this or let anyone argue that this is proof that the cool tech kids have been totally wrong. I hesitate to even say “backlash.” But it is definitely a sign that the period of exuberance for all things digital + disruptive is cooling off.

But what about in Silicon Valley, where the notion of “software eating the world” and the need to “move fast and break things” were born?

Toxic Culture

Part of this story should include how Uber’s cut-throat values and Ayn Rand-loving founder helped breed a toxic, dysfunctional culture that finally caught up with its success — and no doubt caused real harm to the careers, job satisfaction and subjective well-being of many of its employees, especially women.

“Uber’s issues didn’t just pop out of nowhere. To those in the technology industry, the company has long been known for its “asshole culture”; and its frequent clashes with drivers, riders, law enforcement, and local government have made it either a disruptive hero or an arrogant villain, depending on who you talk to, and how badly they need a ride. Either way, there’s no question that Uber has been pushing the limits of acceptable corporate behavior for a while.

How Uber Got Here,” Madison Malone Kircher (NY Mag)

The focus on pushing for the best result has also fuelled what current and former Uber employees describe as a Hobbesian environment at the company, in which workers are sometimes pitted against one another and where a blind eye is turned to infractions from top performers.

Inside Uber’s Aggressive, Unrestrained Workplace Culture,” Mike Isaac (New York Times)

For more on “asshole culture,” not just at Uber but across Silicon Valley, see “Venture capital and the great big Silicon Valley asshole game” by Sarah Lacy (Pando). It’s from 2014, so some companies mentioned are now defunct, and some people have been pushed out, but fits the theme.

And yeah, asshole culture certainly isn’t limited to Silicon Valley, cf. Harvey Weinstein, Donald Trump, Floyd Mayweather, etc., but I’m sticking to tech culture here.

The Limits of Rationality

Image result for lebowski not wrong just an asshole

Another recent case study is James Damore’s infamous memo to fellow Google employees, arguing that a push for more inclusiveness and empathy is “unfair, divisive, and bad for business.”

He did make an honest attempt to use evidence and avoid generalization, but he betrayed some big blind spots, one of which was naivety about how people would feel and react, and how his intentions would be perceived (not to mention apparent unawareness that technical skill doesn’t translate into management/leadership ability, or promotability).

Like many people in technology, and like technology itself, Damore explains a complex social world through seemingly logical systems, patterns and numbers. It can seem like a rational way of thinking but it can also lead to conclusions that lack subtlety or sophistication. The same cognitive patterns underlie the algorithms that power social media, where complicated issues around gender and psychology are reduced to simple shorthand.”

“’I see things differently’: James Damore on his autism and the Google memo,”The Guardian

Ironically, Damore touched on the main theme of this newsletter: biases and blind spots in tech culture. “Google has several biases and honest discussion about these biases is being silenced by the dominant ideology.”

Ironically, Damore touched on the main theme of this newsletter: biases and blind spots in tech culture. “Google has several biases and honest discussion about these biases is being silenced by the dominant ideology.”

But what he calls the “dominant ideology” is more like “a specific initiative” (promoting diversity) which he seems to think is just a result of prevailing political views at Google — as if diversity programs must be part of a liberal political agenda, rather than pragmatic business decisions intended to expand the talent/recruiting pool, improve employee retention/satisfaction (and productivity), avoid negative publicity, and generally not let the assholes take over, à la Uber.

It’s true that people in tech lean left (like a lot of other knowledge-driven professionals: journalists, academics, and even lawyers), but the biggest biases and blind spots come from ideologies that go largely unspoken and taken for granted.

Let’s turn to Facebook…

In a largely automated platform like Facebook, what matters most is not the political beliefs of the employees but the structures, algorithms and incentives they set up, as well as what oversight, if any, they employ to guard against deception, misinformation and illegitimate meddling. And the unfortunate truth is that by design, business model and algorithm, Facebook has made it easy for it to be weaponized to spread misinformation and fraudulent content.”

Zuckerberg’s Preposterous Defense of Facebook,” Zeynep Tufekci (New York Times) — also see “We’re building a dystopia, just to make people click on ads” (TED)

The “dominant ideology” that needs to be discussed and questioned and disrupted is the largely unspoken assumption that technology and progress are the same thing.

Panic!

If you believe that technology-driven change or disruption is inherently good, then you’ll see faster change as better — and anything that slows that process down is bad. And that’s where a lot of our current problems are coming from.

The view at Facebook is that ‘we show people what they want to see and we do that based on what they tell us they want to see, and we judge that with data like time on the platform, how they click on links, what they like,’” a former senior employee told BuzzFeed News. “And they believe that to the extent that something flourishes or goes viral on Facebook — it’s not a reflection of the company’s role, but a reflection of what people want. And that deeply rational engineer’s view tends to absolve them of some of the responsibility, probably.”

How People Inside Facebook Are Reacting To The Company’s Election Crisis,”Charlie Warzel (Buzzfeed)

“This reflects a larger problem caused by Facebook’s immense growth and increasing complexity; as Alexis Madrigal recently wrote in The Atlantic with regards to the role of fake news and targeted ads on Facebook in the most recent election, ‘no one knew everything that was going on on Facebook, not even Facebook.’

People at Facebook Don’t Know How Facebook Works,” Kashmir Hill (Gizmodo)

“‘Most of the early employees I know are totally overwhelmed by what this thing has become,” an early ex-Facebook employee told me recently, referring to the size of the social network and the gargantuan impact it now has on the way people think and communicate… ‘I lay awake at night thinking about all the things we built in the early days and what we could have done to avoid the product being used this way,’ the early ex-employee told me.”

‘Oh my God, what have I done,’ some early Facebook employees regret the monster they created,” Nick Bilton (Vanity Fair)

This is a deeply dark time, in which the structures we have built to sustain ourselves are being used against us — all of us — in systematic and automated ways.”

Something is wrong with the internet,” James Bridle (Medium)

What we’re seeing now is arguably an over-reaction, but sometimes a little over-reaction is part of the rebalancing process.

So Now What?

Some people have started to argue that it’s high time someone studied issues around technology and ethics — which “irritated and bemused” a lot of people who’ve been studying that for years, even decades.

“It’s not as easy as saying we should throw some ethics at our technology. One should immediately wonder whose ethics are in view? We should not forget that ours is an ethically diverse society and simply noting that technology is ethically fraught does not immediately resolve the question of whose ethical vision should guide the design, development, and deployment of new technology. Indeed, this is one of the reasons we are invested in the myth of technology’s neutrality in the first place: it promises an escape from the messiness of living with competing ethical frameworks and accounts of human flourishing.”

One Does Not Simply Add Ethics To Technology,” LM Sacasas

Here’s an entire spreadsheet of technology ethics courses with links to their syllabi (via Casey Fiesler).

On a practical level, one idea I’ve been thinking about came from a one of the talks I saw at the CanUX conference a few weeks ago. In “An Opposite Truth,” Dan Klyn suggested we flip the UX dictum that “you are not your user”; we should also think about the ways in which “you and your user are one.”

Now seems like a good time for digital designers to find a way to face the mirror of self in the experiences we make…”

An Opposite Truth,” Dan Klyn

Wouldn’t it be funny if all we needed was the Golden Rule? (Probably not, since we don’t all like to be treated the same way and I hope social conformity isn’t the answer, but it’s worth revisiting it as an ethical rudiment.)

What’s Next?

The next edition should pick up where this left off (or rather, where I originally intended to take this one):

How can we insert more critical, long-term reflection into (or alongside) tech culture and “fail fast” practices like Lean Startup, some forms of Design Thinking, Agile, etc?

If you want to join a completely imaginary Continuous Discovery book club, I just started reading Victor Papanek’s classic, Design for the Real World: Human Ecology and Social Change (1971). It takes a critical look at design and how designers think (or don’t) about the implications of what they/we create.

In other news, last week we did a workshop at ResIM on “Facilitation for Introverts,” because I think the world needs a little more listening and reflection in the processes we use to create and change things.

Nothing else is planned but I’ll try to share future workshop topics/adventures, either here or through Twitter and LinkedIn. Thoughts/questions welcome.

Till then.
Brian