Joi Ito's conversation with the living web.

I received a lot of excited feedback from people who saw the 60 Minutes segment on the Media Lab. I also got a few less congratulatory messages questioning the "gee-whiz-isn't-this-all-great" depiction of the Lab and asking why we seemed so relentlessly upbeat at a time when so many of the negative consequences of technology are coming to light. Juxtaposed with the first segment in the program about Aleksandr Kogan, the academic who created the Cambridge Analytica app that mined Facebook, the Media Lab segment appeared, to some, blithely upbeat. And perhaps it reinforced the sometimes unfair image of the Media Lab as a techno-Utopian hype machine.

Of course, the piece clocked in at about 12 minutes and focused on a small handful of projects; it's to be expected that it didn't represent the full range of research or the full spectrum of ideas and questions that this community brings to its endeavors. In my interview, most of my comments focused on how we need more reflection on where we have come in science and technology over the 30-plus years that the Media Lab has been around. I also stressed how at the Lab we're thinking a lot more about the impact technology is having on society, climate, and other systems. But in such a short piece--and one that was intended to showcase technological achievements, not to question the ethical rigor applied to those achievements--it's no surprise that not much of what I said made it into the final cut.

What was particularly interesting about the 60 Minutes segment was the producers' choice of "Future Factory" for the title. I got a letter from one Randall G. Nichols, of Missouri, pointing out that "No one in the segment seems to be studying the fact that technology is creating harmful conditions for the Earth, worse learning conditions for a substantial number of kids, decreasing judgment and attention in many of us, and so on." If we're manufacturing the future here, shouldn't we be at least a little concerned about the far-reaching and unforeseen impact of what we create here? I think most of us agree that, yes, absolutely, we should be! And what I'd say to Randall is, we are.

In fact, the lack of critical reflection in science and technology has been on my mind-I wrote about it in Resisting Reduction. Much of our work at the Lab helps us better understand and intervene responsibly in societal issues, including Deb Roy's Depolarization by Design class and almost all of the work in the Center for Civic Media. There's Kevin Esvelt's work that involves communities in deployment of the CRISPR gene drive and Danielle Wood's work generally and, more specifically, her interest in science and racial issues. And Pattie Maes is making her students watch Black Mirror to imagine how the work we do in the Lab might unintentionally go wrong. I'm also teaching a class on the ethics and governance of AI with Jonathan Zittrain from Harvard Law School, which aims to ensure that the generation now rising is more thoughtful about the societal impact of AI as it is deployed. I could go on.

It's not that I'm apologetic about the institutional optimism that the 60 Minutes piece captured. Optimism is a necessary part of our work at the Lab. Passion and optimism drive us to push the boundaries of science and technology. It's healthy to have a mix of viewpoints-critical, contemplative, and optimistic-in our ecosystem. Not all aspects of that can necessarily be captured in 12 minutes, though. I'm sure that our balance of caution and optimism isn't satisfactory for quite a few critical social scientists, but I think that a quick look at some of the projects I mention will show a more balanced approach than would appear to be the case from the 60 Minutes segment.

Having said that, I believe that we need to continue to integrate social sciences and reflection even more deeply into our science and technology work. While I have a big voice at the Lab, the Lab operates on a "permissionless innovation" model where I don't tell researchers what to do (and neither do our funders). On the other hand, we have safety and other codes that we have to follow--is there an equivalent ethical or social code that we or other institutions should have? Harrison Eiteljorg, II thinks so. He wrote, "I would like to encourage you to consider adding to your staff at least one scholar whose job is to examine projects for the ethical implications for the work and its potential final outcome." I wonder, what would such a process look like?

More socially integrated work in technology has continued to increase in both the rest of society and at the Lab. One of my questions is whether the Lab is changing fast enough, and whether the somewhat emergent way that the work is infusing itself in the Lab is the appropriate way. Doing my own work in ethical and critical work and having conversations is the easiest way to contribute, but I wonder if there is more that we as a Lab should be doing.

One of the main arcs of the 60 Minutes piece was showing how technology built in the Lab's early days--touch screens, voice command, things that were so far ahead of their time in the 80s and 90s as to seem magical--have gone out into the world and become part of the fabric of our everyday lives. The idea of highlighting the Lab as a "future factory" was to suggest that the loftiest and "craziest" ideas we're working on now might one day be just as commonplace. But I'd like to challenge myself, and everyone at the Media Lab, to demonstrate our evolution in thoughtful critique, as well.









6 Comments

Joi

In the wake of the 2002 Sarbanes-Oxley legislation, passed in the wake of financial scandals such as Enron and Tyco, many companies struggled to cope with the complexities of compliance. Goldman Sachs CEO Hank Paulson organised 20 ethics forums that the bank’s entire staff of managing directors was required to attend. Citigroup introduced ethics training for all its 300,000 employees. At Lubrizol, a specialty chemicals company, two people were employed to post ethics guidelines in seven languages, and oversee 27 regional ethics leaders, around the world. A firm called EthicsPoint was set up to provide ethics online: the firm built integrated web and telephony systems to provide “automated and accurate distribution”.

This delivery metaphor for ethics persists. The Online Ethics Center is chock full of ‘resources’ on the ethics of science and engineering. http://www.onlineethics.org/ The mission of Entire, Europe’s online Wiki, is also to make Research Ethics and Research Integrity ‘accessible’.

The idea that ethics is a thing that, when missing, one obtains from a third-party supplier (or, as Mr Eiteljorg suggests, from an in-house expert) is a symptom of the problem, not its solution. Would it not be better to investigate the system conditions in which some communities do behave ethically - and then figure out how to recreate those conditions at the Lab?

I can see the concern about delegating or 'outsourcing' ethics, but is embedding it through a system really achievable or desirable? Should individual Google or Tesla product managers be given the task of balancing complex commercial and ethical judgements? Don't we want companies to do what companies do best - innovate for commercial reasons - and restrain them where needed through the kind of regulation you describe?

On the other hand, I find Joi Ito's dismissal of the independent ethics advisor troubling. Message seems to be, don't worry, leave it to us, we'll find a solution. Where have I had that before?

The Media Lab no longer invents the future. Nobody does that anymore. It is an idea left over from the birth of the Media Lab. It has gone one to disappoint its founder. That past is over as it is for the rest of the innovation crowd.

Totally agree.

Not all progress is good.

What you should be looking for at MIT are antibodies; some peaceful idea that binds human beings without the need for science and technology. Because we can all agree that this King of the Mountain approach in large organisations has got to go; a lucky few get to spend a bit of time on top, but everybody spends most of their existence getting crushed from a great height. So why do don’t you start solving this problem rather than placing ever more electronic, non-human media as a proxy for authentic relationships?

Joi,
I am a big fan of the Lab. To bring ethics in the mix you need to bring in the personal transformation into the mix. You don't need to be religious to be bound by some ethical values. You could ask all students and staff what boundaries they voluntarily will honor; a sort Hippocrates' oath for Geeks. For my school we have some activities that start of this process, but we are not there yet. We are creating the future :-)
Gerdt

Leave a comment

Recent Comments

Whiplash by Joi Ito and Jeff Howe
Freesouls by Joi Ito

Category Archives

Monthly Archives