Joi Ito's conversation with the living web.

A few months ago, McKinsey & Co. asked me to write an article for their online magazine What Matters. The edited article, "Creative Commons: Enabling the next level of innovation" was just posted to their site.

Following is the unedited original version.


The explosion of innovation around the Internet is driven by an ecosystem of people who work in an open network defined by open standards. However, the technical ability to connect in an increasingly seamless way has begun to highlight friction and failure in the system caused by the complicated copyright system that was originally designed to "protect" innovation. Just as open network protocols created an interoperable and frictionless network, open metadata and legal standards can solve many of the issues caused by copyright and dramatically reduce the friction and cost that it currently represents.

Before Ethernet and RJ45 connectors became the standard, we connected computers together using a variety of different network technologies and connectors. It was usually physically impossible to connect computers from different companies together. Many of us will remember having Appletalk cables on our Macintoshes, which didn't connect to the network cables on our PCs. While Ethernet wasn't the "smartest" protocol around, because of its simplicity and the lack of proprietary patents encumbering its use, it became widely adopted as a standard way to connect computers together.

Before TCP/IP was developed, even if the computers were able to be connected together physically and electronically, the computers couldn't really talk to each other without proprietary networking software. There were the networking protocols from computer and operating system vendors like Appletalk and Microsoft's own networking protocol. You could also buy networking equipment and software from vendors such as Banyan and Novell.

I remember very clearly when I first heard about TCP/IP and I downloaded the free implementations for both my Mac and my PC and for the first time, was able to communicate between the computers and more importantly with all of the computers in the rest of the world using TCP/IP. TCP/IP enabled the creation of the Internet and ended an era of proprietary networks both locally and as services such as The Source, CompuServe and AOL in their original forms.

Then Tim Berners Lee and the World Wide Web came along. Again, I remember clearly many people arguing that we didn't need the World Wide Web since we could already log into any computer on the Internet, download papers, find the citations and track down and easily download the references. Many people did not recognize, initially, the value that the interoperability and the simplicity that the World Wide Web enabled in creating documents on the Internet.

As we know in hindsight, each of these open standards created an explosion of innovation. Ethernet enabled companies such as Cisco, 3Com and others to emerge and compete in an area that used to be dominated by huge vendors who built super-expensive networking systems designed by telephone companies to specifications hammered out over years in Inter-Governmental standards bodies.

Similarly, TCP/IP allowed independent companies, the first ISPs to compete at providing network services to companies and individuals, breaking, often for the first time, monopolies that the telephone companies were granted by government. This introduced competition driving down the cost of moving bits around and also enabled a whole ecosystem of software components, many free and open source. Author David Weinberger would later describe this system as "small pieces loosely joined." This new network created out of small objects developed by small teams using open standards and protocols was a completely new model.

In the past, organizations under the UN such as CCITT that later became the ITU worked together with governments, telephone companies and their huge research organizations to create enormously complicated standards anticipating every possible problem and building in features for the various constituents represented at in the meetings. After years of deliberation, these standards would be agreed upon and the telephone companies would contract massive projects taking years and millions of dollars to huge vendors who would develop the systems. There was no room for small pieces, small players or participation by any person or organization that wasn't well trained, organized, funded and authorized.

The Internet changed all of that. The Internet Engineering Task Force (IETF) had the credo, "rough consensus, running code". Anyone could participate in the discussion and in fact, much of the discussion occurred online allowing just about anyone to contribute as long as what they were saying or the code they were writing made sense. The agile method of developing standards allowed very small teams and individuals to participate both in the standards process and the development of useful tools and components of the network.

It took only several years from the days when "unauthorized devices" couldn't be connected to the Internet to when just about everything important that we were using to talk to each other was written by small teams on top of lightweight standards and protocols, mainly HTML and HTTP, on top of TCP/IP.

The Web and the ability for users to "view source" and copy each other's code created an explosion of innovation, content and business models such as eBay, Amazon and Wikipedia.

If you try to imagine what it would have been like to create Google before we had this stack of open standards, you would probably have had to pay millions of dollars to create the software on a proprietary operating system. It would have required a huge team of people taking many years. Since it was a "search engine" it most likely would have been given to the phone company to design and run. If we were using X.25, the CCITT equivalent of the Internet, we would be charged and would be charging each person for each packet of information that they sent and received from us in a network where each network operator had a bilateral agreement with each other network operator.

This total project probably would have taken a decade and cost a billion dollars and would probably not even have worked properly.

In fact, the total cost of actually building and launching the first Google server was probably only thousands of dollars using standard PC components, mostly open source software as the base and connecting to the Stanford University network which immediately made the service available, at no additional cost, to everyone else on the Internet.

The open standards and the small pieces loosely joined had created an ecosystem of components and networks that dramatically lowered the cost of development, collaboration and delivery. This allowed people to innovate, launch, fail, connect, mashup and remix in such an efficient way and at such low cost, that the center of innovation moved from the research laboratories of the giant companies to the startup and venture capital scene in Silicon Valley.

Of course, there were startups and venture capitalists before the Internet, but the influence and scale of this new engine of innovation was unprecedented.

The Internet continues to disintermediate and disrupt sector after sector by lowering friction and enabling interoperability. We find businesses and whole industries having to change their models and compete with a whole new set of players ranging from individuals to companies to non-profit organizations. In most cases, this has created lower prices, more access and more choice for the users. The new industries outscale in size and global reach businesses of the past.

The Internet has enabled us to technically connect and collaborate. But just as network software engineers were required to open communications between online users, we now need lawyers to sort out the copyright and content regulations between us so that we - businesses and individuals - can share, collaborate and build legally.

Before the Internet, if two large companies wanted to collaborate on a project or one company wanted to license a work from another company for their territory, the deal makers would often meet in a posh hotel in Cannes sipping champagne to negotiate a price. After several rounds of golf and a few cigars, the executives would agree on the price and "my people will talk to your people" to nail down the details. Finally, the lawyers would be flown in to negotiate the contract. Often these deals were multi-million dollar deals, legal fees costing hundreds of thousands of dollars over the lifetime of the collaboration. However, the value and the cost of the actual transaction was so high that the legal fees were just absorbed into the cost.

Today, the Internet enables a professor in Croatia to collaborate on courseware with a professor in Japan. However, if they are going to legally share data and copyrighted material, they need to clear the licensing systems of both universities, calling upon their mutual legal departments. Most likely, they would need to bring in outside experts to translate the legal documents and finally they would negotiate some sort of contract for the collaboration. The legal fees between these two professors would drastically exceed the technical cost and probably the value of the project, effectively making such a transaction prohibitively expensive, dooming this collaboration to failure.

Imagine an amateur filmmaker creating content to upload to their website as they try to clear the rights of music that they've gathered from across the Internet. Or imagine someone who wants to give a television broadcaster the right to use, with attribution, a photograph that they had posted on their blog. In most cases, the legal fees would exceed the value of the transaction and the sharing would fail, either because the parties would ignore the law, or opt not to share because the legal cost of doing so was prohibitive.

Creative Commons, the non-profit organization for which I am the Chief Executive Officer, is the "TCP/IP of collaboration and content layer." Creative Commons aims to solve these problems with a series of licenses, technical specifications and tools that allow creators to mark their works with the permissions that they wish to grant, free of charge. People using Creative Commons licenses decide whether they would like to allow commercial reuse or restrict reuse to only non-commercial purposes. They decide whether they would like to allow derivative use and modification of their creation. And they decide whether these modified works must be shared back to the rest of the world using the same free license or not.

Creative Commons also provides tools for users to dedicate their works to the public domain. For some scientific data or educational resources the public domain provides the maximum flexibility and value.

You can choose one of the Creative Commons licenses yourself or use the CC0 public domain dedication tool. Service providers like Google, Yahoo and Microsoft support Creative Commons, providing tools to technically mark your works with easily understood icons and standardized metadata. Standardized metadata means other users can easily find and use available creative works, making tasks such as attribution and citation easy and automatic.

Users of Creative Commons licenses such as The White House, MIT, Wikipedia, Flickr, Al-Jazeera and many others have generated over 250 million works published under Creative Commons licenses and do not need to hire a lawyer each time they want to share because each of these works uses a standard license. People building on these works also do not need to ask permission each time they want to share and collaborate because the necessary permissions have already been granted.

This lowering of friction and ability to interoperate creates an opportunity for completely new types of collaborations as well as the ability for previously excluded sectors of society to participate.

Projects such as Open CourseWare and the open educational resources (OER) movement allow students and educators to share and build upon each others works dramatically increasing transparency and diversity while decreasing the overall cost of collaboration and delivery for online learning.

Scientists and researchers all over the world are increasingly sharing data outside of the traditional academic and corporate silos enabling more participants and collaboration at an unprecedented scale.

Previously, because of the technical difficulties and costs, many of these barriers were not visible and in many cases were necessary in order to build business models to allow the high cost sharing that was necessary before the Internet.

Now, many of the systems put in place to protect businesses sharing information are becoming barriers to more widespread sharing as the Internet technically enables a whole new layer of collaboration and innovation. Even copyright itself can be a barrier to collaboration.

TCP/IP and the Web are successful because they are open standards shepherded by non-profit organizations which are custodians of a bottom-up process taking inputs from and creating consensus from a wide variety of stakeholders. Similarly, Creative Commons is a non-profit organization with thousands of volunteers in over 80 countries working to develop standards for content sharing and to help organizations adopt these standards.

Having 100 Internets or 100 World Wide Webs governed by incompatible "standards" would suffocate the network effects that we enjoy on our one interoperable Web. Having a single set of copyright licenses and a single metadata format is key to creating the network effect of interoperability at the collaboration/legal layer.

Just as some networks still use X.25 and some electronic publishing systems do not use the Web and HTML, there will always be some cases where the standardized licenses that Creative Commons provides do not make sense. However, Creative Commons has become the defacto standard for the Internet and the ecosystem of sharing and is best viewed as much a standards organization as anything else.

In the early days, those of us who were proponents of TCP/IP had to argue with regulators, lawyers and technologists who for a variety of reasons did not support TCP/IP. Creative Commons still has critics who have not yet understood and do not feel the benefit of the network effects and collaboration that Creative Commons enables.

Just as we have seen with each new layer of the Internet stack, I believe that Creative Commons will soon become, in hindsight, an obvious thing and that all of the yet to be imagined innovations will have a dramatically positive effect on business, society and the environment.


"Unauthorized devices" is an absolutely brilliant way to think of this issue.

By creating interoperability between content, Creative Commons multiplies the efficiency of the Net by a trillion. Just as AT&T crippled the phone system by barring a third party princess phone, traditional copyright actually hinders creation.

I would love to see the default setting for anything that's created by a CC license... in other words, you should have to opt in to copyright, not opt out.

Well said, Joi.

I agree with Seth: well said.
But said that, now the question arises: how can you convince governments? How can you win against lobbying from companies that thrive on copyright money? That's the biggest challenge you have, Joi. Good luck :)

Copyright is an appreciation of the work or the creation of a person or institution. We need to appreciate the work of others. But because the current technology is development, and a lot of plagiarism often occurs, especially the Internet world. Sometimes in the business world, plagiarism and piracy work done to achieve profit maximization. Copyright institutions so that does not seem able to cope with the plagiarism and piracy. We as educated people, let us begin to appreciate the work of others either in the form of art, technology and things that seem trivial, because the people who succeed in creating the work can be called as an entrepreneurs.

Leave a comment