Recently in Network Technology Category

A few weeks ago, Stewart Brand emailed me asked if I was still playing World of Warcraft and if I had read DAEMON. I was still playing World of Warcraft and hadn't read DAEMON. A few days later, thanks to Amazon, I was reading DAEMON.

Years ago, I remember thinking about Multi User Dungeons (MUDs) and how much they affected people in the real world. I knew people who were obsessed with MUDs, the first Multi-User Online Role Playing Games (MMORPGs). I was obsessed. (I think the first time I ever appeared in Wired was in 1993 when Howard Rheingold with Kevin Kelly wrote about MUDs and mentioned my obsession.) In MUDs, people got married, people got divorced, people lost their jobs, people shared ideas... The MUDs I played touched the real world through all of the people in the game.

Unlike the World of Warcraft and more like Second Life, MUDs allowed players to create rooms, monsters and objects. When you entered a MUD, it was like entering the collective intelligence of all of the people who played the game. There were quests that were designed by people using their knowledge of Real Life™. Playing in their worlds was like walking through their brains. These worlds merged and collided as people from everywhere collaborated in creating MUDs of various themes with various objectives.

At some point in the evolution of MMORPGs, MUDs forked and we ended up with most of the people who liked creating objects and worlds in places like Second Life where, while you CAN make games, most of what happens is world creation. The "gamers" ended up in games like World of Warcraft where the game play aspect has been honed to a fine art, but the player content creation aspect has been completely lost. (Although most of the developers are former obsessive players.)

What I envisioned back when I was playing and hacking MUDs more was that if you turned the world a bit inside out and imagined that YOU were the MUD, the people who played your game were like little pawns or interfaces for you in the real world. They inputted content and created worlds and taught you about the real world. They promoted you to their friends. They played obsessively increasing experience points and commitment to the game so that they would forever feed you and keep you alive. They would set up servers and pay for hosting just to feed their obsession and protect their investment. If you became extremely popular, a group of your players would spawn a new MUD with your DNA-code and there would be another one of you.

The hardcore players would hack your open source code and keep you evolving. The Wizards would educate and add character to each instance of your code. The players would be your footprint in Real Life™.

When most of the gamers moved to corporation owned closed source games designed by a team of developers, I stopped having this dream. The games were no longer "alive" in the same way I had envisioned them evolving.

After reading DAEMON, this dream is back. Leinad Zeraus depicts a world where a collosall computer daemon designed by a genius MMO designer begins to take over the world after his death. In many ways, the vision is similar to the vision I had, but the author adds a macabre twist and many many more orders of scale to make this one of the most inspiring books I've read in a long time. The author is "an independent systems consultant to Fortune 100 companies. He has designed enterprise software for the defense, finance and entertainment industries." He uses his experience to make the book extremely believable and realistic and still mind-blowing.

It was super fun to read and is a book I'd recommend to any who loves the Net and gaming. I'd also recommend it to anyone who doesn't. It's a great book to learn about the importance of understanding all of this - before it's too late.

Yesterday, we started planning our veggie garden and started a compost bin. I'm trying to figure out what percentage of my total food intake I can grow at home. We have a relatively large yard by Japanese standards so most of this will be a matter of personal energy. I'm going to start small this year but try to increase my nutritional independence from commercial networks every year.

My goal is to be able to cover nearly all of our fertilizer needs through the composting of all of our biodegradable garbage this year.

Thinking through the various scenarios, I realized that I could significantly reduce inputs and outputs from our house by going this route. When I imagine walking over to the garden every morning, picking my veggies, then chucking the waste into the compost bin, I get a happy feeling inside. I realize this is pretty simple and not so significant, but "just add water and sunlight" is very appealing.

I think that I can also make a significant impact on my energy inputs through photovoltaics and maybe some day get off of the power grid. This requires a larger financial investment but is an area that I've already done a bit of work in this area from my time at ECD.

In my lab/office/Tokyo pad we just finished setting up (thanks to the folks at WIDE) a dark fiber connection to the WIDE box at the Japanese Internet exchange. It is currently a 1G connection. WIDE is a research project and I'm only paying for the dark fiber. WIDE is routing for me. I am not going through a single licensed telecom provider for my Internet connectivity. Consequently, going from 1G to 10G is just a matter of buying more hardware and has no impact on the running cost. More bandwidth is just about more hardware. The way it SHOULD be.

It's exciting to think about making my footprint smaller and smaller in nutrition and energy and thinking about nutrition, energy and bandwidth more and more as assets that I operate rather than services from big companies.

I was going to Twitter this as I was sitting here drinking my morning tea, but it turned into a blog post. Thanks Twitter. ;-)


news on sina.com.cn, in which China Telecom, one the biggest ISP in China, release an official statement( with my rough translation)

中国电信称,据我国地震台网测定,北京时间12月26日20时26分和34分,在南海海域发生7.2、6.7级地震。受强烈地震影响,中美海缆、亚太1号、亚太2号海缆、FLAG海缆、亚欧海缆、FNAL海缆等多条国际海底通信光缆发生中断,中断点在台湾以南15公里的海域,造成附近国家和地区的国际和地区性通信受到严重影响。

China Telecom has confirmed that, according to China institute of earthquake monitoring, at Dec 26, 20:26-20:34 Beijing Time, 7.2 and 6.7 magnitude earth quake have occurred in the South China Sea. Affected by the earthquake, Sina-US cable, Asia-Pacific Cable 1, Asia-Pacific Cable 2, FLAG Cable, Asia-Euro Cable and FNAL cable was broken and cut up. The break-off point is located 15 km south to Taiwan, which severely affected the International and national tele-communication in neighboring regions.

据悉,中国大陆至台湾地区、美国、欧洲等方向国际港澳台通信线路受此影响亦大量中断,国际港澳台互联网访问质量受到严重影响,国际港澳台话音和专线业务也受到一定影响。

It was also reported that communication directed to China mainland, Taiwan, US and Europe were all massively interrupted. Internet connection to countries and region outside of China mainland became very difficult. Voice communication and telephone services were also affected.

中国电信称,受余震影响,抢修工作遇到较大困难,加之海缆施工具有一定难度,预计影响还将持续一段时间。

China Telecom has claimed that due to the aftershock of the earthquake, the repairing works would be very tough. In addition undersea operation is also not easy to handle with. So this phenomenon is going to exist for certain period.

This really throws the notion of "cyberspace" into the physical world. My sympathies to everyone affected. Hope they figure out how to fix those cables quickly.

UPDATE: Xeni just picked this up on Boing Boing and linked to the Wikinews article and to the image above.

AlterNet
Senator Ted Stevens: The Remix

Posted by Melissa McEwan at 6:57 AM on July 11, 2006.

Last month, Senator Ted Stevens (R-Alaska) gave a rather stunning speech on the issue of net neutrality, in which he made such clueless statements as: "I just the other day got, an internet was sent by my staff at 10 o’clock in the morning on Friday and I just got it yesterday," and "[T]he internet is not something you just dump something on. It’s not a truck. It’s a series of tubes."

Now, the good folks at Boldheaded have turned his "skillful fusion of political doublespeak and perplexing ignorance on how the Internet works" into the DJ Ted Stevens Techno Remix: "A Series of Tubes." [Stream or Download above]

All I can say is just go listen. And then laugh and laugh and laugh.

(ChezLark, Boldheaded)

I DID NOT KNOW that the Internet was a series of tubes.

Very funny. ;-)

You can download it from the Bold Headed Broadcast site.

via Scott via Deb

I just got a new Vodafone Japan phone to mess around with the network. In particular, I'm curious about how SMS evolves or fails to evolve in Japan.

So here's what I tested. I have a T-Mobile US SIM in a Nokia phone and was able to send and receive SMSs over both the Vodafone 3G network and the NTT DoCoMo 3G network. I was able to send an SMS to my Vodafone Japan phone, but not to my NTT DoCoMo phone. However, I was NOT able to reply to the SMS. As far as I can tell, but Vodafone Japan and DoCoMo disable sending SMSs to any other network than their own, but Vodfhone Japan allows you to receive an SMS from outside the network. This is for people with accounts on those networks. Their networks DO allow people who roam on their networks to send and receive SMS freely.

I am going to Finland tomorrow so I will try to use my Vodafone Japan phone there and see if it still blocks my SMS. I have a feeling that since the SMS server is probably where they block it, that it probably won't change anything.

The good news is that the 3G networks in Japan allow 3G phones and 3G subscribers from outside of Japan to roam on the Japanese networks. The bad news is that the Japanese networks are bringing their old-fashioned closed network philosophy and crippling connectivity between their networks. How stupid.

By

Just read the newly crafted elevator pitch for Benetech in a letter from Jim Fruchterman, the CEO, Chairman and Founder.

His pitch:

Benetech creates technology that serves humanity by blending social conscience with Silicon Valley expertise. We build innovative solutions that have lasting impact on critical needs around the world.
Webcams and other digital communication could give ordinary people feedback on results acheived due to donation of their money and time.

This would give the power of oversight formerly reserved for wealthy philanthropists.

Does this hint toward disruptive digital technology underming the NGO world with individualized philanthropy that cuts out the middle men?

By

In France bloggers have been investigated by police for inciting the riots.

Also, my audiocast on the riots for the New York Times website. (My first podcast-style effort)

Blogs and sms messages were apparently used to coordinate violent action on a large scale.

What should authorities do?

Is there an alternative to censorship?

As the Web 2.0 bandwagon gets bigger and faster, more and more people seem to be blogging about it. I am increasingly confronted by people who ask me what it is. Just like I don't like "blogging" and "blogosphere", I don't like the word. However, I think it's going to end up sticking. I don't like it because it coincides with another bubbly swell in consumer Internet (the "web") and it sounds like "buzz 2.0". I think all of the cool things that are going on right now shouldn't be swept into some name that sounds like a new software version number for a re-written presentation by venture captitalists to their investors from the last bubble.

What's going on right now is about open standards, open source, free culture, small pieces loosely joined, innovation on the edges and all of the good things that WE FORGOT when we got greedy during the last bubble. These good Internet principles are easily corrupted when you bring back "the money". (As a VC, I realize I'm being a bit hypocritical here.) On the other hand, I think/hope Web 2.0 will be a bit better than Web 1.0. Both Tiger and GTalk use Jabber, an open standard, instead of the insanity of MSN Messenger, AOL IM and Yahoo IM using proprietary standards that didn't interoperate. At least Apple and Google are TRYING to look open and good.

I think blogging, web services, content syndication, AJAX, open source, wikis, and all of the cool new things that are going on shouldn't be clumped together into something that sounds like a Microsoft product name. On the other hand, I don't have a better solution. Web 2.0 is probably a pretty good name for a conference and probably an easy way to explain why we're so excited to someone who doesn't really care.

While we're at labeling the web x.0. Philip Torrone jokingly mentioned to me the other day (inside Second Life) that 3D was Web 3.0. I agree. 3D and VR have been around for a long time and there is a lot of great work going on, but I think we're finally getting to the phase where it's integrated with the web and widely used. I think the first step for me was to see World of Warcraft (WoW) with its 4M users and the extensible client. The only machine I have where I can turn on all of the video features is my duel CPU G5. On my powerbook I have to limit my video features and can't concurrently use other applications while playing. Clearly there is a hardware limit which is a good sign since hardware getting faster is a development we can count on.

Second Life (SL) is sort of the next step in development. Instead of trying to control all real-money and real-world relationship with things in the game like Blizzard does with WoW, SL encourages it. SL is less about gaming and more about building and collaboration. However, SL is not open source and is a venture capital backed for-profit company that owns the platform. I love it, but I think there's one more step.

Croquet, which I've been waiting for for a long time appears to be in the final phases of a real release. Croquet, if it takes off should let you build things like SL but in a distributed and open source way. It is basically a 3D collaborative operating system. If it takes off, it should allow us to take our learning from WoW and SL and do to them what "Web 2.0" is doing to traditional consumer Internet services.

However, don't hold your breath. WoW blows away SL in terms of snappy graphics and response time and has a well designed addictive and highly-tuned gaming environment. Croquet is still in development and is still way behind SL in terms of being easy to use. It will take time for the more open platforms to catch up to the closed ones, but I think they're coming.

Web 3.0 is on its way! Actually, lets not call it Web 3.0.

I don't know how much deep thought was involved when George Bush called the Internet "the internets" but this reflects a real risk that we face today. If you look at the traffic of many large countries with non-English languages, you will find that the overwhelming majority of the traffic stays inside the country. In countries like China and Japan where there is sufficient content in the local language and most people can't or don't like to read English this is even more so. I would say that the average individual probably doesn't really notice the Internet outside of their country or really care about content not in their native language.

Physical mail inside of these countries is delivered with addressing in their local language. It's not surprising that on the issue of International Domain Names (IDNs) there is a strong and emotion position inside of these countries that people should be able to write URLs in their native scripts. Take my name for example, the same Chinese characters for my name can be transliterated into English as either Johichi Itoh or Joichi Ito. This problem is aggravated in languages such as Chinese where there are more dialects and many more readings for the same set of characters. Why should these people be forced to learn some sort of roman transliteration in order to access the company page where they know the official Chinese characters for the names.

Similarly, there are people who don't like the policies of the Internet and either want to censor or otherwise manage differently THEIR internet. Others who don't like the way DNS works, have proposed alternative roots. This is possible and easy to do, but you end up with "the internets".

It is the fact that we have a single root and that we have global policies and protocols which allows the Internet to be a single network and allows anyone to reach anyone else in the world. Clearly, allowing anyone in the world to reach anyone else in the world with a single click introduces a variety of problems, but it creates a single global network which allows dialog and innovation to be shared worldwide without going through gateways or filters. This attribute of the Internet is a key to the future of a global democracy and I believe we need to fight to preserve this.

Since more and more people are using the Internet, there are more and more diverse views about the policies and control. This is clearly making consensus more difficult and ICANN is one of the groups which is having to adapt to the increasing number of inputs in the consensus process. This is all the more reason to work harder to keep everything together. Please. Lets fight to keep the Internet and not let it turn into the internets... It is a difficult process with various flaws, but if we give up, it will be very difficult if not impossible for all of to talk again very soon.

As Wendy says... Grokster...

EFF: MGM v. Grokster

Technorati Tags:

Micah Sifry has written a nice piece about why wifi and cheap broadband is an essential enabler and more important than direct aid for communities which need help. He references various examples and source. I completely agree. I remember speaking to a UN diplomat who said that the Internet has changed the face of global policy making. He told us that the Anti-Personal Land-mine Treaty would not have happened if it weren't for email and the ability for NGOs to get information, organize and pressure governments and the UN using the Internet. I believe that at every level, it is essential to empower individuals and communities with a voice and the Internet is in a position to enable people for the first time at a reasonable cost. It is about global voices.

I believe that it is easy enough to run a basic Wifi, Internet and Voice over IP network that in many cases municipal governments can run them. I realize this hurts competition and this is what Verizon argued when they tried to stop Philadelphia for setting up their own Wifi network, but I think it would be better than what we have now. In many places broadband is controlled by organizations that are effectively monopolies anyway. See for example the new ruling in the US that cable companies don't have to allow others to provide access through their network. Would you rather have the network run by a monopoly that is controlled by a bunch of greedy shareholders or a local government that the people at least have some control over?

People will argue that allowing local governments to operate networks will stifle innovation because of lack of competition. I think that the benefit is worth the cost of providing cheaper and more universal access. The network is becoming less and less a "service" and more and more a "thing". You can buy a bunch of routers and hook them together and you have a pretty good network. You do need maintenance, but you don't need some huge company with a bunch of bell-heads running the thing. Simple access is more like a road than a full-service hotel. It just has to be cheap and work.

I agree that this isn't for all municipal governments, but I think the central governments of the world should try very hard not to give in to the pressure of the telco lobbies and stifle the attempts of municipal governments to provide network services including voice. I also believe that non-profits and NGOs can play a huge role in helping provide access in addition to municipal governments as well as helping municipal governments set up such networks.

At the Internet Association Japan meeting yesterday, the folks from Impress gave a summary of their 10th annual Internet survey.

Impress 2005 Internet White Paper
There are 32,244,000 broadband households which is 36.2%.

There are 70,072,000 Internet users.

72.5% of people have heard of blogs, up from 39% last year.

25% of women in their teens and 20's have blogs.

9.5% of Internet users use RSS Readers.

46.5% of Internet users have decreased spending in physical shops because of online shopping.

29.6% of offices have wifi up from 10.7% last year.

2.8% of companies have corporate blogs and over 50% express no intention of ever having corporate blogs.

5.5% of companies have corporate web pages for mobile phone users.

I took notes based on a verbal presentation so there could be some mistakes. If anyone notices any, please let me know.

UPDATE: PDF of press release summary of white paper. (includes charts / Japanese)

I'm sure everyone knows what BitTorrent is, but it is the most popular peer-to-peer file sharing protocol for sharing large files. Before you had to have a tracker to create "torrents" which coordinated this sharing, but now you don't. This should make it even easier for people to make BitTorrent enclosures in blog entries and otherwise use BitTorrent to share files. Having said that, there are value added trackers like Prodigem which I'm sure people will use to charge for and otherwise track their files.

BitTorrent
BitTorrent Goes Trackerless: Publishing with BitTorrent gets easier!

As part of our ongoing efforts to make publishing files on the Web painless and disruptively cheap, BitTorrent has released a 'trackerless' version of BitTorrent in a new release.

[...]

In prior versions of BitTorrent, publishing was a 3 step process. You would:

1. Create a ".torrent" file -- a summary of your file which you can put on your blog or website
2. Create a "tracker" for that file on your webserver so that your downloaders can find each other
3. Create a "seed" copy of your download so that your first downloader has a place to download from

Many of you have blogs and websites, but dont have the resources to set up a tracker. In the new version, we've created an optional 'trackerless' method of publication. Anyone with a website and an Internet connection can host a BitTorrent download!

[...]

Although still in Beta release, the trackerless version of BitTorrent, and the latest production version are available at http://www.bittorrent.com/

Mr Blog
Practical IPv6

We finally released a project we've been working on in EarthLink R&D for some time now. I was not the lead engineer on this project but it's perhaps one of the most exciting things we've done in R&D to date, if not the most exciting thing.

Basically it's a demonstration of a practical IPv6 migration strategy. There is a sandbox that allows users to obtain their own /64 IPv6 subnet of real routable addresses (Goodbye NAT -- YEAH!)

Here's how it works: Simply get an account at http://www.research.earthlink.net/ipv6/accounts.html to get your own personal block of 18,446,744,073,709,551,616 IPv6 addresses; install the firmware onto your standard Linksys WRT54G router, and blamo, you have IPv6. With this special code installed on your Linksys router, your IPv4 works as normal; you'll still have your NAT IPv4 LAN. But in addition to that, any IPv6 capable machine on the LAN will get a real, honest to goodness, routable IPv6 address too. It couldn't be easier. This works for Mac OS X, Linux/UNIX, as well as Windows XP. You don't have to do anything special on the machines on the LAN. They just work, as they say.

So with this code installed on the router and your IPv6 accounts setup, nothing breaks. You continue to use your LAN as before, but you suddenly also get real IPv6 addresses. Easy migration. No forklift required.

This may be a bit geeky for some people, but for anyone who's been worried about how we're going to get IPv6 everywhere, this should be good news. Congratulations Earthlink R&D! I'm going to get a WRT54G router and try this out right away...

Technorati Tags:

I had dinner with Steve Crocker last night. I met him before through David Isenberg, but since he is the Security and Stability Advisory Committee Liaison to the ICANN board, I am getting a chance to hang out with him more these days. Among other things, he's well known for being the author of RFC 1.

His explained the software that his company Shinkuro produced and I tried it today. It solves a BUNCH of needs that I had. It's basically a very cryptographically robust, cross-platform collaboration tool. It allows you to create groups and share folders of files, has a shared chat space (like IRC) and allows you to share your desktop screen with other members of the group (yes, across platforms). The shared files are transfered in the background and edits to files are sent as diffs which can be accepted into the original by the recipient. There is also standard IM with your buddy list. The great thing is that all of the traffic is encrypted. 256 bit AES and 2048 bit RSA keys. Each message is encrypted with a unique key, and the key is transmitted under the RSA key. This is very important since I know for a fact that people sniff IM and other traffic at many of the conferences and public places.

The folder in the groups is really nifty. You drop files into a folder and you can see who has received the files and see any changes that are waiting for you. This seems so much more organized than the tons of attachments and updates I receive before board meetings and conference calls.

It seems similar to Groove in some ways, but is more lightweight and most importantly cross-platform. (Mac, Windows, Linux.)

You can download it at www.shinkuro.com and for now it's free. If you register it, you will get all of the features. My id is jito!shinkuro.com if you want to invite me to be your friend or into a group. As I've said before, I think email is dead and I'm always looking for things like this that help me survive the post-email era.

I'm listening to Andrew Odlyzko giving a talk right now about why Quality of Service (QoS) and real-time streaming is stupid. He showed a slide showing that P2P and other traffic are generally transmitting files at faster speeds than their bit rates. Basically, if you cache and buffer, you can have outages in the downloads and you'll usually be fine. I agree. I can see why carriers would want to spread the rumor that QoS is some feature that we have to have, but it's strange that so many researchers seem to think we will need QoS supported video streaming. Maybe they need to stop watching cable TV.

I'm getting an error that says "Your account is currently suspended" when I try to log into my joiitosk AIM account. Does anyone know what this error means and how I resolve it?

Update: Article in eWeek about this, but it doesn't say whether AOL is going to give us our accounts back. Thanks to Cours for the link.

David Beckemeyer
BT appears to be blocking third-party VoIP

I've been biting my tongue on this since I first ran across it several months back. But now I have to say something. If someone can prove me wrong on this, fine, I'll post a retraction, but now I'm going to say it: British Telecom appears to be explicitly blocking VoIP for their DSL subscribers.

I've worked with an associate to examine the situation and all signs point to an explicit blocking of VoIP. In Cisco ACL-speak, it appears there is a rule somewhere in the BT network being applied to inbound packets of the form:

deny udp any eq 5060 any
If this is true, this is VERY bad behavior. 5060 is the port that SIP uses. I can understand why a phone company wouldn't want "free phone calls over the Internet" running on their system, but this is exactly the kind of behavior that makes Internet folks dislike telephone company control.

Can anyone else corroborate this fact?

VoIP stands for "Voice over IP" and SIP is the open standard "Session Initiation Protocol" used to set up calls over the Internet

UPDATE: Looks like it is a customer router issue, but still may be BT driven. Update on Mr. Blog.

I'm sitting in the Italian Parliament (I think.) The panel I was on was dealing with the impact of digital/Internet on content creation and distribution. It started yesterday and continued today. I think it lasted about seven hours or so in total. I found myself in violent disagreement at the beginning because they kept talking about piracy. The interesting thing about this panel (probably more common in other cultures, but new for me) was that we had to come to a written consensus by the end of the session and present it in the Parliament building. It would then be distributed to politicians across Europe as a recommendation.

I found myself negotiating like some UN diplomat.

In the end, here is where we ended up on a few of my "hot buttons".

Organized, for-profit, commercial piracy was different from P2P file sharing by individuals. We could not agree on the impact of P2P file sharing, but we agreed that punishing file sharing was not the only/best way to deal with the issue. I pushed for a stronger stance, my position being that as Chris Anderson says in The Long Tail, it's a matter of price and convenience. People will pay if the experience is better. That was not included in the statement, but "education" was used instead. Blah. I just made a statement that I disagree with this and that there is not enough evidence that P2P filesharing of music is really bad for the music industry.

It appeared that people had a VERY bad image of Creative Commons. For some reason they thought that CC was trying to force people to share and was anti-copyright. I explained the CC was built upon copyright and was trying to help artists choose their copyright.

This part turned out quite well in the statement. They said that CC was a tool, not to steal from artists, but to give them the choice to share and lower the parasitic costs (legal) of choosing a license. They concluded that CC was NOT a threat as they had originally envisioned, but a complimentary and a good thing. The tone was very pro-artist and less tolerant of distributors, the idea of giving more control to artists seemed to be quite attractive.

I'm about to have a chance to object to some of the issues I see in the statement and give an address about my thoughts. I'm going to talk about the value of the Long Tail and Creative Commons.

I've just been nominated to the board of ICANN (Internet Corporation For Assigned Names and Numbers) and will be officially joining already seated members at the conclusion of the ICANN Meeting in Cape Town, South Africa, December 1-5. ("Nominated" technically because I officially join in December, but the selection process is completed.)

This is the end of a two or so year process of people telling me I should get involved and others warning me against it. Some of my wisest advisors urged me not to join saying things like, "you will make 3 mistakes in your life... this is one of them..." or "friends don't let friends do ICANN."

ICANN has its share of problems and a negative image associated with it in many circles. I've even taken my fare share of cheap shots at ICANN.

I am joining ICANN for two reasons. ICANN is changing and it's critical that ICANN is successful.

I've talked to on the phone and met a great number of people involved in ICANN in a variety of capacities. I realized that ICANN today is not what ICANN was a few years ago. Please reset your biases and pay attention to what they are doing. Yes. There are still problems, but they are being addressed by an extremely committed team of people who are doing amazing work. Also, take a look at the board. It's very geographically and professionally diverse. It's not some puppet of the US or special interests.

Why is ICANN important? If ICANN is not successful in proving that it can manage some of the critical elements of the Internet such as the name space and IP addresses, ICANN will be dissolved and the ITU will step in. Why would that be bad? I am generally in favor of multi-lateral approaches, but in the case of the ITU, I believe it is biased towards the telephone monopolies. The ITU was built by telcos to set technical standards for telcos. That suits the telephone system architecture, which is highly centralised and is structured as a patchwork of geographic monopolies. The Internet is decentralised, and there are many small companies and individuals working at the peripheries to develop new applications for the overall network. The governance process has to reflect the diversity and the needs of these companies, as well as the needs of the network providers.

I believe that many of the things that ICANN is doing are important, but the single biggest factor leading to my decision to try to participate in ICANN is to try to prove that the people of the Internet can govern themselves without direct involvement from nation-states and to try to help build an organization that can deliver that promise.

The official press release is on the ICANN site. For more information on the nomination process, please see the NomCom page.

People have been reporting about the FBI ordering a hosting provider, Rackspace, with offices in the US and the UK to seize at least two servers from Indymedia's UK datacenter. Indymedia is a well known edgy alternative news site which was established to provide grassroots coverage of the WTO protests in Seattle. It has grown into a multinational resource for some hardcore journalism including a lot of work on the Diebold and the Patroit Act issues. The reports as well as Indymedia's page on this story say that the FBI has not provided a reason for the seizure to Indymedia. The statement from Rackspace says:

Rackspace
In the present matter regarding Indymedia, Rackspace Managed Hosting, a U.S. based company with offices in London, is acting in compliance with a court order pursuant to a Mutual Legal Assistance Treaty (MLAT), which establishes procedures for countries to assist each other in investigations such as international terrorism, kidnapping and money laundering. Rackspace responded to a Commissioner’s subpoena, duly issued under Title 28, United States Code, Section 1782 in an investigation that did not arise in the United States. Rackspace is acting as a good corporate citizen and is cooperating with international law enforcement authorities. The court prohibits Rackspace from commenting further on this matter.
In past, Indymedia has done stuff like posting photos of undercover police officers. However, according to Indymedia, the "FBI asked for the Nantes post on swiss police to be removed, but admitted no laws were violated". This time the FBI has not told them what they've done wrong and Rackspace is under a gag order so they can't even tell Indymedia exactly what hardware they removed.

This implies that some non-US entity had the FBI force an action in the UK under MLAT. This means that Indymedia is being suspected of engaging in international terrorism, kidnapping or money laundering. I've seen some extreme reporting on Indymedia, but terrorism, kidnapping or money laundering? I guess the definition of "terrorism" has been expanded to meet popular demand these days, but come on... really?

This reminds me of toywar. A group of Swiss artists established in 1994 who are Golden Nica award winners from my Ars Electronica jury in 1996 call themselves etoy. Later, Etoys, founded in 1998 tried to take the etoy.com domain by force. They got a temporary injunction against the web site because a judge in LA agreed that it was confusing to customers of Etoys. Network Soutions complied and went beyond their call of duty and shut down etoy.com email as well for good measure. Swiss artists can be sued in a US court and having their email shut down by a US registrar.

My point is, be careful where your data lives...

UPDATE: nyc.indymedia.org is speculating that it is because Indymedia published the identities of the RNC delegates.

UPDATE2: It appears that maybe it wasn't the RNC, but the photos of the police officers according to Cryptome.

UPDATE3: imajes has an written a letter to his MPs. Maybe others should do the same.

Dr. Mark Petrovic and David Beckemeyer at Earthlink R&D have developed a proof of concept P2P application using SIP called SIPshare. SIP stands for Session Initiation Protocol and is one of the key technologies for the open standards around Voice over IP (VoIP). This application is pure P2P use of SIP. It is completely decentralized. According to David Beckemeyer this project is quite important.

David Beckemeyer in email
This may not sound like that big of a deal, as file sharing has been done, but I think this is a really big event. It's not about file sharing. Nobody is really going to use the demo app Mark built. It's about demonstrating that pure P2P can be done over SIP and that SIP is about more than just voice and video.

In some sense, the SIP wars to me are about sneaking in some aspects of the original "stupid network" baack into the NAT hell we've created. If we can do what it takes to get NAT boxes to support SIP (be consistent in how they do NAT so the edges can use STUN et al), then we have reclamied the ability to have individually addressable nodes, where we use SIP as the new IP network almost. This may be getting carried away, but anyway...

SIP has been waylaid in regulatory and execution problems in the past and many people have written it off as a non-starter. I'm seeing more and more companies who are actually using it for cool stuff and proving that it's ready for prime time now. If you written off SIP and haven't taken a look at what people doing with it for the last six months, I suggest you take another look.

via David Beckemeyer

I want to start playing with BitTorrent and integrating it into blogging more. I think I need a BitTorrent tracker. Can anyone recommend a respectable public tracker or does anyone have a machine they'd be willing to run a public tracker on? I want try to experiment with a variety of legal uses of BitTorrent.

Today, an associate professor at the most prestigious university in Japan, Tokyo University was arrested today for developing a tool that enables piracy. The program is a P2P system cally Winny. Previously two of the users had been arrested. I got a call from Asahi Shimbun (Japanese newspaper) today asking me for a comment for the morning news tomorrow. I hope the print it. I think it's an absolute disgrace to Japan. While the US is fighting in congress, Hollywood pushing to ban P2P and Boucher et al are fighting for DMCRA, Japanese police go and arrest someone developing P2P software with a VERY sketchy case. The thing is, it's quite likely he will be found guilty.

I once served as an expert witness on the FLMASK case. FLMASK was a program that could be used to allow password protected scrambling of areas of an image so that porn sites could post pictures that passed the Japanese censors, but allowed users to unscramble them. The police were so upset that they cracked down on the hardcore porn sites with the argument that even with FLMASK'ed "clean" images, they would be deemed hardcore. The problem was, this left the developer of FLMASK free from claims that his software enabled anything illegal. So they busted him for LINKING to these porn sites that got busted as users of his software. They deemed linking to a porn site as the same as actually running a porn site. I was the chairman of Infoseek Japan at the time so I obviously had a lot to say about that. The amazing thing is... after overwhelming evidence of the stupidity of the allegations, the guy was found guilty.

Anyway, Japan is yet again leading the world in stupid Internet policing.

more on slashdot

Here are some thoughts on where I think things are going in the mobile and content space.

I wrote this essay before reading Free Culture so I'm saying a lot of stuff that Larry says better...

Several crucial shifts in technology are emerging that will drastically affect the relationship between users and technology in the near future. Wireless Internet is becoming ubiquitous and economically viable. Internet capable devices are becoming smaller and more powerful.

Alongside technological shifts, new social trends are emerging. Users are shifting their attention from packaged content to social information about location, presence and community. Tools for identity, trust, relationship management and navigating social networks are becoming more popular. Mobile communication tools are shifting away from a 1-1 model, allowing for increased many-to-many interactions; such a shift is even being used to permit new forms of democracy and citizen participation in global dialog.

While new technological and social trends are occurring, it is not without resistance, often by the developers and distributors of technology and content. In order to empower the consumer as a community member and producer, communication carriers, hardware manufacturers and content providers must understand and build models that focus less on the content and more on the relationships.

Smaller faster

Computing started out as large mainframe computers, software developers and companies “time sharing” for slices of computing time on the large machines. The mini-computer was cheaper and smaller, allowing companies and labs to own their own computers. The mini computer allowed a much greater number of people to have access to computers and even use them in real time. The mini computer lead to a burst in software and networking technologies. In the early 80’s, the personal computer increased the number of computers by an order of magnitude and again, led to an explosion in new software and technology while lowering the cost even more. Console gaming companies proved once again that unit costs could be decreased significantly by dramatically increasing the number of units sold. Today, we have over a billion cell phones in the market. There are tens of millions camera phones. The incredible number of these devices has continued to lower the unit cost of computing as well as devices imbedded in these devices such as small cameras. High end phones have the computing power of the personal computers of the 80’s and the game consoles of the 90’s.

History repeats with WiFi

There are parallels in the history of communications and computing. In the 1980’s the technology of packet switched networks became widely deployed. Two standards competed. X.25 was a packet switched network technology being promoted by CCITT (a large, formal international standards body) and the telephone companies. It involved a system run by telephone companies including metered tariffs and multiple bilateral agreements between carriers to hook up.

Concurrently, universities and research labs were promoting TCP/IP and the Internet opportunity for loosely organized standards meetings being operated with flat rate tariffs and little or no agreements between the carriers. People just connected to the closest node and everyone agreed to freely carry traffic for others.

There were several “free Internet” services such as “The Little Garden” in San Francisco. Commercial service providers, particularly the telephone company operators such as SprintNet tried to shut down such free services by threatening not to carry this free traffic.

Eventually, large ISPs began providing high quality Internet connectivity and finally the telephone companies realized that the Internet was the dominant standard and shutdown or acquired the ISPs.

A similar trend is happening in wireless data services. GPRS is currently the dominant technology among mobile telephone carriers. GPRS allows users to transmit packets of data across the carrier network to the Internet. One can roam to other networks as long as the mobile operators have agreements with each other. Just like in the days of X.25, the system requires many bilateral agreements between the carriers; their goal is to track and bill for each packet of information.

Competing with this standard is WiFi. WiFi is just a simple wireless extension to the current Internet and many hotspots provide people with free access to the Internet in cafes and other public areas. WiFi service providers have emerged, while telephone operators –such as a T-Mobile and Vodaphone- are capitalizing on paid WiFi services. Just as with the Internet, network operators are threatening to shut down free WiFi providers, citing a violation of terms of service.

Just as with X.25, the GPRS data network and the future data networks planned by the telephone carriers (e.g. 3G) are crippled with unwieldy standards bodies, bilateral agreements, and inherently complicated and expensive plant operations.

It is clear that the simplicity of WiFi and the Internet is more efficient than the networks planned by the telephone companies. That said, the availability of low cost phones is controlled by mobile telephone carriers, their distribution networks and their subsidies.

Content vs Context

Many of the mobile telephone carriers are hoping that users will purchase branded content manufactured in Hollywood and packaged and distributed by the telephone companies using sophisticated technology to thwart copying.

Broadband in the home will always be cheaper than mobile broadband. Therefore it will be cheaper for people to download content at home and use storage devices to carry it with them rather than downloading or viewing content over a mobile phone network. Most entertainment content is not so time sensitive that it requires real time network access.

The mobile carriers are making the same mistake that many of the network service providers made in the 80s. Consider Delphi, a joint venture between IBM and Sears Roebuck. Delphi assumed that branded content was going to be the main use of their system and designed the architecture of the network to provide users with such content. Conversely, the users ended up using primary email and communications and the system failed to provide such services effectively due to the mis-design.

Similarly, it is clear that mobile computing is about communication. Not only are mobile phones being used for 1-1 communications, as expected through voice conversations; people are learning new forms of communication because of SMS, email and presence technologies. Often, the value of these communication processes is the transmission of “state” or “context” information; the content of the messages are less important.

Copyright and the Creative Commons

In addition to the constant flow of traffic keeping groups of people in touch with each other, significant changes are emerging in multimedia creation and sharing. The low cost of cameras and the nearly television studio quality capability of personal computers has caused an explosion in the number and quality of content being created by amateurs. Not only is this content easier to develop, people are using the power of weblogs and phones to distribute their creations to others.

The network providers and many of the hardware providers are trying to build systems that make it difficult for users to share and manipulate multimedia content. Such regulation drastically stifles the users’ ability to produce, share and communicate. This is particularly surprising given that such activities are considered the primary “killer application” for networks.

It may seem unintuitive to argue that packaged commercial content can co-exist alongside consumer content while concurrently stimulating content creation and sharing. In order to understand how this can work, it is crucial to understand how the current system of copyright is broken and can be fixed.

First of all, copyright in the multimedia digital age is inherently broken. Historically, copyright works because it is difficult to copy or edit works and because only few people produce new works over a very long period of time. Today, technology allows us to find, sample, edit and share very quickly. The problem is that the current notion of copyright is not capable of addressing the complexity and the speed of what technology enables artists to create. Large copyright holders, notably Hollywood studios, have aggressively extended and strengthened their copyright protections to try to keep the ability to produce and distribute creative works in the realm of large corporations.

Hollywood asserts, “all rights reserved” on works that they own. Sampling music, having a TV show running in the background in a movie scene or quoting lyrics to a song in a book about the history of music all require payment to and a negotiation with the copyright holder. Even though the Internet makes available a wide palette of wonderful works based on content from all over the world, the current copyright practices forbid most of such creation.

However, most artists are happy to have their music sampled if they receive attribution. Most writers are happy to be quoted or have their books copied for non-commercial use. Most creators of content realize that all content builds on the past and the ability for people to build on what one has created is a natural and extremely important part of the creative process.

Creative Commons tries to give artists that choice. By providing a more flexible copyright than the standards “all rights reserved” copyright of commercial content providers, Creative Commons allows artists to set a variety of rights to their works. This includes the ability to reuse for commercial use, copy, sample, require attribution, etc. Such an approach allows artists to decide how their work can be used, while providing people with the materials necessary for increased creation and sharing.

Creative Commons also provides for a way to make the copyright of pieces of content machine-readable. This means that a search engine or other tool to manipulate content is able to read the copyright. As such, an artist can search for songs, images and text to use while having the information to provide the necessary attribution.

Creative Commons can co-exist with the stringent copyright regimes of the Hollywood studios while allowing professional and amateur artists to take more control of how much they want their works to be shared and integrated into the commons. Until copyright law itself is fundamentally changed, the Creative Commons will provide an essential tool to provide an alternative to the completely inflexible copyright of commercial content.

Content is not like some lump of gold to be horded and owned which diminishes in value each time it is shared. Content is a foundation upon which community and relationships are formed. Content is the foundation for culture. We must evolve beyond the current copyright regime that was developed in a world where the creation and transmission of content was unwieldy and expense, reserved to those privileged artists who were funded by commercial enterprises. This will provide the emerging wireless networks and mobile devices with the freedom necessary for them to become the community building tools of sharing that is their destiny.

David Isenberg blogs about the "Bellhead" background of Roy Neel, Howard Dean's new campaign manager.

Sony and Docomo have announced that they are working together to put contactless IC chips in phones. Sony's FeliCa (type C contactless IC chip) is slowly becoming a defacto standard in Japan. (The government is backing a different standard, type B.) Currently the Japan Railways, AM/PM and others are using it for payments. Many companies use it for company ID's. The problem is that you can't see how much is left in your card and it's a pain to "charge" the card with more money. Putting it on a phone lets you download money from your bank and see how much is left. I worry about the privacy and security issues, but connecting an RF payment system with a phone totally makes sense.

I have a theory that Docomo has to become an identity/payment company and dump the voice and other bit-pushing businesses and go flat rate or free on the network. Docomo should buy a credit card company and use the bit-pushing business as a stick when collecting money. There are some regulations regarding payment businesses that make it difficult, but I'm sure the government would waive this if there was enough of a social need. Right now, the transaction business that credit card companies do doesn't make money. This has driven credit card companies to become loan companies that lobby the government to allow them to charge crazy interest rates. These interest rates cause people to end up in debt hell and commit suicide. If Docomo replaced credit cards as the primary non-cash transaction, credit system and could use network service termination to lower the collection costs, I bet they could make enough money on the transaction business to cover the bit-pushing.

Docomo is Japan's biggest mobile carrier that does about $8B / yr in data revenues.


Dan Gillmor writes about how censorware blocks his site. It's blocking mine too.

Dan Gillmor
Simon Phipps alerts me that one of the big censorware outfits, SurfControl, is blocking this and other blogs as a default setting for some customers. He points to Jon Udell's report of a surrealistic conversation with a company salesdroid upon his own such discovery. Good grief.

SurfControl puts all blogs under Usenet, a fairly bizarre characterization of the genre, but par for the course for the censorware mavens. They tend to sweep big categories into their filter, and then let you try to find your own way to escape.

Speaking of false positives, I'm also against blacklists because they can also cause false positives that are difficult to correct. Smartmobs was blacklisted by Verio and it took Roland two months of hell to get it sorted out.

I know I use a blacklist for my comment filtering. It's a stop-gap measure until someone figures out a better solution.

Lauren Weinstein has a great mp3 Fact Squad Radio rant on the Versign Site Finder issue.

Cory and the EFF have been leading the charge to stop the broadcast flag proposal. Lessig chimes in. The broadcast flag is a bad thing which is anti-end-to-end. Fight for the Stupid Network!

If this entry is cryptic to you, you need to learn more about the broadcast flag and why it is bad. Click on the links.


I was noodling around trying to organize "the space" in my head and put this picture together. The x axis is the "context". IE low context is stuff like CD's and books which don't change, are worth approximately the same amount to most people and don't have much timing or personal context. The far right is very personal, very timing sensitive, high context information such as information about your current "state". Then there is everything in between. The top layer is the type of content sorted by how much context they involve. The next layer is how they are aggregated and syndicated. Below that are substrates that are currently segmented vertically, but could be unified horizontally with open standards. Anyway, just a first path. Thoughts and feedback appreciated.

UPDATE: Changed color to red and edited the examples to be brand agnostic.

The ultimate outrage: Rusty Lewis of VeriSign says this is a test for the Net, to see whether the infrastructure can be innovated. It's a threat: Let us do what we want or we won't invest in upgrading infrastructure, he implies.

In response to a question, he bascially indicates that ICANN doesn't have the power to keep VeriSign from doing what it's done. The company will have a dialogue with whoever wants to talk, but it plans to "reintroduce" Site Finder.

I think VeriSign has already won the key part of this war. It has persuaded reporters to call Site Finder a "service" instead of what it truly is, a misuse of its monopoly.

This sounds really bad. How can a company that tries to sell trust act in such a blatantly untrustworthy way...

After ICANN's formal letter asking Verisign to shut down Site Finder, VeriSign has temporarily shut the service down. They don't sound very happy about shutting down a "service has been well received by millions of Internet users". Good job on this one ICANN.

Via Lauren Weinstein's Blog

Andrew Fried
I have been following the various threads relating to Verisign and wanted to make one comment that I feel has been missing. Simply put, I would like to publicly express my appreciation to Mr. Vixie for taking the time to add the "root-delegation-only" patch for Bind. I'm fairly new to NANOG, but I'm sure that others beside myself also feel a thank you is appropriate.
Andrew Fried, Senior Special Agent for the US Treasury Department posted this on the NANOG list regarding Verisign and the SiteFinder thing. Very cool that someone "patched" Bind to fix the "bug". Also very cool that someone like Andrew is speaking in his own voice in a public forum about this issue.

Via Boing Boing Via This demands work

If I were Microsoft I would probably like micro-content and metadata. IE and the browser wars were the pits for them. They should hate html by now. Microsoft also hates Google. Google hates metadata. Google likes scraping html, mixing it with their secret sauce and creating the all-mighty page ranking. Anything that detracts value from this rocket science or makes things complicated for Google or easy for other people is probably a bad thing for Google.

If the Net started to look more and more like XML based syndication and subscriptions with lots of links in the feeds to metadata and other namespaces, it would be more and more difficult to create page ranking out of plain old html.

My guess is that Microsoft knows this and intends to be there when it happens instead of totally missing it at the beginning like when the Internet got started. I have a feeling they will embrace a lot of the open standards that we are creating in the blog space now, but that they will add their usual garbage afterwards in the name spaces and metadata so that at the end of the day it all turns funky and Microsoft.

Just a thought...

Reuters
VeriSign Sued Over Controversial Web Service
Thu September 18, 2003 09:13 PM ET

SAN FRANCISCO (Reuters) - An Internet search company on Thursday filed a $100 million antitrust lawsuit against VeriSign Inc., accusing the Web address provider of hijacking misspelled and unassigned Web addresses with a service it launched this week.

I blogged earlier about SiteFinder and everyone agreed it was a "bad thing." VeriSign just got sued for it.

Thanks for the link Peggy!

dejah420@MetaFilter
Verisign modifies the infrastructure of the net to point back to themselves. Verisign has rigged all .com and .net mistyped domains to reroute to their branded search page. This makes them effectively the biggest cybersquatter on the net, and will make it impossible for most spam filters at the network level to operate as well as seriously complicating the lives of network administrators everywhere. posted by dejah420 at 8:07 PM PST
I wonder if someone at Verisign thought this was a clever hack. It's stupid stuff like this that makes it very clear to everyone that Verisign is in a position to abuse their power.

Two of my emails to ado got blocked by SpamAssassin today. According to him SpamAssassin message, my server was an open relay. I asked about this on #joiito and crysflame pointed to an article that explains that Osirusoft which Spam Assassin uses to check for open relays is broken. "Apparently, after having been DDOS'ed, the Osirusoft people have 'given up the ghost' and are now returning back every IP as a spam source when queried!"

So if you want to get mail from me, please reconfigure SpamAssassin as explained on the use PERL; site.

UPDATE: µthe inquirer has an article about this.

Internet News
Report: ISPs Block 17 Percent of Legit E-mail By Brian Morrissey

Top Internet service providers blocked 17 percent of legitimate permission-based e-mail in the first half of the year, according to a report issued by Return Path.

via Scott Mace

I pronounce email officially broken. If 17 percent of legit email is being blocked by spam filters, it's not officially working. No wonder I'm using blogs, IRC and IM for my primary modes of connecting with important people these days.

I don't care what excuses people give. The people who made smtp should have thought more about host authentication and the people who made IPv4 should have made longer IP addresses. My guess is that there were people who were voicing concerns who had more vision.

I have a feeling we are going to be kicking ourselves in the same way when we realize we "forgot" to put privacy into ID systems.

Mitch Kapor and Tim O'Reilly are among advisory board members of Nutch, a new open source search engine project which will try to:

  • fetch several billion pages per month
  • maintain an index of these pages
  • search that index up to 1000 times per second
  • provide very high quality search results
  • operate at minimal cost
Sounds good to me!

John Battelle at Business 2.0 says, "Watch Out, Google".

via Dave Winer at Scripting News.

Yesterday, I had dinner with Robert Kaye. He is the founder of Musicbrainz. Musicbrainz is a metadata project that is creating a database of album artist, title and track information similar to how CDDB used to do it when they were not a corporation. Many people were upset by CDDB's move use the commons created by the community for commercial purposes. Robert was so angry with this betrayal of the community that he started Musicbrainz. Musicbrainz will be set up as a non profit and Robert swears that he will never "sell-out". In fact, we talked about using some sort of emergent democracy that would allow the users to force a way to take shift control in the event that something like this might happen. We talked about the value of such escrow agents of perhaps the DNS and domain name with some sort of tool to allow the users to discuss and trigger a shift in control. This could be a way to force projects like this to stick to their original principles and help build trust at the same time.

Robert seemed like an extremely dedicated, smart and visionary guy and I think his focus and commitment to deliver this service is extremely admirable.

His service is unique in many ways. He is using a sound fingerprint key method to identify the songs. (He got beat up a bit on slashdot because he was using patented technology for this, but I think this is fine. He can always switch later if someone decided to make an open source version.) Basically, his client software scans all of your mp3's looks them up on his database and fixes all of your bad tags. If you have data that isn't in his database, you can submit it. It is a much more automatic and viral approach to what CDDB does.

So far it is only available on Windows, but he's working on an OS X version now...

Ever since the Wired article came out, his server has been swamped so you may not be able to access it... But keep trying and donate some money so he can buy a new server. Thanks for the intro Lisa!


Panelists: Cory Doctorow, EFF; Sean Ryan, Listen.com; Morgan Guenther, Tivo; Media Venture Advisors

Cory is talking about the broadcast flag issue that he has been quite active in resisting. He blogged about it on Boing Boing, but it is basically a flag that can be set in broadcast video to prevent redistribution of it on the Net. The idea is to get commodity hardware and software companies to implement this. The broadcast flag is part I in a three part plan. Part II is to force all analog to digital converters to have technology to sense for watermarks and disable the conversion of anything that had a copywritten watermarks. Part III is to redesign the Internet so that every packet is examined for infringement and discard them.

Sean thinks that the media industry has been bashed so much recently that things are much better than the past. He thinks that there is a viable model that allows people to rip and discover music...

Morgan says that Tivo will be profitable next year... Customers are "happy as clams..." Morgan is talking to the advertising industry about how to use the "real estate" in the living room where families in the US spend 7 hours a day. Wrestling with lots of issues such as copying content between Tivo's. The idea of attacking this without support of the industry didn't make sense to Tivo.

Brewster showing us the Bookmobile
Brewster instructing us on how to print and bind the books
The Connection Machine at the Internet Archive data center
A rack of PC's running Linux at the Internet Archive data center
This morning I went to the see Brewster Kahle at his office in Presidio. Neal Stephenson had been trying to get us together and it finally happened. I was very excited to see/hear the whole thing. We started by seeing the Bookmobile which is this amazing thing that Brewster and his team did. They have 1,000,000 books from the public domain available in their database on the Internet. The Bookmobile cruises around and lets kids print the books and binds them. It costs a dollar to print one of these books so they can give them away. The Bookmobile has cruised around the US and was there during Larry Lessig's argument at the Supreme Court on Eldred v. Ashcroft. The Bookmobile is part of a much bigger project of Brewster's which involves creating a library that archives EVERYTHING. Music, the Web, video, everything. This is called the Internet Archive Project.

This amazing project involves archiving everything using low cost technology. The Connection Machine in the data center was originally running, but now it all runs on PC's with UNIX. There are over a 150 terabytes of data in the data center. There is room for a petabyte. Brewster is on the board of the Library of Congress and is also working with the Library of Alexandria in Egypt on this project. He is trying to recruit other libraries to swap content and mirror the archives. It is such a huge and important project that I couldn't HELP MYSELF... I'm involved. I'm going to try to figure out how to get Japan involved.

Brewster, for those of you who don't know him was one of the founders of WAIS (a great pre-web tool for indexing and publishing information that I used A LOT on my Mac) and Thinking Machines that created the Connection Machine, a massive parallel processing computer. He's quite a legend and it was a great honor and a lot of fun to meet him.

crm_logo.jpgWe talked about spam filters earlier. I use TMDA which is based on whitelisting. The controllable regex multilator is a technical filtering technology. These technologies keep getting smarter. It sort of reminds me of the convolutions we used to go through at Infoseek to get rid of spam sites from our indexes. I remember that some site used to produced different pages to the infoseek search bot by looking at the id... Anyway, this "CRM114" looks interesting.

CRM114 - the Controllable Regex Mutilator
CRM-114 is a system to examine incoming e-mail, system log streams, data files or other data streams, and to sort, filter, or alter the incoming files or data streams according to whatever the user desires. Criteria for categorization of data can be by satisfaction of regexes, by sparse spectra, or by other means. Accuracy of the sparse spectra function has been seen in excess of 99 per cent, for 1/4 megabyte of learning text. In other words, CRM114 learns, and it learns fast .

This is totally amazing. An open source, P2P, email, IM, calendar... total personal information management system with "The Dream Team." Even Andy Hertzfeld is on the team. We've been talking about how cool something like this would be for years. Finally someone is doing this. Where do I sign up? This totally relates to blogs as well. Dan told me about it this weekend, but I waited until his article came out before I blogged it. The Web Site for the Open Source Applications Foudation has more information.

Dan Gillmor

Posted on Sun, Oct. 20, 2002
Software idea may be just crazy enough to work
By Dan Gillmor
Mercury News Technology Columnist

this is an excerpt from the middle

If the software lives up to the developers' plans, it will have wide appeal. It should be highly adaptable to personal tastes, with robust collaborative features. I'm especially hopeful about a feature to build in strong encryption in a way that lets users protect their privacy without having to think about it.

The Chandler architecture builds on other open-source projects. These include Python, a development language and environment that's gaining more and more fans among programmers, and Jabber, a communications infrastructure that started life as an instant-messaging alternative but has evolved into a robust platform of its own.

One of the Chandler developers, Andy Hertzfeld, is volunteering his services. Hertzfeld is well-known in the software community, partly for his key role in creating Apple's original Macintosh and Mac operating system. An open-source company he co-founded a few years ago, Eazel, died during the Internet bubble's immediate aftermath.

``I hope we make a great application that I love to use myself, and that eventually millions of people will enjoy using,'' he says. ``Hopefully, we'll be able to make e-mail a lot more secure, without encumbering the user with technical detail. We can make accessing and managing information of all kinds more convenient if we're lucky. And we'll be helping to pave the way for free software to displace proprietary operating systems at the center of the commercial software industry.''

stewart.jpgStewart Alsop (who I met recently at the Fortune Brainstorm 2002) writes in his column in Fortune Magazine about GoodContacts.

When Barak was visiting a few weeks ago, he was raving about it as well. GoodContacts is basically a contact management package that talks to Outlook or Act! and spams them with email and asks people to update their info. The good thing about GoodContacts is that they don't keep your contact list, they just enable you to spam from your computer. That's why I thought about using it until I realized I would have to switch to Outlook. (and why I am still drooling) It is viral, useful and cool. It triggered a "flashbulb moment" for Stewart.

Stewart Alsop

And that leads me to the flashbulb. Imagine that we all have one phone number and one e-mail address that knows where we are. Imagine that the network keeps track of our location and our personal data, and automatically updates anyone who might be interested. Imagine that we don't have to think about whether the right phone number or address is stored in the network or our PC or our PDA or our phone. Imagine that all these little details of personal life are just handled. Yeah, yeah, I'm dreaming. But if that stuff happens, it will start with dumb little programs like GoodContacts. That's enlightening.

boldface added by Joi for emphasis

I have great respect for Stewart and all this SOUNDS good, but the lightbulb that flashed for me was. OUTLOOK? PERSONAL DATA? Ack! I would like something with similar functionality. It would be great, but I still can't imagine using a Microsoft product for contact management considering all of the security and privacy problems they have. I also would HATE for all of this information to ever end up not being local. Be careful when you ask "the network" to do stuff for you. I envision something similar, but a much different architecture.

Think IM buddy lists. Everyone should be able to have identities that are separate from their "entities". (see my paper about for more thoughts about this) You should be able to have multiple identities for the various roles. Each identity would be attached to different attributes such as memberships, age, corporate roles, or writing pseudonyms. Locally, you would be able to attach current information such as shipping address, home address, current phone, voicemailbox, etc. to each of the identities, being able to manage which identity was "active" or capable of routing to you at any given time. At work you would want your personal phone calls screened, your business contacts on. At home, you could reverse them.

Managing our identities and personal information in this age of privacy destruction will be essential. I truely believe that privacy underpins democracy and that "viral" solutions that give people like Microsoft or their software, access to our contact info should be watched carefully. Peer to peer, multi-vendor, multi-id, hash/digital signature based connectivity is much more interesting for me.

But maybe Stewart was going to get to the architecture next. I think it's a great idea, but the architecture discussion has to happen NOW.

I first saw this on Marc Canter's page, but he got it from Doc Searls Weblog who got it from Wesley Felter on Hack the Planet 2.0 who saw it on cnet.

Doc Searls

Jabber hits critical PR mass, interop finally hits IM
News.com: Out with AOL, in with Jabber. It had to happen eventually. Now it has. The non-interoperative closed doors on IM systems from AOL, MSN and Yahoo are now fated to open. The sooner those companies realize this is a Good Thing that their customers have always wanted, they better off they'll be. Apple shoud take the lead in opening up IM, as it has with so many other standards (USB, SCSI, FireWire, wi-fi and now Rendezvous).
The company's new iChat already makes some use of the Jabber IM protocol. I suspect the only reason iChat is closed (except to AIM) is due to some contractual agreement with AOL. But that also puts Apple in a unique position to tell AOL the gig is up.

Marc Canter
DUUUUUUUUDE! Apple's new iChat IS AIM. It's licensed technology. That's the only way Apple can link into the AIM universe. That's what AOL announced is their inter-op strategy -let others license THEIR engine. So - no - I don't think you'll see little Stevie taking any leadership steps here.........

And BTW - it should be noted that the ONLY way to get Renezvous to work is to open it up. It wouldn't do much - if all it did was configure Apple products - right? Apple is using Open Source as a puppet to achieve their own ends. Whenever Apple does something good is more by strategical manipulation that anything else.

So I heard from a VERY reliable source that ICQ does not really mind people plugging into their network. For instance there is a client called trilian that lets you: "Communicate with Flexibility and Style. Trillian is everything you need for instant messaging. Connect to ICQ®, AOL Instant Messenger(SM), MSN Messenger, Yahoo! Messenger and IRC in a single, sleek and slim interface."

So I agree that Jabber seems cool and maybe the next big thing, but what do I do with all of my old buddy lists? Also, if you're going to make me switch again, I'd like IP telephony seamlessly built into IM so that I don't have to have a phone number any more. It's stupid that the government in Japan allocates phone number when all you really need is a buddy list and an IM account.

I have to figure out a cooler way of formating quotes from various people... How's this?

I just got the beta of opencola. (Thanks Howard!) On the surface, it looks like a bookmarking, meta-searching relevance tracking front end. Very useful just for meta-searching various search engines and news sources and filing your information. You have various folders for different topics and you mark the relevance of various documents and you can continue to search for more stuff similar to what you like. The cool thing is that you can add peers that can look at your public folders and share recommendations with. It is similar to a company we invested in that unfortunately didn't end up making it past "beta" called FatBubble... Howard talks about opencola in Smartmobs. I think it was started by Cory Doctorow of BoingBoing. Anyway, so far it looks great. The only problem is I have no PEERS! If someone else can download the beta and post their id here as a comment or email me their id we can be peers. (I do have the choice of rating the relevance of peers. ;-0 ) Anyway, definitely worth a look.

Welcome To Opencola

xpWed16740328.gif
Social Network Diagram for ITO JOICHI
head_arms.gif
Found this strange site that has extracted data from conference attendances and created graphical maps of social network. Pretty scary. I attended an Open Source Solutions conference organized by Robert Steele, a former CIA expert on Open Source Intelligence. There were a bunch of CIA and KGB folks at the conference. Anyway, the list of attendants of this conference among other lists seem to have made it into this database...

Spam is an issue that has been discussed and discussed. Laws have even been passed about it. The reason I decided to write something now about it is because I've been using a spam filter for awhile and I think it is working. Usually. I also think it represents the proper way of thinking about Spam.

Sen and I have talked about spam a lot and we often talk about how it is yet another basic mistake in the way that the Internet was designed. If only smtp could let you authenticate the user before you received mail, we could make much better mail filter. Alas, this can not be changed now. (Or it would be very difficult.) So we have to come up with some solutions.

The best solution we have found so far is whitelisting. It is a way to make a filter that only lets mail from certain addresses through. Originally Sen had made a script for me where I had a separate mailbox for mail from people who were in my address book. Now we have moved over to TMDA which lets you create white lists, black lists and a variety of other things. I have mine set up so that I can create and maintain my own list of addresses and domains that I want to receive mail from. We also have it set up so that if someone sends mail that is from someone not on the list, they get a message asking to reply and confirm that they are a human being. Once we receive the confirmation, the mail comes through. To filter for humans I don't want to get mail from or for that intelligent spam robot, I can make black lists.

There are still various problems with the system, but it works quite well. I still have it set up so that I have a mailbox for all of the mail that is rejected and I go through this periodically to make sure I did't miss something important.

In Japan, spam has become a huge issue because the recipient has to pay for the mail on i-mode phones. NTT Docomo is trying very hard to deal with this issue with filters of their own, but there are still major problems.

I do think that spam should be solved by certificates, authentication, keys, etc. on the user side and not by some huge central server...

About this Entry

This page contains a single entry by Joi published on June 10, 2012 2:12 PM.

A week of a student's electrodermal activity was the previous entry in this blog.

Find recent content on the main index.

Monthly Archives