Recent Wikileaks episode has highlighted the immense control national governments and private companies have on what content can be hosted. Within days of being identified by the U.S. government as a problem, private companies in charge of hosting and providing banking services to Wikileaks withdrew support, largely neutering organization’s ability to raise funds, and host content.
Successful attempts to cut Internet in Egypt and Libya also pose questions of a similar nature.
So two questions follow. Should anything be done about it? And if so, what? The answer to the first is not as clear, but on balance, perhaps such absolute discretionary control over the fate of ‘hostile’ information/or technology should not be the allowed. As to the second question, given many of the hosting, banking companies, etc. essential to disseminating content are privately held, and susceptible to both government and market pressures, dissemination engine ought to be independent of those as much as possible (bottlenecks remain: most pipes are owned by governments or corporations). Here are three ideas:
- Create an international server farm on which content can be hosted by anyone but only removed after due process, set internationally. (NGO supported farms may work as well.)
- We already have ways to disseminate content without centralized hosting—P2P. But these systems lack a browser that collates torrents and builds a webpage in live time. Such a torrent based browser can vastly improve the ability of P2P networks to host content.
- For Libya/Egypt etc. the problem is of a different nature. We need applications like Twitter to continue to function even if the artery to central servers goes down. This can be handled by building applications in a manner that they can be run on edge servers with local data. I believe this kind of redundancy can also be useful for businesses.