Resistant protocols: How decentralization evolves
Why do we build decentralized technologies? How do we predict what will be successful, what will get shutdown, and what will end up as a science experiment that no one uses in practice?
As I’ve written before, blockchain isn’t the first decentralization craze and we can learn a lot from studying the history of p2p file sharing. By examining how decentralization for file sharing evolved, I think a few lessons stand out:
- Mainstream decentralization emerges in response to the law when a certain use of centralized technology is denied
- Decentralization is best used sparingly, obfuscating the technology which cannot possibly exist as a centralized system
- If you want to know if something should be decentralized, look for informal decentralization which reveals demand for a formal system
- Decentralization is part of a bigger playbook of legal tactics used to keep technologies alive despite the best efforts of a hostile government
- Decentralization doesn’t work in a vacuum, mainstream decentralized systems require a degree of activism to keep the system working
If we rewind to 1997, we can watch as the distribution of mp3s starts centralized and grows decentralized with time. The history helps answer some tough questions:
- When should we build decentralized technology?
- How do we know what to decentralize?
- How should we reason about how people will use and support the decentralized technology we build?
These questions are at the core of every decentralization project being built today. The answers are relevant to entrepreneurs, open source contributors, investors, and internet activists.
Pure centralization
People shared copyrighted files in the 1980s, but we’ll focus on the history of sharing mp3s. The mp3 could compress songs to 1/12th the original size while retaining good sound quality, making it a catalyst for the start of the music revolution. With 1997 internet speeds, good compression made it much more pleasant to swap music online.
David Weekly shared mp3s in 1997 on his personal website, hosted by Stanford. People liked the free music on his website so much that Stanford noticed:
“Your computer is currently responsible for 80% of the outgoing traffic from this campus. We’re just curious, what are you doing?”
David took the website offline when the RIAA asked. Dozens of others did the same as the RIAA sent waves of takedowns to mp3 hosts. Although it wasn’t clear at the time, a decade long war to share free music was building steam. The message from the first battle was clear: don’t host copyrighted mp3s or we’ll sue.
Centralized mp3 distribution defines the precondition of the decentralization movement to come. For cryptocurrency, equivalent parts of history might be the death of e-gold and Liberty Dollar. For Tor, the parallel is just the internet before Tor; what websites weren’t allowed to exist and which were censored? The common thread is the law restricting people from using the internet in the ways they want.
Hyperlinks as foreshadowing
People kept uploading copyrighted music to centralized servers, but the RIAA kept pace and became very efficient at issuing takedowns.
Mp3 fanatics found a new home: link-only mp3 websites. These websites provided continuity for end-users since the RIAA couldn’t take them down as easily as centralized hosts. Every mp3 was hosted by someone else’s server; if a third party host was taken down, the website administrator could update the link or remove it. Operators of link-only mp3 websites scoured the web for you, reducing the amount of work for everyone else. Informally, link-only mp3 websites centralized the indexing of mp3s and decentralized hosting.
Mp3board was a poster child for the link-only mp3 sharing phase of online piracy. In a legal battle with the RIAA, they agued
“If this kind of automated hyperlinking is ruled illegal, the Internet is going to grind to a halt,” said Ira Rothken, legal counsel for MP3Board.com.
Mp3board’s use of hyperlinks as a legal defense is an early example of a theme that shows up for the rest of the p2p revolution: should a service be at fault for creating an automated system that helps others download copyrighted material from external sources?
MP3Board tested the letter of the law, but outside of court they weren’t exactly subtle about what was really going on. An RIAA spokesperson said it best:
“This is about the fact that the sources MP3Board.com are linking to are blatantly pirate sites which they are aware of. They link to sites that say ‘Super Pirated MP3s.’
Can technology companies really just write software that pits the letter of the law against the spirit? The nakedness of this tactic doesn’t make it easier to prosecute, websites like mp3board were designed to have stronger legal defenses.
Link-only mp3 websites of the late 90s don’t fit the image we have today when we think of decentralized technologies: it was hyperlinks and direct downloads. Still, the shift from central hosts to link-only websites tells the full story of decentralization. When the law points to a piece of centralized software and demands modifications that users don’t want, technologists split that software into parts and obscures the objectionable features the legal system understands.
Link-only websites are interesting because they represent a relatively simple solution to the law stopping the easier centralized approach. For cryptocurrency, the parallel might be coin tumblers which helped Bitcoin users hide who they received their funds from. Privacy-oriented VPN providers that didn’t store logs signaled that people wanted a private way to browse the internet that couldn’t be traced back to them. Anonymously purchased domains and hosting providers demonstrated the same for operating a website.
The Napster craze
Link-only mp3 websites were helpful, but you wasted time dealing with broken links. Shawn Fanning, the founder of Napster, said that dead links inspired him to create Napster.
When you opened Napster, it shared your list of mp3s with Napster’s servers. When someone searched, the server would only return files belonging to online users. My PC connected directly to yours to download a song.
Napster formalized the separation of concerns we saw with link-only mp3 websites: central knowledge of where files live and decentralized file distribution. Formalization improved user experience by eliminating dead links and allowing Napster to maintain massive collections of mp3s larger than any human run website could hope for. Most Napster downloaders were also sharing music on their system, so the breadth and depth of Napster’s music collection grew with each user. Not only did Napster avoid broken links, but their seamless use of p2p technology meant Napster was more likely to have any given song available for download.
The music industry hauled Napster off to court, shut down the service, and bankrupt the company. Secondary liability laws meant that Napster was liable for any copyright infringement on their platform. The test for secondary liability involved four core questions at the time:
- Was Napster able to supervise and stop infringement?
- Did Napster stand to profit from infringement?
- Was Napster aware of infringement?
- If Napster was aware of infringement, did they knowingly facilitate it?
During the lawsuit, a judge told Napster to shutdown their servers if they aren’t able to stop the infringement on their platform. Napster added a filter to avoid a premature death, but this proved to the courts that they could have stopped infringement before the lawsuit, but chose not to. Internal documents came back to haunt Napster, with one quote saying that user growth will drive future revenue. Separate internal emails made it clear Napster was aware of infringement:
“For example, a document authored by co-founder Sean Parker mentions the need to remain ignorant of users’ real names and IP addresses “since they are exchanging pirated music.” … The same document states that, in bargaining with the RIAA, [Napster] will benefit from the fact that “we are not just making pirated music available but also pushing demand.”
The court ruled that Napster facilitated infringement by building “proprietary software, search engine, servers, and means of establishing a connection between users’ computers.”
Each nail in the coffin is crucial to understand:
- Ability to filter searches
- A profit motive for producing file sharing software
- Knowledge of infringement by the executives at the company
- Facilitation of infringement
These are the criteria that guide decentralization and decide the eventual file sharing winner.
Napster’s succession to link-only mp3 websites introduces a cycle which seems to show up frequently through the evolution of decentralization. When the legal system efficiently shuts down centralized solutions, informal workarounds emerge, and often the informal process is productized and packaged into a protocol.
If link-only mp3 websites foreshadowed Napster, coin tumbling seem to have foreshadowed the emergence of privacy coins like Monero and Zcash. Privacy-oriented VPNs demonstrated the demand for a generalized solution like Tor. Anonymously operated websites probably helped inform the creation of Tor hidden services. Mainstream decentralized technologies tend to have groups of people struggling to get the same features online before a strong solution emerged.
Trying to keep the party going: OpenNap and Napigator
Open source developers reverse engineered Napster’s protocol in 1999 and created “OpenNap.” With OpenNap, anyone could run their own Napster server for others to connect to. If link-only mp3 websites informally decentralized file distribution, OpenNap informally decentralized file search.
When the music industry killed Napster, many turned to both OpenNap and “Napigator.” Napigator ran on your machine and let you choose which OpenNap server you wanted to use; when you picked a server, it would alter your Napster client to use that server from then on. If you wanted to run your own OpenNap server, you could deploy the OpenNap software and email Napigator the server’s IP address. Napigator manually curated and hosted a centralized list of OpenNap servers.
OpenNap’s fame was short lived. Once the RIAA buried Napster, they started issuing takedowns to popular OpenNap servers and the number of users dropped by 80% in less than six months.
Decentralizing around the law: Kazaa
The next generation of file sharing protocols were built carefully with the letter of the law in mind. Napster was able to filter its central search, so Kazaa decentralized search.
Kazaa’s protocol (“FastTrack”) had a “hybrid architecture.” Basically, this means that some machines in the network are more important than others, behaving like servers. Users were automatically promoted to be “supernodes” if their machine had a fast internet connection and enough storage space. Multiple normal Kazaa peers would connect to a supernode which would keep track of the files each user was sharing, process their search requests, etc. A supernode also maintained connections to many other supernodes, so search requests could be quickly checked against many different supernodes.
OpenNap diffused the liability of centralized search by allowing anyone to operate a search server, just like link-only mp3 websites diffused the liability of hosting copyrighted mp3s. Just as Napster formalized the workarounds from link-only mp3 websites, Kazaa’s protocol formalized the workarounds we first saw with OpenNap and Napigator.
A lot of popular file sharing applications used hybrid architectures after Napster died, including Limewire, Kazaa, and eDonkey. This wave of decentralization was hard to prosecute and companies built on Kazaa’s protocol eventually set legal precedents when they wound up in the US Supreme Court. Kazaa itself died in an Australian court, with the court ruling that Kazaa could have installed client-side filters but chose not to:
“Even if there is no such central server, other measures were available to the respondents, but not put in place, that would have prevented (or at least limited) infringements of their clients’ copyrights by Kazaa users.”
United States courts handed down a similar ruling to Limewire, pointing to their choice to include a client-side copyright filter but disable it by default, requiring the user to turn it on in the preferences panel:
“In May 2006, [Limewire] implemented an optional, hash-based content filter. A hash-based filter can identify a digital file that contains copyrighted content, and block a user from downloading the file. The “default” setting of LimeWire’s hash-based filter was “off,” however, meaning that LimeWire users would have to affirmatively turn the filter “on” for it to have any effect on the transfer of digital recordings to or from their computers. [LimeWire] could have made the hash-based content filter mandatory for all LimeWire users, or made “on” the default setting, so that a user’s file-sharing activities would be subject to the filtering process unless he affirmatively deactivated the filter.”
To prosecute Kazaa’s generation, the Supreme Court ruled that a file sharing company can be tried as liable if they “induced” infringement. For example, if a company advertised to former Napster users by saying they could download copyrighted music using their software, then that can be argued in court as proof that the company is trying to profit from copyright infringement.
The introduction of the “inducement” doctrine of secondary liability killed off a significant number of file sharing companies, including Limewire and eDonkey. The new rules were clear:
- If someone can search for copyrighted content with your software, Hollywood can demand you install a copyright filter
- If a company advertises their file sharing product as a tool for copyright infringement, they can be tried as liable for what happens in their decentralized protocol
Once again, these rules strongly shape decentralization moving forward. Before, we saw decentralized file sharing protocols built with Napster’s ruling in mind. By the time of Kazaa‘s death, hundreds of file sharing companies were already on the market with different designs. Instead of new companies emerging, the next wave of decentralization was determined by what companies were left over after the Supreme Court’s inducement ruling.
BitTorrent: the resistant strain
Unlike the rest of the file sharing space, the BitTorrent company and its founders condemned infringement.
[BitTorrent’s] creator, Bram Cohen, seems interested only in noninfringing uses, and has said all the right things about infringement — so consistently that one can only conclude he is sincere.
BitTorrent is a legitimate technological innovation. When a lot of people download a file at the same time, the download slows with centralized solutions but speeds up with BitTorrent. Twitter and Facebook use it to deploy code for this reason.
Unlike Kazaa and eDonkey, BitTorrent’s protocol was free and open source from the start. Hundreds of BitTorrent clients popped up all over the world from independent companies and open source communities.
Critically, the BitTorrent protocol had nothing to do with searching for content and this meant BitTorrent clients were just tools for fast downloads and not a search box for piracy. The BitTorrent protocol outsourced file discovery to torrent search engines like The Pirate Bay which hosted torrent files anyone could download and then open in a separate client.
Around 2005, BitTorrent took over Kazaa’s place as a top protocol for p2p file sharing. BitTorrent files started simple, specifying what files the torrent will download, how those files are broken into small pieces, and the single BitTorrent tracker the client should connect to in order to find other peers that are uploading and downloading the same data.
Hollywood never pursued the BitTorrent company like they did for Napster, Limewire, Kazaa, and many others. The BitTorrent company created the protocol, distributed their own client, and operated a content search engine. The client didn’t have search functionality of course and the official BitTorrent search engine respected copyright and removed infringing files. It would even be hard to argue that BitTorrent was created for the sake of making money, since Bram Cohen open sourced the protocol and gave away the client for free.
“‘I wanted to work on something rewarding,” [Bram] says. And once he was done, he was ready to move on to something new; his father had to twist his arm to build a company out of his work
Torrential Reign from Fortune (2005)
After Kazaa fell, attention shifted to BitTorrent and public policy experts saw that BitTorrent may win the war.
The litmus test is BitTorrent. … Its creator, Bram Cohen, seems interested only in noninfringing uses, and has said all the right things about infringement — so consistently that one can only conclude he is sincere … BitTorrent looks like a clear example of the kind of dual-use technology that ought to pass the Court’s active inducement test
Legal attention shifted from protocol creators to torrent search engines and trackers. Hundreds of torrent websites shutdown when Hollywood threatened lawsuits. Just as legal action against the previous generations of file sharing protocols forced technological evolution that resulted in BitTorrent taking the lead, legal action against BitTorrent websites revealed which teams had the strongest conviction. Hundreds folded, but Demonoid, isoHunt, and The Pirate Bay managed to stay around for more than a decade. Somehow, The Pirate Bay is still operating to this day.
Lawsuits attacking parts of the BitTorrent ecosystem informed how the BitTorrent protocol evolved. Websites like The Pirate Bay also went through their own internal technological revolution, reworking how The Pirate Bay operated with the law in mind.
BitTorrent today
When huge BitTorrent websites were taken down, it hurt the community. In the earlier days, you downloaded a torrent file from The Pirate Bay and then your client connected exclusively to The Pirate Bay’s torrent tracker. If their servers went offline then all of the metadata about the files (titles, hashes, file names) went offline and torrent clients would no longer know how to find other peers with the same torrent information.
Every choke point of the BitTorrent ecosystem was altered to make legal action unattractive. Full decentralization was rarely the goal; instead, you can view the technology as changing to make it a lot more work for law enforcement to damage the network.
BitTorrent files added “Multitracker” support and The Pirate Bay stopped operating their own torrent tracker; today, if you download a torrent from The Pirate Bay you’ll notice trackers operated by third parties which seem to be privacy oriented non-profits.
A lot of legal resilience is packed into this change. Even if these trackers didn’t require much effort to raid and takedown individually, there are still five of them in every torrent. If law enforcement wants to stop infringement by taking down the torrent tracker, they still would have to coordinate five takedowns at once now in order to actually take torrents offline. These trackers aren’t easy to take down either! For example, Coppersurfer.tk is a Dutch nonprofit entity which makes them a bit harder to prosecute since the owner is losing money operating this tracker out of goodwill:
The Coppersurfer operator … fails to see how a non-profit service that doesn’t even require a website, can be seen as online commerce
Top Torrent Tracker Knocked Offline Over “Infringing Hashes”
By using third party trackers, The Pirate Bay also improved their legal position since the court couldn’t argue that they were helping users actually find other peers to download from.
Later, The Pirate Bay switched from hosting torrent files to only serving up “magnet links.” Magnet links are basically a simple identifier which a torrent client then uses to discover the rest of the information like the title, the files, and anything else that a torrent file would have previously held. This change decreased the value of taking down The Pirate Bay since torrent metadata was now totally decentralized.
Magnet links were a big step towards decentralization, using “distributed hash tables” (the system torrent clients use in order to resolve torrent information from a magnet identifier) and “peer exchange” which allows clients to discover other peers sharing the same torrents. This was a major milestone for The Pirate Bay’s founders:
DHT (combined with PEX) is highly effective in finding peers without the need for a centralized service. If you run uTorrent you might have noticed in the tracker tab of your torrents that the [Peer Exchange] (PEX) row is often reporting a lot more peers than the trackers you might have for that torrent. These peers all came to you without the use of a central tracker service! This is what we consider to be the future. Faster and more stability for the users because there is no central point to rely upon.
Worlds most resilient tracking from The Pirate Bay blog
Torrents from The Pirate Bay still include centralized trackers as well, but you can now just view this as polish for the sake of making downloads a bit faster. BitTorrent trackers are no longer a centralized point of failure, they are decentralized with a layer of convenient centralization on top.
The Pirate Bay founders are crazy. They spent more than a decade using technical tricks and differences in international law in order to keep search engine alive. They shared details on how their servers are deployed in 2012:
The Pirate Bay is currently hosted … in two countries … “If one cloud-provider cuts us off … we can just buy new virtual servers from the next provider.” … The load balancer and transit-routers are still owned and operated by The Pirate Bay, which allows the site to hide the location of the cloud provider. … The hosting providers have no idea that they’re hosting The Pirate Bay … The worst case scenario is that The Pirate Bay loses both its transit router and its load balancer. … All the important data is backed up externally on VMs that can be re-installed at cloud hosting providers anywhere in the world. “They have to be quick about it too, if the servers have been out of communication with the load balancer for 8 hours they automatically shut down. When the servers are booted up, access is only granted to those who have the encryption password,” they add.
Pirate Bay Moves to The Cloud, Becomes Raid-Proof by TorrentFreak
Basically, to temporarily takedown The Pirate Bay, law enforcement would need to legally raid two cloud computing companies in different countries. To actually seize servers with data on them, they would have also do a legal raid in a third country which they would only learn about after the first two raids. If any of the machines go offline in the first two raids, the servers in the other countries shutdown and become useless if seized.
Separately, The Pirate Bay hosts archives of all of their torrents for anyone to download. If an extremely sophisticated raid was able to seize all of The Pirate Bay’s servers and deprive them of backups, the public would be able to recover all torrents up to the previous year.
The Pirate Bay has also worked around regular ISP level blocks and issues around domains being revoked. If you google “pirate bay proxy” you can find many websites that list domains for accessing The Pirate Bay that may workaround an ISP block or thepiratebay.org being down
The BitTorrent company avoided prosecution by passing the buck for liability to search engines like The Pirate Bay. Legal action against torrent search engines helped evolve both the BitTorrent protocol and The Pirate Bay’s private technology.
The path we followed from centralized mp3 distribution to the modern BitTorrent ecosystem is a case study following one of the biggest mainstream decentralization movement the world has seen. Napster had 80 million users at its peak. In 2007, BitTorrent was ~60% of global internet traffic. From 1997 to 2007, users flocked to the best user experience possible that was just decentralized enough to stay alive.
Shadowing the law
Napster wasn’t really built with a close eye to the law, but it seems like Kazaa and BitTorrent were. Both distanced themselves from the centralized content search that killed Napster. Kazaa and many others banked on being able to point to their decentralized search as proof that they had no ability to oversee and proactively censor content search and file transfer. BitTorrent was a bit more extreme, making it such that anyone can run a file search. You can imagine the argument: “our file search is totally legal, The Pirate Bay is the illegal one!”
When the BitTorrent company gained some breathing room by passing the buck to torrent search engines, the community evolved the protocol. The introduction of multiple trackers per torrent, distributed hash tables, and peer exchange all seem like countermeasures to reduce the effectiveness of shutting down centralized search engines and trackers. If we follow the trail of lawsuits beyond BitTorrent and into The Pirate Bay’s history, we see further informal decentralization with websites like thepiratebayproxylist.net and The Pirate Bay’s server private architecture which is designed to resist server raids.
The history of successful decentralization is defined by people saying “we’re not allowed to do X, but I bet we would get away with it if we did Y.” Rebecca Giblin describes this perfectly:
“there is a gap between those physical world assumptions and the realities of P2P software development … the physical world/software world divide … [By] failing to fully recognize the unique characteristics that distinguish software code and software development from their physical world predecessors, the law has been and continues to be vulnerable to exploitation by those who understood that those traditional or physical world assumptions do not always hold good in the software context.”
Code Wars: 10 Years of P2P Software Litigation by Rebecca Giblin
Copyright case law was built around cases where a single business owner could plausibly oversee and police copyright infringement on their property. The law wasn’t built to handle the idea that no one owned the property, which is what Kazaa exploited. The Supreme Court set precedents which assumed that it would require a significant amount of capital to create and sell a product which could infringe on copyright. Based on this, the courts also assumed that tools for large scale copyright infringement would only be created in order to make a profit.
Decades of copyright law were built around the assumption that someone like Bram Cohen didn’t exist. Bram built a protocol which was used for mass infringement, and the main costs were his time and energy. He didn’t even want to create a company and he gave away the protocol for free. Decentralized search in Kazaa’s protocol violated the assumption that someone could intervene and stop infringement, and companies that built on top of Kazaa were able to exploit this all the way to the Supreme Court.
If we look at other successful decentralized technologies, we see similar patterns. The US Government killed e-gold, a centralized digital currency, via various anti-money laundering demands and lawsuits. The US Mint and Department of Justice killed The Liberty Dollar, a privately operated and centrally issued physical currency, after it achieved a certain amount of popularity. Bitcoin mining decentralizes both the issuance of currency and the transaction process. The government can’t directly impose anti-money laundering controls at the point where transactions are processed nor can it shutdown the system itself.
Informal decentralization foreshadows protocols
In the earlier periods of the music sharing revolution, informal and manual processes emerged which foreshadowed automated technological innovations that would show up later on. When music fans knew they couldn’t host their own music, link-only websites emerged. When Napster’s central server was shutdown, many others ran alternative central servers. Hyperlink music blogs and OpenNap/Napigator worked around the law in the same ways that Napster and Kazaa would, later.
Each successful phase of decentralization we saw represented the minimum viable decentralization necessary to stay alive.
- “We’re not allowed to host mp3s? What if someone else does and we link to it.”
- “Hollywood shutdown Napster’s server? What if anyone can run one?”
In a way, the informal phases seem like laziest solutions that anyone could have come up with. Instead of decentralizing hosting, which is hard, how about we just let someone else host it and update the link? Instead of decentralizing fuzzy search, which is an unsolved problem in distributed computing, how about we just let anyone run a server and have users choose one?
The fact these lazy seeming workarounds foreshadow later popular protocols seems to tell us something about decentralization. The progression of centralized hosting → Napster → Kazaa → BitTorrent seems to represent the minimum viable decentralization required to stay alive as defined by the law at the time. These lazy workarounds match because decentralization isn’t the product, it is just a means of staying alive.
Plenty of people went further with decentralization and anonymity, but it wasn’t necessary for staying alive and it only mattered to a privacy-focused minority of people. Beyond staying alive, decentralization is a weakness not a strength. In many ways, 2005’s BitTorrent was more centralized than Kazaa, but it decentralized file transfer and outsourced content discovery which made it more resilient than Kazaa which decentralized search at the protocol level.
Demotivating legal action
File sharing companies built decentralized technology in order to disown concepts like file hosting and content search, but strategically toying with legal assumptions is only part of technology’s role. Supporting multiple trackers in every torrent file is only “decentralization” in the weakest form of the term. Instead of shoehorning it into the word, maybe we should look at it as a tactic for keeping torrents online. Even without DHTs, five torrent trackers is a lot of redundancy for guaranteeing that downloads from The Pirate Bay will likely stay online. Rebecca Giblin thinks that the large number of open source BitTorrent clients is why Hollywood didn’t go after BitTorrent the way it went after Kazaa.
The Pirate Bay has done a lot to keep its BitTorrent search engine online, including embracing a lot of decentralized technologies like magnet links. How do we reconcile their choice to not run their own tracker and instead depend on several operated by nonprofits? What about all the work they have done to split their servers across multiple countries? These aren’t decentralization at all really, but they’re clever uses of technology which make The Pirate Bay more resilient against law enforcement.
Activism: the linchpin of decentralization
Bram Cohen gave away BitTorrent’s protocol for free. He didn’t have to do that, Kazaa was pretty successful in keeping their decentralized protocol proprietary. The founders of The Pirate Bay have gone through hell to keep their creation online. They even went to jail for it. Somehow, it is still operating today despite ISP level blocks and a history that has made The Pirate Bay notoriously risky from a legal perspective.
Decentralization and other technological tricks help keep technologies online which wouldn’t last if they were centralized, but they don’t fully solve the problem. Instead, it seems like decentralized technologies depend on activists in order to fully realize the vision of the technology. Bram played this part by open sourcing his protocol, limiting his ability to profit from the system, and creating an environment where killing his client would basically do nothing to stop BitTorrent usage. The Pirate Bay is a more obvious example of activism and they go hand in hand with Piratbyrån’s anti-copyright mission. Yes, there are private torrent trackers and public options besides The Pirate Bay, but no one has provided the continuity and resilience that The Pirate Bay has in staying alive no matter the cost.
Decentralized technologies don’t take the legally impossible and make it unstoppable. Decentralization is a tactic for diffusing risk for many and lowering the risk for the activists that operate the most sensitive parts of the system. We see the same with Tor, where the risk of participating in the system is concentrated at the exit nodes which can attract undesirable legal attention. Without activism, we would have beautifully designed decentralized technologies which are impossible to use in practice.
The bigger picture
The hype around the blockchain has attracted a lot of people to a larger decentralization movement. Both the NSA and Facebook have helped make privacy online a hot button political topic, motivating people uninterested in blockchain to still work on complex cryptosystems and decentralized technologies that try to reclaim the rights we seem to have lost.
Looking at the history of file sharing, it seems clear that pure technical decentralization is just one tool in a larger picture. The real name of the game is creating technology which anyone can use relatively easily, even if your government doesn’t like it. Decentralization helps to the extent that it exploits the letter of the law or evades the enforcement of the law. The siblings of decentralization are
- Techniques for exploiting differences in international law. The Pirate Bay lived in Sweden for a reason
- Corporate structuring and separation of technical responsibilities. Operating pure torrent trackers as nonprofits meant they were harder to kill given that the law focused on criminals trying to profit off of infringement
- Open source as a means of widely replicating and improving decentralized technology. The BitTorrent community proposed improvements and discussed them in public. Open source BitTorrent clients meant that shutting down BitTorrent the company wouldn’t really affect the adoption of BitTorrent technology
- Creative uses of technology which make legal action less effective. The Pirate Bay spread its servers across multiple countries to make raids less attractive
Whether you’re building a blockchain company or a decentralized protocol for the greater good, you can learn from the bigger picture which decentralization played a major role in for file sharing technologies. Ask yourself, what can some people not do on centralized systems which they should be able to do? If you look closely, you might be able to spot informal strategies people are using today to get around the rules, and these could help inform what to formalize into a protocol. Decentralization can be a powerful weapon, but it is hard to deploy and can backfire if it isn’t paired with other legal strategies. Over applying decentralization isn’t a strategy unless your goal is obscurity; the most popular file sharing applications were never the most decentralized. Decentralization on its own isn’t enough, activists willing to take a big risk seem essential in the long run.
If you liked this post, follow me on twitter. I have several other posts about p2p and blockchain and I’ll be writing more.
Special thanks to Rebecca Giblin, author of the excellent book “Code Wars: 10 Years of P2P Software Litigation” which helped me learn the legal context around file sharing