What is the PL Network?
What is the Protocol Labs Network
Last updated
What is the Protocol Labs Network
Last updated
In this talk by Molly Mackinlay, understand where Protocol Labs is an organization in 2022, where we are headed, the trajectory, evolving landscape, and how we are accelerating the PL network. Originally given in Feb 2022.
IPFS Was one of the original core OSS projects, from which other projects in Protocol Labs have grown and nucleated, creating a network of Open Source technologies, developer tooling, DApps, and more.
Filecoin is the other original core organization that supports the builders and storage providers that use Protocol Labs & Filecoin technologies. Filecoin is a utility token (cryptocurrency) that supports and incentivizes the growth of our network of web3 technologies.
libp2p is a set of modular tools for networking and peer-to-peer communications
IPLD is the data model that is used for the content-addressable web
Multiformats is a project that makes it possible to create self-describing data, which allows protocols and projects to be interoperable and future-proof
Testground is a platform for testing, benchmarking, and simulating distributed and peer-to-peer systems at scale, that can scale from 2k-10k instances
Drand is a revolutionary public, distributed, verifiable randomness generator that can be used in protocols and cryptocurrencies that require public randomness.
Filecoin Virtual Machine (FVM) is a layer 1 protocol that will enable developers to create and execute smart contracts on data stored in the Filecoin network
Lotus is the main Filecoin network implementation, supported by Filecoin, written in Go & maintained by the PL team
In Protocol Labs, there are new projects and technologies being invented every day. Keep in mind that this may not be a comprehensive list of all important projects growing in our network
This is an annotated version of a blog
The Protocol Labs Network is community of teams, projects, and organizations focused on the research, development, and deployment of network protocols. We believe the internet is humanity’s most important technology. Our mission is to improve the internet and computing, generally. We are doing so by creating groundbreaking protocols, and speeding up the pipeline from ideas on paper to users’ hands.
The advent of computing radically transformed humanity, and so has the flourishing of our global nervous system: the Internet. This still-nascent medium of communication, of digital existence, will connect us and empower us in ways we've only begun to understand. In just a few short decades we've acquired tremendous, almost magical superpowers: we can speak to half the planet, at any time and from almost anywhere; we can explore and search the most complete compendium of human knowledge in seconds; we can reason about and solve tremendously difficult problems; we can work, play, and be together at a distance; we can conjure systems of digital and mechanical agents to do our bidding; we can change the world, we can save lives, we can wield all the powers of our species, for good and bad, with a sequence of keystrokes.
And yet, this is only the beginning of computing's impact on humanity. We're living during a radical phase transition in the history of life and intelligence on Earth, and computation is a breakthrough on the order of genetic evolution.
Long term, universal computation will transform us more than language has, which is indeed saying something. But it is important to realize we will see radical changes in the short term too: computing and the internet will grant us even greater superpowers, as we fuse ever tighter with our technology.
The horizon of computing is exciting and turbulent. We face optimistic progress, significant challenges, and a host of existential threats. We navigate this complex landscape by grounding ourselves in two important goals. I will speak more about these in future communications. (For a longer treatment of this background, see this talk
We should improve the internet and computing generally. Secure and robust access to information, to communication, and to computing has become a critical part of what it means to be a modern human. The internet has become the main vehicle for human interactions worldwide, and it will only become more so as our devices and interfaces improve. Brain-machine interfaces will cement this, and those are not so far away. Therefore, fixing problems with the internet and upgrading our computational fabric will have tremendous impact for humanity, now and in the future. This is the Why.
We should accelerate the ideas-to-superpowers pipeline. At the heart of computing progress lies a simple process: the research, development, and deployment cycle. Ideas are conceived and refined, encoded into mathematical rules, programmed into software, and deployed into computers, which grants super-powers to humans world-wide. The better and faster we are at sifting ideas through this pipeline, the better and faster our superpowers will come. This is the How.
Breakthroughs from labs that are exclusively or mostly focused on research tend to stay buried as papers. Sometimes the lag is natural, the research is far ahead of its broad applicability. But we find that there are hundreds of breakthroughs useful long before they reach users. It can take many years, even decades, before breakthroughs are realized in products that improve people's lives. This is massively inefficient! Research has to be coupled closely with development and deployment, so that we can iterate through the cycle quickly, and build good products that actually solve problems for people.
Even though we are young, we have already created a large ecosystem of interrelated projects and products. These have all spawned from the first, IPFS, and naturally have much to do with decentralized data distribution. Our approach is to carefully modularize projects so that they can serve as many people as possible. Once it is clear a sub-project should be independent, we spin it into its own effort. I will leave the descriptions here short, click through to find out more about each.
In 2013, while studying Bitcoin, it became clear that cryptocurrencies solved many fundamental issues with protocol development:
A cryptocurrency can address a portion of the value created by the protocol, and it can be allocated to fund protocol creators and maintainers.
This is an unprecedented way to capture a fraction of the value created and feed it back into development. The IPFS protocol benefits, its ecosystem, and the creation of more protocols. What's more, the value capture is coming at the protocol layer and not solely based on applications or services above. The attention and interests of the protocol developers remain aligned with successful outcomes for the protocol, and the projects building around it.
The cryptocurrency can be used to incentivize individuals, and other groups outside the original organization, to improve and work on the protocol. It can also be used to fund the entire development of the currency, by pre-selling it as Ethereum did in 2014.