Interplanetary File System or IPFS

shoupn profile image Nick Shoup Updated on ・4 min read

This post was originally published on my blog Fundamentals Of Code

Interplanetary File System

You might be thinking Intergalactic Planetary, Planetary Intergalactic? Unfortunately at the moment though, IPFS is only relegated to planet Earth. IPFS or Interplanetary File System, is a peer to peer file system. It's inspired and built upon distributed technologies via peer to peer systems such as git and BitTorrent. It uses hashing and file system storage for distributing content across the web, essentially it's a way to replicate and address files based on their content instead of their location. IPFS is not in itself a blockchain, however, it has been tokenized and Filecoin exists to incentivize persons to operate an IPFS node and promote storage on their file system. Users willing to store IPFS data are rewarded with Filecoin.

What are some use cases you might ask? What about the concern that the internet has become more centralized and an increasing potential for censorship of content and ideas? We are already seeing limitations on allowed content within certain regions by governments shutting down websites or blocking sites via ISP's. In a country like Turkey that has blocked sites like Wikipedia, the site has been mirrored on IPFS, and as such it is feasible to still gain access to the content on Wikipedia via P2P networking(it's replicated, not the actual Wiki site). IPFS is a decentralized and distributed network allowing sharing of files, and content (similar to BitTorrent swarms). This prevents control and censorship of content.

Distributed Hash Table

The protocol uses DAG's for controlling the hashed files namespace. DAG's or Directed Acrylic Graphs is a data structure that has been around for a while. If you've ever used git (distributed source control system), then realize it or not, you've been working with DAG data structures. This data structure is similar to a Tree, in that they are directed, have a root node, but the difference is that there can be multiple parent nodes (or paths between the nodes). This is a dramatically simplified description, but you can find out more here on the good ol' Wikopedia.

When a new file(s) is uploaded to IPFS it is given a hash on the content. Anytime data is changed or modified it will be given a new hash. https://ipfs.io/ipfs/QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1mXWo6uco is the link to the previously mentioned Wikipedia site (English version). You'll notice the hash at the end of the URL. You can go to another gateway node with the same hash and receive the same content https://gateway.ipfs.io/ipfs/QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1mXWo6uco, this hash on this link will change, as it is a weekly update, so you'll need to find the new hash every time the site changes. But hopefully, you get the idea.

This is a simplified highlight of how IPFS works, but if you're interested you can read more on the whitepaper here (notice the hash, we'll use it later). I am a kinesthetic learner and as such like doing things by example.

Downloading and installing IPFS

There are currently two ways to run IPFS on your system. There is an implementation written in Go, and another that was written in JavaScript and can be installed via node package manager or npm. I went with the npm install as I am not a Go programmer, and wanted to be able to interact with IPFS via JavaScript (more on that later). If you have node installed you can run npm install ipfs --global which will allow you to run via the command line or if you want to install to your project directly run npm install ipfs --save like you normally would. I'm still learning about the actual implementation, but there appears to be an rpc to upload files and host on your local machine. When you run jsipfs daemon you start running an instance of an IPFS node locally. Following that you can go to the above hashed white paper pdf for example but instead of the hosted gateways above you can use your local., you could try the wiki site, but this will require your running node to download the hashed files to your local machine from your swarm peers. To view your peers you can open another terminal/console and run jsipfs swarm peers (feel like BitTorrent yet?). The IPFS program has a series of helpful commands, most of which I am still learning. But you can run the jsipfs --help flagged command to learn more.

Upload your First File

Here is a quick demo using node and the above mentioned npm package. I pulled the below code almost line for line from the IPFS GitHub tutorial.

const IPFS = require('ipfs')

const node = new IPFS()

node.on('ready', async () => {
    const version = await node.version()

    console.log('Version:', version.version)

    const filesAdded = await node.files.add({
      path: 'ipfs.txt',
      content: Buffer.from('Intergalactic Planetary!')

    console.log('Added file:', filesAdded[0].path, filesAdded[0].hash) //https://ipfs.io/ipfs/QmXhQ3gPvhnMYL3KLkA8Y8TJiKAXDWw4JSwe8bsXSCU4Rp

Following that you can go to this https://ipfs.io/ipfs/<hash> node or your local at<hash> followed by the hash of your console output and see for yourself that the content is on the IPFS system. Pretty sweet! You can also go to the above IPFS gateway URLs that we viewed the wiki site, and use your hash there too. The same content will load.

This post is just meant to be an introduction and by no means comprehensive. I will be doing some additional posts and updates in the future, but please leave comments and feedback. My initial opinion is that for something like open sourced content, and permitting of open data, IPFS is an interesting idea and one definitely useful when combating censorship. I am going to continue to explore this one.


Editor guide