Sunday, December 05, 2010

 

Storing files in the publicly writable web.

Take a file and chunk it out much like bittorrent. Except we are going to base64 the little chunks and upload them redundantly a bunch of public places on the web. Blog comment areas. StatusNet installs. Twitter. Facebook accounts. Gmail accounts. Forums. Anywhere that lets you store data. Scattered across the wild wild web and indexed into a single text file. But, you say, the index will be bigger than the original file with so many small pieces scattered redundantly. Interestingly we can have a layer of source list storage built in. Every addressable space can be given a 128bit address which can be mapped to a full wrapper/handler name and url fragment. These addressible space descriptors can be stored on the web and resolved by anyone participating in the network address resolver (kind of how people participate in DHT). So then the index file is composed of a smattering of 128bit numbers and the network will resolve those to locations where actual information is stored. Or something like that. I think its a really neat idea worth exploring if I had the patronage.

Comments:

Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?