Subscribe to feed
‹ Free Grand Canyon Plane Tour • Noticing deaths ›
April 21, 2008 in Ideas | 2 comments
I just made a few changes to Understanding Infrastructure, which I trust will improve it. See what you think.
Mike Warot on April 22, 2008 at 3:17 am
The internet was first started as a way to allow access to programs that happened to be on computers that were scattered around the USA. It was not the sharing of data, which could be done by shipping paper tape, cards, etc… it was about interacting with cool new software that was the main motivator for the genesis of the internet.
We seem to have forgotten this point along the way. The desire to run code is what it’s all about. Now that we’ve got insecure endpoints, everyone is afraid to use the internet for it’s main purpose, sharing code.
We now all have a watered down, Disneyland version of the original Arpanet, where the code was king. The Java sandbox, the Flash animations, the Ajax client-server stuff all pales in comparison to just being able to offer access to something that works, on the system it was tweaked to run on.
I hope I’ve made my point… that we’re missing something big because we don’t have a way of sharing code like we used to. There’s a big hole where the main generative nature of the internet used to be… it’s just not safe to run random stuff any more.
It doesn’t have to be this way, we can make sandboxes that work, in all circumstances, but it requires a heavy duty paradigm shift that can best be summarized like this:
Never trust code
It worked in the beginning because the code came with the system it ran on. We can build systems that let the code run locally with our stuff as input, in a secure manner. All we have to do is teach our computers how not to trust.
In this manner we can vastly increase the generative power of the ends of the Internet.
I hope that makes sense, and expands the conversation.
Time for bed now…
Keith Dick on April 24, 2008 at 4:01 am
I guess I don’t catch something you are saying here.
The web server at the far end of an http connection can run any program on the server it is set up to access. And it is often the case that web requests result in a lot of computation being invoked on the server. So, at least to a first approximation, it seems that the internet is still being used to provide access to remote code.
Of course you know how web servers work, so there must be something about that approach that doesn’t equate with giving access to remote code that you are talking about. Could you say a few more words about this to point out the difference I’m overlooking?
Comments are now closed.
Listening to @serial? Remember the West Memphis Three. bit.ly/16oVjHt — counter-example to Edgar Smith bit.ly/1w46gYG
About 2 days ago from Doc Searls's Twitter via Twitter Web Client
Amazing wifi at the @united club at Newark Airport: bit.ly/1BFGTgc via speedtest.net
About 3 days ago from Doc Searls's Twitter via Bitly
@rwang0 @MarcioOnTW Will do. Meanwhile, just added your tweet (w/2015 forecast) to this post: bit.ly/1z3Nhym #vrm
About 3 days ago from Doc Searls's Twitter via Twitter Web Client
@media_bite @NewYorker @ReclaimThePress Keep me posted on how it goes.
@media_bite @NewYorker Interesting angle by you/@ReclaimThePress: Instead of subscribing, buying data from publishers. I have that right?
About 4 days ago from Doc Searls's Twitter via Twitter Web Client
Powered by WordPress and Tarski