Slowly but surely consumers are being taught the value of open systems that the hackers intuitively understood 40 years ago.
Is this true? I hope so.
A VC posts about openness, and the hacker ethic of the 60s:
I've been reading Steven Levy's Hackers.
There is this great story where the hackers at the AI Lab at MIT are being forced to use a time sharing system on their beloved PDP-6 and they are in revolt.
So Ed Fredkin, who runs the lab, enlists Richard Greenblatt to create a new "hacker friendly" time sharing system. Richard enlists Ted Nelson and the two of them hack together a new time sharing sysem in "weeks of hard core hacking". They called this system ITS, for "incompatible time sharing system".
The reason for mentioning this is that ITS was completely open. It had no passwords. It was completely extendable. Anyone could add features to the system. It was designed specifically so that everyone could look at everyone else's work.
ITS was built in the late 1960s.
Almost 40 years later we are finally seeing the "hacker ethic" arrive in consumer software and services.
When we were trying to explain difference between Shutterfly and Flickr. When we were explaining the difference between Flickr and Shutterfly/Ofoto/Snapfish to users we often claimed that those services were "holding your information hostage" -- they were. Photo sharing was "free" -- but it was really a loss leader for photo finishing services. Photo sharing was the wide mouth of the funnel that led you to print your photos -- and that meant you could not access the high-res originals that you yourself had uploaded. These were kept away from you (and your friends and family) so they could charge you for prints.
People should own their own data, and interestingly, openness goes two ways. people own their own data, they are more willing to share their stuff. The examples that he gave in his post, the blog post, the public photos tagged with "Vietnam".