January 2013

You are currently browsing the monthly archive for January 2013.

I came late to personal computing, which was born with the MITS Altair in 1975.

The first PC I ever met — and wanted desperately, in an instant — was an Apple II, in 1977. It sold in one of the first personal computer shops, in Durham, NC. Price: $2500. At the time I was driving one of a series of old GM cars I bought for nothing or under 1/10th what that computer cost. So I wasn’t in the market, and wouldn’t buy my first personal computer until I lived in California, more than a decade later.

By ’77, Apple already had competition, and ran ads voiced by Dick Cavett calling the Apple II “The most personal computer.”

After that I wanted, in order, an Osborne, a Sinclair and an IBM PC, which came out in ’82 and, fully configured, went for more than $2000. At least I got to play with a PC and an Apple II then, because my company did the advertising for a software company making a game for them . I also wrote an article about it for one of the first issues of PC Magazine. The game was Ken Uston’s Professional Blackjack.

Then, in 1984, we got one of the very first Macs sold in North Carolina. It cost about $2500 and sat in our conference room, next to a noisy little dot matrix printer that also cost too much. It was in use almost around the clock. I think the agency had about 10 people then, and we each booked our time on it.

As the agency grew, it acquired more Macs, and that’s all we used the whole time I was there.

So I got to see first hand what Dave Winer is driving at in MacWrite and MacPaint, a coral reef and What early software was influential?

In a comment under the latter, I wrote this:

One thing I liked about MacWrite and MacPaint was their simplicity. They didn’t try to do everything. Same with MacDraw (the first object- or vector- based drawing tool). I still hunger for the simplicity of MacDraw. Also of WriteNow, which (as I recall) was written in machine, or something, which made it very very fast. Also hard to update.

Same with MultiPlan, which became (or was replaced by) Excel. I loved the early Excel. It was so simple and easy to use. The current Excel is beyond daunting.

Not sure what Quicken begat, besides Quickbooks, but it was also amazingly fast for its time, and dead simple. Same with MacInTax. I actually loved doing my taxes with MacInTax.

And, of course, ThinkTank and MORE. I don’t know what the connection between MORE and the other presentation programs of the time were. Persuasion and PowerPoint both could make what MORE called “bullet charts” from outlines, but neither seemed to know what outlining was. Word, IMHO, trashed outlining by making it almost impossible to use, or to figure out. Still that way, too.

One thing to study is cruft. How is it that wanting software to do everything defeats the simple purpose of doing any one thing well? That’s a huge lesson, and one still un-learned, on the whole.

Think about what happened to Bump. Here was a nice simple way to exchange contact information. Worked like a charm. Then they crufted it up and people stopped using it. But was the lesson learned?

Remember the early Volkswagen ads, which were models of simplicity, like the car itself? They completely changed advertising “creative” for generations. Somewhere in there, somebody in the ad biz did a cartoon, multi-panel, showing how to “improve” those simple VW ads. Panel after panel, copy was added: benefits, sale prices, locations and numbers, call-outs… The end result was just another ugly ad, full of crap. Kind of like every commercial website today. Compare those with what TBL wrote HTML to do.

One current victim of cruftism is Apple, at least in software and services. iTunes is fubar. iCloud is beyond confusing, and is yet another domain namespace (it succeeds .mac and .me, which both still work, confusingly). And Apple hasn’t fixed namespace issues for users, or made it easy to search through prior purchases. Keynote is okay, but I still prefer PowerPoint, because — get this: it’s still relatively simple. Ugly, but simple.

Crufism in Web services, as in personal software, shows up when creators of “solutions” start thinking your actual volition is a problem. They think they can know you better than you know yourself, and that they can “deliver” you an “experience” better than you can make for yourself. Imagine what it would be like to stee a car if it was always guessing at where you want to go instead of obeying your actual commands? Or if the steering wheel tugged you toward every McDonalds you passed because McDonalds is an advertiser and the car’s algorithm-obeying driver thought it knew you were hungry and had a bias for fast food — whether you have it or not.

That’s the crufty “service” world we’re in now, and we’re in it because we’re just consumers of it, and not respected as producers.

The early tool-makers knew we were producers. That’s what they made those tools for. That’s been forgotten too.

I wrote that in an outliner, also by Dave.

Interesting to see how far we’ve come, and how far we still need to go.

Bonus link, on “old skool”.

Appreciating Mike Auldridge

I was at a friend’s house in Chapel Hill, one warm day in 1975, listening to WDBS, the Duke radio station where I worked at the time. As often happened with ‘DBS, a great tune came on: “Bottom Dollar,” sung by Mike Auldridge, with Linda Ronstadt singing high harmony. What blew us away, though, was not Mike’s honey baritone, but his dobro playing. It was beyond sublime. We learned the song came from the album “Blues and Bluegrass,” and promptly drove into town to buy it.

Later I gave the album to Ray Simone, to help him prep for doing Mike’s Eight String Swing album cover. It disappeared after that, and many years went by before I replaced it when  a double-CD of Mike’s old albums came out. It wasn’t easy to get then. I had to send off to somewhere in Europe, as I recall. Now its at that last link on Amazon. Cheap too, considering.

I actually became acquainted with Mike earlier, when he played with The Seldom Scene. But I had no idea he was so damn good solo until I heard that song, and that album.

A few minutes ago, when I was searching for something else, I ran across this tribute site, which was created just a couple months before he passed away, one day short of his 74th birthday, on December 29. This was bad news. We lost a treasure.

So was his music. Go listen.

Tomorrow evening, Tuesday, will be a meetup I wish I could attend in San Francisco. The subject is personal clouds.

We’re not talking about storage here, though that’s part of it, just like storage is part of your PC or your phone. We’re talking about your own personal space, which you control, on the Net, and not just on your devices. We’re talking about your own personal operating system: the platform for your enterprise of one. We’re talking about the place where you stand as you manage not just your own data, but your relationships with other people, various services, the Internet of Things, and your contacts—meaning your real social network (the one you define, your own way). It might be self-hosted, or physically elsewhere on the Net; doesn’t matter, long as it’s yours alone, and secure. That is, not contained in somebody else’s service. (Though you can engage one for that, if you like. On your terms.)

Personal clouds are a new concept, but central to what I (and many others) have been working on for years with ProjectVRM and related efforts. (Some of those will be there too.) It’s where personal computing, personal networking, personal storage and personal autonomy and control all meet — or should, once the tech gets built out.

It’s early in the history of wherever this thing is going to go, which is why going to this thing is a good idea.

Register here.

Apple rot

In The Lost Luster of the Juicy Apple Rumor, Steve Smith writes, “Most of the current rumors surrounding the fabled company involve Apple catching up to trends.” Ouch. In Samsung vs. Apple: Losing My Religion, which ran in AdAge last month, Barbara Lippert, a longtime member of the “Cult of Cupertino,” wrote, “The truth hurts.” That was in reference to Samsung ads that made fun of Apple, which she called “open for parody” — especially after the iPhone 5 turned out to be “a bit of a ‘meh.’” (I know: it’s not, but if that’s the perception…)

Look around the world today and you see a lot of Apple. If you’re making apps, you need a good reason not to make them for iPhones and iPads, just like you needed a good reason not to write for Windows late in the last millennium. There are just too damn many Apple thingies out there.

But we’re talking about high-turnover consumer electronics here. The life expectancy of a phone or a pad is 18 months. If that. Meanwhile, look at what Apple’s got:

  • The iPhone 5 is a stretched iPhone 4s, which is an iPhone 4 with sprinkles. The 4 came out almost 3 years ago. No Androids are as slick as the iPhone, but dozens of them have appealing features the iPhone lacks. And they come from lots of different companies, rather than just one.
  • The only things new about the iPad are the retina screen (amazing, but no longer unique) and the Mini, which should have come out years earlier and lacks a retina screen.
  • Apple’s computer line is a study in incrementalism. There is little new to the laptops or desktops other than looks — and subtracted features. (And models, such as the 17″ Macbook Pro.) That goes for the OS as well.
  • There is nothing exciting on the horizon other than the hazy mirage of a new Apple TV. And even if that arrives, nothing says “old” more than those two letters: TV.

Yes, there is a good chance Apple will have a big beautiful screen, someday. Maybe that screen will do for Apple what Trinitron did for Sony. But it will not be an innovation on the scale of the Mac, the iPod, the iPhone or the retail stores, all of which debuted in the Steve Age.

Steve built Apple on the model of a Hollywood studio — or, more specifically, Pixar. Apple’s products are like what Hollywood calls “projects.” And, like Pixar, Apple has very few of them. The business model — yea, the very nature of the company — requires each project to be a blockbuster: one after another, coming out a year or few apart. This model is suited to movie studios and the old computer industry. But it isn’t to consumer electronics, which is where Apple lives today.

There hasn’t been one Apple blockbuster since Steve died. Dare we consider the possibility that there won’t be another? It’s more than conceivable.

And let’s not forget how iOS 6 default-forces you to use Apple’s still-awful Maps app, which may be the biggest value-subtract in the history of computing. It still sees no subways in New York. (Stops, yes; but nothing more at any of them than links to the MTA website.) As fails go, it has few equals.

Apple’s job is to make trends, not to chase them. At that it is failing today.

This can change, of course. For the sake of Apple and its nervous shareholders I hope it does. But for now, Apple is getting ripe.

[Update on 18 January: A memorial service will be held tomorrow in the Great Hall at Cooper Union in New York. Many will speak, me included. Register at the first link. I've also added many more links to the stack below. I've also put together a too-short collection of photos I've taken of Aaron over the years. They are all Creative Commons licensed to encourage re-use. So take 'em away. I'll add more as I find them.]

Aaron Swartz’ funeral is today, and I can’t get him out of my mind. None of us who knew him ever will.

That’s not just because he was a great guy, which he was. It’s because Aaron stood for something.

That thing is freedom. It won’t die, and never will.

Look up “Aaron Swartz” +freedom. Bookmark it. Go back often. Watch what happens.

Nobody was more native to the Net than Aaron, or more determined to save it from those who would limit the freedom it embodies and supports.

The Net is free because it embodies virtues we call NEA:

  • Nobody owns it
  • Everybody can use it
  • Anybody can improve it

Like air, oceans, sunlight, gravity and the periodic table, the Net is free for us all. Both socially and economically, it has positive externalities beyond calculation.

Yet pieces of the Net’s physical infrastructure, and much of what flows over it, are either property outright, or subject to property claims. Aaron was good at drawing distinctions between the two, and — far more importantly — building tools and services that made it easier to understand those distinctions and do more within the boundaries they provide. Creative Commons, for example. Aaron’s fingerprints on that one were applied when he was just fourteen years old.

David Weinberger writes, “Aaron Swartz was not a hacker. He was a builder.” In that post, David highlights Aaron’s many contributions — a remarkable sum for a man on Earth for less than 27 years.

Aaron is gone, and that won’t change. But his influence, like the freedom he loved, will only grow, thanks to the good work he did when he was here.

As I did in my last post, I’m going to add recollections of Aaron here. Unlike that other list, all these will deal with Aaron’s life, rather than just his death:

Aaron Swartz died yesterday, a suicide at 26. I always felt a kinship with Aaron, in part because we were living demographic bookends. At many of the events we both attended, at least early on, he was the youngest person there, and I was the oldest. When I first met him, he was fourteen years old, and already a figure in the industry, in spite of his youth and diminutive stature at the time. Here he is with Dave Winer, I believe at an O’Reilly conference in San Jose:

It’s dated May 2002, when Aaron was fifteen. That was the same year I booked him for a panel at Comdex in Las Vegas. His mom dropped him off, and his computer was an old Mac laptop with a broken screen that was so dim that I couldn’t read it, but he could. He rationalized it as a security precaution. Here’s a photo, courtesy of Mary Wehmeier. Here’s another I love, from the same Berkman Center set that also contains the one above:

All those are permissively licensed for re-use via Creative Commons, which Aaron helped create before he could shave.

Aaron’s many other passions and accomplishments are well-described elsewhere, but the role he chose to play might be best described by Cory Doctorow in BoingBoing: “a full-time, uncompromising, reckless and delightful shit-disturber.” Cory also writes, “Aaron had an unbeatable combination of political insight, technical skill, and intelligence about people and issues. I think he could have revolutionized American (and worldwide) politics. His legacy may still yet do so.”

I hope that’s true. But it would have had a much better chance if he were still here doing what he did best. We haven’t just lost a good man, but the better world he was helping to make.

[Later...] Larry Lessig makes the case that Aaron was driven to end his life by the prospect of an expensive trial, due to start soon, and the prospect of prison and worse if he lost the case and its appeals. Writes Larry ,

[Aaron] is gone today, driven to the edge by what a decent society would only call bullying. I get wrong. But I also get proportionality. And if you don’t get both, you don’t deserve to have the power of the United States government behind you.

For remember, we live in a world where the architects of the financial crisis regularly dine at the White House — and where even those brought to “justice” never even have to admit any wrongdoing, let alone be labeled “felons.”

In that world, the question this government needs to answer is why it was so necessary that Aaron Swartz be labeled a “felon.” For in the 18 months of negotiations, that was what he was not willing to accept, and so that was the reason he was facing a million dollar trial in April — his wealth bled dry, yet unable to appeal openly to us for the financial help he needed to fund his defense, at least without risking the ire of a district court judge.  And so as wrong and misguided and fucking sad as this is, I get how the prospect of this fight, defenseless, made it make sense to this brilliant but troubled boy to end it.

Fifty years in jail, charges our government. Somehow, we need to get beyond the “I’m right so I’m right to nuke you” ethics that dominates our time. That begins with one word: Shame.

One word, and endless tears.

[Later again, 13 January, Sunday morning...] Official Statement from the family and partner of Aaron Swartz is up at http://RememberAaronSw.tumblr.com. Here it is, entire:

Our beloved brother, son, friend, and partner Aaron Swartz hanged himself on Friday in his Brooklyn apartment. We are in shock, and have not yet come to terms with his passing.

Aaron’s insatiable curiosity, creativity, and brilliance; his reflexive empathy and capacity for selfless, boundless love; his refusal to accept injustice as inevitable—these gifts made the world, and our lives, far brighter. We’re grateful for our time with him, to those who loved him and stood with him, and to all of those who continue his work for a better world.

Aaron’s commitment to social justice was profound, and defined his life. He was instrumental to the defeat of an Internet censorship bill; he fought for a more democratic, open, and accountable political system; and he helped to create, build, and preserve a dizzying range of scholarly projects that extended the scope and accessibility of human knowledge. He used his prodigious skills as a programmer and technologist not to enrich himself but to make the Internet and the world a fairer, better place. His deeply humane writing touched minds and hearts across generations and continents. He earned the friendship of thousands and the respect and support of millions more.

Aaron’s death is not simply a personal tragedy. It is the product of a criminal justice system rife with intimidation and prosecutorial overreach. Decisions made by officials in the Massachusetts U.S. Attorney’s office and at MIT contributed to his death. The US Attorney’s office pursued an exceptionally harsh array of charges, carrying potentially over 30 years in prison, to punish an alleged crime that had no victims. Meanwhile, unlike JSTOR, MIT refused to stand up for Aaron and its own community’s most cherished principles.

Today, we grieve for the extraordinary and irreplaceable man that we have lost.

Funeral and other details follow at the bottom of that post, which concludes, Remembrances of Aaron, as well as donations in his memory, can be submitted at http://rememberaaronsw.com.

Also, via @JPBarlow: “Academics, please put your PDFs online in tribute to @aaronsw. Use #pdftribute.” Here’s the backstory.

A memorial tweet from Tim Berners Lee (@TimBerners_Lee): Aaron dead. World wanderers, we have lost a wise elder. Hackers for right, we are one down. Parents all, we have lost a child. Let us weep.

Some links, which I’ll keep adding as I can:

Tags:

A new window of the sole

monofocal interocular lens“I see,” we say, when we mean “I understand.” To make something “clear” is to make it vivid and unmistakable to the mind’s eye. There are no limits to the ways sight serves as metaphor for many good and necessary things in life. The importance of vision, even for the sightless (who still use language), is beyond full accounting. As creatures we are exceptionally dependent on vision. For us upright walkers sight is, literally and figuratively, out topmost sense.

It is also through our eyes that we express ourselves and make connections with each other. That eyes are windows of the soul is so well understood, and so often said, that no one author gets credit for it.

Yet some of us are more visual than others. Me, for example. One might think me an auditory or kinesthetic type, but in fact I am a highly visual learner. That’s one reason photography is so important to me. Of the many ways I study the world, vision is foremost, and always has been.

But my vision has been less than ideal for most of my adult life. When I was a kid it was exceptional. I liked to show off my ability to read license plates at great distances. But in college, when I finally developed strong study habits, I began getting nearsighted. By the time I graduated, I needed glasses. At 40 I was past minus-2 dioptres for both eyes, which is worse than 20/150. That was when I decided that myopia, at least in my case, was adaptive, and I stopped wearing glasses as much as possible. Gradually my vision improved. In 1999, when the title photo of this blog was taken, I was down to about 1.25 dioptres, or 20/70. A decade later I passed eye tests at the DMV and no longer required corrective lenses to drive. (Though I still wore them, with only a half-dioptre or so of correction, plus about the same for a slight astigmatism. They eye charts said I was then at about 20/25 in both eyes.

My various eye doctors over the years told me reversal of myopia was likely due to cataracts in my lenses. Whether or not that was the case, my cataracts gradually got worse, especially in my right eye, and something finally needed to be done.

So yesterday the lens in my right eye was replaced. That one was, in the words of the surgeon, “mature.” Meaning not much light was getting through it. The left eye is still quite functional, and the cataract remains, for now, mild.

Cataract surgery has become a routine outpatient procedure. The prep takes about an hour, but the work itself is over in fifteen minutes, if nothing goes wrong, which it usually doesn’t. But my case was slightly unusual, because I have a condition called pseudoexfoliation syndrome, or PEX, which presents some challenges to the surgery itself.

As I understand it, PEX is dandruff of the cornea, and the flakes do various uncool things, such as clog up the accordion-like pleats of the iris, so the eye sometimes doesn’t dilate quickly or well in response to changing light levels. But the bigger risk is that these flakes sometimes weaken zonules, which are what hold the lens in place. Should those fail, the lens may drop into the back of the eye, where a far more scary and complicated procedure is required to remove it, after which normal cataract surgery becomes impossible.

In the normal version, the surgeon makes a small incision at the edge of the cornea, and then destroys and removes the old lens with through a process called phaceomulsification. He or she then inserts an intraocular lens, or IOL, like the one above. In most cases, it’s a monofocal lens. This means you no longer have the capacity to focus, so you need to choose the primary purpose you would like your new lens to support.  Most choose looking at distant things, although some choose reading or using a computer screen. Some choose to set one eye for distance and the other for close work. Either way you’ll probably end up wearing glasses for some or all purposes. I chose distance, because I like to drive and fly and look at stars and movie screens and other stuff in the world that isn’t reading-distance away.

The doctor’s office measured the dimensions of my eye and found that I wouldn’t need any special corrections in the new lens, such as for astigmatism — that in fact, my eyes, except for the lens, are ideally shaped and quite normal. It was just the lenses that looked bad. They also found no evidence of glaucoma or other conditions that sometime accompany PEX. Still, I worried about it, which turned out to be a waste, because the whole thing went perfectly. (It did take awhile to get my iris to fully dilate, but that was the only hitch.)

What’s weird about the surgery is that you’re awake and staring straight forward while they do all this. They numb the eye with topical anesthetic, and finally apply a layer of jelly. (They actually call it that. “Okay, now layer on the jelly,” the doctor says.) Thanks to intravenous drugs, I gave a smaller shit than I normally would have, but I was fully conscious the whole time. More strangely, I had the clear sense of standing there on my retina, looking up at the action as if in the Pantheon, watching the hole in its dome. I could see and hear the old lens being emulsified and sucked away, and then saw the new lens arriving like a scroll in a tube, all curled up. As the doctor put it in place, I could see the lens unfurl, and studied one of the curved hair-like springs that holds it in place. Shortly after that, the doctor pronounced the thing done. Nurses cleaned me up, taped a clear shield over my eye, and I was ready to go.

By evening the vision through that eye became clearer than through my “good” left eye. By morning everything looked crystalline. In my follow-up visit, just 24 hours after the surgery, my vision was 20/20. Then, after the doctor relieved a bit of pressure that had built up inside the cornea, it was better than that — meaning the bottom line of the eye chart was perfectly clear.

Now it’s evening of Day 2, and I continue to be amazed at how well it’s going. My fixed eye is like a new toy. It’s not perfect yet, and may never be; but it’s so much clearer than what I saw before — and still see with my left eye — that I’m constantly looking at stuff, just to see the changes.

The only nit right now is  little rays around points of light, such as stars. But the surgeon says this is due to a bit of distortion in my cornea, and that it will vanish in a week or so.

The biggest difference I notice is color. It is now obvious that I haven’t seen pure white in years. When I compare my left and right eyes, everything through my left — the one with the remaining cataract — has a sepia tint. It’s like the difference between an old LCD screen and a new LED one. As with LED screens, whites and blues are especially vivid.

Amazingly, my computer and reading glasses work well enough, since the correction for my left eye is still accurate and the one for my right one isn’t too far off. For driving I removed the right lenses from my distance glasses, since only the left eye now needs correction.

But the experience of being inside my eye watching repairs in the space of the eye alone — sticks with me. All vision is in the brain, of course, and the world we see is largely a set of descriptions we project from the portfolio of things we already know. We can see how this works when we disconnect raw sensory perception from our descriptive engines. This is what happens with LSD. As I understand it (through study and not experience, alas), LSD disconnects the world we perceive from the nouns and verbs we use to describe it. So do other hallucinogens.

So did I actually see what I thought I saw? I believe so, but I don’t know. I had studied the surgical procedure before going into it, so I knew much of what was going on. Maybe I projected it. Either way, that’s over. Now I don’t see that new lens, but rather the world of light refracting through it. That world is more interesting than my own, by a wider margin than before yesterday. It’s a gift I’m enjoying to the fullest.

Two years ago I called Al Jazeera’s live coverage of the revolution in Egypt a “Sputnik moment” for cable in the U.S. Turns out it wasn’t. Not since Al Jazeera agreed to pay half a $billion, plus their live internet stream, to sit at U.S. cable’s table. Losing Al Jazeera English reduces to a single source — France24 — the number of live streams available on the Net from major video news channels. It also terminates years Al Jazeera English’s history on the Net at 5.25 years.

It’s a huge victory for cable and an equally huge loss for the open Net. I dearly hope Al Jazeera feels that loss too. Because what Al Jazeera screws here is a very loyal audience. Just, apparently, not a lucrative one.

In Al Jazeera Embraces Cable TV, Loses Web, The Wall Street Journal explains,

…to keep cable operators happy, Al Jazeera may have to make a difficult bargain: Giving up on the Web.

The Qatar government-backed television news operation, which acquired Current TV for a few hundred million dollars from investors including Al Gore, said Thursday that it will at least temporarily stop streaming online Al Jazeera English, its global English-language news service, in about 90 days. That’s when it plans to replace Current TV’s programming with Al Jazeera English.

Al Jazeera plans later to launch an entirely new channel, Al Jazeera America, that will combine programming from the existing English-language service with new material. The new channel likely won’t be streamed online either, a spokesman said.

And it is unclear whether the original English service will reappear online: the spokesman said Thursday a decision about that was dependent on negotiations with cable operators.

The network’s decision to pull its service off the Web is at the behest of cable and satellite operators. It reflects a broader conflict between pay television and online streaming that other TV channels face. Because cable and satellite operators pay networks to carry their programming, the operators don’t want the programming appearing for free online. Aside from older series available through services like Netflix, most cable programming is available online only to people who subscribe to cable TV.

You won’t find better proof that television is a captive marketplace. You can only watch it in ways The Industry allows, and on devices it provides or approves. (While it’s possible watch TV on computers, smartphones and tablets, you can only do that if you’re already a cable or satellite subscriber. You can’t get it direct. You can’t buy it à la carte, as would be the case if the marketplace were fully open.)

For what it’s worth, I would gladly pay for Al Jazeera English. So would a lot of other people, I’m sure. But the means for that are not in place, except through cable bundles, which everybody other than the cable industry hates.

In the cable industry they call the Net “OTT,” for “over the top.” That’s where Al Jazeera English thrived. But now, for non-cable subscribers, Al Jazeera English is dead and buried UTB — under the bottom.

Adverto in pacem, AJE. For loyal online viewers you were the future. Soon you’ll be the past.

Bonus links:

Nearly all smartphones today are optimized to do three things for you:

  1. Run apps
  2. Speak to other people
  3. Make you dependent on a phone company

The first two are features. The third is a  bug. In time that bug will be exterminated. Meanwhile it helps to look forward to what will happen with #1 and #2 once they’re liberated from #3.

Both features are personal. That’s key. Our smartphones (or whatever we end up calling them) should be as personal as our clothing, wallets and purses. In other words, they should work as extensions of ourselves.

When this happens, they will have evolved into what Martin Kuppinger calls life management platforms, good for all these things —

— in addition to the stuff already made possible by the zillion apps already out there.

What kinds of smartphones are in the best position to evolve into Life Management Platforms? The short answer is: open ones. The longer answer is: open ones that are already evolving and have high levels of adoption.

Only one platform qualifies, and that’s Android. Here’s what Wikipedia says (as of today) about Android’s open-ended evolutionary position:

Historically, device manufacturers and mobile carriers have typically been unsupportive of third-party firmware development. Manufacturers express concern about improper functioning of devices running unofficial software and the support costs resulting from this.[81] Moreover, modified firmwares such as CyanogenMod sometimes offer features, such as tethering, for which carriers would otherwise charge a premium. As a result, technical obstacles including locked bootloaders and restricted access to root permissions are common in many devices. However, as community-developed software has grown more popular, and following a statement by the Librarian of Congress in the United States that permits the “jailbreaking” of mobile devices,[82] manufacturers and carriers have softened their position regarding third party development, with some, including HTC,[81] Motorola,[83] Samsung[84][85]and Sony Ericsson,[86] providing support and encouraging development. As a result of this, over time the need to circumventhardware restrictions to install unofficial firmware has lessened as an increasing number of devices are shipped with unlocked or unlockable bootloaders, similar to the Nexus series of phones, although usually requiring that users waive their devices’ warranties to do so.[81] However, despite manufacturer acceptance, some carriers in the US still require that phones are locked down.[87]

The unlocking and “hackability” of smartphones and tablets remains a source of tension between the community and industry, with the community arguing that unofficial development is increasingly important given the failure of industry to provide timely updates and/or continued support to their devices.[87]

But the community doesn’t just argue. It moves ahead with implementations. For example, Ubuntu for Android and custom ROMs for Google’s Nexus 7.

The reason there is an aftermarket for Nexus hardware is that Google intended for Android to be open and generative from the start, pointedly saying that Nexus is “unlocked and contract free.” This is why, even though Google does lots of business with mobile phone company operators, it is those operators’ friend only to the degree it helps lead those operators past current customer-entrapment business models and into a future thick with positive economic externalities. Amidst those externalities, phone companies will still enjoy huge built-out infrastructure and other first-mover advantages. They will wake up and smell the infinity.

While Apple deserves huge credit for modeling what a smartphone should do, and how it should work (Steve Jobs was right to see Android as something of a knock-off) the company’s walled-garden remains a monument of feudality. For a window on how that fails, read Barbara Lippert’s Samsung vs. Apple: Losing My Religion in MediaPost. Barbara is an admitted member of the “cult of Cupertino,” and is — along with droves of other Apple serfs — exiting the castle.

Samsung, however, just happens to be (deservedly) the maker of today’s most popular Androids. The Androids that win in the long run will be true life management platforms. Count on it.

For a window on that future, here are the opening paragraphs of  The Customer as a God, my essay in The Wall Street Journal last July:

It’s a Saturday morning in 2022, and you’re trying to decide what to wear to the dinner party you’re throwing that evening. All the clothes hanging in your closet are “smart”—that is, they can tell you when you last wore them, what else you wore them with, and where and when they were last cleaned. Some do this with microchips. Others have tiny printed tags that you can scan on your hand-held device.As you prepare for your guests, you discover that your espresso machine isn’t working and you need another one. So you pull the same hand-held device from your pocket, scan the little square code on the back of the machine, and tell your hand-held, by voice, that this one is broken and you need another one, to rent or buy. An “intentcast” goes out to the marketplace, revealing only what’s required to attract offers. No personal information is revealed, except to vendors with whom you already have a trusted relationship.

Within a minute offers come in, displayed on your device. You compare the offers and pick an espresso machine to rent from a reputable vendor who also can fix your old one. When the replacement arrives, the delivery service scans and picks up the broken machine and transports it to the vendor, who has agreed to your service conditions by committing not to share any of your data with other parties and not to put you on a list for promotional messages. The agreement happened automatically when your intentcast went out and your terms matched up with the vendor’s.

Your hand-held is descended from what they used to call smartphones, and it connects to the rest of the world by whatever ambient connection happens to be available. Providers of commercial Internet connections still make money but not by locking customers into “plans,” which proved, years ago, to be more trouble than they were worth.

The hand-held itself is also uncomplicated. New technologies and devices are still designed by creative inventors, and there are still trade secrets. But prototyping products and refining them now usually involves actual users at every stage, especially in new versions. Manufacturers welcome good feedback and put it to use. New technology not only evolves rapidly, but appropriately. Ease of use is now the rule, not the exception.

OK, now back to the present.

Everything that I just described can be made possible only by the full empowerment of individuals—that is, by making them both independent of controlling organizations and better able to engage with them. Work toward these goals is going on today, inside a new field called VRM, for vendor relationship management. VRM works on the demand side of the marketplace: for you, the customer, rather than for sellers and third parties on the supply side.

It helps that Android is already huge. It will help more when makers of Android devices and apps squash the phone company dependency bug. It will also help that the “little square code” mentioned above already exists. For a pioneering example, see SquareTag.com. For examples of how individuals can program logical connections between other entities in the world, see Kynetx and Iffft. (Kynetx is for developers. Ifttt is for users.)

As for investors, startups and incumbent big companies, it will help to start looking at the world from the perspective of the individual that each of us happens to be. The future is about liberating us, and equipping us with means for managing our lives and our relationships with other entities in the open marketplace. Personal independence and empowerment is what the PC, the Internet and the smartphone have all provided from the start. Trying to rein in that independence and empowerment comes naturally to big companies, and even some startups. But vector of progress to the future has always been along the line of personal freedom and empowerment. Free customers will be more valuable than captive ones. Android’s success is already starting to prove that.

One day, back around 15,000 BCE, half a mountain in Southern California broke loose and slid out onto what’s now the Mojave desert. The resulting landform is called the Blackhawk Slide. Here it is:

It’s that ripple-covered lobe on the bottom right. According to Robert Sharp’s Geology Underfoot in Southern California, it didn’t just flow off the mountain, as would happen with a typical landslide. It actually slid intact, like a toboggan, four and a half miles, on a slope of only two to three degrees. It could not have traveled so far, and have remained so intact (with rock layers preserved, in order, top to bottom), if it had merely flowed.

Geologists can tell it slid because it didn’t just heap at the base of the mountain from which it detached. Instead it soared, at low altitude, four and a half miles, on the flat, on a cushion of air, out across the desert, before plopping down.

To get some perspective on this, here are two facts to consider. First, we’re talking about ten billion cubic feet of detached mountain face here. Second, in order to travel that far out onto the desert, shattered but essentially in one piece, it had to glide on a cushion of air, at speeds up to 270 miles per hour. Or so goes the theory.

One wonders if humans were there to see it happen. Ancestors of native Americans were already on the continent by then, thanks to the last glacial maximum, which still had several thousand more years to go. There may have been some ice on the mountains themselves, and perhaps that helped weaken the rock, which was already raised to the sky by pressures on the San Andreas Fault, which lies on the back side of the San Bernardino Mountains, a couple dozen miles from here.

I came along a bit late, but was glad to get my first chance to gander at the slide, the day after Thanksgiving, on a United flight from San Jose to Houston. I was shooting against the sun, and it was a bit hazy, but I was still able to get a good look, and this photo set too.

Additional links: