Gear

You are currently browsing the archive for the Gear category.

Fuse is more than a device and a smartphone app to go with it. The world is full of those already.

Fuse is the first product in the digital age that can blow up every one of the silos built to trap personal data and limit personal independence.

Fuse does that by putting you — literally — in the driver’s seat of your life.

Fuse is also the first product to show how your own “Internet of things” can be fully yours — and truly integrated in ways that work for you — without requiring that you become a serf in some company’s castle.

Fuse is an invention of Phil Windley and his team at Kynetx, who are committed to the freedom,  independence and self-empowerment of individuals: to making you a driver of your own life and your own stuff, and not just a “user” of others’ products and services. And to letting you be “social” in your own ways, as you are in your everyday life outside the Web.

This is why Fuse is Net-native, not Web-native (though it uses the Web too). This matters because the Net was created as a decentralized World of Ends, where every node can be sovereign and independent, as well as zero functional distance from every other node. The Web could have been the same, but instead it grew on top of the Net, along lines defined by client-server architecture (aka calf-cow), which makes everything there centralized: you’re always a client, and always at the mercy of servers. This is why the browser, which started out as a vehicle on the Information Superhighway, turned into a shopping cart that gets re-skinned at every commercial site you visit, and carries tracking beacons so you can be a better target for advertising.

Fuse drives under and away from that model, which has become terribly corrupted, and toward what Bob Frankston (sitting next to me as I write this) calls the “boundary less” and “permissionless” world.

If Fuse succeeds, it will be a critical first step toward building the fully independent vehicle for the fully independent human being on that same old Information Superhighway. And it will do that that by starting with your own car.

There are only a few hours left for the Fuse Kickstarter campaign. The sum required is only $60,000, and contributions have passed $50,ooo already. So help put it over the top. It could be the most leveraged investment you’ll ever make in the future of personal independence in the networked world.

More background in my first post on Fuse.

[Later, same day...] Goal reached:

294 backers
$63,202 pledged of $60,000 goal
Looking forward to seeing Fuse’s pudding prove the headline above. :-)

car radio

Radio’s 1.x era is coming to an end. Signs and portents abound. The rise and decline of AM radio just ran in the Pittsburgh Post-Gazette, hometown paper for KDKA, the granddaddy of AM radio in the U.S. In AM/FM Radio Is Already Over, And No One Will Miss ItAdam Singer writes,

Radio advertisements are an awful, intrusive experience and universally despised

Most passionate music fans have held disdain for radio since the advent of portable music. It’s not just a dated medium, it tries to prop up a legacy generation “winner take all” of the most banal / manufactured “hits” as opposed to the meatier middle and tail of music where the quality content is (and where artists take chances and push the envelope creatively).

AM / FM radio djs and personalities are really the only thing left, and they should abandon radio now because they would benefit greatly by setting up shop online. Whether their own blog / podcast, app, or even experimenting with video (which is still a chance to be a pioneer). Even if they aren’t totally ready to abandon it yet, they should start to funnel their audiences to a digital community of some sort where they can grow over time in a platform agnostic way. This way they’re prepared for a digital future.

The notion of terrestrial analog content via AM/FM is quaint in a digital society and has reached an inevitable end. The technology itself is done. The good news is the personalities and content can not just survive, but thrive in a much higher quality environment. Further, digital provides a better experience for  audiences and sheds legacy baggage / a model that pushes aside quality and creativity for profit. Advertisers and technology providers will benefit here too: the modern device landscape provides a much better experience from a measurement, content serving, customization, and brand perspective (and so much more).

No doubt in our lifetime AM/FM will completely go away, perhaps only existing as emergency frequency. But everyone: consumers, advertisers, artists and personalities win by embracing digital. You’re fighting the future to ignore this and that’s never a way to succeed.

Yet people still listen to streams of audio, which is all radio ever was. Most of that audio is now digital, and comes to us over the Internet, even if some of it also still streams out over analog airwaves. Naturally, it’s all merging together, with predictable combinations of hand-wringing and huzzahs.

In How Tesla Changes Radio, B. Eric Rhoads reports on both:

Most in our industry are responding like any industry that’s challenged: defending the status quo and finding all the reasons consumers won’t change. And it might even be true, in radio’s case. But how likely is that? The questions all radio broadcasters need to be asking themselves now is how they can develop listener loyalty and cement their brands so deeply that listeners will seek out their favorite stations even when they have a choice of 75,000 stations from all around the world. Though you’ll still be available on the local AM FM dial, you need to assume people embracing online radio may only seek out stations in an online environment.

And, speaking of the status quo, dig “Fixing” AM Radio Broadcasting, Parts I, II and III, by Old Curmudgeon of LBA Group. There you will find perhaps the only useful way to bring a 1920′s-vintage transmission system into the next millennium. And it may well work, even though the result will still suffer from a bug what was once a feature. I explain what I mean by that in a comment under Part III:

Last year, after failing to find a useful radio at Radio Shack, my teenage son asked me a question that spoke straight to the obsolescence of radio as we know it: “What is the point of ‘range’?” In other words, why is losing a signal while driving away from town a feature and not a bug? When I explained some of the legacy technical and regulatory issues behind ‘range’, he asked, “What will it take to save radio?”

I like your answers.

In this series you frame the problems well and pose a good solution that I think will work by providing a technical and regulatory bridge from analog to digital and from 1925 to 2015. I hope regulators and broadcasters both take your proposals seriously.

Meanwhile, both the radio industry and the FCC are in denial of what’s actually happening with the “millenial” generation to which my son belongs. These people are Net-based. They assume connectivity, and zero functional distance between themselves and everyone and everything else in the networked world. They are also remarkably unconcerned with threats to the Net and therefore that model, from phone and cable companies, and captive regulators.

Hollywood in particular has known since 1995 that all of broadcasting and content distribution is being absorbed by the Net. With phone and cable companies — with which Hollywood is increasingly integrated vertically — they are desperate to find ways to continue controlling that distribution — preferably on models just as old as AM radio. Billing especially is a key issue. Phone and cable companies are billing systems as well as communications ones. Terrestrial TV and radio are not, which is one reason they care little about saving them.

So, to me at least, the parallel challenge to saving AM (and FM) radio, is keeping incumbent giants and their captive regulators from from stuffing the Internet’s genie back in the bottles of Business as Usual.”

In You Must Be HD to Compete in the Dash, RadioINK interviews Bob Struble (@rjstruble), CEO of iBiquity, the company behind HD Radio, which I love because it cleans beat-up FM and AM signals, more than for its other virtues. An excerpt:

…take my new Sequoia as an example. It has one screen layout that is the same for all audio services — Sirius, Pandora, iHeart, iPod, and analog or digital AM/FM. The screen has all my presets, from any source, on one side, and the content screen on the other side. Like all the digital services, HD Radio technology allows a station to fill that screen. There is an album cover or station logo in the middle of the screen, there are indicators that there is an HD2, HD3, or HD4 station available, there is song and artist info, there is an iTunes Tagging button to store song info for later purchase. Overall, it looks and feels like an audio service should in the digital age.

Hmm: “audio service.” I think that’s Radio 2.0, which here I call the “holy grail.”

All this will be front & center at the Dash Conference next week in Detroit. I’ll be there in spirit while my butt is at IIW in Silicon Valley (which I co-organize). This means I’ll be watching Twitter and blogs for reports on progress. In other words, I’ll stay tuned.

We’re not watching any less TV. In fact, we’re watching more of it, on more different kinds of screens. Does this mean that TV absorbs the Net, or vice versa? Or neither? That’s what I’m exploring here. By “explore” I mean I’m not close to finished, and never will be. I’m just vetting some ideas and perspectives, and looking for help improving them.

TV 1.0: The Antenna Age

In the beginning, 100% of  TV went out over the air, radiated by contraptions atop towers or buildings, and picked up by rabbit ears on the backs of TV sets or by bird roosts on roofs. “Cable” was the wire that ran from the roof to the TV set. It helps to understand how this now-ancient system worked, because its main conceptual frame — the channel, or a collection of them —  is still with us, even though the technologies used are almost entirely different. So here goes.

tv antenna

Empire State Building antennas

On the left is a typical urban rooftop TV antenna. The different lengths of the antenna elements correspond roughly to the wavelengths of the signals. For reception, this mattered a lot.

In New York  City, for example, TV signals all came from the Empire State Building — and still do, at least until they move to the sleek new spire atop One World Trade Center, aka the Freedom Tower. (Many stations were on the North Tower of the old World Trade center, and perished with the rest of the building on 9/11/2001. After that, they moved back to their original homes on the Empire State Building.)

“Old” in the right photo refers to analog, and “new” to digital. (An aside: FM is still analog. Old and New here are just different generations of transmitting antennas. The old FM master antenna is two rings of sixteen T-shaped things protruding above and below the observation deck on the 102nd floor. It’s still in use as an auxiliary antenna. Here’s a similar photo from several decades back, showing the contraptual arrangement at the height of the Antenna Age.)

Channels 2-6 were created by the FCC in the 1940s (along with FM radio, which is in a band just above TV channel 6). Those weren’t enough channels, so 7-13 came along next, on higher frequencies — and therefore shorter wavelengths. Since the shorter waves don’t bend as well around buildings and terrain, stations on channels 7-13 needed higher power. So, while the maximum power for channels 2-6 was 100,000 watts, the “equivalent” on channels 7-13 was 316,000 watts. All those channels were in VHF bands, for Very High Frequency. Channels 14-83 — the UHF, or Ultra High Frequency band, was added in the 1950s, to make room for more stations in more places. Here the waves were much shorter, and the maximum transmitted power for “equivalent” coverage  to VHF was 5,000,000 watts. (All were ERP, or effective radiated power, toward the horizon.)

This was, and remains, a brute-force approach to what we now call “delivering content.” Equally brute approaches were required for reception as well. To watch TV, homes in outer suburban or rural areas needed rooftop antennas that looked like giant centipedes.

What they got — analog TV — didn’t have the resolution of today’s digital TV, but it was far more forgiving of bad reception conditions. You might get “ghosting” from reflected signals, or “snow” from a weak signal, but people put up with those problems just so they could see what was on.

More importantly, they got hooked.

TV 2.0: the Cable Age.

It began with CATV, or Community Antenna Television. For TV junkies who couldn’t get a good signal, CATV was a godsend. In the earliest ’70s I lived in McAfee, New Jersey, deep in a valley, where a rabbit-ears antenna got nothing, and even the biggest rooftop antenna couldn’t do much better. (We got a snowy signal on Channel 2 and nothing else.) So when CATV came through, giving us twelve clear channels of TV from New York and Philadelphia, we were happy to pay for it. A bit later, when we moved down Highway 94 to a high spot south of Newton, my rooftop antenna got all those channels and more, so there was  no need for CATV there. Then, after ’74, when we moved to North Carolina, we did without cable for a few years, because our rooftop antennas, which we could spin about with a rotator, could get everything from Roanoke, Virginia to Florence, South Carolina.

But then, in the early ’80s, we picked up on cable because it had Atlanta “superstation” WTCG (later WTBS and then just TBS) and HBO, which was great for watching old movies. WTCG, then still called Channel 17, also featured the great Bill Tush. (Sample here.) The transformation of WTCG into a satellite-distributed “superstation” meant that a TV station no longer needed to be local, or regional. For “super” stations on cable, “coverage” and “range” became bugs, not features.

Cable could also present viewers with more channels than they could ever get over the air. Technical improvements gradually raised the number of possible channels from dozens to hundreds. Satellite systems, which replicated cable in look and feel, could carry even more channels.

Today cable is post-peak. See here:

catv and cable tv

That’s because, in the ’90s, cable also turned out to be ideal for connecting homes to the Internet. We were still addicted to what cable gave us as “TV,” but we also had the option to watch a boundless variety of other stuff — and to produce our own. Today people are no less hooked on video than they were in 1955, but a declining percentage of their glowing-rectangle viewing is on cable-fed TV screens. The main thing still tying people to cable is the exclusive availability of high-quality and in-demand shows (including, especially, live sports) over cable and satellite alone.

This is why apps for CNN, ESPN, HBO and other cable channels require proof of a cable or satellite TV subscription. If cable content was á la carte, the industry would collapse. The industry knows this, of course, which makes it defensive.

That’s why Aereo freaks them out. Aereo is the new company that Fox and other broadcasters are now suing for giving people who can’t receive TV signals a way to do that over the Net. The potential served population is large, since the transition of U.S. television from analog to digital transmission (DTV) was, and remains, a great big fail.

Where the FCC estimated a 2% loss of analog viewers after the transition in June 2009, in fact 100% of the system changed, and post-transition digital coverage was not only a fraction of pre-transition analog coverage, but required an entirely new way to receive signals, as well as to view them. Here in New York, for example, I’m writing this in an apartment that could receive analog TV over rabbit ears in the old analog days. It looked bad, but at least it was there. With DTV there is nothing. For apartment dwellers without line-of-sight to the Empire State Building, the FCC’s reception maps are a fiction. Same goes for anybody out in the suburbs or in rural areas. If there isn’t a clear-enough path between the station’s transmitter and your TV’s antenna, you’re getting squat.

TV stations actually don’t give much of a damn about over-the-air any more, because 90+% of viewers are watching cable. But TV stations still make money from cable systems, thanks to re-transmission fees and “must carry” rules. These rules require cable systems to carry all the signals receivable in the area they serve. And the coverage areas are mostly defined by the old analog signal footprints, rather than the new smaller digital footprints, which are also much larger on the FCC’s maps than in the realities where people actually live.

Aereo gets around all that by giving each customer an antenna of their own, somewhere out where the signals can be received, and delivering each received station’s video to customers over the Net. In other words, it avoids being defined as cable, or even CATV. It’s just giving you, the customer, your own little antenna.

This is a clever technical and legal hack, and strong enough for Aereo towin in court. After that victory, Fox threatened to take its stations off the air entirely, becoming cable- and satellite-only. This exposed the low regard that broadcasters hold for their over-the-air signals, and for broadcasting’s legacy “public service” purpose.

The rest of the Aereo story is inside baseball, and far from over. (If you want a good rundown of the story so far, dig Aereo: Reinventing the cable TV model, by Tristan Louis.)

Complicating this even more is the matter of “white spaces.” Those are parts of the TV bands where there are no broadcast signals, or where broadcast signals are going away. These spaces are valuable because there are countless other purposes to which signals in those spaces could be put, including wireless Internet connections. Naturally, TV station owners want to hold on to those spaces, whether they broadcast in them or not. And, just as naturally, the U.S. government would like to auction the spaces off. (To see where the spaces are, check out Google’s “spectrum browser“. And note how few of them there are in urban areas, where there are the most remaining TV signals.)

Still, TV 2.0 through 2.9 is all about cable, and what cable can do. What’s happening with over-the-air is mostly about what the wonks call policy. From Aereo to white spaces, it’s all a lot of jockeying for position — and making hay where the regulatory sun shines.

Meanwhile, broadcasters and cable operators still hate the Net, even though cable operators are in the business of providing access to it. Both also remain in denial about the Net’s benefits beyond serving as Cable 2.x. They call distribution of content over the Net (e.g. through Hulu and Netflix) “over the top” or OTT, even though it’s beyond obvious that OTT is the new bottom.

FCC regulations regarding TV today are in desperate need of normalizing to the plain fact that the Net is the new bottom — and incumbent broadcasters aren’t the only ones operating there. But then, the feds don’t understand the Net either. The FCC’s world is radio, TV and telephony. To them, the Net is just a “service” provided by phone and cable companies.

TV 3.0: The IPTV age

IPTV is TV over the Internet Protocol — in other words, through the open Internet, rather than through cable’s own line-up of channels. One example is Netflix. By streaming movies over the Net, Netflix put a big dent in cable viewing. Adding insult to that injury, the vast majority of Netflix streamed movies are delivered over cable connections, and cable doesn’t get a piece of the action, because delivery is over OTT, via IPTV. And now, by producing its own high-quality shows, such as House of Cards, Netflix is competing with cable on the program front as well. To make the viewing experience as smooth as possible for its customers, Netflix also has its own equivalent of a TV transmitter. It’s called OpenConnect, and it’s one among a number of competing CDNs, or Content Delivery Networks. Basically they put up big server farms as close as possible to large volumes of demand, such as in cities.

So think of Netflix as a premium cable channel without the cable, or the channel, optimized for delivery over the Internet. It carries forward some of TV’s norms (such as showing old movies and new TV shows for a monthly subscription charge) while breaking new ground where cable and its sources either can’t or won’t go.

Bigger than Netflix, at least in terms of its catalog and global popularity, is Google’s YouTube. If you want your video to be seen by the world, YouTube is where you put it today, if you want maximum leverage. YouTube isn’t a monopoly for Google (the list of competitors is long), but it’s close. (According to Alexa, YouTube is accessed by a third of all Internet users worldwide. Its closest competitor (in the U.S., at least), is Vimeo, with a global reach of under 1%.) So, while Netflix looks a lot like cable, YouTube looks like the Web. It’s Net-native.

Bassem Youssef, “the Jon Stewart of Egypt,” got his start on YouTube, and then expanded into regular TV. He’s still on YouTube, even though his show on TV got canceled when he was hauled off to jail for offending the regime. Here he tells NBC’s Today show, “there’s always YouTube.” [Later... Dig this bonus link.]

But is there? YouTube is a grace of Google, not the Web. And Google is a big advertising business that has lately been putting more and more ads, TV-like, in front of videos. Nothing wrong with that, it’s a proven system. The question, as we move from TV 3.0 to 3.9, is whether the Net and the Web will survive the inclusion of TV’s legacy methods and values in its midst. In The TV in the Snake of Time, written in July 2010, I examined that question at some length:

Television is deeply embedded in pretty much all developed cultures by now. We — and I mean this in the worldwide sense — are not going to cease being couch potatoes. Nor will our suppliers cease couch potato farming, even as TV moves from airwaves to cable, satellite, and finally the Internet.

In the process we should expect the spirit (if not also the letter) of the Net’s protocols to be violated.

Follow the money. It’s not for nothing that Comcast wishes to be in the content business. In the old cable model there’s a cap on what Comcast can charge, and make, distributing content from others. That cap is its top cable subscription deals. Worse, they’re all delivered over old-fashioned set top boxes, all of which are — as Steve Jobs correctly puts it — lame. If you’re Comcast, here’s what ya do:

  1. Liberate the TV content distro system from the set top sphincter.
  2. Modify or re-build the plumbing to deliver content to Net-native (if not entirely -friendly) devices such as home flat screens, smartphones and iPads.
  3. Make it easy for users to pay for any or all of it on an à la carte (or at least an easy-to-pay) basis, and/or add a pile of new subscription deals.

Now you’ve got a much bigger marketplace, enlarged by many more devices and much less friction on the payment side. (Put all “content” and subscriptions on the shelves of “stores” like iTunes’ and there ya go.) Oh, and the Internet? … that World of Ends that techno-utopians (such as yours truly) liked to blab about? Oh, it’s there. You can download whatever you want on it, at higher speeds every day, overall. But it won’t be symmetrical. It will be biased for consumption. Our job as customers will be to consume — to persist, in the perfect words of Jerry Michalski, as “gullets with wallets and eyeballs.”

Future of the Internet

So, for current and future build-out, the Internet we techno-utopians know and love goes off the cliff while better rails get built for the next generations of TV — on the very same “system.” (For the bigger picture, Jonathan Zittrain’s latest is required reading.)

In other words, it will get worse before it gets better. A lot worse, in fact.

But it will get better, and I’m not saying that just because I’m still a utopian. I’m saying that because the new world really is the Net, and there’s a limit to how much of it you can pave with one-way streets. And how long the couch potato farming business will last.

More and more of us are bound to produce as well as consume, and we’ll need two things that a biased-for-TV Net can’t provide. One is speed in both directions: out as well as in. (“Upstream” calls Sisyphus to mind, so let’s drop that one.) The other is what Bob Frankston calls “ambient connectivity.” That is, connectivity we just assume.

When you go to a hotel, you don’t have to pay extra to get water from the “hydro service provider,” or electricity from the “power service provider.” It’s just there. It has a cost, but it’s just overhead.

That’s the end state. We’re still headed there. But in the meantime the Net’s going through a stage that will be The Last Days of TV. The optimistic view here is that they’ll also be the First Days of the Net.

Think of the original Net as the New World, circa 1491. Then think of TV as the Spanish invasion. Conquistators! Then read this essay by Richard Rodriguez. My point is similar. TV won’t eat the Net. It can’t. It’s not big enough. Instead, the Net will swallow TV. Ten iPad generations from now, TV as we know it will be diffused into countless genres and sub-genres, with millions of non-Hollywood production centers. And the Net will be bigger than ever.

In the meantime, however, don’t hold your breath.

That meantime has  now lasted nearly three years — or much longer if you go back to 1998, when I wrote a chapter of a book by Microsoft, right after they bought WebTV. An excerpt:

The Web is about dialog. The fact that it supports entertainment, and does a great job of it, does nothing to change that fact. What the Web brings to the entertainment business (and every business), for the first time, is dialog like nobody has ever seen before. Now everybody can get into the entertainment conversation. Or the conversations that comprise any other market you can name. Embracing that is the safest bet in the world. Betting on the old illusion machine, however popular it may be at the moment, is risky to say the least…

TV is just chewing gum for the eyes. — Fred Allen

This may look like a long shot, but I’m going to bet that the first fifty years of TV will be the only fifty years. We’ll look back on it the way we now look back on radio’s golden age. It was something communal and friendly that brought the family together. It was a way we could be silent together. Something of complete unimportance we could all talk about.

And, to be fair, TV has always had a very high quantity of Good Stuff. But it also had a much higher quantity of drugs. Fred Allen was being kind when he called it “chewing gum for the eyes.” It was much worse. It made us stupid. It started us on real drugs like cannabis and cocaine. It taught us that guns solve problems and that violence is ordinary. It disconnected us from our families and communities and plugged us into a system that treated us as a product to be fattened and led around blind, like cattle.

Convergence between the Web and TV is inevitable. But it will happen on the terms of the metaphors that make sense of it, such as publishing and retailing. There is plenty of room in these metaphors — especially retailing — for ordering and shipping entertainment freight. The Web is a perfect way to enable the direct-demand market for video goods that the television industry was never equipped to provide, because it could never embrace the concept. They were in the eyeballs-for-advertisers business. Their job was to give away entertainment, not to charge for it.

So what will we get? Gum on the computer screen, or choice on the tube?

It’ll be no contest, especially when the form starts funding itself.

Bet on Web/TV, not TV/Web.

I was recruited to write that chapter because I was the only guy Microsoft could find who thought the Web would eat TV rather than vice versa. And it does look like that’s finally happening, but only if you think Google is the Web. Or if you think Web sites are the new channels. In tech-speak, channels are silos.

When I wrote those pieces, I did not foresee the degree to which our use of the Net would be contained in silos that Bruce Schneier compares to feudal-age castles. Too much of the Web we know today is inside the walls governed by Lord Zuck, King Tim, Duke Jeff and the emperors Larry and Sergey. In some ways those rulers are kind and generous, but we are not free so long as we are native to their dominions rather than the boundless Networked world on which they sit.

The downside of depending on giants is that you can, and will, get screwed. Exhibit A (among too many for one alphabet) is Si Dawson’s goodbye post on Twitcleaner, a service to which he devoted his life, and countless people loved, that ”was an engineering marvel built, as it were, atop a fail-whaling ship.”  When Twitter “upgraded” its API, it sank Twitcleaner and many other services built on Twitter. Writes Si, “Through all this I’ve learned so, so much.Perhaps the key thing? Never playfootball when someone else owns the field. So obvious in hindsight.”

Now I’m having the same misgivings about Dropbox, which works as what Anil Dash calls a POPS: Privately Owned Public Space. It’s a great service, but it’s also a private one. And therefore risky like Twitter is risky.

What has happened with all those companies was a morphing of mission from a way to the way:

  • Google was way to search, and became the way to search
  • Facebook was way to be social on the Web, and became the way to be social on the Web
  • Twitter was way to microblog, and became the way to microblog

I could go on, but you get the idea.

What makes the Net and the Web open and free are not its physical systems, or any legal system. What makes them free are their protocols, which are nothing more than agreements: the machine equivalents of handshakes. Protocols do not by their nature presume a centralized system, like TV — or like giant Web sites and services. Protocols are also also not corruptible, because they are each NEA: Nobody owns it, Everybody can use it and Anybody can improve it.

Back in 2003, David Weinberger and I wrote about protocols and NEA in a site called World of Ends: What the Internet Is and How to Stop Mistaking It For Something Else. In it we said the Net was defined by its protocols, not by the companies providing the wiring and the airwaves over which we access the Net.

Yet, a decade later, we are still mistaking the Net for TV. Why? One reason is that there is so much more TV on the Net than ever before. Another is that we get billed for the Net by cable and phone companies. For cable and phone companies providing home service, it’s “broadband” or “high speed Internet.” For mobile phone companies, it’s a “data plan.” By whatever name, it’s one great big channel: a silo open at both ends, through which “content” gets piped to “consumers.” To its distributors — the ones we pay for access — it’s just another kind of cable TV.

The biggest player in cable is not Comcast or Time Warner. It’s ESPN. That’s because the most popular kind of live TV is sports, and ESPN runs that show. Today, ESPN is moving aggressively to mobile. In other words, from cable to the Net. Says Bloomberg Businessweek,

ESPN has been unique among traditional media businesses in that it has flourished on the Web and in the mobile space, where the number of users per minute, which is ESPN’s internal metric, reached 102,000 in June, an increase of 48 percent so far this year. Mobile is now ESPN’s fastest-growing platform.

Now, in ESPN Eyes Subsidizing Wireless-Data Plans, the Wall Street Journal reports, “Under one potential scenario, the company would pay a carrier to guarantee that people viewing ESPN mobile content wouldn’t have that usage counted toward their monthly data caps.” If this happens, it would clearly violate the principle of network neutrality: that the network itself should not favor one kind of data, or data producer, over another.Such a deal would instantly turn every competing data producer into a net neutrality activist, so it’s not likely to happen.

Meanwhile John McCain, no friend of net neutrality, has introduced the TV Consumer Freedom Act, which is even less friendly to cable. As Business Insider puts it, McCain wants to blow the sucker upSays McCain,

This legislation has three principal objectives: (1) encourage the wholesale and retail ‘unbundling’ of programming by distributors and programmers; (2) establish consequences if broadcasters choose to ‘downgrade’ their over-the-air service; and (3) eliminate the sports blackout rule for events held in publicly-financed stadiums.

For over 15 years I have supported giving consumers the ability to buy cable channels individually, also known as ‘a la carte’ – to provide consumers more control over viewing options in their home and, as a result, their monthly cable bill.

The video industry, principally cable companies and satellite companies and the programmers that sell channels, like NBC and Disney-ABC, continue to give consumers two options when buying TV programming: First, to purchase a package of channels whether you watch them all or not; or, second, not purchase any cable programming at all.

This is unfair and wrong – especially when you consider how the regulatory deck is stacked in favor of industry and against the American consumer.

Unbundle TV, make it á la carte, and you have nothing more than subscription video on the Net. And that is what TV will become. If McCain’s bill passes, we will still pay Time Warner and Comcast for connections to the Net; and they will continue to present a portfolio of á la carte and bundled subscription options. Many video sources will continue to be called “networks” and “channels.” But it won’t be TV 4.0 because TV 3.0 — TV over IP — will be the end of TV’s line.

Shows will live on. So will producers and artists and distributors. The old TV business to be as creative as ever, and will produce more good stuff than ever. Couch potatoes will live too, but there will be many more farmers, and the fertilizer will abound in variety.

What we’ll have won’t be TV because TV is channels, and channels are scarce. The Net has no channels, and isn’t about scarcity. It just has an endless number of ends, and no limit on the variety of sources pumping out “content” from those ends. Those sources include you, me, and everybody else who wants to produce and share video, whether for free or for pay.

The Net is an environment built for abundance. You can put all the scarcities you want on it, because an abundance-supporting environment allows that. An abundance system such as the Net gives business many more ways to bet than a scarcity system such as TV has been from the antenna age on through cable. As Jerry Michalski says (and tweets), “#abundance is pretty scary, isn’t it? Yet it’s the way forward.”

Abundance also frees all of us personally. How we organize what we watch should be up to us, not up to cable systems compiling their own guides that look like spreadsheets, with rows of channels and columns of times. We can, and should, do better than that. We should also do better than what YouTube gives us, based on what its machines think we might want.

The new box to think outside of is Google’s. So let’s re-start there. TV is what it’s always been: dumb and terminal.

 

artifacty HD[Later (7 April)... The issue has been resolved, at least for now. We never did figure out what caused the poor video resolution in this case, but it looks better now. Still, it seems that compression artifacts are a mix of feature and bug for both cable and satellite television. One of these weeks or months I'll study it in more depth. My plan now is just to enjoy watching the national championship game tomorrow night, between Louisville and Michigan.]

What teams are playing here? Can you read the school names? Recognize any faces?  Is that a crowd in the stands or a vegetable garden? Is the floor made of wood or ice?

You should be able to tell at least some of those things on an HD picture from a broadcast network. But it ain’t easy. Not any more. At least not for me.

Used to be I could tell, at least on Dish Network, which is one reason I got it for our house in Santa Barbara. I compared Dish’s picture on HD channels with those of Cox, our cable company, and it was no contest. DirectTV was about the equal, but had a more complicated remote control and cost a bit more. So we went with Dish. Now I can’t imagine Cox — or anybody — delivering a worse HD picture.

The picture isn’t bad just on CBS, or just during games like this one. It sucks on pretty much all the HD channels. The quality varies, but generally speaking it has gone down hill since we first got our Sony Bravia 1080p “Full HD” screen in 2006. It was the top of the line model then and I suppose still looks good, even though it’s hard to tell, since Dish is our only TV source.

Over-the-air (OTA) TV looks better when we can get it; but hardly perfect. Here’s what the Rose Bowl looked like from KGTV in San Diego when I shot photos of it on New Years Day of 2007. Same screen. You can see some compression artifacts in this close-up here and this one here; but neither is as bad as what we see now. (Since I shot those, KGTV and the CBS affiliate in San Diego, KFMB, moved down from the UHF to the VHF band, so my UHF antenna no longer gets them. Other San Diego stations with UHF signals still come in sometimes and look much better than anything from Dish.)

So why does the picture look so bad? My assumption is that Dish, to compete with cable and DirectTV, maximizes the number of channels it carries by compressing away the image quality of each. But I could be wrong, so I invite readers (and Dish as well) to give me the real skinny on what’s up with this.

And, because I’m guessing some of you will ask: No, this isn’t standard-def that I’m mistaking for high-def. This really is the HD stream from the station.

[Later...] I heard right away from @Dish_Answers. That was quick. We’ll see how it goes.

Nearly all smartphones today are optimized to do three things for you:

  1. Run apps
  2. Speak to other people
  3. Make you dependent on a phone company

The first two are features. The third is a  bug. In time that bug will be exterminated. Meanwhile it helps to look forward to what will happen with #1 and #2 once they’re liberated from #3.

Both features are personal. That’s key. Our smartphones (or whatever we end up calling them) should be as personal as our clothing, wallets and purses. In other words, they should work as extensions of ourselves.

When this happens, they will have evolved into what Martin Kuppinger calls life management platforms, good for all these things —

— in addition to the stuff already made possible by the zillion apps already out there.

What kinds of smartphones are in the best position to evolve into Life Management Platforms? The short answer is: open ones. The longer answer is: open ones that are already evolving and have high levels of adoption.

Only one platform qualifies, and that’s Android. Here’s what Wikipedia says (as of today) about Android’s open-ended evolutionary position:

Historically, device manufacturers and mobile carriers have typically been unsupportive of third-party firmware development. Manufacturers express concern about improper functioning of devices running unofficial software and the support costs resulting from this.[81] Moreover, modified firmwares such as CyanogenMod sometimes offer features, such as tethering, for which carriers would otherwise charge a premium. As a result, technical obstacles including locked bootloaders and restricted access to root permissions are common in many devices. However, as community-developed software has grown more popular, and following a statement by the Librarian of Congress in the United States that permits the “jailbreaking” of mobile devices,[82] manufacturers and carriers have softened their position regarding third party development, with some, including HTC,[81] Motorola,[83] Samsung[84][85]and Sony Ericsson,[86] providing support and encouraging development. As a result of this, over time the need to circumventhardware restrictions to install unofficial firmware has lessened as an increasing number of devices are shipped with unlocked or unlockable bootloaders, similar to the Nexus series of phones, although usually requiring that users waive their devices’ warranties to do so.[81] However, despite manufacturer acceptance, some carriers in the US still require that phones are locked down.[87]

The unlocking and “hackability” of smartphones and tablets remains a source of tension between the community and industry, with the community arguing that unofficial development is increasingly important given the failure of industry to provide timely updates and/or continued support to their devices.[87]

But the community doesn’t just argue. It moves ahead with implementations. For example, Ubuntu for Android and custom ROMs for Google’s Nexus 7.

The reason there is an aftermarket for Nexus hardware is that Google intended for Android to be open and generative from the start, pointedly saying that Nexus is “unlocked and contract free.” This is why, even though Google does lots of business with mobile phone company operators, it is those operators’ friend only to the degree it helps lead those operators past current customer-entrapment business models and into a future thick with positive economic externalities. Amidst those externalities, phone companies will still enjoy huge built-out infrastructure and other first-mover advantages. They will wake up and smell the infinity.

While Apple deserves huge credit for modeling what a smartphone should do, and how it should work (Steve Jobs was right to see Android as something of a knock-off) the company’s walled-garden remains a monument of feudality. For a window on how that fails, read Barbara Lippert’s Samsung vs. Apple: Losing My Religion in MediaPost. Barbara is an admitted member of the “cult of Cupertino,” and is — along with droves of other Apple serfs — exiting the castle.

Samsung, however, just happens to be (deservedly) the maker of today’s most popular Androids. The Androids that win in the long run will be true life management platforms. Count on it.

For a window on that future, here are the opening paragraphs of  The Customer as a God, my essay in The Wall Street Journal last July:

It’s a Saturday morning in 2022, and you’re trying to decide what to wear to the dinner party you’re throwing that evening. All the clothes hanging in your closet are “smart”—that is, they can tell you when you last wore them, what else you wore them with, and where and when they were last cleaned. Some do this with microchips. Others have tiny printed tags that you can scan on your hand-held device.As you prepare for your guests, you discover that your espresso machine isn’t working and you need another one. So you pull the same hand-held device from your pocket, scan the little square code on the back of the machine, and tell your hand-held, by voice, that this one is broken and you need another one, to rent or buy. An “intentcast” goes out to the marketplace, revealing only what’s required to attract offers. No personal information is revealed, except to vendors with whom you already have a trusted relationship.

Within a minute offers come in, displayed on your device. You compare the offers and pick an espresso machine to rent from a reputable vendor who also can fix your old one. When the replacement arrives, the delivery service scans and picks up the broken machine and transports it to the vendor, who has agreed to your service conditions by committing not to share any of your data with other parties and not to put you on a list for promotional messages. The agreement happened automatically when your intentcast went out and your terms matched up with the vendor’s.

Your hand-held is descended from what they used to call smartphones, and it connects to the rest of the world by whatever ambient connection happens to be available. Providers of commercial Internet connections still make money but not by locking customers into “plans,” which proved, years ago, to be more trouble than they were worth.

The hand-held itself is also uncomplicated. New technologies and devices are still designed by creative inventors, and there are still trade secrets. But prototyping products and refining them now usually involves actual users at every stage, especially in new versions. Manufacturers welcome good feedback and put it to use. New technology not only evolves rapidly, but appropriately. Ease of use is now the rule, not the exception.

OK, now back to the present.

Everything that I just described can be made possible only by the full empowerment of individuals—that is, by making them both independent of controlling organizations and better able to engage with them. Work toward these goals is going on today, inside a new field called VRM, for vendor relationship management. VRM works on the demand side of the marketplace: for you, the customer, rather than for sellers and third parties on the supply side.

It helps that Android is already huge. It will help more when makers of Android devices and apps squash the phone company dependency bug. It will also help that the “little square code” mentioned above already exists. For a pioneering example, see SquareTag.com. For examples of how individuals can program logical connections between other entities in the world, see Kynetx and Iffft. (Kynetx is for developers. Ifttt is for users.)

As for investors, startups and incumbent big companies, it will help to start looking at the world from the perspective of the individual that each of us happens to be. The future is about liberating us, and equipping us with means for managing our lives and our relationships with other entities in the open marketplace. Personal independence and empowerment is what the PC, the Internet and the smartphone have all provided from the start. Trying to rein in that independence and empowerment comes naturally to big companies, and even some startups. But vector of progress to the future has always been along the line of personal freedom and empowerment. Free customers will be more valuable than captive ones. Android’s success is already starting to prove that.

Mobile maps matter, and Apple now has the worst mapping you can get on a phone. The best, one would think (given the Apple vs. Google coverage) is Google’s. But maybe not, because Nokia has NAVTEQ, which rocks. Or so says Alexis Madrigal in the Atlantic, in a fascinating piece that visits just some of what NAVTEQ has been doing since 1985. For example, providing most of the maps you see on Garmin, Magellan and other legacy GPS companies.

This should be tempting for Apple. Here’s Alexis:

…if a certain tech giant with a massive interest in mobile content (Microsoft, Apple, Yahoo) were looking to catch up or stay even with Google, the company’s Location & Commerce unit might look like a nice acquisition they could get on the cheap (especially given that the segment lost 1.5 billion euros last year). Microsoft and Yahoo are already thick as thieves with Nokia’s mapping crew, but Apple is the company that needs the most help.

Tristan Louis makes the case as well:

So maps are now essen­tial to one’s mobile strat­egy and Apple is behind. When you’re as far behind as they are, there are two ways you can get back to the table: you can either run like crazy and try to iter­ate your prod­uct at light speed or you can buy your way back at the table.

And what bet­ter com­pany than the mar­ket leader if you are to make the invest­ment? On top of it, Apple would get some inter­est­ing sup­port for its AppleTV product.

Apple would get Nokia’s huge mobile tech patent portfolio, which includes a license to Qualcomm’s impressive collection. Tristan suggests that Nokia’s idle patents on mobile TV tech would also help Apple. No doubt it would. Let’s also remember that Google bought Motorola Mobility a short while back pretty much for the same reason: to get an edge in the “nuclear showdown” that patent-based tech wars tend to be. And mobile, alas, is a patent-based game.

The downside would be owning a struggling giant with lots of baggage Apple surely does not want. But Apple has to do something.

Nokia and Microsoft are deeply in bed, however, and both are unlikely to consider selling out to Apple, an enemy in the marketplace. (One can easily imagine Steve Ballmer going nuclear at the very thought of it.)

Eric Bleeker at Motley Fool responds to Tristan while laying out a number of possibilities. His conclusion: “The simple reality is that Apple will probably continue taking smaller bets on emerging technologies.”

Such as? In Yandex to Power Apple Maps, Alexander Vostrov of Russia Beyond the Headlines writes,

Russian software fans are glowing with pride, while analysts make the most improbable assumptions: the Russian IT giant Yandex has entered into a partnership with Apple and will have its Yandex Maps location service integrated with Apple’s new iOS 6 operating system.

This piece from June in The Verge also points to an attribution list at Apple. The page is copy-proof, so just go look at it. The list of data sources is long.

So how about OpenStreetmap? I don’t see them in the above list, but this OpenStreeMap Foundation blog post by Harry Wood on 2 October offers confirming evidence. Says Harry,

Apple’s new maps for iOS6 make use of OpenStreetMap in some parts of the world. We’re not sure how extensive this use is, but it’s fair to say they are mostly using other sources. Apple have used TomTom as a key supplier of data for example. This means that inaccuracies in apple maps are probably not the fault of OpenStreetMap (contrary to some commentary!) However OpenStreetMap is mentioned in apple’s credits, and we have spotted some areas where we think we can see our data in use.

This means your contributions to OpenStreetMap at least have a chance of helping Apple, along with everybody else. But, if you want to go direct to Apple, here’s the trick:

  1. Open Maps on your iOS device
  2. Go to a map view with a problem in it
  3. Lift the lower right (turned up) corner of the map
  4. Look for the very small gray-on-gray text above the Print button that says “Report a problem.” Click on that.
  5. Fill out the short form

I just reported one of Apple’s absent subway stations, just to see how it works. (In fact, they’re all missing, and not just here in New York. I also saw none in London or Paris.)

Meanwhile, I continue to believe selling their own map apps on iOS would be good for Google, and Nokia as well.

[Later...] eWeek has what may be the best suggestion yet: get out of the maps business entirely. Let the Maps companies give away or sell a maps app on the phone. If Nokia and Google decided not to, that would hurt Apple, but it would make them (especially Google) look like silo-building schmucks playing passive-agressive games against a competitor.

Probably too late now. But maybe the open game is the only one for Apple to play now. Dunno though. Food for re-thought.

Having both iPhone and Android devices in the household, I’ve been struck for some time by the absence of two Google Maps features on the iPhone that appear on the Android. One is adaptive turn-by-turn directions (the “recalculating” thing that good GPSes, like those of Garmin, Magellan and Tom-Tom, have always done) when you go off the original course. The other is vocalization of directions (which, again, single-purpose GPS devices do). Android devices have those. The iPhone doesn’t.

I had always thought that this difference was due to one of two things:

  1. Apple didn’t want those features
  2. Google didn’t want Apple devices to have those features, presumably to favor Android in user comparisons with iPhone

The second one makes more sense to me, especially since Apple dropped Google’s maps and added those missing features to its own maps.

But I don’t know. In fact, without an Android with me here in France I can’t compare the two. (Back in the U.S., where I’m headed today, I can.)

I’m not even sure I have the facts right on Android vs. Apple navigation.

What I am sure about is that coverage of the change so far is mostly missing the possibility of numbers one or two above. Anybody got the facts on that? Specifically, did Google intentionally cripple its maps on Apple devices to favor Androids? I haven’t seen that question asked yet. [Later... The answer, according to comments below, and also on Twitter, is no. Apparently #1 is the case.]

Meanwhile, Apple’s new maps are a fail for us here in Paris. I upgraded to iOS 6 and my wife didn’t, on our pair of iPhones. Her Google map shows Metro stops. My Apple map does not. Lacking those stops is a deal-killer for her, and she won’t be upgrading until it’s clear to me on my phone that the Apple maps have parity. I’ve got a feeling that will be awhile.

Huge bonus link.

I want to drive on the Web, but instead I’m being driven. All of us are. And that’s a problem.

It’s not for lack of trying on the part of websites and services such as search engines. But they don’t make cars. They make stores and utilities that try to be personal, but aren’t, and never can be.

Take, for example, the matter of location. The Internet has no location, and that’s one of its graces. But sites and services want to serve, so many notice what IP address you appear to be arriving from. Then they customize their page for you, based on that location. While that might sound innocent enough, and well-intended, it also fails to know one’s true intentions, which matter far more to each of us than whatever a website guesses about us, especially if the guessing is wrong.

Last week I happened to be in New York when a friend in Toronto and I were both looking up the same thing on Google while we talked over Skype. We were unable to see the same thing, or anything close, on Google, because the engine insisted on giving us both localized results, which neither of us wanted. We could change our locations, but not to no location at all. In other words, it wouldn’t let us drive. We could only be driven.

Right now I’m in Paris, and cannot get Google to let me look at Google.com (presumably google.us), Google.uk or Google.anywhere other than France. At least not on its Web page. (If I use the location bar as a place to search, it gives me google.com results, for some non-obvious reason.)

After reading Larry Magid’s latest in Huffpo, about the iPhone 5, which says this…

Gazelle.com is paying $240 for an iPhone 4s in good condition, which is $41 more than the cost of a subsidized iPhone 5. If you buy a new iPhone from Sprint they’ll buy back your iPhone 4s for $235. Trouble is, if you bought a 4s it’s probably still under contract. Sprint is paying $125 for an 8 GB iPhone 4 and Gazelle is paying $145 for a 16 GB iPhone 4 which means that it you can get the $199 upgrade price, your out of pocket could be as little as $54.

… I wondered what BestBuy might give me for my 16GB iPhone 4. But when I go to http://bestbuy.com, the company gives me a page in French. I guess that’s okay, but it’s still annoying. (So is seeing that I can’t get a trade-in price without visiting a store.)

Back in the search world, I’ve been looking for a prepaid wireless internet access strategy to get data at sane prices in the next few countries I visit. A search for “prepaid wireless internet access” on google.fr gets me lots of ads in French, some of which might be more interesting if I knew French as well as I know English, but I doubt it. The “I’m feeling lucky” result is a faux-useful SEO-elevated page with the same title as the search query. The rest of the first page results are useless as well. (I did eventually find a useful site for my UK visit the week after next, but I’ll save that for another post.)

To describe what the Web has become, two metaphors come to mind.

The first is a train system that mostly runs between commercial destinations. In a surreal way, you are transported from one destination to another near-instantly (or at the speed of a page loading ads and cookies along with whatever it was you went there for), and are trapped at every destination in a cabin with a view only of what the destination wants you to see. The cabin is co-occupied by dozens or hundreds of conductors at any given time, all competing for your attention and telling you something they hope will make you buy something or visit other sites. In the parlance of professionals on the supply side of this system, what you get here is an “experience” that they “deliver.” To an increasing degree this experience is personalized, and for every person it’s different. If you looked at pants a few sites back, you might see ads for pants, or whatever it is that the system thinks you might want to buy, whether you’re in a buying mood or not at the time. (And most of the time you’re not, but they don’t care about that.)

Google once aspired to give us access to “all the world’s information”, which suggests a library. But the library-building job is now up to Archive.org. Instead, Google now personalizes the living shit out of its search results. One reason, of course, is to give us better search results. But the other is to maximize the likelihood that we’ll click on an ad. But neither is served well by whatever it is that Google thinks it knows about us. Nor will it ever be, so long as we are driven, rather than driving.

I think what’s happened in recent years is that users searching for stuff have been stampeded by sellers searching for users. I know Googlers will bristle at that characterization, but that’s what it appears to have become, way too much of the time.

But that’s not the main problem. The main problem is that browsers are antique vehicles.

See, we need to drive, and browsers aren’t cars. They’re shopping carts that shape-shift with every site we visit. They are optimized for being inside websites, not for driving outside them, or between them. In fact, we can hardly imagine the Net or the Web as a space that’s larger than the sites in it. But we need to do that if we’re going to start designing means of self-transport that transcend the limitations of browsing and browsers.

Think about what it means to drive.  The cabin, steering wheel, pedals, controls, engine, tires and chassis of a car are all controlled by you. The world through which you move is outside, not inside. Even in malls, you park outside the stores. The stores do not intrude inside your personal space. Driving is no less personal and no less masterfully yours when you ride a bike or a motorcycle, or pilot a plane. Those are all personal vehicles too. A browser should have been like one of those, and that was kind of the idea back in the early days when we talked about “surfing” and the “information highway.” But it didn’t turn out that way. Instead browsers became shopping carts that get fresh skins at every website.

We need a new vehicle. One that’s ours.

The smartphone would be ideal if it wasn’t also a phone. But that’s what it is. With few exceptions, we rent smartphones from phone companies and equipment makers, which collude to sentence us to “plans” that last for two years at a run.

I had some hope for Android., but that hope is fading now. Although supporting general purpose hardware and software was one of Google’s basic ideas behind Android, that’s not how it’s turning out. Android in most cases is an embedded operating system on a special purpose device. In the most familiar U.S. cases (AT&T’s, Sprint’s, T-Mobile’s and Verizon’s) the most special purpose of that device is locking you to a plan and soaking you for some quantity of minutes, texts and GB of data, whether you use the full amounts or not, and then punishing you for going over. They play an asymmetrical knowledge game with you, where they can monitor your every move, and all your usage, while you can barely do the same, if at all.

So we have a long way to go before mobile phones become the equivalent of a car, a bicycle, a motorcycle or a small plane. I don’t think there is an evolutionary path to the Net’s equivalent of a car that starts with a smartphone. Unless it’s not a phone first and a computing/communication device second.

The personal computing and communications revolution is thirty years old now, if we date it from the first IBM PC.  And right now we’re stuck, mostly because we think having the Web “personalized” is the same thing as having a personal vehicle. And because we think having a smartphone makes us independent. Neither is true. That’s why we won’t make progress past those problems until we start thinking and inventing outside their old boxes.

Apple TV (whatever it ends up being called) will kill cable. It will also give TV new life in a new form.

manhole coverIt won’t kill the cable companies, which will still carry data to your house, and which will still get a cut of the content action, somehow. But the division between cable content and other forms you pay for will be exposed for the arbitrary thing it is, in an interactive world defined by the protocols of the Internet, rather than by the protocols of television. It will also contain whatever deals Apple does for content distribution.

These deals will be motivated by a shared sense that Something Must Be Done, and by knowing that Apple will make TV look and work better than anybody else ever could. The carriers have seen this movie before, and they’d rather have a part in it than outside of it. For a view of the latter, witness the fallen giants called Sony and Nokia. (A friend who worked with the latter called them “a tree laying on the ground,” adding “They put out leaves every year. But that doesn’t mean they’re standing up.”)

I don’t know anything about Apple’s plans. But I know a lot about Apple, as do most of us. Here are the operative facts as they now stand (or at least as I see them):

  1. Apple likes to blow up categories that are stuck. They did it with PCs, laptops, printers, mp3 players, smartphones, music distribution and retailing. To name a few.
  2. TV display today is stuck in 1993. That’s when the ATSC (which defined HDTV standards) settled on the 16:9 format, with 1080 pixels (then called “lines”) of vertical resolution, and with picture clarity and sound quality contained within the data carrying capacity of a TV channel 6MHz wide. This is why all “Full HD” screens remain stuck at 1080 pixels high, no matter how physically large those screens might be. It’s also why more and more stand-alone computer screens are now 1920 x 1080. They’re made for TV. Would Steve Jobs settle for that? No way.
  3. Want a window into the future where Apple makes a TV screen that’s prettier than all others sold? Look no farther than what Apple says about the new iPad‘s resolution:
  4. Cable, satellite and over-the-air channels are still stuck at 6MHz of bandwidth (in the original spectrum-based meaning of that word). They’re also stuck with a need to maximize the number of channels within a finite overall bandwidth. This has resulted in lowered image quality on most channels, even though the images are still, technically, “HD”. That’s another limitation that surely vexed Steve.
  5. The TV set makers (Sony, Visio, Samsung, Panasonic, all of them) have made operating a simple thing woefully complicated, with controls (especially remotes) that defy comprehension. The set-top-box makers have all been nearly as bad for the duration. Same goes for the makers of VCR, DVD, PVR and other media players. Home audio-video system makers too. It’s a freaking mess, and has been since the ’80s.
  6. Steve at AllThingsD on 2 June 2010: “The only way that’s ever going to change is if you can really go back to square one and tear up the set-top-box and redesign it from scratch with a consistent UI, withall these different functions, and get it to the consumer in a way they are willing to pay for. We decided, what product do you want most? A better tv or a better phone? A better TV or a tablet? … The TV will lose until there is a viable go-to-market strategy. That’s the fundamental problem.” He also called Apple TV (as it then stood) a “hobby”, for that reason. But Apple is bigger now, and has far more market reach and clout. In some categories it’s nearly a monopoly already, with at least as much leverage as Microsoft ever had. And you know that Apple hasn’t been idle here.
  7. Steve Jobs was the largest stockholder in Disney. He’s gone, but the leverage isn’t. Disney owns ABC and ESPN.
  8. The main thing that keeps cable in charge of TV content is not the carriers, but ESPN, which represents up to 40% of your cable bill, whether you like sports or not. ESPN isn’t going to bypass cable — they’ve got that distribution system locked in, and vice versa. The whole pro sports system, right down to those overpaid athletes in baseball and the NBA, depend on TV revenues, which in turn rest on advertising to eyeballs over a system made to hold those eyeballs still in real time. “There are a lot of entrenched interests,” says Peter Kafka in this On the Media segment. The only thing that will de-entrench them is serious leverage from somebody who can make go-to-market, UI, quality, and money-flow work. Can Apple do that without Steve? Maybe not. But it’s still the way to bet.

Cable folks have a term for video distribution on the net Net. They call it “over the top“. Of them, that is, and their old piped content system.

That’s actually what many — perhaps most — viewers would prefer: an à la carte choice of “content” (as we have now all come to say). Clearly the end state is one in which you’ll pay for some stuff while other stuff is free. Some of it will be live, and some of it recorded. That much won’t be different. The cable companies will also still make money for keeping you plugged in. That is, you’ll pay for data in any case. You’ll just pay more for some content. Much of that content will be what we now pay for on cable: HBO, ESPN and the rest. We’ll just do away with the whole bottom/top thing because there will be no need for a bottom other than a pipe to carry the content. We might still call some  sources “channels”; and surfing through those might still have a TV-like UI. But only if Apple decides to stick with the convention. Which they won’t, if they come up with a better way to organize things, and make selections easy to make and pay for.

This is why the non-persuasiveness of Take My Money, HBO doesn’t matter. Not in the long run. The ghost of Steve is out there, waiting. You’ll be watching TV his way. Count on it.

We’ll still call it TV, because we’ll still have big screens by that name in our living rooms. But what we watch and listen to won’t be contained by standards set in 1993, or by carriers and other “stakeholders” who never could think outside the box.

Of course, I could be wrong. But no more wrong than the system we have now.

Bonus link.

Another.

The hard drive is crapping out on my main laptop. I’m backed up, so that much is cool. Installing a Seagate Momentus XT 750 GB drive later today. We’ll see how it goes.

[Later...] Lot of dependencies and such to clean up, but performance-wise, it’s like a new computer.

When Underwood typewriterour kid started using a computer in the seventh grade, I got him a copy of Mavis Beacon so he’d learn how to touch-type.

I didn’t see him using the program, but I did see him typing. So I asked him what was up with that. He said “I looked at it a couple of weeks ago. It was good.” I asked, “Did you learn to touch type from it?” “Sure,” he said. “It has tests. I used them. I did fine.”

So I asked him to show me. He did. First try: 30 words per minute. Second, 45 wpm.

I took typing in the seventh grade ,which ran from September 1959 to June 1960. work keyboardIt was a year-long class, one period per day. My typewriter at school was an early-Fifties Underwood Rhythm-Touch like the one on the left. For practice at home my parents got me a WWII-era Underwood that looked exactly like the code machine.

I got an F in my first semester of typing class, because I made a lot of mistakes. I got a D in my second semester, for the same reason. For what it’s worth, I doubt anybody in that class has done more typing since then than I have.  Or have worn out more keyboards. Such as the one on the right, which I’m using now.

My handwriting, long neglected, looks about as good.

Some old habits died hard. Here they are:

  • Returning the carriage after the bell five spaces before the end of a line.
  • Wanting to set tabs the old-fashioned way, feeling the physical insertion, literally, of a metal tab into the carriage path.
  • Double-spacing between sentences. Not doing this was my most common error, back in typing class.
  • Hyphenating long words at the ends of lines.
  • Indenting the first line of a paragraph, with a tab five spaces in.

For years I hated word processing without hyphens, and double-returns between paragraphs with no indents. But after awhile I became accustomed to that new norm, and came to appreciate its benefits as well. (For example, when copying and pasting a bunch of text and not having to take out the hyphens and indents that only made sense in the old layout.) I also taught myself to restore my original proclivity to single spaces between sentences.

As for typing speed, I have no idea how fast I am now. What I love about not knowing is that it truly doesn’t matter.

Check the Arbitron radio listening ratings for Washington DC. You have to go waaaay down the list before you find a single AM station that isn’t also simulcast on FM. But then, if you go to the bottom of the list, you’ll also find a clump of Internet streams of local radio stations.

You’ll see the same pattern at other cities on this list from Radio-Info.com. FM on top, AM below, and streams at the bottom.

Together these paint an interesting picture. At the top, Innovators, at the bottom, Dilemma. (Some context, if the distinction isn’t obvious.)

Note that Pandora, Spotify, SiriusXM and other radio-like streaming services are not listed. Nor are podcasts or anything else one might listen to, including stuff on one’s smartphone, ‘pod or ‘pad. If they were, they’d be way up that list. According to Pandora CEO Joseph Kennedy (in this Radio INK piece),

…we have transitioned from being a small to medium sized radio station in every market in the U.S. to one of the largest radio stations in every market in the country. Based on the growth we continue to see, we anticipate that by the end of this year, we will be larger than the largest FM or AM radio station in most markets in U.S. As a consequence, our relevance to buyers of traditional radio advertising in skyrocketing. We have already begun to see the early benefits of this dramatic change. Our audio advertising more than doubled to more than $100 million in fiscal 2012.

Back when I was in the biz, public radio was a similar form of dark matter in the ratings. If you added up all the stations’ shares, they came 10-13% short of 100%. If one went to Arbitron’s headquarters in Beltsville, Maryland (as many of us did) to look at the “diaries” of surveyed listeners, you’d find that most of the missing numbers were from noncommercial stations. Today those are listed, and the biggest are usually at or near the top of the ratings.

But today’s dark matter includes a variety of radio-like and non-radio listening choices, including podcasts, satellite radio, and what the industry calls “pure-play streamers” and “on-demand music services.” Together all of these are putting a huge squeeze on radio as we knew it. AM is still around, and will last longest in places where it’s still the best way to listen, especially in cars. In flat prairie states with high ground conductivity, an AM station’s signal can spread over enormous areas. For example, here is the daytime coverage map from Radio-Locator.com for 5000-watt WNAX/570am in Yankton, South Dakota:

WNAX Daytime coverage

And here’s the one for 50000-watt WBAP/820 in Dallas-Fort Worth:

WBAP coverage

No FM station can achieve the same range, and much of that flat rural territory isn’t covered by cellular systems, a primary distribution system for the data streams that comprise Internet radio.

True, satellite radio covers the whole country, but there are no local or regional radio stations on SiriusXM, the only company in the satellite radio business. To some degree rural places are also served by AM radio at night, when signals bounce off the ionosphere, and a few big stations — especially those on “clear” channels — can be heard reliably up to several thousand miles away. (Listen to good car radio at night in Hawaii and you’ll still hear many AM stations from North America.) But, starting in 1980, “clears” were only protected to 750 miles from their transmitters, and many new stations came on the air to fill in “holes” that really weren’t. As a result AM listening at night is a noisy mess on nearly every channel, once you move outside any local station’s immediate coverage area on the ground.

Even in Dallas-Fort Worth, where WBAP is the biggest signal in town (reaching from Kansas to the Gulf of Mexico, as you see above), WBAP is pretty far down in the ratings. (Copyright restrictions prevent direct quoting of ratings numbers, but at least we can link to them.) Same for KLIF and KRLD, two other AM powerhouses with coverage comparable to WBAP’s. News and sports, the last two staple offerings on the AM band, have also been migrating to FM. Many large AM news and sports stations in major metro areas now simulcast on FM, and some sound like they’re about to abandon their AM facilities entirely.WEEI in Boston no longer even mentions the fact that they’re on 850 on the AM dial. Their biggest competitor, WBZ-FM (“The Sports Hub”) is FM-only.

But while FM is finally beating AM, its ratings today look like AM’s back in the 1950s. FM wasn’t taken seriously by the radio industry then, even though it sounded much better, and also came in stereo. Today the over-the-air radio industry knows it is mightily threatened (as well as augmented, in some cases) by streaming and other listening choices. It also knows it’s not going to go away as long as over-the-air radio can be received in large areas where data streams cannot. It’s an open question, however, whether broadcasters will want to continue spending many thousands of dollars every month on transmitters of signals that can no longer be justified financially.

One big question for radio is the same one that faces TV. That is, What will ESPN do?

ESPN is the Giant Kahuna that’s keeping millions of listeners on AM and FM radio, and viewers on cable and satellite, that would leave if the same content were streamed directly over the Net.

Right now ESPN appears to be fine with distributing its programming through cable and local radio. But at some point ESPN is likely to go direct and avoid the old distribution methods — especially if listeners and viewers would rather have it that way.

On cable ESPN’s problem will be that the distribution will still largely be through cable and phone companies that will wish to be paid for the carriage. That’s a two-sided model that applies now only for TV and satellite radio, but not for anything traveling over the Net, which the cable folks call “Over The Top,” or OTT. (I’m guessing that ESPN already pays for that, in a limited way, through Akamai, Level 3, Limelight and other Content Distribution Networks, or CDNs, which serve a role you might call, in broadcast terms, of local transmitters. Some cable companies, I am sure, do the same. It’s a complicated situation.) If, say, Comcast and Verizon start offering mobile Internet services that are just Facebook, Google+, Twitter and ESPN, they will have kept ESPN from going OTT, and brought Facebook, Google+ and Twitter into the bottom. And, in the process, we will have moved a long way toward the “fully licensed world” I warned about, two posts back. (Interesting that ESPN and others want Arbitron to do “cross-platform measurement”, even as it continues to help make the case for AM and FM radio.)

Regardless of how that goes, AM and FM are stuck in a tunnel, facing the headlights of a content distribution train that they need to embrace before it’s too late.

@ChunkaMui just put up a great post in Forbes: Motorola + Sprint = Google’s AT&T, Verizon and Comcast Killer.

Easy to imagine. Now that Google has “gone hardware” and “gone vertical” with the Motorola deal, why not do the same in the mobile operator space? It makes sense.

According to Chunka, this new deal, and the apps on it,

…would destroy the fiction that internet, cellular and cable TV are separate, overlapping industries. In reality, they are now all just applications riding on top of the same platform. It is just that innovation has been slowed because two slices of those applications, phone and TV, are controlled by aging oligopolies.

AT&T and Verizon survive on the fiction that mobile text and voice are not just another form of data, and customers are charged separately (and exorbitantly) for them. They are also constraining mobile data bandwidth and usage, both to charge more and to manage the demand that their aging networks cannot handle.

Comcast, Time Warner Cable and other cable operators still profit from the fact that consumers have to purchase an entire programming package in order to get a few particular slices of content. This stems from the time when cable companies had a distribution oligopoly, and used that advantageous position to require expensive programming bundles. Computers, phones and tablets, of course, are now just alternative TV screens, and the Internet is an alternative distribution mechanism. It is just a matter of time before competitors unbundle content, and offer movies, sports, news and other forms of video entertainment to consumers.

The limiting factor to change has not been the technology but obsolete business models and the lack of competition.

Before Apple and Google came in, the mobile phone business was evolving at a geological pace. I remember sitting in a room, many years back, with Nokia honchos and a bunch of Internet entrepreneurs who had just vetted a bunch of out-there ideas. One of the top Nokia guys threw a wet blanket over the whole meeting when he explained that he knew exactly what new features would be rolled out on new phones going forward two and three years out, and that these had been worked out carefully between Nokia and its “partners” in the mobile operator business. It was like getting briefed on agreements between the Medici Bank and the Vatican in 1450.

Apple blasted through that old market like a volcano, building a big, vertical, open (just enough to invite half a billion apps) market silo that (together with app developers) completely re-defined what a smartphone — and any other handheld device — can do.

But Apple’s space was still a silo, and that was a problem Google wanted to solve as well. So Google went horizontal with Android, making it possible for any hardware maker to build anything on a whole new (mostly) open mobile operating system. As Cory Doctorow put it in this Guardian piece, Android could fail better, and in more ways, than Apple’s iOS.

But the result for Google was the same problem that Linux had with mobile before Android came along: the market plethorized. There were too many different Android hardware targets. While Android still attracted many developers, it also made them address many differences between phones by Samsung, Motorola, HTC and so on. As Henry Blodget put it here,

Android’s biggest weakness thus far has been its fragmentation: The combination of many different versions, plus many different customizations by different hardware providers, has rendered it a common platform in name only. To gain the full power of “ubiquity”–the strategy that Microsoft used to clobber Apple and everyone else in the PC era–Google needs to unify Android. And perhaps owning a hardware company is the only way to do that.

That’s in response to the question, “Is this an acknowledgment that, in smartphones, Apple’s integrated hardware-software solution is superior to the PC model of a common software platform crossing all hardware providers?” Even if it’s not (and I don’t think it is), Google is now in the integrated hardware-software mobile device business. And we can be sure that de-plethorizing Android is what Larry Page’s means when he talks about “supercharging” the Android ecosystem.

So let’s say the scenario that Chunka describes actually plays out — and then some. For example, what if Google buys,  builds or rents fat pipes out to Sprint cell sites, and either buys or builds its way into the content delivery network (CDN) business, competing with while also supplying Akamai, Limelight and Level3? Suddenly what used to be TV finishes moving “over the top” of cable and onto the Net. And that’s just one of many other huge possible effects.

What room will be left for WISPs, which may be the last fully independent players out there?

I don’t know the answers. I do know that just the thought of Google buying Sprint will fire up the lawyers and lobbyists for AT&T, Comcast and Verizon.

 

Last week I flew back and forth from Boston to Reno by way of Phoenix. Both PHX-RNO legs took me past parts of Nevada I hadn’t had a good look at before. One item stood out: a dry lake that looked, literally, like a town had been built on it and blown up. In fact, this was the case. The lake was Frenchman Lake, on Frechman Flat, a valley in a part of the desert known as the Nevada Test Site. The town was nicknamed “Doom Town,” and it was built to see what would happen to it in an atomic blast. Here’s a video that shows the results.

In fact more than a dozen blasts rocked the Doom Town area, starting with Able, in 1951 — the first at the Test Site.

This shot shows Yucca Lake and Yucca Flat, which has many dozens of subsidence craters where underground blasts have gone off. This Google Maps view shows the same from above. All the blasts look like rows of dimples in the desert. But some are hundreds of feet across. Before reading about underground nuclear testing, I had thought that all the tests were deep enough to avoid surface effects.

This shot looks across the Test Site to Area 51. Amazing place. Some of what they say about it may even be true. By the way, that shot was taken (I just checked) from almost 100 miles away. I used a Canon 5D and a zoom telephoto lens set to 200mm.

This graphicapple revenue progress, of Apple’s revenues per quarter, broken down by products, tells several stories at once. One is that the iPhone remains huge. (I was amazed by how many I saw in the UK and France.) Another is that the iPod may be getting a bit stale. But the big one is the sudden size of the iPad business.

We have one, a 3G model that arrived when we were in Paris in June. It was nice-to-have but something short of its full promise until a friend in Paris got us a 2Gb SIM so the unit became useful outside of our apartment’s wi-fi zone. (Orange, Apple’s carrier partner in France, requires of Americans a French bank account — just one of many vexing problems with 3G outside anybody’s home country. It’s a freaking mess.) With that SIM, the difference became absolute. Now we could look at maps, shop, and read about topics of immediate local interest, live and on the spot, anywhere. (Even in the subways.) The iPad is much faster than the iPhone and much more convenient than a laptop or a netbook. Form-factor wise, it’s a whole new category.

The question is, can anybody else top it, or even compete with it? Certainly somebody should. Here’s what I’d recommend.

First, a second unit with a smaller form-factor: about half or two thirds the size of the iPad. There’s a need for something that’s bigger than a phone but smaller than the current iPad, which is a bit too large for most purses.

Second, freedom from anybody’s silo. Apple has done it’s vertical thing here. Now it’s time for the horizontal one. In product categories, the horizons are always wider than the skies are high.

Third, featuring the 3G or 4G model, rather than regarding it as a premium exception. This also means working energetically to expose and break down the national boundaries to mobile carrier data plans. We desperately need the phone system to become a data system that also does telephony, rather than the reverse. (More about those in another post.)

Fourth, better speaker(s). The iPad actually sounds quite good, for a speaker that talks out of the same flat hole that’s plugged by the power connector (just like the iPhone).

Fifth, two microphones, for binaural recording. This is hugely under-rated as a feature, and generally ignored by portable gear makers. With binaural recording, you get a you-are-there sound field when listening to the recording with headphones. Related idea: two cameras, for shooting in 3D. The latter would also be a cool peripheral.

Sixth, make the ‘pad a production and not just a consumption device. Shooting and/or editing video, and uploading it to a server on the spot, would be a way cool use for the thing.

Of course, consumer electronics makers are notorious copy-cats. But what they need to do is zig here where Apple zags. There’s infinite room.

There’s only one way to justify Internet data speeds as lopsided as the one to the left.

Television.

It’s an easy conclusion to draw here at our borrowed Parisian apartment, where the Ethernet cable serving the laptop comes from a TV set top box. As you see, the supplier is FreeSAS, or just http://free.fr.

I don’t know enough French to interpret that page, or the others in Free’s tree, but the pictures and pitches speak loudly enough. What Free cares about most is television. Same is true for its customers, no doubt.

Television is deeply embedded in pretty much all developed cultures by now. We — and I mean this in the worldwide sense — are not going to cease being couch potatoes. Nor will our suppliers cease couch potato farming, even as TV moves from airwaves to cable, satellite, and finally the Internet.

In the process we should expect the spirit (if not also the letter) of the Net’s protocols to be violated.

Follow the money. It’s not for nothing that Comcast wishes to be in the content business. In the old cable model there’s a cap on what Comcast can charge, and make, distributing content from others. That cap is its top cable subscription deals. Worse, they’re all delivered over old-fashioned set top boxes, all of which are — as Steve Jobs correctly puts it — lame. If you’re Comcast, here’s what ya do:

  1. Liberate the TV content distro system from the set top sphincter.
  2. Modify or re-build the plumbing to deliver content to Net-native (if not entirely -friendly) devices such as home flat screens, smartphones and iPads.
  3. Make it easy for users to pay for any or all of it on an à la carte (or at least an easy-to-pay) basis, and/or add a pile of new subscription deals.

Now you’ve got a much bigger marketplace, enlarged by many more devices and much less friction on the payment side. (Put all “content” and subscriptions on the shelves of “stores” like iTunes’ and there ya go.) Oh, and the Internet? … that World of Ends that techno-utopians (such as yours truly) liked to blab about? Oh, it’s there. You can download whatever you want on it, at higher speeds every day, overall. But it won’t be symmetrical. It will be biased for consumption. Our job as customers will be to consume — to persist, in the perfect words of Jerry Michalski, as “gullets with wallets and eyeballs.”

Future of the Internet

So, for current and future build-out, the Internet we techno-utopians know and love goes off the cliff while better rails get built for the next generations of TV — on the very same “system.” (For the bigger picture, Jonathan Zittrain’s latest is required reading.)

In other words, it will get worse before it gets better. A lot worse, in fact.

But it will get better, and I’m not saying that just because I’m still a utopian. I’m saying that because the new world really is the Net, and there’s a limit to how much of it you can pave with one-way streets. And how long the couch potato farming business will last.

More and more of us are bound to produce as well as consume, and we’ll need two things that a biased-for-TV Net can’t provide. One is speed in both directions: out as well as in. (“Upstream” calls Sisyphus to mind, so let’s drop that one.) The other is what Bob Frankston calls “ambient connectivity.” That is, connectivity we just assume.

When you go to a hotel, you don’t have to pay extra to get water from the “hydro service provider,” or electricity from the “power service provider.” It’s just there. It has a cost, but it’s just overhead.

That’s the end state. We’re still headed there. But in the meantime the Net’s going through a stage that will be The Last Days of TV. The optimistic view here is that they’ll also be the First Days of the Net.

Think of the original Net as the New World, circa 1491. Then think of TV as the Spanish invasion. Conquistators! Then read this essay by Richard Rodriguez. My point is similar. TV won’t eat the Net. It can’t. It’s not big enough. Instead, the Net will swallow TV. Ten iPad generations from now, TV as we know it will be diffused into countless genres and sub-genres, with millions of non-Hollywood production centers. And the Net will be bigger than ever.

In the meantime, however, don’t hold your breath.

Tomorrow we fly to Paris, where I’ll be based for the next five weeks. To help myself prep, here are a few of my notes from conversations with friends and my own inadequate research…

Mobile phone SIM recommendations are especially welcome. We plan to cripple our U.S. iPhones for the obvious reasons AT&T details here. Our other phones include…

  • Android Nexus One (right out of the box)
  • Nokia E72 (it’s a Symbian phone)
  • Nokia N900 (a computing device that does have a SIM slot and can be used as a phone)
  • Nokia 6820b (an old Nokia candybar-shaped GSM phone that hasn’t been used in years, but works)

Ideally we would like to go to a mobile phone store that can help us equip some combination of these things, for the time we’re there. The iPad too, once it arrives. It will be a 3G model.

Au revoir…

[Later...] We’re here, still jet-lagged and settling in. Here are some other items we could use some advice on:

  • “Free” wi-fi. This is confusing. There seem to be lots of open wi-fi access points in Paris, but all require logins and passwords. Our French is still weak at best, so that’s a bit of a problem too. One of the services is called Free, which also happens to be the company that provides TV/Internet/Phone service in the apartment. Should this also give us leverage with the Free wi-fi out there? Not sure. (Internet speed is 16.7Mbps down and .78Mbps up. It’s good enough, but not encouraging for posting photos. I’m also worried about data usage caps. Guidance on that is welcome too.)
  • Our 200-watt heavy-duty 220/110 step-down power transformer crapped out within two hours after being plugged in. We want to get a new one that won’t fail. The dead one is a Tacima.

Again, thanks for all your help.

Tags: , , , , , , , , , , ,

David Siegel, author of the excellent new book Pull, shares with me an abiding frustration with all major camera makers — especially the Big Two: Canon and Nikon: they’re silos. They require lenses that work only on their cameras and nobody else’s. In Vendor Lock-in FAIL David runs down the particulars. An excerpt:

If you have a Canon body, you’re probably going to buy Canon lenses. Why? Not because they are the best, but because they are the only lenses Canon bodies can autofocus. Canon keeps this interface between body and lens proprietary, to keep Canon owners buying more Canon lenses and prevent them from using third-party lenses. A company called Zeiss makes better lenses than Canon does, but Canon won’t license the autofocus codes toZeiss at any price, because Canon executives know that many of their customers would switch and buy Zeiss lenses and they would sell fewer Canon lenses. The same goes for Nikon. And it’s true – we would.

I didn’t know that Canon froze out Zeiss. Canon doesn’t freeze out Sigma and Tamron, both of which make compatible lenses for both Canon and Nikon (many of them, in fact).  Zeiss makes three lenses for Sony cameras but none for Canon and Nikon. I had assumed that Zeiss had some kind of exclusive deal with Sony.

In any case, photographers have long taken camera maker lock-in for granted. And there is history here. Backwards compatibility has always been a hallmark of Nikon with the F-mount, which dates back to 1959. Would Nikon photographers want the company to abandon its mount for lens compatibility with Canon and others? I kinda think not, but I don’t know. I’ve been a Canon guy, like David, since 2005. I shoot a lot, but I don’t have a single lens that a serious photographer would consider good. For example, I own not one L-series lens. (Those are Canon’s best.) All my lenses I bought cheap and/or used (or, in one case, was given to me).  I was a Nikon guy back in the 70s and 80s, but my gear (actually, my company’s gear, but I treated it like my own) all got stolen. Later I was a Pentax guy, but all that stuff got stolen too. Then I was a Minolta guy, and which I stayed until Minolta went out of business (basically getting absorbed into Sony, a company that could hardly be more proprietary and committed to incompatibility). I decided to dabble in digital in 2005, with a Nikon point-and-shoot (the CoolPix 5700, which had great color and an awful UI). I went with Canon for my first (and still only) SLR, an EOS 30D. (I also use a full-frame EOS 5D, but I won’t consider it mine until I’m done paying for it. Meanwhile none of my old lenses work right on it –they all have vignetting — another source of annoying incompatibility.)

Anyway, I do sympathize with David here:

While Nikon and Canon will both say they need to keep their proprietary interfaces to make sure the autofocus is world-class, they are both living in an old-world mentality. The future is open. Some day, you’ll be able to put a Canon lens right on a Nikon body and it will work fine. And you’ll be able to put a Zeiss lens on and it will work even better. But that day is far off. It will only come when the two companies finally realize the mistake they are making with their arms race now and start to talk openly about a better long-term solution.

Stephen Lewis (who is a serious photographer) and I have talked often about the same problem, [later... he says I got this (and much else) wrong, in this comment)] and also look toward the future with some degree of hope. As for faith, I dunno. As companies that are set in their ways go, it’s hard to beat the camera makers.

Tags: , , , , , , , , , , , ,

I was just interviewed for a BBC television feature that will run around the same time the iPad is launched. I’ll be a talking head, basically. For what it’s worth, here’s what I provided as background for where I’d be coming from in the interview:

  1. The iPad will arrive in the market with an advantage no other completely new computing device for the mass market has ever enjoyed: the ability to run a 100,000-app portfolio that’s already developed, in this case for the iPhone. Unless the iPad is an outright lemon, this alone should assure its success.
  2. The iPad will launch a category within which it will be far from the only player. Apple’s feudal market-control methods (all developers and customers are trapped within its walled garden) will encourage competitors that lack the same limitations. We should expect other hardware companies to launch pads running on open source operating systems, especially Android and Symbian. (Disclosure: I consult Symbian.) These can support much larger markets than Apple’s closed and private platforms alone will allow.
  3. The first versions of unique hardware designs tend to be imperfect and get old fast. Such was the case with the first iPods and iPhones, and will surely be the case with the first iPads as well. The ones being introduced next week will seem antique one year from now.
  4. Warning to competitors: copying Apple is always a bad idea. The company is an example only of itself. There is only one Steve Jobs, and nobody else can do what he does. Fortunately, he only does what he can control. The rest of the market will be out of his control, and it will be a lot bigger than what fits inside Apple’s beautiful garden.

I covered some of that, and added a few things, which I’ll enlarge with a quick brain dump:

  1. The iPad brings to market a whole new form factor that has a number of major use advantages over smartphones, laptops and netbooks, the largest of which is this: it fits in a purse or any small bag — where it doesn’t act just like any of those other devices. (Aside from running all those iPhone apps.) It’s easy and welcoming to use — and its uses are not subordinated, by form, to computing or telephony. It’s an accessory to your own intentions. This is an advantage that gets lost amidst all the talk about how it’s little more than a new display system for “content.”
  2. My own fantasy for tablets is interactivity with the everyday world. Take retailing for example. Let’s say you syndicate your shopping list, but only to trusted retailers, perhaps through a fourth party (one that works to carry out your intentions, rather than sellers’ — though it can help you engage with them). You go into Target and it gives you a map of the store, where the goods you want are, and what’s in stock, what’s not, and how to get what’s mising, if they’re in a position to help you with that. You can turn their promotions on or off, and you can choose, using your own personal terms of service, what data to share with them, what data not to, and conditions of that data’s use. Then you can go to Costco, the tire store, and the university library and do the same. I know it’s hard to imagine a world in which customers don’t have to belong to loyalty programs and submit to coercive and opaque terms of data use, but it will happen, and it has a much better chance of happening faster if customers are independent and have their own tools for engagement. Which are being built. Check out what Phil Windley says here about one approach.
  3. Apple works vertically. Android, Symbian, Linux and other open OSes, with the open hardware they support, work horizonally. There is a limit to how high Apple can build its walled garden, nice as it will surely be. There is no limit to how wide everybody else can make the rest of the marketplace. For help imagining this, see Dave Winer’s iPad as a Coral Reef.
  4. Content is not king, wrote Andrew Oldyzko in 2001. And he’s right. Naturally big publishers (New York Times, Wall Street Journal, the New Yorker, Condé Nast, the Book People) think so. Their fantasy is the iPad as a hand-held newsstand (where, as with real-world newsstands, you have to pay for the goods). Same goes for the TV and movie people, who see the iPad as a replacement for their old distribution systems (also for pay). No doubt these are Very Big Deals. But how the rest of us use iPads (and other tablets) is a much bigger deal. Have you thought about how you’ll blog, or whatever comes next, on an iPad? Or on any tablet? Does it only have to be in a browser? What about using a tablet as a production device, and not just an instrument of consumption? I don’t think Apple has put much thought into this, but others will, outside Apple’s walled garden. You should too. That’s because we’re at a juncture here. A fork in the road. Do we want the Internet to be broadcasting 2.0 — run by a few content companies and their allied distributors? Or do we want it to be the wide open marketplace it was meant to be in the first place, and is good for everybody? (This is where you should pause and read what Cory Doctorow and Dave Winer say about it.)
  5. We’re going to see a huge strain on the mobile data system as iPads and other tablets flood the world. Here too it will matter whether the mobile phone companies want to be a rising tide that lifts all boats, or just conduits for their broadcasting and content production partners. (Or worse, old fashioned phone companies, treating and billing data in the same awful ways they bill voice.) There’s more money in the former than the latter, but the latter are their easy pickings. It’ll be interesting to see where this goes.

I also deal with all this in a longer post that will go up elsewhere. I’ll point to it here when it comes up. Meanwhile, dig this post by Dave Winer and this one by Jeff Jarvis.

Tags: , , , , , , , , , , , ,

Some encouraging words here about Verizon’s expected 4G data rates:

After testing in the Boston and Seattle areas, the provider estimates that a real connection on a populated network should average between 5Mbps to 12Mbps in download rates and between 2Mbps to 5Mbps for uploads. Actual, achievable peak speeds in these areas float between 40-50Mbps downstream and 20-25Mbps upstream.The speed is significantly less than the theoretical 100Mbps promised by Long Term Evolution (LTE), the chosen standard, but would still give Verizon one of the fastest cellular networks in North America.

No mention of metering or data caps, of course.

Remember, these are phone companies. They love to meter stuff. Its what they know. They can hardly imagine anything else. They are billing machines with networks attached.

In addition to the metering problems Brett Glass details here, there is the simple question of whether carriers can meter data at all. Data ain’t minutes. And metering discourages both usage and countless businesses other than the phone companies’ own. I have long believed that phone and cable companies will see far more business for themselves if they open up their networks to possibilities other than those optimized for the relocation of television from air to pipes.

Data capping is problematic too. How can the customer tell how close they are to a cap? And how much does fearing overage discourage legitimate uses? And what about the accounting? My own problems with Sprint on this topic don’t give me any confidence that the carriers know how gracefully to impose data usage caps.

There’s a lot of wool in current advertising on these topics too. During the Academy Awards last night, Comcast had a great ad for Xfinity, its new high-speed service, promoted entirely as an entertainment pump. By which I mean that it was an impressive piece of promotion. But there was no mention of upstream speeds (downstream teaser: 100Mb/s). Or other limitations. Or how they might favor NBC (should they buy it) over other content sources. (Which, of course, they will.)

Sprint‘s CEO was in an another ad, promoting the company’s “unlimited text, unlimited Web and unlimited calling…” Right. Says right here in a link-proof pop-up titled “Important 4G coverage and plan information”, that 4G is unlimited, but 3G (what most customers, including I, still have) is limited to “5GB/300MB off-network roaming per month.” They do list “select cities” where 4G is available. Here’s Raleigh. I didn’t find New York, Los Angeles, Chicago or Boston on the list. I recall Amarillo. Can’t find it now, and the navigation irritates me too much to look.

Anyway, I worry that what we’ll get is phone and cable company sausage in Internet casing. And that, on the political side, the carriers will succeed in their campaign to clothe themselves as the “free market” fighting “government takeovers” while working the old regulatory capture game, to keep everybody else from playing.

So five, ten years from now, all the rest of the independent ISPs and WISPs will be gone. So will backbone players other than carriers and Google.  We’ll be gaga about our ability to watch pay-per-view on our fourth-generation iPads with 3-d glasses. And we won’t miss the countless new and improved businesses that never happened because they were essentially outlawed by regulators and their captors.

Tags: , , , , , , , ,

In response to Dave‘s Reading tea leaves in advance of Apple’s announcements, I added this comment:

Steve loves to uncork constipated categories with the world’s slickest laxative. So I’m guessing this new box will expand Apple’s retail shelf space to include newspapers, journals and books as well as sound recordings, movies and TV shows. It will be the best showcase “content” ever had, and will be a wholly owned proprietary channel. A year from now, half the people on planes will be watching these things.

It would be cool if it also helped any of us to become movie producers, and to share and mash up our own HD creations. But I think Steve is more interested in hacking Hollywood (entertainment) and New York (publishing).

I’ve thought for years that Apple’s real enemy is Sony. Or vice versa. But Sony got lame, becoming a Hollywood company with an equipment maker on the side. So think instead of the old Sony — the inventive one that owned the high-gloss/high-margin end of the entertainment gear business, the Sony of Walkmen and Trinitrons. That’s the vacuum Apple’s filling. Only, unlike Sony, Apple won’t have 50,000 SKUs to throw like spaghetti at the market’s wall. They’ll have the fewest number of SKUs possible. And will continue to invent or expand whole new categories with each.

And there will be something missing to piss people off too. Maybe it’ll be absent ports (like you said). Maybe it’s no multi-tasking, or skimpy memory, or bad battery life, or an unholy deal with some “partner.”

Whatever it is, the verities persist. Meaning items 1 through 6 from this 1997 document still apply:

http://www.scripting.com/davenet/stories/DocSea…

At that last link I wrote,

These things I can guarantee about whatever Apple makes from this point forward:

  1. It will be original.
  2. It will be innovative.
  3. It will be exclusive.
  4. It will be expensive.
  5. It’s aesthetics will be impeccable.
  6. The influence of developers, even influential developers like you, will be minimal. The influence of customers and users will be held in even higher contempt.

So now the iPad has been announced, Steve has left the building, and the commentariat is weighing in.

The absence of multi-tasking might be the biggest bummer. (Makes me wonder if mono-tasking is a Jobsian “feature”, kinda like the one-button mouse.) Adam Frucci of Gizmodo lists mono-tasking among eight things that suck” about the iPad, including no cameras, no HDMI out, no Flash, 3×4 (rather than wide) screen and a “Big, Ugly Bezel”. (That last one is off base, methinks. You need the wide bezel so you can hold the device without your hot fingertips doing wrong things with the touchscreen.)

Elswehere at Gizmodo, Joel Johnson says “PCs will be around as expert devices for the long haul, but it’s clear that Apple, coasting on the deserved success of the iPhone, sees simple, closed internet devices as the future of computing. (Or at the very least, portable computing.) And for the average consumer, it could be.”

The Engadgeteers mostly panned it. Unimaginative… underwhelming… one of Apple’s biggest misses.

MG Sigler at Techcrunch says, “The thing is beautiful and fast. Really fast. If you’ll excuse my hyperbole, it felt like I was holding the future. But is it a must-have?” Then answers,

Most people won’t yet, but as long as Apple has its base that will buy and use the iPad, they have plenty of time for either themselves or third-party developers to create the killer uses that make the iPad a must-have product for a broader range of people. We already saw that happen with the App Store and the iPhone/iPod touch. And at $499 (for the low-end version), there will be no shortage of people willing to splurge on the device just to see what all the fuss is about. They’ll get hooked too.

That’s getting close, but it’s not quite there.

First, the base Apple wants is consumers. Literally. We’re talking newspaper and magazine readers, buyers and users of cameras and camcorders, and (especially) TV and movie watchers. To some degree these people produce (mostly home video and photos), but to a greater degree they are still potatoes that metablolize “content”. This thing is priced like a television, with many improvements on the original. Call it Apple’s Trinitron. They are, like I said, after Sony’s abandoned position here, without the burden of a zillion SKUs.

Second, there will be a mountain of apps for this thing, and more than a few killer ones.

What depressed me, though I expected it, was the big pile of what are clearly verticalized Apple apps, which I am sure enjoy privileged positions in the iPad’s app portfolio, no matter how big that gets. It’s full of customer lock-in. I’m a photographer, and the only use for iPhoto I have is getting shots off the iPhone. Apple’s calendar on the iPhone and computer (iCal) is, while useful, still lame. Maybe it’ll be better on the iPad, but I doubt it. And the hugely sphinctered iTunes/Store system also remains irritating, though I understand why Apple does it.

What you have to appreciate, even admire, is how well Apple plays the vertical game. It’s really amazing.

What you also have to appreciate is how much we also need the horizontal one. The iPad needs an open alternative, soon. There should be for the iPad what Google’s Nexus One is for the iPhone.

I got a ride home tonight from Bob Frankston, who was guided by a Nexus One, serving as a better GPS than my dashboard’s Garmin. Earlier in the evening Bob used the Nexus One to do a bunch of other stuff the iPhone doesn’t do as well, if at all. More importantly, he didn’t need to get his apps only from Google’s (or anybody’s) “store”. And if somebody else wants to make a better Android phone than this one, they can. And Google, I’m sure, hopes they do. That’s because Google is playing a horizontal game here, broadening the new market that Apple pioneered with its highly vertical iPhone.

So a big lesson here is that the market’s ecosystem includes both the vertical silos and the horizontal landscapes on which those silos stand, and where all kinds of other things can grow. Joel may be right that “the average consumer” will have no trouble being locked inside Apple’s silo of “simple, closed Internet devices”. But there are plenty of other people who are neither average nor content with that prospect. There are also plenty of developers who prefer independence to dependence, and a free market to a captive one.

Captivity has its charms, and an argument can be made that tech categories are best pioneered by companies like Apple and Sony, which succeed both by inventing new stuff that primes the pump of demand, and by locking both developers and customers inside their silos. But truly free markets are not restricted to choices among silos, no matter how cushy the accomodations may be. Nor are they restricted to the non-choice of just one silo, as is currently the case with the iPad. Free markets are wide open spaces where anybody can make — and buy — anything.

There’s more to fear from heights than widths.

Bonus link: Dave weighs in. This is just a jumbo Oreo cookie.

Tags: , , , , , , , , , , , , , , , , ,

A couple days ago I responded to a posting on an email list. What I wrote struck a few chords, so I thought I’d repeat it here, with just a few edits, and then add a few additional thoughts as well. Here goes.

Reading _____’s references to ancient electrical power science brings to mind my own technical background, most of which is now also antique. Yet that background still informs of my understanding of the world, and my curiosities about What’s Going On Now, and What We Can Do Next. In fact I suspect that it is because I know so much about old technology that I am bullish about framing What We Can Do Next on both solid modern science and maximal liberation from technically obsolete legal and technical frameworks — even though I struggle as hard as the next geek to escape those.

(Autobiographical digression begins here. If you’re not into geeky stuff, skip.)

As a kid growing up in the 1950s and early ’60s I was obsessed with electricity and radio. I studied electronics and RF transmission and reception, was a ham radio operator, and put an inordinate amount of time into studying how antennas worked and electromagnetic waves propagated. From my home in New Jersey’s blue collar suburbs, I would ride my bike down to visit the transmitters of New York AM stations in the stinky tidewaters flanking the Turnpike, Routes 46 and 17, Paterson Plank Road and the Belleville Pike. (Nobody called them “Meadowlands” until many acres of them were paved in the ’70s to support a sports complex by that name.) I loved hanging with the old guys who manned those transmitters, and who were glad to take me out on the gangways to show how readings were made, how phasing worked (sinusoidal synchronization again), how a night transmitter had to address a dummy load before somebody manually switched from day to night power levels and directional arrays. After I learned to drive, my idea of a fun trip was to visit FM and TV transmitters on the tops of buildings and mountains. (Hell, I still do that.) Thus I came to understand skywaves and groundwaves, soil and salt water conductivity, ground systems, directional arrays and the inverse square law, all in the context of practical applications that required no shortage of engineering vernacular and black art.

I also obsessed on the reception end. In spite of living within sight of nearly every New York AM transmitter (WABC’s tower was close that we could hear its audio in our kitchen toaster), I logged more than 800 AM stations on my 40s-vintage Hammarlund HQ-129x receiver, which is still in storage at my sister’s place. That’s about 8 stations per channel. I came to understand how two-hop skywave reflection off the E layer of the ionosphere favored flat land or open water midway between transmission and reception points . This, I figured, is why I got KSL from Salt Lake City so well, but WOAI from San Antonio hardly at all. (Both were “clear channel” stations in the literal sense — nothing else in North America was on their channels at night, when the ionosphere becomes reflective of signals on the AM band.) Midpoint for the latter lay within the topographical corrugations of the southern Apalachians. Many years later I found this theory supported by listening in Hawaii to AM stations from Western North America, on an ordinary car radio. I’m still not sure why I found those skywave signals fading and distorting (from multiple reflections in the very uneven ionosphere) far less than those over land. I am sure, however, that most of this hardly matters at all to current RF and digital communication science. After I moved to North Carolina, I used Sporadic E reflections to log more than 1200 FM stations, mostly from 800 to 1200 miles away, plus nearly every Channel 3 and 6 (locally, 2,4 and 5 were occupied) in that same range. All those TV signals are now off the air. (Low-band VHF TV — channels 2 to 6 — are not used for digital signals in the U.S.) My knowledge of this old stuff is now mostly of nostalgia value; but seeking it has left me with a continuing curiosity about the physical world and our infrastructural additions to it. This is why much of what looks like photography is actually research. For example, this and this. What you’re looking at there are pictures taken in service to geology and archaeology.

(End of autobiographical digression.)

Speaking of which, I am also busy lately studying the history of copyright, royalties and the music business — mostly so ProjectVRM can avoid banging into any of those. This research amounts to legal and regulatory archaeology. Three preliminary findings stand out, and I would like to share them.

First, regulatory capture is real, and nearly impossible to escape. The best you can do is keep it from spreading. Most regulations protect last week from yesterday, and are driven by the last century’s leading industries. Little if any regulatory lawmaking by established industries — especially if they feel their revenue bases threatened, clears room for future development. Rather, it prevents future development, even for the threatened parties who might need it most. Thus the bulk of conversation and debate, even among the most progressive and original participants, takes place within the bounds of still-captive markets. This is why it is nearly impossible to talk about Net-supportive infrastructure development without employing the conceptual scaffolding of telecom and cablecom. We can rationalize this, for example, by saying that demand for telephone and cable (or satellite TV) services is real and persists, but the deeper and more important fact is that it is very difficult for any of us to exit the framing of those businesses and still make sense.

Second, infrastructure is plastic. The term “infrastructure” suggests physicality of the sturdiest kind, but in fact all of it is doomed to alteration, obsolescence and replacement. Some of it (Roman roads, for example) may last for centuries, but most of it is obsolete in a matter of decades, if not sooner. Consider over-the-air (OTA) TV. It is already a fossil. Numbered channels persist as station brands; but today very few of those stations transmit on their branded analog channels, and most of them are viewed over cable or satellite connections anyway. There are no reasons other than legacy regulatory ones to maintain the fiction that TV station locality is a matter of transmitter siting and signal range. Viewing of OTA TV signals is headed fast toward zero. It doesn’t help that digital signals play hard-to-get, and that the gear required for getting it sucks rocks. Nor does it help that cable and satellite providers that have gone out of their way to exclude OTA receiving circuitry from their latest gear, mostly force subscribing to channels that used to be free. As a result ABC, NBC, CBS, Fox and PBS are now a premium pay TV package. (For an example of how screwed this is, see here.) Among the biggest fossils are thousands of TV towers, some more than 2000 feet high, maintained to continue reifying the concept of “coverage,” and to legitimize “must carry” rules for cable. After live audio stream playing on mobile devices becomes cheap and easy, watch AM and FM radio transmission fossilize in exactly the same ways. (By the way, if you want to do something green and good for the environment, lobby for taking down some of these towers, which are expensive to maintain and hazards to anything that flies. Start with this list here. Note the “UHF/VHF transmission” column. Nearly all these towers were built for analog transmission and many are already abandoned. This one, for example.)

Third, “infrastructure” is a relatively new term and vaguely understood outside arcane uses within various industries. It drifted from military to everyday use in the 1970s, and is still not a field in itself. Try looking for an authoritative reference book on the general subject of infrastructure. There isn’t one. Yet digital technology requires that we challenge the physical anchoring of infrastructure as a concept. Are bits infrastructural? How about the means for arranging and moving them? The Internet (the most widespread means for moving bits) is defined fundamentally by its suite of protocols, not by the physical media over which data travels, even though there are capacity and performance dependencies on the latter. Again, we are in captured territory here. Only in conceptual jails can we sensibly debate whether something is an “information service” or a “telecommunication service”. And yet most of us who care about the internet and infrasructure do exactly that.

That last one is big. Maybe too big. I’ve written often about how hard it is to frame our understanding of the Net. Now I’m beginning to think we should admit that the Internet itself, as concept, is too limiting, and not much less antique than telecom or “power grid”.

“The Internet” is not a thing. It’s a finger pointing in the direction of a thing that isn’t. It is the name we give to the sense of place we get when we go “on” a mesh of unseen connections to interact with other entitites. Even the term “cloud“, labeling a utility data service, betrays the vagueness of our regard toward The Net.

I’ve been on the phone a lot lately with Erik Cecil, a veteran telecom attorney who has been thinking out loud about how networks are something other than the physical paths we reduce them to. He regards network mostly in its verb form: as what we do with our freedom — to enhance our intelligence, our wealth, our productivity, and the rest of what we do as contributors to civilization. To network we need technologies that enable what we do in maximal ways.  This, he says, requires that we re-think all our public utilities — energy, water, communications, transportation, military/security and law, to name a few — within the context of networking as something we do rather than something we have. (Think also of Jonathan Zittrain’s elevation of generativity as a supportive quality of open technology and standards. As verbs here, network and generate might not be too far apart.)

The social production side of this is well covered in Yochai Benkler‘s The Wealth of Networks, but the full challenge of what Erik talks about is to re-think all infrastructure outside all old boxes, including the one we call The Internet.

As we do that, it is essential that we look to employ the innovative capacities of businesses old and new. This is a hat tip in the general direction of ISPs, and to the concerns often expressed by Richard Bennett and Brett Glass: that new Internet regulation may already be antique and unnecessary, and that small ISPs (a WISP in Brett’s case) should be the best connections of high-minded thinkers like yours truly (and others named above) to the real world where rubber meets road.

There is a bigger picture here. We can’t have only some of us painting it.

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

One of the reasons I liked Dish Network (to the extent anybody can like a purely commercial entertainment utility) was that their satellite receivers included an over-the-air tuner. It nicely folded your over-the-air (OTA) stations in with others in the system’s channel guide. Here’s how it looked:

dish_guide1

Well, the week before last I discovered that our Dish receiver was having trouble seeing and using its broadband connection — and, for that matter, the phone line as well. That receiver was this one here…

vip622-lrg

… a ViP 622. Vintage 2006. Top of Dish’s line at the time. Note the round jack on the far left of the back side. That’s where your outside (or inside) over-the-air antenna plugged in. We’ll be revisiting the subject shortly.

So Dish sent a guy out. He replaced the ViP 622 with Dish’s latest (or so he said): a ViP 722. I looked it up on the Web and ran across “DISH Network’s forthcoming DVRs get detailed: hints of Sling all over“, by Darren Murph, posted May 18th 2008. Among other things it said, “The forthcoming ViP 722 will be the first HD DVR from the outfit with loads of Sling technology built in — not too shocking considering the recent acquisition. Additionally, the box is said to feature an all new interface and the ability to browse to (select) websites, double as a SlingCatcher and even handle Clip & Sling duties.”

So here it was, July 2009, and I had a ViP 722 hooked up to my nice Sony flat screen, and … no hint of anything remotely suggestive of a Sling feature. When I asked the Dish guy about it, he didn’t have a clue. Sling? What’s that? Didn’t matter anyway, because the thing couldn’t use our broadband. The guy thought it might be my firewall, but I don’t have one of those.  Just a straight Net connection, through a router and a switch in a wiring closet that works fine for every other Net-aware device hooked up to it. We tested the receiver’s connection with a laptop: 18Mb down, 4Mb up. No problems. The receiver gets an IP address from the router (and can display it), and lights blink by the ethernet jack. But… it doesn’t communicate. The Dish guy said the broadband was only used for pay-per-view, and we don’t care about that, it doesn’t much matter. But we do care about customer support. Dish has buttons and menu choices for that, but—get this—has to dial out on a phone line to get the information you want. I had thought this was just a retro feature of the old ViP 622, but when I called Dish they said no, it’s still a feature of ALL Dish receivers.

It’s 2009, and these things are still dialing out. On a land line. Amazing.

So a couple days ago my wife called me from the house (I’m back in Boston) and said that the ViP 722 was dead. Tot. Mort. We tried re-setting it, unplugging and plugging it back in. Nothing. Then yesterday Dish came out to fix the thing, found was indeed croaked, and put in a new one: a ViP 722k, Dish’s “advanced, state-of-the-art” reciever of the moment.

Well, it may be advanced in lots of ways, but it’s retarded in one that royally pisses me off: no over-the-air receiver. That jack in the back I pointed out above? Not there.  So, no longer can I plug in my roof antenna to watch over-the-air TV. To do that I’ll have to bypass the receiver and plug the antenna cable straight into the TV. (That has never worked either, because Sony makes the channel-tuning impossible to understand, much less operate. On that TV, switching between satellite and anything else, such as the DVD, is a freaking ordeal.) Oh, and I won’t be able to record over-the-air programs, either. Unless I get a second DVR that’s not Dish’s.

Okay, so I just did some looking around, and found through this video that the ViP 722K has an optional “MT2 OTA module” that gets you over-the-air TV on the ViP 722k. Here’s some more confusing shit about it. Here’s more from Dishuser.org. Here’s the product brochure (pdf). Digging in, I see it’s two ATSC (digital TV) tuners in one, with two antenna inputs, and it goes in a drawer in the back of the set. It costs $30. I don’t think the Dish installer even knew about it. He told me that the feature had been eliminated on the 722K, and that I was SOL.

Bonus bummer: The VIP 722k also features a much more complicated remote control. This reduces another long-standing advantage of Dish: remote controls so simple to use that you could operate them in the dark. Bye to that too.

So. Why did Dish subtract value like that? I can think of only two reasons. One is that approximately nobody still watches over-the-air TV. (This is true. I’m one of the very few exceptions. Color me retro.) The other is that Dish charges $5.99/month for local channels. They did that before, but now they can force the purchase. “Yes, we blew off your antenna, but now you can get the same channels over satellite for six bucks a month.” Except for us it’s not the same channels. We live in Santa Barbara, but can’t get the local over-the-air channels. Instead we watch San Diego’s. Dish doesn’t offer us those, at any price.

The final irony is that the ViP 722k can’t use our broadband or our phone line either. Nobody ever figured out that problem. That means this whole adventure was for worse than naught. We’d have been better off if with our old ViP 622. There was nothing wrong with it that isn’t still wrong with its replacements.

Later my wife shared a conversation she had with a couple other people in town who had gone through similar craziness at their homes. “What happened to TV?” one of them said. “It’s gotten so freaking complicated. I just hate it.”

What’s happening is a dying industry milking its customers. That much is clear. The rest is all snow.

Tags: , , , , ,

Major props to Cox for cranking up my speeds to 18Mb/s downstream and 4Mb/s upstream. That totally rocks.

I’m getting that speed now. Here’s what Cox’s local diagnostic tool says:

TCP/Web100 Network Diagnostic Tool v5.4.12
click START to begin
Connected to: speedtest.sbcox.net  –  Using IPv4 address
Checking for Middleboxes . . . . . . . . . . . . . . . . . .  Done
checking for firewalls . . . . . . . . . . . . . . . . . . .  Done
running 10s outbound test (client-to-server [C2S]) . . . . . 3.79Mb/s
running 10s inbound test (server-to-client [S2C]) . . . . . . 18.04Mb/s
The slowest link in the end-to-end path is a 10 Mbps Ethernet subnet
Information: Other network traffic is congesting the link

That won’t last. The connection will degrade again, or go down completely. Here we go:

Connected to: speedtest.sbcox.net  –  Using IPv4 address
Checking for Middleboxes . . . . . . . . . . . . . . . . . .  Done
checking for firewalls . . . . . . . . . . . . . . . . . . .  Done
running 10s outbound test (client-to-server [C2S]) . . . . . 738.0kb/s
running 10s inbound test (server-to-client [S2C]) . . . . . . 15.09Mb/s
Your Workstation is connected to a Cable/DSL modem
Information: Other network traffic is congesting the link
[C2S]: Packet queuing detected

Here’s a ping test to Google.com:

PING google.com (74.125.127.100): 56 data bytes
64 bytes from 74.125.127.100: icmp_seq=0 ttl=246 time=368.432 ms
64 bytes from 74.125.127.100: icmp_seq=1 ttl=246 time=77.353 ms
64 bytes from 74.125.127.100: icmp_seq=2 ttl=247 time=323.272 ms
64 bytes from 74.125.127.100: icmp_seq=3 ttl=246 time=343.178 ms
64 bytes from 74.125.127.100: icmp_seq=4 ttl=247 time=366.341 ms
64 bytes from 74.125.127.100: icmp_seq=5 ttl=246 time=385.083 ms
64 bytes from 74.125.127.100: icmp_seq=6 ttl=246 time=406.209 ms
64 bytes from 74.125.127.100: icmp_seq=7 ttl=246 time=434.731 ms
64 bytes from 74.125.127.100: icmp_seq=8 ttl=246 time=444.653 ms
64 bytes from 74.125.127.100: icmp_seq=9 ttl=247 time=474.976 ms
64 bytes from 74.125.127.100: icmp_seq=10 ttl=247 time=472.244 ms
64 bytes from 74.125.127.100: icmp_seq=11 ttl=246 time=488.023 ms

No packet loss on that one. Not so on the next, to UCSB, which is so close I can see it from here:

PING ucsb.edu (128.111.24.40): 56 data bytes
64 bytes from 128.111.24.40: icmp_seq=0 ttl=52 time=407.920 ms
64 bytes from 128.111.24.40: icmp_seq=1 ttl=52 time=427.506 ms
64 bytes from 128.111.24.40: icmp_seq=2 ttl=52 time=441.176 ms
64 bytes from 128.111.24.40: icmp_seq=3 ttl=52 time=456.073 ms
64 bytes from 128.111.24.40: icmp_seq=4 ttl=52 time=237.366 ms
64 bytes from 128.111.24.40: icmp_seq=5 ttl=52 time=262.868 ms
64 bytes from 128.111.24.40: icmp_seq=6 ttl=52 time=287.270 ms
64 bytes from 128.111.24.40: icmp_seq=7 ttl=52 time=307.931 ms
64 bytes from 128.111.24.40: icmp_seq=8 ttl=52 time=327.951 ms
64 bytes from 128.111.24.40: icmp_seq=9 ttl=52 time=352.974 ms
64 bytes from 128.111.24.40: icmp_seq=10 ttl=52 time=376.636 ms
ç64 bytes from 128.111.24.40: icmp_seq=11 ttl=52 time=395.893 ms
^C
— ucsb.edu ping statistics —
13 packets transmitted, 12 packets received, 7% packet loss
round-trip min/avg/max/stddev = 237.366/356.797/456.073/69.322 ms

That’s low to UCSB, by the way. I just checked again, and got 9% and 25% packet loss. At one point (when the guy was here this afternoon), it hit 57%.

Here’s a traceroute to UCSB:

traceroute to ucsb.edu (128.111.24.40), 64 hops max, 40 byte packets
1  192.168.1.1 (192.168.1.1)  0.687 ms  0.282 ms  0.250 ms
2  ip68-6-40-1.sb.sd.cox.net (68.6.40.1)  349.599 ms  379.786 ms  387.580 ms
3  68.6.13.121 (68.6.13.121)  387.466 ms  400.991 ms  404.500 ms
4  68.6.13.133 (68.6.13.133)  415.578 ms  153.695 ms  9.473 ms
5  paltbbrj01-ge600.0.r2.pt.cox.net (68.1.2.126)  16.965 ms  18.286 ms  15.639 ms
6  te4-1–4032.tr01-lsanca01.transitrail.ne… (137.164.129.15)  19.936 ms  24.520 ms  20.952 ms
7  calren46-cust.lsanca01.transitrail.net (137.164.131.246)  26.700 ms  24.166 ms  30.651 ms
8  dc-lax-core2–lax-peer1-ge.cenic.net (137.164.46.119)  44.268 ms  98.114 ms  200.339 ms
9  dc-lax-agg2–lax-core2-ge.cenic.net (137.164.46.112)  254.442 ms  277.958 ms  273.309 ms
10  dc-ucsb–dc-lax-dc2.cenic.net (137.164.23.3)  281.735 ms  313.441 ms  306.825 ms
11  r2–r1–1.commserv.ucsb.edu (128.111.252.169)  315.500 ms  327.080 ms  344.177 ms
12  128.111.4.234 (128.111.4.234)  346.396 ms  367.244 ms  357.468 ms
13  * * *

As for modem function, I see this for upstream:

Cable Modem Upstream
Upstream Lock : Locked
Upstream Channel ID : 11
Upstream Frequency : 23600000 Hz
Upstream Modulation : QAM16
Upstream Symbol Rate : 2560 Ksym/sec
Upstream transmit Power Level : 38.5 dBmV
Upstream Mini-Slot Size : 2

… and this for downstream:

Cable Modem Downstream
Downstream Lock : Locked
Downstream Channel Id : 1
Downstream Frequency : 651000000 Hz
Downstream Modulation : QAM256
Downstream Symbol Rate : 5360.537 Ksym/sec
Downstream Interleave Depth : taps32Increment4
Downstream Receive Power Level : 5.4 dBmV
Downstream SNR : 38.7 dB

The symptoms are what they were when I first blogged the problem on June 21, and again when I posted a follow-up on June 24. That was when the Cox service guy tightened everything up and all seemed well … until he left. When I called to report the problem not solved Cox said they would send a “senior technician” on Friday. A guy came today. The problems were exactly as we see above. He said he would have to come back with a “senior technician” (or whatever they call them — I might be a bit off on the title), which this dude clearly wasn’t. He wanted the two of them to come a week from next Wednesday. We’re gone next week anyway, but I got him to commit to a week from Monday. That’s July 6, in the morning. The problem has been with us at least since the 18th, when I arrived here from Boston.

This evening we got a call from a Cox survey robot, following up on the failed service visit this afternoon. My wife took the call. After she indicated our dissatisfaction with the visit (by pressing the appropriate numbers in answer to a series of questions), the robot said we should hold to talk to a human. Then it wanted our ten-digit Cox account number. My wife didn’t know it, so the robot said the call couldn’t be completed. And that was that.

I doubt another visit from anybody will solve the problem, because I don’t think the problem is here. I think it’s in Cox’s system. I think that’s what the traceroute shows.

But I don’t know.

I do know that this is inexcusably bad customer service.

For Cox, in case they’re reading this…

  • I am connected directly to the cable modem. No routers, firewalls or other things between my laptop and the modem.
  • I have rebooted the modem about a hundred times. I have re-started my computers. In fact I have tested the link with three different laptops. Same results. Re-booting sometimes helps, sometimes not.
  • Please quit trying to fix this only at my end of the network. The network includes far more than me and my cable modem.
  • Please make it easier to reach technically knowledgeable human beings.
  • Make your chat system useful. At one point the chat person gave me Linksys’ number to call.
  • Thanks for your time and attention.

Tags: , , , , , , ,

When we went looking for an apartment here a couple years ago, we had two primary considerations in addition to the usual ones: walking distance from a Red Line subway stop, and fiber-based Internet access. The latter is easy to spot if you know what to look for, starting with too many wires on the poles. After that you look for large loops among the wires. That means the wiring contains glass, which breaks if the loops are too small. The apartment we chose has other charms, but for me the best one is a choice between three high speed Internet services: Comcast, Verizon FiOS and RCN. Although Comcast comes via coaxial cable, it’s a HFC (hybrid fiber-coax) system, and competes fairly well against fiber all the way to the home. That’s what Verizon FiOS and RCN provide.

fiber

We chose Verizon FiOS, which gives us 20Mb symmetrical service for about $60/month. The 25 feet between the Optical Network Terminal box and my router is ironically provided by old Comcast cable TV co-ax. (Hey, if Comcast wants my business, they can beat Verizon’s offering.)

My point is that we live where we do because there is competition among Internet service providers. While I think competition could be a lot better than it is, each of those three companies still offer far more than what you’ll find pretty much everywhere in the U.S. where there is little or no competition at all.

The playing field in the skies above sidewalks is not pretty. Poles draped with six kinds of wiring (in our case electrical, phone, cable, cable, fiber, fiber — I just counted) are not attractive. At the point the poles become ugly beyond endurance, I expect that the homeowners will pay to bury the services. By the grace of local regulators, all they’ll bury will be electrical service and bundles of conduit, mostly for fiber. And they won’t bury them deep, because fiber isn’t bothered by proximity to electrical currents. In the old days (which is still today in most fiber-less places), minimum separations are required between electrical, cable and phone wiring — the latter two being copper. In Santa Barbara (our perma-home), service trenching has to be the depth of a grave to maintain those separations. There’s no fiber yet offered in Santa Barbara. At our house there the only carrier to provide “high” speed is the cable company, and it’s a fraction of what we get over fiber here near Boston.

All this comes to mind after reading D.C. Court Upholds Ban on MDU Contracts: FCC prevents new exclusive contracts and nullifies existing ones, by John Eggerton in Broadcasting & Cable.  It begins, “The U.S. Court of Appeals for the D.C. Circuit Monday upheld an FCC decision banning exclusive contracts between cable companies and the owners of apartments and other multiple-dwelling units (MDU).”

The rest of the piece is framed by the long-standing antipathy between cable and telephone companies (cable lost this one), each as providers of cable TV. For example,

Not surprisingly, Verizon praised the decision. It also saw it as a win for larger issues of access to programming:

“This ruling is a big win for millions of consumers living in apartments and condominiums who want nothing more than to enjoy the full benefits of video competition,” said Michael Glover, Verizon senior VP, deputy general counsel, in a statement. “In upholding the ban on new and existing exclusive access deals, the Court’s decision also confirms the FCC’s authority to address other barriers to more meaningful competitive choice and video competition, such as the cable companies’ refusal to provide competitors with access to regional sports programming.”

Which makes sense at a time in history when TV viewing still comprises a larger wad of demand than Internet use. This will change as more and more production, distribution and consumption moves to the Internet, and as demand increases for more Internet access by more different kinds of devices — especially mobile ones.

Already a growing percentage of my own Internet use, especially on the road, uses cellular connectivity rather than wi-fi (thanks to high charges for crappy connectivity at most hotels). Sprint is my mobile Internet provider. They have my business because they do a better job of getting me what I want: an “air card” that works on Linux and Mac laptops, and not just on Windows ones). Verizon wanted to charge me for my air card (Sprint’s was free with the deal, which was also cheaper), and AT&T’s gear messed up my laptops and didn’t work very well anyway.

In both cases — home and road — there is competition.

While I can think of many reforms I’d like to see around Internet connectivity (among citizens, regulators and regulatees), anything that fosters competition in the meantime is a Good Thing.

Tags: , , , , , , , , , , , , ,

WebTV webtvwas way ahead of its time and exactly backwards. The idea was to put the Web on TV. In the prevailing media framework of the time, this made complete sense. TV had been around since the Forties, and nearly everybody devoted many hours of their daily lives to it. The Web was brand new then. And, since the Web used a tube like TV did, it only made sense to make the Web work on TV, rather than vice versa.

Microsoft bought WebTV for $.425 billion in April 1997. It was the most Microsoft had ever spent on an acquisition, and a stunning sum to spend on what was clearly a speculative play. But Microsoft clearly thought it was skating to where the puck was going.

Not long after that I heard from Dave Feinleib, an executive at Microsoft. Dave wanted to know if I would be interested in writing a chapter for a book he was putting together on the convergence of the Web and television. What brought him to my door was that I was the only writer he found who claimed the Web would eat TV, rather than vice versa. Everybody else was saying that history was going the other way — including Microsoft itself, with its enormous bet.

Dave was an outstanding editor, and did a great job pulling his book together. Originally he wanted it to be published by somebody other than Microsoft, but that didn’t work out. If I’m not mistaken (and Dave, if you’re out there somewhere, correct me), his choices of title also didn’t make it. The title finally chosen was a kiss of death: The Inside Story of Interactive TV and (in much larger type) WebTV for Windows. (Cool: You can still get it at Amazon, so death in this case is only slightly exaggerated.)

It was a good book, and an important historic document. At least for me. Much of what I later contributed to The Cluetrain Manifesto I prototyped in my chapter of Dave’s book. My title was “The Message Is Not the Medium.”

Amazingly, I just found a draft of the chapter, which I assumed had been long gone in an old disk crash or something. Begging the indulgence of Dave and Microsoft, I’ll quote from it wholesale. Remember that this was written in 1998, at the very height of the dot-com bubble.

About the conversational nature of markets:

So what we have here are two metaphors for a marketplace: 1) a battlefield; and 2) a conversation. Which is the better metaphor for the Web market? One is zero-sum and the other is positive-sum. One is physical and the other is virtual. One uses OR logic, and the other uses AND logic.

It’s no contest. The conversation metaphor describes a world exploding with positive new sums. The battlefield metaphor insults that world by denying those sums. It works fine when we’re talking about battles for shelf space in grocery stores; but when we’re talking about the Web, battlefield metaphors ignore the most important developments.

There are two other advantages to the conversation metaphor. First, it works as a synonym. Substitute the word “conversation” for  “market” and this fact becomes clear. The bookselling conversation and the bookselling market are the same. Second, conversations are the fundamental connections human beings make with each other. We may love or hate one another, but unless we’re in conversation, not much happens between us. Societies grow around conversations. That includes the business societies we call markets…

About the Web as a marketplace:

Today the Web remains an extraordinarily useful way to publish, archive, research and connect all kinds of information. No medium better serves curious or inventive minds.

While commerce may not have been the first priority of the Web’s prime movers, their medium has quickly proven to be the most commercial medium ever created. It invites every business in the Yellow Pages either to sell on the Web or to support their existing business by using the Web to publish useful information and invite dialog with customers and other involved parties. In fact, by serving as both an ultimate yellow page directory and an endless spread of real estate for stores and businesses, the Web demonstrates extreme synergy between the publishing and retailing metaphors, along with their underlying conceptual systems.

So, in simple terms, the Web efficiently serves two fundamental human needs:

1.    The need to know; and
2.    The need to buy.

While it also serves as a fine way to ship messages to eyeballs, we should pause to observe that the message market is a conversation that takes place entirely on the supply side of TV’s shipping system. In the advertising market, media sell space or time to companies that advertise. Not to consumers. The consumers get messages for free, whether they want them or not.

What happens when consumers can speak back — not just to the media, but to the companies who pay for the media? In the past we never faced that question. Now we do. And the Web will answer with a new division of labor between advertising and the rest of commerce. That division will further expose the limits of both the advertising and entertainment metaphors.

On Sales vs. Advertsing, and how the Web does more for the former than the latter:

“Advertising is what you do when you can’t go see somebody. That’s all  it is.” — Fairfax Cone

Fairfax “Fax” Cone founded one of the world’s top advertising agencies, Foote, Cone & Belding, and ran it for forty years. A no-nonsense guy from Chicago, Cone knew exactly what advertising was and wasn’t about. With this simple definition — what you do when you can’t go see somebody — he drew a clear line between advertising and sales. Today, thirty years after he retired, we can draw the same line between TV and the Web, and divide the labors accordingly.

On one side we have television, the best medium ever created for advertising. On the other side we have the Web, the best medium ever created for sales.

The Web, like the telephone, is a much better tool for sales than for promotion. It’s what you do when you can go see somebody: a way to inform customers and for them to inform you. The range of benefits is incalculable. You can learn from each other, confer in groups, have visually informed phone conversations, or sell directly with no sales people at all.

In other words, you can do business. All kinds of business. As with the phone, it’s hard to imagine any business you can’t do, or can’t help do, with the Web.

So we have a choice. See or be seen: see with the Web, or be seen on TV. Talk with people or talk at them. Converse with them, or send them messages.

Once we divide these labors, advertising on the Web will make no more sense than advertising on the phone does today. It will be just as unwelcome, just as intrusive, just as rude and just as useless.

The Web will call forth — from both vendors and customers — a new kind of marketing: one that seeks to enlarge the conversations we call business, not to assault potential customers with messages they don’t want. This will expose Web advertising — and most other advertising — as the spam it is, and invite the development of something that serves supply without insulting demand, and establishes market conversations equally needed by both.

This new marketing conversation will embrace what Rob McDaniel  calls a “divine awful truth”  — a truth whose veracity is exceeded only by its deniability. When that truth becomes clear, we will recognize most advertising as an ugly art form  that only dumb funding can justify, and damn it for the sin of unwelcome supply in the absence of demand.

That truth is this: There is no demand for messages. And there never was.

In fact, most advertising has negative demand, especially on TV. It actually subtracts value. To get an idea just how negative TV advertising is, imagine what would happen if the mute buttons on remote controls delivered we-don’t-want-to-hear-this messages back to advertisers. When that feedback finally gets through, the $180+ billion/year advertising market will fall like a bad soufflé.

It will fall because the Web will bring two developments advertising has never seen before, and has always feared:  1) direct feedback; and 2) accountability. These will expose another divine awful truth: most advertising doesn’t work.

In the safety of absent alternatives, advertising people have always admitted as much. There’s an old expression in the business that goes, “I know half my advertising is wasted. I just don’t know which half.” (And let’s face it, “half” is exceedingly generous.)

With the Web, you can know. Add the Web to TV, and you can measure waste on the tube too.

Use the Web wisely, and you don’t have to settle for any waste at all.

About advertising’s fatal flaw:

Television is two businesses: 1) an entertainment delivery service; and 2) an advertising delivery service. They involve two very different conversations. The first is huge and includes everybody. The second is narrow and only includes advertisers and broadcasters.

TV’s entertainment producers are program sources such as production companies, network entertainment divisions, and the programming sides of TV stations. These are also the vendors of the programs they produce. Their customers and distributors are the networks and TV stations, who give away the product for free to their consumers, the viewers.

In TV’s advertising business, the advertising is produced by the advertisers themselves, or by their agencies. But in this market conversation, advertisers paly the customer role. They buy time from the networks and the stations, which serve as both vendors and distributors. Again, viewers consume the product for free.

In the past, the difference between these conversations didn’t matter much, because consumers were not part of TV’s money-for-goods market conversation.  Instead, consumers were part of the conversation around the product TV gives away: programming.

In the economics of television, however, programming is just bait. It’s very attractive bait, of course; but it’s on the cost side of the balance sheet, not the revenue side. TV’s $45+ billion revenues come from advertising, not programming. And the sources of programming make most of their money from their customers: networks, syndicators and stations. Not from viewers.

Broadcasters, however, are accustomed to believing that their audience is deeply involved in their business, and often speak of demographics (e.g. men 25-54) as “markets.” But there is no market conversation here, because the relationship — such as it is — is restricted to terms set by what the supply side requires, which are ratings numbers and impersonal information such as demographic breakouts and lifestyle characterizations. This may be useful information, but it lacks the authenticity of real market demand, expressed in hard cash. In fact, very few viewers are engaged in conversations with the stations and networks they watch. It’s a one-way, one-to-many distribution system. TV’s consumers are important only in aggregate, not as individuals. They are many, not one. And, as Reese Jones told us earlier, there is no such thing as a many-to-one conversation. At best there is only a perception of one. Big difference.

So, without a cash voice, audience members can only consume. Their role is to take the bait. If the advertisements work, of course, they’ll take the hook as well. But the advertising business is still a conversation that does not include its consumers..

So we get supply without demand, which isn’t a bad definition of advertising.

Now let’s look at the Web.

Here, the customer and consumer are the same. He or she can buy the advertisers’ goods directly from the advertiser, and enjoy two-way one-to-one market conversations that don’t involve the intervention either of TV as a medium or of one-way messages intended as bait. He or she can also buy entertainment directly from program sources, which in this relationship vend as well as produce. The distribution role of TV stations and networks is unnecessary, or at least peripheral. In other words, the Web disintermediates TV, plus other media.

So the real threat to TV isn’t just that the Web makes advertising accountable. It’s that it makes business more efficient. In fact the Web serves as both a medium for business and as a necessary accessory to it, much like the telephone. No medium since the telephone does a better job of getting vendors and customers together, and of fostering the word-of-mouth that even advertisers admit is the best advertising.

The Web is an unprecedented clue-exchange system. And when companies get enough clues about how poorly their advertising actually works, they’ll drop it like a bad transmission, or change it so much we can’t call it advertising any more.

We may have a blood bath. Killing ad budgets is a snap. Advertising is protected by no government agencies, and encouraged by no tax incentives. It’s just an expense, a line item, overhead. You can waste it with a phone call and almost nobody will get fired, aside from a few marketing communications (“marcom”) types and their expensive ad agencies.

About TV’s fatal flaw:

Few would argue that TV is a good thing. Hand-wringing over TV’s awfulness is a huge nonbusiness. TV Free America counts four thousand studies of TV’s effects on children. The TVFA also says 49% of Americans think they watch too much TV, and 73% of American parents think they should limit their kid’s TV watching.

And, as the tobacco industry will tell you, smoking is an “adult custom” and “a simple matter of personal choice.”

Then let’s admit it: TV is a drug. So why do we take it when we clearly know it’s bad for our brains?

Six reasons: 1) because it’s free; 2) because it’s everywhere; 3) because it’s narcotic; 4) because we enjoy it; 5) because it’s the one thing we can all talk about without getting too personal; and 6) because it’s been with us for half a century.

Television isn’t just part of our culture; it is our culture. As Howard Beale tells his audience, “You dress like the tube, you eat like the tube, you raise your children like the tube.” And we do business like the tube, too. It’s standard.

Howard Beale had it right: television is a tube. Let’s look at it one more time, from our point of view.

What we see is a one-way freight forwarding system, from producers to consumers. Networks and stations “put out,” “send out” and “deliver” programs through “channels” on “signals” that an “audience” of “viewers” “receive,” or “get” through this “tube.” We “consume” those products by “watching” them, often intending to “vege out” in the process.

Note that this activity is bovine at best, vegetative at worst and narcotic in any case. To put it mildly, there is no room in this metaphor for interactivity. And let’s face it, when most people watch TV, the only thing they want to interact with is the refrigerator.

Metaphorically speaking, it doesn’t matter that TV contains plenty of engaging and stimulating content, any more than it matters that life in many ways isn’t a journey. TV is a tube. It goes from them to us. We just sit here and consume it like fish in a tank, staring at glass.

Of course we’re not really like that. We’re conscious when we watch TV.

Well, of course we are. So are lots of people. But that’s not how the concept works, and its not what the system values. TV’s delivery-system metaphors reduce viewing to an effect — a noise at the end of the trough. And they reduce programming to container cargo. “Content,” for example, is a tubular noun that comes straight out of the TV conversation. What retailers would demean their goods with such a value-subtracting label?   Does Macy’s sell “content?” With TV, the label is accurate. The product is value-free, since consumers don’t pay a damn thing for it.

There is a positive side to the entertainment conversation, of course. Writers, producers, directors and stars all put out “shows” to entertain an “audience.” Here the underlying metaphor is theater. By this conceptual metaphor, TV is a stage.  But the negotiable market value of this conversation is provided entirely by its customers: the TV stations and networks. The audience, however, pays nothing for the product. Its customers use it as advertising bait. This isolates the show-biz conversation and its value. You might say that TV actually subtracts value from its own product, by giving it away.

And, the story of TV’s death foretold:

In the long run (which may not be very long), the Web conversation will win for the simple reason that it supports and nurtures direct conversations, and therefore grows business at a much faster rate. It also has conceptual metaphors that do a better job of supporting commerce.

Drugs have their uses. But it’s better to bet on the nurtured market than on the drugged one.

Trees don’t grow to the sky. TV’s $45 billion business may be the biggest redwood in the advertising forest, but in a few more years we’ll be counting its rings. “Propaganda ends where dialog begins,” Jacques Ellul says.

The Web is about dialog. The fact that it supports entertainment, and does a great job of it, does nothing to change that fact. What the Web brings to the entertainment business (and every business), for the first time, is dialog like nobody has ever seen before. Now everybody can get into the entertainment conversation. Or the conversations that comprise any other market you can name. Embracing that is the safest bet in the world. Betting on the old illusion machine, however popular it may be at the moment, is risky to say the least…

TV is just chewing gum for the eyes. — Fred Allen

This may look like a long shot, but I’m going to bet that the first fifty years of TV will be the only fifty years. We’ll look back on it the way we now look back on radio’s golden age. It was something communal and friendly that brought the family together. It was a way we could be silent together. Something of complete unimportance we could all talk about.

And, to be fair, TV has always had a very high quantity of Good Stuff. But it also had a much higher quantity of drugs. Fred Allen was being kind when he called it “chewing gum for the eyes.” It was much worse. It made us stupid. It started us on real drugs like cannabis and cocaine. It taught us that guns solve problems and that violence is ordinary. It disconnected us from our families and communities and plugged us into a system that treated us as a product to be fattened and led around blind, like cattle.

Convergence between the Web and TV is inevitable. But it will happen on the terms of the metaphors that make sense of it, such as publishing and retailing. There is plenty of room in these metaphors — especially retailing — for ordering and shipping entertainment freight. The Web is a perfect way to enable the direct-demand market for video goods that the television industry was never equipped to provide, because it could never embrace the concept. They were in the eyeballs-for-advertisers business. Their job was to give away entertainment, not to charge for it.

So what will we get? Gum on the computer screen, or choice on the tube?

It’ll be no contest, especially when the form starts funding itself.

Bet on Web/TV, not TV/Web.

Looking back on all that, I wince at how hyperbolic some of it was (like, there really is some demand for some messages), but I’m still pleased with what I got right, which is that the Web eats TV. Which brings me to the precipitating post, YouTube is Huge and About to Get Even Bigger, by Jennifer Van Grove in Mashable. Sez Jennifer,

According to YouTube, the hours of video uploaded to YouTube every minute has been growing astronomically since mid-2007, when it was just a measly six hours per minute. Then, in “January of this year, it became 15 hours of video uploaded every minute, the equivalent of Hollywood releasing over 86,000 new full-length movies into theaters each week.”

Now, just a few months later and we’ve hit the 20 hour per minute milestone, which means that for every second in time about 33 minutes of video make it to YouTube, and that for any given day 28,800 hours of video are uploaded in total…

Even though YouTube (YouTube reviews) is seeing such massive upload numbers, and we think that speaks to the strength of their community, they still have monetization challenges that are only exacerbated by the rising bandwidth costs required to support such an enormous load. Bandwidth costs are already proving to be the bane of YouTube’s existence, possibly resulting in $470 million in loses for this year alone.

So while YouTube’s outwardly celebrating that we’re dumping 20 hours of video on their servers every minute, we think they should count their blessings with a little more realism since, based on previous patterns, this number, along with bandwidth costs, will only continue to rise.

“Rise” is too weak a verb. What we have here is something of an artesian flood, a continent of blooming volcanoes.

In the old top-down world of broadcasting, all we had were a few thousand big transmitters, each with limited reach, stretched and widened by cable and satellite TV. (Remember that what we call “cable” began as CATV: Community Antenna TeleVision.) It is over these legacy systems, plus the upgraded phone system, that most of us are connected to the Internet today.

In the legacy TV world, transmitters are obsolete to the verge of pointlessness. So are “channels.” So are the “networks” that are now just distributors for TV shows. All that matters is “content,” as they say. And that’s moving online, huge-time.

Tomorrow’s shows  won’t be coming only from big-time program producers.  We’ll be getting them from each other as well. We already see that with YouTube, but in relatively low-def resolutions. Still, it’s a start. At the end of the next growth stage we’ll be producing out own damn shows, and at resolutions higher than cable can bear. So will the incumbent producers, of course, but they won’t be taking the lead in pushing for wider bandwidth. That’s an easy call because they’re not taking the lead right now, and they should be. Instead they’ve left it up to us: the “viewers” who are now becoming producers and reproducers.

Already you can get a camcorder that will shoot 1080p video for well under a $grand. That’s more resolution than you’ll get from cable or satellite, with a few pay-per-view exceptions. Combine the sphinctered nature of cable and satellite TV bandwidth with the carriers’ need to compete by carrying more and more channels, and what you get is stuff that’s “HD” in name only. While the resolution might be 720p or 1080i, the amount of actual data carried on each channel is minimal or worse, resulting in skies that look plaid and skin that looks damaged. All of whch means that the best thing you can see — today — on your new 1080p screen comes from your new 1080p camcorder. (Unless you pay bux deluxe for a Blu-Ray player, which not many of us are doing.) So: how long before ordinary folks are producing their own high-def movies, in large numbers? How long before that pounds out the walls of pipes all over the place?

Even if that takes awhile, we have to face facts. We’re going to need the bandwidth. Storage and processing we’ve got covered, because that’s at the edges, where there’s not much standing in the way of growth and enterprise. In the middle we’ve got a world wide bandwidth challenge.

The phone and cable companies can’t give it to us — at least not the way they’re currently set up. Even the best of the carrier breed — Verizon FiOS, which I’m using right now, and appreciating a great deal — is set up as a top-grade cable TV system that also delivers Internet. Not as a fat data pipe between any two points, which is what we’ll need.

Pause for a moment and recall this scene from the movie “Jaws”. “We’re gonna need a bigger boat,” Roy Scheider says.

TV on the Net is the shark in this story. The Quinn role is being played by the carriers right now. They need to be smarter than what we’ve seen so far. So do the rest of us.

Tags: , , ,

jesusita_google_modis10

Where most of my earlier shots in this series were of fire detection and spread across time, the one above (and in the larger linked shot, on Flickr) is of “fire radiative power”. If you look at the whole set, you can get an idea of both intensity and spread across time. Again, these are from MODIS, which is an instrument system on satellites passing more than 700km overhead. Still, it finds stuff, and dates it. That’s why this next shot is very encouraging:

jesusita_google_modis11

It will sure spread some more, but we can see the end coming. Here’s the whole photo set.

And here’s the latest update on exactly what burned (addresses and all) from Matt Kettmann (Contact), Sam Kornell , Chris Meagher (Contact), Ben Preston (Contact), Ethan Stewart (Contact) of the Independent.

They also issue a caution:

The bad news is that the fire still threatens parts of Goleta to the west, the Painted Cave community to the north, and, to the east, parts of Santa Barbara and Montecito, where the evacuation order was just extended once again.

Those Indy folks did — and are still doing — an outstanding job, deserving of whatever rewards are coming their way. Great work by everybody else reporting on the fire as well. Kudos all around.

And great work, of course, by the firefighters. They saved the city. If you’ve ever seen a fire this big and threatening (for example, Oakland, which I did see, and which took out more than 3500 homes), you know how hard it is to stop. Around 80 homes were lost in this one. It could have been many more. If Cheltenham, or the Riviera, had gone up, and the sundowner winds kept blowing, it’s not hard to imagine losing the whole city, since the rain of flaming debris would have caused a true firestorm. From the same Indy report:

“The firefighters must have sat in every single backyard and held it off. The fire reached literally the backyards of every single one of them, but I didn’t see a single house burned up there.”

The mountains won’t be as pretty for a couple of years. But the city will also be safer. That’s the upside. 2:54pm Pacific

Here is a great map that shows all three fires in the last year, as well as good information about the ongoing Jesusita Fire.

Tags: , , , , , , , ,

Thanks to Keith McArthur for clueing me in on Cluetrainplus10, in which folks comment on each of Cluetrain’s 95 theses, on roughly the 10th anniversary of the day Cluetrain went up on the Web. (It was around this time in 1999.)

The only thesis I clearly remember writing was the first, “Markets are conversations.” That one was unpacked in a book chapter, and Chris Locke has taken that assignment for this exercise. Most of the other theses are also taken, so I chose one of the later ones, copied and pasted here:

71. Your tired notions of “the market” make our eyes glaze over. We don’t recognize ourselves in your projections—perhaps because we know we’re already elsewhere. Doc Searls @dsearls

Ten years later, that disconect is still there. Back when we wrote Cluetrain, we dwelled on the distance between what David Weinberger called “Fort Business” and the human beings both inside and outside the company. Today there is much more conversation happening across those lines (in both literal and metaphorical senses of the word), and everybody seems to be getting “social” out the wazoo. But the same old Fort/Human split is there. Worse, it’s growing, as businesses get more silo’d than ever — even (and especially) on the Net.

For evidence, look no farther than two of the most annoying developments in the history of business: 1) loyalty cards; and 2) the outsourcing of customer service to customers themselves.

Never mind the inefficiencies and outright stupidities involved in loyalty programs (for example, giving you a coupon discounting the next purchase of the thing you just bought — now for too much). Just look at the conceits involved. Every one of these programs acts as if “belonging” to a vendor is a desirable state — that customers are actually okay with being “acquired”, “locked-in” and “owned” like slaves.

Meanwhile, “customer service” has been automated to a degree that is beyond moronic. If you ever reach a Tier One agent, you’ll engage in a conversation with a script in human form:

“Hello, my name is Scott. How are you today?”

“I’m fine. How are you?”

“Thank you for asking. I’m fine. How can I help you today?”

“My X is F’d.”

“I’m sorry you’re having that problem.”

Right. They always ask how you are, always thank you for asking how they are, and are always sorry you have a problem.

They even do that chant in chat sessions. Last week I had a four chat sessions in a row with four agents of Charter Communications, the cable company that provides internet service at my brother-in-law’s house. This took place on a laptop in the crawl space under his house. All the chats were 99% unhelpful and in some ways were comically absurd. The real message that ran through the whole exchange was, You figure it out.

Last week in the New York Times, Steve Lohr wrote Customer Service? Ask a Volunteer. It tells the story of how customers, working as voluntary symbiotes in large vendor ecosystems, take up much of the support burden. If any of the good work of the volunteers finds its way into product improvement, it will provide good examples of what Eric von Hippel calls Democratizing Innovation. But most companies remain Fort Clueless on the matter. Sez one commenter on a Slashdot thread,

There’s a Linksys cable modem I know of that has a recent firmware, and by recent I mean last year or so. Linksys wont release the firmware as they expect only the cable companies to do so. The cable companies only release it to people who bought their cable modems from them directly. So there are thousands of people putting up with bugs because they bought their modem retail and have no legitimate access to the updated firmware.

What if I pulled this firmware from a cable company owned modem and wrote these people a simple installer? Would the company sing my praises then?

The real issue here is that people frequent web boards for support because the paid phone support they get is beyond worthless. Level 1 people just read scripts and level 2 or 3 people cant release firmwares because of moronic policies. No wonder people are helping themselves. These companies should be ashamed of providing service on such a low level, not happy that someone has taken up the slack for them.

Both these annoyances — loyalty cards and customer support outsourced to customers — are exacerbated by the Net. Loyalty cards are modeled to some degree on one of the worst flaws of the Web: that you have to sign in to something before you make a purchase. This is a bug, not a feature. And the Web makes it almost too easy for companies to direct customers away from the front door. They can say  “Just go to our Website. Everything you need is there.” Could be, but where? Even in 2009, finding good information on most company websites is a discouraging prospect. And the last thing you’ll find is a phone number that gets you to a human being, even if you’re prepared to pay for the help.

So the “elsewhere” we talked about in Cluetrain’s 71st thesis is out-of-luck-ville. Because we’re still stuck in a threshold state: between a world where sellers make all the rules, and a world where customers are self-equipped to overcome or obsolete those rules — by providing new ones that work the same for many vendors, and provide benefits for both sides.

This whole issue is front-burner for me right now. One reason is that I’m finally getting down (after three years) to unpacking The Intention Economy into a whole book, subtitled “What happens when customers get real power” (or something close to that). The other is that this past week has been one in which my wife and I spent perhaps half of our waking lives on the phone or the Web, navigating labyrinthine call center mazes, yelling at useless websites, and talking with tech support personnel who were 99% useless.

A Tier 2 Verizon person actually gave my wife detailed instructions on how to circumvent certain call center problems in the future, including an unpublished number that is sure to change — and stressing the importance of knowing how to work the company’s insane “system”. And that’s just one system. Every vendor of anything that requires service has its own system. Or many of them.

These problems cannot be solved by the companies themselves. Companies make silos. It’s as simple as that. Left to their own devices, that’s what they do. Over and over and over again.

The Internet Protocol solved the multiple network problem. We’re all on one Net now. Email protocols solved the multiple email system problem. We don’t have to ask which company silo somebody belongs to before we send email to them. But we still have multiple IM systems. The IETF approved Jabber’s XMPP protocol years ago, but Jabber has been only partially adopted. If you want to IM with somebody, you need to know if they’re on Skype or AIM or Yahoo or MSN. Far as I know, only Google uses XMPP as its IM protocol.

Meanwhile text more every day than they IM. This is because texting’s SMS protocol is universally used, both by all phone systems and by Twitter.

The fact that Apple, Microsoft, Skype and Yahoo all retain proprietary IM systems says that they still prefer to silo network uses and users, even after all these decades. They are, in the immortal words of Walt Whitman, “demented with the mania of owning things.”

Sobriety can only come from the customer side. As first parties in their own relationships and transactions, they are in the best position to sort out the growing silo-ization problems of second and third parties (vendors and their assistants).

Once customers become equipped with ways of managing their interactions with multiple vendors, we’ll see business growing around buyers rather than sellers. These are what we’re starting to call fourth party services: ones that Joe Andrieu calls user driven services. Here are his series of posts so far on the topic:

  1. The Great Reconfiguration
  2. Introducing User Driven Services
  3. User Driven Services: Impulse from the User
  4. User Driven Services: 2. Control

(He has eight more on the way. Stay tuned.)

Once these are in place, marketers will face a reciprocal force rather than a subordinated one. Three reasons: 1) because customer choices will far exceed the silo’d few provided by vendors acting like slave-owners; 2) customers will have help from a new and growing business category and 3) because customers are where the money comes from. Customers also know far more about how they want to spend their money than marketers do.

What follows will be a collapse of the guesswork economy that has comprised most of marketing and advertising for the duration. This is an economy that we were trying to blow up with Cluetrain ten years ago. It’s what I hope the next Cluetrain edition will help do, once it comes out this summer.

Meanwhile, work continues.

Tags: , , , ,

Oft-rode vehicles

Back in the summer of ’05, I put up a post that ran down a list of all the cars I’ve owned. Since then I’ve added one more car to that list. Since it’s giving me trouble lately I thought I’d copy over and update the original vehicular C.V. and add a few more words of woe. Here goes…

On my 58th birthday, I find myself thinking, for no reason other than sleeplessness (it’s 12:30am), about all the cars I’ve owned. In rough order, the are:

  1. Black 1963 Volkswagen ragtop beetle. Rolled it in the Summer of ’66, when I was turning 19. That one had a 1200cc engine. A friend had a new ’66 with a 1300cc engine, and we were out doing time trials to see the difference. Mine lost, of course, but I didn’t roll it while racing, or anything close. Instead it was when we were just driving around the North Carolina countryside. Right after I realized that I couldn’t keep up with my buddy’s car, I slowed down, closed the cloth (actually, vinyl) sunroof, and entered a curve that bent right where a dirt road came in from the left. Gravel had migrated onto the pavement, and when the car hit the curve, the rear end spun out. As Consumer Reports said of the car (as best I recall), “slight understeer changes abruptly and unexpectedly to unstable oversteer, to the limits of tire adhesion.” The pavement came up to my window and disappeared overhead three times before the car came to a rest, right side up, I was a bit banged up, but okay. Oddly, both shoes were next to each other on the road, also right side up, also facing the forward direction, looking like I had just stepped out of them — about 80 feet behind where the car had come to rest.
  2. Black 1961 English Ford Consul II sedan. Piece of crap. Leaked oil from everywhere.
  3. Midnight blue 1958 Mercedes 220S sedan. Fast and solid. Had seats that reclined to make the whole interior a bed. Had a bizarre “Hydrack” transmission: four on the column, no clutch on the floor. Sold it after the Hydrack died.
  4. Blue 1963 Chevy Bel-Air 4-door sedan. 283 V8. Automatic. Great car. Sold it when the transmission began failing.
  5. Yellow 1966 Volvo 122s sedan. Straight 4. Stick. Solid car. Sold it because I needed a wagon.
  6. Dark green 1966 Peugeot 404 wagon. Stick. Would hold anything. Had screw-on hubcaps, among other design oddities. Rusted to death.
  7. Snot-green 1969 Chevy Biscayne sedan. 287 V8. Automatic. Looked like an unmarked cop car. Drove it into the ground. It was this Chevy, more than any other car I’ve owned, that made me a shadetree mechanic of GM V8 cars.
  8. White 1970 Austin America, with a black stripe down its middle. Belonged to my sister, then my father, then me, then my father. Brilliant design, front wheel drive, transverse 4-cylinder engine, manual-automatic transmission, quirky and way ahead of its time.
  9. White 1970 Pontiac Catalina sedan. 327 V8. 4 door. Automatic. Leaked water into the trunk. Failed often without reason. Real beast of a car.
  10. Dark red 1974 Datsun pickup. Straight 4. Stick. Father’s car. Had use of it for a year or so. Seat was so bouncy your head hit the roof. Had two sets of points in the distributor: a vintage Datsun “feature.”
  11. Sky blue 1974 Ford Pinto wagon. Straight 4 that was flat on one side and looked like half an engine. Stick. Piece of shit. Moved kind of crabwise, due to an earlier accident, before I got the car.
  12. Blue 1980 Chevy Citation fastback. V6. Automatic. Bought it from my aunt after her stroke. Like the Pinto, but more comfortable.
  13. Sky blue 1970-something Volkswagen squareback. Had to crawl under the back of it with a hammer to hit the starter. Parked on hills so I could start it by rolling a ways and then popping the clutch. Was found burned to the metal on a side road a few months after I sold it.
  14. Blue 1978 Honda Accord fastback. Straight four. One of the first “good” Hondas. Though this one wasn’t, turned out. Bought it from a dishonest mechanic, which I didn’t find out unti the engine failed after I sold it. The new owner came after me, however. I was then in California and they were in North Carolina. We settled, but both felt burned.
  15. Dark red 1985 Toyota Camry. Straight 4. Stick. First and only new car I ever bought. Also the best, by far. Towed everything I owned in a U-Haul to California in August ’85. All but failproof. Eventually gave it to my daughter, who finished driving it to past 300,000 miles, I think. Only car I ever had where the AC actually worked.
  16. Sand-colored 1992 Infiniti Q45a. Wife’s car. Got it almost new in 1994. Best-performing, most enjoyable car I’ve ever driven. More about it here.
  17. Dark red 1988 Subaru wagon. Transverse 4. Stick. Front wheel drive that goes to 4WD, which requires four tires of identical circumference, so it has never worked quite right. Bought it from Buck Krawczyk in ’94. Handy for hauling stuff. I beat the crap out of it, but it won’t die. If I need a nice car I rent one or drive my wife’s 1995 Infiniti Q45a, which is a good car but not the equal of her 1992 Q45a, which it replaced and I still miss.
  18. Black 2000 Volkswagen Passat wagon. 1.8 Turbo engine. Tiptronic automatic transmission. Comfortable. Outstanding handling. Great for hauling stuff around, too. Got this in 2006, I think. Bought it from a friend who was leaving the country. Cost me $5k. Had 111,000 miles on it, and needed a bit of work. I put about $3k into it before taking it across the country to Boston in September 2007. Since then It has had about another $10k of work.

Anyway, the Passat lately has not been turning off when I take the key out. The engine keeps running. Weird. For that I had the ignition switch replaced. That helped for less than a day. Meanwhile it often thinks I’m breaking into it when I’m not, going into honking no-start mode.

I’ll be leaving it with the mechanic while I head to Atlanta next week. Hope they can figure it out.

I don’t think I’ve ever had a car that was so completely well-made and trouble-prone. My old ’85 Camry was a thin-metal plastic-filled thing, and all but failproof. This Passat has great fit & finish, it’s tight mechanically, and drives like new. But man, it costs a pile to run.

Tags: , , , , , ,

Me too. Which brings up the subject of this post here.

Tags: , , ,

Barack Obama wants to wait on the DTV shift currently scheduled for 17 February. On the grounds that it’ll be a mess, this is a good idea. But nothing can make it a better idea. It’s not that the train has left the station. It’s that the new OTA (over the air) Oz is mostly built-out and it’s going to fail. Not totally, but in enough ways to bring huge piles of opprobrium down on the FCC, which has been rationalizing this thing for years.

I explain why in What happens when TV’s mainframe era ends next February?. Most VHF stations moving to UHF will have sharply reduced coverage. The converter shortage is just a red herring. The real problem is signals that won’t be there.

Most cable customers won’t be affected. But even cable offerings are based on over-the-air coverage assumptions. Those may stay the same, but the facts of coverage will not. In most cases coverage will shrink.

FCC maps (more here and here) paint an optimistic picture. But they are based on assumptions that are also overly optimistic, to say the least. Wilimington, NC was chosen as a demonstration market. Bad idea. One of the biggest stations there, WECT, suffers huge losses of coverage.

Anyway, it’s gonna be FUBAR in any case.

Tags: , , , , , , , , ,

So now my dream app is ready on the iPhone. It’s just the beginning of What It Will Be, but it’s highly useful. If you have an iPhone, go there and check it out. It’s free.

As you see here, I’m involved, through the Berkman Center, which is collaborating with , which is working under a grant from the Corporation for Public Broadcasting (). Major props go to the PRX developers, who have been working very very hard on this thing. Some of the most diligent heads-down programming I’ve seen.

An interesting thing. In the old days, when an app came out, in any form, on nearly any platform, there was this assumption that it was a Done Thing, and should be critiqued on those grounds. Not the case here. This is a work in progress, and the process is open. In the long run, we should see much more opened up as well.

Paranthetically, I think right now we’re looking at some cognitive dissonance between the Static Web and the Live Web, when the latter seems to look like the former. You have a website, or an app. These seem to be static things, even when they’re live. An app like the Public Radio Tuner is more of a live than a static thing. But it’s easy, as a user, to relate to it as a static thing. Because at any one time it does have more static qualities than live ones. Imagine a house you can remodel easily and often, and at low cost and inconvenience. That’s kind of what we have here. A cross between product and process — that looks the former even when it’s doing the latter. Anyway…

Though this grant is for an iTunes app, work is sure to go on to other platforms as well — such as Android. So, rather than criticize this app for coming out first on the iPhone, please provide feedback and guidance for next steps beyond this first effort (and join me in giving the developers a high five for delivering a functional app in a remarkably short time). And in the reviews section at iTunes, provide honest and constructive reviews. At this stage I’m sure they’ll be good. (Some of the bad reviews were on the very first version released, which has since been replaced.)

To VRM followers and community members, VRM is very much on the agenda, and we’re thinking and working hard on what the VRM pieces of this will be, and how they’ll work. This may be the first piece of work where VRM components appear, and we want to do them right. Also bear in mind that this is the first step on a long, interesting and fruitful path. Or many paths. Interest and guidance is welcome there too.

Tags:

Since I’m an aviation freak, I’m also a weather freak. I remember committing to getting my first color TV, back in the mid-70s, because I wanted to see color radar, which at that time was carried by only one TV station we could get from Chapel Hill: WFMY/Channel 2 in Greensboro. These days TV stations get their radar from elsewhere, and have mothballed their old radar facilities. (Here’s one mothballed TV radar tower, at the WLNE/Channel 6 transmitter, which is istself doomed to get mothballed after the nationwide February 17 switchover to digital TV — marking the end of TV’s Mainframe Era.)

Online I’ve been a devoted watcher of both Weather.com and Weather Underground. Both those last two links go to local (Cambridge, MA) maps. They’re good, but they don’t quite match Intellicast, source of the map above. Play around witht the pan & zoom, the animation and the rest of it. It’s a nice distraction from weather as ugly as we’re getting right now here: sleet and then rain atop enough snow to cancel school today,.

Tags: , , , , , ,

The more I fly, the more useful, or at least interesting, the NOAA‘s AviationWeather.gov service becomes. At any given moment it has dozens of different reports on weather at altitude, across North America. The one above is among the many that show potential or reported turbulence.

I also just discovered TurbulenceForecast.com, with the TurbulenceForecast Blog. There’s a lot of overlap with AviationWeather.gov, since it uses a lot of maps and data from there.

Here’s the FAA’s page on flight delays. Plus FlightAware, the best of a bad bunch — too much flash and other stuff that doesn’t work on too many browsers, especially ones in handhelds. Speaking of which, I’ve lately been appreciating FlightTrack. The list could go on, but I need to move on. See ya in Boston. (At IAD now. The last two paragraphs were written at SFO, where connectivity was minimal.)

Oh, click on the map above and check out the current maximum turbulence potential between here (Washington) and Boston. So far there’s just one pilot report, of moderate turbulence, over Connecticut.

Tags: , , , , , , , , , , , , ,

There’s a good chance that the best picture you can put on your HD screen doesn’t come from your cable or satellite TV company, but from your new HD camcorder. As time and markets march on, that chance will only get larger. That’s because the there is a trade-off between the number of channels carried and the quality of each channel. That quality compression shows up as “artifacts” in the picture itself. Gradations of shading and color, such as in a blue or gray sky, turn to a mosaic of blocks. (In this shot, I show how grass on a football field has pimples.) Carriers compete more by the number of channels they carry than by the quality of each channel.(There are exceptions to this, but on the whole that’s what we’ve got.) Meanwhile your camcorder quality only goes up.

And as camcorder quality goes up, more of us will be producing rather than consuming our video. More importantly, we will be co-producing that video with other people. We will be producers as well as consumers. This is already the case, but the results that appear on YouTube are purposely compressed to a low quality compared to HDTV. In time the demand for better will prevail. When that happens we’ll need upstream as well as downstream capacity.

So here’s a piece in Broadband Reports that shows how carriers can be out of touch with the future, even as they increase the capacities of their offerings. An excerpt:

In upgraded markets, Comcast is not only upgrading existing speed tiers ($42.95 “Performance” 6Mbps/1Mbps and $52.95 “Performance Plus” 8Mbps/2Mbps tiers became 12Mbps/2Mbps and 16Mbps/2Mbps), but is adding two new tiers to the mix ($62.95 “Ultra” 22Mbps/5Mbps and the aforementioned $139.95 “Extreme 50″ 50Mbps/10Mbps).

One recurring theme we’ve seen in our forums is that the new speeds have many users downgrading. In both forum threads and polls, many customers on Comcast’s 16Mbps/2Mbps tier say they’re downgrading to their 12Mbps/2Mbps tier — apparently because they don’t think an additional 4Mbps downstream is worth $10. Customers used to be willing to pay the additional $10 for double the upstream speed, but there’s no longer an upstream difference between the tiers.

That last line is the kicker. Comcast apparently still thinks that downstream is all that really matters. It isn’t. For anybody producing a lot of photography or video, upstream not only matters more, but supports activities where the user can see the difference.

In fact there isn’t a lot of perceived difference between 12Mbps and 16Mbps on the downstream side. Either is fast enough for a YouTube video. But on the upstream side, you can see the difference. In my case, that difference appears in the progress bars for pictures I upload to Flickr.

A few months ago I upgraded my Verizon FiOS service from 20/5Mbps to 20/20Mbps. The difference was obvious as soon as it went in. The difference will be a lot more obvious to a lot more people once those people start sharing, mashing up and co-producing higher-definition videos.

Just watch.

Tags: , , , , , , , ,

On departure from Zürich to Paris yesterday the ground was shrowded in gloom and haze, but above it the sky was clear and crystalline. I sat purposely on the left side of the plane to get a view, even though I knew I’d be photographing the scene against the sun, which would be low in the early afternoon on a day approaching the Winter Solstice. Worse, the window looked like it had been cleaned with fine-grit sandpaper. Still, I got some nice shots with my old Tamron zoom and the Canon Rebel Xti (borrowed from the excellent and generous Rebecca Tabasky, a colleage at the Berkman Center).

I’m guessing the plane was about a hundred miles from the shot above. Closer for some of the early ones, and much farther for some of the later ones, some of which feature Mont Blanc, the only peak I could easily identify. I’m hoping some of the rest of you can fill in the blanks.

Tags:

Back in the 80s junkies were stealing radios from cars. Now it’s GPS units. At Logan Airport, bright signs greet you in the parking lot: REMOVE YOUR GPS UNITS, or words to that effect. I forget exactly. But the point is, they’re bait for thieves.

We have had two stolen in the last two months, both from our parked car in the driveway. The first was a Garmin 340c, and it was sitting on the dashboard. The second was a Garmin Nuvi 680, stolen along with a bunch of other stuff, even though it was hidden.

That was yesterday. I found out when a cop showed up at our front door asking if we’d had a GPS stolen. I said, “Yes, last month.” He said “How about last night?” I said I don’t know. So we went to look at the car, and sure enough, it was gone, along with cables and chargers for varioius stuff, plus a mount for a Sirius satellite radio.

Turns out the cops caught some people in the act, though not at our place. But they found our GPS freshly stolen. They looked up “Home” on it and found our address. Handy.

So we went down to the station to retrieve it last night. Not all the pieces were there (it’s missing a mount piece), but it’s fine. The cops told us not to have any mounts on the dashboard or the windshield, or any exposed power cabling that suggests anything of value is hidden somewhere in the car. So now we’re charging the GPS indoors, and not connecting it to anything inside the car. We just lay it in a space between the front seats and let it work there.

Not exactly the way it was designed to be used, but safer anyway. Sad it’s come to that, though.

[A month later...] Now we have a new routine. The GPS and all cabling (including a splitter and charger cable for our iPhones) go in a dark bag that gets thrown among junk in front of the back seats. The GPS mount, a bean-bag affair, gets turned upside down (where it’s black and looks like nothing other than more junk) and stuffed under one of the front seats. It takes about 40 seconds to set up the GPS, but at least it charges in the car and works like it should. So far, no more thefts. It helps, however, to have a messy car.

Tags: , , ,

One reason I got the iPhone was that it’s GSM. Meaning it should work outside the U.S. I also thought I had a plan with AT&T that allowed that. Well, now I’m in Europe and my iPhone just says “Searching…”. Did it in Frankfurt, and does it in London.

Anybody have any clues for a fix on this?

[Later...] Fixed. See comments below, and thanks to everybody.

Tags: , , , , ,

G-Mobile

The G1 gets covered by the Guardian. The new phone was launched today by T-Mobile.

What happens after TV’s mainframe era ends next February? That’s the question I pose in a long essay by that title (and at that link) in Linux Journal.

It’s makes a case that runs counter to all the propaganda you’re hearing about the “digital switchover” scheduled for television next February 17.

TV as we know it will end then. It’s worse than it appears. For TV, at least. For those already liberated, a growing new world awaits. For those still hanging on the old transmitter-based teat, it’ll be an unpleasant weaning.

I’m not a car nut — I could never afford to be, lacking both the money and the time — but I do enjoy and appreciate them as works of arts, science, culture and plain necessity. So, about a month ago the kid and I joined Britt Blaser at the Concours d’Elegance in Newport Harbor, looking at an amazing collection of antique cars and motorcycles, all restored or preserved to a level of perfection you hardly find in new cars off the production line.

We also got to hang with new friends from Iconic Motors, who are making a very hot little sports car designed and made entirely in the U.S., mostly by small manufacturers of obsessively perfected goods. Took a lot of pictures of both, which you’ll find by following the links under the photos.

This is mostly true:

This one is my fave.

There is no business I wish more that I had thought of than Despair.com. Just freaking brilliant. And humbling.

Clicking on the picture above will take you on a slideshow tour of the Grand Canyon, shot from the right side of an LAX-bound 757 that departed from Boston. I have no idea what movie was showing at the time; though I do know I refused, as I usually do, to close my windowshade to reduce ambient light on the ancient crappy ceiling-mounted TV screens. The scene outside upstaged the movie in any case, as it has been doing for the last several million years, as the Kaibab Plateau has pushed its dome upward and the Colorado has stayed roughly where it had been since the many millions of years before that, when it wandered lazily across a flat plain.

As ranking canyons go, the Grand Canyon is almost too grand. It’s freaking huge. From the air I find it far more dramatic to peer down into its narrower regions, such as the one above, which is early in the Colorado’s course through the canyon. The series follows the canyon from east to west, from not far below Glen Canyon dam and the Vermillion Cliffs area to Vulcan’s Throne and Lava Falls, where relatively recent flows have slopped their blackness down across the canyon’s iconic layer-cake strata.

What is most amazing to me about this corner of The West is that it was obviously placid through so many time stretches across the last almost two billion years. The West is painted with the colors of long periods of relative quiet, as sands and silts and gravel and cobbles were deposited by braided rivers and transgressing seas.

All of the Grand Canyon’s strata were laid down before the age of dinosaurs. Younger layers such as those comprising the Vermillion Cliffs to the East, the Grand Staircase upstream in the Glenn Canyon area, in Canyonlands, Arches, and most of Utah’s most colorful layer-cake displays — Bryce, Zion, Capitol Reef, Cedar Breaks, San Rafael Reef and Swell — are comprised of younger rock eroded off the top of the Kaibab Plateau.

Some of the shots were taken with my Canon 30d, and others with my tiny PowerShot 850. which does a better job of shooting straight down through the window. Its smaller lens distorts less through the plane’s multiple layers of bad glass and plastic windows. And the display on the back lets me shoot without looking through an eyepiece. It’s not perfect, but not bad, either.

I still miss my Nikon Coolpix 5700, which took lots of great pictures out plane windows, and was frankly much better at that job than the Canon, mostly because the Coolpix’ objective lens was smaller (again, better for looking at angles through the terrible optics of plane windows), and partly because the camera’s flip-out viewer allowed me to hold the camera to the window at angles I could not put my face, but where I could still see and frame the view.

Eastern Greenland blows my mind every time I fly over it. This last trip was no exception. Imagine Alps, Rockies, Himilayas, buried up to their nostrils in snow and ice across an expanse of Saharan dimensions, all of it moving, less an ice cap than a great spreading mound of blue and white, all of it heavy as magma, hard as stone, abrading away at the mountains, leaving horns and scarps protruding above the whiteness. At its edges icebergs calve off constantly and in great profusion, suggesting a bovine maternal quality to the great mound itself.

Anyway, it’s past the equinox and gaining on the winter solstice, so the sun was quite low when we flew over Greenland en route to Denver from London last week. Still, the subject was still there. Amazing sight.

Reading through the comments to Loose Linkage, where I pionted to Jalopnik’s What’s the oldest car you’ve ever owned, I got to wondering if I could remember every car I ever owned, and what happened to it. Here’s a try:

  1. 1963 Volkswagen Beetle. Black. 1200cc engine. Belonged to my parents. Rolled it during summer school after my freshman year in college. In fact, it rolled over three times before coming to rest right-side up. I remember trying to hold onto the bottom of the seat, watching the pavement come up to the window and disappear overhead, over and over again. I was fine, but the bug was totaled. Still, it brought $425 at auction from a guy who cut it in two and attached the front end of it to the back of another one. New it was $1250 or so.
  2. 1960 English Ford Consul. Black. Leaked oil from everywhere. Bought it for $400, sold it for almost nothing, which is what it was worth. The low point came when it croaked in Hickory, NC, where it limped after the alternator belt blew up on the Blue Ridge and where no replacement could be found, so we had to hitch back to Greensboro. In the rain. As I recall no belts could be found to fit around the alternator pulley, and for awhile we used some nylon hose tied into a loop.
  3. 1958 Mercedes 220S. Midnight blue. Bought it for $250, needed new upholstery, which I put in. Had a “hydrax” semi-automatic transmission. 4-on-the-column, no clutch. The couchlike seats reclined all the way, making the interior into a double bed. This made it a very romantic car. Alas, the transmission went bad, and I sold it for $75.
  4. 1963 Chevy Bel Air. 283 V8. Rochester carb. My parent’s old car, and the first new car they had ever bought. Drove it to 125,000 miles, when the transmission started to go. Sold it.
  5. 1966 Pugeot 404 wagon. Bought for $500. Had dents in all four doors, and lots of stupid “features” such as screw-on hubcaps and spark plugs hidden down inside the valve cover at the far ends of bakelite sleeves that would break. Got rid of it after driving it from New Jersey to North Carolina, in the middle of which a resonator can on the exhaust manifold blew off; and, in an unrelated matter, large hunks of the floor between the front seat and the pedals fell out, so I could see the pavement under my feet, hear the engine noise bypass the exhaust system, and breathe the exhaust, all at once — for another 400 miserable miles.
  6. 1966 Volvo 122S. Bought it from my parents, who bought it new in Belgium . Great car, very solid. Ran out of oil once, however, and damaged the engine. Sold it with 110K miles on it to a guy who replaced the engine.
  7. 1967 (?) Austin America. Belonged originally to my sister. Loaned from my father, who later sold it for almost nothing, which is what it was worth. An early front-wheel drive, it had lots of good ideas but terrible construction. I think Pop sold it for $10.
  8. 1971 (?) Datsun pickup. My father’s, actually. But I drove it for awhile. It had two sets of points in the distributor. Very confusing. Mastering those helped me later when I had a girlfriend with a Datsun 610 wagon.
  9. 1969 Chevy Biscayne. Snot green. Black vinyl seats. Looked like an unmarked cop car. Developed leaks in the roof. Turning on the heat would steam up the windows. Don’t remember how I got rid of it.
  10. 1978 Volkswagen Squareback. Bought it from a buddy for $200, sold it for $225. Something like that. My buddy and I fixed it more often than we would have, had not beers been involved in prior fixes. A few months after I sold it, cops showed up at my door to tell me I needed to get its corpse out of the woods, where somebody had set it on fire. Still had my plates on it. Fortunately, I had the paperwork for the sale. No idea what happened after that.
  11. 1969 Pontiac Catalina. “Big White.” Bought if from my uncle. The trunk would fill with water in the rain, making it useless for carrying stuff in there. Not sure what happened to that one, either.
  12. 1980 Chevy Citation. The famous “X car”, created to compete with Chrysler’s equally bad “K car”. It had front wheel drive, which was new in those days, and a roomy sloping hatchback. But it was crap and didn’t last long. Gave it up in a divorce, in trade for my ex’s old Pinto.
  13. 1974 Ford Pinto wagon. One of the worst cars ever made. This one had been in an accident at some point in the long prehistory before I came into possession of it, and the frame was bent, so it moved crabwise down the road. Every once in awhile it would start to veer wildly out of control, even on the straightaway. It did this once on the boulevard between Chapel Hill and Durham, hooking bumpers with another car, sending them both spinning. Fortunately, the Pinto’s bumper bent completely while the other hardly had a dent, which was both strange and amazing. The lady driving the other car wanted money anyway, and I paid. At some point the car just died, as best I recall.
  14. 1979 Honda Accord hatchback. Very nice, smooth-running car that went completely dead on a winding coastal road in the black of night, and then produced light in the form of a flame coming up from between my legs. I slowed to a stop as quickly as I could while feeling the shoulder of the road like I was reading braille through my right tires. When I fished a flashlight out of the glove box and got out of the car I found the car had come to rest exactly one foot from a parked car in front of it. A look under the dash revealed a hot lead (from the + side of the electric system) to Everything had been cut at some point in the past, spliced poorly and wrapped in gooey old black electric tape. As the splice came undone, electricity passed through an ever-narrower path until it turned into an incendiary thread, set fire to the tape and then fell apart. So it was easily fixed. But the car, in a very un-Honda-like way, was cursed with problems. I sold it to a young woman for whom it performed fine until the engine blew up. She contacted the mechanic who sold it to me in the first place, found that he had misrepresented the car (saying the engine was original, for example, when it wasn’t), and then sued me rather than him, because I had sold her the car. It was a small claims case in North Carolina. I was by then living in California. So I settled. By then, fortunately, I had bought my…
  15. 1985 Toyota Camry. Basic model with a stick. My first new car, and the first that had working air conditioning. Best car I ever had. Gave it to my daughter when I got the Subaru in the early 90s. I think it went way past 300,000 miles. It may still be working, somewhere in Santa Cruz, which is where she gave it away.
  16. 1986(?) Subaru 4Wd wagon. Tried to drive it into the ground but failed and gave it to a friend earlier this year. It’s still going.
  17. 2000 Volkswagen Passat wagon. Bought for $5k from a friend who was moving out of the country. Put another $3k into it, to bring it up to top shape. Wish it was a stick, but otherwise it’s a great little car. [Summer 2009 update: I have since put another $10k into it. I've never known a better-made yet more repair-intenstive car.]

I’m sure I’ve forgotten a few, but that’s an outline for countless stories.

[Later...] Fun comments below. By far the most entertaining (or frightening, or both) pointage out goes to the Head Lemur’s list. Wow. Reminds me of Hot Rod Lincoln, one of the Great Gassed Insanity Songs. Those linked lyrics, by the way, are from the Commander Cody version. The Commander gives the definitive performance of the piece (I just went through the karaoke exercise supported by the audio at that last link, and The Kid said he was glad “nobody was here” to hear it), although full props go to George Wilson for writing (and living) the original.

Tags: , , , , , , , , , , , , , , , , , , , , , , , , ,

Great cheap-outs

The best table radio I’ve heard in years is the Cambridge Soundworks 705. It’s solid and friendly-looking, with an old-fashioned round dial and a nice soft feel to the reduction gears inside its knob. (All three of its knobs feel good, actually.) Sound on AM as well as FM is outstanding, especially considering its small size. It’s mono, but only through its excellent speaker. It’s stereo though the headphone jack, so you can hook up to external stereo speakers if you like. The internal antenna works well, and it has a jack for an external one if you want to inprove reception. I think it’s a bargain at $99.99 from HiFi.com (Cambridge SoundWorks’ website); but they have it for $20 less at the company’s warehouse store in Needham. Comes in three colors: black, white and gray. I like the white.

Also from Cambridge SoundWorks, the PCWorks speaker system has been selling for years at $39.95 or something. Right now it lists for $10 more than that. The warehouse in Needham has it for $36-something. I’ve bought maybe five of these over the years, usually for service as laptop sound systems. People are always astonished at how good they sound, especially for the money. There’s a shoebox-sized bass unit, tiny (2.5″ square) right and left speakers, and a volume control in the cord that runs from your source (typically a portable MP3 player or a laptop, but anything with a headphone jack). The speakers cables and audio source cable are all long, which makes it easy to spread the speakers far apart or to hide most of the gear somewhere. Comes in white and black.

Want cheap HDTV? Combine that PCWorks speaker system with a low-cost LCD monitor like one of these from Costco, which start at $199. Plug the two into your HDTV set-top box and you’ve got an HDTV for less than $250. That’s kinda what I did yesterday when I needed to test our new Verizon FiOS (fiber optic) installation. We don’t have a TV of any kind here, but we have a PCWorks speaker system and a ViewSonic 22″ 1680 x 1050 display that cost in the low $200s from Costco. It was a jury-rigged setup, but something of a revelation: together they look and sound fabulous.

Last but far from least, the You-Do-It Electronics Center. If you’re a hardware geek who’s lucky enough to live near Boston, this place is Shangri-La. They don’t have everything, but it sure seems that way. (Look Honey, they sell capacitors!) Last night I grazed there for half an hour (way too short a time) picked up a cheap Y connector (RCA male to 1/8″ female) and a nice Uniden cordless phone with a headset jack. Works very nicely too. You can’t miss the neon signage peering over the northbound entrance ramp to 495 I-95/128 (see comments for the correction) at the Needham interchange. Finding the store is a lot harder. Clue: take the street next to the Hess station, and just look around the industrial district behind there.