infrastructure

You are currently browsing articles tagged infrastructure.

Last month The Kid and I went to the top of the Empire State Building on the kind of day pilots describe as “severe clear.” I put some of the shots up here, and just added a bunch more here, to share with fellow broadcast engineering and infrastructure obsessives, some of whom might like to help identify some of the stuff I shot.

Most of these shots were made looking upward from the 86th floor deck, or outward from the 102nd floor. Most visitors only go to the 86th floor, where you can walk outside, and where the view is good enough. It costs an extra $15 per person to go up to the 102nd floor, which is small, but much less crowded. From there you can see but one item of broadcast interest, and it’s so close you could touch it if the windows opened. This is the old Alford master FM antenna system: 32 fat T-shaped things, sixteen above the windows and sixteen below, all angled at 45°.

From the 1960s to the 1980s (and maybe later, I’m not sure yet), these objects radiated the signals of nearly every FM station in New York. They’re still active, as backup antennas for quite a few stations. The new master antennas (there are three of them) occupy space in the tower above, which was vacated by VHF-TV antennas (channels 2-13) when TV stations gradually moved to the World Trade Center after it was completed in 1975.

When the twin towers went down on 9/11/2001, only Channel 2 (WCBS-TV) still had an auxiliary antenna on the Empire State Building. The top antenna on the ESB’s mast appears to be a Channel 2 antenna, still. In any case, it is no longer in use, or usable, since the FCC evicted VHF TV stations from their old frequencies as part of last year’s transition to digital transmission. Most of those stations now radiate on UHF channels. (All the stations continue to use their old channel numbers, even though few of them actually operate on those channels.) Two of those stations — WABC-TV and WPIX-TV — have construction permits to move back to their old channels (7 and 11, respectively).

That transition has resulted in a lot of new stuff coming onto the Empire State Building, a lot of old stuff going away, and a lot of relics still up there, waiting to come down or just left there because it’s too much trouble to bother right now. Or so I assume.

For some perspective, here is an archival photo of WQXR’s original transmitting antenna, atop the Chanin Building, with the Empire State Building in the background. The old antenna, not used in many years, is still up there. Meanwhile the Empire State building’s crown has morphed from a clean knob to a spire bristling with antennae.

Calling the Fat Tail

I think I’ve figured out a lot of what’s up there, and have made notes on some of the photos. But I might be wrong about some, or many. In any case, a lot of mysteries remain. That’s why I’m appealing to what I call the “fat tail” for help.

The “fat tail” is the part of the long tail that likes to write and edit Wikipedia entries. These are dedicated obsessives of the sort who, for example, compile lists of the tallest structures in the world, plus the many other lists and sub-lists linked to from that last item.

Tower freaks, I’m talking about. I’m one of them, but just a small potato compared to the great , who reports on a different tower site every week. Among the many sites he has visited, the Empire State Building has been featured twice:  January 2001 and November 2003. Maybe this volunteer effort will help Scott and his readers keep up with progress at the ESB.

This Flickr set, by the way, is not at my home pile, but rather at a new one created for a group of folks studying infrastructure at Harvard’s Berkman Center, where I’m a fellow. I should add that I am also studying the same topic (specifically the overlap between Internet and infrastructure) as a fellow with the Center for Information Technology and Society at UCSB.

Infrastructure is more of a subject than a field. I unpack that distinction a bit here. My old pal and fellow student of the topic, , visits the topic here.

Getting back to the Empire State Building, what’s most interesting to me about the infrastructure of broadcasting, at least here in the U.S., is that it is being gradually absorbed into the mobile data system, which is still captive to the mobile phone system, but won’t be forever. For New York’s FM stations, the old-fashioned way to get range is to put antennas in the highest possible places, and radiate signals sucking thousands of watts off the grid. The new-fashioned way is to put a stream on the Net. Right now I can’t get any of these stations in Boston on an FM radio. In fact, it’s a struggle even to get them anywhere beyond the visible horizons of the pictures I took on the empire State Building. But they come in just fine on my phone and my computer.

What “wins” in the long run? And what will we do with all these antennas atop the Empire State Building when it’s over? Turn the top into what King Kong climbed? Or what it was designed to be in the first place?

Infrastructure is plastic. It changes. It’s solid, yet replaceable. It needs to learn, to adapt. (Those are just a few of the lessons we’re picking up.)

Tags: , , ,

I just posted this essay to IdeaScale at OpenInternet.gov, in advance of the Open Internet Workshop at MIT this afternoon. (You can vote it up or down there, along with other essays.)  I thought I’d put it here too. — Doc


The Internet is free and open infrastructure that provides almost unlimited support for free speech, free enterprise and free assembly. Nothing in human history, with the possible exception of movable type — has done more to encourage all those freedoms. We need to be very careful about how we regulate it, especially since it bears only superficial resemblances to the many well-regulated forms of infrastructure it alters or subsumes.

Take radio and TV, for example. Spectrum — the original “bandwidth” — is scarce. You need a license to broadcast, and can only do so over limited distances. There are also restrictions on what you can say. Title 18 of the United States Code, Section 1464, prohibits “any obscene, indecent or profane language by means of radio communication.” Courts have upheld the prohibition.

Yet, as broadcasters and the “content industry” embrace the Net as a “medium,” there is a natural temptation by Congress and the FCC to regulate it as one. In fact, this has been going on since the dawn of the browser. The Digital Performance Right in Sound Recordings Act (DPRSA) came along in 1995. The No Electronic Theft Act followed in 1997. And — most importantly — there was (and still is) Digital Millenium Copyright Act of 1998.

Thanks to the DMCA, Internet radio got off to a long and very slow start, and is still severely restricted. Online stations face payment requirements to music copyright holders are much higher than those for broadcasters — so high that making serious money by webcasting music is nearly impossible. There are also tight restrictions on what music can be played, when, and how often. Music on podcasts is essentially prohibited, because podcasters need to “clear rights” for every piece of copyrighted music they play. That’s why, except for “podsafe” music, podcasting today is almost all talk.

There is also a risk that we will regulate the Net as a form of telephony or television, because most of us are sold Internet service as gravy on top of our telephone or cable TV service — as the third act in a “triple play.” Needless to say, phone and cable companies would like to press whatever advantages they have with Congress, the FCC and other regulatory bodies.

It doesn’t help that most of us barely know what the Internet actually is. Look up “The Internet is” on Google and see what happens: http://www.google.com/search?hl=en&q… There is little consensus to be found. Worse, there are huge conflicts between different ways of conceiving the Net, and talking about it.

For example, when we say the Net consists of “sites,” with “domains” and “locations” that we “architect,” “design,” “build” and “visit,” we are saying the Internet is a place. (Where, presumably, you can have free speech, enterprise and assembly.)

But if we say the Net is a “medium” for the “distribution” of “content” to “consumers,” we’re talking about something more like broadcasting or the shipping industry, where those kinds of freedoms are more restricted.

These two ways of seeing the Net are both true, both real, and both commonly used, to the degree that we mix their metaphors constantly. They also suggest two very different regulatory approaches.

Right now most of us think about regulation in terms of the latter. That is, we want to regulate the Net as a shipping system for content. This makes sense because most of us still go on the Net through connections supplied by phone or cable companies. We also do lots of “downloading” and “uploading” — and both are shipping terms.

Yet voice and video are just two among countless applications that can run on the Net — and there are no limits on the number and variety of those applications. Nor should there be.

So, what’s the right approach?

We need to start by recognizing that the Net is infrastructure, in the sense that it is a real thing that we can build on, and depend on. It is also public in the sense that nobody owns it and everybody can use it. We need to recognize that the Net is defined mostly by a collection of protocols for moving data — and most of those protocols are open to improvement by anybody. These protocols may be limited in some ways by the wired or wireless connections over which they run, but they are nor reducible to those connections. You can run Internet protocols over barbed wire if you like.

This is a very different kind of infrastructure than anything civilization has ever seen before, or attempted to regulate. It’s not “hard” infrastructure, like we have with roads, bridges, water and waste treatment plants. Yet it’s solid. We can build on it.

In thinking about regulation, we need to maximize ways that the Net can be improved and minimize ways it can be throttled or shut down. This means we need to respect the good stuff every player brings to the table, and to keep narrow but powerful interests from control our common agenda. That agenda is to keep the Net free, open and supportive of everybody.

Specifically, we need to thank the cable and phone companies for doing the good work they’ve already done, and to encourage them to keep increasing data speeds while also not favoring their own “content” subsidiaries and partners. We also need to encourage them to stop working to shut down alternatives to their duopolies (which they have a long history of doing at both the state and federal levels).

We also need to thank and support the small operators — the ISPs and Wireless ISPs (WISPs) — who should be able to keep building out connections and offering services without needing to hire lawyers so they can fight monopolists (or duopolists) as well as state and federal regulators.

And we need to be able to build out our own Internet connections, in our homes and neighborhoods — especially if our local Internet service providers don’t provide what we need.

We can only do all this if we start by recognizing the Net as a place rather than just another medium — a place that nobody owns, everybody can use and anybody can improve.

Doc Searls
Fellow, Berkman Center for Internet & Society
Harvard University

[Later...] A bonus link from Tristan Louis, on how to file a comment with the FCC.

Tags: , , , , , , , , , , , , , , , , , , , , , ,

‘Smart’ Electric Utility Meters, Intended to Create Savings, Instead Prompt Revolt is a New York Times story that perhaps suggests a deeper truth: People don’t want their utilities to get smart on them. Except, occasionally, on request. Like, when a bill one month is strangely high.

These paragraphs encapsulate several problems at once:

At the urging of the state senator, Dean Florez, Democrat of Fresno and the chamber’s majority leader, and others, the California Public Utilities Commission is moving to bring in an outside auditor to determine whether the meters count usage properly.

In response to a wave of complaints from the Bakersfield area in the Central Valley, Pacific Gas & Electric has been placing full-page advertisements in newspapers in the area promising benefits from the new meters. It says customers will save money not only by paying rates based on hourly fluctuations in the wholesale market, but also eventually by displaying real-time rates.

To reduce their bills, customers could cut back at pricey peak times and shift some activities, like running a clothes dryer or a vacuum cleaner, to off-peak periods. Utilities will then have lower costs, the argument goes, because the grid will need fewer power plants as demand levels out.

Customers will become “structural winners,” said Andy Tang, senior director of the company’s Smart Energy Web program.

The first problem is that some customers (enough to cause a stink, and cause newspaper stories) think their new “smart” meters are cheating them. Let’s say the meters are fine. (And I’m betting they are.) What’s this say?

The second problem is that the meters complicate usage. Who (besides people paid to care) are interested in wholesale energy market price fluctuations? And how many customers are ready to modulate usage based on fluctuating real-time demand?

The third problem is cultural, normative, and to some degree explains the first two: We’re not used to caring about this kind of stuff. Much less about being “structural winners,” whatever those are.

What’s being called for here is not just new gear that helps users use less electricity, water and gas. And what’s proposed is not just the need for all of us to “go green” and care about wasting resources and cooking the planet. What’s proposed is re-conceiving what a utility is.

Utilities, at least to the end user, the final customer, the one paying the bills, are simple things. They are dumb. Their availabiliy is binary: it’s there or its not. When it is, you want to hold down costs, sure; but you expect it to be there full-time. There should be enough gas or oil to run a furnace, to boil an egg, to produce hot water. There should be enough electricity to light bulbs and keep appliances running. There should be enough water pressure for people to take showers and wash dishes. More than enough doesn’t get noticed. Less than enough is a problem. That or none requires a call to the utility company or the landlord.

“Smart” so far looks complicated. And most people don’t want complicated, especially from their utilities.

Now, what we’re talking about here is making all utilities digital. That is, computerized. Again, complicated. True, for a mostly good cause. But entirely good? I gotta wonder. When I see big companies like GE and IBM talking about making our power “smart,” I think they’re talking about making it smart their way. Which is not like other companies’ ways. And selling “solutions” to utility companies that are different than the next company’s “solutions,” and lock the customers into proprietary systems that can cause more annoyance than convenience down the road.

I haven’t studied any of this, so I don’t know. I’m just saying what I suspect. And I invite correction on the matter. If there are standard ways to smarten power, so that customers can swap out one company’s gear for another’s, that’s fine. But again, I dunno.

Meanwhile, let’s table that and look at the Internet. This is a place where we have a degree of intelligence in a utility. Customers in many places have choices about variables such as bandwidth, and “business” versus “home” levels of support.

But I think what we want out of the Internet is what we already have with water, gas and electricity: it’s just there. Nothing more complicated than that.

I hope that’s where we end up. But my fear is that old-fashioned utilities will get smart the way the phone and cable companies have made the Internet smart. And that would be dumb.

Tags: , ,

I just posted Rupert Murdoch vs. The Web, over at Linux Journal. In it I suggest that the Murdoch story (played mostly as Bing vs Google) is a red herring, and that the real challenge is to free the Web and ourselves from dependencies from giant companies I liken to volcanoes:

We’re Pompeians, Krakatoans, Montserratans, building cities and tilling farms on the slopes of active volcanoes. Always suckers for stories, we’d rather take sides in wars between competing volcanoes than build civilization on more flat and solid ground where there’s room enough for everybody.

Google and Bing are both volcanoes. Both grace the Web’s landscape with lots of fresh and fertile ground. They are good to have in many ways. But they are not the Earth below. They are not what gives us gravity.

I think one problem here is a disconnect between belief systems about markets, and the stories that arise from them.

One system believes a free market is Your Choice of Captor. In this camp I put both the make-it/take-it mentality (where “winners” are rewarded and “losers” punished) of the Wall Street Journal (which a few months ago looked upon the regulated duopolies for Internet access as the “free market” at work) and those who see business (or corporations, or capitalism, or all three) as a problem and look to government — another monopoly — for remedy from these evils in the marketplace. In other words, I lump both the left and the right in here, along with the conflicts between them.

The other system sees markets as settings for human activity: the locations, both real and virtual, where people and their organizations meet to do business, make culture, and build civilization. Here I put nearly everybody who contributed the structural agreements that made the Internet possible, and who truly understand what it is and how it works, even if they can’t all agree on what metaphors to use for it. I also include all who have contributed, and continue to contribute, to the free and open code bases with which we are building out our networked world. While political beliefs among members of this system may sort somewhere along the right-vs.-left axis, what they do to build the world is orthogonal to that axis. That’s one big reason why that work escapes notice.

The distinction I see here aligns well with Virginia Postrel‘s contrast between “stasists” and “dynamists”. The difference is that much of what gets done to make the networked world (and to support its dynamism) isn’t “dynamic” in the active and dramatic sense of the word — except in its second-order effects. For example, SMTP and IMAP are not dynamic. (Being mannerly technical agreements, protocols don’t do that.) But on those protocols (and related ones) email happened, and the world hasn’t been the same since.

With that distinction in mind, I suggest that too much oxygen suckage is wasted on “wars” between the stasists (some of whom are also into the superficially dynamistic attention-suck of vendor sports — here’s an oldie but goodie that still makes my point), and not enough on constructive work done by geeks and entrepreneurs who quietly build the original and useful stuff that serves as solid infrastructure on which countless public goods (including wealth creation beyond measure) can be generated.

We have the same problem in most net neutrality arguments. The right hates it, the left loves it. One looks to protect the “free market” of phone and cable companies (currently a Your-Choice-of-Captor system) while the other looks to government (meet your new captor) for relief. When in fact the whole thing has happened all along within what Bob Frankston calls The Regultorium.

The primary dynamism of the Internet — what gave us the Net in the first place, and what holds the most promise in the long run — doesn’t just come from those parties, and can’t be found in the arguments they’re having. It comes from low-box-office geekery that supports enormous new business opportunities (along with many public benefits, with or without business).

It’ll take time to see this, I guess. Just hope we don’t drown in lava in the meantime.

Bonus red herring: A lot of news really isn’t.

Tags: , , , , , , , , , , , , , , , ,

Got these shots of St. Louis and the convergence of the Missouri and Mississippi Rivers while flying to Austin by way of Chicago two Fridays ago. You can see the Gateway Arch, right of center, Busch Stadium, the Edward Jones Dome, the City Museum, and lots of barge traffic on the river.

I actually didn’t see much of St. Louis. My window seat didn’t have well-placed windows, and I couldn’t see downward in any case. But my little Canon Powershot 850 could look for me. So I held it against one of the windows, angled it downward, and shot away, checking from time to time on the back of the camera to see if my shots were accurate. Didn’t do too poorly, considering.

What I want is a small camera like this one that can shoot RAW without taking forever to do it. (As was the case with my old and much missed Nikon Coolpix 5700, which also featured a flip-out viewer, making shots like this much easier.) The PS 850 has no RAW mode, and its processing is rather thick with artifacts. Still, fun to use.

Tags: , , , , , , , , , , , , , , , , , , , , , , ,

Kathy Moran has a great line — “Blogging about productivity began to feel like drinking about alcoholism” — that somehow comes to mind as I point to The Free Beer Economy, which I just put up at Linux Journal, in advance of SXSW, where I’ll moderate a panel titled Rebuilding the World with Free Everything. The panel will happen next Tuesday, right after the keynote conversation between Guy Kawasaki and Chris Anderson, whose book Free: The Future of a Radical Price is due out this summer, and who will join our panel as well.

The gist:

So we have an ecosystem of abundant code and scarce imagination about how to make money on top of it. If that imagination were not scarce, we wouldn’t need Nicholas Carr to explain utilities in clouds with The Big Switch, or Jeff Jarvis to explain how big companies get clues, in What Would Google Do?

More to the point for us blogging folk, I’ll add Dave’s How I made over $2 million with this blog.

His point: He made money because of it. As I have with mine. Neither one of us, more than coincidentally, has advertising on our blogs. Neither one of us burdens our blogs with a “business model”. Nor do we feel a need to hire some outfit to do SEO for us. Good blogs are self-optimizing. That can go for their leverage on income as well, even without cost to one’s integrity.

As with so much on the Net, it’s still early. Much future is left to unfurl. The millipede has many more shoes to drop. So there is much fun left to be had, and much money to be made, even in a crap economy.

But hey, I’m an optimist. What else can I say?

Look forward to seeing many of ya’ll in Austin. I fly down tomorrow, back on Wednesday.

[Later...] I tweeted a pointer to the post earlier, and did something I’ve never done before, which was ask people to digg the piece. It’s kind of an experiment. Curious to see how it goes.

I’ve only had one post dugg to a high level before. It was fun for the few hours it lasted, but I’m not sure it did anything substantive (other than drive traffic to Linux Journal, which was more than agreeable). What I mean is, I’m not sure it drove a conversation about its subject. Hence, the next experiment. Applied heuristics, you might say.

Tags: , , , , , , , ,

There’s a good chance that the best picture you can put on your HD screen doesn’t come from your cable or satellite TV company, but from your new HD camcorder. As time and markets march on, that chance will only get larger. That’s because the there is a trade-off between the number of channels carried and the quality of each channel. That quality compression shows up as “artifacts” in the picture itself. Gradations of shading and color, such as in a blue or gray sky, turn to a mosaic of blocks. (In this shot, I show how grass on a football field has pimples.) Carriers compete more by the number of channels they carry than by the quality of each channel.(There are exceptions to this, but on the whole that’s what we’ve got.) Meanwhile your camcorder quality only goes up.

And as camcorder quality goes up, more of us will be producing rather than consuming our video. More importantly, we will be co-producing that video with other people. We will be producers as well as consumers. This is already the case, but the results that appear on YouTube are purposely compressed to a low quality compared to HDTV. In time the demand for better will prevail. When that happens we’ll need upstream as well as downstream capacity.

So here’s a piece in Broadband Reports that shows how carriers can be out of touch with the future, even as they increase the capacities of their offerings. An excerpt:

In upgraded markets, Comcast is not only upgrading existing speed tiers ($42.95 “Performance” 6Mbps/1Mbps and $52.95 “Performance Plus” 8Mbps/2Mbps tiers became 12Mbps/2Mbps and 16Mbps/2Mbps), but is adding two new tiers to the mix ($62.95 “Ultra” 22Mbps/5Mbps and the aforementioned $139.95 “Extreme 50″ 50Mbps/10Mbps).

One recurring theme we’ve seen in our forums is that the new speeds have many users downgrading. In both forum threads and polls, many customers on Comcast’s 16Mbps/2Mbps tier say they’re downgrading to their 12Mbps/2Mbps tier — apparently because they don’t think an additional 4Mbps downstream is worth $10. Customers used to be willing to pay the additional $10 for double the upstream speed, but there’s no longer an upstream difference between the tiers.

That last line is the kicker. Comcast apparently still thinks that downstream is all that really matters. It isn’t. For anybody producing a lot of photography or video, upstream not only matters more, but supports activities where the user can see the difference.

In fact there isn’t a lot of perceived difference between 12Mbps and 16Mbps on the downstream side. Either is fast enough for a YouTube video. But on the upstream side, you can see the difference. In my case, that difference appears in the progress bars for pictures I upload to Flickr.

A few months ago I upgraded my Verizon FiOS service from 20/5Mbps to 20/20Mbps. The difference was obvious as soon as it went in. The difference will be a lot more obvious to a lot more people once those people start sharing, mashing up and co-producing higher-definition videos.

Just watch.

Tags: , , , , , , , ,