States continue to worry about the Internet’s nasty bits. Myspace will tighten access to underage users by those over 18, and allow anyone (not just kids) to designate a profile as private. A U.S. Senate proposal would require labeling of sexually explicit material appearing on the Internet. (Is footage of Janet Jackson at the Super Bowl sexually explicit? What about “Desperate Housewives” episodes? The Sports Illustrated swimsuit photos? From what I read, half of Facebook.com would qualify.) Australia considers legislation to ban adult material delivered to 3G phones over premium plans. This would seem strange here – “premium” means you must request such materials, and the cell phone billing service presumably enables age verification – but remember that different countries approach expression in different ways.
One way to think about content limitations on the ‘Net is to map them not only by subject area (porn, illegal drugs sites, hate speech pages), but also based on two sets of concerns:
1. Is the worry about accidental exposure to dicey content, or deliberate decisions to view it?
2. Is the concern about minors or adults?
Thinking about the particular problem in this way helps formulate a thoughtful approach if regulation is needed. For example, labeling is useful for preventing accidental exposure (assuming truthful, widespread labeling – a huge assumption), but not deliberate viewing. Filtering software may beneficially block children from seeing certain things, but we might need to worry that it censors material that’s appropriate or useful for adults.
Most regulatory moves seek to block accidental viewing by people of all ages, and also to prevent deliberate requests by minors for lots of sensitive stuff (porn being the thin edge of the wedge here). The question of overbreadth – or of pretextual moves – is whether these methods bleed into the fourth section of our matrix: where adults knowingly seek out Web pages some find objectionable.