UK Police Attacking Journalists Who Uncovered Police/Journalism/Government Scandal

The British voicemail hacking scandal just took a hugely dangerous turn. Scotland Yard is making war on the journalists who broke the voicemail-hacking scandal that Scotland Yard refused — corruptly or ineptly — to fully investigate on its own.

The police had all but ignored most of the immoral and almost certainly illegal acts of News Corp.’s top-selling and now defunct UK newspaper, News of the World (and maybe others). Scotland Yard’s lack of interest in the case — putting the lid on the investigation after several early arrests — may have been simple incompetence, but the other possible explanation is a corrupt alliance with crooked journalists and governments. 

But the Guardian (for which I write a weekly opinion piece) did its job when other journalists didn’t. Almost singlehandedly, the Guardian kept the story alive until the public saw more clearly what had happened.

Now the police are using one of the UK’s most draconian laws, the Official Secrets Act, against the newspaper. This is a blatant effort to punish the one news organization that dared to stand up for the public’s right to know about a scandal that implicated the nation’s most powerful media company, governments run by both major parties and, as increasingly seems safe to assume, the police themselves. 

Scotland Yard, stung by honest journalism, is attempting to criminalize that journalism. What an outrageous move. 

Tom Watson, the member of parliament who’s been on the case more than any other, puts it well in the Guardian’s coverage:

“It is an outrageous abuse and completely unacceptable that, having failed to investigate serious wrongdoing at the News of the World for more than a decade, the police should now be trying to move against the Guardian. It was the Guardian who first exposed this scandal.”

The FCC’s weak new “open Internet” rules

This article was originally published on Salon.com on December 21, 2010.

A partisan vote on Tuesday displeases everyone. And everyone’s right

The neutering of the Internet is now the unofficial policy of the Federal Communications Commission. Contrary to the happy talk from FCC Chairman Julius Genachowski at a rule-making announcement today in Washington, the move is well underway to turn the Internet into a regulated playground for corporate giants.

Tuesday’s FCC vote on rules purportedly designed to ensure open and free networks was a 3-2 partisan charade, with Genachowski and the other two Democratic commissioners in favor and the two Republicans against. It did nothing of the sort. The short-term result will be confusion and jockeying for position. Genachowski’s claim that the rules bring “a level of certainty” to the landscape was laughable unless he was talking about lobbyists and lawyers; their futures are certainly looking prosperous. The longer-range result will be to solidify the power of the incumbent powerhouses — especially telecommunications providers and the entertainment industry — to take much more control over what we do online.

Continue reading The FCC’s weak new “open Internet” rules

Now we have to jailbreak our Android phones, too

This article was originally published on Salon on August 23, 2010.

As Google gives carriers more sway over the operating system, customers need more options

When Google introduced the Nexus One smartphone early this year, we got a glimpse into what the future could be if device makers a) wrested control of the device from the mobile carriers and b) trusted users to decide what software they could run on the hardware they’d purchased. The heart of this notion was Google’s Android operating system running on a device that wasn’t subject to a mobile carrier’s deliberate limitations on how it could be used.

The experiment wasn’t a rousing success for several reasons, not least Google’s ineptness at running a store where it had to, gasp, deal with actual human customers needing technical support. Still, the Nexus One was the class of the not-Apple world, in my view, which is why I bought one and still use it with almost entirely happy results.

But Google has withdrawn from selling its own devices except to developers. And in the process, as ZDNet’s Jason Hiner persuasively explains, it’s ceded back to the carriers selling Android phones the control that users had expected for themselves with an open-source operating system.

Meanwhile, Google has made ominous common cause with Verizon in the policy arena, saying that it’s OK to toss out network neutrality — the idea that carriers shouldn’t discriminate on the basis of content — on mobile networks. Add it all up, and Google’s retreat is distressing.

The emboldened carriers have started loading all kinds of “crapware” — apps from partner companies that can’t be removed in standard configurations and that can slow down the devices. (For that matter, Google itself has done this with the Nexus One and Android, by putting unremovable apps into the operating system updates.)

Now, you can get an iPhone without this stuff. Unfortunately, you also get AT&T’s lackluster network and, much worse, Apple’s control freakery. Apple decides through its app-approval process what you’re allowed to use on the device, where Android phones (in almost all cases) don’t block you from installing what you want in addition to what they’ve already placed on the phones.

I wish Google — or more likely HTC or some other manufacturer willing to risk annoying the carriers — would sell an Android handset that I could use any way I want: an unlocked, un-crapwared device with lots of power and room to expand. Unfortunately there’s no sign of such a phone on the horizon, though I’ll keep watching.

The alternative is less attractive. It’s the rough equivalent of what the iPhone community calls “jailbreaking” — removing the artificial limitations in the operating system by fixing the software.

In the Android world this is typically called “rooting” the phone (again, not exactly the same thing as jailbreaking an iPhone), or giving the customer what’s also called “Superuser” access to all functions. There are risks in doing this, notably in security, but increasingly I’m inclined to believe they’re worth it.

So as I look for a new Android phone, which I’m doing as the hardware gets better and better, I’m watching several online forums for information, especially the XDA Developers site. Notably, I want to knowbefore I buy that I can root the phone and, in some cases, update to the latest Android operating system with full features.

Unfortunately, this process isn’t always easy or simple, which is just fine, I’m sure, with the mobile network companies. But until we move into a mobile world where at least one carrier and manufacturer allow their customers to actually own what they’ve purchased, we’ll be forced to overcome these barriers.

Google-Verizon plan: Why you should worry

This article was originally published on Salon on August 9, 2010.

Ominous references to the “public Internet” inescapably suggest something else entirely

So Google and Verizon had not, in fact, cooked up a special business dealfor their mutual benefit. But what they have cooked up, as announced today, may be no great deal for you and me: the idea of a parallel network that could, in the long run, become the default network — at the very least for entertainment and truly advanced services.

At first glance, no one could argue with what the companies’ CEOs, Eric Schmidt and Ivan Seidenberg, announced on Monday: A “joint effort by two companies to offer a suggestion” to public policy makers, as Seidenberg put it. (Both CEOs denounced last week’s media hyperventilating, based on stories in the New York Times and Bloomberg News, as bogus journalism, though they didn’t use that precise phrase. “There is no business relationship,” Schmidt said on a conference call.)

Their suggestion? Essentially, it was to insist that the Internet should remain open and freely available, with enforcement teeth to ensure what most people would call network neutrality, the idea that carriers can’t discriminate against one content provider in favor of another.

But the proposal went further. It would promote the expansion of new services, not part of the  Internet as we know it now, that would go beyond anything we have today. These new services, if Congress and regulators enacted the companies’ proposal, could not be designed to be end runs around net neutrality; they would have to be genuinely new.

But here’s the rub: You should not trust Verizon or other carriers, or Google for that matter, to follow through in ways that are truly in the interest of the kind of open networks the nation needs. Throughout the conference call, we kept hearing references to the “public Internet” — an expression that leads inescapably to something else.

If Schmidt was telling the truth when he said Google’s overwhelming focus will remain on the public Internet, such as his promise that YouTube will remain there, that’s great. I have no reason to disbelieve him, and Google’s track record to date is strong on this issue. But plans change, managements change, and corporate goals change.

The other big news in today’s announcement was Google’s clear retreat on network neutrality when it comes to wireless networks. As Susan Crawford, professor at Cardozo Law School and an expert on all things Internet, explains: “That’s a huge hole, given the growing popularity of wireless services and the recent suggestion by the Commission that we may not have a competitive wireless marketplace.”

For Verizon’s part, the acceptance of what sounds like fairly serious neutrality rules on current wire-line networks was welcome. But I see the rest as a Trojan Horse for a modern age. Verizon and other carriers have every incentive, based on their legacies, to push network upgrade investments into the parallel Internet, not the public one.

With one exception, the carriers have all but abandoned their push to bring the kind of wired-line bandwidth that other nations — Japan, South Korea, France and Sweden come immediately to mind — have done. Verizon has all but stopped building out its fast fiber-optic network to homes, leaving Comcast as the provider that is most ardently boosting connection speeds via its cable lines. (Even Comcast’s fast speeds are nothing special next to what carriers in those other nations have provided.)

So when Seidenberg said, “We have to be flexible,” my immediate thought was, uh-oh.

I’ve been worried for years that the game was on to turn the carrier-controlled Internet into just another version of television. Maybe the carriers won’t get away with that.

The right way forward is to have sufficient bandwidth that we can do pretty much anything we choose using public networks — a true broadband infrastructure where packet-switched services (moving data around, at super-fast speeds, in little packages that are reassembled at the user’s device) are the basis for all communications.

Instead, the game is on to create a parallel Internet. It’ll still be packet-switched. But they won’t call it the Internet anymore. That’s an end game we should not encourage.

Network neutrality’s corporate adversaries

This article was originally published on Salon on August 5, 2010.

Reports of backroom deals by big players that would inhibit Web freedom are unconfirmed, but should make us uneasy

UPDATED

The Internet is abuzz with worries that the Bloomberg news service and the New York Times may have gotten it right in reports that Google and Verizon are cooking up a deal that, in the worst case, would be a blow to our digital future.

The story, denied by both companies (Google tweetVerizon blog post) in language that gives them plenty of wiggle room, essentially says that the corporate giants are on the verge of an agreement that would, as the Times says:

allow Verizon to speed some online content to Internet users more quickly if the content’s creators are willing to pay for the privilege.

The charges could be paid by companies, like YouTube, owned by Google, for example, to Verizon, one of the nation’s leading Internet service providers, to ensure that its content received priority as it made its way to consumers. The agreement could eventually lead to higher charges for Internet users.

As countless commentators have already agreed, such a deal would mock something that Google has in the past adamantly maintained: the need for content-neutral treatment of what flows on the Net. It’s usually called “network neutrality,” and it means that Internet service providers — typically your cable and phone company for home broadband connections — should not get to decide what bits of data get delivered in what order and at what speed, if at all, based on who’s providing those bits of information.

That is, Verizon should not have the right to decide that it will delay what you see from, say, Salon or my personal blog if Fox News pays Verizon to put its content in the queue first. Why not? Because Internet access is an oligopoly business, creating unprecedented (at least in modern times) choke points for information.

The Federal Communications Commission, which has been making noises about doing more to make net neutrality a requirement, has been holding semi-secret meetings with many of the top players in the online world. If you and I have been represented at that table, it’s news to me, but this is how so much gets decided in our world. (UPDATE: The FCC has reportedly suspended the close-door meetings; good.)

Now we hear that Google — once the foremost proponent of net neutrality and a participant at those meetings — may be cooking up a deal with Verizon, which has also been at the FCC table. The scary thing is how plausible this is; Google is a huge company now with interests that may well be diverging from its previous commitments on the matter.

Google CEO Eric Schmidt, speaking yesterday at a conference, told reporters that the company isn’t backing away in the least from the coreprincipalprinciple (thanks, pragma). Speaking about network neutrality, he said, according to CNET:

“I want to make sure that everybody understands what we mean about it. What we mean is that if you have one data type, like video, you don’t discriminate against one person’s video in favor of another. It’s OK to discriminate across different types … There is general agreement with Verizon and Google on this issue. The issues of wireless versus wireline get very messy … and that’s really an FCC issue not a Google issue.”

So is this about wireless or wired connections, or both? How does this square with the Times report saying that Google would pay Verizon to speed YouTube videos to users’ computers? The murkiness of the entire thing is disturbing, which is one reason it feels too early to declare the imminent death of the Internet as we know it, as some commentators have done.

While I’m not ready to hyperventilate about this alleged deal, I’m more than ready to say that the policy and corporate trajectories are going in the wrong direction. I don’t trust Google to do what’s best for me. I trust Google to do what’s best for its shareholders, and if throwing net neutrality under the bus is best for Google the company will do that in a heartbeat — if allowed.

Take it for granted that the telecom companies will keep pushing for the right to control the content in the data pipes. They don’t want you to remember that they built their networks in large part with the help of government-granted monopolies and other special help; that would make it harder to claim this absolute control.

The backroom dealing is a big part of the problem here. If the FCC wants to come up with the right policy, it should make transparency part of the method.

Schmidt is board chairman of the New America Foundation, a smart-thinking organization that has done a lot to elevate policy conversations in recent years. I hope he’ll read and heed what the foundation said today in a statement:

During the 19th century a handful of wealthy industrialists dominated steel, oil refining and railroads; striking agreements to receive favorable terms for the carriage of their goods, while subjecting farmers and competitors to unreasonable and excessive charges.

Now, over a century later, history is in danger of repeating itself. After weeks of closed-door meetings sanctioned by the Federal Communication Commission, two of the largest corporations in the communications industry have reportedly negotiated an agreement on network neutrality. Though details of the agreement are not available, its terms are immaterial. It should not be the policy of the FCC to allow the largest companies to write the regulations that will determine the future of the Internet.

UPDATE: Scott Rosenberg wonders if the entire leak is just a head-fake, or possibly reflects dissent inside Google. Meanwhile, Google has put out a more strenuous denial, telling the Guardian, among others: “The New York Times is quite simply wrong. We have not had any conversations with Verizon about paying for carriage of Google traffic. We remain as committed as we always have been to an open internet.”

Saving our digital heritage

This article was originally published on Salon on July 19, 2010.

The Library of Congress and other preservation-minded organizations ponder how we preserve what we’re creating

They’re trying to save the news. Among other things.

No, this isn’t yet another thumb-sucking cogitation about the future of journalism, at least not the kind we typically see these days. Rather, this is about a different issue: How do we save journalism (and other media) that’s already been created — including the all too ephemeral information that we’re creating online?

This week in Washington, DC, the Library of Congress is gathering its “Digital Preservation Partners” for a three-day session — one of a number of such meetings the library has been holding under a broad initiative called the “National Digital Information Infrastructure and Preservation Program.” Its multi-year mission is:

to develop a national strategy to collect, preserve and make available significant digital content, especially information that is created in digital form only, for current and future generations.

It’s what my technology friends call a non-trivial task, for all kinds of technical, social and legal reasons. But it’s about as important for our future as anything I can imagine. We are creating vast amounts of information, and a lot of it is not just worth preserving but downright essential to save.

My role this week, and at a workshop I joined last year, is to be thinking about the news. My mind almost explodes when I consider the issues.

Even when there were relatively few community information sources — mostly newspapers — we had preservation issues. I started my newspaper career at a small weekly that has long since closed down. While I’m sure someone, somewhere, has a printed copy of the issues, the journalism is nowhere to be found online. And what happens when a newspaper with some printed archives and some online shuts down? Sometimes those archives go dark, too.

Even newspaper archives that exist online tend to live behind paywalls that prevent most people from using them. This greedy policy, which I’ve discussed before, has helped ensure that newspapers are less relevant in their communities than they should be.

A newspaper company I worked for deleted years worth of my blogging, twice. Once was when it changed publishing platforms. The second time was after I left the company. With some technical help I recovered and republished most of it myself.

TV and radio broadcasters have tended to save tapes or digital archives, though huge gaps have emerged in the record. Remember, storage used to be expensive.

The rise of citizen media has complicated everything. Now we had vast new sources of information, some useful and some not. (Kind of like traditional media, no?) Who had the obligation, if there was one, to save this material?

Well, we have the wonderful Brewster Kahle and his team at the Internet Archive to thank that a bunch of it still exists (including my old blogging that we recovered, no thanks to the newspaper company that killed it). The reality, however, is that much of the Web — not to mention many if not most of the great BBS conversations of earlier times — is lost.

After last year’s digital preservation meeting I suggested that we needed better ways to do our own archiving of blogs and other social media. I still believe the Library of Congress, Internet Archive and other preservation-minded folks should help the rest of us with this task.

The social question arises about people who don’t want to save what they’ve done.? Do they have a right to delete it? The Archive will take things down on request. But once you’ve put something up publicly, isn’t it public?

It’s not just a social question, but a legal one, now that judges areordering newspapers to delete archived stories. It’s a legal issue as well because copyright laws are constantly getting in the way of reasonable use of published material. The entertainment industry has taken us down a troubling path in this regard, and things are only getting worse.

And then there’s the entire question of material we create spontaneously, using databases that provide individualized experiences when we seek information. This isn’t just about search queries but about many kinds of community information sources; what you and I see when we visit Everyblock may well differ based on what we type into the text box. The only people archiving this stuff are the ones who own the databases; will the rest of us every have a look? Privacy interests say that we should not reveal it, but historians in the next century and beyond would find this absolutely crucial to their understanding of our times.

Happily, smarter people than yours truly are working on all of this. I’ll be filing some reports from the Washington meetings, to let you know what they’re thinking.

Choke points leave us vulnerable

This article was originally published on Salon on July 6, 2010.

Technology’s points of failure — and control — remain a constant worry even in the Internet Age

The SEACOM fiber-optic cable is one of the major data conduits connecting Africa’s networks to the rest of the world. SEACOM has suffered a serious outage in the past 36 hours or so, and Internet users across a wide area are having problems as a result. Among those affected are the South African universities that use a SEACOM-reliant regional research networking service, TENET, which also means that I and about 800 or so other participants at a pair of international mediaconferences in South Africa are mostly disconnected from email and the other Internet services we normally find essential.

It’s easy to grumble in such situations, and we’re all doing that. And it’s essential that we work on ways to increase network reliability — and reduce our vulnerability to situations where a single point of failure can paralyze vital services.

Even as we realize how far we have to go before we can fully trust that when our online tools will be available when we need them, we need to remember how far we’ve come. A decade ago, on my first trip to Africa, dial-up phone access was the main way I connected to the Internet in my travels — if I could connect at all. And just decade before that, hardly anyone in America, much less Africa, had even heard of the Internet.

The problem we’re experiencing here will pass. Repairs to SEACOM are under way (though conferees may not be back online properly by the time we leave two days from now). And Africa’s telecommunications system, while far behind the rest of the world, is steadily improving in a general sense.

Meanwhile, I’m sending this column via the somewhat iffy WiFi connection at the B&B where we’re staying. The establishment subscribes to an Internet service provider (ISP) that either doesn’t depend, down the pipe, on SEACOM or, unlike the university system, has found a way around the affected cables via other fiber-optic lines.

Yet the outage here reminds us that our interconnected world — and the technology we’ve increasingly come to rely on — remains all too vulnerable to areas of control that amount to outright choke points. Accidents or deliberate actions can achieve the same result: Network services go down or are censored; market-dominating software and Internet companies have critical bugs or use their power to make unilateral decisions.

Sometimes our problems are of our own making. Big Internet customers typically have backup plans for network outages. Home users typically don’t. This is one reason why the epic consolidation of the ISP business in much of the world, including the United States, into duopolies or oligopoly is a dangerous trend. When Comcast or AT&T has a major outage, millions of people are stuck until service is restored. And the internet “backbone” system — the lines that carry the longer-distance traffic — is becoming more and more a fiefdom of a few big players.

Choke points are built into other parts of our culture and economy. We encourage single companies to control access to life-saving drugs. We have an energy ecosystem, especially the electrical grid, that’s scarily vulnerable to disruption. We invite huge financial institutions to make wild bets with other people’s money, grow too big to fail without bringing down the global economy, and then let them steal the rest of us blind when their insane bets go bad. A few thunderstorms in the wrong places create absolute havoc in airline traffic, in part due to an aging and nearly archaic Air Traffic Control system. Even a mild pandemic of a slightly more lethal virus than last year’s H1N1 would overwhelm our already-stretched hospitals. And so on.

The best solutions are redundancy and competition, of course. But there seems little political will to make the kinds of decisions that would encourage more of either. A rare exception was last week’s Obama administration proposal to expand wireless bandwidth; we’ll see if the telecom industry’s lobbyists find a way, as they always try, to scuttle any initiative that expands competition.

There’s not much individuals can do about the mega choke points except tell politicians that they care, and vote. But we have more choices about the control points that affect our daily lives.

This is why you should keep in mind that data you post online — in places like Flickr, YouTube, Google Mail, Yahoo, Facebook and so many others — is much more in their control than yours. Can you get your information out as easily as you put it in? Rarely. And if you’re using a small service that someday goes out of business, you might be entirely out of luck.

That’s also why you should be wary of doing business with companies that sell you gadgets or other technology and then insist that they, not you, have ultimate control over the ways you can use them. Apple is most famous for this kind of control-freakery in its iPhone/iPad ecosystem, but is hardly alone. You may recall that Amazon (a company in which I own a small amount of stock) drew well-deserved fury when, for reasons it considered compelling, it removed books from customers’ Kindles remotely. Apologies don’t suffice; the way to avoid such gaffes is to be open and non-controlling.

And the way for the rest of us to avoid being on the receiving end is to have a Plan B for the essentials. Do you?

Google retreats in China

This article was originally published on Salon on June 29, 2010.

Search giant gives Beijing half a loaf, but will regime be satisfied?

Clever dodge or capitulation? Google’s latest move in its ongoing battles with China looks like a bit of both.

When Google closed its search operations in China last March rather than obey government censorship edicts, the search company tried somecorporate judo: It redirected searches from Chinese customers to servers in Hong Kong, thereby providing more honest results than the ones it replaced.

Too clever by half, the Beijing regime decided — and it ordered Google to stop its tactic or lose its ability to do pretty much any business in the world’s most populous nation. Disappointingly but not surprisingly, Google has done just that.

But in what is clearly a concession to the latest Bejing must-censor edicts, Google offered half a loaf to China search users and a half-raised middle finger to Beijing: They can still get the mostly uncensored Hong Kong results, but now they have to do so via hyperlinks rather than automatically. David Drummond, Google’s top lawyer, explained it this way on the company’s blog:

We have therefore been looking at possible alternatives, and instead of automatically redirecting all our users, we have started taking a small percentage of them to a landing page on Google.cnthat links to Google.com.hk—where users can conduct web search or continue to use Google.cn services like music and text translate, which we can provide locally without filtering. This approach ensures we stay true to our commitment not to censor our results on Google.cn and gives users access to all of our services from one page.

But how long will this last? China is nothing if not peristent on its censorship, and it’s hard to imagine that the Beijing censors will sit still for what they’re likely to see as further insult.

Rebecca MacKinnon, visiting Fellow at Princeton’s Center for Information Technology Policy (as well as friend and former colleague at the Harvard Berkman Center for Internet & Society), sees the situation playing out in four possible scenarios, each of which feels plausible. I’m not making a guess, because the Google-China tussle is so fluid.

Eventually, one guesses, Google will have to make the most serious decision of all: whether to shut down China operations entirely or keep making concessions. Will its fiduciary duty to shareholders outweigh moral concerns? Right, silly question.

FIFA to World Cup attendees: No replays for you

This article was originally published on Salon on June 28, 2010.

Soccer czars’ ban of in-stadium video replays only postpones reality

The czars of football (or soccer, as we call it in the U.S.), stung repeatedly by poor officiating in the World Cup tournament in South Africa, have come up with a way to mute protests inside the stadiums: pretend the mistakes aren’t happening by blocking any in-stadium video replays.

As the AP reports:

FIFA will censor World Cup match action being shown on giant screens inside the stadium after replays of Argentina’s disputed first goal against Mexico fueled arguments on the pitch.

The proximate event was a goal scored by a player who was offside before he took his shot. A spokesman for FIFA (the governing body of international football), Nicolas Maingot, regretted the “clear mistake.”

What mistake? Why, the video replay, not the actual officiating flub.

Yep, that’ll work. Actually it will, for now, if the majority of people inside the stadium are kept in the dark (assuming they missed the rules violation in the first place), which is the point of this exercise. Meanwhile, in living rooms and bars around the world, everyone else will have a close-up view of the officiating mistakes.

It’s not as if FIFA is alone in making this kind of decision. Salon colleague King Kaufman, who forgets more about professional sports every week than I’ve ever learned, tells me it’s fairly standard practice not to show big-screen replays of close calls in major U.S. professional and collegiate (OK, same thing) sports. No doubt Major League Baseball, which has its own prominent cases of staggeringly bad calls, is watching all this with interest.

But let’s consider where technology is heading to understand why the giant screens inside the stadiums aren’t going to remain the only issue.

Even today, it’s likely that some spectators could have seen the offside video almost immediately on video-equipped mobile phones. By the time the next World Cup rolls around, that will be most of the people inside the stadium. Will FIFA block mobile video access to ensure that official bumbles remain unseen by the people actually attending the events?

I’m sympathetic to FIFA in one respect. Soccer is a game of flow; it would change dramatically if officials repeatedly stopped the matches to review video footage — though correcting egregiously wrong calls on whether a goal had been scored would be an obvious place where it would make sense, as opposed to a missed offside, which happens all the time. But to pretend that the videos don’t exist isn’t going to work much longer.

Entertainment Cartel Moving Toward Outer Limits of Control

UPDATED

The FCC has decided that Hollywood’s copyright cartel will soon be able to control your television — and ultimately much more — remotely. This is not a joke.

The FCC order “‘will allow the big firms for the first time to take control of a consumer’s TV set or set-top box, blocking viewing of a TV program or motion picture,” Gigi Sohn, president of Washington-based Public Knowledge, said in a statement.

I’m old enough to remember the introduction to a long-ago TV show called the Outer Limits. Science fiction often comes true.

UPDATE: Cory Doctorow at BoingBoing, who has much more technical depth on these issues, explains clearly what’s going on — and how this is about much more than just your TV. It’s about all electronics you use. Thought you owned that new PC or Mac? Think again.