Data Point: Lots of Book Downloads

In the three days since Mediactive was published here in PDF format, about 1,500 visitors here have downloaded the book, and many more have visited the Table of Contents, which connects to the HTML version. Far fewer have purchased the book, of course, but it’s selling — and I’ve barely begun the real marketing process, which will take place in the new year.

World Bank Cites Mediactive

worldbanklogo.png

The World Bank’s Johanna Martinsson, communication associate in the institution’s Communication for Governance & Accountability Program, has a long blog post today about the need for a mediactive culture. She calls the book

a great source for everyone, and it will make you think twice next time you read or write something (because Gillmor suggests that in a participatory culture, we ought to do both to become fully literate).  With the current state of media (and no doubt it’s going to continue to evolve and diversify), we can no longer afford to take a back-seat and be passive consumers of information.  If we do, we run the risk of being misguided and form opinions that are not based on fact.

BoingBoing Praises Book

boing boing logo

BoingBoing’s Cory Doctorow has written a wonderful blog post about the book. He calls it a “master class in media literacy for the 21st century” and

an extraordinary text that disrupts the current poor-me narrative of failing journalistic business models and counters it with a set of sensible, entrepreneurial proposals for an Internet era news-media that invites broad participation without surrendering critical thinking and healthy skepticism.

Cory, a longtime friend, is one of the real heroes in the ongoing struggle to keep cyberspace open and innovative. He’s also a fabulous writer. I’m grateful for his support of this project.

 

Book is Now Available

Mediactive CoverI’m happy to say that Mediactive, the book, is now available in print (Amazon and Lulu) and in a Kindle edition.

And because this project lives under a Creative Commons Attribution-NonCommercial-ShareAlike license, I’ve also published it in full on this site. In addition, you can download it here as a PDF.

This is only the beginning. I’m working on an epub edition to use with other online services, and will be creating a variety of e-book editions that include other kinds of media. Beyond that, I intend Mediactive to be an ongoing, iterative process — with updates here on the website and, eventually new versions of the book itself.

Lots of people have helped me get this far on the journey. I thank you all, and look forward to the next steps.

Archiving ourselves

This article was originally published on Salon on November 5, 2010.

Our cultural heritage isn’t just the books, magazines and newspapers we read, nor the movies and TV we watch or the radio we listen to. More and more of our culture takes the form of digital media — and more and more of that is what we create, not just what we consume.

Heritage is about preserving what we know (or at least what we think we know) for generations yet to be born. And in the age of democratized media, as we collectively create information that has news value for communities, small and large, the people who care most about saving what we’re creating are wondering how to do it.

No archive is as comprehensive as the one at the Library of Congress, where I’ve been a participant in a two-day meeting this week about the subset of “user generated” media we sometimes call citizen journalism. As usual, at sessions like this one — this is my third visit to the library to help out with its ambitious digital-preservation project, the National Digital Information Infrastructure and Preservation Program — there are more questions than answers.

The reason for libraries and archives like the Library of Congress is simple: We need a record of who we are and what we’ve said in the public sphere. We build on what we’ve learned; without understanding the past we can’t help but screw up our future.

It was easier for these archiving institutions when media consisted of a relatively small number of publications and, more recently, broadcasts. They’ve  always had to make choices, but the volume of digital material is now so enormous, and expanding at a staggering rate, that it won’t be feasible, if it ever really was, for institutions like this to find, much less, collect all the relevant data.

Meanwhile, those of us creating our own media are wondering what will happen to it. We already know we can’t fully rely on technology companies to preserve our data when we create it on their sites. Just keeping backups of what we create can be difficult enough. Ensuring that it’ll remain in the public sphere — assuming we want it to remain there — is practically impossible.

Blogging pioneer Dave Winer, a participant in this week’s meeting, has some smart recommendations on creating what he calls “future-safe archives” — including “long-lived organizations to take part in a system we create to allow people to future-safe their content.” He lists universities, government and insurance companies as examples of such institutions. The Library of Congress knows it can’t store everything. Its archiving experts are working with a variety of partners, with a long-range goal of creating archives that are loosely connected but where researchers (and I hope regular folks) in the future will be able to easily find, retrieve and work with what’s being created today.

The technology industry isn’t an obvious candidate to provide the archiving institutions; as Dave notes, the tech companies are too likely to disappear or change in ways that make them unreliable. Even Google, for all its reach and power today, isn’t the place I want to store my work, in part because it’s a company that makes money by using our data to sell advertising. That’s not the relationship I want with my own archivist.

But the tech industry has a vital role to play in preserving the material we create ourselves, e.g. blogs, at the edges of the networks. It can work with the archiving institutions to ensure that we, the creators of media,can play a role in our own archiving.

What do I mean by this? Here’s an example. I use WordPress to create my personal website, and the website that accompanies my soon-to-arrive new book, “Mediactive.” I wish there was a plug-in for WordPress that would let me save my site to the wonderful Internet Archive, the nonprofit that is trying to archive as much online material (among other things) as possible. All blogging software vendors should have features like this, assuming the Internet Archive wants the material, which I’m fairly sure it does.

The value for future historians of what we do online comes from much more than blog posts. Among the sites that tell us most about our modern culture are such services as Craigslist and eBay. They are created entirely by their users, or at least the content is. How could they be persuaded to regularly archive what they do, for future reference?

I have little hope that Facebook would participate in such a system, because it’s Facebook’s obvious plan to itself be the repository for history. This is one reason that I don’t spend a lot of time posting things on Facebook, despite its usefulness; even though I can download what I do there, or at least some of it, no one but Facebook itself can get at the greater value of the service: the relationships among the users.

So when and if the Internet Archive (among others) makes a deal with WordPress and various content-creation platform providers, as I hope will happen someday, the information that goes into the archive needs to include more than just our blog posts. It should include the links I’ve made to other sites and reader comments, of course; but it should also include the inbound attention from people who’ve linked to what I’ve written, among the other relationships.

The complications go on and on. On my personal site I have RSS feeds from other sites where I’ve created some content, including such things as my Amazon and Yelp reviews and Twitter stream. I have no idea how to archive all of my public work in a compact way, or even if I should.

I’m hoping, sometime in the next few months, to help organize a meeting that connects technology people with archiving people so we can talk about personal archiving of this kind. One of the ideas raised at the Washington gathering was a “public commons” — a federated collection of services, I’d hope — where we could all save our creations, and if enough of the right people got together on this they could make and connect the tools to make it all work.

We need this for our children and grandchildren. They need it, as do the researchers and creators of tomorrow, to make their own world a better place — or at least to understand more clearly how their world got the way it is.

Net neutrality another election loser

This article was originally published on Salon on November 4, 2010.

There’s no way to sugar-coat this: Since Tuesday, network neutrality isn’t quite dead, but may well be in a coma. That’s the only rational way to look at the results of the 2010 elections, which saw some of net neutrality’s major backers go down to defeat.

Network neutrality is the idea that your broadband Internet provider — almost always a local cable or phone company — isn’t making decisions about what you can use on the Internet. That is, your ISP should not decide which bits of data get to your computer in what order or at what speed, much less whether they will ever get there at all.

Among the most damaging congressional losses will take place with thedeparture of Rep. Rick Boucher, a Virginia Democrat, who lost his reelection bid. No one in that chamber has a better grasp of technology issues, not even Silicon Valley’s representatives. Boucher wasn’t just a strong supporter of net neutrality on tech policy; as chair of the House Communications, Technology and Internet subcommittee he used his authority over tech policy in generally progressive ways. Democrats weren’t fully in support of net neutrality to begin with, but Republicans, ever-loyal to the big-money corporate interests, have decided that the duopoly is all the competition we need.

The robber barons who run our local telecom duopolies and the barely competitive mobile networks are surely thrilled with their good luck. They aren’t stupid enough to believe voters tossed out Boucher and other net-neutrality supporters on that issue alone, or that voters even gave it much thought, but they’ll definitely take advantage of the circumstances.

The Federal Communications Commission has been relatively timid on net neutrality, working mostly at the edges of the debate; witness its move to partially reclassify broadband service — a regulatory approach that would give the commission more authority to prevent carriers from discriminating against certain kinds of content. That’s a relatively timid move (though useful), but even this limited progress is under attack. And President Obama’s campaign promises to push hard for net neutrality seem hollow, at best.

Meanwhile, America falls further and further in the deployment of serious broadband. And the carriers are closer than ever to turning the Internet, which should be the most open of networks, into just another kind of cable television.

Science takes a hit on Election Day

This article was originally published on Salon on November 3, 2010.

The Democrats weren’t the only big loser in yesterday’s election. Science got clobbered, too.

Fueled by disdain for government interference with business and tanker loads of cash from the energy industry and its allies, the Republican party has been moving steadily into the denial camp on global climate change, or at least deep skepticism. And it’s practically an article of faith among the tea-party activist crowd.  A recent survey from the Pew Research Center for the People & the Press showed a yawning gap between Democrats and Republicans over the issue, with just 38 percent of Republicans believing that the earth is getting warmer — a belief that drops to 23 percent among tea party Republicans.

By every account, the Republican takeover of the House is likely to derail any possibility of serious action on climate change during at least the next two years, longer if President Obama is defeated for reelection in 2012.

And Republicans in the House have vowed to go to war against the Obama administration’s environmental policies, including its (too tepid) approach to climate change. Republicans have proclaimed their intention to use their new investigatory powers —  the majority party controls congressional investigations — to go after climate scientists.

The Republican attack on science is nothing new. The Bush administration made an art form of it, not just on climate but by supporting such anti-science initiatives as creationism; at one point during his presidency George W. Bush said he thought intelligent design should be taught in class as the other side of the issue, implying two roughly equal sides to an issue where essentially all the scientific evidence supports evolution and virtually none supports creationism.

The war on science has extended into the classrooms of America. Biologists are constantly warding off creationists’ efforts to put “intelligent design” (the standard code word for creationism) into the curriculum. Climate science will likely face even more hostility, especially given the moneyed interests fighting to curb the truth. The U.S. Chamber of Commerce, which spent so freely to elect the Republican House, has ginned up a “teaching guide” in collaboration with a textbook publisher that should know better to persuade kids that we can’t afford to save the planet.

There’s at least one major industry in this country that absolutely relies on workers who don’t deny reality, and who need to have learned well in math and science. It’s the technology industry, the leaders of which are constantly wailing about the lousy quality of American schools.

Most of the tech leaders were silent on creationism, shamefully so. At least a few, including Google’s Eric Schmidt, have offered their opinions that global climate change is a serious issue that we have to deal with sooner than later. Schmidt made that point rather forcefully last week during a Churchill Club conversation with film director and environmental activist James Cameron.

The tech industry as a whole has been loath to take on causes that don’t have a direct impact on its own immediate bottom line. But what better cause could there be than to defend science, the bedrock of everything that makes this industry work.

No group of leaders, speaking out loudly in defense of science and against propaganda, could have a greater impact on this critically important issue. Time is running out for them, and for all of us.

A Linux that works

This article was originally published on Salon on October 11, 2010.

Back in June I told you about my decision to make a serious change in my computing life: moving from the Macintosh operating system to Linux. As I’ll describe below, after a false start my migration is now proceeding well.

My decision to switch didn’t reflect any major unhappiness with the Mac OS, which I still consider the class in the desktop/laptop market. Rather, it reflected my problems with Apple.

Specifically, I was concerned because of the implications of  the company’s huge success with the iOS family of products — the iPhone, iPad and iPod Touch — and its smothering control of the ecosystem around those products. First, even though Apple has relented in small ways on its control-freakery, the fundamental nature of the ecosystem remained: You essentially need Apple’s permission to be part of it in almost every serious way.

Second, the company’s focus centers on the iOS ecosystem. Steve Jobs and his colleagues see what they call “curated” — a more polite word for control — systems as the way of the future. That inevitably leaves uncurated systems — that is, ones where people don’t need permission to build on them — in a second-class status.

The Mac OS got me off of Windows more than a decade ago, even before it was better than Windows, in part because I’d had it up to here with Microsoft’s tactics during its desktop monopoly days. I’ve owned more Mac computers than I can remember, and while I’ve never been wild about the laptop hardware, the machines have been my steady companions at home, work and on the road since then.

During my Mac years I’ve also said any number of times that I wished Apple would license its OS to IBM, which made what I consider the class of all laptop/notebook hardware: the ThinkPad. I’ve owned ThinkPads since the mid-1990s, when I was using Windows. A ThinkPad (now made by Lenovo) with the MacOS would be pretty close to the ultimate portable system for serious computing, but it was never to be.

In departing from the Mac OS at this juncture, I considered moving back to Windows. By any measure, Windows 7 is the best operating system from Microsoft in years — as big a leap above its most recent version as Windows 95 was over Windows 3.1. But Microsoft has its own control-freakish instincts and behavior, too, so that idea was a nonstarter.

Which, essentially, left Linux. Foolishly, however, I’d purchased a new ThinkPad X201 model, which combined portability, ruggedness and other features in ways that struck me as ideal. I was foolish because I’d bought it without knowing whether Linux — at least the version of Linux I wanted to use, Ubuntu — would run properly on it. It didn’t, because some of the hardware components were so new that the Ubuntu community hadn’t yet gotten around to writing the software to support them.

A word about Ubuntu: The open-source project aims to bring Linux to the masses, and it’s a remarkable collection of people who’ve coalesced around a team spearheaded by South African software entrepreneur named Mark Shuttleworth, who’s put millions of dollars of his own money into the effort.

Ubuntu Linux, from the start, has been a free operating system that’s one of many so-called distributions of the GNU/Linux platform that emerged in the 1990s as the free-software movement decided that Microsoft’s monopoly and proprietary control were unacceptable. The open nature of Linux has many competing versions, with the best ones aimed at corporate use and financially supported through fees that enterprises gladly pay for technical support and consistent upgrades and updates.

But behind all versions of Linux is a global community of people who contribute in small and large ways to the overall system, and give back improvements to the wider community. It’s messy, and wonderful.

Ubuntu 10.04, released in the spring, was nicknamed “Lucid Lynx” and was a major upgrade. But it didn’t install correctly on my x201; all I got was a black screen when I started the computer. I tried some of the tricks people suggested online but had no luck.

In September, my friend Cory Doctorow, who has been one of the people encouraging this switch and knew about my troubles, connected me with an American programmer, Robert Douglass, who lives in Germany. Douglass had found a way to make his own x201 work, and he was happy to help me do the same.

We spent part of a Saturday morning (California time) on Skype, during which time he helped me upgrade a critical part of the ThinkPad’s hardware — special software, called the BIOS, that’s embedded inside the hardware — and then try to install a beta, or prerelease, version of Ubuntu 10.10, nicknamed “Maverick Meerkat” in the offbeat naming system the Ubuntu-ites enjoy.

It worked immediately. After the installation and restart of the computer, I was greeted with a typical log-in screen. I was overjoyed, and Douglass was happy if somewhat bemused that all of his tinkering had been — as such things tend to go — rendered moot by the Ubuntu team. He was a classic member of the open-source software community: willing to help out a relative newbie because he believed so much in what he was helping to support.

But because I was now using beta software, I suspected something would still be problematic in a showstopping way. I suspected correctly: The Virgin Mobile USB modem I use for wireless broadband didn’t work right, and my hunting around for help online didn’t bring any useful tips.

Ubuntu 10.10 was released on (naturally) 10/10/10. The update could not have gone any easier. One of Ubuntu’s greatest achievements has been in the ease-of-use arena; installing and updating most software is now a breeze. And, as I’d expected from the comments I’d seen online about my modem, a fix had been added. I wrote this piece and uploaded it using my new setup.

This isn’t the first Linux I’ve used. I’ve installed various distributions on other hardware in the past, from various providers. But this is the first Linux that strikes me as truly ready for prime time for a large segment of the computing population. I don’t believe for a second that a large percentage will do what I’ve done, because the standard alternatives are compellingly easy to use and come pre-installed on the most widely purchased computers. Yet I’d bet serious money that any normal computer user could adapt quickly to Ubuntu, and that people who do only a few things with their computers — browsing, writing documents and the like — would be entirely satisfied.

I do all kinds of different things on my laptop machines, and migrating entirely to Linux is going to be a chore. I am 85 percent of the way, by my estimates, to a system that runs everything  I need to get my work done. In coming months I’ll periodically let you know how the migration is going. So far, so very, very good.