Saturday, October 30, 2010

Uncle Sam Is So Anti-Social

Social networking sites have become powerful tools for connecting people. Whether it's a party, political event, or a business deal, people are increasingly getting it done through FaceBook, Twitter, and LinkedIn. Some workplace technologies, like SharePoint, have social tools to enhance collaboration. Status updates are replacing group emails, and it's starting to be difficult to know what's going on without logging on.

The federal government, however, is adopting social networking and Web 2.0 at a snail's pace. A few agencies, like the Centers for Disease Control and Prevention (CDC), are at the forefront of government use of social networking. Many other agencies, however, severely limit or even forbid their people from using these new tools in any official capacity. Instead, government workers are posting commentary and information unofficially and creating the same kinds of information conflicts the outright ban was levied to avoid.

Certainly, the government has a duty to protect some information like internal operations, personally identifiable data, and classified material. I'm pretty sure they have laws, rules, and policies to cover this stuff already. Does the medium really matter? Isn't distributing classified material already illegal?

These bans are a lot like the bans some states enacted on using a cellphone while driving. Distraced driving laws are already on the books; we don't have to cover every single distractive act a driver can commit. In fact, the more that laws against activities like text messaging get passed, the weaker the general distracted driving law gets. It's a vicious circle; as the overarching law weakens, legislatures feel compelled to pass dozens of mini-laws to cover it. By the same token, singling out a particular technology to ban implies that the others are somehow OK to abuse.

Web 2.0, including social networking, is simply a different way to communicate. It should be governed by the same information policies that regulate email, phone calls, and dinner party conversation. If the answer is not banning or policy changes, what is it? Training. How about allowing official use of Web 2.0 technology to communicate with the public after a review of established guidelines and practical training in their application? In other words, why not teach government workers about appropriate use of Web 2.0 instead of pushing it underground? Then discipline or dismiss the offenders without shutting out everyone else.

Saturday, October 23, 2010

I Will Not Make A Cloud Pun

The world is moving to cloud computing, where servers online act as organizational file shares and software is distributed as a service, accessed right through your web browser. Some describe cloud computing as a return to mainframes, and there are some similarities. If you don't know what a mainframe is, go ask your dad. One major difference is that internal networks and desktop PCs have a role in storing more sensitive data, serving location-specific needs such as printing, and hosting beefier programs and very large files.

One of the organizational benefits is to reduce need for network servers within an organization, thereby causing less IT headache. Users don't really know where their files are anyway, so switching to the cloud from a network share shouldn't be a problem. In the future, storage space, bandwidth, and application usage may be measured and billed like electricity or phones are now.

The federal government's CIO, Vivek Kundra, is looking to the cloud as a solution to the challenge of spiraling IT infrastructure costs. While acknowledging a transition period, he seems to believe that the cloud will result in greater efficiencies and huge cost savings. From what I've read, he bases his argument on his work with the Washington, DC government, where he introduced Google applications to over 30,000 government workers. Reportedly, over 4,000 (about 13%) are using the cloud on a regular basis. The federal plan is to roll out similar cloud services to 300 million worldwide government workers. With similar usage rates, there should be about 39 million regular federal cloud users within a year of the official rollout.

What I wonder is, can this cloud experience be replicated on such a massive scale? Whenever I am presented with numbers, I like to do some math. 300 million is 10,000 times 30,000. That's four orders of magnitude from DC government to the federal level. I don't think it's reasonable to expect the tech czar to recreate his success 10,000 times over, but let's look a little further before passing judgement.

To get another perspective, I googled "30,000." Here are some results from the first ten (out of 99,500,000):
  • a Washington Post article mentions that 30,000 jobs were created or saved due to the federal economic stimulus package
  • the New York Daily News reports that 30,000 poll workers may not get paid for election work in the NYC area this year.
  • 30,000 people a year wake up during surgery, according to msnbc.com
  • a datacenterknowledge.com interview reveals that Facebook has 30,000 servers


The last two items seem particularly interesting. 30,000 patients seems like a ghastly amount of people waking during surgery, but out of 20 million surgeries in the US each year, maybe it's not so many. After all, 99.85% of patients don't wake up during surgery. Then again, if it were the fabled "5 nines" (99.999%, or down only about five minutes every year) of the IT server world, only 200 patients would wake up. If server uptime was 99.85%, it would be down for an entire day every week. OK, maybe that's an unfair comparison. Anesthesia has only been in regular medical use for about a hundred years; servers have been around for... wait a minute.

That brings us to the item about Facebook. Notice how the same numbers pop up: 30,000 servers, 300 million users, so that's 10,000 users per server. So what? Well, Facebook was down this week for hours, at least for some people. Will the government launch its cloud initiative with 30,000 servers? If only 13% of government workers use the cloud, we still need 3,900 servers to accommodate them, and for more activity than Mob Wars and status updates. Earlier this month, T-Mobile and Microsoft destroyed data for all of its Sidekick phone users, who lost all of their contacts and other personal information. What assurance will the government have that the cloud will be properly maintained and backed up?

The federal government has not had the best track record at massive IT upgrades. Search online for upgrade problems with NMCI or FBI and you'll find a litany of lost money and productivity. The cloud transition will require many of the improvements in government infrastructure that would be necessary without the cloud. Firewalls will need to be modified, browsers upgraded, bandwidth widened, etc. etc. All this has to be done by a group of internal IT folks who are often contractors with a vested interest in the status quo. Even the government employees in IT face reduction in force or at least a transfer.

Maybe a good first test is to push SharePoint into the cloud. Agencies have only been using it for a few years, so it shouldn't have such a backlog of files and legacy applications like so many other servers do. Users don't know where the SharePoint server is, and it doesn't matter anyway, since it is designed for external as well as internal access. Development of most SharePoint applications takes place through the browser or the free SharePoint Designer; there's very little need to login to the server directly. Since many government workers use some form of Microsoft Office in their daily work already, workflows can stay the same, and agencies can avoid supporting a melange of office productivity tools.

Though cloud computing is indeed what the future holds, it is just getting started in the corporate world. It's a good idea to start with technology that is well-suited to the cloud, but familiar to users at the same time. Pushing SharePoint to the cloud first will help ensure a smoother transition.

Friday, October 15, 2010

Full-Motion Panic

My industry reading is backlogged, so I didn’t see this item until recently. A press release from the Gartner Group outlines an article that predicts that by 2013, more than 25 percent of content that workers see in a day will be pictures, video, audio. The author claims, “enterprises that see such growth as irrelevant to their operations risk alienating themselves from customers who start to request video communication services.” Hold the videophone, this could be a sea change, perhaps a revolution. The article seems to predict workplace chaos. “Users … will not accept onerous restrictions of inflexible security, access controls or forced metadata schemes in the workplace.”

The Gartner Group is usually an excellent source of statistics, industry trends, and other hard-to-find research, but I’m not getting the urgency or danger here. Certainly, the use of video online has skyrocketed, partially because of the prevalence of broader bandwidth, better compression algorithms, and faster hardware. Mostly, though, video is more popular on the web because there’s something to watch. As we used to say before the .com collapse, “Content is king.” Employees are going to YouTube and Hulu to relax and enjoy themselves; they’re not checking out the CEO’s video blog in droves. Entertainment is different from office life, at least for most people. A tedious office video is equally boring in the conference room or on a desktop.

Video can be a great medium to demonstrate physical procedures, impart wisdom from an expert, or give a live view of an event or place. In the 60 or so years that video has been widely available, it has both entertained and enlightened us. Putting it on a computer isn’t something that organizations have to prepare for any differently than they had to prepare for voicemail, email, faxes, and the like. The most important question is not “What will customers demand,” or “What will our employees put up with,” but “What business need does video fill?” From that starting point, we can examine what to do about it and look back at the recently trod paths of websites and collaboration tools for some guidelines.

In order to put this technology in the hands of employees, we need to do a little scratchpad work on procedure. It’s a good idea to figure out what video is for in an organization before letting everyone loose with a camera and upload space. Every other method of business communication is governed by policy; untested ones like video should have some broad guidelines. I don’t mean to suggest locking the general counsel in a room until there’s a 400-page ironclad policy, but the same rules for trade secrets, appropriate language, etc. need to be in place for video. Next, let people try it. Employees can cover meetings, construction progress, depositions, or whatever they think makes sense. Let them record with cell phones, Nanos, videocameras, or anything else on hand. Give them space to upload within the intranet and take stock. Now that people have gotten their feet wet with video, the organization should have some ideas on further policies and directions.

Once the organization knows where they’re going with video, it’s time to look for technology. Any sophisticated video-handling system licensed within the next twelve months will probably be replaced by something five times as good and a third as expensive by 2013. Organizations must determine how immediate the need for video communication is. If they can wait, they should do so. There are plenty of content management technologies out there that handle video to a degree, but we’re a long way from a SharePoint-style video intranet server, at least for most groups.

What’s most important is not to panic. Audio, video, and pictures have been around a long time, just not on the intranet. I know it seems like a lot: we’re just getting used to “wikis and blogs,” now there’s video. Start small and see how it feels before taking the plunge. Otherwise, your adventure will be all licensing and servers, but no video.