Showing posts with label training. Show all posts
Showing posts with label training. Show all posts

Saturday, February 26, 2011

Failure Has Some Options

Everyone fails at one time or another. Every organization can point to a program or project that jsut didn't go well at all. One of the big steps is admitting the problem in the first place. Another big step is figuring out what to do about it. There are three main options: do nothing; add resources; scrap the project. For many years, the federal government has only take the first two tacks; now it's going to try the third. According to an article in 'nextgov' magazine, the White House will review and post the health of all government IT projects. You can find that link at
http://it.usaspending.gov/.

While some of us might enjoy the handy pie chart and deep red color of these stats, there is so much more to know. By what measure is a particular project failing? Will this project succeed given more time and other resources? Is it worth it? Are there particular people to blame? Yes, I know it's popular to say that no one is at fault, or everyone is, which means the same thing for accountability. Starting with the project requirements, it's useful to look at the history of the whole effort. It's entirely possible that the project in question was never going to meet the organizational goals in the first place, especially if there are no goals.

What's nice about adding accountability to these IT projects is that contractors who don't meet the timeline or requirements will no longer be rewarded with more money, at least the really bad ones won't. Those bad projects are a huge time and resource suck, and I for one can think of all sorts of valuable ways that money can be spent. Even if the project is set to continue, it can be done under another contractor. As you've read previously, I'm pretty big on accountability and consequences, and this IT health monitor is set to deliver plenty of both. I'll be watching to see what happens.
























Saturday, November 20, 2010

4th Floor -- Results, Organizational Benefits, Objectives, Goals

I gave a seminar on elearning at a federal department this week. It was full of training managers and other human capital experts. They all had the same comment: "This is great information, but it doesn't address what I really need. How do I convince upper management that training is valuable?" As we examined that topic from different angles, a few things became clear. First, many training managers aren't measuring their training beyond a course evaluation. Second, and more shocking, upper management isn't sharing program objectives or organizational goals with the training team.

Let's take the measures first. You're probably familiar with Donald Kirkpatrick's evaluation model. Four levels of training assessment are broken down into a learner's evaluation, a learning assessment, knowledge transfer, and organizational benefit. Each level is measured and builds on the preceding one. It's pretty simple in concept, but it can be difficult to practice.

The training managers had some problems with the level one assessments. One issue was the questions didn't seem to go far enough. It's good to ask a learner how he or she liked the course, but it's better to go a little deeper. Ask how the learner would do it differently or if any information seemed to be missing. Find out how the learner would have survived without the course, and what resources he or she might have used if the course didn't exist. Another big issue was making sure the online evaluations were anonymous in the LMS -- something any good programmer should be able to provide with little difficulty.

At level two, managers seemed to be uncomfortable with the idea of a pretest evaluation of the learning population. They wanted to get on with the training. I sympathize, but it's really important to know how your training changed the learner's knowledge. If you don't know that, it's impossible to know what direct effect your training has had.

Level three, interestingly, was the easiest for training managers to handle. After all, they had numbers to justify that a particular skill was lacking in some way, and it was easy enough to do an additional evaluation to note the change in skill. Since the level two measures were lacking, however, managers could only guess that training was responsible for the change. Increased awareness, marketing, or other factors may have been responsible.

The biggest issue was level four, which is to be expected. Since training managers didn't have the first three levels of measurement on solid footing, and upper management wasn't sharing its objectives (or didn't have any), level four was an impossible climb. We've already looked at some issues with the first three levels, which are generally under the training manager's control. Since organizational goals and program objectives are the purview of higher powers, what can a training manager do?

I think the answer is simple: make something up. Training managers know enough about the organization to understand its mission. They know about the big programs and what they're designed to do. The trick is to tie the course to the program.

Though training managers may not get the news, marketing is all over this stuff. Every press release is peppered with "meeting the executive's goal of..." or "increase operational efficiency" or "reduce our risk exposure." This kind of language is all over an organization's website. Trainers should just pick a goal and march with it. Better yet, pick several. It's not a trick; any good trainer can identify organizational deficiencies instinctively. Trainers know what the organization needs, they just need a little help describing it to the folks with the checkbook.

Training must have a purpose, and that purpose should at least include an organizational benefit. As long as the first three Kirkpatrick levels are solid, it should be easy to show the way to level four.

Saturday, October 30, 2010

Uncle Sam Is So Anti-Social

Social networking sites have become powerful tools for connecting people. Whether it's a party, political event, or a business deal, people are increasingly getting it done through FaceBook, Twitter, and LinkedIn. Some workplace technologies, like SharePoint, have social tools to enhance collaboration. Status updates are replacing group emails, and it's starting to be difficult to know what's going on without logging on.

The federal government, however, is adopting social networking and Web 2.0 at a snail's pace. A few agencies, like the Centers for Disease Control and Prevention (CDC), are at the forefront of government use of social networking. Many other agencies, however, severely limit or even forbid their people from using these new tools in any official capacity. Instead, government workers are posting commentary and information unofficially and creating the same kinds of information conflicts the outright ban was levied to avoid.

Certainly, the government has a duty to protect some information like internal operations, personally identifiable data, and classified material. I'm pretty sure they have laws, rules, and policies to cover this stuff already. Does the medium really matter? Isn't distributing classified material already illegal?

These bans are a lot like the bans some states enacted on using a cellphone while driving. Distraced driving laws are already on the books; we don't have to cover every single distractive act a driver can commit. In fact, the more that laws against activities like text messaging get passed, the weaker the general distracted driving law gets. It's a vicious circle; as the overarching law weakens, legislatures feel compelled to pass dozens of mini-laws to cover it. By the same token, singling out a particular technology to ban implies that the others are somehow OK to abuse.

Web 2.0, including social networking, is simply a different way to communicate. It should be governed by the same information policies that regulate email, phone calls, and dinner party conversation. If the answer is not banning or policy changes, what is it? Training. How about allowing official use of Web 2.0 technology to communicate with the public after a review of established guidelines and practical training in their application? In other words, why not teach government workers about appropriate use of Web 2.0 instead of pushing it underground? Then discipline or dismiss the offenders without shutting out everyone else.