I gave a seminar on elearning at a federal department this week. It was full of training managers and other human capital experts. They all had the same comment: "This is great information, but it doesn't address what I really need. How do I convince upper management that training is valuable?" As we examined that topic from different angles, a few things became clear. First, many training managers aren't measuring their training beyond a course evaluation. Second, and more shocking, upper management isn't sharing program objectives or organizational goals with the training team.
Let's take the measures first. You're probably familiar with Donald Kirkpatrick's evaluation model. Four levels of training assessment are broken down into a learner's evaluation, a learning assessment, knowledge transfer, and organizational benefit. Each level is measured and builds on the preceding one. It's pretty simple in concept, but it can be difficult to practice.
The training managers had some problems with the level one assessments. One issue was the questions didn't seem to go far enough. It's good to ask a learner how he or she liked the course, but it's better to go a little deeper. Ask how the learner would do it differently or if any information seemed to be missing. Find out how the learner would have survived without the course, and what resources he or she might have used if the course didn't exist. Another big issue was making sure the online evaluations were anonymous in the LMS -- something any good programmer should be able to provide with little difficulty.
At level two, managers seemed to be uncomfortable with the idea of a pretest evaluation of the learning population. They wanted to get on with the training. I sympathize, but it's really important to know how your training changed the learner's knowledge. If you don't know that, it's impossible to know what direct effect your training has had.
Level three, interestingly, was the easiest for training managers to handle. After all, they had numbers to justify that a particular skill was lacking in some way, and it was easy enough to do an additional evaluation to note the change in skill. Since the level two measures were lacking, however, managers could only guess that training was responsible for the change. Increased awareness, marketing, or other factors may have been responsible.
The biggest issue was level four, which is to be expected. Since training managers didn't have the first three levels of measurement on solid footing, and upper management wasn't sharing its objectives (or didn't have any), level four was an impossible climb. We've already looked at some issues with the first three levels, which are generally under the training manager's control. Since organizational goals and program objectives are the purview of higher powers, what can a training manager do?
I think the answer is simple: make something up. Training managers know enough about the organization to understand its mission. They know about the big programs and what they're designed to do. The trick is to tie the course to the program.
Though training managers may not get the news, marketing is all over this stuff. Every press release is peppered with "meeting the executive's goal of..." or "increase operational efficiency" or "reduce our risk exposure." This kind of language is all over an organization's website. Trainers should just pick a goal and march with it. Better yet, pick several. It's not a trick; any good trainer can identify organizational deficiencies instinctively. Trainers know what the organization needs, they just need a little help describing it to the folks with the checkbook.
Training must have a purpose, and that purpose should at least include an organizational benefit. As long as the first three Kirkpatrick levels are solid, it should be easy to show the way to level four.
Showing posts with label government. Show all posts
Showing posts with label government. Show all posts
Saturday, November 20, 2010
Saturday, November 6, 2010
Section 508 Blues
Technology is amazing. When it comes to online training, I am impressed with what a good programmer can do. I have both produced and taken online training in various forms, and I love it when I really get the subject matter. Programmers use lots of techniques and media to make training engaging, like interactivity, video, audio, branching paths, cool graphics and animations. True, some of it can be just eye candy, but a good instructional designer will employ multimedia to give learners further insight. Giving a high-level learning experience is so valuable, I sometimes wonder why federal government trainers do anything else. Then I remember -- Section 508.
In case you don't know, Section 508 of the Americans With Disabilities Act "... requires that Federal agencies' electronic and information technology is accessible to people with disabilities," according to www.section508.gov. What does this mean? Nobody seems to really know, because its broad interpretation leaves a lot of leeway. The spirit of the law, however, is pretty easy to understand: make the experience of a disabled individual as close as possible to that of someone who isn't disabled.
With that in mind, it's time to make all that interactivity and media accessible. Sounds pretty simple in theory, doesn't it? In practice, unfortunately, development tools like Flash provide only the most rudimentary accessibility support. Sure, it's easy enough to provide tab navigation to people who use interface devices other than a mouse, and providing captioning for the deaf and hard of hearing isn't that hard, either, though it takes some time. Section 508 for the blind, though, is a real challenge.
Out-of-the-box navigation objects and media containers just aren't built to work well with screen reader programs like JAWS. Online, screen readers want to examine every piece of content, regardless of its relevance. They aren't bright enough to know that a background graphic isn't as important as a lesson title. We have to tell screen readers what's important with good programming, and that takes lots of time and skill.
So why bother? We have other options. Maybe we can just make a website with no graphics or interactivity at all. That's pretty easy to make 508 compliant. The only problem is we've just fallen off the training wagon into a big pile of "information." If we can't do a little simulation, coaching, or interaction, it's hard to know whether learners can apply what we're trying to teach them.
Maybe we can just provide a "text equivalent," which is a transcript of the online course. It's a technical out but doesn't conform to the spirit of the law in any way. People who use the transcript will only be told, not trained.
It looks like we're stuck with providing a rich, valuable learning experience for everyone, regardless of disabilities. That means we have to make those custom interface controls, provide narration, and describe buttons, graphics, and animation. We will use a screen reader like JAWS to test our work. We will try to get as close to the same experience as we can with the resources we have. We want to train everyone, don't we?
In case you don't know, Section 508 of the Americans With Disabilities Act "... requires that Federal agencies' electronic and information technology is accessible to people with disabilities," according to www.section508.gov. What does this mean? Nobody seems to really know, because its broad interpretation leaves a lot of leeway. The spirit of the law, however, is pretty easy to understand: make the experience of a disabled individual as close as possible to that of someone who isn't disabled.
With that in mind, it's time to make all that interactivity and media accessible. Sounds pretty simple in theory, doesn't it? In practice, unfortunately, development tools like Flash provide only the most rudimentary accessibility support. Sure, it's easy enough to provide tab navigation to people who use interface devices other than a mouse, and providing captioning for the deaf and hard of hearing isn't that hard, either, though it takes some time. Section 508 for the blind, though, is a real challenge.
Out-of-the-box navigation objects and media containers just aren't built to work well with screen reader programs like JAWS. Online, screen readers want to examine every piece of content, regardless of its relevance. They aren't bright enough to know that a background graphic isn't as important as a lesson title. We have to tell screen readers what's important with good programming, and that takes lots of time and skill.
So why bother? We have other options. Maybe we can just make a website with no graphics or interactivity at all. That's pretty easy to make 508 compliant. The only problem is we've just fallen off the training wagon into a big pile of "information." If we can't do a little simulation, coaching, or interaction, it's hard to know whether learners can apply what we're trying to teach them.
Maybe we can just provide a "text equivalent," which is a transcript of the online course. It's a technical out but doesn't conform to the spirit of the law in any way. People who use the transcript will only be told, not trained.
It looks like we're stuck with providing a rich, valuable learning experience for everyone, regardless of disabilities. That means we have to make those custom interface controls, provide narration, and describe buttons, graphics, and animation. We will use a screen reader like JAWS to test our work. We will try to get as close to the same experience as we can with the resources we have. We want to train everyone, don't we?
Labels:
accessibility,
flash,
government,
interactive,
multimedia,
section 508
Saturday, October 30, 2010
Uncle Sam Is So Anti-Social
Social networking sites have become powerful tools for connecting people. Whether it's a party, political event, or a business deal, people are increasingly getting it done through FaceBook, Twitter, and LinkedIn. Some workplace technologies, like SharePoint, have social tools to enhance collaboration. Status updates are replacing group emails, and it's starting to be difficult to know what's going on without logging on.
The federal government, however, is adopting social networking and Web 2.0 at a snail's pace. A few agencies, like the Centers for Disease Control and Prevention (CDC), are at the forefront of government use of social networking. Many other agencies, however, severely limit or even forbid their people from using these new tools in any official capacity. Instead, government workers are posting commentary and information unofficially and creating the same kinds of information conflicts the outright ban was levied to avoid.
Certainly, the government has a duty to protect some information like internal operations, personally identifiable data, and classified material. I'm pretty sure they have laws, rules, and policies to cover this stuff already. Does the medium really matter? Isn't distributing classified material already illegal?
These bans are a lot like the bans some states enacted on using a cellphone while driving. Distraced driving laws are already on the books; we don't have to cover every single distractive act a driver can commit. In fact, the more that laws against activities like text messaging get passed, the weaker the general distracted driving law gets. It's a vicious circle; as the overarching law weakens, legislatures feel compelled to pass dozens of mini-laws to cover it. By the same token, singling out a particular technology to ban implies that the others are somehow OK to abuse.
Web 2.0, including social networking, is simply a different way to communicate. It should be governed by the same information policies that regulate email, phone calls, and dinner party conversation. If the answer is not banning or policy changes, what is it? Training. How about allowing official use of Web 2.0 technology to communicate with the public after a review of established guidelines and practical training in their application? In other words, why not teach government workers about appropriate use of Web 2.0 instead of pushing it underground? Then discipline or dismiss the offenders without shutting out everyone else.
The federal government, however, is adopting social networking and Web 2.0 at a snail's pace. A few agencies, like the Centers for Disease Control and Prevention (CDC), are at the forefront of government use of social networking. Many other agencies, however, severely limit or even forbid their people from using these new tools in any official capacity. Instead, government workers are posting commentary and information unofficially and creating the same kinds of information conflicts the outright ban was levied to avoid.
Certainly, the government has a duty to protect some information like internal operations, personally identifiable data, and classified material. I'm pretty sure they have laws, rules, and policies to cover this stuff already. Does the medium really matter? Isn't distributing classified material already illegal?
These bans are a lot like the bans some states enacted on using a cellphone while driving. Distraced driving laws are already on the books; we don't have to cover every single distractive act a driver can commit. In fact, the more that laws against activities like text messaging get passed, the weaker the general distracted driving law gets. It's a vicious circle; as the overarching law weakens, legislatures feel compelled to pass dozens of mini-laws to cover it. By the same token, singling out a particular technology to ban implies that the others are somehow OK to abuse.
Web 2.0, including social networking, is simply a different way to communicate. It should be governed by the same information policies that regulate email, phone calls, and dinner party conversation. If the answer is not banning or policy changes, what is it? Training. How about allowing official use of Web 2.0 technology to communicate with the public after a review of established guidelines and practical training in their application? In other words, why not teach government workers about appropriate use of Web 2.0 instead of pushing it underground? Then discipline or dismiss the offenders without shutting out everyone else.
Labels:
government,
social networking,
training,
web 2.0
Saturday, October 23, 2010
I Will Not Make A Cloud Pun
The world is moving to cloud computing, where servers online act as organizational file shares and software is distributed as a service, accessed right through your web browser. Some describe cloud computing as a return to mainframes, and there are some similarities. If you don't know what a mainframe is, go ask your dad. One major difference is that internal networks and desktop PCs have a role in storing more sensitive data, serving location-specific needs such as printing, and hosting beefier programs and very large files.
One of the organizational benefits is to reduce need for network servers within an organization, thereby causing less IT headache. Users don't really know where their files are anyway, so switching to the cloud from a network share shouldn't be a problem. In the future, storage space, bandwidth, and application usage may be measured and billed like electricity or phones are now.
The federal government's CIO, Vivek Kundra, is looking to the cloud as a solution to the challenge of spiraling IT infrastructure costs. While acknowledging a transition period, he seems to believe that the cloud will result in greater efficiencies and huge cost savings. From what I've read, he bases his argument on his work with the Washington, DC government, where he introduced Google applications to over 30,000 government workers. Reportedly, over 4,000 (about 13%) are using the cloud on a regular basis. The federal plan is to roll out similar cloud services to 300 million worldwide government workers. With similar usage rates, there should be about 39 million regular federal cloud users within a year of the official rollout.
What I wonder is, can this cloud experience be replicated on such a massive scale? Whenever I am presented with numbers, I like to do some math. 300 million is 10,000 times 30,000. That's four orders of magnitude from DC government to the federal level. I don't think it's reasonable to expect the tech czar to recreate his success 10,000 times over, but let's look a little further before passing judgement.
To get another perspective, I googled "30,000." Here are some results from the first ten (out of 99,500,000):
The last two items seem particularly interesting. 30,000 patients seems like a ghastly amount of people waking during surgery, but out of 20 million surgeries in the US each year, maybe it's not so many. After all, 99.85% of patients don't wake up during surgery. Then again, if it were the fabled "5 nines" (99.999%, or down only about five minutes every year) of the IT server world, only 200 patients would wake up. If server uptime was 99.85%, it would be down for an entire day every week. OK, maybe that's an unfair comparison. Anesthesia has only been in regular medical use for about a hundred years; servers have been around for... wait a minute.
That brings us to the item about Facebook. Notice how the same numbers pop up: 30,000 servers, 300 million users, so that's 10,000 users per server. So what? Well, Facebook was down this week for hours, at least for some people. Will the government launch its cloud initiative with 30,000 servers? If only 13% of government workers use the cloud, we still need 3,900 servers to accommodate them, and for more activity than Mob Wars and status updates. Earlier this month, T-Mobile and Microsoft destroyed data for all of its Sidekick phone users, who lost all of their contacts and other personal information. What assurance will the government have that the cloud will be properly maintained and backed up?
The federal government has not had the best track record at massive IT upgrades. Search online for upgrade problems with NMCI or FBI and you'll find a litany of lost money and productivity. The cloud transition will require many of the improvements in government infrastructure that would be necessary without the cloud. Firewalls will need to be modified, browsers upgraded, bandwidth widened, etc. etc. All this has to be done by a group of internal IT folks who are often contractors with a vested interest in the status quo. Even the government employees in IT face reduction in force or at least a transfer.
Maybe a good first test is to push SharePoint into the cloud. Agencies have only been using it for a few years, so it shouldn't have such a backlog of files and legacy applications like so many other servers do. Users don't know where the SharePoint server is, and it doesn't matter anyway, since it is designed for external as well as internal access. Development of most SharePoint applications takes place through the browser or the free SharePoint Designer; there's very little need to login to the server directly. Since many government workers use some form of Microsoft Office in their daily work already, workflows can stay the same, and agencies can avoid supporting a melange of office productivity tools.
Though cloud computing is indeed what the future holds, it is just getting started in the corporate world. It's a good idea to start with technology that is well-suited to the cloud, but familiar to users at the same time. Pushing SharePoint to the cloud first will help ensure a smoother transition.
One of the organizational benefits is to reduce need for network servers within an organization, thereby causing less IT headache. Users don't really know where their files are anyway, so switching to the cloud from a network share shouldn't be a problem. In the future, storage space, bandwidth, and application usage may be measured and billed like electricity or phones are now.
The federal government's CIO, Vivek Kundra, is looking to the cloud as a solution to the challenge of spiraling IT infrastructure costs. While acknowledging a transition period, he seems to believe that the cloud will result in greater efficiencies and huge cost savings. From what I've read, he bases his argument on his work with the Washington, DC government, where he introduced Google applications to over 30,000 government workers. Reportedly, over 4,000 (about 13%) are using the cloud on a regular basis. The federal plan is to roll out similar cloud services to 300 million worldwide government workers. With similar usage rates, there should be about 39 million regular federal cloud users within a year of the official rollout.
What I wonder is, can this cloud experience be replicated on such a massive scale? Whenever I am presented with numbers, I like to do some math. 300 million is 10,000 times 30,000. That's four orders of magnitude from DC government to the federal level. I don't think it's reasonable to expect the tech czar to recreate his success 10,000 times over, but let's look a little further before passing judgement.
To get another perspective, I googled "30,000." Here are some results from the first ten (out of 99,500,000):
- a Washington Post article mentions that 30,000 jobs were created or saved due to the federal economic stimulus package
- the New York Daily News reports that 30,000 poll workers may not get paid for election work in the NYC area this year.
- 30,000 people a year wake up during surgery, according to msnbc.com
- a datacenterknowledge.com interview reveals that Facebook has 30,000 servers
The last two items seem particularly interesting. 30,000 patients seems like a ghastly amount of people waking during surgery, but out of 20 million surgeries in the US each year, maybe it's not so many. After all, 99.85% of patients don't wake up during surgery. Then again, if it were the fabled "5 nines" (99.999%, or down only about five minutes every year) of the IT server world, only 200 patients would wake up. If server uptime was 99.85%, it would be down for an entire day every week. OK, maybe that's an unfair comparison. Anesthesia has only been in regular medical use for about a hundred years; servers have been around for... wait a minute.
That brings us to the item about Facebook. Notice how the same numbers pop up: 30,000 servers, 300 million users, so that's 10,000 users per server. So what? Well, Facebook was down this week for hours, at least for some people. Will the government launch its cloud initiative with 30,000 servers? If only 13% of government workers use the cloud, we still need 3,900 servers to accommodate them, and for more activity than Mob Wars and status updates. Earlier this month, T-Mobile and Microsoft destroyed data for all of its Sidekick phone users, who lost all of their contacts and other personal information. What assurance will the government have that the cloud will be properly maintained and backed up?
The federal government has not had the best track record at massive IT upgrades. Search online for upgrade problems with NMCI or FBI and you'll find a litany of lost money and productivity. The cloud transition will require many of the improvements in government infrastructure that would be necessary without the cloud. Firewalls will need to be modified, browsers upgraded, bandwidth widened, etc. etc. All this has to be done by a group of internal IT folks who are often contractors with a vested interest in the status quo. Even the government employees in IT face reduction in force or at least a transfer.
Maybe a good first test is to push SharePoint into the cloud. Agencies have only been using it for a few years, so it shouldn't have such a backlog of files and legacy applications like so many other servers do. Users don't know where the SharePoint server is, and it doesn't matter anyway, since it is designed for external as well as internal access. Development of most SharePoint applications takes place through the browser or the free SharePoint Designer; there's very little need to login to the server directly. Since many government workers use some form of Microsoft Office in their daily work already, workflows can stay the same, and agencies can avoid supporting a melange of office productivity tools.
Though cloud computing is indeed what the future holds, it is just getting started in the corporate world. It's a good idea to start with technology that is well-suited to the cloud, but familiar to users at the same time. Pushing SharePoint to the cloud first will help ensure a smoother transition.
Subscribe to:
Posts (Atom)