Saturday, November 27, 2010

SharePoint Is a Waste of Money, But It Shouldn't Be

Microsoft SharePoint 2007 is an amazing tool. Imagine a project management site, library, supply chain system, and records center coupled with Web 2.0 and social networking. It can't miss, right? Right! The promise of SharePoint has made it a wildly popular implementation in business and government. The practicality of it, however, has made it a hair-pulling experience for many people who are just trying to get some work done. Where's the disconnect?

The biggest challenge seems to be implementation. Typically, SharePoint gets poor treatment by IT people, who view it as a glorified web server or network share. Either way, it's an unknown and therefore a pain. Often, IT folks will spend six months planning the hardware and bandwidth and six hours on the software. Once SharePoint is turned on, the team walks away. It's tough for most employees to create a website from scratch; it's almost impossible for a novice to use SharePoint to its potential. The hardware, licensing, and IT effort are all for naught.

The answer is pretty simple: IT need to listen to the user base. They should find out -- from workers, not their manager -- about their daily work. Engineers can discover with them what features SharePoint can offer that save them some effort. Programmers can go over the use of RSS feeds and alerts for automatic communication. Trainers can school them in the benefits and surface usage. Once they get their feet wet, users will come back to IT for time-saving features that will really demonstrate SharePoint's ROI. Otherwise, SharePoint is just another spike in the five-year budget.

Saturday, November 20, 2010

4th Floor -- Results, Organizational Benefits, Objectives, Goals

I gave a seminar on elearning at a federal department this week. It was full of training managers and other human capital experts. They all had the same comment: "This is great information, but it doesn't address what I really need. How do I convince upper management that training is valuable?" As we examined that topic from different angles, a few things became clear. First, many training managers aren't measuring their training beyond a course evaluation. Second, and more shocking, upper management isn't sharing program objectives or organizational goals with the training team.

Let's take the measures first. You're probably familiar with Donald Kirkpatrick's evaluation model. Four levels of training assessment are broken down into a learner's evaluation, a learning assessment, knowledge transfer, and organizational benefit. Each level is measured and builds on the preceding one. It's pretty simple in concept, but it can be difficult to practice.

The training managers had some problems with the level one assessments. One issue was the questions didn't seem to go far enough. It's good to ask a learner how he or she liked the course, but it's better to go a little deeper. Ask how the learner would do it differently or if any information seemed to be missing. Find out how the learner would have survived without the course, and what resources he or she might have used if the course didn't exist. Another big issue was making sure the online evaluations were anonymous in the LMS -- something any good programmer should be able to provide with little difficulty.

At level two, managers seemed to be uncomfortable with the idea of a pretest evaluation of the learning population. They wanted to get on with the training. I sympathize, but it's really important to know how your training changed the learner's knowledge. If you don't know that, it's impossible to know what direct effect your training has had.

Level three, interestingly, was the easiest for training managers to handle. After all, they had numbers to justify that a particular skill was lacking in some way, and it was easy enough to do an additional evaluation to note the change in skill. Since the level two measures were lacking, however, managers could only guess that training was responsible for the change. Increased awareness, marketing, or other factors may have been responsible.

The biggest issue was level four, which is to be expected. Since training managers didn't have the first three levels of measurement on solid footing, and upper management wasn't sharing its objectives (or didn't have any), level four was an impossible climb. We've already looked at some issues with the first three levels, which are generally under the training manager's control. Since organizational goals and program objectives are the purview of higher powers, what can a training manager do?

I think the answer is simple: make something up. Training managers know enough about the organization to understand its mission. They know about the big programs and what they're designed to do. The trick is to tie the course to the program.

Though training managers may not get the news, marketing is all over this stuff. Every press release is peppered with "meeting the executive's goal of..." or "increase operational efficiency" or "reduce our risk exposure." This kind of language is all over an organization's website. Trainers should just pick a goal and march with it. Better yet, pick several. It's not a trick; any good trainer can identify organizational deficiencies instinctively. Trainers know what the organization needs, they just need a little help describing it to the folks with the checkbook.

Training must have a purpose, and that purpose should at least include an organizational benefit. As long as the first three Kirkpatrick levels are solid, it should be easy to show the way to level four.

Saturday, November 13, 2010

Spend Your Training Budget, Train No One

Looking over some of my previous posts, I have discovered a common theme: scaling objectives. That is, achieving a goal by building a small but solid foundation and scaling up as needed to meet the goal. Unfortunately, some training program managers choose to spend lots of money on the one elearning component that doesn't directly train anyone: a learning management system. As soon as a training program is launched, some people go running for infrastructure. Don't get me wrong; I'm not against using an LMS. I just don't think that kind of major purchase is Step One.

So what is Step One? Figuring out what Step One is, of course. Look at all the people you serve. What are their duties? Where are the biggest area for improvement? What kind of impact would training make on performance? In other words, make sure you know who you are training and what courses they should have. Then it's time to figure out how.

There are two main directions to go: synchronous and asynchronous. Those are just 50-cent terms for instructor-led and self-directed. Which way to go is mostly based on how many learners you have, how many courses you intend to provide, and the course content, not to mention budget. Synchronous training is LMS-independent; classroom training has existed for centuries. Asynchronous training needs an LMS of some kind, but it doesn't have to cost much.

The essence of an LMS for many organizations is automated reporting. Test scoring, classroom scheduling, and course creation can all be beneficial, but for most trainers they are expensive add-ons. The bad news is that a traditional LMS includes those expensive add-ons and contributes to a huge budget footprint. The good news is that you don't have to have a traditional LMS to do your reporting.

If you have access to an intranet similar to SharePoint, work with a good developer to devise a solution for your internal or OPM reporting needs. The course and reporting can be hosted on your intranet and reporting data can be pushed to an external source as needed. The development will cost a fraction of a full-blown LMS, and there are no recurring costs, since your intranet license is already covered.

So when do you need an LMS? The answer varies with the organization, but here are some rough guidelines.

  • Courses have expanded to a curriculum tied to a career path
  • The learning population is over 200
  • The number of courses exceeds 50
  • You really want all the add-on features of a traditional LMS

Otherwise, stick with a home-grown reporting tool and use that huge savings to get some courses ready.

Saturday, November 6, 2010

Section 508 Blues

Technology is amazing. When it comes to online training, I am impressed with what a good programmer can do. I have both produced and taken online training in various forms, and I love it when I really get the subject matter. Programmers use lots of techniques and media to make training engaging, like interactivity, video, audio, branching paths, cool graphics and animations. True, some of it can be just eye candy, but a good instructional designer will employ multimedia to give learners further insight. Giving a high-level learning experience is so valuable, I sometimes wonder why federal government trainers do anything else. Then I remember -- Section 508.

In case you don't know, Section 508 of the Americans With Disabilities Act "... requires that Federal agencies' electronic and information technology is accessible to people with disabilities," according to www.section508.gov. What does this mean? Nobody seems to really know, because its broad interpretation leaves a lot of leeway. The spirit of the law, however, is pretty easy to understand: make the experience of a disabled individual as close as possible to that of someone who isn't disabled.

With that in mind, it's time to make all that interactivity and media accessible. Sounds pretty simple in theory, doesn't it? In practice, unfortunately, development tools like Flash provide only the most rudimentary accessibility support. Sure, it's easy enough to provide tab navigation to people who use interface devices other than a mouse, and providing captioning for the deaf and hard of hearing isn't that hard, either, though it takes some time. Section 508 for the blind, though, is a real challenge.

Out-of-the-box navigation objects and media containers just aren't built to work well with screen reader programs like JAWS. Online, screen readers want to examine every piece of content, regardless of its relevance. They aren't bright enough to know that a background graphic isn't as important as a lesson title. We have to tell screen readers what's important with good programming, and that takes lots of time and skill.

So why bother? We have other options. Maybe we can just make a website with no graphics or interactivity at all. That's pretty easy to make 508 compliant. The only problem is we've just fallen off the training wagon into a big pile of "information." If we can't do a little simulation, coaching, or interaction, it's hard to know whether learners can apply what we're trying to teach them.

Maybe we can just provide a "text equivalent," which is a transcript of the online course. It's a technical out but doesn't conform to the spirit of the law in any way. People who use the transcript will only be told, not trained.

It looks like we're stuck with providing a rich, valuable learning experience for everyone, regardless of disabilities. That means we have to make those custom interface controls, provide narration, and describe buttons, graphics, and animation. We will use a screen reader like JAWS to test our work. We will try to get as close to the same experience as we can with the resources we have. We want to train everyone, don't we?