Tracking is not Learning

SCORM and the LMS are the Achilles Heel of training. Tracking data has become synonymous with measurement. This week I got an email from a vendor promoting the tracking capabilities of the product. It made realize how often tracking data is used to misrepresent training success.

Many people think that metrics pulled out of an LMS indicate the success of training programs. They see tracking metrics as “performance measurement”. Tracking is not measurement and is no indication that learning took place, or that learning will transfer to job performance. Relying on tracking data shows our collective weakness in measuring training effectiveness.

Reliance on tracking data also means we aren’t asking the right questions about our training programs. How many times are you asked the following questions about training:

  • How many learners completed the eLearning course?
  • How many people attended training?

These are important stats, but they have little, if any, correlation to job performance. Just look at how effective the Secret Service ethics training was prior to the debacle in Columbia. Everyone completed training, but it was clearly not effective.

Reliance on tracking data also means instructional designers aren’t being honest about real measurement and effectiveness. It’s easy to hide behind tracking numbers, and often those numbers can give the impression that training programs are effective and valuable. It certainly sounds better to say that 90% of learners completed training than to say you have no idea whether or not they learned anything.

Yes, we can design training courses to measure learning through activities and assessments (not just quizzes, real assessment). In those cases, the tracking metrics do provide value. If the course has rigorous assessment, then your completion stat is an indicator of learning. But how many courses have you developed that truly assess learning? How many have you taken online or attended? Instructional designers are often between a rock and hard place – there is an expectation from management (or customers) that people complete training, so we are under pressure to ensure they do. It’s our job as course developers to make sure that completion is not the most important metric.

And we won’t necessarily find the answer in Kirkpatrick’s Level 3 evaluation. That’s a great idea, but impractical in many organizations. If you can do Level 3 evaluations, then do them. We really need to look more seriously at the types of integrated assessments we do in training and how learners can measure their own success. We can’t be afraid to let people fail the assessment, and shouldn’t punish those that do.

But instructional designers and course developers need to start at the beginning and ensure that the people asking for reports on training success understand what the data means. We also have to ask the right questions before we start developing training:

  • Why is this training important to the organization.
  • What criteria will be used by management to determine success.

If the answer to the second question is something like “everyone will complete the training” then you better go back to the first question and dig deeper into the problem.

In the long run, reliance on tracking data and lack of learning assessment will come back to bite us. If you figure out quickly that your training isn’t effective, then you can make adjustments before it’s too late. Achilles was a mighty warrior, but in the end he was defeated by his one vulnerability. Don’t let assessment and measurement be yours.

Access = Learning

Learning is about access to information. The more information people have available to them, the more likely they are to learn. Sounds pretty obvious, right? After all, Google has turned into the greatest job support/learning tool ever created because it gives us instant access to the information we need, when we need it. Why should we treat learning in our organizations any differently?

Learning resources should be freely available to the people that need them without forcing them to jump through hoops.  The learning landscape is moving beyond the concept of a course. Times are changing, people are growing up with Google, Digg, Facebook,, RSS feeds, and Twitter. The way we provide information and training must match the way people consume it.

Remember the newspaper? It got delivered to your door every morning and you sat down to read it over breakfast. That’s how you learned what was going on in the world. How many people under the age of 40 still do that? How many ever did that? The idea that courses are the pinnacle of eLearning is as archaic as the idea of getting news only from the morning paper. Newspapers are scrambling to change their business model, and so should course developers.

Why not just let learners view the resources they need? Web server statistics will give us a lot of information about what resources people are using. We can build simple tracking or feedback mechanisms without a lot of overhead.

If we want people to learn we need to be less concerned about how many pages they view or which quizzes they pass, and more concerned about providing the right information in the right way. Access to information does equate to learning. We know that 80% of learning is informal. We should focus at least 80% of our effort in that direction.

Career Development Training – Too Little or Too Much?

Today while checking Google Reader I saw two articles about career development training that caught my eye.

The first article reports the results of a survey sponsored by SkillSoft that finds American companies need to offer more career development training in order to boost job satisfaction. As the article states:

“… a SkillSoft-commissioned survey indicated that eight out of ten employees would have higher job satisfaction levels if they received more on-the-job training.”

Career development is important in organizations not just because it helps with job satisfaction, but also because it enables people to do their job and be more productive. I think most training professionals understand that. The key is that the training has to be relevant to the job and to the employee. I think this particular survey needs to be taken with a large grain of salt considering SkillSoft sponsored it. The article does state SkillSoft is a provider of online courses, but reads like a SkillSoft press release.

Regardless of the source or credibility of the survey, providing career development opportunities for employees is important and should be taken seriously.

On the opposite end of the spectrum is this article about excessive on the job training in Korea. The article tells the story of workers who arrive at work early to complete mandatory online training so they don’t fall behind in meeting training requirements, up to 12 courses a month in one case. The amount of required training has become stress point for employees. I hope someone is looking at the return on investment for such stringent training requirements.

Ideally companies provide something in the middle – the right amount of training for a given job. The training offered to employees should be based on organizational and individual goals. Some level of mandatory training is often required, like sexual harassment or ethics training, but some real employee development courses are needed, especially in technical fields. If the only training offered is the “because our corporate lawyer told us we had to” type of training, then you can probably do more for your employees.

So, take a look at what you offer to employees, or if you’re an employee, look at what’s offered to you. Does it meet organizational goals? Does it meet personal goals? Does it help you keep up with new technologies? Does it help you be more productive? Hopefully the answer to all these questions is “Yes”.