Thursday, March 15, 2007

Why Training Is Useless

For a lot of my professional life, I got paid to do training. It usually went very well in the sense that I got high ratings (evaluations) and people not only paid their bills, but invited me back again to do it again and again.

I now believe that the overwhelming majority of all business training, by me and by everyone else, is a complete waste of money and time, because only a microscopic fraction of any training is ever actually put into practice and yield the hoped-for benefits.

The main reason is that companies keep trying to bring about changes in behaviour by training their people in new things, and then sending them back to their operating groups subject to the same measures and management approaches as before. Not surprisingly, little, if any of the training ever gets implemented.

What companies don't seem to understand is that training is a wonderful LAST step in bringing about changed behaviour, but a pathetically useless first step.

Take, one example, the many speeches and seminars I have been invited to give since 2000. Companies have paid me loads of money to come in and be (a) inspirational and (b) informative about the importance of, and mechanisms for, growing relationships with existing clients. However, in vain do I point out that in order to have the time to nurture these relationships, it follows that you need to be more selective on which new client proposal opportunities you pursue -- there are only so many hours in a day.

This doesn't always go down well. In fact, I got fired by one firm after the training seminar, for suggesting that it was OK to decline to pursue something. "In this firm", the National Managing Partner said, "we pursue all new business opportunities."

That's fine by me. He may be right and I may be dead wrong. But why did he spend all that time and money encouraging his people to do something other than what he believed in, and what they all knew he was going to evaluate them on?

Another example of wasted money are the calls I get to put on training programs to help people become better managers. I put my callers through a standard set of questions: Did you choose your managers because they were the kind of people who could get their fulfillment and satisfaction out of helping other people shine, rather than having the ego need to shine themselves? (No!) Did you select them because they had a prior history of being able to give a critique to someone in such a way that the other person says -- wow, that was really helpful, I'm glad you helped me see all that. (No!) Do you reward these people for how well their group is done, or do you reward them for their own personal accomplishments in generating business and serving clients? (Their personal numbers!)

So, let's summarise, I say. You've chosen people who don't want to do the job, who haven't demonstrated any prior aptitude for the job, and you are rewarding them for things other than doing the job? Thanks, but I'll pass on the wonderful privilege of training them!

The truth is that most firms go about training entirely the wrong way. They decide what they wished their people were good at, allocate a budget to a training director and ask that training director to come up with a good program. Note, of course, that a training director (or Chief Learning Officer, if you prefer) is the LEAST influential person in the firm in the sense of being positioned to bring about any real changes in how things happen in daily execution. Pathetic and useless. A waste of everybody's time.

The correct process would be to sit top management down, ask "What are people not doing that we want them to be doing?" and then figuring out a complete sequence of actions to address the questions -- how do we actually get people to change their behaviour? What measurements need to change? -- what behaviours by top management need to change to convince people that the new behaviours are really required, not ust encouraged? -- what has to happen before the training sessions to bring about the change? What has to be in place the very day they finish?

Only if done this way will firms and companies get a return on their training dollar, pound, euro or yen.

A good test is as follows. If the training were entirely optional and elective, and was only available in a remote village accessible only by a mule, but people still came to the training because they were saying to themselves "I have got to learn this -- it's going to be critical for my future", then, and ONLY then, you will know you have timed your training well. Anything less than that, and you are putting on the training too soon.

13 comments:

Anonymous said...

Amen.

When I hear the word "training" in the corporate environment, I sometimes think of doggie treats and a rolled-up newspaper.

Your questions that management should ask before initiating training are excellent.

But does management itself exhibit or really support these new behaviors?

People can sniff that lack of alignment out in the first five minutes.

Anonymous said...

Good question.

I think thet the training is offered and funded so that management can be see to be doing something.

They really don't want to address the changes I suggest need to be examined.

Nor, and this is the key point, do they want to admit that the "fish rots from the head", -- it is them who created the current response from the organisation and it is them who need to change first in order to elicit a different response.

That's a heck of a blow to the ego to absorb.

Accepting responsibility for what's going on?

Wow!

That's heavy, man.

It's a lot easier (and self-protective) to say "It ain't me, it's a training problem.

I always say that it is management's job to make people WANT to learn things, by managing the "WHY": why this is important, why it is exciting and fulfilling, why people should sacrifice their time and attention to get involved.

If you can be convincing on the WHY, the training itself is trivially easy, people will find the books, the videos, the on-line materials, the college courses.

In fact, when I do still do training-like sessions, that's all I focus on -- I try to get people excited about the topic, so they will leave actively seeking out the new learning.

Anonymous said...

I agree that most corporate training is a waste of time and money.

Businesses often use training as a surrogate for the hard work of true skill development.

Richard's research into corporate profits, published in Practice as You Preach (and generously available on this website, p. 14 of Richard's Presentation Handouts, Strategy), demonstrates that improving employee perceptions of several other factors -- fair compensation; enthusiasm, commitment and respect; employee satisfaction; etc. -- are more effective drivers of profits than training.

[Aside to faithful blog readers: Take the time to review ALL of the presentation handouts. They are terrific!]

Certainly it is preferable to employ training as a part of a much larger, multi-faceted improvement process.

Obtaining buy in from the top managers and agreeing on public implementation of the initiative (and corresponding culture change) is the best practice for consultants looking to effectuate change and demonstrate value for their services.

I commend Richard and others for eschewing the easy money to be made in training, instead committing to the more difficult process of creating lasting change and adding true value to the client's bottom line.

I work as a facilitator within General Motors' Standards for Excellence process, a voluntary, continuous improvement initiative for GM dealerships designed to help dealers increase sales, exceed customer expectations, and increase profitability.

The process has been very successful, and it includes some training elements for the dealership employees.

Based on my experience, attendees can benefit from training sessions (including those unwilling to donate a kidney to attend) even in the absence of management buy in or a significant cultural change.

Some training, such as Humax's Reciprocity Ring (http://www.humaxnetworks.com/), provides experiential lessons that have reached even the most reluctant attendee.

Training that shares best industry practices to those on the front line also is particularly beneficial for many.

Even training on basic skills, such as becoming a better manager or active listening, can help if done well.

The issue is not whether training is the best solution in most situations (it is not), but rather whether it should be attempted without prior management buy in and commitment to change.

I tend to be optimistic about the possibility of reaching/ helping those who are ready for the message.

The training sessions also have provided a baseline of knowledge/ self-awareness that provide a context for future coaching/ feedback.

Anonymous said...

I agree with the points expressed in the post and the comments.

I think that the much of what is wrong with most corporate training programs is succinctly illustrated in Harry Potter and the Trainer of Dire.

Anonymous said...

Having spent over a decade in the training/ learning business, I agree with Richard's comments.

Training is too often used as a (personally) inexpensive way to look like you're doing something if you're a manager.

As typically done, it requires little time and little personal change.

Training can help people learn how to do new things.

But that's only one of the many (and usually inappropriate) ways firms use training.

On a good day, they're trying to solve a performance problem.

We've found that performance problems (or behaviour change) have several potential roots (and I'm sure I've missed some):

-- People don't WANT to change

-- People don't know WHAT they should change

-- People don't know WHY they should change

-- People aren't incented to change or aren't MANAGED in such a way that they are motivated to change (as commented on by John and Richard above)

-- People CAN'T change (they just flat-out don't have the capability even on their best day)

Of course, the other big problem with traditional training is that it's divorced from "real work".

We give people tools and techniques days, weeks, months, even years before they'll need them and then hope they will somehow recall them (and perform them!) flawlessly when needed.

Wishful thinking at best.

While not a panacea, active coaching (by managers or competent outsiders) helps people really learn because the new information is immediately applied to a situation of great interest to the learner.

And nothing beats the feedback given at that moment by that competent coach for a learning experience.

Anonymous said...

Richard, if you're doing the training, where can I find a mule?

Anonymous said...

How many professional trainers does it take to change a company?

One, but they have to really want to change.

Anonymous said...

I made the majority of my income as a commercial trainer for almost 10 years, teaching software testing courses.

In parallel, I continued my work as a software development consultant and as an attorney who specialises in computer-related commercial law.

The training brought in far more income and more client appreciation than anything else that I did, and probably provided less long term benefit than anything else I did.

I'm going to share some of the finer details of my experience to give you a context for the more general suggestions that I'll lay out near the end of this note.

TRAINING IN SOFTWARE TESTING

Software testing is an interesting area.

Depending on how you count, 20% to 60% of the software development budget is spent on testing and many software development groups have as many testers as programmers.

Despite the enormous cost, university education in software testing is weak.

It is getting slightly better now, but even the best schools (for testing) only scratch the surface.

Unlike programming, most testing theory and most testing skills will have to be learned on the job without university support.

The result for testing is that it is in a deep rut.

Theoretical development in the field has stagnated at about the level that can be achieved by an individual working in an applied position in a company can achieve over a few years of thoughtful work.

We got that far by 1983.

I know a bit about the state of the field -- In 1983, I headed the Testing Technology Team at WordStar, then one of the largest software companies in the world (bigger than Microsoft).

Figuring out better ways to train staff, evaluating new tools and processes, and working with our outsourced-service providers (external test labs, etc.) were in my scope.

Dissatisfied with the inefficiencies and rigid QA thinking common at the time, I started developing my own testing courses and writing my own books.

My first book, Testing Computer Software (1988) became the best selling book in the field in its (1993) second edition.

Along with testing and managing test groups, over the years, I did/ managed every technical aspect of product development including programming, user interface design/ development (my first doctorate was in psychology), and technical writing (my staffs and I won awards for our writing).

Having established a few credentials, let me come back to the testing rut and the training problem.

Most commercial training in testing -- and the syllabi/ standards for certification of software testers -- are stuck at the 1983 level.

Let me explain the impact of this in practical terms.

When I was learning software engineering (1970's) a large computer program was 10,000 lines of code.

Business code was often written in COBOL, a language designed to be readable even to business managers.

Most testing theory and test techniques were developed in that era.

Many assume that the tester has the time and ability to read all of the relevant parts of the program.

These days, even your cell phone probably has over a million lines of code (several have over 5 million).

It's great to lovingly handcraft every test, but in the face of programs this large, the tester gets (proportionally) so little done that even if the technological risks were the same now as before (different risks call for different types of tests), the tester who does things the old way will find her work irrelevant simply because she can't do enough in the time available to make a difference to the quality of this large a program.

We need new models that improve our productivity by several orders of magnitude.

Sadly, the most popular test automation tools perpetuate the old styles of design and test and improve productivity only marginally.

People spend big bucks for the small improvements (that are often not realized) in productivity and project control.

To break out of the rut, we need to think in new ways, to use the technology in new ways.

For that, we need advances in theory and practice that go beyond what testers can easily learn and reinvent on their own (or with a few friends) on the job.

Traditional commercial training provides no (0.00) help with this.

People need time to play with new ideas, challenge them, try them out, learn how to apply them, and for some of the brightest, learn how to go beyond them to the next ideas on the new path.

Universities can be excellent for this type of learning.

In 1999, I decided to break out of commercial consulting and training to see if I could develop university-quality instruction that could be exported back to the workplace.

Florida Institute of Technology was enthusiastic about this vision and hired me as Professor of Software Engineering.

Florida Tech let me chair the Curriculum Committee in its Computer Science program.

My committee designed a B.Sc. program in software engineering that became one of the first software engineering degree programs to be accredited in the United States.

We very carefully researched the idea of offering a software testing degree as well and abandoned it, largely because it carried too much risk of overspecialisation for our students (no matter how broad the courses the students actually took, too many hiring managers and faculty evaluating graduate student applications told us they would perceive such a degree-holder as highly specialised, which would limit their opportunities to stretch into new types of positions).

So, as I suspected, the task was to develop a curriculum that could export back to the workplace, rather than a university degree program in testing.

I had to accept a serious pay cut (commercial training is VERY lucrative) but 6 years later, I believe it was worth it.

I think I have a model that might work -- and not just for testing.

WHERE WE ARE IN TESTING EDUCATION

At http://www.testingeducation.org/BBST, I publish a collection of instructional support materials for software testing.

These include video lectures, slides, worked examples, multiple choice review questions, study guide questions for exams, suggested classroom activities and so on.

These are a foundation for my introductory course in software testing and for similar courses taught by faculty at several other universities.

Several test managers also use them.

The materials are free.

Anyone can use them.

I think there will be a place for commercial materials like these, but I think businesses supplying them will be viable only after a broader adoption of the concept of this style of training.

Until then (for several years) my goal will be to foster adoption and improve my materials to facilitate adoption, rather than to achieve licensing revenue.

At Florida Tech, here's how we use the course materials.

(a) Students watch the lecture BEFORE coming to class. (I often give a quiz at the start of class to check what they learned from the video.)

(b) In class, we do coached activities, usually in teams, or we have instructor-facilitated discussions.

(c) I also assign homework -- typically moderately complex assignments that students can do in teams. We run the course using an open source (free) course management system (http://www.moodle.org) that lets us provide various types of discussion forums for the students so they can work together online.

This is called a "hybrid" course.

It relies heavily on web access, but the coached meetings and exams are live, face-to-face events.

My impression is that my students are learning more in this type of course.

They are also working harder.

EXPORTING THIS TO THE WORKPLACE

Test managers at a few other companies are using these materials with their own staff.

Here's an example of how it can be done:

(a) The test group agrees to meet at lunch time every Tuesday.

(b) Over the weekend, or Monday night, the staff watch a video segment, typically about 20 minutes of video.

(c) on Tuesday, they start with a discussion of the video (the technique or theory presented, how the ideas fit in their company etc.).

(d) Then they talk about how they could apply this to the project(s) they are working on Right Now and over the next week, if they agree that the ideas are applicable (some are not and are dismissed by the group), they actually do try them. The test manager (or other local trainer) works with individual staff over the week seeing how the attempts to apply the idea are going.

(e) Next Tuesday, they review their progress.

(f) For the following Tuesday, they watch the next video.

This takes time -- a one-semester university course translates to about a year of in-house training, taken a little bit at a time.

But at the end of the year, the staff-students have actually learned a lot and have actually tried to apply what they are learning.

Changes are introduced into the organisation gradually, experimentally, and with supervision.

Some of them are likely to stick.

At the end of the year, students move to the next course.

(I don't have a "next course" online for testing education yet, but some of us are working on it.)

EXPORTING THIS BEYOND TESTING

The testing course illustrates an instructional approach.

Some other faculty and I have been looking at teaching discrete mathematics this way.

Or software requirements analysis.

Or software metrics.

It seems clear to us that the approach would work as well for any of these as for testing.

I taught my first courses in human experimental psychology, not computer science.

When I was in law school, I was a teaching assistant (gave lectures, graded papers) in real property law.

I also taught professional development courses on computer law for engineers at UC Berkeley Extension.

Based on those experiences as an instructor, I believe that this approach would generalise well enough to "soft skills" commercial training.

The training would still have to be in areas in which individuals are allowed to try the new things they are learning.

Teaching someone a new approach -- that the won't be ALLOWED to use in their workplace -- is good only for encouraging someone to find a job somewhere else.

But many courses are about improving what you do within the constraints of what is socially acceptable in your current context.

Within those constraints, I think this instructional style can be effective.

WHAT ARE SOME OF THE ESSENTIAL INGREDIENTS?

I am speculating here.

I am doing some research on the instructional significants of the things that my lab is doing, but most of my conclusions are educated guesses based on what I see, hear and feel as an instructor (a craftsman) rather than as an educational scientist.

(1) The pace of introduction of new ideas is kept slow. This allows time for discussion and attempts to apply every important new idea. If it's not worth discussing, and it's not worth trying to apply, it's not worth teaching.

(2) The teaching is led by the group manager (or someone else in authority). Encouragement is coming from a person who plays a role in rewarding good work (promotions, bonuses, raises, etc.). This is an important way of saying that the company takes this seriously.

(3) The ideas are applied to the actual current work being done by the staff. Rather than distracting from the staff's ongoing work, the training helps them improve the quality or efficiency of what they are doing today. Training pays for itself now, not later. (Well, later too. But not JUST later.)

(4) Work is collaborative. People work in pairs or small teams to apply a new idea or to develop a presentation. The value of this varies from person to person, but it is often the case that one person can help a second break out of a creative logjam.

(5) The training media (videos and written papers) are redundant across modalities (some people learn better from reading, some from watching a video) and are reinforced by discussion.

(6) The rate of introduction of new materials is slow enough to not interfere with the normal pace of work.

(7) The hours per week spent on training are few enough to not interfere with current work.

(8) A long-term training program that delivers genuine value to the staff becomes a retention tool.

-- Cem Kaner, J.D., Ph.D., kaner@kaner.com

Anonymous said...

talk about throwing the baby out with the bath water

why waste our time with this and not rather write an essay about how to deliver or procure effective training that both the employee and the organisation sees tangible benifits from?

Anonymous said...

OK, David G, fair challenge.

Watch this space.

But if the spirit of your comment is that one shouldn't criticise without offering a constructive suggestion, shouldn't that apply to readers/ visitors too?

What are your ideas?

Anonymous said...

simple -- like anything that a business manages, training efficiency and effectiveness should be measured, trended and then improved based on those fiindings.

No manager should ever be given a training budget until they show which business metric that training will impact -- they should set a goal for that impact and review performance against that goal before allocating more training budget to that manager

in some part of an organisation, like operations, training is a necessity and it can be done extremely well or extremely poorly -- the difference is often whether or not training is measured.

That's the starting point.

Then, try different stuff, measure and repeat the stuff that works.

at-will employment has allowed US management to become extremely inept at the skill of human capital development -- at the expense of total productivity and innovative capability of its workforce -- this issue is part of a major problem facing US enterprise

now, your turn --

Anonymous said...

Wonderful, David.

I've continued it with a new blogpost, called Saving the Training Baby

Anonymous said...

You are correct.

Trainig assumes and outcome and that is rarely if ever a correct assuption.

The impact of training should be abel to be measured in impact on business.

The purpose of training should be to change behaviour.

Behaviour change starts with those things that elicit and maintain behaviour.

How can those things be implacted within the scope of "training"?