I’ve been in L&D a long time. Long enough to have seen fads come and go. The “future of learning” declared every couple of years; and endless talk of how we need to be more strategic, more business-aligned, more focused on impact.
But for most of that time, I’ve run into the same frustrating reality. As L&D people, we might say we’re here to improve performance, but we frequently behave as if we’re in the training delivery business.
So, we shouldn’t be too surprised when the wider organisation fails to take us seriously.
If that sounds a bit harsh, believe me, I’m not judging anyone. I’ve been that L&D person. I’ve taught hundreds of people how to write better objectives, design better courses, evaluate learning more effectively.
Re-wind the clock a couple of decades and I genuinely believed that the real solution to L&D’s problems lay in more effective course design. If we could crack that code, better business results would inevitably flow from that.
Then, in the spring of 2011, I went to a Training Zone Live event in London. As part of that day, I was very fortunate to attend a session run by Jim Kirkpatrick (of the Kirkpatrick’s Evaluation Model fame). I wrote about it at the time here.
Jim was a superb presenter/trainer. During the session, he introduced us all to the idea of ‘red pants syndrome’ – which was an entertaining and memorable way of highlighting the fact that in a workplace environment no-one wants to be the one doing something differently from everyone else.
So, for example, if you come back from some training full of good ideas about how to do something better (or differently), if none of your colleagues are interested in following your lead, it probably won’t be too long before you step back in line and carry on as before.
Connected to this, he also introduced us to some jaw-dropping research from his colleague Rob Brinkerhoff, which can be summarised as follows:
In a traditional approach to training design, 90% of the time is spent on design and development of the training event and only 10% on pre and post development activity. With this approach, typically, the following happens to learners:
- 15% do not try the new skills
- 70% try to implement the learning but fail
- 15% achieve and sustain the new learning
In an alternative approach, where 50% of the time is devoted to design and development and 50% of the time is devoted to post-training follow-up, typically, the following happens to learners:
- 5% do not try the new skills
- 10% try to implement the learning but fail
- 85% achieve and sustain the new learning
Over the years, I’ve shared that data (and other research which supports Brinkerhoff’s findings) to L&D teams. It generally provokes one of two reactions. Either, dead silence and deep discomfort as the futility of traditional approaches sinks in. Or cries of, “We want those results. How do we get them?”
Whatever the reaction, the honest answer to the “How do we do that?” question was, “With great difficulty”. In other words, doing ‘that’ would likely involve massive increases in budgets and resources that very few in L&D could ever afford to pay for.
So, leaving aside some exceptionally lucky folk who work in an environment where that shift from 90/10 to 50/50 has been encouraged and funded, most L&D teams are stuck in the 90/10 paradigm, delivering courses that may be fantastic but simply don’t bring about the desired workplace performance improvements.
After attending Jim’s session, it took me a while to fully process the implications of Brinkerhoff’s data. At the time, I wrote: “This is one of the most compelling pieces of research-based evidence I have seen for a long time. It has made me realise that here at Pacific Blue we should make much greater efforts than we currently do to encourage, you, our clients, to engage in this kind of approach.”
That certainly acknowledged the problem; but it definitely didn’t answer the question of ‘how to do it’. Because, as already noted, there wasn’t a meaningful answer. Several years later, I shifted focus a little, encouraging people to build qualitative evaluation and impact studies into their design process. This, at least, helps tease out where problems with workplace application lie and it enables future iterations of courses to go some way to addressing some of these problems.
But I’ve spent the last 15 years or so feeling deeply frustrated by the fact that there is a solution to the long-standing problem of L&D struggling to justify its existence – but for the vast majority of L&D teams, there has not been a viable way of implementing that solution.
Stay tuned. In the next diary entry, I’ll fill you in the next big shift in my thinking that happened a bit over 12 months ago, which really led me to where I am today and why I’m sharing this journey with you.


