Before you send an angry email to me, please be assured, what I want to say is that we are in revolution. We are a lovely bunch, not at all revolting!
I have to say I have been a little sick and tired recently. I am not getting any younger and it seems like forever we have been talking about really making a difference in L&D. Yes there have been changes, but as someone who works a lot with trainers, facilitator, L&D professionals and subject matter experts we still seem to be missing out on the basics:
- Getting close to the organisation and understanding it
- Getting curious
- Using data to drive performance improvements
- Using tech appropriately – driven by needs not the tech itself
- Focussing first on a needs analysis to drive a good evaluation
- Being agile enough to keep up with the demands of the market
If you are working for an organisation who is doing ALL of these things or even some of them or would like to do ALL of them, we would love to hear from you. If you would like to discuss how to do this, with a group of like-minded professionals then come and join us in the L&D Revolution group on LinkedIn.
It is not for the following individuals:
- Those happy with the rate of change in L&D (evolution not revolution)
- Those happy just to keep delivering the same courses without much impact
- Those happy with level 1 evaluations and moving no further
- Those more interested in applying the latest tech rather than improving performance
Last week I had the absolute pleasure of recording another podcast with John Tomlinson of Trainer Tools. This time it was a little different as I was sharing the mic with Kevin M. Yates, my newly discovered partner in crime. Kevin and I met via social media and immediately clicked over our shared interest in wanting to revolutionise L&D. A podcast would be a great way to share our common interests and the thoughts we exchanged in our initial conversation.
The title of this blog is also the title of the podcast and it will be coming out in March.
Let’s start at the beginning of the conversation. I began my training career back in the late 80’s as an IT trainer for IBM. My title was a ‘VM Instructor’ and my job was to teach customers how to use the front end of the VM software. If I delivered the training effectively then the participants should be able to use the system for at work, to do their jobs. Our measures though were purely about how the training was received:
- Was the instructor knowledgeable? Approachable? Friendly?
- Were the notes useful?
- How was the lunch?
- Was the classroom comfortable?
To this day, I don’t know if anyone actually found out from the customers whether the training achieved what it had set out to do. The world of L&D is quite different now. For a start more and more training functions have changed their names to ‘Learning and Development’. This was in response to the fact that learning was recognised to happen in places other than the training room. Learning had a broader remit and with this came a change in the traditional trainer role.
Since then, L&D have been changing and morphing. Towards Maturity have outlined their thoughts on the ‘New Learning Organisation’ and I commented on this in my blog about the ‘We don’t do train-the-trainer’. In this blog I spoke about the new “Learning Leader’ required to support this new Learning Organisation. Below I have amended the original diagram to include explicit references to performance and using data to help drive intelligent decisions.
This blog is mostly concerned with the first and sixth of these qualities:
- Clarity of purpose – performance focussed
- Helps people make intelligent decisions – using data
The ‘Clarity of purpose’ is closely tied in with the identity crisis Kevin and I spoke about in the podcast:
Clarity of purpose – performance focussed ….. means that……
- L&D are business focused but also learner centred. Ensuring that they focus on the aspects of individual performance that will improve how the organisation works.
- L&D is strategically focussed to deliver what the organisation needs. Gathering data to understand where the organisation is, but speaking to the right stakeholders to find out the direction of travel required.
- L&D are curious and analytical. If we gather data to investigate what is true and what is happening, we will need to be less brave about asserting our identity. WE will not have to be as brave in suggesting different solutions, because we will have the evidence to back it up. Some people will still expect L&D just to dispense training……(groan)
- L&D are able to engage stakeholders in order to leverage essential resources and achieve the results required. This requires building up those relationships that make the biggest difference and saying no to those stakeholders who neither have impact nor support L&D.
Helps people make intelligent decisions – using data …… means that….
- L&D makes decisions informed by the organisations’ purpose and it can only do that if it has done a lot of the things outlined in the paragraph above.
- L&D develops others capability in decision making by providing the appropriate tools and skill in data collection and analysis. This also means that L&D as part of its new identity needs to learn how to do this too!
- L&D helps people use data to track performance and also the impact of learning on peoples performance. This then drives the organisation towards efforts that have impact and stops the organisation from putting energy into things which give no return.
During the podcast Kevin threw in 3 questions which are crucial for L&D to ask in order to become more data driven:
- What is happening in the organisation?
- What is the organisations goal?
- What performance requirements are needed to achieve your goal?
Once we have asked these questions we need to look at how else we can develop our skills in line with the new identity. Who is interested in discussing this further? We have a LinkedIn group called the L&D Revolution – take part in the conversation and join the group!
There are some lessons in life that I seem to learn and re-learn, no matter how many times I go through the cycle. I have always been a bit of a perfectionist and this has sometimes lead to stress, unnecessary work and frustration. I have rationalised that my striving to give 120% (when people don’t even notice if I drop to 80%) is unnecessary and I need to “give myself a break” but somehow this keeps popping up. It is no doubt in my DNA.
It has been such a lesson and is quite a funny story. My eldest son, Alex has been on the other side of the world for nearly two years and was coming home for Christmas, bringing along his girlfriend. To say I have been excited has been an understatement (btw if my younger son Joe is reading this – we were looking forward to seeing you too!) I had been cleaning and tidying his room in readiness, buying bits and bobs, making the bed cosy (the cold will be a shock!). Yesterday I walked in and thought “Ooh that lovely air freshener will smell nice in here” … a final finishing touch.
So I sprayed and stood back, anticipating how welcoming the room was looking. To my horror, what appeared on the wall was a huge splatter of oily residue from the air freshener. No matter how hard I washed and scrubbed it persisted. Not in a position where it could not be noticed or hidden by furniture, I was faced with having to paint the whole wall. Why oh why could I not have just left it be?
In my professional life, not recognising when good is good enough has also happened and I wonder who else can relate to it?
- “Tweaking” and delaying a report until all the i’s are dotted and t’s crossed – rather than getting something out that will promote discussion and others getting involved
- Working on a design for much longer than anticipated, to get it “perfect” rather than relaxing before the delivery
- Adding more and more thoughts to a blog, when really there was not that much to say (gilding the lily?)
So with that last thought I will leave it here and encourage you all to give yourself a break and recognise when good is good enough. It is not a perfect world and sometimes you just need to be a little less perfect!
Yes, this was an actual question during a conversation about how in L&D we need to get back to basics.
This is not the first conversation I have had recently on this topic. I have had the absolute pleasure of making new connections recently (Kevin Yates and Amrit Sandhar yes it’s you two!) and I believe it is not a coincidence. I believe there are lots of people thinking the same way…..The topic coming up time after time is how in L&D we seemed to have lost our way. Instead of focussing on the basics (we will look at what those are later) we seem distracted by the new and shiny. I am not averse to the new or shiny at all. I am a self-confessed geek but the new and shiny has to fit the problem not the other way around.
So today I was having a review with Marie Duncan, Head of L&D for Kibble Education. They are a fabulous organisation over 150 years old, dedicated to helping children at risk. At the beginning of November I ran the Learning Loop for a group of 12 trainers and subject matter experts. We were reviewing the impact of the programme and what it has done for them.
We caught up on what has been imbedded and future work. We spoke about conferences and their value, but also how they can lead to a feeling of overwhelm. What do we spend the hard fought budget on and are we really getting value out of what we have? These are key questions on many L&D managers lips.
Then we took a similar path to previous conversations. Is L&D losing its way? What is it about? My opinion on what L&D should be about is:
- Understanding the organisation and its’ purpose
- Aligning L&D activity with the main goals of the organisation
- Conducting a needs analysis when appropriate to inform us of what really needs to be done rather than what presents itself
- Designing something appropriate using the right tools (not just applying the training ‘sticking plaster’ or the new shiny glittery thing)
- Delivering something that meets the objectives and improves something in the organisation (not just a warm glow from the glitter)
- Finding out whether what we did had the impact we said it would and working with the organisation to prove it (with business metrics)
- Enabling line managers to help imbed the learning
Much of what we hear about are new advances in AI, VR, micro-learning, mobile learning, social learning, digitalisation, is all fabulous stuff, but how many in L&D are measuring the impact of what they do? How many have their finger on the pulse of the organisation, to know what is really going on? Are we swayed too much by what the big kids have in the playground?
Listening to all the new advances can we stay focussed? Is it all a distraction? Not-for-profits, voluntary and public sector organisations are strapped for cash and quite often need to know, in Marie’s words “how to imbed what we are doing well and doing it better”.
I love the Towards Maturity reports, giving us all a good idea of what we should be doing and benchmarking against others. Looking outwards can be helpful but so can looking inwards. Each organisation is unique and really understanding its purpose and how that can be fulfilled is crucial. Not all navel gazing is counterproductive! It is about balance.
So where did the tiara come in? When we were talking about all the latest fads, new and shiny things, one of our key concerns was how appropriate they are to the problem you are solving and just because you have one, “would you wear a tiara to the gym?”
I would love to hear your thoughts about ”getting back to basics” and ask two questions:
- Are you getting the L&D basics right?
- If not, what is stopping you?
By the way if this topic is of interest to you, Kevin and I will be recording a podcast in the new year on Trainer Tools going into more depth on this topic.
This title has been taken from the Towards Maturity report published in August of this year. I was immediately drawn to the title and anyone who knows what a maths geek I am will understand why.
I always loved numbers and even just playing with them, multiplying numbers by themselves repeatedly just for fun! Yes I know it’s not normal and I also appreciate that not everyone else has the same love of numbers, in fact quite the opposite. I have several friends who will admit that numbers are almost a phobia.
Reading this report ,it is quite evident that we, in L&D are not great at collecting and using data to its best advantage. Some of the figures that struck me were:
- Of those aspiring data to affect change only 2/5 were able to say that it helped the, demonstrate business impact (40%)
- Also only 3/5 were successful in using data to help solve business problems (60%)
Bear in mind that was from a sample size of 700+ and the two figures above were those people who were really trying to use data affectively. This means in reality that there will also be a number of people not even trying so the 40% and 60% are likely to be very optimistic figures.
The most likely reasons cited were:
- Data and it’s analysis is complicated
- Lack of L&D skills in this area
If I look at the second point first. Why are we not addressing this lack of skills? Is it this phobia of numbers? A fear of what to do once you start collecting? An expectation that things have to change once you start collecting data effectively? Maybe it’s a combination of all three? Or maybe a misconception around what it means to collect and analyse data?
For me it is quite simple (and this may address the first point). In L&D we need to get nosey. When someone asks us to deliver a leadership programme, we need to ask why and how will you know it has been successful? If the first person who asks you doesn’t know, then ask someone else. Is it a real need or a perceived need?
The perceived need may be something like employee engagement scores being low. What we really need to determine is what effect that is having on the performance of the business:
- High recruitment costs?
- Lack of agility in the marketplace because there is a high attrition rate, staff not as familiar as they should be about products?
- Poor customer service because the tools they use have had little investment?
So when you look at these examples, you can start to see it really is not about data analysis, but curiosity, perseverance and a healthy dose of skepticism. If you can pinpoint what the problem is and it is a real business need, then what you need to measure will be very obvious:
- Reduction in recruitment costs
- Reduction in time to market with new products
- Range of new products and uptake
- Attrition rate
- Customer satisfaction scores
These are not L&D statistics these are business measures and having highlighted the purpose of your L&D focus, the business will also want to measure it. That is not to say that at times you are not needed to do some data analysis and collection but I think we are over complicating it and not getting to the nub of the problem.
In my book (currently in first draft) “How not to Waste your Money on Training’ I will show people simply how to “find the story in the data”. Using a simple example of a scoring grid, I will show how you can, using a spreadsheet and playing with different graph types, discover little parts of the truth about what is going on. It takes a click and a small amount of curiosity. If you want to try it out then just use this example, using Excel to play around with different types of charts:
- Bar chart
- Pie chart
- Stacked bar
- Spider diagram
For each format ask yourself “what do I see now?”. Using this approach of curiosity and play I discovered:
- A bar chart gives a good comparison one person against another for each part of their role
- A spider diagram shows how well-rounded each team member is in their own right. Some are not rounded at all! Tracy seems the most well rounded.
- Stacked column shows the teams strengths and weaknesses:
- Who is the strongest in sales skills?
- Who is the weakest in product knowledge and working independently (why might this be? Manager poor at delegating?)
So I would urge you L&D, before spending a lot of money on data analytics experts, get nosey and do some detective work yourselves. Keep it simple and dig into what is going on beneath the surface. Don’t just take one persons viewpoint or use just one method, mix it up and start finding the story in the data!
My conclusions from the report and my own anecdotal research suggests that:
- L&D does not have the skills required for data analysis (I had better get that book finished!)
- It is not as complicated as you think
- It is about asking the right questions and finding the story in the data
- We don’t always need data analytics experts to do this!
This blog is a follow up to the webinar of the same name delivered on the 25th of September 2018
In January, as soon a tickets became available for this event, there was a flurry of bookings and so they have poured in throughout the year. This has peaked my interest as to why this was such a popular event.
To find out more we went out a survey attendees and also in parallel a “Deep Dive” survey which addressed a wider look at the challenges in L&D. These were some of the findings:
These results prompted me to phone a number of people to dig deeper and here were some of the issues that were lurking behind the statistics:
In my head I am thinking, what is really going on is that people find evaluation and demonstrating value difficult because they do not do the analysis part well. Some people believe that stakeholder conversations are all they need. I believe we should have a more analytical approach in L&D, getting closer to the business and finding out what is really going on.This is what I would recommend, The Learning Loop Approach.
As part of the consultancy approach, I referred back to another showcase event “The Consultancy Approach” where we had some great conversations about what that might look like.
Finally we discussed what it would look like to have a “learning ecosystem” (phrase stolen from Pedro Valido) and hopefully that discussion is still going on via a message board we created.
If this topic has interested you then please get in touch and/or complete our “Deep Dive” survey to let us know what keeps you awake at night. You can also register your interest for our first Deep Dive event on December 4th in London.
A little provocative I know, but for years I have been hearing that face-to-face learning is dead and the simple fact is, that it’s doing well. It’s alive and kicking!
This is borne out by Towards Maturity report “L&D where are we now 2017 -2018” where 4/5 soft skills courses are still delivered face to face. In fact what surprised me was that of the statistics around technology in learning, there were no figures over 35% to say technology has visible benefits. So why the provocative title? Am I biased towards face to face?
The simple answer is no, BUT I recognise the huge value it can bring and the buzz it can create. WE are after all human and there is really nothing that can recreate the excitement and buzz the right face to face intervention can create. What I hate to see is L&D’s rush to the latest fad, thinking that this will:
- Save money
- Improve performance
- Get the results we need
In reality the latest fad, is just that and what I favour is a more pragmatic approach rather than a huge investment in some technology that may not deliver on the hype and promises. That is not to say that we should not use the latest and shiniest. My concerns are often around investment versus return, especially for the voluntary, public and not-for-profit sectors. Can they really afford to invest in some of the latest when there is no guarantee they will deliver?
So, what am I suggesting? Here are a few things:
- Analyse the needs carefully so you get an accurate picture of what is required – this will put it into the knowledge, skills and attitudinal learning categories as well as what level of learning (see Blooms taxonomy) – will it need to be in the moment, hard wired or semi-permanent?
- Look at how to build up the learning, not in a one-time only event (unless that is ok) but maybe overlapping and layering of the learning, interweaving skills and knowledge
- Choose from the 100 ways to learn and create a blend that will make the learning interesting and engaging. People can then choose one or all of the activities. The activities could, if appropriate build up the learning. See the example below:
Learning about the 5 secrets of Accelerated Learning on the Learning Loop:
- Participants read an article in the pre-work a few weeks before in the LNA email, finding out needs and objectives of participants
- Participants get to watch a short video summarising the 5 secrets in the last-minute email
- On the workshop, there are posters and resources on the 5 secrets around the room
- During the game – there will always be a question to “Describe the 5 secrets of Accelerated Learning” – which is a higher level of learning than “List the 5 Secrets of Accelerated Learning”
- One group does a teach back to help everyone remember what the 5 Secrets are and how to apply them
Back to the title then…. AI, eLearning, VR, and face to face are not dead. They are, as is face-to-face, simply some of the tools that you can keep in your toolbox so that you can choose the most appropriate method. Just as these days PowerPoint has been overdone, lets not overdo the new ones. If we mix up the tools we use, then we create a variety for the learners that will keep them engaged. What are your thoughts?
Should we in L&D be focussed on improving job performance and hence all learning is focussed on that or should we be looking for people to be inspired to learn more and be more self-directed?
I spent some time in July writing my book, “How not to waste your money on training” and during the week I had a few philosophical moments. One was about the difference between education, training and learning. I often speak to L&D professionals about the difference between a training needs analysis and a learning needs analysis. How the former always leads to training whereas the latter leads to something broader than just training; it could be learning in many different forms.
In a similar way I was thinking about how my degree in chemical engineering and fuel technology was a good education. It prepared me for the world of work and also began a lifelong desire to learn more. When I moved from engineering to IT training with IBM we were called “instructors” and I worked in the IBM Education Centre in St. John’s Wood. Was what people received when they came to us there, an education? I am not so sure. I would hope the delegates were more prepared for their world of work and that they were inspired to learn more. But how broad was that inspiration? Did they become self-motivated learners keen to go beyond the traditional training course to further develop themselves?
This leads me to the present day; my title changed from instructor to trainer to L&D professional/facilitator. How do I define though whether I am educating, training or helping people to learn?
A few years after I gained my Certificate in Training Practice, I began working with trainers, delivering the CIPD Certificate in Training Practice, then the Certificate in Learning and Development Practice. Through my accelerated learning programmes, I further worked with L&D professionals to help them learn more about an approach that has been taking shape over many years. An approach that helps me focus on organisational needs as well as learner requirements. Programmes that took 8 months of weekly 4 hour sessions, were delivered in 8 one day sessions moving to a well-known learning provider. Now I deliver a 6-week programme, which includes a 2 day workshop and I cannot possibly ‘cover’ all I used to.
Leading by example and walking-the -talk have been driving forces in our organisation “How to Accelerate Learning”. Facilitation is practiced and runs like an invisible thread through the programmes. Inspiring resources and innovative ways of learning through gamification, create a different feel to the programme, leaving many people “inspired” – their words not ours. Even hardened trainers with years of experience under their belts talk of how different it feels.
This has not always been a deliberate intention, but a on occasions, a happy and accidental one. One that we persist with because of the results we achieve and the feedback we get. We put effort into:
- Drilling deeper into needs to see if learning is the appropriate course of action
- Delivering learning via a blend of activities not just training
- Helping people to gain confidence in stepping out of their comfort zone – to experience new ways to help people learn
- Using unusual materials and resources to inspire a different approach
- Following up the learning so it is not just a one-time event
So, I don’t feel like it is training in the traditional sense; not like when I was a VM Instructor. Nor do I feel that it is just learning because of the feedback we get. So are we educating and is that now the remit of L&D?
I have had no issue with the Kirkpatrick model, as it has always made sense to me. Just recently I have been challenged to reconsider my position. Last week running the Learning Loop, one of the participants said they have started to use Will Thalheimers’ Learning Transfer Evaluation Model, which made me curious about what that model had to offer, that was different to Kirkpatricks.
This week I read a blog from Work Learning Research saying how levels 3 and 4 of Kirkpatrick are misunderstood by most L&D people, that they use learner reactions as if they were valid level 3 and 4 measures. The blog got me ruffled a bit, particularly because the survey results:
1) use a sample size of TWO for vendors
2) there is no visible link with those 250 surveyed and whether they are actually even measuring at levels 3 or 4 (if they are not, then could they know the answer to the question?)
This makes any conclusions, drawn from the research, in my opinion tenuous to say the least.
What made me curious enough to look at Will Thalheimers’ model was any suggestion that it had something new to offer. What I did like was the focus on learning transfer, but then I thought – shouldn’t the focus be on performance? Will has a lot to say and criticise the Kirkpatrick model for, but I am looking for a deeper and better way to do evaluation. What I would like to offer is a deeper explanation of Kirkpatrick’s model and I hope you will make up your own mind how you might apply it.
“Begin with the end in mind” said Steven Covey. I agree wholeheartedly. If you do a thorough analysis, engage with stakeholders and line managers about the outcomes (in performance) required, then evaluation should be straight-forward. But please note if the analysis part is skipped over or done on a superficial level, THE EVALUATION WILL BE DIFFICULT WHICHEVER MODEL YOU USE.
- Level 1 – learner reactions – still an important part of seeing if you have had sufficient engagement and struck the right chord with the objectives (Business focused and learner-centred according to the 5 Secrets of Accelerated Learning.
- Level 2 – learning achieved – did they learn what they were supposed to and have they met those objectives? This is important to know for L&D and their line managers
- Level 3 – impact on performance – this is what is then observed on the job and in my opinion should not be L&D’s sole remit. If they have analysed the needs correctly and determined the correct outcomes, the line managers should be engaged enough to help imbed the learning as well as measure performance improvements.
- Level 4 – impact on the business- if the stakeholders have been engaged and the outcomes are focused on the business, the stakeholders will be interested in measuring the business impact.
Here are some ways in which you could do evaluate at different levels of Kirkpatrick. Let me know what you think – I will be curious to hear your reactions!
In the graphic below I have overlaid Kirkpatrick model onto Will Thalheimers’ model.
Evaluation methods answers
When a stakeholder asks you to do something, can you distinguish between what they want and what they need? Does it make any difference?
I think it does (otherwise “What’s the point of this blog?” you might ask)
Let us illustrate this with an example. Someone says they would like a glass of orange juice, but is it really what they need? Digging deeper and understanding their situation, you discover they need their thirst quenching. Once we understand that is the real need, it then opens up the possible solutions:
- A cup of tea
- A glass of water
- A cool beer
- An apple juice
So in a business context I am sure that you can see the parallels. If you drill down into peoples (and the organisations) needs then not only do you bring a solution that is fit for purpose, you also open up the number of solutions available. You also avoid expensive mistakes, whereby leaping into solution mode too quickly, you miss the real point of what is going on.
So how do you find out the needs rather than the desires to “wants”?
It really is not that complicated….. get curious, ask questions, don’t assume they actually know the answer, no matter how convincing they are.
Every month our loyal subscribers get a free resource and in the past they have received:
- A stakeholder analysis informational video
- A stakeholder analysis question sheet
Both of which would help greatly in determining the real needs. If you would like to receive resources like this every month then please subscribe. Depending on whether you are an L&D professional (includes consultants) or manager, you will get different and appropriate resources.
Would love to hear your thoughts on this topic!