There are some lessons in life that I seem to learn and re-learn, no matter how many times I go through the cycle. I have always been a bit of a perfectionist and this has sometimes lead to stress, unnecessary work and frustration. I have rationalised that my striving to give 120% (when people don’t even notice if I drop to 80%) is unnecessary and I need to “give myself a break” but somehow this keeps popping up. It is no doubt in my DNA.
It has been such a lesson and is quite a funny story. My eldest son, Alex has been on the other side of the world for nearly two years and was coming home for Christmas, bringing along his girlfriend. To say I have been excited has been an understatement (btw if my younger son Joe is reading this – we were looking forward to seeing you too!) I had been cleaning and tidying his room in readiness, buying bits and bobs, making the bed cosy (the cold will be a shock!). Yesterday I walked in and thought “Ooh that lovely air freshener will smell nice in here” … a final finishing touch.
So I sprayed and stood back, anticipating how welcoming the room was looking. To my horror, what appeared on the wall was a huge splatter of oily residue from the air freshener. No matter how hard I washed and scrubbed it persisted. Not in a position where it could not be noticed or hidden by furniture, I was faced with having to paint the whole wall. Why oh why could I not have just left it be?
In my professional life, not recognising when good is good enough has also happened and I wonder who else can relate to it?
- “Tweaking” and delaying a report until all the i’s are dotted and t’s crossed – rather than getting something out that will promote discussion and others getting involved
- Working on a design for much longer than anticipated, to get it “perfect” rather than relaxing before the delivery
- Adding more and more thoughts to a blog, when really there was not that much to say (gilding the lily?)
So with that last thought I will leave it here and encourage you all to give yourself a break and recognise when good is good enough. It is not a perfect world and sometimes you just need to be a little less perfect!
Yes, this was an actual question during a conversation about how in L&D we need to get back to basics.
This is not the first conversation I have had recently on this topic. I have had the absolute pleasure of making new connections recently (Kevin Yates and Amrit Sandhar yes it’s you two!) and I believe it is not a coincidence. I believe there are lots of people thinking the same way…..The topic coming up time after time is how in L&D we seemed to have lost our way. Instead of focussing on the basics (we will look at what those are later) we seem distracted by the new and shiny. I am not averse to the new or shiny at all. I am a self-confessed geek but the new and shiny has to fit the problem not the other way around.
So today I was having a review with Marie Duncan, Head of L&D for Kibble Education. They are a fabulous organisation over 150 years old, dedicated to helping children at risk. At the beginning of November I ran the Learning Loop for a group of 12 trainers and subject matter experts. We were reviewing the impact of the programme and what it has done for them.
We caught up on what has been imbedded and future work. We spoke about conferences and their value, but also how they can lead to a feeling of overwhelm. What do we spend the hard fought budget on and are we really getting value out of what we have? These are key questions on many L&D managers lips.
Then we took a similar path to previous conversations. Is L&D losing its way? What is it about? My opinion on what L&D should be about is:
- Understanding the organisation and its’ purpose
- Aligning L&D activity with the main goals of the organisation
- Conducting a needs analysis when appropriate to inform us of what really needs to be done rather than what presents itself
- Designing something appropriate using the right tools (not just applying the training ‘sticking plaster’ or the new shiny glittery thing)
- Delivering something that meets the objectives and improves something in the organisation (not just a warm glow from the glitter)
- Finding out whether what we did had the impact we said it would and working with the organisation to prove it (with business metrics)
- Enabling line managers to help imbed the learning
Much of what we hear about are new advances in AI, VR, micro-learning, mobile learning, social learning, digitalisation, is all fabulous stuff, but how many in L&D are measuring the impact of what they do? How many have their finger on the pulse of the organisation, to know what is really going on? Are we swayed too much by what the big kids have in the playground?
Listening to all the new advances can we stay focussed? Is it all a distraction? Not-for-profits, voluntary and public sector organisations are strapped for cash and quite often need to know, in Marie’s words “how to imbed what we are doing well and doing it better”.
I love the Towards Maturity reports, giving us all a good idea of what we should be doing and benchmarking against others. Looking outwards can be helpful but so can looking inwards. Each organisation is unique and really understanding its purpose and how that can be fulfilled is crucial. Not all navel gazing is counterproductive! It is about balance.
So where did the tiara come in? When we were talking about all the latest fads, new and shiny things, one of our key concerns was how appropriate they are to the problem you are solving and just because you have one, “would you wear a tiara to the gym?”
I would love to hear your thoughts about ”getting back to basics” and ask two questions:
- Are you getting the L&D basics right?
- If not, what is stopping you?
My book “How not To Waste Your Money On Training” addresses these issues and more….. on how to gt it right when analysing needs and determining solutions.
This title has been taken from the Towards Maturity report published in August of 2018. I was immediately drawn to the title and anyone who knows what a maths geek I am will understand why.
I always loved numbers and even just playing with them, multiplying numbers by themselves repeatedly just for fun! Yes I know it’s not normal and I also appreciate that not everyone else has the same love of numbers, in fact quite the opposite. I have several friends who will admit that numbers are almost a phobia.
Reading this report ,it is quite evident that we, in L&D are not great at collecting and using data to its best advantage. Some of the figures that struck me were:
- Of those aspiring data to affect change only 2/5 were able to say that it helped the, demonstrate business impact (40%)
- Also only 3/5 were successful in using data to help solve business problems (60%)
Bear in mind that was from a sample size of 700+ and the two figures above were those people who were really trying to use data affectively. This means in reality that there will also be a number of people not even trying so the 40% and 60% are likely to be very optimistic figures.
The most likely reasons cited were:
- Data and it’s analysis is complicated
- Lack of L&D skills in this area
If I look at the second point first. Why are we not addressing this lack of skills? Is it this phobia of numbers? A fear of what to do once you start collecting? An expectation that things have to change once you start collecting data effectively? Maybe it’s a combination of all three? Or maybe a misconception around what it means to collect and analyse data?
For me it is quite simple (and this may address the first point). In L&D we need to get nosey. When someone asks us to deliver a leadership programme, we need to ask why and how will you know it has been successful? If the first person who asks you doesn’t know, then ask someone else. Is it a real need or a perceived need?
The perceived need may be something like employee engagement scores being low. What we really need to determine is what effect that is having on the performance of the business:
- High recruitment costs?
- Lack of agility in the marketplace because there is a high attrition rate, staff not as familiar as they should be about products?
- Poor customer service because the tools they use have had little investment?
So when you look at these examples, you can start to see it really is not about data analysis, but curiosity, perseverance and a healthy dose of skepticism. If you can pinpoint what the problem is and it is a real business need, then what you need to measure will be very obvious:
- Reduction in recruitment costs
- Reduction in time to market with new products
- Range of new products and uptake
- Attrition rate
- Customer satisfaction scores
These are not L&D statistics these are business measures and having highlighted the purpose of your L&D focus, the business will also want to measure it. That is not to say that at times you are not needed to do some data analysis and collection but I think we are over complicating it and not getting to the nub of the problem.
In my book “How not To Waste Your Money On Training” I show people simply how to “find the story in the data”. Using a simple example of a scoring grid, I show how you can, using a spreadsheet and playing with different graph types, discover little parts of the truth about what is going on. It takes a click and a small amount of curiosity. If you want to try it out then just use this example, using Excel to play around with different types of charts:
- Bar chart
- Pie chart
- Stacked bar
- Spider diagram
For each format ask yourself “what do I see now?”. Using this approach of curiosity and play I discovered:
- A bar chart gives a good comparison one person against another for each part of their role
- A spider diagram shows how well-rounded each team member is in their own right. Some are not rounded at all! Tracy seems the most well rounded.
- Stacked column shows the teams strengths and weaknesses:
- Who is the strongest in sales skills?
- Who is the weakest in product knowledge and working independently (why might this be? Manager poor at delegating?)
So I would urge you L&D, before spending a lot of money on data analytics experts, get nosey and do some detective work yourselves. Keep it simple and dig into what is going on beneath the surface. Don’t just take one persons viewpoint or use just one method, mix it up and start finding the story in the data!
My conclusions from the report and my own anecdotal research suggests that:
- L&D does not have the skills required for data analysis (I had better get that book finished!)
- It is not as complicated as you think
- It is about asking the right questions and finding the story in the data
- We don’t always need data analytics experts to do this!
This blog is a follow up to the webinar of the same name delivered on the 25th of September 2018
In January, as soon a tickets became available for this event, there was a flurry of bookings and so they have poured in throughout the year. This has peaked my interest as to why this was such a popular event.
To find out more we went out a survey attendees and also in parallel a “Deep Dive” survey which addressed a wider look at the challenges in L&D. These were some of the findings:
These results prompted me to phone a number of people to dig deeper and here were some of the issues that were lurking behind the statistics:
In my head I am thinking, what is really going on is that people find evaluation and demonstrating value difficult because they do not do the analysis part well. Some people believe that stakeholder conversations are all they need. I believe we should have a more analytical approach in L&D, getting closer to the business and finding out what is really going on.This is what I would recommend, The Learning Loop Approach.
As part of the consultancy approach, I referred back to another showcase event “The Consultancy Approach” where we had some great conversations about what that might look like.
Finally we discussed what it would look like to have a “learning ecosystem” (phrase stolen from Pedro Valido) and hopefully that discussion is still going on via a message board we created.
If this topic has interested you then please get in touch and/or complete our “Deep Dive” survey to let us know what keeps you awake at night. You can also register your interest for our first Deep Dive event on December 4th in London.
A little provocative I know, but for years I have been hearing that face-to-face learning is dead and the simple fact is, that it’s doing well. It’s alive and kicking!
This is borne out by Towards Maturity report “L&D where are we now 2017 -2018” where 4/5 soft skills courses are still delivered face to face. In fact what surprised me was that of the statistics around technology in learning, there were no figures over 35% to say technology has visible benefits. So why the provocative title? Am I biased towards face to face?
The simple answer is no, BUT I recognise the huge value it can bring and the buzz it can create. WE are after all human and there is really nothing that can recreate the excitement and buzz the right face to face intervention can create. What I hate to see is L&D’s rush to the latest fad, thinking that this will:
- Save money
- Improve performance
- Get the results we need
In reality the latest fad, is just that and what I favour is a more pragmatic approach rather than a huge investment in some technology that may not deliver on the hype and promises. That is not to say that we should not use the latest and shiniest. My concerns are often around investment versus return, especially for the voluntary, public and not-for-profit sectors. Can they really afford to invest in some of the latest when there is no guarantee they will deliver?
So, what am I suggesting? Here are a few things:
- Analyse the needs carefully so you get an accurate picture of what is required – this will put it into the knowledge, skills and attitudinal learning categories as well as what level of learning (see Blooms taxonomy) – will it need to be in the moment, hard wired or semi-permanent?
- Look at how to build up the learning, not in a one-time only event (unless that is ok) but maybe overlapping and layering of the learning, interweaving skills and knowledge
- Choose from the 100 ways to learn and create a blend that will make the learning interesting and engaging. People can then choose one or all of the activities. The activities could, if appropriate build up the learning. See the example below:
Learning about the 5 secrets of Accelerated Learning on the Learning Loop:
- Participants read an article in the pre-work a few weeks before in the LNA email, finding out needs and objectives of participants
- Participants get to watch a short video summarising the 5 secrets in the last-minute email
- On the workshop, there are posters and resources on the 5 secrets around the room
- During the game – there will always be a question to “Describe the 5 secrets of Accelerated Learning” – which is a higher level of learning than “List the 5 Secrets of Accelerated Learning”
- One group does a teach back to help everyone remember what the 5 Secrets are and how to apply them
Back to the title then…. AI, eLearning, VR, and face to face are not dead. They are, as is face-to-face, simply some of the tools that you can keep in your toolbox so that you can choose the most appropriate method. Just as these days PowerPoint has been overdone, lets not overdo the new ones. If we mix up the tools we use, then we create a variety for the learners that will keep them engaged. What are your thoughts?
Should we in L&D be focussed on improving job performance and hence all learning is focussed on that or should we be looking for people to be inspired to learn more and be more self-directed?
I spent some time in July 2018 writing my book, “How Not To Waste Your Money On Training” and during the week I had a few philosophical moments. One was about the difference between education, training and learning. I often speak to L&D professionals about the difference between a training needs analysis and a learning needs analysis. How the former always leads to training whereas the latter leads to something broader than just training; it could be learning in many different forms.
In a similar way I was thinking about how my degree in chemical engineering and fuel technology was a good education. It prepared me for the world of work and also began a lifelong desire to learn more. When I moved from engineering to IT training with IBM we were called “instructors” and I worked in the IBM Education Centre in St. John’s Wood. Was what people received when they came to us there, an education? I am not so sure. I would hope the delegates were more prepared for their world of work and that they were inspired to learn more. But how broad was that inspiration? Did they become self-motivated learners keen to go beyond the traditional training course to further develop themselves?
This leads me to the present day; my title changed from instructor to trainer to L&D professional/facilitator. How do I define though whether I am educating, training or helping people to learn?
A few years after I gained my Certificate in Training Practice, I began working with trainers, delivering the CIPD Certificate in Training Practice, then the Certificate in Learning and Development Practice. Through my accelerated learning programmes, I further worked with L&D professionals to help them learn more about an approach that has been taking shape over many years. An approach that helps me focus on organisational needs as well as learner requirements. Programmes that took 8 months of weekly 4 hour sessions, were delivered in 8 one day sessions moving to a well-known learning provider. Now I deliver a 6-week programme, which includes a 2 day workshop and I cannot possibly ‘cover’ all I used to.
Leading by example and walking-the -talk have been driving forces in our organisation “How to Accelerate Learning”. Facilitation is practiced and runs like an invisible thread through the programmes. Inspiring resources and innovative ways of learning through gamification, create a different feel to the programme, leaving many people “inspired” – their words not ours. Even hardened trainers with years of experience under their belts talk of how different it feels.
This has not always been a deliberate intention, but a on occasions, a happy and accidental one. One that we persist with because of the results we achieve and the feedback we get. We put effort into:
- Drilling deeper into needs to see if learning is the appropriate course of action
- Delivering learning via a blend of activities not just training
- Helping people to gain confidence in stepping out of their comfort zone – to experience new ways to help people learn
- Using unusual materials and resources to inspire a different approach
- Following up the learning so it is not just a one-time event
So, I don’t feel like it is training in the traditional sense; not like when I was a VM Instructor. Nor do I feel that it is just learning because of the feedback we get. So are we educating and is that now the remit of L&D?
I have had no issue with the Kirkpatrick model, as it has always made sense to me. Just recently I have been challenged to reconsider my position. Last week running the Learning Loop, one of the participants said they have started to use Will Thalheimers’ Learning Transfer Evaluation Model, which made me curious about what that model had to offer, that was different to Kirkpatricks.
This week I read a blog from Work Learning Research saying how levels 3 and 4 of Kirkpatrick are misunderstood by most L&D people, that they use learner reactions as if they were valid level 3 and 4 measures. The blog got me ruffled a bit, particularly because the survey results:
1) use a sample size of TWO for vendors
2) there is no visible link with those 250 surveyed and whether they are actually even measuring at levels 3 or 4 (if they are not, then could they know the answer to the question?)
This makes any conclusions, drawn from the research, in my opinion tenuous to say the least.
What made me curious enough to look at Will Thalheimers’ model was any suggestion that it had something new to offer. What I did like was the focus on learning transfer, but then I thought – shouldn’t the focus be on performance? Will has a lot to say and criticise the Kirkpatrick model for, but I am looking for a deeper and better way to do evaluation. What I would like to offer is a deeper explanation of Kirkpatrick’s model and I hope you will make up your own mind how you might apply it.
“Begin with the end in mind” said Steven Covey. I agree wholeheartedly. If you do a thorough analysis, engage with stakeholders and line managers about the outcomes (in performance) required, then evaluation should be straight-forward. But please note if the analysis part is skipped over or done on a superficial level, THE EVALUATION WILL BE DIFFICULT WHICHEVER MODEL YOU USE.
- Level 1 – learner reactions – still an important part of seeing if you have had sufficient engagement and struck the right chord with the objectives (Business focused and learner-centred according to the 5 Secrets of Accelerated Learning.
- Level 2 – learning achieved – did they learn what they were supposed to and have they met those objectives? This is important to know for L&D and their line managers
- Level 3 – impact on performance – this is what is then observed on the job and in my opinion should not be L&D’s sole remit. If they have analysed the needs correctly and determined the correct outcomes, the line managers should be engaged enough to help imbed the learning as well as measure performance improvements.
- Level 4 – impact on the business- if the stakeholders have been engaged and the outcomes are focused on the business, the stakeholders will be interested in measuring the business impact.
Here are some ways in which you could do evaluate at different levels of Kirkpatrick. Let me know what you think – I will be curious to hear your reactions!
In the graphic below I have overlaid Kirkpatrick model onto Will Thalheimers’ model.
Evaluation methods answers
When a stakeholder asks you to do something, can you distinguish between what they want and what they need? Does it make any difference?
I think it does (otherwise “What’s the point of this blog?” you might ask)
Let us illustrate this with an example. Someone says they would like a glass of orange juice, but is it really what they need? Digging deeper and understanding their situation, you discover they need their thirst quenching. Once we understand that is the real need, it then opens up the possible solutions:
- A cup of tea
- A glass of water
- A cool beer
- An apple juice
So in a business context I am sure that you can see the parallels. If you drill down into peoples (and the organisations) needs then not only do you bring a solution that is fit for purpose, you also open up the number of solutions available. You also avoid expensive mistakes, whereby leaping into solution mode too quickly, you miss the real point of what is going on.
So how do you find out the needs rather than the desires to “wants”?
It really is not that complicated….. get curious, ask questions, don’t assume they actually know the answer, no matter how convincing they are.
Every month our loyal subscribers get a free resource and in the past they have received:
- A stakeholder analysis informational video
- A stakeholder analysis question sheet
Both of which would help greatly in determining the real needs. If you would like to receive resources like this every month then please subscribe. Depending on whether you are an L&D professional (includes consultants) or manager, you will get different and appropriate resources.
Would love to hear your thoughts on this topic!
This blog is for anyone who has ever suffered from or is suffering from imposter syndrome, or thinks they are over it and yet at times something seeps through to imply otherwise.
I was prompted to draw this graphic after something I said last week and further inspired to write this blog after the person I said it to (Perry Timms) blogged an excellent blog entitled “Enough”.
We were in Warszawa (I have to write it this way, I am Polish after all and was accidentally born in the UK) and I was just about to open the Elearning Fusion Conference, when I explained to Perry the reason I had brought some postcards as give aways: I was not enough.
“Of course you are, you will be awesome” replied Perry (bless him!)
But would I be enough? Was I experienced enough? Was my message pertinent enough for the audience? Would there be enough interaction? Would there be enough content?
Let me tell you a little about myself and for those of you who know me, you will know that this is not me bragging (honest):
- I have a degree in Chemical Engineering and Fuel Technology
- I was trained to be a VM instructor in the late 80’s in IBM
- I have 15 years experience in the soft skills learning arena (and a CIPD qualification)
- 9 years a business owner
- Published author
- Regular blogger
- Nearly 5000 followers on LinkedIn, 2500+ on Twitter and 2000+ subscribers to our monthly free resources
So the question I ask myself, is knowing all of this and that I was invited to open the conference with a 45 minute workshop in front of a large audience “Why am I not enough?” The simple answer is that “I am”, but let us unpack it further:
- I am enough because of the life I have lived and the experiences I have had
- I am enough because others see the gold in me that I see in others
- I am enough because I am imperfect and willing to learn
- I am enough because I am unique and my voice is not any other persons voice
- I am enough not because I have earned it, but because I exist in the world
- I am enough, just like each and every one of us is enough
And yet…… having worked though my imposter syndrome on many occasions, this creeps up. Does it keep me real? Stop me from getting too big for my boots? Serve any useful purpose? Maybe?
All I know, is what I say to many people who suffer from the same syndrome, from time to time: you are awesome…. talented…. unique…. amazing…. and have something unique to contribute, regardless of your situation. I truly believe this deep in my core and so if I can believe this for others, I have to believe this for me:
“I AM ENOUGH”
….. enough said!
For anyone wanting to know how the workshop went you can follow this link
Health Warning: This blog post may cause agitation as I will be mentioning Learning Styles
I do not think there is anyone left on the planet who does not know that we no longer use learning styles to design learning for specific individuals attending learning events. Just in case you have been away here is a link you might want to follow before reading on.
Most days there is something posted about it and there are many videos saying that people learning in their preferred learning style DO NOT learn any better than in any other style. The important thing is that the learning is delivered in an appropriate way; based on the subject matter and whether it is knowledge, skills or behaviours we are trying to change.
So the other day I had a little insight and this is where it came from. At a meeting of Trainer Talk Local in Leeds we talked about Learning Styles and how they had fallen out of favour….. I reminisced with the group how I used to use them a lot in my early career and how it impacted my practice.
When I became a soft skills trainer I bought a trainer styles questionnaire booklet and completed it. Low and behold my preferred training delivery style was also my preferred learning style! This really made me think as I potentially was missing out some valuable learning. Was there enough theory in my design and delivery? Were learners given enough time to reflect on their learning and realise what they had learnt or was I simply immersing them in “doing” with lots of practical hints and tips?
Just to let you know my preferred style was Pragmatist, closely followed by Activist and then low scores on Reflector and Theorist.
Filling out the learning style questionnaire had alerted me to how I prefer to learn, not how I learn best. In discovering my bias in delivery, I thought about how to learn best according to David Kolb and that was to go around the whole of the learning cycle:
- Have an experience
- Reflect on it
- Make sense of it (though theories and models)
- Experimenting and applying it
If you are aware of your bias (buzz word of the year btw) towards any part of the cycle, you can as a learner, make more effort to experience other parts of the cycle. If you are a trainer or facilitator it will help inform you during your design and delivery, which parts you are likely to skip over or not spend as much time on because it is not your preference.
So lovely L&D folks out there, don’t throw the baby out with the bath water! Being aware of your preferences or bias can help improve your practice and therefore help others to learn better.
Some things to do may be:
- Find out your learning style preference
- Make an action plan for yourself to broaden the ways you learn
- Always do a check in your design that there is a balance of activities, covering the whole learning cycle
So anyone else want to add anything?