Is L&D a ‘Tik-Toking’ timebomb?

Is L&D a ‘Tik-Toking’ timebomb?

‘ByteDance, the owners of Tik-Tok, has made a strong statement about the ineffectiveness of its talent development team. In an internal memo, the company noted talent development had “limited practical value” and represented a “disconnect” from the company’s needs.’1

Can you, in L&D, be sure that you are connected to your organisation’s needs?

It is thought that close to 100 people for talent development were laid off, in spite of the company’s insistence that “Talent development is still very much a priority for us and for our employees.” That indeed may be true, as they are looking at different ways in which they can still develop their people without the burden of a large and inflexible Talent Development team.

How can you ensure your L&D team are relevant and in tune with the organisation?

“Because the team has already grown quite large, we have decided to no longer retain the Talent Development Center as many of its roles and functions are not in tune with our current development strategies” continued the statement.

So is your L&D team running out of time? 

What do you do to demonstrate value? Is your learning strategy aligned with the organisational strategy? Do you engage with the right stakeholders in the right way to maximise impact?

This is from the CIPD L&D at work survey of 2021:

  1. The desire to demonstrate impact is hampered by barriers to evaluation
  2. The majority of respondents do not use evidence to inform programme design

Both of these would be enough to be of concern in any organisation and both are easily remedied.

When people speak about the “barriers to evaluation” I believe this is more about having the data measures in place BEFORE beginning any design. This is echoed in the second point from the CIPD, about using evidence to inform programme design.

It seems straightforward to me with my engineering brain and my love of data to:

  • Assess where you are now
  • Determine where you (the organisation and the participants) want to be, with clear measurable outcomes
  • Look at your resources and constraints
  • Design something within the constraints you have that will achieve those outcomes

Another recent set of opinions from Donald Taylors Global Sentiment Survey 2022:

If you aggregate 1, 5, 9, 10 and 11, it adds up to a whopping 36.3%. I chose these because again they are indicative of L&D not aligning with the organisation and not using evidence to inform good decision making.

 

 

 

 

 

 

 

So what happens then in the world of L&D when we want to get closer to the organisation?Just the other day, I saw a social media post asking if anyone had a copy of a learning strategy they could use as an example, to copy. Here is what they should do instead of copying a strategy from someone else:

  • Look at the organisations mission, vision and goals for the next few years
  • Conduct a stakeholder analysis and determine the best stakeholders to work with
  • Speak to each of these stakeholders and ask them these questions:
    • What is the biggest challenge the organisation is facing?
    • What is the biggest challenge your team/division/section is facing?
    • How could L&D help you overcome these challenges? Can they be quantified in some way?
    • What should L&D start doing?
    • What should L&D stop doing?
    • What should L&D continue doing?
    • What should L&D do differently?

If all of this seems sensible but you and your colleagues need a hand in some of the details you may be interested in a new online course that launched recently called “How Not To Waste Your Money On Training”.

  1. https://www.cnbc.com/2022/01/31/tiktok-owner-bytedance-laid-off-a-global-hr-team-in-december.html

 

In a Learning Needs Analysis, which 3 things are you looking for?

This is another one of the questions that may be asked when playing the Learning Loop™ , a new and exciting replacement for the traditional ‘train-the-trainer’ course.

It’s is an interesting question because some people go straight to the obvious, things they need to get from a Learning Needs Analysis:

  1. Knowledge – WHAT they will keep in their head.
  2. Skills – WHAT you will be able to see them do (or their outputs).
  3. Attitudes – HOW they do things.

 

These are 3 very good things to look out for and specifically you would want to find the difference between where they are now and where you/the stakeholders would like them to be.You are essentially wanting to know the gaps that you need to fill. If you can gauge what level they are at the start, defining what they need to know by the end will certainly be easier!

What will help you greatly in this is something like Blooms Taxonomy. Watch the video below to fi;nd out more. 

There are of course other things that you may be looking for in a Learning Needs Analysis:

  • A clear idea of the problem(s) you are trying to solve
  • Clear organisational outcomes and measures
  • Clear learning outcomes and levels of learning
  • Which stakeholder’s will be involved and supportive
  • How line managers may help imbed the learning
  • The resources that are available 
  • What has and has not worked in the past for similar projects

And many more things….. if you would like to chat to Krystyna about how you even make start, then book a free 30 minute consultation to ask all the questions you would like!

 

 

Why bother with a needs analysis?

This is one of the first questions that may be asked when playing the Learning Loop™ , a new and exciting replacement for the traditional ‘train-the-trainer’ course.

Notice that it does not say “Training Needs Analysis” or “Learning Needs Analysis”, simply “Needs Analysis”.

By implication, if you conduct a Training Needs Analysis (TNA), any of the solutions will be training courses. In the same way, if you conduct a Learning Needs Analysis (LNA), the solutions will be broader than for the TNA, but the assumption is that the outputs will involve learning of some sort.

So if you conduct a Needs Analysis (NA), you will look beyond training or learning requirements and it may force you to identify the problem(s) more clearly and concisely. Now you may worry about conducting a Needs Analysis, for fear that you might be required to solve problems which are beyond the scope of what L&D does. This is where working with the right stakeholders will really help. If you identify a problem outside of your remit, your stakeholders will be grateful, but not necessarily expect you to solve it. In this way you do not end up wasting your money on training or learning that is not required. Again, happy stakeholders!

Going back to the question raised in the title……

Reasons to conduct a needs analysis:

  • To ensure you have measurable outcomes
  • To rule out that the problem has nothing to do with training or learning
  • To make the design easier (clear objectives)
  • To get on board with your stakeholders requirements
  • To get an idea of the gaps in knowledge, skills and behaviours so you will know how to fill them
  • To identify any issues that were not identified in the brief
  • To get line manager buy-in for follow up (the biggest reason training fails)

Can you think of any more?

If you would like to learn how to do an impactful needs analysis that will help you demonstrate value in all of your learning r training interventions then take a look at this new online course “How Not To Waste Your Money On Training” 

 

 

 

 

LTSF19 – Finding the Story in the Data

June the 11th 2019 was the date of the Learning Technologies Summer Forum in ExCel London. I was honoured to run a session on “Finding the Story in the Data” and here are some of my notes and thoughts about the session. 

This session was a practical nitty gritty sort of event. I think people did forgive me if I was teaching my grandmother to suck eggs but I do hear from a lot of L&D people who just don’t know where to start. Data is all over the place and you can easily get swamped. So the purpose of this session was to get people started and get some confidence in looking at data in a practical way. 

I started by asking a question: “Why bother collecting or analysing data?”.

Here are some of the reasons collected from the Learning and Skills group webinar by the same name the week before. 

The chart was put together by Laura Overton and reproduced with her permission.

The two main reasons as you can see are to improve the user experience and also understand the effects or benefits. Not surprising really and in a report by Towards Maturity from 2018 they speak about 91% of the top deck saying that their learning interventions were aligned to the business goals. In order to do that, you need to be measuring what you are doing.

Other reasons may be:

  1. Credibility
  2. To check if things are going to plan
  3. Demonstrate the value brought by L&D
  4. Transition from learning provide to performance enhancer
  5. Avoid the sheep dip approach
  6. It is expected
  7. Stakeholder buy-in

My engineering brain….. in a former life I was a chemical engineer and fuel technologist and if you think that it is all about data and analysis with no room for intuition, then let me share a little story:

As an engineer, gathering data to site wind turbines, I became very skilled at finding appropriate sites just by looking at a map. This helped me to narrow down where to look from a myriad of places, that might be suitable. I would look at the maps, gather data from a mast and correlate it to the nearest met station. It is no different in L&D. You can use your intuition to see where things might be going wrong, from the data that you are already collecting and from your stakeholders. This means you can collect limited and focussed data to confirm your suspicions, to begin to find the story in the data.

Understanding the link between data and performance is crucial, as per the diagram below.

Knowing when to collect quantitative or qualitative data is also important.

Working through a case study helped participants decide when it was appropriate to gather quantitative data and then qualitative. A crucial part of this thinking was to think broader than the case study which is a great piece of advice to anyone doing their own analysis. Look and see what is happening in your industry just in case the sudden drop in sales is industry wide and not just a blip in your own organisation. It could save you a lot of time!

I then challenged the participants to say what they saw in a number of different graphs , encouraging them to be playful to find the story in the data. Sometimes the graphs raised more questions than they answered but it certainly gave everyone an insight into how easy it is to use Excel and simple charts to uncover that story.

 

I just had to share this picture from LTSF19 – Rachael Orchard, my fabulous host for the session, kindly brought her stormtrooper so we could endlessly make Star Wars puns and then playful Don Taylor agreed to pose with us both!

 

 

 

 

Why do we (in L&D) spend so little in the analysis phase?

My thoughts are meandering today onto my book “How not to waste your money on Training” (work in early progress) and after posing the question above on Twitter, there were some interesting answers. The whole storify for the tweet chat  can be accessed here, but there were a few answers that mimg_2344ade me ponder more:

  • Impatience  from L&D and the client
  • Lack of accountability for L&D
  • Distracted by the new and shiny
  • Not realising it is not a static process

So my next question has to be, “How do we help others in L&D see that if they get the analysis part right, then this follows?”:

  • Respect and inclusion from the business
  • Flexibility for the organisation
  • Demonstrating value, so getting budgets is easier
  • Getting to play with the shiny stuff, to enhance the learning experience

My thoughts are that there is a fear in L&D of gathering data, analysis, interpretation, challenging the norm and having the gall to ask “Do we really need this?” or “Is this really a learning gap?”, for fear we might be out of a job.

Thing is, if we don’t start asking these questions, we may be out of a job anyway…..

 

 

 

Are L&D thinking digitally?

TM blogs #3This is the third in a series of blogs inspired by David Hayden, at the CIPD NAP(Northern Area Partnerships) conference June 2016, in a short workshop. The title of his workshop was “Is L&D prepared for the Future of Learning?” and the basis of the discussion was around key statistics uncovered in the “Towards Maturity” report of April 2016 “Preparing for the Future of Learning”. The third question, not the statistics in the graphic, caused me to do some deep thinking!

So, let me tell you a little bit about my thinking in term of learners, digital stuff and also what my experience has been. I am an ex-engineer (if you can ever really leave that?) and a former IT trainer for IBM, so digitally, I would say I am maybe more comfortable than the minority, as keen as the majority, but not as convinced as the digital evangelists.

I have run webinars, created short learning videos, taken part in Twitter chats (LnDConnect) and learn from my own professional learning network, I blog regularly, share updates on LinkedIn and engage in forums, created online polls, used online reflective apps like Brainscape, I have designed blended learning programmes and generally embraced new technology, where it can accelerate and enhance the learning experience. Let me make it quite clear, I am fluent and practised in digital and I use it as an ingredient to a rich blend of many other methods. It is not the first or only thing I think of when looking for a learning solution. So this question is what has caused me to think deeply. “Are L&D thinking digitally?”

If I am baking a cake, I use the right tools for the job and in L&D I think exactly the same. I consider carefully*:

  • Budget and resources
  • Location(s) of the learners
  • The topic
  • Timescales
  • Depth of the learning required (so I may layer different methods)
  • Commitment of the stakeholders
  • Size an culture of the organisation

*See also blog on LNA

The question“Are L&D thinking digitally?” implies that this is how we should be thinking. Digital is not the answer to every L&D problem, it is part of a toolkit available to L&D professionals to create a great blend of learning that will maximise the effectiveness of any planned learning interventions. It is very easy, with the latest, shiniest digital tools, to be thinking “Oh golly where can I use this?” (in my giddiness – I have been there!), whereas we should be thinking about:

“What will work best in this situation, with these learners and to achieve the best organisational outcomes?”

So with this in mind, I would change this question to: “Are L&D thinking digitally, in an appropriate way?“. Maybe its semantics…. what do you think?

Sign up to receive regular updates and news

Sign up to receive regular updates and news

Read how we use your data here

You have Successfully Subscribed!

Pin It on Pinterest