L&D’s relationship with Data

This title has been taken from the Towards Maturity report published in August of this year. I was immediately drawn to the title and anyone who knows what a maths geek I am will understand why.

I always loved numbers and even just playing with them, multiplying numbers by themselves repeatedly just for fun! Yes I know it’s not normal and I also appreciate that not everyone else has the same love of numbers, in fact quite the opposite. I have several friends who will admit that numbers are almost a phobia.

Reading this report ,it is quite evident that we, in L&D are not great at collecting and using data to its best advantage. Some of the figures that struck me were:

  • Of those aspiring data to affect change only 2/5 were able to say that it helped the, demonstrate business impact (40%)
  • Also only 3/5 were successful in using data to help solve business problems (60%)

Bear in mind that was from a sample size of 700+ and the two figures above were those people who were really trying to use data affectively. This means in reality that there will also be a number of people not even trying so the 40% and 60% are likely to be very optimistic figures.

The most likely reasons cited were:

  1. Data and it’s analysis is complicated
  2. Lack of L&D skills in this area

If I look at the second point first. Why are we not addressing this lack of skills? Is it this phobia of numbers? A fear of what to do once you start collecting? An expectation that things have to change once you start collecting data effectively? Maybe it’s a combination of all three? Or maybe a misconception around what it means to collect and analyse data?

For me it is quite simple (and this may address the first point). In L&D we need to get nosey. When someone asks us to deliver a leadership programme, we need to ask why and how will you know it has been successful? If the first person who asks you doesn’t know, then ask someone else. Is it a real need or a perceived need?

The perceived need may be something like employee engagement scores being low. What we really need to determine is what effect that is having on the performance of the business:

  • High recruitment costs?
  • Lack of agility in the marketplace because there is a high attrition rate, staff not as familiar as they should be about products?
  • Poor customer service because the tools they use have had little investment?

So when you look at these examples, you can start to see it really is not about data analysis, but curiosity, perseverance and a healthy dose of skepticism. If you can pinpoint what the problem is and it is a real business need, then what you need to measure will be very obvious:

  • Reduction in recruitment costs
  • Reduction in time to market with new products
  • Range of new products and uptake
  • Attrition rate
  • Customer satisfaction scores

These are not L&D statistics these are business measures and having highlighted the purpose of your L&D focus, the business will also want to measure it. That is not to say that at times you are not needed to do some data analysis and collection but I think we are over complicating it and not getting to the nub of the problem.

In my book (currently in first draft) “How not to Waste your Money on Training’ I will show people simply how to “find the story in the data”. Using a simple example of a scoring grid, I will show how you can, using a spreadsheet and playing with different graph types, discover little parts of the truth about what is going on. It takes a click and a small amount of curiosity. If you want to try it out then just use this example, using Excel to play around with different types of charts:

  • Bar chart
  • Pie chart
  • Stacked bar
  • Spider diagram

For each format ask yourself “what do I see now?”. Using this approach of curiosity and play I discovered:

  • A bar chart gives a good comparison one person against another for each part of their role
  • A spider diagram shows how well-rounded each team member is in their own right. Some are not rounded at all! Tracy seems the most well rounded.
  • Stacked column shows the teams strengths and weaknesses:
    • Who is the strongest in sales skills?
    • Who is the weakest in product knowledge and working independently (why might this be? Manager poor at delegating?)

So I would urge you L&D, before spending a lot of money on data analytics experts, get nosey and do some detective work yourselves. Keep it simple and dig into what is going on beneath the surface. Don’t just take one persons viewpoint or use just one method, mix it up and start finding the story in the data!

My conclusions from the report and my own anecdotal research suggests that:

  • L&D does not have the skills required for data analysis (I had better get that book finished!)
  • It is not as complicated as you think
  • It is about asking the right questions and finding the story in the data
  • We don’t always need data analytics experts to do this!

 

 

 

Measure at all costs?

IMG_2532Measuring stuff in L&D is good and I am an advocate of using data to inform your decision making as well as demonstrating your worth. So is it a case of just measuring everything and it’s bound to be useful? For anyone who knows us and our approach, you know the answer to that already!

Here is what measurement I know happens already, in many places:

  • Number of people completed training, either face to face or online
  • Test scores from online quizzes
  • Amount of time taken for elearning and “engagement” during learning
  • Number of “no-shows” on courses
  • Number of training hours per year company wide

My question is “Does any of this help to improve performance?” The answer may be that sometimes training is not a performance issue but a compliance issue. The training has to be done, so we need to know how many hours we complete per year and do it in the most efficient way. Fair enough! If you have to do the training, make sure it’s effective and done in the most efficient way to not waste money.

What about measuring all these things when it’s related to performance? I can see value in this only if a good analysis has been done beforehand to:

  • Rule out issues that cannot be solved by training (poor systems, processes or lack of resources etc)
  • Identify the right stakeholders to work with, who will support you and put measures in place to measure the effectiveness
  • Determine the organisational outcomes that need to be met, with the appropriate stakeholders providing resources and support
  • Define learning outcomes that are geared towards improving performance and are both observable and measurable
  • Put in place follow up by line managers before the learning starts and performers know what they are going to get out of the learning before they attend

This sort of “joined-up” L&D, makes learning everyone’s responsibility and it also means that measurement is not just L&D’s responsibility.  It means that “learners” are transformed into “performers”. It means that those measures listed at the top of the page could be used to inform which methods have had the most engagement (not that that is always an indicator for success!)It would be wrong, I believe to suggest that good engagement with one successful cohort will guarantee the same success in another cohort.

To my thinking, any suggestion that there could be some sort of permanent link between engagement and performance misses the point entirely. In a closed system like a chemical plant, where putting in the same chemicals at the same rate with the same process can produce expected, achievable results. Learning is not however a closed system: the variables are always changing as are the participants and the influences on their behaviour. The true measure, in my opinion has to be what measurable differences the learning has made to the performance of an individual. How have customer complaints reduced? Income increased? Sales boosted? A one time analysis is not the answer because things change. So analyse, plan, implement and measure, then round again to make sure you are capturing the “now” and not the “yesterday”.

So am I in L&D Narnia, expecting the impossible? Measuring the unmeasurable? Quite simply, I believe that before investing in learning and merely measuring “engagement” dig, dig and dig deeper to find out what is missing and what you will need to learn and do to make changes. Keep asking why? Until you have some sensible answer other than “why not?”

Why do we (in L&D) spend so little in the analysis phase?

My thoughts are meandering today onto my book “How not to waste your money on Training” (work in early progress) and after posing the question above on Twitter, there were some interesting answers. The whole storify for the tweet chat  can be accessed here, but there were a few answers that mimg_2344ade me ponder more:

  • Impatience  from L&D and the client
  • Lack of accountability for L&D
  • Distracted by the new and shiny
  • Not realising it is not a static process

So my next question has to be, “How do we help others in L&D see that if they get the analysis part right, then this follows?”:

  • Respect and inclusion from the business
  • Flexibility for the organisation
  • Demonstrating value, so getting budgets is easier
  • Getting to play with the shiny stuff, to enhance the learning experience

My thoughts are that there is a fear in L&D of gathering data, analysis, interpretation, challenging the norm and having the gall to ask “Do we really need this?” or “Is this really a learning gap?”, for fear we might be out of a job.

Thing is, if we don’t start asking these questions, we may be out of a job anyway…..

 

 

 

Where on earth do you begin with evaluation L&D?

Where do you begin with evaluation?IMG_0853
A few weeks ago, we ran a Learning Loop showcase event called “Taking the Fear out of ROI”. There was a great mix of people from many different organisations and to say the discussions were lively, would be an understatement!

Steven Covey, a man of many wise words said “Begin with the end in mind”. So the answer to the question in the title, would of course echo Stephen Covey’s sentiments. If evaluation has been an afterthought to the process of delivering learning, then, quite frankly it will be a waste of time. It would be a little like starting to knit a jumper, without a pattern or any thought to shape, size or colour and then expecting it to fit your needs.

Here is the essence, for a good evaluation, you need:

  • A solid needs analysis, which identifies the impact you would like the learning to have on the organisation
  • Stakeholders engaged at the beginning, providing you with not only the resources to identify needs, but resources and support for the evaluation. (*for more on stakeholder management click here)
  • Clear organisational outcomes, which the stakeholders will monitor and measure
  • Learning outcomes that support the organisational ones
  • Time before the next new project to complete the evaluation analysis and reporting
  • Realistic expectations from the stakeholders about the expected outcomes

There are of course other factors, but this is brief run through of the key components. The last one is an interesting one, especially when there are multiple factors which may influence the outcomes. Let us take a simple example:

At the same time as the a customer service learning programme being rolled out, a new customer management system is also installed.

In this instance the relevant stakeholders may either:

  • Join forces and measure the overall impact of both
  • Agree percentages of the impact of the two separately

Whichever approach is used, there needs to be realistic expectations from the stakeholders as to some of the other factors which may prevent the objectives being achieved:

  • Lack of line manager support for the learners (one of the biggest reasons for learning not imbedding)
  • A long enough lead time, between the learning and the measurement, to allow the learning to imbed and for results to be observed
  • Time and space in the learners roles for the learning to be put into action

Again, this is not an exhaustive list, but some of the key areas that may be investigated, if the learning does not meet expectations.

At How to Accelerate Learning, we can help organisations to dig deeper into evaluation. Contact us to find out how we could help you.

Are L&D thinking digitally?

TM blogs #3This is the third in a series of blogs inspired by David Hayden, at the CIPD NAP(Northern Area Partnerships) conference June 2016, in a short workshop. The title of his workshop was “Is L&D prepared for the Future of Learning?” and the basis of the discussion was around key statistics uncovered in the “Towards Maturity” report of April 2016 “Preparing for the Future of Learning”. The third question, not the statistics in the graphic, caused me to do some deep thinking!

So, let me tell you a little bit about my thinking in term of learners, digital stuff and also what my experience has been. I am an ex-engineer (if you can ever really leave that?) and a former IT trainer for IBM, so digitally, I would say I am maybe more comfortable than the minority, as keen as the majority, but not as convinced as the digital evangelists.

I have run webinars, created short learning videos, taken part in Twitter chats (LnDConnect) and learn from my own professional learning network, I blog regularly, share updates on LinkedIn and engage in forums, created online polls, used online reflective apps like Brainscape, I have designed blended learning programmes and generally embraced new technology, where it can accelerate and enhance the learning experience. Let me make it quite clear, I am fluent and practised in digital and I use it as an ingredient to a rich blend of many other methods. It is not the first or only thing I think of when looking for a learning solution. So this question is what has caused me to think deeply. “Are L&D thinking digitally?”

If I am baking a cake, I use the right tools for the job and in L&D I think exactly the same. I consider carefully*:

  • Budget and resources
  • Location(s) of the learners
  • The topic
  • Timescales
  • Depth of the learning required (so I may layer different methods)
  • Commitment of the stakeholders
  • Size an culture of the organisation

*See also blog on LNA

The question“Are L&D thinking digitally?” implies that this is how we should be thinking. Digital is not the answer to every L&D problem, it is part of a toolkit available to L&D professionals to create a great blend of learning that will maximise the effectiveness of any planned learning interventions. It is very easy, with the latest, shiniest digital tools, to be thinking “Oh golly where can I use this?” (in my giddiness – I have been there!), whereas we should be thinking about:

“What will work best in this situation, with these learners and to achieve the best organisational outcomes?”

So with this in mind, I would change this question to: “Are L&D thinking digitally, in an appropriate way?“. Maybe its semantics…. what do you think?

Subscribe to access FREE monthly activities

Join our mailing list to receive the latest news and updates from our team.

Read how we use your data here

Krystyna Gadd & Associates will use the information you provide on this form to be in touch with you and to provide updates and marketing. Please let us know all the ways you would like to hear from us:

You can change your mind at any time by clicking the unsubscribe link in the footer of any email you receive from us, or by contacting us at krystyna@howtoacceleratelearning.co.uk. We will treat your information with respect.

We use MailChimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to MailChimp for processing. Learn more about MailChimp's privacy practices here.

You have Successfully Subscribed!

Pin It on Pinterest