Blooming Marvellous!

When I discovered Blooms Taxonomy first of all…. I was confused, then frustrated and now I absolutely love it!

The first thing that frustrated me was the word TAXONOMY – it just means classification so why use something that sounds so complicated?

 

The next thing was the names of the domains:

  • Cognitive (Knowledge)
  • Affective (Attitudes)
  • Psychomotor (Skills)

Again , why make it sound so complicated when it is so easy?

What I love about it, is the way I use it to determine the gaps in peoples knowledge, skills or attitudes and then the level to which they need to get better. Having determined that a need is down to a gap in learning and not in resources, relationships, processes etc I ask myself a question:

Is this a knowledge, skill or attitudinal gap?

I can determine the answer to this question (and whether it is a combination of 2 or 3) by thinking:

Is it something that has to be in peoples’ heads? A knowledge thing? Something you will only know if they have got if they, describe, explain, list or tell you about it?

 

Or is it a skill thing? Something that you will see them doing or that there is some visible output? It may be a physical skill (hence the ‘hands’)


Or is it the way they should be doing something? A heart thing? Their attitude?

 

Or is it a combination of all three?

Once it is clear in my mind which domain the learning falls into, then it requires some thought for the level of the learning. A simple example would be GDPR(General data Protection Principles) mandatory training. This is both a knowledge thing and also an attitudinal thing. It might even become a skill thing depending on which level you operate in the organisation.

Mandatory training for all staff can be tedious and if you make it generic it may not hit the mark with a lot of people. Let’s examine for different groups of people what they might actually need:

For colleagues you might want them:

  • To be able to explain what their responsibilities are with regards their role and GDPR 
  • As a team to be able to identify possible data security risks in their own team 
  • Follow the GDPR policy and advocate its use to other team members 

For line managers:

  • To be able to explain what their responsibilities are with regards their role and GDPR 
  • As a team to be able to identify possible and actual data security risks in their own team 
  • With other line managers, outline a GDPR plan for their team to ensure that their approach is regularly reviewed 
  • Follow the GDPR policy and advocate its use to other team members 
  • Be role model for GDPR

For the Data Protection Officer:

  • To be able to explain what their responsibilities are with regards their role and GDPR 
  • To be able to identify data security risks within their own team and the organisation
  • With other line managers, outline a GDPR plan for their team to ensure that their approach is regularly reviewed 
  • Put together and communicates a policy which safeguards the data within the organisation according to GDPR
  • Be role model for GDPR
  • Inspires others to follow the GDPR policy 

From the above you can see that some of the learning could be used for all levels, but for some you need to take them to the next level and maybe beyond. Looking at the picture at the top of the article therefore Blooms Taxonomy can be used to determine the level of learning but also map out a path for learning for different groups of individuals. It is worth noting that you cannot just leap to the top level in any domain without spending some time at the lower levels.

If this is slowly starting to make sense or needs more clarification then watch this short video or chat to me

 

 

L&D’s relationship with Data

This title has been taken from the Towards Maturity report published in August of 2018. I was immediately drawn to the title and anyone who knows what a maths geek I am will understand why.

I always loved numbers and even just playing with them, multiplying numbers by themselves repeatedly just for fun! Yes I know it’s not normal and I also appreciate that not everyone else has the same love of numbers, in fact quite the opposite. I have several friends who will admit that numbers are almost a phobia.

Reading this report ,it is quite evident that we, in L&D are not great at collecting and using data to its best advantage. Some of the figures that struck me were:

  • Of those aspiring data to affect change only 2/5 were able to say that it helped the, demonstrate business impact (40%)
  • Also only 3/5 were successful in using data to help solve business problems (60%)

Bear in mind that was from a sample size of 700+ and the two figures above were those people who were really trying to use data affectively. This means in reality that there will also be a number of people not even trying so the 40% and 60% are likely to be very optimistic figures.

The most likely reasons cited were:

  1. Data and it’s analysis is complicated
  2. Lack of L&D skills in this area

If I look at the second point first. Why are we not addressing this lack of skills? Is it this phobia of numbers? A fear of what to do once you start collecting? An expectation that things have to change once you start collecting data effectively? Maybe it’s a combination of all three? Or maybe a misconception around what it means to collect and analyse data?

For me it is quite simple (and this may address the first point). In L&D we need to get nosey. When someone asks us to deliver a leadership programme, we need to ask why and how will you know it has been successful? If the first person who asks you doesn’t know, then ask someone else. Is it a real need or a perceived need?

The perceived need may be something like employee engagement scores being low. What we really need to determine is what effect that is having on the performance of the business:

  • High recruitment costs?
  • Lack of agility in the marketplace because there is a high attrition rate, staff not as familiar as they should be about products?
  • Poor customer service because the tools they use have had little investment?

So when you look at these examples, you can start to see it really is not about data analysis, but curiosity, perseverance and a healthy dose of skepticism. If you can pinpoint what the problem is and it is a real business need, then what you need to measure will be very obvious:

  • Reduction in recruitment costs
  • Reduction in time to market with new products
  • Range of new products and uptake
  • Attrition rate
  • Customer satisfaction scores

These are not L&D statistics these are business measures and having highlighted the purpose of your L&D focus, the business will also want to measure it. That is not to say that at times you are not needed to do some data analysis and collection but I think we are over complicating it and not getting to the nub of the problem.

In my book  “How not To Waste Your Money On Training” I show people simply how to “find the story in the data”. Using a simple example of a scoring grid, I show how you can, using a spreadsheet and playing with different graph types, discover little parts of the truth about what is going on. It takes a click and a small amount of curiosity. If you want to try it out then just use this example, using Excel to play around with different types of charts:

  • Bar chart
  • Pie chart
  • Stacked bar
  • Spider diagram

For each format ask yourself “what do I see now?”. Using this approach of curiosity and play I discovered:

  • A bar chart gives a good comparison one person against another for each part of their role
  • A spider diagram shows how well-rounded each team member is in their own right. Some are not rounded at all! Tracy seems the most well rounded.
  • Stacked column shows the teams strengths and weaknesses:
    • Who is the strongest in sales skills?
    • Who is the weakest in product knowledge and working independently (why might this be? Manager poor at delegating?)

So I would urge you L&D, before spending a lot of money on data analytics experts, get nosey and do some detective work yourselves. Keep it simple and dig into what is going on beneath the surface. Don’t just take one persons viewpoint or use just one method, mix it up and start finding the story in the data!

My conclusions from the report and my own anecdotal research suggests that:

  • L&D does not have the skills required for data analysis (I had better get that book finished!)
  • It is not as complicated as you think
  • It is about asking the right questions and finding the story in the data
  • We don’t always need data analytics experts to do this!

 

 

 

Measure at all costs?

IMG_2532Measuring stuff in L&D is good and I am an advocate of using data to inform your decision making as well as demonstrating your worth. So is it a case of just measuring everything and it’s bound to be useful? For anyone who knows us and our approach, you know the answer to that already!

Here is what measurement I know happens already, in many places:

  • Number of people completed training, either face to face or online
  • Test scores from online quizzes
  • Amount of time taken for elearning and “engagement” during learning
  • Number of “no-shows” on courses
  • Number of training hours per year company wide

My question is “Does any of this help to improve performance?” The answer may be that sometimes training is not a performance issue but a compliance issue. The training has to be done, so we need to know how many hours we complete per year and do it in the most efficient way. Fair enough! If you have to do the training, make sure it’s effective and done in the most efficient way to not waste money.

What about measuring all these things when it’s related to performance? I can see value in this only if a good analysis has been done beforehand to:

  • Rule out issues that cannot be solved by training (poor systems, processes or lack of resources etc)
  • Identify the right stakeholders to work with, who will support you and put measures in place to measure the effectiveness
  • Determine the organisational outcomes that need to be met, with the appropriate stakeholders providing resources and support
  • Define learning outcomes that are geared towards improving performance and are both observable and measurable
  • Put in place follow up by line managers before the learning starts and performers know what they are going to get out of the learning before they attend

This sort of “joined-up” L&D, makes learning everyone’s responsibility and it also means that measurement is not just L&D’s responsibility.  It means that “learners” are transformed into “performers”. It means that those measures listed at the top of the page could be used to inform which methods have had the most engagement (not that that is always an indicator for success!)It would be wrong, I believe to suggest that good engagement with one successful cohort will guarantee the same success in another cohort.

To my thinking, any suggestion that there could be some sort of permanent link between engagement and performance misses the point entirely. In a closed system like a chemical plant, where putting in the same chemicals at the same rate with the same process can produce expected, achievable results. Learning is not however a closed system: the variables are always changing as are the participants and the influences on their behaviour. The true measure, in my opinion has to be what measurable differences the learning has made to the performance of an individual. How have customer complaints reduced? Income increased? Sales boosted? A one time analysis is not the answer because things change. So analyse, plan, implement and measure, then round again to make sure you are capturing the “now” and not the “yesterday”.

So am I in L&D Narnia, expecting the impossible? Measuring the unmeasurable? Quite simply, I believe that before investing in learning and merely measuring “engagement” dig, dig and dig deeper to find out what is missing and what you will need to learn and do to make changes. Keep asking why? Until you have some sensible answer other than “why not?”

Why do we (in L&D) spend so little in the analysis phase?

My thoughts are meandering today onto my book “How not to waste your money on Training” (work in early progress) and after posing the question above on Twitter, there were some interesting answers. The whole storify for the tweet chat  can be accessed here, but there were a few answers that mimg_2344ade me ponder more:

  • Impatience  from L&D and the client
  • Lack of accountability for L&D
  • Distracted by the new and shiny
  • Not realising it is not a static process

So my next question has to be, “How do we help others in L&D see that if they get the analysis part right, then this follows?”:

  • Respect and inclusion from the business
  • Flexibility for the organisation
  • Demonstrating value, so getting budgets is easier
  • Getting to play with the shiny stuff, to enhance the learning experience

My thoughts are that there is a fear in L&D of gathering data, analysis, interpretation, challenging the norm and having the gall to ask “Do we really need this?” or “Is this really a learning gap?”, for fear we might be out of a job.

Thing is, if we don’t start asking these questions, we may be out of a job anyway…..

 

 

 

Where on earth do you begin with evaluation L&D?

Where do you begin with evaluation?IMG_0853
A few weeks ago, we ran a Learning Loop showcase event called “Taking the Fear out of ROI”. There was a great mix of people from many different organisations and to say the discussions were lively, would be an understatement!

Steven Covey, a man of many wise words said “Begin with the end in mind”. So the answer to the question in the title, would of course echo Stephen Covey’s sentiments. If evaluation has been an afterthought to the process of delivering learning, then, quite frankly it will be a waste of time. It would be a little like starting to knit a jumper, without a pattern or any thought to shape, size or colour and then expecting it to fit your needs.

Here is the essence, for a good evaluation, you need:

  • A solid needs analysis, which identifies the impact you would like the learning to have on the organisation
  • Stakeholders engaged at the beginning, providing you with not only the resources to identify needs, but resources and support for the evaluation. (*for more on stakeholder management click here)
  • Clear organisational outcomes, which the stakeholders will monitor and measure
  • Learning outcomes that support the organisational ones
  • Time before the next new project to complete the evaluation analysis and reporting
  • Realistic expectations from the stakeholders about the expected outcomes

There are of course other factors, but this is brief run through of the key components. The last one is an interesting one, especially when there are multiple factors which may influence the outcomes. Let us take a simple example:

At the same time as the a customer service learning programme being rolled out, a new customer management system is also installed.

In this instance the relevant stakeholders may either:

  • Join forces and measure the overall impact of both
  • Agree percentages of the impact of the two separately

Whichever approach is used, there needs to be realistic expectations from the stakeholders as to some of the other factors which may prevent the objectives being achieved:

  • Lack of line manager support for the learners (one of the biggest reasons for learning not imbedding)
  • A long enough lead time, between the learning and the measurement, to allow the learning to imbed and for results to be observed
  • Time and space in the learners roles for the learning to be put into action

Again, this is not an exhaustive list, but some of the key areas that may be investigated, if the learning does not meet expectations.

At How to Accelerate Learning, we can help organisations to dig deeper into evaluation. Contact us to find out how we could help you.

Subscribe to access FREE monthly activities

Join our mailing list to receive the latest news and updates from our team.

Read how we use your data here

Krystyna Gadd & Associates will use the information you provide on this form to be in touch with you and to provide updates and marketing. Please let us know all the ways you would like to hear from us:

You can change your mind at any time by clicking the unsubscribe link in the footer of any email you receive from us, or by contacting us at krystyna@howtoacceleratelearning.co.uk. We will treat your information with respect.

We use MailChimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to MailChimp for processing. Learn more about MailChimp's privacy practices here.

You have Successfully Subscribed!

Pin It on Pinterest