Podcast | The Three L&D Elephants In The Room – Paul Matthews
Author:
Gary
PUBLISHED ON:
March 31, 2021
June 26, 2023
PUBLISHED IN:
Podcast
Do L&D professionals need to accept that L&D’s not always the answer? Yes! And that’s not even one of Paul Matthews’ three elephants in the room. Instead, we need to start with the desired behaviour and work backwards to the right format.
Paul Matthews is an L&D expert, speaker and consultant, but he’s also the author of three books on the three L&D elephants: Learning transfer, capability and informal learning. And he’ll talk you through how and why you should address them in this episode.
Hit play and learn how you can harness informal learning’s power, how to become a better performance consultant, the need for a credible L&D brand and plenty more. And you can tell Paul is a great public speaker, because he breaks it all down in relatable ways. In other words, he’s great fun to listen to!
Watch the episode
Listen to the episode
Running order
1:30 > The three L&D elephants 2:51 > Elephant One: Learning transfer 11:05 > Elephant Two: Capability 15:31 > How L&D can become better performance consultants? 19:57 > Elephant 3: Informal learning 22:57 > Harnessing the power of informal learning 29:27 > Credibility: The sneaky fourth elephant 36:10 > Quickfire questions
Summary: How to tackle three L&D elephants and the sneaky fourth one
Before we can even get to the elephants we need to understand why these issues warrant their comparison to the world’s largest land mammal. And that’s because they’re things we shouldn’t ignore but people often choose to because they’ve either been there in the background, it’s become a habit or people just don’t know where to start.
Here’s a quick summary of the things Paul talked us through, but for the full tusk and trunk experience, you should really listen to the full episode.
Elephant One: Learning Transfer
When we talk about formal learning, it’s not just a case of moving information from A to B, we have to do something with it once we get to B. When people go on a course, they need to apply what they learn and become active to make sure it’s not lost. Otherwise, how’s it going to make a difference in the real world?
One problem might be the lack of a central point for accountability when it comes to that training driving results. A trainer might think they’ve done their part by explaining the topic, a manager might think their part is done when they send someone on the course, but where are we pulling everyone together to be accountable for results?
You need to consider the learner’s motivation and mindset. Are they ready to learn or can they even be bothered? Is there a roadmap or plan from your end to drive that behaviour change?
Elephant Two: Capability
When someone comes to you and asks for a particular training, people need a better way to respond than just approving it. Part of that comes from better performance diagnoses and improving the consultancy process. That means understanding why people aren’t doing the things we want them to, what the barriers are and working out how we can break down the barriers.
That doesn’t always mean L&D either, so don’t presume learning initiatives are the answer to everything. Don’t start with the format in mind and work backwards, tackle the problems and barriers to that new behaviour with the right support. The elephant in the room is having that conversation with someone when they request something they want but need visibility into what they need.
Elephant Three: Informal Learning
Informal learning is really just anything that’s not formal. What do people do during the day when they want to learn? That comes back to how they observe others, the discussions they have, what they look up on YouTube – but it doesn’t really matter if it’s formal or informal, it’s helping them do their job.
The majority of learning happens on the job, in the flow of work, which L&D needs to pay attention to. Ask people where they go when they need information and smooth the path for them, so it becomes easier to find and fewer clicks away.
The Sneaky Fourth Elephant: Credibility In The Organisation
How well L&D can drive change comes down to how well it’s perceived. In a lot of organisations, its brand and reputation isn’t that great or it doesn’t help them promote changes – how can they focus on changing the culture without credibility?
Creating a strong learning brand can help. Start by setting out a value proposition for L&D and defining your audience(s). Understand your mission, vision and purpose for L&D and align them with the company line. That helps you understand who you are and what you can do for the organisation, so you present that and deliver on the brand promise during your touchpoints.
[00:00:00] Nelson Sivalingam: [00:00:00] Work is changing. The question is, are you? Welcome to 99 Problems But Work Ain’t One, the new podcast series from HowNow that will help you prepare for the fast changing world of work. I’m your host Nelson Sivalingam, and I’ll be talking to disruptive startups, contrarian thinkers, global leaders, and real game changers and asking them the burning questions about the challenges we face at work from scaling cultures and adopting technology.
[00:00:28] To improving wellbeing and building fast learning organisations that are prepared for the future of work. We get the insights, tactics, and actionable nuggets of knowledge to put to work. Have you heard about the three L&D elephants? They’re not the things we never forget, but the ideas we often overlook. Paul Matthews joins me this week to talk us through learning transfer, building capability and informal learning.
[00:00:53] Plus the sneaky fourth elephant. Paul’s got so much L&D experience. We could fill an episode with [00:01:00] his backstory. But in a nutshell he’s an industry expert he’s written three L&D books, is a regular speaker at events, a consultant, a strategist, more importantly, breaks down L&D challenges in a relatable and applicable way, from how L&D can become better performance consultants to building the credibility needed to drive real change.
[00:01:21] This is a must-listen for any l&d and people leaders. Hi, Paul. Thanks for joining us.
[00:01:27] Paul Matthews: [00:01:27] Hey guys. Great to be here.
[00:01:29] Nelson Sivalingam: [00:01:29] Um, and so today’s chat. I really wanted to kind of focus around the three elephants in the room, the physical room, the virtual room, but the three elephants are what we’re going to talk about. And this is what you’ve kind of widely written about in your three books.
[00:01:45] And a lot of your work is centered around these three elephants. And so why don’t you start off by telling us what the three elephants are?
[00:01:54] Paul Matthews: [00:01:54] Well, there’s actually four of them because there’s a fourth sneaky one. So, [00:02:00] I started writing my book on learning transfer, which is actually my third book. And, and I started using the term, the elephant in the room for learning transfer because it’s something that a lot of learning and development people are aware of, but almost studiously ignore.
[00:02:16] when they’re going about doing that learning and development initiatives, Um, and that obviously got me thinking, you know, are there any other elephants looking in the same room? And I realised there are a couple more and strangely enough, they aligned very well with the first two books I wrote. So I ended up talking about the three elephants, uh, as the three books that I wrote.
[00:02:38] Um, and, and then as I’ve said, there’s a fourth sneaky one and, I’m calling it sneaky because I don’t want to write a book on it.
[00:02:47] Nelson Sivalingam: [00:02:47] Well, not if we can force you to write another one, let’s take it from, I guess, the order in which the kind of elephants entered the room.
[00:02:56] Paul Matthews: [00:02:56] Um, it was the learning transfer elephant, [00:03:00] even though that was the first one that I sort of noticed or started describing that way.
[00:03:05] by the way, in Brazil, they have an expression the goat under the table, which means the same thing.
[00:03:14] Nelson Sivalingam: [00:03:14] Let’s pick up learning transfer as the first one, we talk about it, I guess. Firstly, why do you think it is an elephant in the room? Why is it being ignored and not being talked about as much.
[00:03:27] Paul Matthews: [00:03:27] Well it’s an elephant because it’s got a massive impact on what anybody does in learning and development. Um, and, and so therefore it’s one of those things that you really can’t ignore, but people do because it’s, uh, it’s because it’s typically always been in the background. People haven’t ever paid much attention to it, so it’s become habitual not to, um, And it’s also because it can be a little difficult making a start.
[00:03:54] Well, where do I start? How do I deal with this? Well, you know, what do I do with this elephant? So even when someone, it [00:04:00] says, look, look elephant over there. Um, they say, Oh yeah, yeah, yeah. I mean, you know, we kind of know about that. And then they go back to talking about bums on seats. So rolling it out to the international audience or, and they’d just disappear back into content delivery, um, which is mostly where they sit.
[00:04:17] Um, with little thought of, how is that content going to make a difference in the real world, out there in a practical world, in terms of how people do their job?
[00:04:29] Nelson Sivalingam: [00:04:29] And just on that point, how would you, you start off with defining learning transfer?
[00:04:35] Paul Matthews: [00:04:35] Um, there’s, there’s a lot of different definitions about. I’ve read a lot of different papers on it, but if you kind of pull them all together, it’s really.
[00:04:44] Make sure that people do something after a training course with what they learned on the training course. So it’s about operationalising what they’ve learned. And, and of course, if they don’t, um, make that learning active in some way and utilise it [00:05:00] effectively in their work and their job, then the learning was kind of wasted.
[00:05:04] Uh, if it’s not going to get used. So that’s really what it is. So it’s not just about the transfer itself from the training, the point of training, whether that’s a physical or virtual classroom or e-learning or whatever. Um, and we’re talking about formal, uh, learning interventions here. So, um, so there’s that point of training or learning or whatever it is.
[00:05:26] And, and then there’s much more than just moving that from A to B it’s it’s moving it from a to B and then doing something with it. And it’s, they’re doing something with it that most people fall down on. I think there are some systems out there which focus on making sure that people memorise stuff, but they’re still kind of useless if they do nothing with what they’ve memorised, even if they’ve.
[00:05:46] You know, I memorised it cause they keep getting bombarded with text messages or this or that or the other, and kind of like, well, who cares if you’re not going to use it, they’re still doesn’t matter.
[00:05:57] Nelson Sivalingam: [00:05:57]Do you think this is due to the kind of [00:06:00] challenges of implementing, learning transfer from the perspective of kind of creating opportunities where learners can apply what they’ve learned in a relevant context.
[00:06:09] Or do you think this is more a case of. The data infrastructure for measuring, um, the kind of impact of learning transfer isn’t there. And therefore they’re not able to see, you know, what the benefits would be of, um, driving, learning trends. Where do you think?
[00:06:24] Paul Matthews: [00:06:24] I think it’s probably a bit of both.
[00:06:26] I think the biggest problem is there’s no central point of accountability for training courses having results, right? Because the trainer says, well, I’m responsible for pouring this stuff into their heads. And, you know, I’m being a bit flippant there, but that’s kind of a jug and mug approach. That’s what the trainer is doing it.
[00:06:44] The manager says, well, I sent them on a training course. They should come back fixed and able to do stuff. Um, the, you know, the other, the other stakeholders are, are equally at a distance. So there’s no one place where all of those different responsibilities are [00:07:00] collated together and no one’s held responsible for a training course to have any concrete result in terms of what happens.
[00:07:08] Because the manager said it’s when I sent them on a training course that they still can’t do the job. It’s not my fault. If they can’t get trained. And the trainer says, well, I got them. I gave them what we were supposed to give them and they’ve gone back to their work. My job is done, you know? So I think that’s one of the reasons there’s this.
[00:07:25] David Wilson calls it the conspiracy of convenience. Everybody says, well, I’ve done my bit. And that’s all I’m really wanting to get involved with. So I think that’s one reason is that central point of accountability is not there. Um, the other thing is that people are unsure where to start. Because you said giving them a chance to practice.
[00:07:44] Well, that’s fine. But there’s also: what are all the motivational issues going on for the learner? Do they want to learn? Can they be bothered learning? Is there something in it for them? So there’s the mindset of the learner. Are they ready to learn? Do they think they can, do they think they’ve got enough [00:08:00] support?
[00:08:00] Can they be bothered all the rest of it? Um, and then there is all of the design of your, the program to set it up so that there are pieces in place and opportunities for learning. So there’s a roadmap, there’s a plan. There’s a journey that people are going to go on because by large, you’re wanting behaviour change out of training.
[00:08:22] And of course, behavior change doesn’t happen in an event unless you get some incredible road to Damascus conversion. It’s kind of, you know, you will need a sequence of things happening over time in order for behaviour to change and certainly to change effectively. Uh, and then the other thing is all the other stakeholders around them, you know, who was involved with, you know, their manager is going to be involved, their peers, there’s all sorts of other people that have an input that is necessary in order for learning transfer to go well.
[00:08:55] Nelson Sivalingam: [00:08:55] do you have examples where, where you’ve seen learning transfer done [00:09:00] well, and also, you know, what are the kind of tangible benefits you’ve seen when this has been done? Well, and it’s been, like you said, there’s the right kind of accountability in place it’s been factored into the actual learning experience design?
[00:09:13] Um, yeah. Any examples
[00:09:15] Paul Matthews: [00:09:15] I’ve not seen it done really well. I’ve seen people do bits of those things I’ve mentioned. And whenever they do, they always get better results, but there’s more that can be done. So I don’t think, I’ve ever come across anybody who’s doing it really, really well. Um, and kind of pulling all the levers that need to be pulled.
[00:09:37] Nelson Sivalingam: [00:09:37] So, where do you think, you know, someone working in an L&D team at the moment, listening to this podcast,what’s the kind of first step they can take in the direction of kind of effective knowledge transfer? What would you recommend they do first?
[00:09:53] Paul Matthews: [00:09:53] I’ll challenge your phrase knowledge transfer to start with, because that’s not what we’re talking about here.
[00:09:57] We’re talking knowledge transfer as much more like an [00:10:00] educational thing. We are talking about transferring knowledge to someone who then maybe wants to pass an exam. What we’re doing here is different. And even learning transfer is not an ideal term, but it’s kind of what we’re stuck with given what the industry uses.
[00:10:14] Um, so, and so those aren’t to be mixed up, I would hasten to add. Um, but in terms of a first step, if I suppose it’s very self-serving, but read my book on learning transfer, um, and also that references three or four other books that are companion reading to it, where I would also recommend that people go to, um, And yeah, there’s a few videos on my website and stuff like that, but there are not that many people out there really focusing hard on learning transfer.
[00:10:44] Which I find staggering. Um, my background is engineering and you know, you kind of want results out. The other end. If you designed a bit of machinery, you don’t want it to break. That’s a failure. And yet so many people go through formal training or [00:11:00] formal learning and effectively it’s a failure because they do so little.
[00:11:03] If anything, with it afterwards.
[00:11:05] Nelson Sivalingam: [00:11:05] Yeah. And so let’s move on to kind of that next elephant in the room. Which one did you want to go to? Do you want to go informal learning or capability?
[00:11:16] Paul Matthews: [00:11:16] Well, I think they, the capability one may be, um, which is my second book. So kind of working backwards here. That’s much more about the diagnostics process of, uh, if someone does come and ask for training.
[00:11:29] Which is a very common occurrence. The manager will rock up to training and development or the learning and development departments and say that I want, you know, two days of negotiation skills training for my call centre team or whatever. And that’s a real example I’ve actually come across. What they need to do is have a way to respond to that without just taking the order and then delivering that training without any thought as to why is it this training and not some other training? is training even required [00:12:00] and all of that.
[00:12:00] So I think there’s two different steps and I think a lot of people get them mixed up. The first step really is performance consultancy or performance diagnostics, which is sitting down and saying, okay, these people are not doing things the way we would prefer them to be doing them. So why not? What’s going on?
[00:12:18] What are the barriers? What are the causes? What’s underlying this, uh, behavior that we would prefer to be different? Um, and that’s a performance diagnostics or performance consultancy, a lot of L&D people think they do that, but they don’t because what they do is they go into that conversation, assuming there’s going to be a learning initiative or output from.
[00:12:42] And so they end up going in doing what I call learning consultancy, which is. Assuming this going to be a learning solution to whatever problem that gets presented to them. That first step of performance consultancy is required first, where you’re not presupposing, there’s a learning solution. And you might [00:13:00] end up doing that and saying they know what they need to know, training
[00:13:03] Isn’t going to help. So in that case, there’s no point in training them.
[00:13:10] Nelson Sivalingam: [00:13:10] Yeah, and I can relate to this. I mean, it reminds me a lot of, um, Steve Blank’s kind of customer development methodology when you’re kind of building products and the idea of, you don’t start with your kind of solution, you’re starting with the user and the problem and, and it’s, uh, You know, methodology for effective user interviews to actually understand what problem you’re solving and then to kind of ideate around the solution.
[00:13:34] And I can definitely see, you know, with my experience of what I’ve seen within the L&D world. Sometimes we stop from the format first. Like you from the get go, you know, it’s going to be classroom training. And now with that box, you’re trying to solve the user’s problems. And do you find the same?
[00:13:52] It’s the fact that we’re kind of stuck with the format first and trying to fit the solution within the format.
[00:13:57] Paul Matthews: [00:13:57] It’s very, very [00:14:00] common. Um, it doesn’t mean that starting with classroom training, what it means, starting with learning solutions, but it’s the Abraham Maslow quote. If someone has a hammer, they assume everything, you know, it looks like a nail.
[00:14:12] Yeah. Um, and that’s why learning and development people are often quite poor at going into that performance consultancy conversation because they go in with a very big hammer and, and called learning and development initiatives. And they might end up coming out the other end with action learning or coaching or whatever.
[00:14:31] But if it was a process improvement thing that needed to happen. That call centre. The one I mentioned earlier, as an example, two days of, of a negotiation skills training, they actually ended up after going through a proper consultancy process, changing the script and everything they wanted to do.
[00:14:48] They didn’t need negotiation skills training. Never did. That was the Bee that was in the bonnet to the manager. So that was what he wanted to do. So what they want and what they need are two different things very often, [00:15:00] and what they come to L and D is, well, I want this, but it’s not what they need. So I think that whole, the elephant in the room basically is having that conversation with that person requesting.
[00:15:12] What they want and helping make sure that they have visibility of what they truly need. And then in all likelihood they will say, Oh, Oh, that thing I used to want, I don’t want that anymore. Now that I can see what I need. I want that thing that I need. And then you’ve got a chance of delivering something that will actually have some impact.
[00:15:32] Nelson Sivalingam: [00:15:32] So what do you think L&D teams can do from perspective of their own learning and kind of skills development to become better performance consultants?
[00:15:42] Paul Matthews: [00:15:42] Read my book. Um, there are a number of books out there in this kind of area and the, I think it’s just important. They realise they’ve got to [00:16:00] go into that conversation, not thinking about learning and development.
[00:16:04] They’ve got to go in with a very open mind and thinking, okay, we’ve got this bunch of people. They’re not doing what we want them to do. What do we want them to do instead? So what’s the behavior gap. And then another powerful question there is, well, if we don’t do anything about this for six months, what’s it going to cost the organisation 10,000, a hundred thousand, a million, 10 million?
[00:16:24] I don’t know. And the chances are, they won’t be able to tell you what behaviours they want instead of the current ones. They won’t be able to define that very well at all. And B they won’t be able to give you an idea of what it’s going to cost. So then the question is, well, hang on, you’re not ready to have this conversation yet.
[00:16:41] Mr. Manager, you need to really think about these things. Because I need to know kind of what’s the value in doing something for you, as opposed to some other manager, who’s got a bigger, more pressing problem. That’s going to be worth more or costs more to the company. And also until you can tell me what behaviors you want out, the other end.
[00:16:59] I, [00:17:00] there’s not much I can do. Um, And once we’ve got some idea of that behavior gap, then we can start looking at a performance consultancy thing saying, given, we want these new behaviors what’s stopping the new ones occurring right now. So that’s this diagnostics process of why aren’t they doing that behavior right now?
[00:17:21] There’s a photograph. I’ve got a, um, of a donkey on a cart and it’s been lifted up in the traces of the cart. So it’s actually lifted off the ground. Someone’s overloaded the cart backend heavy. So the donkey’s got its feet off the ground. Now what you might say is that donkey’s not doing his job. We need to send it to donkey school.
[00:17:42] You know, it needs training on how to pull a cart. Well, actually donkey school isn’t going to help that. Don’t keep. Uh, there’s a manager, that’s misbehaving there. We need to fix the manager and teach the manager how to load the cart, then the donkey will be fine. So it’s a matter of diagnostically [00:18:00] digging into the barriers that are causing the lack of behavior change or causing the absence of the behaviours that you’re seeking.
[00:18:09] So that’s why you need to do what I call a behavioral needs analysis or a task analysis to actually start driving out. What’s the set of behaviors that we want. Now, what are the barriers to the behaviours? How can we trigger them? How can we promote them? And typically 70, 80% of the things that you can do and nothing to do with learning and development, there’ll be things about process improvement, improving it, tweaking a software system, getting, you know, making sure the right information is on the file server in the right place, all of those kinds of things.
[00:18:40] Um, uh, you know, are usually much more relevant to fixing those kinds of performance problems. Now, if you go through that process and determine, yes, some kind of skills or, or knowledge is required, then you are into potentially a learning and development [00:19:00] scenario. Then you can start doing learning consultancy after you’ve done the performance consulting.
[00:19:05] If you’ve proven that there is potential for learning to be part of the solution overall.
[00:19:13] Nelson Sivalingam: [00:19:13] I think from what you just said, I can see why the sneaky fourth elephant came in, because it does sound like, well, I can see why the kind of idea of, I know what it is, right. In terms of the rebrand element,
[00:19:30] Not just because L&D um, almost changing its approach on purpose, but also the entire business, changing its perception and understanding in a way when you’re talking about the manager, really being clear in the needs. And what are the behaviour changes they need. And almost we evaluate what their relationship is with the L&D team.
[00:19:53] Anyway, I’m jumping ahead. We can come to the sneaky fourth elephant, but let’s go to the third one, [00:20:00] which is informal learning. And, you know, from my experience of working with various organisations, I still find informal learning is an under utilised approach or methodology. It’s still, we still see a lot of top-down formal structured learning, but it’d be great to first get your kind of definition of what you define as kind of informal learning.
[00:20:25] And ind of take it from there.
[00:20:27] Paul Matthews: [00:20:27] Well, the cop-out is saying informal formal is anything that’s not formal. Um, and, and I kind of stick with that because formal learning is much easier to define. It’s scheduled, it’s planned in some way, you’re saying this event is a learning event or learning as a primary output from whatever, um, or we’re doing some e-learning or whatever.
[00:20:48] So, uh, anything else other than that, and the research, the numbers, any of that, those various things show that a huge proportion of what people learn in order to do their job comes from. [00:21:00] Uh, outside of the formal learning track, uh, in other words, informal or non-formal or whatever you want to do at people bandy about various percentages and all sorts of things.
[00:21:09] I don’t really care much. I just know that a huge proportion, I dunno, 70, 80, 90, sometimes a hundred percent, depending on people’s time and job and so on comes from what they do during the day, how they observe others, the discussions they have, the things they learn, the bits they look up on YouTube, you know, on an ad hoc basis, all of that sort of stuff.
[00:21:33] So, and people get very at an angsty about, well, is this formal or is this informal? It doesn’t matter. It’s just learning. It’s just helping people do their job. But what’s really important is to recognise that the vast majority of learning that people do in order to do their job happens on the job. While they’re doing their work in the flow of work.
[00:21:53] And so consequently learning and development needs to be paying attention to that. Um, I think [00:22:00] that’s the most important thing. And then there are ways to utilise it, to harness the power of it, because it’s a very powerful form of learning. It’s the way we’ve always lived long before we had any kind of formal learning informal was all there was, um, yeah.
[00:22:15] And unless you were talking about, you know, a family of cave people sitting around their fire and telling stories, you might consider that formal schooling back then. But, but basically the formal is very new to us. So as a. As a structure built for an urology. It’s not right. You know?
[00:22:35] Nelson Sivalingam: [00:22:35] Yeah. And like you said, Paul, you know, every, every organisation has a learning culture.
[00:22:39] Right? Good or bad progressive or non-progressive every organisation has a learning culture. And like you said, People are already doing this. You know, we are already sharing knowledge between each other on tools like Slack and Teams. And we’re probably, you know, reading blogs, listening to podcasts and doing all of the informal learning.
[00:22:57] I guess the challenge for L&D [00:23:00] teams is like you said, how do you harness the power of informal learning to help you drive and will impact whether that’s business performance or behavioural change? And so how would you recommend L&D teams do that? How do they harness that power to drive business goals?
[00:23:21] Paul Matthews: [00:23:21] There’s a few ways to do that. One is to be aware that informal learning is happening to go out there and find out survey people and saying, well, If this happens, where do you go to find out that’s the question? You know, do you go to a colleague, do you do this to, you know, what are the resources and then to try and smooth that path for them to make sure that the information is more readily available.
[00:23:45] It’s fewer clicks away. It’s more readily accessible. The Intranet’s better. You know, and then publish other places. They can look for similar information and so on. So it’s about, so that’s one side of it. The other thing is to start [00:24:00] utilising informal learning in a purposeful way. So rather than grease the wheels of what’s already happening, you can say, well, actually we need people to learn about this.
[00:24:10] Uh, so here’s a specific output that we want from a learning perspective. So, what we can do is give people some tasks to go and do and send them away to do them. And over there, out of our sight, they are learning informally while they’re doing those tasks. And then we bring them back to debrief them on those tasks.
[00:24:30] So it’s kind of like tasking them from like a coach might do to a coachee. And what you’re starting to do is you’re bridging that gap between formal and informal by sort of a halfway house and saying. Here’s some tasks to do, go and do them and then come back. Um, the problem when a lot of people get involved with informal learning is they try and manage it too tightly.
[00:24:56] And in doing so they throttld it, it’s a bit like holding a butterfly in your [00:25:00] hands. If you hold it too tight, it’s not good for the butterfly. And if you hold it too loose, of course, it just flies away. So you’ve got that, that happy middle ground. Um, and, and I think that’s important with informal learning is you can’t hold it too tight.
[00:25:15] But what you can do is start directing little bits of it in specific areas of learning. And that’s what I mean by this tasking, send them a way to go and do something. And while they’re over there, they will be learning informally while they’re doing it. But you have no idea what they’re learning or how fast they’re doing it, or even learning anything until you bring them back.
[00:25:35] So it’s a bit like, let go of control while I go and do it. And then reassert control when you bring them back to debrief it. So that’s one way you can start harnessing informal learning. In a formal way, if you want to talk about,
[00:25:47] Nelson Sivalingam: [00:25:47] and that seems like it requires a culture shift because quite often in the conversation we have, what we find is it’s quite different to the way they might be doing learning now, which is say predominantly [00:26:00] formal, quite top down and quite one size fits all.
[00:26:03] And therefore, to move to an environment where. You don’t necessarily have that control over what someone is looking at, you are not tracking, you know, everything they read and watched and listened to, and you don’t have that kind of granular data to it. And also you’re bringing in knowledge sharing and learning from each other.
[00:26:24] There’s this whole idea of. Do we approve everything that’s shared within our learning platform. So there’s this idea of kind of micromanaging and control that seems to require a bit of a culture shift. How do you think L&D teams or organisations in general can bring about that culture shift?
[00:26:41] Paul Matthews: [00:26:41] Um, Well, I think a lot of that data that you’re talking about is valuable, but not that valuable because just because people read something doesn’t mean they’re going to do anything with it. And they might’ve actually opened up that document, then gone and got a coffee and never spent that much time with it at all.
[00:26:57] So I think it’s really important to, [00:27:00] to start just getting an idea of what we’re trying to do, and that comes back to that behavioral needs analysis. Again, what behaviors do we want? How do you trigger the change in people in order for them to start doing those new behaviors, that’s almost certainly going to be them experimenting and practicing some skills in order to build them up into behaviours.
[00:27:21] Um, which inevitably means you’re going to end up needing some kind of workflow solution. And I don’t mean in the workflow. I mean, workflow in the true sense of an orchestrated sequence of tasks, that’s done sequentially to get people from where they are to where you need them to be. So it’s a bit like the turn by turn instructions on your sat nav and once the blue lines on your screen, and you know, you’re going to get to Edinburgh.
[00:27:43] If you put Edinburgh as a destination, the turn by turn instructions have been designed. And if you follow them, you will get there. So if you have your behavioural needs analysis and you know, the set of behaviors you want out the other end of some kind of learning initiative, given you, have [00:28:00] you proven that to there, or even if it isn’t the learning initiative, you know, the sequence of things that are going to have to happen in order for people to be able to start exhibiting those behaviors by, you know, because you’ve removed the barriers.
[00:28:11] So what are all the step by step things that need to happen? Um, and so if you can design that step-by-step set of tasks. That will get people behaving the way you want them to behave and they follow those set of tasks. Then you will get the result and behavior change that you want. So that’s where you’ve got to work from the behaviors and work backwards and behavior change requires a workflow solution, not a learning solution.
[00:28:37] Although learning might be part of one or two of the steps, there’ll be far more other steps in terms of experimentation practice and I, and, and collaborative things and all sorts of other stuff. And informal learning. Won’t be part of that. And so informal learning is a core part of learning transfer.
[00:28:55] It’s a core part of people getting behaviour change, because most of what people [00:29:00] learn in order to change their behaviour happens when they’re doing stuff on their own. Well, you know, with colleagues, it’s not, when there’s a trainer in the room, Yeah. Um, so, so you’re back to that behavior change and then work backwards.
[00:29:13] Whereas most L&D people start, well, we better do some management. So what they do is they say what do managers need to know, here’s the content list. And then they develop a set of content, but it’s the wrong approach. Completely. It’s backwards.
[00:29:25] Nelson Sivalingam: [00:29:25] And so let let’s come back to the sneaky fourth elephant.
[00:29:37] Paul Matthews: [00:29:37] A lot of the stuff I’ve been talking about is a little bit different and most people I talked about are saying, well, that makes sense. Yeah, we should be doing that. And, but they don’t. And one of the reasons they don’t is they actually don’t have a high level of credibility in the organisation.
[00:29:53] So they have difficulty having a conversation saying we need to change the way we do things [00:30:00] now. And so therefore it is the credibility of L&D that will have a big impact on how well they’re able to serve the organisation effectively. And so that the fourth sneaky elephant to me, it’s the brand and reputation, which are slightly different things, but they’re correlated, is the brand and reputation of L&D.
[00:30:19] And in most organisations, it’s not very good. Or it’s certainly, it’s not that it’s bad necessarily, but it doesn’t help them do what they need to do. And it certainly doesn’t help them promote changes in the way learning is happening and promote changes to, as you said earlier, every organisation has a learning culture.
[00:30:40] So. You know, how can they start focusing on changing that culture? And that requires change and the people that can generate change and trigger change best are those with good credibility. So that’s one L&D needs.
[00:30:56] Nelson Sivalingam: [00:30:56] and so what do you think will, you know, someone listening to us now, [00:31:00] what are the immediate things they can do to essentially give, um, L&D a rebrand or to create a learning brand within their organisation?
[00:31:09] Paul Matthews: [00:31:09] But it’s, it’s probably more than a brand. It’s more than the logo. So there’s a few steps that need to be taken somewhat iteratively there. Um, one is to sit down and do a value proposition for L&D and just like a marketing department would do for the products. What’s their value proposition for product X or product Y uh, L&D needs to sit down and say, who’s our audience and what is our value proposition for that audience?
[00:31:34] And there might be several different audience segments. Uh, and so, and that’s just a standard business process. There’s lots of different value proposition sort of tools and models out there. So that’s what they need to be doing to start with. And then once they’ve figured that out, they can say, okay, given that and our positioning in the organisation, what is our mission, vision, purpose in L&D and how do [00:32:00] we link them to the greater vision, mission, purpose of the organisation?
[00:32:04] So in a sense, our job is to ensure people are competent to step up to the line and execute the corporate strategy, whatever that is. So that’s, you know, we’ve got to assume that’s right. That’s been handed down from on high. So if that’s wrong, we’re all kind of in the wrong boat, but hang on, let’s assume.
[00:32:21] That’s right. So how do we, as learning and development ensure that we have people competent to step up in the right kind of timeframes to deliver on that corporate and execute that corporate strategy. So that’s kind of a roundabout answer, but does the value proposition work. And then you’ve got to do your vision, mission, purpose, and stuff for L&D and then work out a strategy of how are we going to deliver that.
[00:32:47] And obviously all of that’s got to be aligned with and plugged into the greater things that are going on at a business level. So you’ve got to talk to the business and all that. Yeah. So, and then you can start saying, [00:33:00] we now understand who we are. This is what we’re going to present to the organisation. So your L&D strategy and the envisioned mission, purpose kind of documents or statements might well have two different versions, one for internal use and L&D.
[00:33:13] And then one for publishing out of the organisation saying, this is who we are. This is what we will do for you. And this is how we can do it. And then you’ve got to make sure all the touch points that you have with the organisation actually fulfill the promise in that brand because a brand is effectively a promise.
[00:33:30] A brand says, we will do this for you when you see a brand. Do you know what the promise is? You know, what to expect from that brand, from that logo. So your brand is your promise. But your reputation is based on the touch points that you have. And of course, if the touch points don’t fulfill the promise, then you’ve got to disconnect between your reputation, your brand, and then things start going down the toilet kind of fast.
[00:33:54] Nelson Sivalingam: [00:33:54] And do you find as a part of this rebranding exercise, do you see kind of L&D [00:34:00] restructuring? And the reason why I ask this is some of the hyper growth, um, new tech unicorn companies that we work with, we see that they’re starting to build in performance enablement or performance support teams into each function.
[00:34:15] So you might get it for, uh, you know, in the same way you might have dev ops, but you’re now starting to get more kind of performance. So how do we make sure our engineers have the skills, knowledge, and the behaviour they need to be able to deliver what the business needs, the same thing. For sales, you’ve always had kind of a sales enablement, but it’s moving more towards sales enablement.
[00:34:33] Looking after that kind of performance support and learning piece for the sales function. Do you see a restructuring happening as a part of this rebranding? It
[00:34:43] Paul Matthews: [00:34:43] might, or it might not. Um, but what you want to do is not just put veneer or on something that’s broken. So clearly if it’s broken it, you know, it’s like the lipstick on a pig type of thing.
[00:34:53] That’s not going to work terribly well. So you, you need to. Make sure that the structure [00:35:00] is in place that is there and able to deliver on the strategy that you have decided is the appropriate strategy going forward. And you may find that your current structure can do that, in which case that’s fine. If it ain’t broke, don’t necessarily break it or fix it.
[00:35:17] So, um, yeah, but, but often it does mean some kind of restructure and typically in larger organisations, it does. And often when there’s an HR VP type of a structure from HR, because there, you’re getting HR business partners who aren’t necessarily L&D trained, and they’re not having the right conversations at the right point, but not having those performance consultancy conversations, for example.
[00:35:45] So what you want around learning and development is almost like a filter, so that, uh, requests come in go through that performance consultancy filter. They’re not just automatically ingested [00:36:00] into the cell that is L&D and then some output is created because you end up doing a lot at work that’s completely wasted.
[00:36:08] Nelson Sivalingam: [00:36:08] Um, And now that we’ve spoken about the elephants in the room, we’re going to move on to the final part of the show, which is my favorite part, which is the quick fire questions. And essentially I get to throw a bunch of questions at you and hopefully, uh, you know, it can be a fast and short answer, but if you do want to elaborate on anything, feel free to do so.
[00:36:29] Uh, so are you ready for the quick fire round?
[00:36:31] Paul Matthews: [00:36:31] Um, no.
[00:36:34] Nelson Sivalingam: [00:36:34] Alright, let’s go for it. So one thing L&D teams can do today to improve learning and performance at their company.
[00:36:45] Paul Matthews: [00:36:45] Think about the elephants in the room, because each of those elephants all four of them should have lines and mentions in your L and D strategy.
[00:36:54] So revisit your strategy and make sure the elephants are catered for in your annual strategy.
[00:37:00] [00:36:59] Nelson Sivalingam: [00:36:59] And you’re obviously an incredible public speaker. You know, we’ve watched your videos, I’ve spoken to people who’ve been to your talks. It comes very natural to you. So, but what is a top tip that you would give to someone who wants to improve their public speaking and be more engaging as a public speaker?
[00:37:24] Paul Matthews: [00:37:24] Um, practice, do it often. I mean, I didn’t use to be, I used to hate, I was scared of it, quite frankly, well, not a phobia, but close to it. I really didn’t like getting up and it was just years and years of just getting up and slightly bigger and, um, I’ve done crowds of thousands now,
[00:37:40] and people say, aren’t you nervous? And I said, well, no, I got it. That’s all gone. Um, and sometimes I almost have to kind of get myself fired up to go out there to generate some energy. So I think really it’s just practice to get feedback. There’s a heap of stuff online, to do. [00:38:00] and if you really want to get speakers, there’s various speakers’ Bureaus, you know, Groupings like that, where you can start watching other people do it.
[00:38:09] Nelson Sivalingam: [00:38:09] Well, uh, I’m going off schedule here Paul, but I know you’ve spoken about the kind of power of storytelling and you obviously do that really well. Um, in your speaking events and that kind of art of building the story, um, beyond practice, what’s helped you develop that kind of skill of storytelling.
[00:38:31] Paul Matthews: [00:38:31] Um, I do remember joining a little group of people.
[00:38:35] We got together once every couple of months and told each other stories and effectively what you can do is start practicing a story until it’s just kind of by wrote letter perfect. You kind of know it off by heart and you can tweak it a little bit. And if you’ve got a dozen different stories that have different points, you can string them together in different ways.
[00:38:55] And, you know, So you get this kind of repertoire. Um, [00:39:00] so now, you know, I can just kind of do it off that repertoire and speak for hours, literally. So yeah. So, and then, but that just comes with the practice. It’s about doing it a lot, even if what you’re doing it too, is your Guinea, pigs, soil, your dog, you know, it’s all about breakfast.
[00:39:16] Nelson Sivalingam: [00:39:16] Which is something you talk about a lot, actually, which we didn’t get a chance to speak about. Maybe we do a part two, uh, you know, learning by doing, and that’s a kind of another area that you you’ve kind of spoken about, um, which we didn’t touch on, but the next quickfire for you is what’s the one skill that you’d like to develop and why?
[00:39:36] Paul Matthews: [00:39:36] Oh,
[00:39:40] that’s a really interesting one. I’m very intellectual. I probably need to get more in touch with my feelings. Okay. Okay. If you want a personal answer to that one.
[00:39:51] Nelson Sivalingam: [00:39:51] Yeah. Um, next one is what’s one weak thing you’ve learned from a colleague or someone you’ve worked with?
[00:40:01] [00:40:00] Paul Matthews: [00:40:01] Um, I think from my business perspective, someone told me a long time ago, every business is a marketing business it’s just they happen to have a product or service, right.
[00:40:10] Yeah, there was a little bit of wisdom that I have ignored too often and sometimes remembered. Um, but yeah.
[00:40:20] Nelson Sivalingam: [00:40:20] Um, what’s the one book that’s not written by you, that you would recommend to someone who’s working within the people function. So HR and L&D.
[00:40:31] Paul Matthews: [00:40:31] um, I’d probably get a hold of DR. Elena Veinhaudel’s book on learning transfer.
[00:40:37] Um, I mentioned it in my book and, and, but she’s developed a 12 levers model of learning transfer as a result of doing a PhD on it. So she’s read pretty much everybody literature there is in the world on learning transfer and pulled it all together and then collated it and ended up developing this 12 lever model.
[00:40:56] Uh, so I’d highly recommend her and her [00:41:00] work, uh, on learning transfer.
[00:41:02] Nelson Sivalingam: [00:41:02] Yeah. Brilliant, brilliant book and brilliant model. And so what’s the L&D trend that you think will, you know, pick up in the next year or two will really start to be.
[00:41:15] Paul Matthews: [00:41:15] Huh. In a sense, I don’t care. A lot of people focus on the future.
[00:41:21] I think there’s plenty we can do. Right. It’s the old 80/20 rule. And if you keep chasing the bright, shiny new objects you end up not. So I tend to avoid chasing. I just want really basic. I’m a simple farm boy from New Zealand. I want to do stuff that works. And so just because something is trending or new, there’s so many things that have trended, there’s this thing, this is, there’s a main thing.
[00:41:45] Remember things like second life. And I mean, there’s been so many different stuff, but sort of come in and people are not even doing the basics. Right. And I think that’s important. I played basketball, um, in, in, in college and university and [00:42:00] a very high, uh, professional coach basketball coach from the US was visiting New Zealand.
[00:42:07] And he just happened to visit our college and, and said, you know, I’ll come and do a training session for you. So we said, this is great. And so we were all there and he said, okay, what should we do? And, and several of them said, well, you know, my colleagues said, you know, we do this fancy bit. We want to do these plays.
[00:42:25] And he said, no, let’s start with basic dribbling. And he took us through some really basic skills, and this is one of the top college basketball coaches in the US and I find the same thing in learning and development is just go and talk about the basics, is they’re often just not there with enough strengths.
[00:42:46] And so trying to chase the bright, shiny new fashion is just not the right thing to be doing.
[00:42:52] Nelson Sivalingam: [00:42:52] And so let me flip that question from a perspective of, do you see the, any kind of lasting [00:43:00] change, um, as a result of, of COVID and where we’re now working? You know, how much of it do you think obviously, organizations have been forced to.
[00:43:10] Um, reevaluate the way they were doing learning programs, whether that’s in the classroom and going online, maybe they’re, um, kind of reevaluating the way they’re going to upskill and reskills their people. But do you see any of this being will lasting change or do you see this as kind of a temporary because of the way we’ve been forced to work now and you know, once we go back to any kind of normality, people are going to go back to old habits, how do you see that?
[00:43:34] Paul Matthews: [00:43:34] Well, I think that there’s a lasting change in terms of the remote nature of a lot of delivery. I think, um, it’s a bit like Kurt Lewin’s model – you’ve got to unfreeze stuff before you can change it and then freeze it again. And what’s happened is learning and development got unfrozen real fast about last March and suddenly what was happening was no longer possible.
[00:43:56] And also that meant that the senior management [00:44:00] teams couldn’t just say, just put them in the classroom, get on with it and do it because they now couldn’t say that. And they didn’t know what else to say. So they were actually in a position of relying on L&D to come up with alternatives. And that’s meant that L&D inherited a whole lot of power as a result of COVID sadly, very few L&D departments have really taken that power of suggestion.
[00:44:24] They’ve got to suggest new ways of developing L&D sadly not many have picked up on that. Yes, they’ve gone virtual, but they’ve sort of done it under duress. Uh, in many cases, what I have seen certainly is people converting, um, what used to be face-to-face and to online and now saying, Oh my God, this actually works better.
[00:44:43] We’ve got clients who have done that on our platform and saying, well, actually we should have done this years ago. Dammit. Why didn’t we, you know, we, but of course that wasn’t where their head was at. So I think the proportion. Well, the number of hours of training that is done in [00:45:00] in the coming years will be less, a lot less than it’s been.
[00:45:03] I think it’ll drop to below 50%. There’ll be, there’ll be, um, there’ll be a snapback. People will say we can get back in the classroom. Yay. I don’t think that’ll happen for a while. because I mean, what are you doing with the people who are not vaccinated or whatever? So it’s going to be potentially years before we actually get to any kind of place where we can reliably go back to the classroom in the same way that we were.
[00:45:25] And in fact, talking with people in countries that have had minimal lockdown, I’m a Kiwi originally some talking with people in New Zealand. And they’re actually saying in some ways they’re back and doing training as normal, and they’re saying, we’re going to suffer long-term from that because we’re not developing the online skills that the rest of the world is.
[00:45:46] And that means the rest of the world can come hunting in our backyard with online courses. And we won’t be able to compete. So there’s some interesting other views around what’s right. And what’s not. So I think there’ll be a lot less training. [00:46:00] I think people are getting much more used to the idea of self-directed learning of learning stuff when they need it for themselves at the time, this is this whole informal learning thing.
[00:46:10] So that’s going to make a big change. Um, so yes, there’s a lot of different stuff happening and who knows where those changes will end up longer term. Right. You know, I dunno.
[00:46:21] Nelson Sivalingam: [00:46:21] Yeah. And my last one on the quickfire round, Paul is, you know, what’s the one skill do you think is important for L&D to start developing
[00:46:35] Paul Matthews: [00:46:35] Consultancy skills.
[00:46:37] I think people often talk about data, but I think we have enough data skills to kind of skim the top of it and get a sense of what’s going on. But there’s so much more basic stuff that we can do by talking to the business in a way that the business can relate to and doing that performance consultancy and all of that upfront.
[00:47:00] [00:46:59] Uh, that will sidestep a huge amount of stuff that we’re doing right now that we don’t need to do. And shouldn’t be doing when we get that right, then we can start getting into the measurement a lot more because if you just measuring rubbish, that’s not working very well. You end up saying, well, that’s not working very well, but it still doesn’t help you do anything better.
[00:47:19] It’s not just not that I’m against data. I’m just thinking there’s other things that L&D needs to focus on. Right now, given where it’s at that a lower hanging fruit, let’s say than the data.
[00:47:32] Nelson Sivalingam: [00:47:32] Yeah. Cool. It’s been an absolute pleasure to have you on the show. And like I said, I can carry on going and I can probably book you in for a few more shows, but it’s been absolutelyamazing.
[00:47:43] There’s so much to unpack there, but thank you very much for your time. And thanks once again for coming on the show.
[00:47:49] Paul Matthews: [00:47:49] Oh, you’re very welcome. No, it’s been great. Having a chat.
[00:47:53] Nelson Sivalingam: [00:47:53] And there you have it. Our excellent elephant based chat with Paul Matthews comes to an end. If you’d like to know more about [00:48:00] Paul, his books, and anything else, check out the links in the show notes.
[00:48:03] And if you want to share any feedback and ask any questions about the episode, you can find the relevant social channels and links in the show notes too. If you’ve enjoyed this episode, please do think about subscribing, sharing and leaving us a review or telling a friend. It goes a long way in helping the show grow.
Overview: Based on reviews from learning platform users, Sana Learn is praised for its intuitive interface, easy adoption, engaging interactive content, and AI-powered tools that can speed up content creation and discovery. Customers consistently highlight smooth onboarding, responsive support, and useful integrations with email, calendar, and collaboration tools. However, recurring limitations emerge around content flexibility, AI accuracy, occasional technical glitches, UI quirks, and gaps in admin training, which can create friction as teams scale their learning programs. While Sana Learn works well for organisations seeking fast rollout and straightforward learner engagement, teams needing more customization, reliable AI, and robust integrations may want to explore alternative platforms and see how they compare in practice.
–
When you're evaluating learning platforms, everyone has an opinion. Vendors have feature pages. Review sites have listicles. And everyone claims to be the best AI-powered LMS on the market.
What nobody tells you is what it's actually like six months in.
Sana Learn (part of Sana Labs), an AI company founded in 2016 in Stockholm, will likely show up early in your research. It's well-funded, well-marketed and has built a genuine reputation in the AI learning space.
But reputation and reality don't always match. And the people best placed to tell you the difference aren't the sales team. They're the L&D leaders, admins and learners who use it every day.
To help you, you’ve analysed 50+ real customer reviews so you don't have to. Not to cherry-pick the bad bits but to find the patterns that will help you make informed decisions. The things that come up again and again once the implementation is done and the day-to-day reality sets in.
Because when you're making a buying decision that affects your entire workforce, what matters isn't which platform has the best copy or demo. It's which one that will help you build and engage your workforce to proactively build the skills your business needs to grow.
Where Sana Learn does well.
One thing becomes clear when you read through the customer reviews: Sana Learn is easy to like.
Users consistently describe the platform as intuitive, clean and simple to pick up. There's very little friction in getting started which, if you've ever tried rolling out a new learning platform to a sceptical workforce, you'll know is no small thing.
That ease extends to implementation. Several reviewers highlight how seamless the setup felt, with teams barely noticing the transition. For organisations without the time or appetite for a heavy rollout, that's a meaningful advantage.
AI is another area where Sana Learn gets genuine praise. Users point to how quickly they can generate content, surface answers and navigate learning materials with AI woven throughout the experience. When it works, it removes friction from the content creation process in a way that L&D teams with limited resources will appreciate.
The learning experience itself also lands well. Interactive modules, clickable elements and embedded content make it easier to engage with topics that would otherwise feel dry. Learners aren't just clicking through slides; they're actually interacting with material.
Put simply: Sana Learn is a platform that's genuinely easy to adopt and easy to engage with. For teams prioritising simplicity and fast time-to-value, that counts for a lot.
What are the limitations of Sana Learn?
Once you move past first impressions, the reviews become more nuanced; and more useful.
A recurring theme is that while Sana Learn is easy to use, it can feel limiting when you try to do more with it.
Several users point to a lack of flexibility in content creation. Editing options are described as restrictive, with one reviewer putting it plainly:
"Tables are a bit clunky and hard to edit… [there's not] much freedom when it comes to text & layout."
Others mention having to rely on external tools to get the output they actually need:
"Many features are unavailable and have to be done outside of the platform using third-party providers."
For L&D teams trying to scale content production or tailor learning experiences more precisely, that's where friction starts to add up.
There's also a subtle but telling critique around product direction. One reviewer notes that the platform sometimes prioritises:
"attention-grabbing features over more basic feature development."
That's the kind of comment that tends to surface when a platform is evolving quickly; but not always in the direction its users need most.
Is Sana Learn's AI reliable?
AI is one of Sana Learn's biggest selling points; but it's also one of its most inconsistent areas.
While some users are impressed by the speed and convenience, others highlight accuracy issues that slow them down rather than speed them up:
"There are times when the AI doesn't fully grasp what I'm asking for…"
"Sometimes the AI suggestions are not fully accurate, and it takes a bit of time to find the exact content I'm looking for."
That tension shows up across multiple reviews. The capability is there; but it's not always reliable enough to trust without sense-checking.
For L&D teams expecting AI to meaningfully reduce manual effort, that gap matters more than it might first appear.
What do Sana Learn users say about technical performance?
Another pattern across the reviews is the presence of ongoing, low-level technical friction. Not catastrophic failures; but enough to interrupt workflows when they matter most.
Users mention occasional platform freezing, performance lags when handling complex content and integration challenges, particularly around APIs. One reviewer sums it up plainly:
"The platform can be a bit glitchy at times…"
Others call out specific integration issues:
"Had some hiccups with [the] Bamboo integration API."
These aren't universal experiences; but they appear frequently enough to be worth factoring in, particularly for organisations running a broader HR and L&D tech stack where reliable integrations aren't optional.
What do Sana Learn users say about the interface?
Interestingly, even though usability is one of Sana Learn's most praised qualities, there are still consistent complaints about specific interface behaviours; particularly once users move beyond everyday tasks.
For example, one reviewer points out a frustrating content creation issue:
"When I'm creating a comment… and then pop over to another window, the comments I started typing disappear."
Others find the home screen experience overwhelming:
"The interface can appear a little overwhelming with all the videos visible when you enter the homescreen."
There are also mentions of difficulty navigating back to in-progress courses, and issues with live learning environments around audio and visual quality.
None of these are deal-breakers on their own. But together they create a sense of inconsistency; where the platform feels smooth in some moments and frustrating in others. For L&D teams managing large learner populations, those friction points tend to get amplified at scale.
What do Sana Learn users say about the learning experience?
Beyond the platform mechanics, some users point to limitations in how learning content is actually delivered.
Quiz functionality comes up more than once, particularly around rigid structures:
"When making a mistake… you have to click through the whole exam before being able to repeat."
Others mention repetitive questions and a lack of depth in supporting materials:
"Example videos are not very detailed enough."
There's also feedback around pacing; specifically that learners can move through content too quickly without meaningful controls in place to slow them down or check understanding along the way.
None of these are headline issues. But for L&D teams where learning effectiveness is the whole point, they're worth knowing about before you buy.
What do Sana Learn admins say about the platform experience?
While learners tend to find Sana Learn straightforward, the experience for admins and L&D teams is less consistently praised.
Some reviewers highlight a lack of guidance when it comes to more advanced features:
"Need more training on available features."
Others point to documentation that doesn't quite hit the mark:
"Videos are usually very short and articles can be text heavy."
This creates a meaningful disconnect. The platform feels simple on the surface; but getting the most out of it as an admin can require significantly more effort than the initial experience suggests. For L&D teams who need to move fast and can't afford a steep learning curve behind the scenes, that's worth factoring into your decision.
Should you be looking at Sana Learn alternatives?
That depends on what you need.
If your priority is fast rollout, strong initial engagement and a clean intuitive interface, Sana Learn clearly delivers. For teams that need something up and running quickly with minimal friction, it's a strong option.
But if you're thinking longer term; about scaling learning, tailoring content more precisely and integrating deeply into your wider HR and L&D tech stack, the limitations that surface across these reviews start to matter a great deal more.
The question isn't whether Sana Learn is a good platform. For many organisations, it is. The question is whether it's the right platform for where your organisation is going; not just where it is today.
Is HowNow a good Sana Learn alternative?
HowNow tends to come up for teams that want more than a clean learning interface.
Reviews give you a strong starting point but they won’t tell you how a platform fits your specific setup.
If you’re weighing up Sana Learn against alternatives, the most useful next step is to see them side by side.
HowNow built around a different idea: that learning shouldn't sit in a separate platform, disconnected from the way people actually work. It should connect everything together; the content, the skills data, the performance context and the tools your teams already use every day.
In practice, that means bringing learning from multiple sources into one centralised place, linking development directly to skills gaps and business performance, and using AI in a way that supports real workflows rather than just speeding up content generation.
But perhaps most importantly, HowNow is designed to scale with you. Not just easy to start; but built to deliver more as your organisation grows, your needs get more complex and your expectations of what good learning looks like get higher.
If the patterns in these reviews resonate with challenges you're already facing, it might be worth seeing it for yourself.
You’re comparing features, pricing, integrations, and user experience. But there’s one thing that often gets pushed down the list is security.
It shouldn’t be.
Learning platforms sit on a goldmine of sensitive data e.g. employee records, performance data, personal details. If that data is mishandled, the impact isn’t just technical. It’s reputational, legal, and operational.
So before you get dazzled by a slick demo, it’s worth asking more important questions such as:
Is this platform safe? And can I trust this vendor?
Why security matters when buying a learning platform
Security conversations are often left until the final stages of evaluation.
By then:
Data has already been shared
Internal stakeholders are invested
Walking away feels expensive
That’s how risky decisions get made.
Instead, bring security into the conversation early.
Loop in your InfoSec, IT and data protection teams from the start so they can review vendors alongside you (not play catch-up at the end which is what we often see).
It saves time, avoids friction, and builds confidence internally.
What security certifications should an LMS or LXP have?
There are plenty of badges vendors can display.
Not all of them mean the same thing.
When it comes to learning platform security, there are two certifications that actually matter:
ISO 27001:2022 — The Global Standard
ISO 27001 is a globally recognised information security standard.
It’s a risk-based framework that shows a vendor takes security seriously across their organisation (not just in isolated areas).
But this is where many buyers stop too early.
The certificate alone isn’t enough.
Ask for the Statement of Applicability (SoA).
This document shows:
which controls are implemented
how risks are managed
why specific decisions were made
When reviewing it, pay close attention to:
information classification
data leakage prevention
handling of personally identifiable information (PII)
Learning platforms process large volumes of employee data. If a vendor can’t clearly explain how that data is segmented and protected in their cloud environment, the certification doesn’t mean much.
What to double-check
Does the certification cover the whole organisation or just part of it?
Is it officially accredited?
Is it the vendor’s certification, or are they pointing to their hosting provider (AWS, Azure, etc.)?
If it’s the latter, push back. Hosting infrastructure doesn't mean application security.
Cyber Essentials Plus — Essential for UK-based organisations
If you’re a UK-based company, Cyber Essentials Plus should be your baseline.
Unlike the standard Cyber Essentials (which is self-assessed), the Plus certification includes:
independent technical verification
hands-on testing
real validation of controls
For a learning platform handling sensitive employee data, this provides confidence that the basics are properly secured.
As with ISO 27001, don’t just take it at face value.
Learning Technologies is back and we could not be more excited.
L&D is changing faster than most organisations can keep up with. AI is reshaping how people learn, skills gaps are widening and the pressure on L&D teams to prove impact has never been higher. The conversations happening at this year's event are going to matter.
HowNow is already working with companies to build the talent of tomorrow; closing skills gaps, connecting learning to performance and giving L&D teams the data to prove it's working. We want to help you do the same.
Learning Technologies is a great place to start this journey.
You'll find us at stand E30. Come and find us.
Here's what's waiting for you.
1. Get a Free Learning Health Check
Most L&D teams we speak to already know something isn't quite working. Maybe engagement is low. Maybe learning is scattered across too many tools. Maybe the business is asking questions about impact that are hard to answer.
The Learning Health Check is a free 15-minute desk-side consultation with one of our experts at stand E30. No slides, no sales pitch; just a focused conversation about where your organisation is right now, what's getting in the way and where the biggest opportunities are.
You'll walk away with tips you can apply to your strategy straight away, whether you use HowNow or not. This is exclusive to Learning Technologies and designed to be relevant to you and your organisation.
2. Hear How to Prove Learning Is Actually Building Skills
Day one. 1:10pm. Bitesize Stage
If you've ever sat in a leadership meeting struggling to demonstrate the impact of your learning programme, this one's for you.
Harvey Stead is taking the stage for a bitesize session on one of the biggest questions in L&D right now: how do you prove that learning is genuinely building skills? Join a group of 30+ L&D leaders for a practical, focused conversation designed to give you something you can actually take back to the business.
Arrive at 1pm to secure your seat. Spaces are extremely limited and assigned on a first-come, first-served basis.
3. Learn What It Means to Be a Self-Improving Company
Day one, 2:45pm in Theatre 2.
Every company wants the same thing: people continuously getting better at their jobs. But running that loop manually is nearly impossible. Who's struggling? When do you intervene? What actually helps? Did it work? By the time you've coordinated answers to those questions, months have passed and the moment is gone.
In this session, Nelson Sivalingam; CEO of HowNow, one of the fastest-growing AI learning companies and author of the acclaimed book Learning at Speed; introduces a fundamentally different model: the self-improving company.
Nelson will show how AI agents are transforming organisational performance by monitoring work systems in real time, detecting struggles the moment they emerge, intervening with the right support at the right time and measuring what actually improved in performance data; not surveys.
4. Hear Directly from Trainline on Linking Learning to Business Outcomes
Day two, 11:45am in Theatre 2.
Proving the business impact of learning is one of the hardest things L&D teams are asked to do. Most organisations know learning matters; getting the data to prove it to the business is a different challenge entirely.
This is the session for anyone who's ever had to make that case internally.
Trainline will be on stage sharing exactly how they've connected learning to real business outcomes; the approach they took, the challenges they faced and what the results actually looked like. No theory, no vendor pitch; just a peer in the same shoes telling you what worked.
If you're trying to win more investment for L&D, build credibility with your leadership team or simply understand what good looks like in practice, bring a notebook.
5. Meet HowNow Customers at Our Happy Hour
Our customers will be joining us at Learning Technologies and we would love to introduce you.
Straight after Nelson's session, we're hosting a customer meet and greet at stand E30. Prosecco, canned cocktails, beers and the kind of conversations you actually come to events like this for.
Want to know what it's really like to use HowNow? Don't ask us. Ask them.
Look out for the special 'talk to me' badges; those are the HowNow customers with the real stories. They'll be in and around the stand all afternoon and they're easy to spot. Pull them aside, ask them anything and hear first-hand what's working for organisations just like yours.
6. Start Day Two with Breakfast on Us
Day 2. Stand E30.
Come and find us first thing on day two.
We'll have coffee, croissants, Danish pastries and muffins waiting; and it's a great chance to have a relaxed conversation with the team before the day gets going. No agenda, no pressure; just good food and good company.
The best conversations at events like this often happen before the programme even starts. And we will provide food for thought… literally.
And so many more reasons….
So, whether you want to catch a talk, grab a drink, or just have a proper conversation about your learning strategy, we'd love to see you. Learning Technologies is one of the best opportunities of the year to connect, learn and get inspired and we're making sure our stand is worth your time.
Onboarding is one of those things everyone agrees matters and yet it’s still one of the most inconsistently done processes in most organisations. Too often it’s a chaotic first week of back-to-back meetings, a SharePoint folder nobody can find, and a laptop that arrives three days late.
Designing onboarding that actually scales is one of the biggest challenges HR and L&D teams face. Most organisations know their onboarding could be better.
Pauline Taylor, VP of People at HowNow, spoke with Ian Walker on the L&D Disrupt Podcast about what great onboarding really looks like and how to build it properly.
This blog walks you through what came out of that conversation and where to start.
Why Onboarding Matters More Than You Think
Let’s start with the business case, because it’s a strong one.
As Ian puts it:
“The value, of course, is that you are accelerating people’s sense of connection. And the statistic about that is that if people feel that they have been treated well in the onboarding process, their longevity is extended. So from a retention point of view, the evidence is pretty unequivocal.”
Connection drives retention. If a new hire spends their first few weeks feeling lost, anxious, or like an afterthought, you’re already on the back foot, regardless of how good the role is. Good onboarding accelerates that sense of belonging and gets people up to speed faster. Friction in those early weeks doesn’t just feel bad. It costs you time, productivity, and ultimately, people.
Should Employee Onboarding be In-Person vs. Remote?
There’s no universal answer here, but there are some useful principles.
If you’re onboarding in person, you’re making a strategic investment in culture. Salesforce, for example, made in-person onboarding a priority specifically because they believed it was the best way to embed culture from day one. That’s not a logistical decision; it’s a values one.
If you’re onboarding remotely, the goal is to make the experience feel as close to in-real-life as possible. As Ian says:
“Similarly, if you’re doing it remotely, make sure that all of the experience is as far as possible close to the in real life experience.”
The principles are the same: connection, culture, and clarity. The delivery just looks different.
Nail the Employee Onboarding Fundamentals
This one sounds obvious, but it’s where so many onboarding programmes fall apart.
If you’re bringing someone in person, the infrastructure has to be invisible. Ian is direct on this:
“If you’re gonna do it in person, make sure that all of that is properly handled and does not come back onto the individual. Not only will that distract them, it’ll make them more nervous, it’ll make them feel less good about the whole experience. But it will detract from the efficiency of ramping them up quickly as well.”
That means flights and hotels booked correctly, a laptop ready on day one, security badges sorted in advance, and schedules organised. Get the admin right, and everything else has a chance to land.
What Should Actually Be In Your Onboarding Programme?
Your company culture is the most important element of any onboarding programme. Don’t just list your values on a slide and move on. Bring them to life.
Ian’s advice here is clear:
“Bring in managers, bring in people who are living the culture. So it’s not just someone listening to the same person, same voice all day. You’re getting different voices in there, but you’re getting people sharing their lived experience of why is this culture important to me?”
When people share their lived experience, it lands differently. It’s personal, it’s real, and far more memorable than a PowerPoint.
Networking opportunities
When you’ve got a cohort of new starters in a room (or on a call) that’s a real opportunity. Ian puts it well:
“Use this opportunity to build your network as well. Understand what’s happening within the company because not only will you leverage those relationships, but you’ll learn about what are potential career paths that you can also follow?”
Build in time for people to actually connect with each other. Those relationships can shape how people collaborate and grow within the organisation long after onboarding ends.
Setting real performance expectations
Be upfront about what working there actually looks like. Ian recalls:
“I remember talking to a room full of newly hired employees and saying, you’re gonna be expected to work hard. And you could see these big eyes — and it’s like, yeah, it’s just a reality. You are gonna be held to account for what you do. So expectation setting early on, I think, is really key.”
Ideally, it starts in the interview process, but reinforcing it early avoids misalignment down the line.
The big picture
Help new starters understand how the company works from top to bottom. As Ian explains:
“If you can explain from a top level down, this is a corporate objective, this is what we try and accomplish, this is how it cascades down within each team and each department — how it all fits together and what role you play in it — people get the sense of the bigger picture they’re playing within the organisation as well.”
When people understand how their work connects to something larger, they’re more motivated and more effective.
The Triangle: Getting the Handoff Right
This is one of the most important (and most overlooked) parts of onboarding at scale.
Onboarding isn’t one team’s job. It’s a shared responsibility across three groups:
The onboarding team: responsible for culture, company-wide knowledge, and the rites of passage every new starter goes through
The enablement or L&D function: responsible for the functional knowledge someone needs to actually do their job
The manager: responsible for supporting the new hire and integrating that learning into day-to-day work
Ian is emphatic about how closely these three need to work together:
“The enablement organisation and the onboarding organisation need to be in a triangle. A really close triangle. So that the handover is happening effectively. The knowledge is being built upon. It’s not being duplicated. Nothing worse than when someone’s being invited to one call for onboarding and then they’ve been invited to an enablement call. You can’t allow that to happen. It has to be sequential and it has to be managed collectively.”
When this triangle breaks down, the new hire falls through the gaps. When it works, everything flows.
Onboarding is a Two-Way Street
Onboarding isn’t something that happens to a new hire. They have a role to play too. As Ian puts it:
“The fourth person is the learner themselves. They need to invest the time in order to onboard themselves effectively. So they need to read the materials, do the out of the room learning piece, as well as relationship building out of the room as well, which is so key to onboarding effectively.”
Setting that expectation early makes a real difference. People who take ownership of their own onboarding get up to speed faster and feel more settled sooner.
How Long Should Onboarding Last?
There’s no magic timeline that works for every role, every person, or every organisation. The length of onboarding depends on the complexity of the role, the individual’s prior experience, and how transferable their skills are.
What Ian suggests is a more interesting reframe altogether:
“You should always feel that you’re onboarding because you are always in your job. And particularly now, jobs are changing so quickly that if you have that beginner’s mindset, you are always onboarding yourself in a new direction. If you are always growing yourself.”
The most effective people don’t stop onboarding when week four ends. They carry that curiosity with them.
The Summary
Great onboarding isn’t about cramming as much information as possible into someone’s first week. It’s about connection, clarity, and getting the fundamentals right so people can do their best work sooner.
Get the logistics sorted. Bring culture to life. Build the triangle. Give new starters the space to take ownership. Resist the urge to put a fixed time limit on it.