Change Management - Knowledge Hygiene: make sure Lessons Learned don't stink!
Updated: May 29, 2021
Lessons Learned are only as good as your knowledge hygiene practices. What framework are you using to make sure your Lessons Learned don't stink?

If you want to avoid your Lessons Learned initiative turning into a stinker, then consider what framework you are using to assess knowledge hygiene: Relevance | Richness | Rigour | Review | Rating | Revisits | Revisions | Report (Reuse, Results, Return) | Rapidity

Disclaimer! I need to start by saying that in my opinion, Lessons Learned have had their time. Lessons Learned are, ultimately, about moving knowledge to action. I am often left to wonder, why then, if the Lesson Learned is significant enough to record, are KM waiting for people to find, read and action it? Knowledge Management should perhaps focus on signal identification/anticipation and learning cycle times according to the likelihood of incident reoccurrence and impact, but that's for another article.
However, this article is about KM User Experience (UX) and Knowledge Hygiene. Lessons Learned have to deliver meaning to create impact and results for the user, that simple.
In a typical Lessons Learned knowledge base, 90% of searches present 'knowledge' that is widely known and, therefore, seen as irrelevant to the user. The other 10% of searches present 'knowledge' that is mostly incomprehensible because of how it is written, where 90% of people can't understand the content of the lesson presented to them
Knowledge Hygiene & Lessons Learned - the case example
As part of an enquiry into cost reduction, I was contracted by a multinational company to benchmark the impact of their Lessons Learned Portal. To measure any KM initiative, you first need to establish meaning.
The purpose (meaning) of the Lessons Learned Portal was to sense process/system errors, improve quality, reduce expenditure, and protect against duplication of effort. The Lessons Learned Portal had been in place for two years and had collected over 1300 lessons learned from engineering-based projects.
The meaning is clear, and it is now possible to explore access-to-action, impact and results.
Watch: these concepts are explored in David's Birthday Webinar on KM Impact; finding and reporting results & ROI.
Knowledge Hygiene: The 12 Rs Framework
My approach to Lessons Learned maturity benchmarking is to use a 12 R Knowledge Hygiene framework - I initially designed this framework for a KM tech company in California:
Relevance | Richness | Rigour | Review | Rating | Revisits | Revisions | Report (Reuse, Results, Return) | Rapidity

Using the 12 Rs Knowledge Hygiene framework, I approached the impact benchmarking challenge in the following way, which you can adapt for your use case:
The User Experience Part 1:
I conducted a system test and demonstrated that engineers were uploading lessons learned, but knowledge seekers could not locate the knowledge object. For example, I uncovered a seemingly valuable lesson that spoke about a significant process change and asked ten engineers to locate the lesson. In each case, the engineer, using key-words typical to the problem discussed in the lesson, did not present the lesson I had previously located.
The User Experience Part 2:
I spoke directly with the Lessons Learned Portal management team about access-to-action: access rates, re-use (application) rates and impact/results from the lessons captured. The managers could not provide any evidence for knowledge being accessed and actioned, where actions positively impact STIQCE indicators (Safety, Time, Innovation, Quality, Cost, Experience (customer/employee). There was zero evidence of access-to-action, which equated to zero measurable or reported value being created by a Lesson learned initiative that had cost over $1.3 million over two years.
This included zero ROI on:
Lessons Learned Portal staffing costs
Investment and maintenance costs of the software platform
Costs associated with 1300 engineers completing a lesson learned template that required, on average, 5 hours of input from 5 staff – a total of 32,500 person-hours)
The User Experience Part 3:
I surveyed the users to ascertain their feelings toward the Lessons Learned Portal. The feedback was extremely negative. Engineers made statements such as, "90% of what is in [the lessons learned] just doesn't make sense, and I have 30 years of experience" or "nobody uses it, you can't find anything useful, and it is just a tick box, something we have to do, and we just work to get it done."
The Lessons Learned team had peer-reviewed content. However, they did not peer-review the document for quality of content, only whether the content was accurate; there was no consideration for whether engineers had captured the most valuable knowledge or meaningful value.
Using The 12 Rs KNowledge Hygiene Framework, I was able to demonstrate that the company had a low level of knowledge hygiene and, therefore, knowledge capability.
The analysis resulted in deploying a new Lessons Learned template (based on adult learning principles) and adopting a new value-based approach to the Lessons Learned initiative to continuously monitor and anticipate the meaning (purpose) of the initiative, as well as user need and experience with a focus on access-to-action and impact producing positive results.
Watch the webinar above for more information on KM Informatics and why KM is not a good idea!
Have you been struck?
Check out the Good Life Work project 365. 24 . 7 . 1 Performance Improvement Subscription.
If you have been struck by the content of this article and would like to collaborate or partner with us, contact david@k3cubed.com