Lessons Learned Effectiveness
In problem Solving one of the  steps is to share lessons learned. This is for either a “Read Across” or Yokoten to similar processes or products.
 We would also store information for reference to apply on new programs or processes.
As Managers, it is necessary to be able to measure the effectiveness of the process.
What measures, Kpi’s, does your organization use to measure the effectiveness of the Lessons Learned system.

Michael Hirt

9 Replies
This is a very good question.  KPIs would incorporate a metric based on two or more measures, so what measures would need to be captured over a time period?

Lessons Learned is a broad term that could incorporate defects, incidents, nonconforming conditions, required corrective actions, and root causes.

Whether this is effective would be to compare the quantity of lessons learned to improvements to products and processes.
Really a good topic. We are trying to implement this lesson learn sharing concept in our org. 
Do you guys have/or know where to find a good template for this? Or what are you using today to collect and share lesson learns?

Thansk a lot! 
Amir Oren
1 Posts
Lesson learn template -  Great issue to discuss ! 
I hope also to find a basic format . 
we are developing the concept, and in time we also added a layer  above the standard details of event and C/A - we add the motivation we think will be for the professional reader to have in-depth review of the case , and suggestions for questions that one should ask himself while reading the case. it can be consider as baby feed , but on the other hand it open once mind to more options on his end. 
I suggest to develop this idea of template a little more... 
Hi Amir, 

Probably we could develop some template together :) 
There are many templates available online but for project related issues and lesson learned, but I'm looking more for a template how to share quality issue from production and it's lesson learned, what were the difficulties in solving it and what went well in the process. 
I would like to be able to do this lesson learned in power point so it could be easily presented to company members, yet not sure how to do it.
Good point Michael,

Should be there a time-limit defined when is the best time to check the effectiveness of the lessons learned (and how?) depending on the criticality of the problem/process?
Unfortunately, it's been my experience that a company will go through an extensive process to determine and document lessons learned and then never refer to them again.  The result of this is that the same lessons are "learned" over and over again.  Perhaps any evaluation of a lessons learned process should start with "did we review the prior list of lessons learned"?
Grace Duffy
69 Posts
This is a great topic. I will let others research templates from Six Sigma Stage Gate presentation instructions or documentation on writing an After Action Report. I am sure they are out there. 

What caught my attention is Aram's question about time limits for lessons learned. As a trained curriculum and instructional designer, I am used to a structured set of timings for feedback. We all know about the smile sheets after training (level 1 feedback). We also know about content quizzes, called "level 2" tests. Not all of us remember level 3, or behavioral impact measures. Level 4 measures whether application of the learning made a difference to the bottom line of the company. There may even be a level 5 out there now on long term effects to society. 

We can use the same progression for lessons learned. In fact, we probably already do.

Level 1: What did we learn about the assumptions we made in setting up the charter for the project initially? What changed, why, and how do we know whether it was a value add or not? 
Level 2: What did the leading indicators from our project tasks or process modifications tell us? What lessons did we learn about the technical activities conducted during the improvement? This is what shows up in Stage Gate reviews most often. Some behavioral lessons will emerge here in team dynamics or organizational culture impacts.
Level 3: More a lagging indicator: What did we learn from the immediate outputs of the project or process change? Did we accomplish the desired result of the project within 3 months of project delivery? (This time frame may vary depending on complexity of the project.)More cultural and climate impacts may show up here. 
Level 4: Definitely a lagging indicator: What did we learn from the outcomes of the changes? What was the impact to the bottom line? What was the true ROI of this project or process change? This may take 6 months or a year to fully assess, especially if it is a cross functional or supply chain level project. Did we make a positive change to our culture or market position?

This is the first time I have thought about applying the levels of evaluation to project lessons learned. Someone may wish to take the thought further. Or, it may already have been done and I just have not read that book yet. 
Hello Emanuel,
I've attached a template that I had developed to use largely for quality/product development/design issues, but it can accommodate other functions as well(Lessons learned format-PV.xlsx). If you improve upon it, please do share the updated template here :) I had used excel to document and track the lessons
@Miriam Boucher
Couldn't agree with you more! One element that helped me ensure that the lessons actually loop back into improving the system, was to have the recommended actions compulsorily feed into appropriate documented procedures such as design checklists, process qualification procedures etc. The status on the lesson could not be closed until an implementation document ID is captured in the lessons learned document. This helped to audit the lessons-learned process too :)
When I was doing benchmarking at HP, Compaq, and Xerox in the period from 1985 to 1993, we developed a structured approach to organize external studies (e.g., strategic or operational benchmarking studies) and internal studies (operational benchmarking that also included lessons learned from improvement studies). I haven't been at Xerox for over 26 years, so I have no idea if it was kept up or changed over time. What we did was start with a major process deployment map of the organization (this was linked to a generic set of process names so we could compare internal and external processes. For each process we identified input measures, in-process measures, and output-measures that cold be used in what LSS people refer to as a Y=f (X) equation that illustrates the process throughput. These measures were added to an organization-wide "swim-lanes" diagram (typically called a deployment map or diagram) and the indicators illustrated at the bottom as a time-series flow using what we call today a value stream map. This core structure was the architecture for the model. We then developed a "tree diagram" of the model flow to support structuring the file storage system so records could be recorded by topic. These files were kept in R-Base 5000 a relational data base which allowed us to cross-reference files based on a keyword search. All of these capabilities are available in modern IT systems. In 1993 the American Productivity & Quality Center created a process classification guide with standardized process names. This enables clearer cross-company comparisons. APQC Process Classification Framework.pdf This can help to build an internal process architecture that enables external benchmarking - it is a tool that enables strategic quality thinking!
I hope that this helps!
Best regards,