+44 (0)7766-525433

Tuesday 19th March 11:16 (UK)

From benchmarking to best practice sharing

May 5th, 2015

by Alastair Ross and Martyn Luscombe

 

The term ‘benchmarking’ originally emanated from the practices of cobblers, measuring people’s feet on a bench to make a pattern for a shoe. It was appropriated by business in the 1980s when progressive companies began to formally ‘benchmark’ themselves against each other. Its purpose was to drive business improvement based on identification of superior practices in other organisations. Xerox was the first major business to popularise it and since then it has become a core tool in the business improvement toolkit. Whilst benchmarking can be a very effective tool in catalysing and focusing improvement, it can also be easy to misuse, with the result that the not insignificant resources required can fail to yield usable improvements.

Codexx has gained significant experience in benchmarking programmes since it was established in 2002 and our consultants have additional benchmarking experience gained from their work with previous employers such as IBM, Philips and Cranfield School of Management. Our experience includes the benchmarking of manufacturing practices and performance, supply chain management, R&D, general business innovation and law firm innovation. This experience has given us a good feel for both the strengths and weaknesses of benchmarking as a whole. In this short article we want to share some of the key lessons we have learned and our recommendations on how to best to use benchmarking methods in your own improvement programmes.

To help in establishing a framework for assessing the use of benchmarking, it is important to recognise that in practice ‘benchmarking’ covers a wide range of approaches, each with their strengths and weaknesses. We have divided benchmarking into three distinct types and evaluated each on their merits based on our experience in applying them:

  • Quantitative benchmarking
  • Quantitative practice assessments
  • Qualitative practice assessments

In practice, many benchmarking assessments will be a hybrid of two or more types to suit specific requirements. A key success factor in benchmarking is not to lose sight of the goal. In benchmarking the goal is to be able to use the assessment findings in a practical way to catalyse, focus and effectively execute changes in business practices which will yield performance improvement. The benchmarking findings (the ‘score’) should never be seen as an end in itself – although, in our experience, that can all too easily happen as managers use the findings simply to validate a past or planned action, rather than to use it to drive improvement.

1. Quantitative benchmarking

This type of benchmarking can be considered as the ‘classic’ benchmarking with its focus on comparing the selected business against a defined set of performance metrics and comparing the results with a database of other businesses on an anonymous basis.

Sometimes benchmarking exercises make the mistake of focusing solely on performance metrics, and only part of the story is revealed. By also comparing business practices in an objective way (e.g. through the use of maturity grids) a more valuable assessment can be performed helping to answer the ‘why?’ as well as the ‘what?’ of the scoring. For example, if a manufacturing company is shown to have high inventories compared to the average in its sector, weak practice scoring in the use of lean practices and the use of large production batches would explain this poor performance and give management a list of areas for improvement.

A good example of this type of benchmarking is the Probe manufacturing benchmarking tool. This was developed by IBM Consulting and London Business School in 1992 as an objective way of assessing a manufacturing operation against world class standards. It was based on a model of World Class Manufacturing that drew heavily on lean thinking. The benchmarking tool provided numerical scoring based on practice maturity and measurable performance achievements, and grew to a database of over 2000 manufacturing sites. Codexx has been a licensed user of the Probe manufacturing benchmarking tool since 2002 and we have used Probe for multi-factory benchmarking on a regular basis.

An excellent example of a company that has successfully applied best practices benchmarking and improvement is Grundfos, the leading global pump manufacturer with factories in Europe, America and Asia and headquartered in Denmark. Grundfos started using best practice benchmarking using PROBE in 1996 to assess the practices and performance in its factories. This identified gaps with best practices including the need to utilise Lean manufacturing approaches. Grundfos commenced implementation of lean manufacturing techniques across their factories from 1997 and continued regular PROBE benchmarking supported by Codexx. In parallel it used EFQM to drive overall Business Excellence. Grundfos Denmark won the EFQM award in 2006. In the PROBE benchmarking assessment in 2008, their Danish factories scored at a world class practice level. The result has been clear improvement in key performance indicators such as on-time delivery, inventory turns and quality.

New Directions 19 - benchmarking article - Figure 1

Figure 1: Example of practice v performance scatter for PROBE manufacturing benchmarking

 

Strengths

  • Uses a structured approach and provides quantified comparisons with other organisations on an anonymous basis.
  • The large database of businesses that have been benchmarked against the defined practices and performance provides an authoritative comparison.
  • This can be a cost-effective approach – the license or consulting fee required will be less than the total cost of a ‘do it yourself’ approach.
  • The benchmarking exercise can be performed without visiting other companies so does not require much time.

Weaknesses

  • The underlying model that the benchmark is based on is key to its relevance and its makeup should be clear to participants. It is all too easy for this to be complex and opaque and therefore unclear as to the validity and relevance of the final scoring.
  • Performance benchmarking alone does not provide easily actionable improvements. An effective Quantitative Benchmarking tool will cover both practice and performance.
  • The scope of the benchmark and the practice and performance areas covered are pre-defined by the benchmarking model and there is little or no room for flexibility.
  • Valid comparison is based on the assumption that all participating organisations define the compared metrics in the same way. It is up to the benchmarking assessors to ensure this is the case and if participating businesses are self-assessed, this cannot be guaranteed. The danger is that if the metric is ‘apples’, some organisations will actually be measuring ‘pears’… And this can easily be done. For example one global engineering company found that each of its international subsidiaries had different ways of measuring ‘on time delivery’; so a benchmarking performance question that asked for ‘on time delivery performance’ could get a high variety of %results from across this company even if they actually had identical performance…

2. Quantitative practice assessments

This benchmarking approach uses a defined framework of key practices with levels of maturity, i.e. from weak to strong practices. It is used when there is an absence of a large database of companies which have been assessed against these practices, so quantitative benchmarking is not feasible. Typically this would be because the nature or scope of the practice area is not wide enough to have merited academic or consulting study and the development of a benchmarking programme. This approach suits niche assessments of specialist areas, where a structured comparison against best practices is required. The lack of a database of comparative data means that the process must involve a number of like-minded organisations, all seeking to learn how to improve performance in the area involved.

Codexx used this approach to help a client perform benchmarking of production maintenance practices as part of a re-engineering programme for maintenance across 8 factories. The objectives of the benchmarking were:

1. To help ‘open the eyes’ of the maintenance managers to better practices in other maintenance organisations – and thus encourage change.
2. To provide a quantifiable comparison of practices and some key performance metrics with organisations with measurably better maintenance operations to provide a set of improvement targets.
3. To provide examples of improved practices which could be transferable to their business.
4. To learn about each company’s ‘journey’ to their current maintenance practices and benefit from their learning.

We initially looked to use a commercially available benchmarking tool for this work, but we found no tool that was broad enough to meet the client needs (most were focused on specific aspects of maintenance such as cost). We therefore developed an assessment framework built around a skeleton provided by the ISO 9001 8 principles of quality management. On this skeleton we added maintenance-specific practices and performances and developed this assessment model collaboratively with the client’s re-engineering team of maintenance managers – with whom we were already working. This approach was used to benchmark five other European companies, using 1 day visits – attended by the majority of the maintenance managers – supported by prior data sharing and preparation. Each participating firm was provided with an assessment report and overall scoring – it was important to ensure that each participating company gained value from the exercise. The assessment approach provided much insight to the maintenance managers and was used to develop improvement projects as part of the re-engineering programme. The scoring also provided a measurable set of practice and performance targets that were used over a three year period to measure re-engineering progress, with 6-12 month self-assessments against the framework (more information).

New Directions 19 - benchmarking article - Figure 2

Figure 2: Extract from F4i showing questions on innovation practice maturity.

 

Another example of this approach is in the development of the Foundations for Innovation (F4i) assessment tool by Codexx in 2006. This was born our of our recognition of a lack of a cohesive tool for assessing an organisation’s innovation ‘health’ in a cohesive and objective way. F4i was developed with the support of Imperial College Business School to bring academic rigour to the development of the assessment tool. Based on an academically validated model of innovation, it used 60 key innovation practices and performance metrics to enable a quantitative assessment of innovation in a business (see figures 2 & 3). We have used F4i in innovation assessments of industrial and service organisations across Europe, USA and China. F4i does not have a large database of companies – we focus primarily on how participating organisations score against the defined practices and more importantly in the underlying reasons for weak practices. We took a similar approach in the development of an innovation assessment for the legal sector, building on our experience of working with major UK law firms in innovation since 2005. We used this to perform a study of innovation in 35 UK and German law firms, together with the business schools of Exeter and Leipzig universities (more information).

Strengths

  • The assessment framework can be tailored precisely to the specific needs of the organisation (or group of companies) involved.
  • The assessment approach can again be tailored as required.
  • Learning is gained from the visits to other organisations.

Weaknesses

  • It is typically a more expensive approach to benchmarking than Type 1 as all the costs of development usually have to be borne by the lead partner(s).
  • There needs to be mutual value in the assessment for each participant and the lead organisation needs to recruit the involvement of others and typically cover the costs of the framework, travel to the other companies and in the scoring and reporting.
  • The limited number of participants results in a limited data set which restricts the ability to identify and validate relationships between practice and performance. Findings are thus more indicative than anything.
  •  A custom assessment framework needs to be developed for the benchmarking exercise by one of the parties involved in the assessment or by a supporting consultant.

New Directions 19 - benchmarking article - Figure 3

Figure 3: Example of ‘traffic light’ scoring against overall F4i innovation model.

 

 3. Qualitative practice assessments

This final type of ‘benchmarking’ is comparatively unstructured and primarily produces a qualitative assessment. One could argue that this is what normally happens in ad hoc and unstructured visits to other companies. Such informal benchmarking is sometimes referred to as ‘industrial tourism’ where the lack of any objective or structure to the visit results in limited value and few actionable outcomes. We are not advocating this approach. Structured qualitative practice assessment does have objectives and an assessment framework but it is deliberately loose. Codexx have facilitated a number of such assessments in a practice sharing programme on manufacturing technology areas, between a major Scandinavian and German manufacturers where there was mutual interest in a specific area of production technology development and the companies did not compete directly in their customer markets but were mutual users of each other’s products. Our approach to these assessments was to develop an overall practice framework (again based around the ISO’s 8 principles of quality management) and develop specific practices around this framework in collaboration with the two companies. An assessment visit programme was arranged and the framework used to scope and focus discussions. Key findings from the assessment were documented for each company.

Strengths

  • Suitable for use in highly specific business areas where an existing benchmarking tool is unlikely to be available.
  • Learning is gained from the visits to other organisations.
  • The flexible framework enables practice sharing visits and discussions to accommodate discussion and evaluation of thinking, methods and experience which emerge as of value to participants, and would not easily be accommodated in a tight prescriptive assessment format. Basically ‘best practices’ in this area are not well defined or understood and so the assessment approach needs to accommodate emergent practices that have proved to be of value to at least one of the participating organisations.

Weaknesses

  • Requires participants to each have areas of practice which are perceived to be progressive by other partners, so each participant can learn from the assessment – not just teach the other participants. This means that pre-work is typically required to engage participants and establish the potential for a mutually beneficial practice sharing arrangement.
  • There is a danger that the assessment will be so flexible and loose that it becomes difficult to make objective comparisons to enable effective learning for participants.
  • Again one party needs to take the lead in such a practice sharing programme.

 Final Words

Benchmarking is a proven way to compare an organisation’s effectiveness with others and do this in a way that provides objective findings that can be used to catalyse business improvement. However, sometimes traditional benchmarking can be expensive, too ‘one size fits all’ and its scoring opaque to participants.

Less formal methods, based around ‘practice sharing’ can be effective at achieving the required objectives of catalysing improvement – without the ‘heaviness’ of typical benchmarking – and enabling a more personalised fit to the requirements of the participants.

Overall, it is key that, whatever benchmarking approach is used, participants do not lose sight of the ultimate aims of the activity: to catalyse and focus improvement activities.

Leave a Reply

Energizing Change

Copyright © Codexx
All rights reserved