Guide to Optimal Maintenance & Reliability--Managing Performance

HOME | FAQ | Books | Links


AMAZON multi-meters discounts AMAZON oscilloscope discounts


" You cannot manage something you cannot control, and you cannot control something you cannot measure."

  • 1 Introduction
  • 2 Terminology
  • 3 Identifying Performance Measures
  • 4 Data Collection and Data Quality
  • 5 Benchmarking and Benchmarks
  • 6 Summary
  • 7 QUIZ

Learning goals:

  • • What to measure and why to measure performance
  • • Differences between lagging and leading indicators
  • • Key performance indicators
  • • A balanced scorecard
  • • The challenges of data collection
  • • The importance of data quality and integrity
  • • Benchmarks and benchmarking

1 Introduction

An organization must measure and analyze its performance if it’s to make the improvements needed for staying in business in a competitive market place. The performance measures must be derived and aligned with the organization's goals and strategies of the business. They should be centered on the critical information and data related to the key business processes and outputs, and should be focused on improving results.

Data needed for process improvement and performance measurement includes information about products and services, asset performance, cost of operations, and maintenance. Data can be easy or difficult to collect, and emphasis must be placed on data quality. This data is analyzed to determine trends, cause and effects, and the underlying reasons for certain results that may not be evident without analysis. Data are also used to serve a variety of purposes, such as planning, projections, performance reviews, and operations improvements, and for comparing an organization's performance with the "best practices" benchmarks.

A key component of an improvement process involves the creation and use of performance indicators, also known as metrics. These metrics are measurable characteristics of products, services, and processes related to the business. They are used by an organization to track and improve its performance. Metrics should be selected to best represent the factors that lead to improved operations, including maintenance and customer satisfaction. A comprehensive set of measures or metrics tied to the business activities and customers should be based on long- and short-term goals of the organization. Metrics need to be constantly reviewed and aligned with the new or updated goals of the organization and become part of its strategic plan.

Metrics based on the priorities of the strategic plan make lasting improvements to the key business drivers of the organization. Processes are then designed to collect information relevant to these metrics and reduce them to numerical form for easy dissemination and analysis.

The value of metrics is in their ability to provide a factual basis in the following areas:

• Strategic feedback to show the present status of the organization from various perspectives

• Diagnostic feedback of various processes to guide improvements on a continuous basis

• Trends in performance over time as the metrics are tracked

• Feedback around the measurement methods themselves in order to track the correct metrics

In most businesses, success is easily measured by looking at the bottom line-the profit. But what's the bottom line for maintenance as a business function? To better understand how to evaluate maintenance business performance, it's helpful to examine how businesses generate profit. In simple terms, businesses generate profit by selling goods and services and by minimizing their costs. Obviously, revenues generated from sales must exceed the costs.

Customers generally demand value. Key components of value are:

timeliness, quality, price, and return on investment (ROI). Therefore, metrics for maintenance and reliability should reflect how an organization is providing value to its customers in terms of timely maintenance (avail ability of assets), quality of service (minimum rework), controlling costs, etc. Thus, maintenance as a business function must develop its internal metrics to evaluate its performance in terms of these parameters.

The Benefits of Performance Measurement

Accountability

Well-designed performance measures document progress toward achievement of goals and objectives, thereby motivating and catalyzing organizations to fulfill their obligations to their employees, stakeholders, and customers.

Resources / Budget Justification

Because it ties activities to results, performance measurement becomes a long-term planning tool to justify proper resource or budget allocation.

Ownership and Teamwork

By providing a clear direction for concentrating efforts in a particular functional area, performance measurement provides more employees participation in problem solving, goal setting, and process improvements. It helps set priorities and promotes collaboration among departments and business areas.

Communication-A Common Language

Achievement of goals through metrics can enhance employee under standing and support of management strategies and decisions. They also give employees a common language to communicate, alert them to potential problem areas, and encourage them to share knowledge. Therefore, performance metrics, if properly designed and implemented, enhance productivity and reduce cost.

2 Terminology

Benchmark A standard measurement or reference that forms the basis for comparison; this performance level is recognized as the standard of excellence for a specific business process.

Benchmarking American Productivity and Quality Council (APQC) defines benchmarking as the process of identifying, learning, and adapting outstanding practices and processes from any organization, anywhere in the world, to help an organization improve its performance. Benchmarking gathers the tacit knowledge-the know-how, judgments, and enablers.

Benchmarking Gap -- The difference in performance between the benchmark for a particular activity and the level of other organizations. It’s the measured performance advantage of the benchmark organization over other organizations.

Best-in-Class -- Outstanding process performance within an industry; words used as synonyms are best practice and best-of-breed.

Best Practice -- A method or technique that has been found to be the most effective and has consistently achieved superior results compared to results achieved with other means, e.g., current practices while minimizing the use of an organization's resources. This practice becomes a benchmark.

Generic Benchmarking

Benchmarking process that compares a particular business function or process with other organizations, independent of their industries.

Goals -- An observable and measurable end result having objectives that will be achieved within a more or less fixed time frame. Goals indicate the strategic direction of an organization.

Internal Benchmarking -- Benchmarking process that is performed within an organization by comparing similar business units or business processes.

Metric -- A metric is a standard measure to assess performance in a specific area. Metrics are at the heart of a good, customer-focused process management system and any program directed at continuous improvement.

Networking -- A practice of building up informal relationships with people, with a common set of values and interests that can help both par ties professionally.

Objective -- The set of results to be achieved that will deploy a vision into reality.

Performance -- The results of activities of an organization over a given period of time.

Performance Measurement -- A quantifiable indicator used to assess how well an organization or business is achieving its desired objectives.

Vision -- The achievable dream of what an organization wants to do and where it wants to go.

World-Class -- Practices that are ranked by the customers and industry experts to be among the best of the best. An exemplary performance achieved independent of industry, function, or location.

3 Identifying Performance Measures

It’s often said that "what gets measured gets done." Getting things done, through people, is what management is all about. Measuring things that get done and the results of their effort is an essential part of successful management. But too much emphasis on measurements or the wrong kinds of measurements may not be in the best interest of the organization.

A few vital indicators which are important for evaluating process performance are called KPIs--Key Performance Indicators. KPIs are an important management tool; they measure business performance, including maintenance. There only are a few "hard" measures of maintenance output and the metrics that are commonly used are often easy to manipulate. Maintenance and operations KPIs must be integrated to make them effective and balanced. There are three other criteria that should be considered when deciding what aspects of maintenance to measure:

1. The performance measures should encourage the right behavior.

2. They should be difficult to manipulate to "look good."

3. They should not require a lot of effort to measure.

Some metrics may encourage people to do things that we don’t want.

A common metric is "adherence to weekly work schedule" for maintenance work. It's easy to achieve a high adherence to schedule by scheduling less work, through overestimating work orders. However, what we really want is higher productivity, which can often be achieved by challenging people and scheduling more work, but with better work estimating. Thus the wrong measurement may work against us. Like "adherence to schedule," some other metrics, such as time spent on Preventive Maintenance (PM) work, percent rework, and percent emergency work are easy to manipulate.

The KPIs which are truly relevant and satisfy the criteria listed above should only be considered only for implementation. A good example comes from an organization trying to improve turnaround and shutdown planning, where a new target of completing all planning work two or three weeks in advance of a shutdown has been set and agreed upon. All of its shutdown work orders have a specific code. Therefore, a simple report from the CMMS listing all purchase requisitions against work orders for a specific shutdown that were originated less than two or three weeks in advance will provide a very useful measure to evaluate planning performance. This metric supports the right behavior, is unlikely to be manipulated, and is easy to measure. It will also provide information on where to take action and when to recognize good planning efforts.

Metrics such as the one in this example are of immense value when measuring the success of efforts to implement better practices and to change the behavior. These metrics may in turn be discontinued when the new and improved practices become a habit.

Metrics Development Process

The first step in developing metrics is to involve the people who are responsible for the work to be measured. They are the most knowledge able about the work. Once these people are identified and involved, it’s necessary to:

1. Identify critical work processes and customer requirements.

2. Identify critical results desired and align them to customer requirements.

3. Develop measurements for the critical work processes or critical results.

4. Establish performance goals, standards, or benchmarks.

A SMART test can be used to ensure the quality of a particular performance metric. Here, the letters in SMART represent: S = Specific. Be clear and focused to avoid misinterpretation. The metric should include measurement assumptions and definitions and should be easily interpreted.

M = Measurable. The metric can be quantified and compared to other data. It should allow for meaningful statistical analysis. Avoid "yes/no" measures except in limited cases such as start-up or systems-in-place situations.

A = Attainable. The metric is achievable, reasonable, and credible under conditions expected.

R = Realistic. It fits into the organization's constraints and is cost effective.

T = Timely. It's do-able, data is available within the time needed.

FIG. 1 lists a few examples of key maintenance and reliability related metrics.

FIG. 1 List of Maintenance and Reliability Metrics.

Leading and Lagging Indicators

A simple way to determine if a metric is leading or lagging is to ask the question, "Does the metric allow us to look into the process, or are we outside of the process looking at the results?" Leading indicators are for ward looking and help manage the performance of an asset, system, or process, whereas lagging indicators tell how well we have managed.

Process measures are leading indicators. They offer an indication of task performance with a lead time to manage for successful results. For example, a leading maintenance process indicator will measure how proactive the planning or scheduling function has been in preparing preventive and condition-based maintenance work packages or to monitor the percentage of PM / CBM inspections completed per schedule. If people are doing all the right things, then the expectation is that improved results will follow. The leading process indicators are typically more immediate than lagging measures of results. We must manage by the leading indicators. Some examples of M&R-related leading metrics are:

• % Schedule compliance

• % Planned work

• % PM / CBM work compliance (completed on time)

• Work order cycle time

• % Rework

• Planner to craft workers ratio

Lagging indicators are results that occur after the fact. They monitor the output of a process. They measure the results of how well we have managed an asset, process, or overall maintenance business. Some examples of M&R-related lagging metrics are:

• Maintenance cost as % of RAV

• Return on Net Assets (RONA)

• Asset Availability

• MTBF

• OEE

• Maintenance Training Man-hours or $

FIG. 2 on the next page shows a hierarchical model of lagging and leading indicators.

FIG. 2 Leading and Lagging KPI Model

On a cautionary note, an indicator could be either leading or lagging.

For example, PM/CBM work compliance is a lagging indicator-the result of how much PM/CBM work is completed-when viewed in the context of work execution. However, when viewed as an indicator of asset reliability, PM/CBM compliance is a leading indicator of the reliability process. Higher PM/CBM work compliance predicts or very likely leads to improved asset reliability. Similarly, improved asset reliability will lead to reduced maintenance costs, which is a lagging indicator of the maintenance process.

Whether leading or lagging, metrics should be used to provide information on where the process is working well and where it isn't. In doing so, these metrics help build on successes and lead to making process changes where unfavorable trends are developing.

FIG. 3 The Balanced Scorecard

Balanced Scorecard

Most of the time, we measure what's important from the financial and productivity prospective of a process or an organization. The balanced scorecard suggests that we view the process or an organization from four perspectives. We should also develop metrics, collect data, and analyze the data relative to each of these perspectives to balance out any bias.

The Balanced Scorecard is a strategic management approach developed in the early 1990s by Dr. Robert Kaplan of Harvard Business School, and Dr. David Norton. As the authors describe the approach:

"The balanced scorecard retains traditional financial measures. But financial measures tell the story of past events, an adequate story for industrial age companies for which investments in long-term capabilities and customer relationships were not critical for success. These financial measures are inadequate, however, for guiding and evaluating the journey that organizations must make to create future value through investment in customers, suppliers, employees, processes, technology, and innovation." The balanced scorecard ( FIG. 3) identifies four perspectives from which to view a process or an organization. These are:

• Learning and Growth Perspective

• Business Process Perspective

• Customer Perspective

• Financial Perspective

The balanced scorecard is a strategic planning and management system that is used extensively in business and industry, government, and nonprofit organizations worldwide. It helps to align business activities to the vision and strategy of the organization, improve internal and external communications, and monitor organization performance against strategic goals. It provides a balanced view of organizational performance.

The balanced scorecard has evolved from its early use as a simple performance measurement framework to a full strategic planning and management system. The "new" balanced scorecard transforms an organization's strategic plan from an attractive but passive document into the marching orders for the organization on a daily basis. It provides a frame work that not only provides performance measurements, but helps organizations to identify what should be done and measured. It enables executives to truly execute their strategies.

The Learning and Growth Perspective

This perspective includes employee training and corporate cultural attitudes related to both individual and corporate self-improvement. In a knowledge-worker organization, people-the only repository of knowledge-are the main resource. In the current climate of rapid technological change, it’s becoming necessary for knowledge workers to be in a continuous learning mode. Government agencies often find themselves unable to hire new technical workers. At the same time, there is a decline in training of existing employees. This is a leading indicator of a "brain drain" that must be reversed. Metrics can be put into place to guide man agers in focusing training resources where they can help the most.

Kaplan and Norton emphasize that "learning" is more than "training." It also includes things like mentors and tutors within the organization, as well as ease of communication among workers that allows them to readily get help on a problem when it’s needed. It also includes technological tools; what the Baldrige criteria calls "high performance work systems." In the maintenance area, these tools include the use of new technologies, e.g., Ultrasonic, Infrared Thermography, Motor Current Analysis, and applying RCM in new designs.

Maintenance & Reliability (M&R) related examples of this perspective are:

• Hours (or dollars) spent on training per person, e.g., 80 hours/person in a given year

• Percent of training hours per total, e.g., 5% in year 2009

• Number of technical papers presented or written/$M of M&R budget

• Percent of employees certified in Condition Based Maintenance (CBM) technologies or Certified Maintenance Reliability Professionals (CMRP)

• Percent of work orders created by CBM/Predictive Maintenance (PdM) technology

• Percent of CBM tasks in overall Preventive Maintenance (PM) program

• Percent of FMEA/RCM processes applied to new designs The Business Process Perspective

This perspective refers to internal business processes. Metrics based on this perspective allow the managers to know how well their business is running, and whether its products and services conform to customer requirements (the mission). These metrics have to be carefully designed by those who know these processes most intimately. Usually with missions unique to each organization, these metrics are developed by the organizations themselves without the help of outside consultants.

In addition to the strategic management process, two kinds of business processes may be identified: a) mission-oriented processes, and b) support-oriented processes. Many processes in government, such as DoD/ NASA, are mission-oriented processes, and have many unique problems in measuring them. The support processes are more repetitive in nature.

Hence, they are easier to measure and benchmark using generic metrics.

Maintenance & Reliability (M&R) related examples of this perspective include:

• PM Backlog-Percent or Number of Tasks

• Percent Schedule Compliance

• Percent Rework

• Percent Reliability (or MTBF)-Asset / System

• Percent Material Delivered or Available on Time

The Customer Perspective

Recent management philosophy has shown an increasing realization of the importance of customer focus and customer satisfaction in any business. These are leading indicators: if customers are not satisfied, they will eventually find other suppliers that will meet their needs. Poor performance from this perspective is thus a leading indicator of future decline, even though the current financial picture may look good. For maintenance organizations, their customers are the operations. If they are not happy with the service due to increasing failure rate and downtime, they could outsource the maintenance.

In developing metrics for satisfaction, customers should be analyzed in terms of the kinds of customers and processes for which the organization is providing a product or service to those groups. M&R related examples of this perspective include:

• Percent downtime

• Percent availability

• Percent delivery on time (asset/system back to operation as promised)

• Customer satisfaction with the services maintenance provides, such as turn-around (no cost overruns, worked right the first time, asset operates at 100% performance, etc.)

The Financial Perspective

Kaplan and Norton don’t disregard the traditional need for financial data. Timely and accurate financial data will always be a priority, and managers will do whatever necessary to provide it. In fact, often there is more than enough handling and processing of financial data. With the implementation of a corporate database, it’s hoped that more of the processing can be centralized and automated. But the point is the current emphasis on financials leads to an unbalanced situation with regard to other perspectives.

There are two general types of measures that affect the financial out come of a business: effectiveness and efficiency. An organization may be effective in safely producing a good product on-time, but can be out flanked by a more efficient competitor. The reverse is also true, i.e., producing in an efficient manner but without the quality expectations.

Generally, the effective measures are mastered first, followed by efficiency measures. It’s the old struggle between quality and production.

There is perhaps a need to include additional financial-related data, such as risk assessment and cost-benefit data, in this category. M&R related examples of this perspective are:

• Maintenance cost per unit of product or service provided

• Maintenance cost as percent of Replacement Asset Value (RAV)

• Inventory turns (of MRO store)

• MRO store Inventory as a percent of RAV

• Maintenance cost / HP installed

4 Data Collection and Data Quality

Another key challenge with a performance measurement system is data collection and availability of quality data on a timely basis. Data is the key ingredient in performance measurement. Major factors in establishing a performance measurement system are:

a. Cost of data collection

b. Data quality

c. Data completeness

d. Extrapolation from partial coverage

e. Matching performance measures to their purpose

f. Understanding extraneous influences in the data

g. Timeliness of data for measures

h. Use of measures in allocation of funding

i. Responsibility for measures, and limited control over the process

j. Benchmarking and targets

Evidently, an efficient and effective data collection system is needed to ensure availability of quality data. A data collection system should:

1. Identify what data needs to be collected and how much; the population from which the data will come; and the length of time over which to collect the data.

2. Identify the charts and graphs to be used, the frequency of charting, various types of comparisons to be made, and the methodology for data calculation.

3. Identify the characteristics of the data to be collected. (Attribute data are items that can be counted-variable data are items that can be measured.)

4. Identify if existing data sources can be utilized or new data sources need to be created for new or updated measure of performance. All data sources need to be credible and cost effective.

How good are the metrics? The following questions may serve as a checklist to determine the quality of metrics and to develop a plan for improvement:

• Do the metrics make sense? Are they objectively measurable?

• Are they accepted by and meaningful to the customer?

• Have those who are responsible for the performance being measured been fully involved in the development of this metric?

• Does the metric focus on effectiveness and/or efficiency of the system being measured?

• Do they tell how well goals and objectives are being met?

• Are they simple, understandable, logical, and repeatable?

• Are the metrics challenging but at the same time attainable?

• Can the results be trended? Does the trend give useful management information?

• Can data be collected economically?

• Are they available timely?

• Are they sensitive? (Does any small change in the process get reflected in the metric?)

• How do they compare with existing metrics?

• Do they form a complete set-a balanced scorecard (e.g., adequately covering the areas of learning and growth, internal business process, financial, and customer satisfaction)?

• Do they reinforce the desired behavior-today and in the long haul?

• Are the metrics current (living) and changeable? (Do they change as the business changes?)

5 Benchmarking and Benchmarks

What Is Benchmarking?

Benchmarking is the process of identifying, sharing, and using knowledge and best practices. It focuses on how to improve any given business process by exploiting topnotch approaches rather than merely measuring the best performance. Finding, studying, and implementing best practices provide the greatest opportunity for gaining a strategic, operational, and financial advantage.

Informally, benchmarking could be defined as the practice of being modest enough to admit that others are better at something, and wise enough to try to learn how to match, and even surpass them.

Benchmarking is commonly misperceived as simply number crunching, site briefings and industrial tourism, copying, or spying. It should not be taken as a quick and easy process. Rather, benchmarking should be considered an ongoing process as a part of continuous improvement.

Benchmarking initiatives help blend continuous improvement initiatives and breakthrough improvements into a single change management system. Although benchmarking readily integrates with strategic initiatives such as continuous improvement, re-engineering, and total quality management, it’s also a discrete process that delivers value to the organization on its own.

Types of Benchmarking

Generally there are two types of benchmarking activities. They include:

1. Internal

2. External

a. Similar industry

b. Best Practice

Internal Benchmarking

Internal benchmarking typically involves different processes or departments within a plant or organization. This type of benchmarking has some advantages such as ease of data collection and comparison-some of the enablers such as employee's skill level and culture would be generally similar. However, the major disadvantage of internal benchmarking is that it’s unlikely to result in any major breakthrough in improvements.

External Benchmarking

External benchmarking is performed outside of an organization and compares similar business processes or best in any industry.

Similar industry benchmarking uses external partners in a similar industry or with similar processes; it shares their practices and data. This process may be difficult in some industries, but many organizations are open to share non-proprietary information. This type of benchmarking initiative usually focuses on meeting a numerical standard rather than improving any specific business process. Small or incremental improvements have been observed in this type of benchmarking.

Best Practices benchmarking focuses on finding the best or leader in a specific process and partnering with them to compare their practices and data.

Benchmarking Methodology

One of the essential elements of a successful benchmarking initiative is to follow a standardized process. Choosing an optimal benchmarking partner requires a deep understanding of the process being studied and of the benchmarking process itself. Such understanding is also needed to properly adapt best practices and implement changes to each organization's unique culture. Simply stated, the best practice needs to be tailored to meet an organization's culture if it’s to be implemented successfully.

This dynamic process often involves finding and collecting internal knowledge and best practices, sharing and understanding those practices so they can be used, and adapting and applying those best practices in new and existing situations to enhance performance levels.

The following steps are recommended for successfully implementing a benchmarking initiative:

1. Conduct internal analysis.

2. Compare data with available benchmarks.

3. Identify gaps in a specific area.

4. Set objectives and define scope.

5. Identify benchmarking partners.

6. Gather information.

a. Research and develop questionnaire.

b. Plan benchmarking visits.

7. Distill the learning-compile results.

8. Select practice to implement.

9. Develop plan and implement improvements-tailored practice.

10. Review progress and make changes if necessary.

Challenges in Benchmarking: The Code of Conduct

Benchmarking can create potential problems, ranging from simple misunderstandings to serious legal problems. To minimize the likelihood of these types of difficulties, it’s strongly recommend that the benchmarking teams follow a simple Code-of-Conduct.

Legal

Don't enter into discussions or act in any way that could be construed as illegal, either for you or your partner. Potential illegal activities could include a simple act of discussing costs or prices, if that discussion could lead to allegations of price fixing or market rigging.

Be Open

Early in your discussion, it helps to fully disclose your level of expectations with regard to the exchange.

Confidentiality

Treat the information you receive from your partners with the same degree of care that you would for information that is proprietary to your organization. You may want to consider entering a non-disclosure agreement with your benchmarking partner.

Use of Information

Don't use the benchmarking information you receive from a partner for any purpose other than what you have agreed to.

The Golden Rule of Benchmarking

Treat any benchmarking partners and their information the way you'd like them to treat you and your information.

Lack of Standardized Definitions

One of the challenges in the M&R benchmarking process is the absence of standardized definitions of M&R terms, including metrics. We have found from our own experience that during the benchmarking process, the benchmarking partners usually spend considerable time learning to understand each other's terms, including metrics, as well as what data goes in to satisfy that specific metric. To overcome this challenge, the Society of Maintenance & Reliability Professionals (SMRP), has taken the initiative to define and standardize M&R terms. The SMRP team has undertaken a very rigorous and time-consuming development process to standardize maintenance and reliability-related terms, and to obtain feedback from the M&R community to ensure their validity. There are about two hundred terms which have been defined and standardized by the SMRP's best practices team. However, to fill the needs of a larger M&R community, a new document called "The Professional's Guide to Maintenance and Reliability Terminology" was released. It’s a very comprehensive list of over 3000 definitions and acronyms in the M&R field, including project management and quality.

Society for Maintenance & Reliability Professionals (SMRP) Initiative SMRP's effort is being carried out by its Best Practices committee.

The committee has been developing definitions for key M&R performance metrics. Through group consensus and an extensive review by subject matter experts, including the use of web-based surveys, these metrics are becoming industry standards. As such, they can be used in benchmarking processes and when searching for best practices. This would help to create a common language in M&R field which is badly needed now.

The development process used by the SMRP Best Practices commit tee is a six-step process:

1. Selection of key metrics

2. Preparation of the metric descriptions

3. Review and consensus by the committee

4. Review and feedback by subject matter experts

5. Final review and editing by the committee

6. Publication

A template was also developed by the best practices / metrics team to provide a consistent method of describing each metric. The basic elements of each metric are:

• Title: The name of the metric

• Definition: A concise definition of the metric in easily understandable terms

• Objectives: What the metric is designed to measure or report

• Formula: Mathematical equation used to calculate the metrics

• Component Definitions: Clear definitions of each of the terms that are utilized in the metric formula

• Qualifications: Guidance on when to apply and not apply the metrics

• Sample Calculation: A sample calculation utilizing the formula with realistic values

Visit SMRP's website at www.smrp.org to view its current list of the metrics and to obtain additional information regarding best practices and metrics. FIG. 4 is a list of metrics developed by the SMRP's Best Practices team.

FIG. 4 List of SMRP Metrics

FIG. 5 Maintenance and Reliability Best Practices Key Benchmarks

Benchmarks

A benchmark refers to a measure of best practice performance where as benchmarking refers to the actual search for the best practices.

Emphasis is then given to how we can apply the process to achieve superior results. Thus, a benchmark is a standard, or a set of standards, used as a point of reference for evaluating performance or quality level.

Benchmarks may be drawn from an organization's own experience, from the experience of others in the industry, or from regulatory requirements such as those from Environmental Protection Agency (EPA) or Occupational Safety and Health Agency (OSHA).

If we were to benchmark world conquest, what objective measure would we use to compare Julius Caesar to Alexander the Great, or Genghis Khan to Napoleon? Which of them was the epitome, and why? We do the same thing in business. Which organization has the best PdM program? Who provides the most responsive customer service department? Who is the best in planning/scheduling? What about the leanest manufacturing operation? And how do we quantify that standard? FIG. 5 lists some of the key maintenance and reliability best practices benchmarks.

6 Summary

Performance measurement is a means of assessing progress against stated goals and objectives in a way that is quantifiable and unbiased. It brings with it an emphasis on objectivity, consistency, fairness, and responsiveness. At the same time, it functions as a reliable indicator of an organization's health. Its impact on an organization can be both immediate and far-reaching.

Performance measurement asks the question, "What does success really mean?" It views accomplishment in terms of outputs and outcomes, and it requires us to examine how operational processes are linked to organizational goals. If implemented properly, performance measures- metrics-are evaluated not on the basis of the amount of money that is spent or the types of activities performed, but on whether the organization has produced real, tangible results.

The real objective of metrics should be to change the behavior so that people do the right things. The secondary objective is to determine the health of the process or assets being monitored. A metric is nothing more than a standard measure to assess performance in a particular area.

However there is an imperative need to ensure that the right things are being measured.

Performance indicators can be leading or lagging. The purpose of using these indicators is to measure the performance of the process or asset and to help identify where the process is working well and where it’s not. Leading and lagging indicators provide information so that positive trends can be reinforced and unfavorable trends can be corrected through process changes. Leading indicators measure the process and predict changes and future trends. Lagging indicators measure results and con firm long-term trends. Whether an indicator is a leading or lagging indicator depends on where in the process the indicator is applied. Lagging indicators of one process component can be a leading indicator of another process component.

A benchmark is a measure of best practice performance.

Benchmarking refers to the search for the best practices that yields the benchmark performance with emphasis on how can we implement the best practice to achieve superior results.

Finally, developing a performance measurement system that pro vides feedback relative to an organization's goals and supports it in achieving these goals efficiently and effectively is essential. A successful performance measurement system should:

• Comprise a balanced set of a limited vital few measures

• Produce timely and useful reports at a reasonable cost based on quality accurate data

• Disseminate and display information that is easily shared, under stood, and used by all in the organization

• Help to manage and improve processes and document achievements

• Support an organization's core values and its relationship with customers, suppliers, and stakeholders

7 QUIZ

__1 Why do we need a performance measurement system? What are the benefits of such a system ?

__2 What are the benefits of benchmarking?

__3 Explain what is meant by a "World Class" benchmark?

__4 What are the key attributes of a metric?

__5 Explain leading and lagging metrics.

__6 What types of metrics show results?

__7 Explain the Balanced Scorecard model.

__8 Explain the different types of benchmarking. What are the benefits of external benchmarking?

__9 Discuss data collection and quality issues. How can we improve data quality?

__10 List five metrics that can be used to measure overall plant level performance of maintenance activities. Discuss the reason for your selection.

+++++

Prev. | Next

Article Index    HOME   Project Management Articles