From the “Productivity Blog” – here.
Over 90% of organizations now say they are tying salary increases and annual bonuses to specific performance measures, up from 78% in 2009, according to a new study conducted by the Institute for Corporate Productivity (i4cp). Despite these high numbers, the study also shows that many companies aren’t successfully executing their pay for performance strategy.
Oh yes. This has always been an issue, need and goal of many organizations. My question: pay for past performance – or for current capability to perform? Do you pay the firemen for what they did – or for what they can do? Other jobs/roles?
I’ve been there and done that – so to speak.
Our Prudhoe Bay client in 1987 – ARCO of Alaska – had attempted one of these efforts twice before bringing us in. What we did was to do a large CAD – Curriculum Architecture Design effort – and then built only the Performance Tests – no training – which is really the IAD methodology for those who follow my PACT Processes for T&D, Learning and Knowledge Management. It was a Pay for Knowledge (and Skill – applied to the job tasks of each job) effort.
As workers – then and there – and everywhere today – do not always control enough of their circumstances (see Deming) to pay them based solely on what they did – some firms have gone to what they can do – if given the chance. Like the firemen – who unfortunately do get to demo their skills way too often. But I digress.
From my and Ray Svenson’s 2008 book: Employee Performance-based Qualification/Certification Systems…
Two Short Case Studies
The two cases are both from related projects in North America. There are two parts to the story. The first involves maintenance technician qualification for one of the country’s largest oil fields. The second involves operator and maintenance technician qualification for one of the country’s largest oil pipelines. The oilfield in the story feeds the pipeline, but they are operated by different companies.
These stories are connected by some of the people who were involved in both of them and by the similarity of the qualification systems implemented in them.
The Oil Field
In the mid 1980s, the Oil Field Company instituted a Pay Progression Program (PPP) for its maintenance technicians. This program tied progression through the technician pay grades to performance, time in grade, and demonstrated qualification to perform technician work. There were 15 different classifications of technicians including electricians, mechanics, welders, instrumentation techs, computer systems techs, automotive techs, heavy equipment operators, and others.
When we were called in to help in 1986, the program was stalled because they had not found an acceptable way to qualify the technicians for the work as part of the program. Two previous attempts had failed, in part because they relied heavily on written tests which the technicians did not like and which did not really test their ability to do the work.
The technicians were responsible for maintaining virtually all systems and equipment on the oil field from the well heads to the oil-water-gas separation plants, gas and water re-injection plants, and all the pipelines to the head end of the Pipeline. They were also responsible for all the roads, vehicles, airport runways and navigational equipment as well as living quarters, potable water and sewage systems, and electricity generation and distribution.
The project was guided by a Steering Team made up of the Maintenance Managers, HR, and Training Department representatives who were responsible for administering the PPP program. We assembled a team of highly respected technicians for each of the classifications, e.g., electricians, to work with us to design the qualification testing system. Most of the teams had one or two engineers as well as the expert technicians. We all agreed right from the start that there would be no written tests unless absolutely necessary, and that all the tests would involve the performance of real tasks.
With each technician team we first identified all the systems and equipment on which they worked and then made a list of tasks for each system or equipment component. This was made easier by the use of a standard list of action verbs for the tasks such as: inspect, clean, calibrate, repair, overhaul, trouble-shoot. The task lists then became a matrix of the equipment and systems to the action verbs. Tasks were rated critical (worthy of testing) or not critical based on their impact on safety, operation of the oilfield, protection of the equipment and the environment. The teams then identified people they considered expert performers per task for us to work with to develop the tests for critical tasks. In all, there were over two thousand tests to develop.
We developed a standard test form designed to be used by an “evaluator,” who was an expert in the performance of the specific task being tested, to use while observing the candidate performing the task.
All the tests were to be performed on real work or simulated work or as a “talk-through troubleshooting” routine we developed where an adequate simulation environment, such as a lab, was not available. In fact, many of the troubleshooting tasks were tested using a talk-through simulation method, since it is not possible to introduce real troubles in most operating systems. Because we were able to test actual performance of all the critical tasks this way, there was no need to test theory and hence no need for written tests. The actual tests were developed with technicians who were judged to be experts and reviewed by their peers and the engineers. The technicians had great faith in the validity and value of the tests since their representatives had determined what needed to be tested and had contributed all the technical content of the tests.
A change process was established so that anyone at any time could challenge the validity of a test, suggest changes to a test, or suggest a new test. These challenges and suggestions could be based on changes in the work or the work environment, or just the perception that something is not right. These suggestions were to be formally tracked and resolved by “Review Boards” for each technician group made up of subject matter experts, master performers, and supervisors. In this way the system was kept evergreen and continuously validated.
We created an administrative system to manage the qualification process that included things like: selecting and qualifying evaluators, individual test planning, conducting and documenting testing, processing appeals by technicians who felt they were unfairly failed on a test, record keeping, continuous improvement of the tests and the entire system and subsystems.
The system was implemented with full support of the technicians and their management and survived for many years with modifications and improvements till the company was acquired and the system was merged with systems from the acquiring company.
In the early ’90s the operation of the Pipeline was audited by the Bureau of Land Management which has the regulatory oversight responsibility. One of the findings of this audit was that the Pipeline Company that operated the pipeline did not have adequate records to prove that the operators and maintenance technicians were qualified to do their work. There were no incidents or other indicators traced to under-qualified workers, but lack of verifiable qualification was considered to be a quality issue to be remedied since personnel safety and environmental and equipment protection and integrity were at stake. The company set out to develop a qualification system and test its existing operators and maintenance technicians to demonstrate their qualification.
The Pipeline training department was given the task of designing and implementing the qualification system. As it turned out, one of the training leaders had been the training manager who had brought us in on the successful Oil Field project. This resulted in an opportunity for us to help design a similar system for the Pipeline Company.
The operators and maintenance technicians were upset, and in some cases angry, that their skills were being called into question and that they would all have to re-qualify for their jobs using a qualification system with which they were not familiar. They self-organized an “Operator/technician Advisory Board” to present their concerns to management. Fortunately, the management seized the opportunity to communicate with the employees through this Board and used the Board to work with us as an advisory group to help design the qualification system and “sell it” to the operators and technicians. Once this Board became convinced that the company had no choice because of the regulatory environment, they recognized that their participation would help us create a qualification system that had the respect of the operator/technician community.
From that point on, the story followed the same path as the Oil Field story with teams for each specialty, the same types of performance-based qualification tests, expert performers as evaluators, an administrative system, etc. A sub-team of the Operator/Technician Advisory Board helped design the administrative system and then went around to the various pump stations, marine terminal, and control room and sold the system to their compatriots. The system was implemented successfully.
Common Threads From Both Cases
There were a number of common threads worth mentioning that emerged from both cases:
- The training departments took over administration of the qualification system including management of all the documentation
- “Selling” the system to the operators and maintenance technicians was a major effort which paid off with a high level of acceptance
- The performance-based testing of specific task performance worked well and sustained itself over time; even the talk-through simulations worked well
- Training of all the participants; candidates, evaluators, supervisors, and review boards, including video demonstrations of the testing process, helped implementation proceed with no major snags
- Managing to have enough qualified evaluators on hand during the first implementation wave was a challenge that was overcome but required careful planning
- In both cases the qualification program enabled the company to gain a total inventory of its maintenance work for the first time.
Early reviews for “Employee Performance-based Employee Qualification/Certification Systems” – 2007
Darlene Van Tiem:
Svenson and Wallace provide a definitive guidebook complete with sound advice and a wealth of examples, covering everything you need to establish and sustain a successful qualification/certification system!
This whole book is like a road map to unexplored territory. Some practitioners have been there before but left no maps to guide those who follow. You have mapped out a complex territory that has had little systematic attention but which is very important.
This book is a very useful contribution to the practice of performance development and improvement. Most of the professional literature focuses on elements of the system—test development, feedback, etc. and NOT on the design and management of a whole-company approach to qualification and certification. Most of the really difficult issues are not in the individual blades of grass, but are in the overall landscape which you describe so well.
This book should be required reading for anyone who is venturing out for the first time to create a qualification/assessment/certification system.
I like the questions approach used at the beginning and end of each chapter. I very much like the preface. It “sets” the book well regarding expectations. Emphasis on project plan criticality is GOOD! For some reason, establishing a strong agenda, for meetings, seems to be very difficult for most; these samples should be most helpful! The case studies are strong and I’m glad you incorporated those; most helpful. I really liked the work overall; it is thorough and well done.
Mark Graham Brown:
Thanks for sending me the book! You guys have done an amazing amount of work to document all this stuff and present it using beautiful pages. It looks very professional.
If the goal is to give someone step-by-step directions on how to design, develop, and maintain such a system, there is a lot of great detail here. Chapter 1 is interesting reading, addresses key questions a reader should have, and is clearly written. The book is clearly based on some valuable real-world experience. The Alaska examples are good case studies. The book is a great documentation of the process and lessons learned on these two projects.
In my opinion the first few chapters are written in a way that does interest people like myself. I think you guys have done a nice job in grabbing the audience early.
I like the 9 part cover diagram! Clear, simply written, easy to follow. The book format and layout look good – eye appeal! Excellent introductory chapters. Chapters 3-6 provide a good overview of the system. Chapters 7-10 provide more detail about the system. Excellent lists and tables. You’ve hit the target and are on the mark!
This is a manual for building a bullet-proof, performance-based qualification and certification system. As complex as a project of this magnitude could be, this book provides the fundamental “how to.”
Very well done! I like the conversational style. You’ve taken a relatively complex and detailed process but have handled describing it with plain business language. The one thing I really like about all the work you guys have done together is that you are always aware of the needs of the business at every point of the process.
The project plan for the TMC Stores case study is worth the price of admission. It provides very good picture of how it all comes together. Nice addition! If I was charged with that responsibility, this book is where I’d start! Given the book as the operating guide, I think I could take the project plan and begin to do it!
# # #