If you have ever called into a telephone customer service line for help with a product, most likely the person on the other end was having their performance measured by a series of metrics that included how fast they resolved your problem or simply finished the call, how accurately they entered data related to your call, if they were able to up-sell you to another product or the response time after your call hit the queue. In many cases, your call would have been recorded and a supervisor would have later listened to the recording, using an evaluation form, and notated how many criteria were achieved.
Most customer-facing jobs include a set of metrics which allow management to measure performance, reduce overhead by minimizing repeat calls, evaluate customer satisfaction and minimize the percentage of customers who might jump ship and defect to competitors. While the experience of many customers is that they wait on endless hold, get a snarky person on the phone or end the call with no resolution, the fact remains that metrics are widely used in call centers.
Evaluating costs and performance, using a predetermined set of criteria, is a common practice and it allows management to determine who might qualify for a pay raise, promotion or dismissal. It is also a way to find ways to remain within a budget and measure the effectiveness of current procedures. In the end, it’s about getting better; improving.
This notion of constantly improving, running a tight ship and providing a cost-effective product has not escaped the biomed world. Someone, somewhere is always evaluating how efficiently things are getting done and how well medical equipment is maintained and repaired, utilizing various criteria for measurement.
“CE /HTM departments mostly just do their jobs. The good departments know the importance of the hospital doing well financially so they are very aware of the cost of running their department. These departments run their operation like a business,” says Frank R. Painter, MS, CCE, assistant professor and Clinical Engineering program director in the Biomedical Engineering Department in the School of Engineering at the University of Connecticut.
“This is what separates the mediocre operations from the very good operations,” he says. “So to run like a business you need to know how much it costs you to provide services to those you work for. This is a simple calculation, but few do it.”
Painter points out that this calculation consists of the total cost of running the CE department for a year – salaries, benefits, supplies, space, utilities, etcetera – divided by the number of hours of service the department is able to provide to its customers in a year.
“This will come out somewhere in the range of $75 to $120,” he continues. “If you know this, you can then calculate the cost of providing services to a particular department and compare yourself to other providers (GE, Aramark, Sodexo, etcetera). This is a valuable benchmarking metric but is also a useful number needed to run like a business.”
“Historically, clinical engineering programs have tracked such metrics as PM completion rate; percentage of scheduled inspections completed on time; number of device failures; number of equipment problems found to be ‘use’ related (i.e., operator error); service response time; and “cost-of-service ratio (COSR),” says Dave Dickey, CHE, CCE, corporate director of Clinical Engineering for McLaren Health Care in Flint, Michigan.
“Do any of these metrics have anything to do with measuring ‘efficiency or success?’ Perhaps, but most likely, they do not,” he says. “Efficiency of a CE program could perhaps be quantified by measuring the degree to which a desired outcome was achieved with the least effort.”
“I guess this was more relevant long ago, where the focus was on doing electrical safety tests, as a driver to prove that a PM was done comes into play. We got really high numbers, with little work effort involved with pushing that little ‘lift ground’ button. Cranked out 50 inspections in a day. Does this mean our program was efficient,” Dickey asks?
Dickey questions if success can be quantified by the degree to which goals are obtained. He asks if doing 100 percent of scheduled inspections mean you have a successful program?
“Perhaps, but what if 10 percent of your inspections actually induced post inspection failures of the equipment tested, due to improper maintenance being done (i.e., put it back together incorrectly),” he says
“Hitting a financial target could also mean you were successful in meeting the goal, but what if this was achieved by spending far too much time on equipment repairs, say, taking three days to fix equipment using in-house staff or used parts that were DOA, as opposed to biting the bullet and calling in the vendor, who may have had the device repaired in half the time? You may have saved some cash, but to what expense,” Dickey adds.
Dickey also says that some metrics, which are commonly used, may not have an even application across different institutions and programs.
“Also, cost-of-service ratio, while I believe is a great benchmark for tracking year to year trends; to identify overall financial cost of equipment ownership trends; and to support the annual budgeting exercise, is not a good benchmark to use from one hospital to another, since it is highly dependent on equipment type; what’s included in your program; and purchase cost,” he says.
“However, I do suspect it is quite useful when comparing in-house versus outsourced program options, assuming you have an apples to apples comparison of program inclusions (example; an in-house program may cover abuse related repairs, whereas an outsourced program may not). Another thing to watch for, when attempting to benchmark cost-of-service ratio, relates to equipment types covered. For example, our program includes costs for surgical instrument sharpening, power equipment, rigid and flex scopes; linear accelerators and some image management systems, whereas another HTM programs may not,” Dickey adds.
The variability in the cost-of-service metric is one reason that other approaches are utilized as well according to Benjamin Lewis, MBA, CHTM, director of Clinical Engineering GA/FL for Novant Health Inc.
“Because of our close relationship with our supply chain, we are still able to calculate our cost-of-service ratio. However, this is a difficult metric to capture and can also be somewhat fluid,” Lewis says. “There are other ways to measure and track your performance. The depth and meaningfulness of these key performance indicators are only limited by the data you accurately track in your CMMS.”
Lewis says that some of these other key performance indicators (KPIs) include: “Parts Cost Savings: The difference in OEM and third-party parts costs. Success in this metric will be based on your organization’s savings versus spend trend over time.”
Another KPI he cites is: “Contractual Cost Savings: The reduction of contract spend by bringing contracts in-house or negotiation of a reduction in existing spend.”
He also lists Contractual Cost Avoidance; the cost savings from the avoidance of contracts due to in-house maintenance along with Actual Cost versus Budgeted Cost, which is the ratio of actual cost to budgeted cost.
“CE expenditures can vary due to maintenance needs like tubes and detectors, but the goal is to be, at or below, budget by five percent. The goal is not to pad your budget but not to go over,” he adds.
Tried and True
“Cost-of-service ratio is an excellent financial benchmark,” Painter says. “It tells your cost of doing business – running your HTM operation – but it is then ‘normalized’ so you can compare yourself to others of different size or capability.”
“The normalization factor is the purchase price of the inventory you support. So if you are relatively small, or very large, if you support imaging or you just do general biomedical equipment, the total cost of the inventory you support will be the factor which helps us all to compare ourselves to each other. The problem is that only in the last five years or so have many CE departments been aware that there is value in collecting acquisition cost,” Painter adds.
“Prior to that, most had little or no data about the cost of the inventory; but to properly calculate COSR we need the data. The mediocre departments, or the departments which have been squeezed by administration and are short staffed are not able to do the research to find or estimate these important numbers. So you can see that it is a mad cycle. If you don’t have the ability to collect the data, then you won’t get the metrics which will show how well you are doing,” Painter says.
Painter, Baretich and Ted Cohen co-authored a 2015 guide to benchmarking published by the Association for the Advancement of Medical Instrumentation (AAMI) called the “HTM Benchmarking Guide; Why Benchmarking Matters, and How You Can Do It.” The guide provides examples from AAMI’s web-based “Benchmarking Solutions” platform.
The guide helps in establishing some degree of standardization, as well as covering a case study and reasons for benchmarking.
“My experience as one of the subject matter experts for the AAMI Benchmarking program is that COSR is increasingly recognized as a fundamental performance metric for HTM,” says Matt Baretich, P.E., Ph.D, president of Baretich Engineering, Inc. in Fort Collins, Colorado. “That’s certainly the case among my consulting clients and the professional colleagues I communicate with.”
Baretich says that the challenge is that many HTM programs do not have data for the dollar value of the equipment they manage. Since the objective of performance monitoring and benchmarking is performance improvement, he recommends that one of the first performance improvement initiatives is to get the data.
“That means working with Finance to make sure that HTM routinely gets cost data on all new purchases. For existing equipment, I recommend making good estimates and focusing first on high-value equipment such as imaging devices and systems,” Baretich says.
Setting the Bar
Painter differentiates between completing regularly assigned tasks and making measurable incremental improvements.
“The metric ‘PM completed’ is not a metric anyone should consider to be a performance improvement benchmark. It is a basic requirement of the CE department to get these things done as required by TJC,” he points out.
“Following them is good, as it will help you keep your job, but it cannot be considered a performance improvement metric. You either get them done or you’re toast,” he adds.
“Related to performance improvement, there are two types of benchmarking. Benchmarking against yourself and trying to continuously improve your own performance,” Painter says. “This is the purest form of benchmarking and if done carefully and with eyes wide open, it can result in good forward progress in supporting your hospital’s healthcare technology.”
“The other method is to benchmark yourself against others. We do this to ‘keep up with the Jones’ or as a self-defense measure to show your boss you are in the ballpark and are worth investing in,” he adds.
Quality work equals devices that run smoothly and with fewer disruptions. Tracking data accurately can provide additional benefits.
“In addition to quantitative metrics, qualitative metrics also allow us to increase cost savings through performing alternative equipment maintenance on some equipment that meets the parameters for being part of our AEM program, excluding imaging and laser equipment,” Lewis says.
“Through proper work order tracking and accurate data pulled from our CMMS, we can study metrics like time between failure and total corrective maintenance work orders for a specific model of equipment over a period of time which enables us to make informed decisions on maintenance intervals and procedures.”
The Next Evolution
Finally, much in the way CMS holds a measuring stick up to a health care provider with its own set of metrics, the approach spills over into the HTM department.
“At McLaren Health Care, we are launching a new set of metrics which will allow us to track quality and effectiveness of our medical equipment management program, which I believe are more appropriate for demonstrating an HTM program’s value to the health care organization,” Dickey says.
“The focus here is to determine how our work affects (1) patient care outcomes, and (2) impact on patient’s length of stay. While focusing on these metrics is somewhat new, and a ‘works in process,’ the reality is that, so long as the inspection agencies keep asking about PM completions, our industry will continue to measure, report, brag and/or, hide from what these values, these numbers actually reflect; which is, not much,” he concludes.
When a call comes into a call center, a well-prepared customer service agent can provide the right answer and send the customer happily on their way. When medical equipment is up and running, a patient, who came in through the ER, may be able to get back home before the sun goes down. A measure of success in achieving these things goes a long way.
© 2015, TechNation Magazine. Site designed by MD Publishing, Inc.