The ISO-PPC is a third-party standardized assessment that is performed approximately once a decade. Most departments accept this accreditation process as the only system of assessment. It is for this reason we set out to begin an investigation of its accuracy and therefore usefulness as a way to measure our performance. After all, we know our strict adherence to centuries of tradition is not enough to meet the criteria of 21st century evaluation practices, but conversely do we know with certainty that the ISO meets modern day standards? This is a particularly important question to ask at this moment because the ISO is in the process of overhauling the PPC system, and we as fire departments, citizens, and academics must be aware of, and involved in the process.
In this essay, we present evidence that suggests the ISO as it is currently structured might not be the most accurate tool for measuring overall performance for two reasons. First, we found that half of fire chiefs in one rural North Carolina county believe the ISO rating is “unrealistic” and “needs overhaul.” Second, we compared department budgets to ISO ratings and found evidence of an association between them (the bigger a department’s budget, the better its rating). Consequently, it appears the ISO rating might be influenced more by budget size than by level of performance.
We want to stress that our research is exploratory and far from conclusive. More sophisticated survey research and statistical analysis is needed before we can make any conclusive claims about the value of the current rating system. Over the next year, we hope to conduct a statewide survey of fire chiefs and analyze more data in regard to ISO rating and departments’ loss of life, rural or urban status, response time, and several other variables. Our goal is twofold. First, we want to begin an honest discussion throughout the profession about the ISO rating process. Second, we want to conduct sound social scientific research that will inform the honest discussion. We invite fire fighters and academics to join us in this important research that has real life consequences.
Our Methods and Results
Our findings rely on two methods of analysis. First, we conducted a survey of fire chiefs in a rural North Carolina county. — that wishes to remain annonymous. We found there is a significant level of dissatisfaction among the chiefs when asked: “How do you feel about the ISO Rating Process?” In the pie chart, the green section represents the portion of those satisfied with the rating system, while the larger blue section represents the portion of those dissatisfied with the rating system. Those chiefs who did not answer the question on our questionnaire are represented by the grey portion. For our statewide survey, we will ask the chiefs to better explain their responses so we can have more information on why they are satisfied or dissatisfied. Additionally, we will follow-up with those who chose not to answer the question to find out why they did not respond. This is important because there might be political pressures or department pressures that prevent some chiefs from responding honestly to the question. Our promise of anonymity is very important in receiving valid responses.
Second, we analyzed data already collected by the University of North Carolina School of Government’s “North Carolina Local Government Performance Measure Project.” The data collected in this project is used to evaluate the efficiency of local governments. The study includes 17 North Carolina municipalities, and for our purposes it allows us to observe differences between departments with a varied range of ISO ratings, budgets, and size.
Based on our analysis of this data it seems that the budget could influence the rating more than performance. One would expect some level of relationship between budget and ISO rating as more money can mean more personnel and equipment that can effect response capabilities. However, this relationship should not be the primary factor of an ISO-PPC as there are other critical aspects that gauge quality of fire protection and different departments spend money differently. We want to be very cautious here because we looked at very few departments and the association is not statistically significant. In other words, we cannot make any claims about the effectiveness of the ISO rating; however, we are encouraged to take a closer look at the entire state. Our next step is to analyze all the data from the 17 municipalities using regression analysis and report our findings in a future paper.
Why Our Study is Important?
We think our research and our call for more analysis and discussion by members of the firefighters’ fraternity and the academics who study it are important for two reasons.
First, the ISO rating was designed for insurance companies to assess risk of fire losses, not to assess the performance and safety of fire departments. However, over the years the ISO rating system has become a performance measure allowing chiefs to justify expenses.
For example, how many times have more rural departments justified another engine because their ISO rating would help reduce insurance premiums? Or more often, how many times has a city justified a multi-million-dollar ladder truck to lower a rating? This is not to say some of these expenses are not justified, but we must be honest and acknowledge that the ISO rating system has served departments as a tool to facilitate increases in budgets, not only as a performance assessment or decision aid. This unintended consequence of the ISO rating system tends to serve our needs, but what impact does it have on fire departments when it is used as a performance measure?
Second, the ISO is designed to aid departments in the evaluation and understanding of their decisions, but this does not mean it is a measure of all aspects of performance. This might sound like splitting hairs but there is a difference between recognizing performance shortcomings and evaluating substandard procedures for response. This is important because, due to the absence of an independent state evaluation scale, the ISO rating is the only game in town. This is problematic since agencies seeking accreditation through the Commission on Fire Accreditation International (CFAI) are few and far between. Even the CFAI process, while important, is not adequate to serve as a nuanced performance measure because it is only pass or fail. This leaves the ISO rating as the only uniform and widely-accepted performance measure in the state. For the sake of our profession, this third-party rating system requires closer examination.
In addition to directly contributing to the current process of overhauling the PPC system, there are several steps that chiefs who are dissatisfied with the ISO can take to improve departmental performance:
- Become more than a good operational fire chief, also become an effective administrator who is able to step back and critically look at all aspects of the operation.
- Pursue accreditation (although many departments do not have the resources to devote to such an involved and expensive process).
- Look at your department as a business and your citizens as customers.
- Involve all levels of your department in a regular 360-degree evaluation to identify strengths and expose weaknesses at all levels.
Start a proactive committee to think through the department’s challenges and develop practical tools for evaluation as well as solutions for challenges in your department. The key is to customize this strategy for your own needs. What works in New York may or may not work for you. Do not be afraid to seek help or further education if needed to strengthen your statistical skills.
Do not be afraid to seek the advice of someone who is outside your organization. Although they will not know everything about your department, for that very reason they will bring a fresh perspective. Sharing each other’s experiences will only make us stronger as a profession.
In conclusion, we in the fire services and those who study it need to continue to focus on quality enhancement. Let us, as a larger community, take a closer look at the current ISO-PPC rating system while also creating our own assessment tools. We owe it to ourselves and the people we protect.
Brian Barnes serves as the Risk Management Program Manager with NC Emergency Management. Barnes is also a firefighter/EMT with Buies Creek Fire Department in Harnett County. Barnes is currently pursuing a Master of Science degree in Safety, Security, and Emergency Management from Eastern Kentucky University. John C. Mero teaches public administration and emergency management preparedness at Campbell University. He earned his Ph.D. from Syracuse University’s Maxwell School of Public Affairs and Citizenship.