Public Fire Ed Report Card: Not All Programs Effective

Public Fire Ed Report Card: Not All Programs Effective

Ever since “America Burning” gave credibility to the belief that public fire education was “the single activity with the greatest potential for reducing losses,” fire departments have given more thought— if not always true support— to this prevention effort.

Public fire education continues to mature. Its leaders have learned to develop cost-effective programs of proven results. But the well-meaning programs of some fire departments can only be deemed a waste of time, money and personnel.

The use of education to prevent human carelessness with fire existed before the report of the President’s Commission on Fire Prevention and Control was published in 1973, of course, hut only afterward did public fire education’s own education formally begin. Unfortunately, not everyone has kept up with the rest of the class.

Hitting the target

Public fire education can be compared to a fire stream. If either one is applied directly and correctly, the respective fire problems can be controlled. The difficulty is that public fire education (and water) is not always applied effectively. Again, some programs are clearly outstanding. Some are not.

With fire department budgets coming under closer and closer scrutiny, it is more important than ever to recognize the difference.

Programs of public fire education are as varied as the fire problems they attack. To be proven “effective,” however, a program must include (by definition) documentation and evaluation. And in general, the results should be measurable loss reduction. “America Burning” tried to emphasize this with at least two separate statements:

  • “Parts of the program must be designed to provide feedback information on program effectiveness—information which is essential to achieving optimum benefit, yet is usually not collected.”
  • “It is safe to assume, given the sheer number of efforts, that some programs are less effective than others. W’hat is needed is a mechanism for evaluating these programs so that weak efforts can be replaced by coordinated support of efforts of proven effectiveness.”

Documentation is the record of what was done in a program. It includes materials developed and the relevant statistics from the beginning and end of the time period. Evaluation is the measurement of those statistics and the effect of any behavior change.

Worth the time

Documentation and evaluation take more time, but they are worth it.

Good documentation will always assist the good program. Even the stingiest finance official would have trouble opposing the results of an early Missouri fire education program. The fire death rate there had been above the national average when the program began. Yet after three years of the program the rate had dropped 43 percent. For each dollar invested in the program, $20 were saved in anticipated property losses, medical expenses and earning losses.

In a recent report published by the United States Fire Administration (USFA), however, an account of an otherwise promising program concluded with the admission that the fire department “has not systematically evaluated the impact… on fire loss in the city. That is, there has been no formal assessment to determine whether individuals attending (the program) subsequently have fewer fires than those not involved.”

Elsewhere in the publication, on another program: “Tracing the effects of (the program) on children’s attitudes and behaviors is nearly impossible.”

Finishing the job

These programs represented sincere hard work and they may indeed be effective. So why not prove it with a little extra effort to document and evaluate? As the cartoon says, “The job isn’t complete until the paperwork is finished.”

Record keeping in itself may not be a personally rewarding activity, but Pam Powell of the USFA Office of Planning and Education calls it the “foundation of a good program. It helps focus the program and then helps in proving results.”

The home fire safety survey program in Edmonds, Wash., had the figures to prove its success. Residential fire incidence dropped from 164 to 26 in the second year. And public fire education was responsible for reducing fire incidence involving juveniles in Los Angeles from 169 to 12 in the third year of a special program. The figures from both efforts were there for all to see.

Educational principles

But what about a school program which involves simply gathering all students in an auditorium for a general talk on fire safety or to see a film? Records could be kept on the number of students and the length of the session.

Nevertheless, what is missing is evidence of learning.

The problem is that such a program disregards the principles of education. Now, if a trained teacher without any knowledge of fire attempted to teach fire safety, then the fire service would have reason to complain. Is it less wrong for fire personnel to attempt to teach without fully understanding how learning takes place?

A display of fire fighting equipment and apparatus likewise does not automatically qualify as education. “Public relations is not public education … but good public education is good public relations for a department,” explains Powell. The only certain learning from such a display, she adds, is “How red is my fire truck.”

Not equal

There may be many justifiable reasons for a department to conduct public relations programs, but they should not be confused with education.

At a recent conference of the National Fire Incident Reporting System (NF’IRS), a speaker described how the use of statistics can avoid unneeded programs. He said some departments committed resources to a campaign against student smoking in schools. It was assumed that since smoking materials frequently contributed to fires elsewhere, it should be a problem in schools as well. But the NF’IRS figures showed that smoking was seldom a factor in school fires even when there was no prevention program.

A program giving priority attention to false alarms without evidence of that being a priority problem is similarly not cost effective.

“Shotgunning” is as wasteful as missing the true target problem. This happens when a department with limited resources (all departments!) tries to attack every fire problem simultaneously. Time, money and personnel are spread so thin that none of the fire problems shows much improvement.

F’ailure to target on actual local fire problems is a main characteristic of ineffective programs, says Powell. With a fireground comparison she asks: “In a block of houses with only one house burning, where do you put the water?” The message equally applies to public fire education.

Some programs have failed because the enthusiastic fire educator—with an otherwise organized plan—forgets the need to solicit the support of the rest of the department. Perhaps the authority for a program has been delegated by the chief to the educator. That doesn’t relieve the obligation of the educator to first “sell” the chief and the others on the program. They are not likely to support what they don’t understand.

Other than maintaining good records and avoiding the pitfalls already described, what distinguishes the successful, effective programs? According to the USFA Office of Planning and Education, which studied programs that worked well, it did not matter where the program was conducted, what problem was attacked nor how much money was spent. The main thing was a logical plan.

One of the most experienced public fire educators, Mt. Prospect, Ill., Fire Department’s Lonnie Jackson was involved in developing the F’ive Step Plan recommended by the USFA (see also the November 1980 issue of F’ire Engineering). It is no longer a new concept, but some departments still do not use it. Jackson believes, though, that no program should start without the systematic guidance of the plan’s building blocks:

  1. Identify the local fire problems.
  2. Select one of the major problems.
  3. Design an appropriate message.
  4. Implement the program.
  5. Flvaluate the results.

Data

A critical part of identifying local problems is having the figures that point to the causes of frequent fire losses. How else can it be known with any certainty which types of Fire hazards should be attacked first with limited resources?

Jackson watches for newly developing problems (fires in wood stoves, for example) with a map and colored pins. Different types of fires get a color-coded pin, so that trends in types of fires or in certain locations become more obvious. Then recognition and response don’t have to wait for a year-end accumulation of figures.

Larger national trends can now be spotted quicker, too, thanks to the NF’IRS computers and the figures supplied by the states.

In selecting from the major local fire problems, the consideration should be for where the most good can be accomplished with the available resources. These resources are not limited to fire department money and personnel.

The selection of a target hazard to attack should be a cost-effective one. Targeting should focus on the right audience as well as the right problem. Considering which age group suffers more burn injuries, should a program describing “stop, drop and roll” first go to the schools or to civic clubs?

An inventory of resources may affect a cost-effectiveness decision. For example, if a senior cit izen group is willing to voluntarily make home inspections, then that may take priority over another project requiring thousands of dollars. But the complex problems cannot be ignored just because they are more difficult. “Select an achievable objective,” Powell suggests. If the number one but most complex local problem cannot be reduced by 100 percent, then focus on parts of that main problem. And show a lesser, though worthwhile, reduction.

Getting through

In designing the appropriate message, you decide what you are going to say and how you are going to say it. The message must be simple, yet it must hold an audience’s attention. The USF’A Public Education Assistance Program and Powell can help here by showing what messages have already passed the evaluation test elsewhere. (See also the September 1980 issue of F’ire Fmgineering.)

Implementation is easier after the preliminary work has been t hought fully completed.

A rigorous evaluation should consider many factors that could affect t he surface results. For example, a project boasting of 1000 home inspections may not necessarily be effective if there is no evidence that actual reported hazards were corrected. However, if the later fires are correlated with the homes t hat were inspected, and the inspected homes showed fewer fires, then that is indeed a strong indication of effectiveness.

A reduction in fires from one year to the next may not be credited to an education effort if the population decreased greatly. Other external factors could le present, too, and fire educators should be able to answer any challenges.

When the number of fire incidents showed a major increase on year in Mt. Prospect, Jackson was asked what was wrong. His program had reduced fire incidence by over 60 percent and reduced fire deaths to zero since 1976. By having good information on the reported fires and by analyzing the results, Jackson traced the overall increase to an increase in automobile fires. Most of them were in pre-1975 gas guzzlers, and insurance fraud was more evident than a breakdown in fire education. These would have to be labeled “nonpreventable” by public fire education efforts.

So, analyze the results. And don’t expect every effective targeted program to result in overall loss reduction in a town. There may be camouflage.

Evaluation can’t wait until the end of a lengthy program. A preliminary evaluation is desirable and possible in a short time if monitoring is done on a day-to-day basis. Then a program can be modified as necessary.

“It may take three to five years,” reminds Nancy Dennis Trench of Oklahoma State University, “to document a significant loss reduction, depending on a program’s scope.” After that, of course, the reduction may be dramatic and on-going.

Although less certain than loss reduction evaluation figures, four other measurements are recognized by the public education unit at USFA. If vague, the measurements nevertheless indicate a climate for future provable loss reduction.

  • Institutional change. When a television station provides free public service announcements or other firerelated programming, it consumes almost no fire department money and reinforces other organized projects. It’s the same when a business prints brochures to promote smoke detectors. Any assistance is positive.
  • Educational gain. Whatever the subject being taughtgrease fires, for example—it helps if an awareness test is given before and after a presentation. If scores significantly increase after the lesson, then those individuals can better be expected to handle a grease fire properly.
  • Behavior change. This takes educational gain one step more. Learning about smoke detectors is educational gain, but when a detector is installed and an escape plan discussed as a result of a fire education program, then behavior change can be noted. If enough people take the same action, loss reduction evidence will follow.
  • Anecdotes. Before loss reduction trends are evident, preliminary effectiveness is frequently shown through real-life experiences. The expectation is that the ones made public represent other similar happenings.

They were ready

The Hartford Insurance Company Junior Fire Marshal program has long recognized youngsters for fire heroism. Looking at those accounts, it becomes obvious that specific fire education lessons allowed the spotlighted youngsters to take the proper action even under difficult circumstances.

“I knew what to do,” explained 11year-old Frank Kenney of Collegeville, Pa., ‘‘because the firemen came to our school and told us what to do about people on fire.”

His mother’s clothes had caught fire from the stove. She became frightened and started running. Outside it was Frank who reacted and told his mother to stop and roll on the ground. He poured water on her smoldering back and went to call an ambulance.

Frank’s mother had third-degree burns on her back. But she was alive.

No one can seriously dispute the value of education to prevent fire loss. The challenge, however, is for fire departments to support and to conduct only effective fire education programs. Anything less is a disservice to the public.

Hand entrapped in rope gripper

Elevator Rescue: Rope Gripper Entrapment

Mike Dragonetti discusses operating safely while around a Rope Gripper and two methods of mitigating an entrapment situation.
Delta explosion

Two Workers Killed, Another Injured in Explosion at Atlanta Delta Air Lines Facility

Two workers were killed and another seriously injured in an explosion Tuesday at a Delta Air Lines maintenance facility near the Atlanta airport.