PUBLIC ADVISORY COMMUNICATIONS SYSTEMS
"There is no limit to the good you can do if you don't care who gets the credit" (General George Marshall)
Homeland security communications with the general public exist in three forms: (1) risk; (2) warning; and (3) crisis. According to Bullock et al. (2006), risk communication involves alerting and educating the public to the risks they face and how they can best prepare for and mitigate these risks in order to reduce the impact of future disaster events. Warning communications involves delivering a warning message in time for individuals and communities to take shelter or evacuate in advance of a disaster event. Crisis communications involves providing timely and accurate information to the public during the response and recovery phases of a disaster event. In this lecture, we are primarily concerned with what are arguably the two most important forms of communications; risk and warning. A good deal of scientific research has been carried out on forms of communications, and warning communications has clearly evolved since the Civil Defense siren days. Crisis communications is not covered here because it is well known the effectiveness of crisis communications largely depends on charisma of the leader (messenger), and although there may be issues with this, the whole topic of crisis communications is comfortably treated as a subfield of public affairs (Mullis 1998). However, risk and warning are the forms of communication most commonly involved with the more technical aspects of terrorism alerts. Unfortunately, there is still a long way to go before such threats and hazards are communicated effectively.
No communications system is 100% perfect. This is especially so with natural disasters which are quite unpredictable and difficult for the public to understand (at least with the prediction science involved). At least people know that natural disasters are short-lived events; the unpredictability factor is multiplied tenfold with terrorist events. In addition, a bipolar view of the world tends to develop afterwards, lumping both individuals and nation states into the categories of "being with us" or "against us." It is safe to say that vague messages as well as poor rhetoric will result in greater public fear and confusion. Also, some warnings may result in public indifference (i.e., the "Boy Who Cried Wolf" syndrome). With almost any socio-economic hazard, public interest will be heightened and political suspicions will be rampant. The public will want to know, and quickly know, what's going to happen, who is to blame, what the government is doing, and what they should do next. One issue which makes the terrorism communications problem so difficult (and different) is the fact that at least part of a government's initial response to terrorism must remain classified. Authorities don't want to tip their hand or mess up the investigation, and there is always the chance of a second, or follow-up, terrorist attack. Hence, a balance has to be reached between the twin needs of restricting and releasing information. Anyone who has noticed how a police spokesperson often remarks they cannot comment on something knows which side of the balance the criminal justice system usually favors.
WHAT DOES IT MEANS TO BE "WARNED?"
A number of principles exist in the area of warning communications, such as predictability, controllability, mobilization, and resilience. Many have survived more by custom than science, and many are in need of revision and modification. Disasters, of course, can never be eliminated solely by better scientific prediction, just as politics will never completely eliminate terrorism. All that might be hoped for are better "information delivery systems" which provide sufficient early warning for the maximum number of people impacted and result in the maximum amount of mitigation possible. This represents, unfortunately, an utilitarian or social contract approach which sometimes yields results contrary to common sense. Disasters do not impact a society of utilitarians. Terrorists do not attack discriminately. When it comes to warning, no one group of people are more important than another group. There is a need to perfect warning systems.
The word "alert" is the collective term for both alarms and warnings, and indicates when a predefined set of parameters have reached a warning or alarm state. The word "alarm" generally refers to a condition where human scientists or a mechanical signal device indicates at least one parameter has reached a critical or abnormal state. The word "warning" technically refers to action carried out by public authorities who translate alerts and alarms into recommendations for action. As Alexander (2002) points out, alerts and alarms are generally the responsibilities of scientific and technical staff, while warnings are generally the responsibility of political authorities. The relationship between alert, alarm, and warning is an interesting case of symbiosis between scientists, who pursue prediction in terms of relatively ambiguous statistical probabilities, and politicians, who tend to pursue absolute certainties and crystal-clear messages of hope and promise. Prediction and warning are very different things. It is questionable whether prediction science will ever help improve warning politics.
Furthermore, alert and alarm procedures are different from warning systems, at least in terms of communications channels. Two communications systems usually exist side-by-side: (1) an alert and alarm system that connects all the monitoring, sensor, and signal devices (for which there are many, allowing the technicians and scientists to securely communicate with one another); and (2) a warning system for the general public and specialized groups, which needs to be connected, in some way, to the scientific system so that any misunderstandings about the meaning or interpretation of scientific findings can be cleared up. The phrase "specialized group" also refers to that part of the public who is most likely to be impacted, and in this sense, may constitute a private, or third, system of communications. There are pros and cons about specialized communications systems in the field of disaster management, as sometimes it is more efficient to warn a sector of society, and sometimes there is a legal obligation regarding "failure to warn" the rest of society. The channel which connects technicians and scientists together is sometimes called the "priority" channel, and the one which connects politicians is sometimes called the "civil authorities" channel.
The principle of "mobilization" deserves some comment, and refers to the need for rapid response to the prediction of danger. The word "rapid" should not be taken too seriously, however, because an established principle of disaster warning is that an affected area should be prepared to be self-sufficient for a minimum of 48 hours. The concept of mobilization highlights the importance of time requirements (which is not exactly the same as timeliness). Time requirements would involve, for example, the fact that survival rates fall exponentially if a warning is not given at least an hour before impact. The preferable amount of time, established by custom with the agreement of the scientific community is eight hours (for most disasters). The concept of mobilization also highlights the way in which alert and alarm procedures (aimed mainly for emergency workers) are connected with warning systems (aimed mainly for the general public). Adapted from Alexander (2002), the following is the general model for a full-scale mobilization process:
Phase one -- the activation or call-up phase for emergency workers; either telling them to report for duty or to be aware that they may soon have to report
Phase two -- the stand-by phase, indicating that conditions have worsened and emergency workers need to be deployed into the field and non-essential activities need to be shut down
Phase three -- the readiness phase, where conditions have continued to worsen, and all emergency workers are deployed, and the emergency operations center is fully functional
Phase four -- the evacuation phase, where preventive measures are taken, and the public is normally informed of the fully worsened threat
Phase five -- the emergency phase, involving the onset of the disaster or the highest level of threat
Phase six -- the stand-down phase, which can be interjected at any of the above stages, and normally means a regression to an earlier level
Phase seven -- the all-clear phase, which normally follows the stand-down phase, and represents the announced end of the crisis
According to Foster (1980), once a scientific prediction has been made, an emergency manager should have the authority to decide, at any phase, whether or not to warn the public. However, there are many issues and dilemmas involved. One dilemma is that failure to warn at an early enough phase may be grounds for negligence lawsuits, and premature warnings may be perceived as false alarms and reduce public confidence in the warning process. Ideally, Section 9.1.1 of the emergency plan should outline the obligations of all parties on the interpretation of scientific evidence for purposes of time requirements, but as a practical matter, the decision to warn the public usually takes other issues into account; e.g., how educated the public is; the sociological and psychological factors; the economic and political effects; and whether or not the warning system will function as expected thru prior testing.
The public education factor is perhaps the most important factor because the public needs to know exactly what to do and needs to believe that the warning is legitimate and authoritative. Public education about risk is also a key factor in obtaining that elusive concept known as resilience. Standards for accuracy must be high. It is helpful to remind ourselves here that we are talking about sharing warning, not prediction, with the general public. Many in the public, especially the media, will want to know exactly what is expected to occur and what causes it, and that is prediction, not warning. Warning tells the public what to do or what action to take, but the public is not generally confident in warning unless they are convinced the underlying prediction mechanisms are legitimate (have adequate foundations in law) and effective (valid and reliable). The media compounds the confidence problem by often broadcasting opinions rather than facts (Walsh 1996). An informational dilemma therefore exists which is usually resolved by issuing a public advisory which contains a little of prediction along with a little of warning. A public advisory is the closest thing to scientific gobblygook without sounding like gobblygook. The following is a sample public advisory warning, the form of which should be familiar to most people:
A Sample Public Advisory Type of Warning
|The National Weather Service has issued a winter storm warning for the counties of ABC, MNO, and XYZ in which snowfall in depths up to six feet may unexpectedly hit eastern portions of those areas within the next eight to twelve hours accompanied by freezing rain and ice which may disrupt utilities and along with snow make travel impossible. According to both Doppler and Infrared radar, there is an eighty-five percent chance of this event happening with a ten percent confidence interval for error. Residents are urged to make immediate preparations for shelter and plan for at least two days until public services and utilities are restored. Local police will deploy roadblocks to curtail traffic, and emergency shelters will be available for those who feel they do not have adequate personal shelter for the possible two-day blackout. Do not delay. This is a Particularly Dangerous Situation (PDS). Make preparations now for staying inside, and remain tuned to emergency frequencies on a battery-operated radio.|
In the above example, one can easily see the public education problem. The public may question the underlying legitimacy of the prediction mechanism; e.g., Doppler & Infrared radar with 85% plus or minus 10% accuracy. The public may also question the credibility of the warning source; e.g., the National Weather Service has been wrong before. There are also sociological and psychological factors that come into play, as many residents will rush out to grocery and supply stores to stock up on things like non-perishable food, water, and portable heating systems. How traffic is to be "curtailed" is questionable unless drills or prior tests have been done with police before. Panic is likely to result from some of the wording in the advisory. The public also needs to know where the emergency shelters are located. This is not a perfect example, but there are no perfect examples of perfect warnings. There are only germane subtopics which can be communicated as carefully as possible.
FEAR AND FEAR MANAGEMENT
Human beings tend to perceive risk according to certain patterns that the field of risk perception (Slovic et. al. 1981; Slovic 1986) has studied extensively. Fear is basically defined in the dictionary sense as a expectation or apprehension of danger, but "dread" is something different, and most relevant to the fear of terrorism. There are many ways to die - from a heart attack to being eaten alive by sharks -- and both are fearful, but being eaten alive has more dread. It's like that with terrorism, which not only gets high marks on the dread scale, but has the additional qualities of awareness (people are usually more afraid of things they are aware of than unaware of), being new (people are more afraid of a new rather than familiar threat), is happening to "us" (people are more afraid of things happening to "us" not "them"), is catastrophic (people are more afraid of sudden, acute events rather than chronic or enduring events), and uncertainty (people are more afraid about vague events than specific events). These are all factors that the field of risk perception says fills a person with dread. Risk and warning about terrorism makes use of all these factors. Indeed, fear of terrorism is substantially different.
Bullock et. al. (2006) point out that fear is irrational only if people have enough information about a hazard to perform a personal risk analysis, find that the likelihood of the hazard is smaller than the risks they face on a daily basis, and are still afraid. When people do NOT have enough information, they very rationally overestimate their personal vulnerability because of incomplete or incorrect information, including rumors and opinions. The over-reaction process in humans is quite rational; the under-reaction part is irrational. Only information can combat fear, and the process of combating fear begins with what is called an initial "anchoring" point. Anchors are the first pieces of information a person receives about a risk. Any information provided subsequently makes for what are called "adjustment" points. Emergency and fear management professionals for many years have argued that the anchor point should always contain some information which puts the current risk in context. This is in addition to the requirement that information be delivered by a trustworthy, single source with credibility and decision-making authority. Unfortunately, most initial anchors are things like "the largest manhunt in history began today..." or "an unprecedented event occurred today..." which feed on sensationalism rather than context. If the public comes to believe that authorities are powerless to deal with an "enormous" threat, the impression of dire seriousness and great uncertainty is given. In such cases, the best thing to do is give the public more specifics, since panic has most likely set in. It is also important to not keep repeating or broadcasting the threat. Because of something called the "availability heuristic," people will come to fear something more if they keep hearing the same thing about it. Alternatively, a type of irrational "news fatigue" may set in, which reduces vigilance, and is just as dangerous.
THE IMPORTANCE OF COMMUNICATING WITH THE PUBLIC
The choice of medium to communicate with the public is an important matter, but there may be times when no means of communication is open, and that's where amateur (HAM) radio volunteers come into play because FEMA support for groups like RACES is strong. It's impossible to overstate the importance of having good, strong working relationships with whomever can get messages out to the public, whether it be traditional delivery systems -- television, radio, or print -- or whether it be one of the newer forms of media, like the Internet. Also, communicating with the public about terrorism hazards presents some unique challenges. Let's examine the three kinds of emergency messages in relation to the type of media involved.
This type of hazard awareness message is usually the most comprehensive and documentary. It essentially serves an educational or persuasive function (Mullis 1998). The goals are to teach people what they can do to protect themselves, protect others, and cooperate in the public interest. There have been critics of this kind of communications, on grounds that it causes fear (Furedi 1997; Glassner 1999), or is just government propaganda of the Walter Lippman variety (see Lippman 1922) because it is sometimes patently deceptive and doesn't tell people everything. However, there is much research on risk communication indicating that it is the best way to reduce both the likelihood and consequence components of risk (Ansell & Wharton 1992; Willis 1997; Morgan et. al. 2002). Teaching people how to protect themselves is an ongoing process called preparedness, and the Ready.gov website (with all its "Be Informed" facets) attempts to be the primary DHS effort in this regard for most terrorism-related incidents. The effectiveness of a comprehensive website (as the choice of medium) is a matter of speculation, but social media seems to be the way to go. On the other hand, a lot of classified channels of communication exist. Many times, risk communication constitutes the bulk of information shared between enforcement agencies (although critics might call it rumor and/or innuendo). It is intentionally designed to be persuasive and convince the receiving authority to take some action. A good example is the way INTERPOL uses "notices" to share crime-related information among member countries. There are seven (7) kinds of notices that INTERPOL issues, as follows:
1. Red notice -- this seeks the arrest or
provisional arrest of a wanted person, usually for purposes of extradition
2. Yellow notice -- this is a call for assistance in locating missing persons or identifying hard-to-identify people
3. Blue notice -- this is a call for additional information about activities relating to a crime
4. Black notice -- the seeking of information on unidentified bodies
5. Green notice -- a warning or intelligence on those likely to commit a crime
6. Orange notice -- a warning about threat vectors, such as how a dangerous material is disguised
7. Purple notice -- provides data about hiding places, tactics, and modus operandi
Risk communication also attempts to enhance resilience, which goes beyond preparedness. An example of a resilience-oriented warning system is the one used for WILDFIRES. According to wildfire-fighting authorities, five (5) risk levels are recognized and summarized as follows:
1. Preparedness Level I -- Optimum
conditions for normal prescribed fire operations. Wildfire activity within the
area is light, and large fires are of short duration. There is little or no
commitment of area and/or national resources.
2. Preparedness Level II -- Zone and area resources are adequate to manage all wildfires and prescribed fires. Numerous class A, B, and C fires are occurring and a potential exists for escapes of larger fires for more than one burning period. Potential exists for frequent mobilization of additional resources from other zones.
3. Preparedness Level III -- There is a potential for two or more zones to experience incidents requiring a major commitment of area/national resources. High potential exists of fires becoming class D and larger. Zones may be requesting resource priorities from area command.
4. Preparedness Level IV -- Class D and larger fires are common and have the potential to exhaust area and national resources. Competition exists for area/national resources.
5. Preparedness Level V -- Several zones are experiencing major fires, and national resources are exhausted. Military resources have been committed within the area.
The research on risk communication tends to support some commonsense principles. First of all, the quality of information must be high. People will die if they don't get good, quality information. By "quality" is meant that the depth of information must be good enough that people can make intelligent, informed judgments. Secondly, it is known that people are more likely to follow safety suggestions if the risk involved is an old one they are familiar with, rather than a new risk they are unfamiliar with (Morgan et. al. 2002). Thirdly, it is important to avoid providing too much, exhaustive, unrelated information such as the kind you get from in-depth, investigative journalism. In fact, the journalism and broadcast industries tend to go into too much depth at times, and produce fear of risk rather than the appropriate response, awareness of danger. Risk, by itself, is not to be feared because it is simply a matter of odds, likelihoods, or probabilities; it's the danger or consequence of risk that should be feared, if fear must be involved at all. Trying to handle the new media's bloodthirsty taste for what is "newsworthy" without succumbing to needless, emotional fear appeal is a major challenge in emergency management, and there may also be times when it will be apparent that the news media seems "biased" against the purposes of risk communication. Singer & Endreny (1993) as well as Mileti (1999) have outlined some of the key ingredients for good risk communication, as follows:
the mortality rate associated with the hazard (if known)
the expected spatial extent of impact (as best predicted)
the time frame associated with the hazard
what is being done to mitigate the hazard
This type of message is most commonly found in the form of an alert or public address system. It is best delivered to the public via some easily recognized and popular system which represents a consensus on alert states. A color-coded scheme, a thermometer, sound tones, or other symbol of iconic stature are best used. This kind of delivery system has traditionally worked well with natural disasters, but questions about utility and effectiveness have been raised for its applicability to terrorism (Ethiel 2002). The advantages and disadvantages of color coding schemes for terrorism are discussed later. Suffice it to say that groups like the Partnership for Public Warning have been at the forefront of examining the issue. One of the enduring controversies regarding warning communication is whether or not a consensual terror alert ought to be announced nationwide, regionally, or location-specific. There is some wisdom on all sides of this controversy. A nationwide warning certainly wastes precious resources by forcing governors and mayors all over the place to increase spending on security, but a location-specific warning might elevate fear and anxiety in one particular region (if done repeatedly) and play into the terrorist's hands.
Terror alerts need to become more specific, but obviously the government cannot play all the cards in its hand. What is notably useless are those kinds of vague, general alerts which inform the whole society that an increase in terrorist "chatter" has been intercepted. Specific, or regionally-focused alerts are much better. Unfortunately, lower ranks often do not have access to the intelligence analysis of a terrorist threat, even when a specific geographic area is involved.
This type of message is usually delivered in the form of news conferences, and is best thought of as a consumer-friendly attempt to reach target audiences and/or stakeholders. In the public relations business, they are known as press briefings, and sometimes they have to be given out every hour or two, depending upon consumer demand. Their basic goal is to provide timely information during an ongoing crisis, and also to exercise care in not revealing any information that might "taint" an ongoing criminal investigation. The D.C. Sniper Case of 2002 is a good case study for this because Sheriff Moose, at times, seemed to be using the media to "talk" to the snipers, and there were numerous "leaks" and other panic reactions which might have been avoided. Standard hostage negotiation doctrine does not favor the idea of using the media to communicate with criminals, but there has been some precedent in serial killer profiling and counterterrorism, so maybe the idea is worthy of further exploration.
There are controversies over the best way to deliver crisis communication. Best practice has it that even if an oral press briefing is held, a printed copy of remarks should be handed to the press afterwards (or before, if the script can be followed). However, in some cases, the demand for information is going to be so urgent that frequent updates to an Internet web page might be the best delivery system. It is also important to remember one's audiences, and crisis communication typically is the kind of thing that first responders and other "internal customers" tune in to.
The worst-case scenario is if PANIC sets in. The study of panic reactions is an area where more interdisciplinary research is needed. In criminal justice, there is a diverse literature on panics, scares, and crime waves, but little of it is organized for applicability to homeland security. Sociological perspectives also exist of dubious value, owing primarily to conceptual confusion over the differences between mobs and crowds as well as many unanswered questions about the imitation process in general. Warr's (1987) line of research on the fear of crime tends to be somewhat on-target since it explores at least some of the psychological syndromes that are of interest to emergency managers. Thomas (2003) says that the best protection from panic involves "the timely flow of information from experts to the public via the mass media." Slovic, Fischhoff & Lichtenstein (1981) have identified four "risk perception conclusions" that can be drawn from the research, which are paraphrased as follows:
people tend to deny risk and act overconfident the less they are cognitively informed
people are more afraid of things they can imagine than things that are real
people tend to discount statistical risk and dread the likelihood of an unlikely, fatal mishap
people who disagree about the dangers of a risk won't change their mind in the presence of evidence to the contrary
EMERGENCY ZONATION AND COLOR CODING
Reaching out to the public must not only involve careful calculation of the time requirements, but the space requirements also. There are important spatial or geographic considerations in public safety, and these are generally addressed by a system of emergency zonation and micro-zonation. "Zonation" is a term from ecology which refers to an area of protection or reserve that has established boundaries and a buffer zone where transboundary activities can be coordinated. "Microzonation" is a term from the field of hazard management which attempts (thru extensive GPS mapping) to isolate the safer areas within a disaster impact zone. Emergency zonation usually involves the application of a color coding scheme for different places, and color coding is used as well for different levels of hazard. The use of color coding has long been regarded as the best way of imparting complex scientific information to an audience that is mainly interested in the overall pattern but may also be interested in the details (Bertoline et. al. 1997).
A basic, generic scheme for color coded zonation would make lighter colors, like white, the safest areas, followed by green zones, which are potentially dangerous but no special precautions need be taken. The color yellow is usually reserved for places where monitoring, surveillance, or intense awareness is needed. It then gets complicated after that because some theories and systems use red as the next level, indicating minor destruction possible; and other theories advocate blue or purple as the next level, whereas purple usually indicates major destruction possible. Regardless of whatever color coding scheme is used, it is important to use it consistently, clearly, make sure it is widely disseminated, and that the boundaries, once set, are firmly imposed. Alexander (2002) recommends the following color-coded alert model for general hazard warning:
A Color Coded Alert Model for Hazard Warning
|White (or Green) level||Area subject to possible hazard, but none expected. Monitor for possible secondary hazards.|
|Yellow level||Precursor signs of hazard; at least one anomaly or enhanced activity detected; open priority channels.|
|Orange level||Repeated and accelerating signs of hazard; open civil authorities channel; issue bulletins or advisories; shut down hazardous and sensitive operations.|
|Red level||Intense signs of hazard where minor destruction may have already started; real-time monitoring and specific warnings issued; general warnings advice-oriented; evacuation plan and worker deployments in effect.|
|Purple level||Major impact of hazard in at least one area; evacuation should cease and quarantine begin; all emergency services working; forecasting of secondary impacts.|
Warning, whether specific or general, should be a iterative, or repetitive process. The first message (most likely a bulletin) should be followed by further bulletins which update any changes in the status of the hazard as well as any additional forecasts or predictions of expected impact. Bulletins are usually disseminated via secure, or priority, channels, so there's little need to monitor the appropriateness of the response, unless one is conducting a drill, exercise, or fictitious scenario. Once an advisory has been issued, on the other hand, it is important to monitor the appropriateness of response to ensure that public panic does not result and that civil authorities are safe and secure. Advisories tend to become more and more advice-oriented up to and including the stages of evacuation and search and rescue. Bulletins are normally issued at fixed and regular times, like 6 minutes after every hour, and advisories are normally issued every 10 minutes. Every warning system should contain a way to measure feedback, which means that the recipients of warning should be able to confirm what they hear.
DISSEMINATION OF WARNING
Warning systems must be delivered by some logical means of dissemination, and there are important logistical and sociological matters at stake. Logistically, one may need to consider any special arrangements necessary for nighttime warnings, and sociologically, one may need to consider minority language groups, people with special needs, those who are deaf and blind, and those who live in remote areas. The general rule of thumb is to carry the warning on as many robust mediums as possible and that the overall system of dissemination have redundancy. A robust medium is one that is likely to stay in operation thru many levels of the emergency, and redundancy means that more than one medium is relied upon. No one channel of warning dissemination is likely to reach all the people who need to receive the warning message. Some media channels are notoriously unreliable under hazard conditions, and other channels are more vulnerable to hacking than others. Some basic choices, as outlined by Alexander (2002) with pros and cons, modified and extended upon, are as follows:
sirens -- can be used at night time, but some people do not hear them or misinterpret their meaning
police cars -- can broadcast via loudspeakers or go door-to-door, but is a slow process
aircraft -- can drop leaflets or fly message banners, for areas without utilities or other services
radio -- easy to broadcast repetitively on, but must be on all frequencies, and depends on listeners
television -- audience tends to be large, but often same problems as radio
telephone -- volunteers can do telephone trees and fax plus dial-out (reverse 911) services exist
newspapers -- not much good in short term, but can provide detail and often have Internet access
internet -- "push" systems can send pop-up messages or emails to computers, PDAs, & pagers
billboards -- electronic ones can relay messages to travelers
all media -- a general, all-out mass publicity campaign
Usually, a public information officer is the specialist relied upon to choose the best medium for dissemination, but the procedures for public information are also usually spelled out in the emergency plan. It may be helpful to understand that whenever a warning is issued thru a news media that any particular media becomes "intermediaries" who may take it upon themselves to decipher, translate, and even challenge or criticize the warning. The tradition of a free press tends to foster something of a mistrust of civil authorities that often impedes the close public-private partnership needed for a successful warning system. The Partnership for Public Warning's (pdf) document on this matter is an educational read.
Since 1994, all radio and television stations have been mandated by the FCC (Federal Communications Commission) to have EAS (Emergency Alert System) equipment. EAS used to be called the EBS (Emergency Broadcast System) which had been around since 1963. FCC rules require a station to interrupt programming whenever there is a national alert, and stations are free to choose whether they want to run local alerts. The national alert warnings are sent as a digital packet to the stations and cannot be longer than two minutes in runtime. The station either must display the warning full-screen or add it as a "crawl" along the margins. Because many weather advisories and other warnings are local, and because producers want to minimize interruptions, the public only sees a small percentage of all the warnings that are issued via EAS. Check out the FCC's EAS page for recent rule changes, if any. EAS equipment is tested on all stations weekly by some sound tones, and monthly by a two-tone test message which goes like this: "This is a test of the Emergency Alert System -- this is only a test...." EAS automatically translates into Spanish for areas where there is a significant Spanish-speaking population.
COMMON ALERTING PROTOCOL (CAP)
The Common Alerting Protocol (CAP) is a movement that got started in the year 2000 and is based historically on a document called "Effective Disaster Warnings (pdf)" released by the Working Group on Natural Disaster Information Systems under guidance from the President's National Science & Technology Council. That document suggested a standard method be developed to relay instantaneously and automatically all types of hazard warnings locally, regionally, and nationally. The National Weather Service is the only agency with anything close to that capability at present, and since about 70% of EAS interruptions are weather-related, CAP initiatives have attempted to build on the NWS model, with FEMA guidance. CAP is also essentially a proposal for warning-system "interoperability" which is based on XML (Extensible Markup Language) standards. XML, like HTML (Hypertext Markup Language - what you're reading right now) can be understood by both humans and a variety of machines, especially those machines used to help the deaf and blind. A group called OASIS (Organization for the Advancement of Structured Information Standards) is involved in helping refine and showcase the latest CAP projects that software developers come up with. The most recent (November 2004) project showcased was a new way to handle alerts during incident management of a chemical plume cloud disaster [View Slides]. On a more significant level, the word "interoperability" frequently means "compatibility" and refers to the heavily grant-funded effort to get county-level emergency service departments on the same radio frequencies as city-level police departments. Most EMS departments use VHF frequencies which run in the 30 to 300 megahertz range, and most police departments use 800-megahertz frequencies.
MISCELLANEOUS WARNING SYSTEMS
In 2001, the National Center for Missing & Exploited Children launched the AMBER Alert System. The AMBER Plan was initially suggested in 1997 in honor of 9-year-old Amber Hagerman, a little girl who was kidnapped and murdered while riding her bicycle in Arlington, Texas. The way it works is that in serious child abduction cases where law enforcement agencies think time is of the essence and the law enforcement agency has met Justice Department requirements for entering data into NCIC (National Crime Information Center), descriptive information about the child is then sent to radio and television stations via the EAS (Emergency Alert System). Some states also incorporate electronic highway billboards as part of AMBER alerts, and these are those highway billboards ordinarily used to disseminate traffic information to drivers. Some AMBER alerts also contain pertinent information about abductor or what the abductor might look like. The system is operational in all the United States and most parts of Canada. There are penalties for false reporting, and to date, AMBER Alerts have been responsible for the successful recovery of 176 children. The Center for Missing Kids also runs a successful Cyber Tipline for crimes against children. CodeAmber is where one can get real-time tickers to add to one's website.
Using Amber Alert for Homeland Security Purposes
|Amber Alerts are based on what is known as the capillary model of information distribution, where the alert starts out in a central conduit, like a main website or web portal, and then it spreads or flows out to a variety of devices, such as electronic roadway signs (pictured at right), E-mail, text messages, pop-up ads for people who opt in to alerts, and the capability exists to send the alert (and other digital signals) through public TV stations, cable companies, mobile phones, PDAs, satellite broadcasts, and other devices. NASCIO (National Association of State CIOs) and DHS are conducting pilots in early 2005 to see if Amber has the potential to be the nation's needed All Alert System.|
Sigma Communications, Inc. developed the Reverse 911 System, a Microsoft Windows-based program, in 1993 to help public safety agencies contact citizens quickly in specific areas about emergency situations. Reverse 911 is a system that sends a recorded message to telephone owners from the police. The message can include a warning of the emergency and/or important information regarding the emergency. The calls can be targeted to specific neighborhoods or can be city-wide. The way it works is that residents receive a call that goes like this: "This is a Police Bulletin", DO NOT HANG-UP. The following is important information! This is a reverse 911 call from the police department, calling to let you know of important information. If you are an unlisted number, you must callback at 555-555-5555 to ensure that you will receive future reverse 911 calls." Reverse 911, especially when combined with Enhanced 911, which allows GPS tracking of calls and callbacks, allows for powerful law enforcement and public safety applications. While the system is finding widespread application, the system is most likely to be found in jurisdictions that are near prisons, and to notify households in case of an escape from the facility.
Slightly less than half the states and a good number of the nation's major cities have a 211 System, which was originally created in Atlanta during 1997 as a kind of non-emergency backup for overburdened 911 systems. The 211 systems generally only operate during normal business hours, and connect callers with agencies in an United Way database, other social service providers, charitable organizations, and emergency response information. Most 211 systems are localized or regionalized, but some state homeland security agencies (e.g., Connecticut, Hawaii, Idaho, Minnesota, and Arizona) in the last couple of years have been looking at utilizing the system as part of a statewide alert program. Connecticut, for example, used 211 in the aftermath of the Sept. 11 attacks to connect families with victims, refer people to mental health services, and get volunteers to donate blood. Some statewide 211 systems have VoIP (Voice-over IP) technology, and a group known as the Alliance of Information & Referral Systems has been in the forefront of enhancing 211 capabilities. Research to date indicates the system has excellent promise in directing people's generosity and may be useful in the recovery stages of emergency management.
There are also some interesting and innovative pure web-based alert-like systems on the Internet. For example, Rollins College in Florida (see Rollins emergency plan), created a system called the Student Storm Tracker, which allows students to check in and keep in touch from anywhere. Instead of tracking the storm, the tracker system was a web-based database to record where students went and a contact phone number where they could be reached. Every parent also received e-mails about the location and safety of their children in school. In addition, the system could send post-hurricane-status e-mails. During the 2004 hurricane season, Rollins got about an 85% usage rate with the system. It turned out to be exactly the kind of service people needed and wanted.
THE HOMELAND SECURITY WARNING SYSTEM
The Department of Homeland Security is mandated by law (House Bill H.R. 2250 pdf) to develop the READICall (Responsive Emergency Alert and Dissemination of Information Call System) emergency alert system which will both alert and inform citizens about imminent or current acts of terrorism. The READICall system is designed to be activated only by order of the Secretary of Homeland Security, and reaches roughly all the 104 million households out of 109 million that have landline telephone service, nearly every business in the United States, and 141 million cellular phones. However, the READICall system has never been used yet, and questions exist about how catastrophic in scope something has to be in order for the government to feel the need to contact each and every citizen. Far more familiar to most readers is the Homeland Security Advisory System (HSAS), which was mandated by Homeland Security Presidential Directive 3 on March 11, 2002 (below):
Homeland Security Advisory System
|Severe Condition (Red)
reflects a severe risk of terrorist attacks. Emergency managers increase
personnel for emergency needs; mobilize specially trained teams; monitor
or constrain transportation systems; and close public and government
High Condition (Orange) reflects a high risk of terrorist attacks. Agency-specific protective measures are implemented, and inter-government security efforts are coordinated; additional precautions at public events, including alternative venues or cancellations; critical workforce dispersed; and facility access to essential personnel only.
Elevated Condition (Yellow) reflects a significant risk of terrorist attacks. Increased surveillance of critical locations; emergency plans are coordinated with nearby jurisdictions; precise characteristics of threat are refined; and emergency response plans are implemented.
Guarded Condition (Blue) reflects a general risk of terrorist attacks. Communication systems and plans are checked; and the public is provided with any information that would strengthen its ability to act appropriately.
Low Condition (Green) reflects a low risk of terrorist attacks. Volunteers are trained; and all facilities and sectors are assessed for vulnerabilities to terrorist attacks, with reasonable measures taken to mitigate these vulnerabilities.
Theoretically, the purpose of HSAS is to cultivate an atmosphere of preparedness. The idea is to cultivate citizen vigilance so that perhaps someone will notice something suspicious and perhaps a terrorist plot will be foiled. However, as Bruce Schneier has commented, "professional security experts like me are not particularly impressed by systems that merely force the bad guys to make minor modifications in their tactics." An additional problem is lack of public confidence in the system, a factor that DHS itself is aware of, at least according to its Advisory Council Report of 2009 (pdf). In fact, as of late 2010, DHS started thinking about abandoning the system, and/or replacing it with something else.
The DHS alert chart has been the subject of numerous parodies. Online comedy sites, like AlmostaProverb; Discord; KivaNet; Onion; Sweetooth; and Whitehouse.org have all made fun of them, and perhaps its credibility is gone, but it is instructive to study how it works. Public ridicule appears to be driven by the fact that it should probably never have been called a warning system; perhaps "threat awareness system" would have been a better term.
A threat condition is assigned on the basis of risk, where risk is a combination of the probability of attack and its potential gravity. A qualitative, not quantitative, assessment of integrated intelligence will be used to determine risk, with qualitative factors of such intelligence including being credible, corroborated, specific and/or imminent, and having grave consequences. There can be no guarantee that, at any given threat condition, a terrorist attack will not occur. Threat conditions are assigned by the Attorney General in consultation with the Assistant to the President for Homeland Security the appropriate Homeland Security Principals (heads of directorates). Threat conditions may be assigned for the entire Nation, or they may be set for a particular geographic area or industrial sector. They are binding on the executive branch and suggested, although voluntary, to other levels of government and the private sector. Federal agencies must submit an annual report describing the steps they have taken in terms of Protective Measures for each Threat Condition.
Most of the practical problems that the Advisory System (HSAS) has experienced center on the transition from yellow to orange. Reportedly, DHS officials receive so many complaints every time they change the threat level from yellow to orange, they’re simply not going to change it again, unless they can do so for very specific cities or industry sectors. Nobody really seems to know what to do at Condition Orange anyway, and Condition Yellow seems like what everyone is comfortable with. The CATO Institute has been a long-time critic of the HSAS, and the Partnership for Public Warning's pdf document outlines many problems with the HSAS, as does a 2004 GAO Report (pdf), and a Univ. of Delaware Disaster Center paper (doc) argues that HSAS is NOT even a warning system. There have only been a few times when the Threat Condition has been raised to orange, and the threat level has never been raised to red (high), nor has it ever been lowered to blue (guarded) or green (low).
The basic problem is that there are no objective or published criteria for when a HSAS transition ought to take place. One of the most befuddling things that authorities tell the public is that the government has picked up an increase in "chatter." The closest any government official ever got to explaining "chatter" was during a Sept. 10, 2007 alert which went as follows:
Government Explanation of "Chatter" as Basis of Terrorism Alert
The national threat level is being raised from Elevated, or Yellow, to High, or Orange, for all domestic and international flights. Only small amounts of liquids, aerosols and gels will be allowed in carry-on baggage. See the Transportation Security Administration (TSA) website for up-to-date information on items permitted and prohibited. While there continues to be no credible information at this time warning of an imminent threat to the homeland, the department's strategic threat perspective is that we are in a period of increased risk. A National Intelligence Estimate cited heightened activity overseas and we're mindful of some recent arrests in Europe. There has also been an upward trend in propaganda tapes and messages coming from al Qaeda and affiliated networks over the past year.
MILITARY, LAW ENFORCEMENT, AND SCIENTIFIC SYSTEMS
The military (Department of Defense) operates a system called Force Protection Condition (FPCON) which was formerly known as Threat Condition (threatcon), and the name was changed around 2001 to avoid confusion with the US State Department's system of threat assessment. Like HSAS, Fpcon describes the amount of security needed to respond to terrorist threats, particularly those which relate to military facilities and personnel, but also important vendors as well as critical infrastructures. The central agency which determines the level of Fpcon is the Antiterrorist Alert Center, or ATAC. This agency was one of the first antiterrorist watch groups in the US, originally operated by the Navy since 1983, but now involving the Coast Guard, Marine Corps, CIA, DIA, NSA, State Dept., and US Special Operations Command (although the Navy, or more specifically, the Naval Criminal Investigative Service, or NCIS, still maintains much of it at what they call the Multiple Threat Alert Center, or MTAC. ATAC operates 24 hours a day in twelve hour watches and specializes in determining the credibility of a threat as well as the impact on society. If a threat is deemed "reasonable," ATAC can forward a report about it directly to the National Security Council. NSC staff can then work with the appropriate agencies on what recommendation ought to be brought to the President (considering the public impact, mainly), and then the President can decide whether to raise or lower the Fpcon level.
A couple of the strengths of military law enforcement alert systems are worldwide coverage and fusion. There is a fairly clear-cut distinction between enemies and allies with military systems, so communication generally works well here. However, many military units work with the threat assessments done by international, national, or regional police organizations. A fusion intelligence orientation has existed, of course, since 9/11, utilizing the seconding of specialists and special project monitoring teams. The "seconding" of personnel, or their temporary detailing back and forth from civilian to military agencies (often crossing various levels of rank) is a relatively minor activity in military law enforcement circles, and is often criticized because it is regarded as controversial and in violation of unity of command principles. However, "fused" (both military and civilian) communication tends to be the most down-to-earth, matter-of-fact form of communication out there. A good example are the Force Protection (FPcon) levels that anyone working on a military installation comes to know and love. They are as follows:
FPcon Color-Coded Threat Levels
|Fpcon Normal (green)||No current terrorist activity and low risk of terrorist attack.|
|Fpcon Alpha (blue)||Guarded condition toward a general risk of terrorist attack which is broad and not predictable.|
|Fpcon Bravo (yellow)||Elevated condition for a significant risk of terrorist attack based on reliable information.|
|Fpcon Charlie (orange)||High condition for when reports show a terrorist attack is immanent.|
|Fpcon Delta (red)||Severe condition where terrorist attack has just occurred or is taking place.|
While the military uses a different color-coded system, it is nonetheless integrated with civilian systems in many ways. Some of the extra protective measures and emergency plans, for example, are coordinated at various levels. Also, it is important to note that within some geographical area, there might be one level of threat for civilians and another for the military. Most alert systems are territorial, which means that an advisory for New Jersey may not be the same for another state or even for cities within the same state. Fpcon Delta is the level which has the most military-civilian impact because it will trigger the most military activity which may have a hardship impact on civilian activity. Fortunately, Fpcon Delta is usually only implemented for a couple of days. Fpcon Charlie, by comparison, is likely to be used for several weeks. It should also be noted that military alert systems can "jump" from one extreme level to another (i.e. from Normal to Delta) unlike civilian alert systems which tend to "progressively" move from one level to another. The reason for this is that military systems follow a "better safe than sorry" logic, and it can be further noted that military systems also can assign a plus (+) sign to a level, indicating the need for an "extra" level of security beyond that established by the Fpcon level alone. Plus signs can help to synchronize alert levels between different alert systems.
The US Coast Guard operates a maritime security system designed to provide alerts for the nation's ports and waterways, and this is known as MARSEC (Maritime Security Threat System). Marsec and Fpcon are usually closely related, but most experts will tell you that, for many reasons, it's important to have a separate alert system for maritime security. This is because of the unique threat environments which exist within the maritime industry, and because many common security practices which work inland do not work in the maritime environment. Marsec uses a three-tiered system, as follows:
Marsec Security Alert System
|Marsec Level 1||Minimum security; monitoring of appropriate security measures at all times (generally conforms to Fpcon Normal, Blue, or Yellow).|
|Marsec Level 2||Additional protective measures needed because of increased risk of transportation security incident.|
|Marsec Level 3||Set when a transportation security incident is probable, immanent, or has occurred; even though it may not be possible to identify the target.|
Marsec levels are determined by the Commandant of the US Coast Guard with information gathered from Coast Guard intelligence units and certain other agencies. Level 2 is the more commonly used level here which involves extra water patrols and harbor patrols. Although it is possible to restrict traffic (to unnecessary vehicles) under Level 2, Level 3 is usually reserved for that, when ALL vessel traffic would be stopped. Other options exist such as giving a facility a restrictive curfew, but curfews are generally most associated with Level 3. The Coast Guard's explanation of Marsec says that even though an attempt is made to be "commensurate" with HSAS level and that they may "align closely," there is no expectation that Marsec will "directly correlate" with any other alert system.
It would be a mistake to think that warning systems are more refined when scientists are involved, but perhaps because of the complexity of the phenomena they study or scientific bickering, the truth is such systems are usually quite crude. An example is the system used to measure solar flare activity:
Solar Terrestrial Activity Reporting
The three kinds of solar events that scientists monitor are coronal holes, coronal mass ejections, and class M and X solar flares. Coronal holes are what create the aurora borealis, and produce geomagnetic storms that can knock out the Earth's power grid. Coronal mass ejections are a massive burst of high-speed plasma with deadly cosmic ray effects. Solar flares are wide-spectrum radioactive, energy releases that knock out communications and can cause biochemical damage. Sometimes, a Farraday cage can protect electronics and pre-1980 vehicles might still run, but it may take months to recover from a solar event, especially when there's only 3-8 hours of warning about a solar flare, 96 hours of warning about a mass ejection, and 5 days of warning about a coronal hole.
Amateur Radio Disaster Services
American Radio National Association
Carnegie Mellon Center for Risk Perception & Communication
Christian Emergency Network
Common Alerting Protocol (CAP)
CRS Report on EAS & READICALL (pdf)
DHS Advisory System Main Page
Disaster Warning Network
DoD ThreatCon Definitions
FCC Emergency Alert System (EAS)
Fear Factors in an Age of Terrorism
Fear of Crime in the U.S.: Avenues for Research & Policy (pdf)
FEMA Report on Effective Disaster Warnings
Military False Alerts of Nuclear Attacks
National Association of Deaf Emergency Preparedness
National Cyber Alert System (US-CERT)
National Hazards Center
National Weather Service, What is EAS?
Partnership for Public Warning
Radio Amateur Civil Emergency Service (RACES)
Red Cross Homeland Security Advisory Recommendations
Red Cross/Red Crescent Organizations (international)
Salvation Army Team Emergency Radio Network (SATERN 4-11)
USPS Mail Center Security Guide
Washington DC Area Emergency Alert System
Alexander, D. (2002). Principles of emergency planning and management. NY: Oxford Univ. Press.
Altheide, D. (2002). Creating fear: News and the construction of crisis. NY: Aldine de Gruyter.
Ansell, J. & Wharton, F. (1992). Risk analysis, assessment, and management. NY: Wiley.
Bedient, P., Holder, A., Benavides, J. & Vieux, B. (2003). "Radar based flood warning systems." Journal of Hydrologic Engineering 8(6), 308-318.
Bertoline, G., Wiebe, E., Miller, C. & Mohler, J. (1997) Technical graphics communications. NY: McGraw Hill.
Brzezinski, M. (2004). Fortress America: On the front lines of homeland security. NY: Bantam Dell.
Bullock, J., Haddow, G., Coppola, D., Ergin, E., Westerman, L. & Yeletaysi, S. (2006). Introduction to homeland security, 2e. Boston: Elsevier.
Burkhart, F. (1991). Media, emergency warnings, and citizen response. Boulder, CO: Westview.
Christen, H. & Manisclco, P. (2002). Understanding terrorism and managing the consequences. Upper Saddle River, NJ: Prentice Hall.
Donley, M. & Pollard, N. (2002). “Homeland security: The difference between a vision and a wish.” Public Administration Review 62:138-144.
Dory, A. (2004). "American civil security." The Washington Quarterly 27(1): 37-52.
Ethiel, N. (Ed.) (2002). Terrorism: Informing the public. Chicago: McCormick Tribune Foundation.
Fischoff, B., Gonzalez, R., Small, D. & Lerner, J. (2003). "Evaluating the success of terror risk communications." Biosecurity and Bioterrorism 1(4): 255-258.
Flynn, S. (2004). America the vulnerable. NY: HarperCollins.
Foster, H. (1980). Disaster planning. NY: Springer.
Glassner, B. (1999). The culture of fear. NY: Basic Books.
Haddow, G. & Bullock, J. (2003). Introduction to emergency management. Boston: Elsevier.
Lippman, W. (1922). Public opinion. Cambridge, MA: Harvard Univ. Press [online edition].
Mileti, D. (1999). Disasters by design. Washington DC: Joseph Henry Press.
Morgan, M., Fischoff, B., Bostrom, A. & Atman, C. (2002). Risk communication. NY: Cambridge Univ. Press.
Mullis, J. (1998). "Persuasive communication issues in disaster management." Australian Journal of Emergency Management (Autumn): 51-58.
National Research Council. (1989). Improving risk communication. Washington DC: National Academy Press.
Partnership for Public Warning. (2003). A national strategy for integrated public warning policy and capability. McLean, Virginia: Partnership for Public Warning.
Perry, R. & Lindell, M. (2003). “Understanding citizen response to disasters with implications for terrorism.” Journal of Contingencies and Crisis Management 11(2): 49-60.
Schneier, B. (2003). Beyond fear. NY: Copernicus.
Schoch-Spana, M. (2002). "Educating, informing, and mobilizing the public." Pp. 118-135 in B. Levy & V. Sidel (eds.) Terrorism and public health. NY: Oxford Univ. Press.
Singer, E. & Endreny, P. (1993). Reporting on risk: How the mass media portray accidents, diseases, disasters, and other hazards. NY: Russell Sage Foundation.
Slovic, P., Fischhoff, B., & Lichtenstein, S. (1981). The assessment and perception of risk. London: The Royal Society.
Slovic, P. (1986). "Informing and educating the public about risk." Risk Analysis 6(4): 403-15.
Subcommittee on Natural Disaster Reduction. (2000). Effective disaster warnings. Washington, D.C.: National Science and Technology Council.
Thomas, P. (2003). "The anthrax attacks." NY: Century Foundation [available online]
Van Brunschot, E. & Kennedy, L. (2008). Risk, balance, and security. Los Angeles: Sage.
Walsh, J. (1996). True odds: How risk affects your everyday life. Santa Monica, CA: Merritt.
Warr, M. (1987). "Fear of victimization and sensitivity to risk." Journal of Quantitative Criminology (3):29-46.
Waugh, W. (1990). Terrorism and emergency management. NY: Marcel Dekker.
Waugh, W. (2000). Living with hazards. NY: M.E. Sharpe.
Willis, J. (1997). Reporting on risks: The practice and ethics of health & safety communication. Westport, CT: Praeger.
Zschau, J. & Kuppers, A. (2002). Early warning systems for natural disaster reduction. NY: Springer-Verlag.
Last updated: Mar. 7, 2013
Not an official webpage of APSU, copyright restrictions apply, see Megalinks in Criminal Justice
O'Connor, T. (2013). "Warning Systems," MegaLinks in Criminal Justice. Retrieved from http://www.drtomoconnor.com/3430/3430lect08.htm.