EMForum.org Presentation September 14, 2011
Just How Standard is the NIMS Standard?
Findings from Current Implementation Behavior Research
Jessica A. Jensen, Ph.D.
Assistant Professor, North Dakota State University
Associate Director, Center for Disaster Studies and Emergency Management
This transcript contains references to slides which can be downloaded from http://www.emforum.org/vforum/NIMS/counties.pdf
A video recording of the live session is available at http://www.emforum.org/pub/eiip/lm110914.wmv
An audio podcast is available at http://www.emforum.org/pub/eiip/lm110914.mp3
[Welcome / Introduction]
Amy Sebring: Good morning/afternoon everyone and welcome once again to EMForum.org. I am Amy Sebring and will serve as your Moderator today. We are very glad you could join us.
Today we will be discussing the National Incident Management System (NIMS) and how it has been implemented across the country, so we hope you will take this opportunity to express your thoughts and experience.
Our guest today has conducted five separate scientific research studies on this topic and will share her findings with us. Please note that you can find links to a number of the articles that will be mentioned on todays Background Page. The Journal of Homeland Security and Emergency Management articles in particular may be downloaded for free using a guest login.
We are making a Live Meeting recording, which should be available later this afternoon. We are also making audio only, MP3 versions of our programs for your convenience, which you can access directly from our Website, or by subscribing to a podcast now available from the iTunes Store. If you are not on our mailing list, you can Subscribe from our home page, and then you will get a notice when these are ready.
Now it is my pleasure to introduce todays guest: Dr. Jessica Jensen is an Assistant Professor in the Department of Emergency Management at North Dakota State University (NDSU). She is also the Associate Director of the Center for Disaster Studies and Emergency Management at NDSU. Her research has primarily focused on perceptions about and implementation of the National Incident Management System and the Incident Command System. Please see the Background Page for additional information and links.
Now if you were with us last time when we talked about IAEMs proposed framework for measuring return on preparedness investments, you may recall that Jessica also worked with IAEM on that document, but could not be with us for that program. So we are especially glad she could join us today, and we have asked her to share any additional thoughts she may have. Please note, IAEM has extended the deadline for comments until tomorrow, September 15th, so you still have time to participate.
Welcome Jessica, thank you very much for joining us today, and I now turn the floor over to you to start us off please.
Jessica Jensen: Thank you for having me. It is an honor to be with you and share some of the findings from the research Ive been conducted over the past several years on NIMS. Before I get started, I did want to take the opportunity to once mention again the preparedness project that IAEM has been involved with and Randy Duncan, a colleague of mine, was on EM Forum speaking about recently.
I urge you all to take a look at that document. It is on the IAEM website and also you can take the opportunity to provide your comments and thoughts on whether or not you think this would be a reasonable approach to measuring return on investments for the emergency management grant program. IAEM and I are personally interested in getting any feedback we can.
I was saddened not to be able to participate in the EM Forum presentation last time. I was in Washington, D.C. with IAEM taking meetings on the hill with the Department of Homeland Security and FEMA about the document and just to share with you all, the initial feedback we have received has been very positive.
More information about the feedback that has been received both through the surveymonkey.com review site, and through our meetings throughout the nation will be shared with IAEM members through its website very soon.
Without further ado, I am going to begin with a little bit of background about North Dakota State University and its involvement with emergency management. North Dakota State University began offering a minor in emergency management in 2001 and in 2003, began a Bachelors, Masters of Science, and Doctoral degree program in emergency management.
It is a face to face program so our students are forced to come to Fargo from around the country if they would like to participate in our program as opposed to being online. We have graduated our first round of Bachelors students in 2004, the first Masters of Science students in 2006, and our first Ph.D. holders in 2009.
Our Center for Disaster Studies and Emergency Management began in 2008. The focus of the Center is to conduct multidisciplinary research on topics related to disasters and emergency management as well as to explore linking practitioner effort and work in the field with academic work that is being done. That is actually how I became involved with IAEM.
I am here today with you to talk about the National Incident Management System. I am making a fairly dangerous assumption that most folks on this webinar have some level of familiarity, but just to provide a little overview, after September 11, a number of investigations were carried out into the response efforts that took place in Shanksville, Pennsylvania, at the World Trade Center in New York and the Pentagon in Washington, D.C.
There was great concern that we find a way as a nation to improve the way we respond to disasters. There was also an interest expressed through these various commissions and their findings that we find a way to standardize our effort so that we could anticipate predictably the way that each responding organization would carry out their effort on scene and off scene so that the way we improve as a country would generally improve.
When NIMS was mandated, it was first mandated through Homeland Security Directive number five (HSPD-5), and it was cemented in law in the Homeland Security Act of 2002. Implementation with NIMS was voluntary in 2005 and counties and states had the option of beginning to comply with the NIMS mandate or not.
However as of 2006, all jurisdictions were required to implement NIMS in exchange for preparedness funding. There was quite a bit of a carrot and stick sort of incentive and sanction for participating or not in NIMS and its framework.
When NIMS was rolled out and mandated for use it was done with a number of assumptions in mind. First, the federal government assumed that NIMS would work as it was designed. Second, it was assumed that everyone would buy into the system and be committed to implementing it.
Third, it was assumed that everyone would use the system, and not just use it, but use it in a standardized way. When we say "everyone", we mean members of the private sector that participate in response, local level government, and all the entities within them related to response, including first responders and others, and non-profit organizations that are relevant to response.
A large group was included with this assumption that everyone would buy in and use the system. It was also assumed that the system was equally applicable to all places and all situations. That is okayall of the assumptions were okay as long as they are true, and as long as empirical research can demonstrate that they are working with this assumptions.
Given the ambitious nature of this policy and what it tried to do, you would expect there is a lot of research on the topic. In fact, there is very little.
The pre-existing research is limited to a handful of studies, one of whichsome folks may be familiar with already and if not, maybe after the call you could take a look at itwas published in "Learning from CatastropheQuick Response Research in the Wake of Hurricane Katrina". This was a book that was published in 2006 based on research that was conducted immediately following Hurricane Katrina.
Dr. David Neal and Dr. Gary Webb conducted research in the Louisiana area with local, state, and federal responding partners on the use and usefulness of the National Incident Management System and any structural barriers that existed to its full implementation.
In 2009 I published a study within the International Journal of Mass Emergencies and Disasters based on research I conducted in 2007. Basically what I did was travel throughout three states interviewing county emergency managers about their perceptions of NIMS use and how it was being used, and in so doing, discovered a variety of challenges reported by emergency managers in fully implementing the system.
In 2008, in April, I traveled to a state I cannot name because of Institutional Review Board protocols. I traveled to the scene of a tornado. As the response unfolded, I was on the ground within 24 hours.
I was able to watch how NIMS was used and interview various responding organizations represented in the Emergency Operations Center and the Incident Command Post on how NIMS was used and how useful they thought the system was. That was published in August of 2008.
Based on a mail survey I conducted in April of 2008 with a colleague Dr. D.K. Yoon at North Dakota State University, we asked volunteer fire department chiefs and volunteer fire fighters to participate in a survey that asked them about how they viewed the Incident Command System and NIMS. The findings of this study are reported in the Journal of Homeland Security and Emergency Management and were published earlier this year in the spring.
Rather than walk you through the details of each of these studies, I will summarize their key findings and how they informed the research we are here to talk about today. All the studies I just mentioned focused on the role of the local level in implementing the system and the degree to which local level participation and implementation influenced the ability of the overall response to function effectively and efficiently.
There was also a focus across these studies on behavior. Looking at whether or not the way that counties were approaching implementation was conducive to the National Incident Management System fulfilling its goals, or not. Really looking at whether or not counties were modifying the system, implementing it was designed, totally ignoring it, or just doing it minimallyall of these studies looked at behavior.
All of the studies, while few in number, did find substantial variation in the way NIMS was being implementednot just in terms of variation between one county and another, but also within counties by various organizations that are supposed to be implementing the system, for example, a public health or fire department or a law enforcement agencyeach would be implementing the program potentially differently. At least that is what the data suggests across the country.
Within this variation, we found more variation. Basically not only were counties and individual organizations implementing the system differently from how the system was designed, but there were also substantial differences in whether or not they even wanted to implement the system.
There were differences in what I call "behavioral intent"do people want to do it? There were differences in the degree to which they are actually implementing it. Based on the few studies that have been done already, there was reason to suggest that the assumptions upon which NIMS is based may not hold true.
But, the research that has been done has its limitations. For one thing, there are very few studies in existence. That is a limitation. Another is that these studies had relatively small samples. They didnt talk to everybody in the United States involved in respondingthey were only talking to smaller units.
Next, there wasnt a causal model developed. The research that had been done wasnt looking at "if we do this, then it works, and if we do that, it doesnt work." That is a limitation of the system because it is all well and good to identify that there is variation, but to go to that next stepto limit the variation or get rid of it entirelywe have to know what it is that is making it so different. But the research was being conducted in such a way that we really couldnt draw those conclusions.
Finally, the studies that had been done previous to the one Im going to talk about today lacked the ability to be generalized. Because of the way they had gathered the data, they werent able to speak for the United States.
The study Im going to spend more time talking about todaythe Current NIMS Implementation Behavior of United States Counties tries to put to the test the findings of the pre-existing research and overcome, at least partially, the limitations of that pre-existing research.
By employing a sampling technique known as "random sampling"through a national random sample of county emergency managers, it was able to speak to all counties in the United States, and by employing a quantitative survey tool that allows us to distinguish and explain why we found what we found in the data.
It wasnt just the NIMS research that suggested that behavior was a critical thing to be looking atdecades of policy literature and public administration research have also suggested that behavior and what local level entities do, how they act, actually has a strong influence on if policies perform as intended, what the outcomes are of that policy, and whether or not it achieves its goals.
Based on the prior work on NIMS and the other literature that existed out there, I engaged in this study. A mail survey and internet survey were employed with this national random sample of county emergency managers, and as part of that survey, I focused on asking how counties were behaving how they wanted to behave ratherwith respect to NIMS.
County emergency managers were asked to evaluate their counties overall, similar to what they are asked to do with NIMSCAST (NIMS Compliance Assistance Support Tool) regularlythey were asked to rate on a scale the degree to which their county wants to implement NIMS. Points were specified on that scale that included "not at all", "minimally", "modest modification", and "as designed". County emergency managers could pick points in between those.
So I looked at several aspects of NIMS. First, NIMS is supposed to be used on a daily basis as it is designed. I asked county emergency managers to identify whether or not their countys want to implement NIMS on a daily basis.
I asked them whether or not they want to implement it in small scale events, whether or not they want to implement various components of the preparedness component of NIMS, whether or not they want to implement the structures and processes associated with resource management and NIMS and the same for communication and information management and command and management.
In addition to that, I also collapsed all the data gathered along these various lines of intent into an index or summary score, to look at overallwhat is their general behavioral intent? I looked at the various little parts of NIMS, and the overall impression. What I found is pretty simple.
I found substantial variation in behavioral intent confirming the findings of the previous literature. Basically, what I found is a standard normal curve. The vast majority of counties in the United States are modifying NIMS. They want to modify the system.
Why do they want to modify the system? There are two partial explanations that might exist. The first is they may feel the system isnt perfectly suited to the emergency management needs of their county and that it requires some tweaking to be suitable. On the other hand, it could be they couldnt implement in this design for some reason, even though they felt the system was perfectly suited to their needs.
Regardless of why there is this variation, a key finding here is that not everyone was buying into NIMS as it is designed. You would think that if their intent fit with what they are actually doing, then they are not using the system in the standardized fashion.
Given that NIMS was required for use since 2006 and this survey was conducted in 2010, as a researcher I would have expected that at the very least counties would intend to implement the system as designed, even if it is just to be compliant, because the counties want the preparedness funding. But this survey research would indicate that counties actually intend to implement NIMS at very different points along the continuum of behavior.
There are some that want to implement NIMS in a way that is conducive to the policy fulfilling its goals. They are trying to do it as it is designed. Then there are counties that dont even want to do it as designedthey want to modify it. And if you took a look at the standard normal curve, that also means there are counties that are no implementing the system at all, or minimally.
We can conclude before we even look at whether or not they are actually implementing the system in their actual behavior, that there are a significant number of counties in the United States that dont even intend to do it the way the system is implemented, and because there are counties that intend to modify the system, the way they are conducting emergency management potentially varies from county to county.
It is possible that this can undermine the success of NIMS as an organizing system.
Turning to actual behavior, it was measured in a similar way. Here I asked counties to identify again on a scale the degree to which their counties are actually implementing NIMS. They are given options of "not at all", "minimally", "modestly", "modifying the system" and "as designed", and points in between.
They were asked again whether or not they were actually implementing the system on a daily basis in small scale events and incidentsthe various components within preparedness within NIMSresource management, communication information management, command and managementand once again, I wanted to get an overall impression of what counties are actually doing. I summed up that data and made an index.
Again, rather than bog you down with numbers, Ill show you a simple picture which really leaves that impression. I found substantial variation in the actual implementation behavior of counties, with most counties actually modifying the system within implementation.
I found this same pattern of variation in terms of the degree to which they are implementing the system on a daily basis, within small scale events, within preparedness, resource management, communication and information management, and command and management, and overall. In every single line item, there was a pattern of variation that is very similar to a standard normal curve, with the vast majority modifying the system in each component.
These findings are problematic for a system that is intended to bring about standardization and foster predictability and coordination. The findings really kind of led me to more questions that I could not answer with this one research study. What kind of problems would be triggered because the system is being modified for the systems use and usefulness in a really large scale event?
If we were to see a Hurricane Katrina again and we have multiple states responding at the same time with multiple counties and the federal government, and were seeing so many entities responding, and we know there is substantial variation in the way theyve implemented NIMS, I wonder whether or not we are going to see the improved response we are looking for as a result of implementing NIMS.
I was also led to questionhow are counties modifying the system? Given that they are reporting that they are modifying the system, are the modifying it the same way? Is one countys modification compatible with anothers modification? Are they modifying the system in a way that still allows agencies, organizations, jurisdictions and levels of government to merge effortlessly into a common structure to contend with hazard events?
That is what we are counting on the system to do. When we look at the national data across the country, we are seeing a significant pattern of variation that makes me wonder what we would see in a large scale event like Hurricane Katrina.
Given that we have seen this pattern of variation across the country, I also wanted to look at what might explain these factors. The article that I was asked to discuss, the current implementation behaviors article, didnt really get at that issue, but the study I conducted did.
In addition to asking counties about how they wanted to implement the system and to what degree they actually were, they were asked a series of other questions and to evaluate the extent to which their county agreed or disagreed with a wide variety of other statements.
An article will soon be coming out entitled "Explaining the Current Implementation Behaviors of United States Counties" and I wanted to preview those findings for you. Basically, I looked at a series of emergency management specific variables and looked at whether or not statistically how counties felt about these variables predicts their behaviors.
If they are in a rural area, did that mean they are less likely to implement the system? If a county had responded to one presidentially declared disaster in the last ten yearsdid that predict it? If they had done zero or tenwas that a relevant factor? Was it relevant the amount of emergency management preparedness funding they had received from the federal government?
Was it relevant how much experience the emergency management had, or whether they were full or part time? A lot of these variables had been suggested in the earlier work on NIMS as important, and what was remarkable and incredibly surprising to me was that all of those emergency management specific variables did not bear out.
In other wordsruralness, the amount of money you have, part time or full time, quarter timenone of those variables predicted how counties wanted to implement or actually implementing the system. What explained their behavior was a set of variables that was introduced from the public administration world and policy literature and had been shown to explain how other policies are implemented and how local level implementers behave.
Basically what ended up explaining that behavior was whether or not there were clear goals for the policy, whether or not there was adequate technical support, realistic timelines for implementation, whether or not there were incentives and sanctions, whether or not adequate financial support had been provided for the implementation of the system, and other like variables.
In other words, those policy characteristics that are controlled to a large extent at the federal levelthose characteristics and whether or not counties thought they were therethat predicts whether they want to do it and whether they are actually doing it.
The second strongest variable that explains how counties are behaving is implementer views. This is basically whether or not counties bought into the system. If counties bought in and had a good attitude and were motivated to implement the system, that explained how they would behave.
The third variable which was actually, in terms of the construct that was developed in this survey a very tiny part of it and came as quite a surprisethe third variable that predicted both intent and actual behavior was whether or not counties thought they had enough personnel to implement the systemas in having enough bodies, people, to implement the system. That was a teensy tiny little variable, but it ended up playing quite a big role in explaining how counties are implementing the system.
Basically, what I found in the data was intent, how counties want to behave and how they actually behave is explained by factors that are not specific to emergency management. Things that actually can be concentrated on and adjusted to some extentin other words, the federal government can do things about the characteristics of this policy, whether it is providing more realistic timelines, providing adequate technical support, having introduced guidelines from the beginning about how to implement that component of NIMS.
They could be more active in doing those things now in providing support. We could expect, based on the data, to see an increase in standardization the ways that counties want to and are implementing the system. We could also see the federal government and local emergency managers engaging in more activities designed to bring about buy in and commitment from the various parts of their counties or states, or nation that are supposed to be implementing the system.
We can do more here if we want to see standardization increase. We can also correct this issue of local capacity, and the perception that there arent enough personnel to implement the system. This could be a training issuea lack of understanding about how NIMS can be implemented and utilized. It is designed to be used in part or whole, depending on the requirements of the incident that is being responded to. That could be resolved quickly, assuming that is the issue.
Heres the kicker, and something I found to be really fascinating in terms of NIMS ultimately being able to fulfill its goal in this country. The variables I just suggestedwe can do something about. There is work to be done there. We can do something and we can expect to see standardization improve. But when it comes down to counties implementing the system, there is one variable that matters and it has nothing to do with whether or not they want to do it.
When it comes down to it, inter-organizational characteristics are the fourth variable that predict whether or not they actually are implementing the system and howthat behavior. What was I looking at with inter-organizational characteristics? I was looking at whether or not county organizations get along. Do they like each other? Do they have solid working relationships? Do they share goals and resources? Do they commonly acknowledge that a system like NIMS is needed?
Those inter-organizational characteristics are very much a local issue. There is very little, shy of providing increased funding for exercises, that the federal government can do to control this variable. There is also very little the state can do or a local emergency manager by themselves can do to control this variable.
This is a community wide or county wide issue that strongly suggests is related to whether or not counties are implementing the system and how. I wanted to give you a look ahead to the next part of this studythe findings of which will be released shortly.
I wasnt done with this topic, and I dont think I ever will be. There is so much to explore. I took a small sample of the national random sample that had participated in the study I have been talking about up to this point. I sent the same survey to a public health official, an elected official, a fire chief, the head of local law enforcement, and a school administrator.
I sent the survey to six different individuals within a county that are supposed to be seeing NIMS implemented within their organizations. It was a very small sample and was meant to be exploratory. What I wanted to find out waswere county emergency managers accurate in the way they were assessing their countys implementation? Did these county emergency managers really have a good feel for how their counties felt and the degree to which they wanted to implement the system?
Were they accurate in their assessments in how NIMS was being implemented? Please note, this was an exploratory, small sample study, so the findings are not generalizable, but the initial finding was that county emergency managers consistently overestimated their countys implementation intent and actual implementation.
Going back to what this graph looks like, that would mean we would see more counties in this areathe curve would move to the left with most counties in this Minimal to With Modification area if the findings of this initial study were accurate. The initial findings suggest that local county emergency managers were overestimating the implementation of their counties.
In case you are curious I will add on that public health and fire departments had the highest perceptions of NIMS and ICS. They wanted to implement it the most and they reported actually implementing it to the largest extent in line with what NIMS is intended to do. Elected officials and school administrators were actually in the middle.
The groups at the bottom were law enforcement and public works. They seemed to have the lowest perception of NIMS and didnt want to implement the system as much, and reported not implementing the system in behaviors that align with the systems intent.
That summarizes where I am with my current research. I look forward to hearing your thoughts on the implications of these findings for our country, and I am also willing to take any questions you might have.
Amy Sebring: Thank you very much Jessica. The findings were just fascinating. Now, to proceed to our Q&A and our audience comments.
[Audience Questions & Answers]
Jordan Nelms: Does your research investigate the intersection of ICS and the Emergency Support Function (ESF) framework within EOCs?
Jessica Jensen: My research hasnt looked at specifically the intersection of ICS with ESF in EOCs, but I can report that a study was conducted on that very topic by Mike Lindell and its findings are available to review. I believe it was published in Disasters, and you can look into that study. [Lutz, L.D. & Lindell, M.K. (2008). The Incident Command System as a response model within emergency operation centers during Hurricane Rita. Journal of Contingencies and Crisis Management, 16, 122-134.]
I have observed in my work on scene in the response to the tornado that there were significant issues between the Incident Command Post and the EOC in part because of the different organizational structures that were employed within each of these organizing units. They had a lot of difficulty communicating with one another and they had challenges with using ICS versus using an EOC that was based on the Emergency Support Function.
I did observe that within that one study even though I wasnt specifically studying that issue. In addition, in my conversations with emergency managers across the country, I have heard that this is still a significant issue. However, I have also heard there is not a strong interest in organizing EOCs in the ICS manner.
Ive seen more interest and practice in using the Emergency Support Function structure in EOCs. What that means for the future that means we have some work to do in that area so these two units can talk to one another.
John Bowman: How confident are you that counties' baseline understanding of NIMS was the same? How do you test that?
Jessica Jensen: I can say from the beginning that tackling the issue of how to measure NIMS and NIMS implementation and perceptions of the system is just an enormous task. There were many discussions about how to go about doing this. Unfortunately there were very few examples in prior work done on these topics even in policy implementation literature of how to get at this issue.
This topic I was looking at in terms of behavior was often spoken about as being critical and important to look at and highly influential in whether or not policy achieved its purpose, but very few people had done it in part because it is so darn hard. What was discussed in terms of going about measuring this issue, in terms of behavior, anyone who has taken NIMS training gets the sense that there is a design purpose.
There is a design and a series of steps within each of these five components they are supposed to be implementing. County emergency managers are required to report their progress and their compliance with implementing NIMS as designed through NIMSCAST. So it was assumed that every single person, every single county emergency manager has a sense of what NIMS is supposed to be.
They should also have a sense then, if they are doing it the way the NIMS document, training and NIMSCAST tells them it needs to be, which is its design, or they are not doing it that way. There is no assurance that they have a shared understanding of what NIMS is and its purpose, goals, or whether or not their personal views align with that, but there is the assumption they have taken the required mandated training which certainly makes clear what the intent and design of NIMS is.
Richard Vandame: About 40% of NIMS is the Incident Command System. Did the findings break out ICS implementation separately from the other NIMS concepts?
Jessica Jensen: In this particular study that I spent the most time speaking about today, ICS, a subcomponent of one component of NIMSits one of three components in that command and management component, so it is just a little part overall of NIMSI did not break it out and look at ICS specifically.
But I absolutely recognize that across this country, particularly outside of emergency management and outside of fire service, there is a tendency to confuse ICS with NIMS. While this study didnt address this issue, I did do that in the volunteer fire department perceptions of ICS and NIMS.
I asked a series of questions of these volunteer fire department chiefs and firefighters about ICS and I asked them a series of questions about NIMS, and I was able to statistically test to find out that they do perceive these two systems separately. There is the ability to distinguish one from the other. Now that probably fits with most expectations since ICS originated out of the fire service, but it still gets at this issue.
Is there the ability of folks to tell the difference between the two and separate one from the other? At least now the suggestion would be yes, although I have yet to duplicate it on the national level.
Boise State University: Could one of the problems be that the training focus has been predominately on ICS and not NIMS as a whole? Our local training has been taught by local responders who don't recognize (or realize) the importance of the roles of the EOC, Policy Group, and external agencies outside of Fire, LE, and EMS. The result is that our local responders are standardized at the ICS level, but other components of NIMS don't understand their roles as clearly.
Jessica Jensen: I absolutely concur. I think that gets at one of the variables that was found in this study to predict behavior and intent, and that is the policy characteristics. Was there adequate training? Is there adequate guidance and technical assistance? The answer was resoundingly no.
Anecdotally, based on my personal experience observing these classes in various locations, I can also tell you that I have personally observed that there is a tendency to focus overly on the ICS when in fact it is a small component of NIMS relative to the overall design of the policy.
So I agree. If we want to stick with NIMS, and we believe it is our ideal that is definitely an issue we have to look at going forward to address the whole of NIMS instead of focusing on one of the subcomponents of one component of the policy.
Amy Sebring: This has been around for awhile. I would have expected that the original NIMS would go through a periodic evaluation and updating as needed. As far as I know, that process has not been occurring. What can you tell me about it?
Jessica Jensen: NIMS was redrafted and released in 2008 with a number of stakeholders around the country at all levels given the opportunity to comment. There was a revision of NIMS that did present the system differently on a number of fronts, both substantivelysome of what was required changedbut also its tone or tenor, which made it more clear that implementation of this system is a collaborative exercise with various levels of government and various departments playing a role and having choices along the way.
Both in substance and tone, it changed. As for going through more evaluations, the NIMSCAST gives the federal government and states the ability to evaluate the extent to which their various jurisdictions are making progress. From a research perspective, I can tell you that data is not readily available to look at.
The findings of any analysis that is being done are not shared, at least not with me and my peers in academia. One explanation for that is that the states own the NIMSCAST data they collect from their counties. The federal government does not officially own the county data. They just get a summary reportreports from the statesand states own the other data.
I certainly anticipate that evaluations are done on based on the NIMSCAST data; however I am not sure to what degree if at all those findings are being shared.
Cynthia Chono: Sorry, I was late in joining, but I would like to know what agencies were evaluated. I'm surprised that Public Works agencies were on the low end of implementing ICS / NIMS. The American Public Works Association is very active in supporting NIMS and ICS training. APWA accreditation program requires having an Emergency Management program and implementation of NIMS ICS
Jessica Jensen: I would say reiterate my caveat in presenting those findings in that this is a very small study and the findings are not generalizable, so I absolutely, in no way, shape or form was trying to speak for public works directors at the county level across the United Statesonly a small subset of them.
My work, in terms of evaluating the data I collected was to see the degree to which it aligned with the countys information and then to separate the various entities from one another. Public works, in this small study, was at the bottom.
As to whether or not I would anticipate finding on a national scale the same pattern with public works directorsI dont know. I just wouldnt hazard to guess. I realize there is a lot of support through the national level with public works to implement this system. There are, too, across the United States through all kinds of organizationsthe volunteer firefighters, the National Level Council, the National Fire Protection Associationyet we are still seeing these patterns of variation.
Specifically with public works, I dont know. I am confident that what were seeing is variation, irrespective of national level professional organizations work or FEMAs workwe are still seeing that variation.
Frank Riedijk: Have you reviewed NIMS Compliance with hospitals, as they interact with Public Health, local EMS, and Law Enforcement?
Jessica Jensen: Heres the difference with hospitals and this is the key one to know. Hospitals are required for accreditation to be compliant. They will not be an accredited hospital if they do not implement the system. They are regularly evaluated in person. Inspectors come to hospitals in person to check what is being done and what is not being done.
Based on their report, hospitals do or dont receive accreditation. Talk about a powerful incentive and a powerful sanction to motivate compliance with NIMS! From the very limited work that has been done on hospitals and their compliance with NIMS, it has only been focused on ICS and what has been found is that hospitals have been complying at a far higher rate than I have seen in my data with implementation of the system.
I would expect to find that because there is someone there checking and enforcing it and it matters to the hospitals and their existence whether or not they are being compliant.
Robert Hare: I believe that the issue of implementation in a strictly standardized manner vs. desire to implement in a modified manner needs to be further studied in detail and is a key question. While my region in New York has some minor modifications, it has had no problem with us operating under incident management structure in an event six hours (drive) away in a suburban/rural area. This may be because the fire service has a long history; this ability to operate did not vary between volunteer and career department personnel. In short, if I mix some vanilla Carvel and Breyers ice cream together, does it make it non-edible?
Jessica Jensen: Not only did I not get at it and it needs further research, but looking at the complexity of what NIMS is, at least on its face, and what it tries to doI dont think there has ever been a more sweeping, grand, ambitious policy mandate certainly in the history of emergency management, and I would dare say in many respects outside of emergency management.
This policy tries to be everything and structure how we do emergency management through its components. To study the degree to which these activities are actually being undertakenthese processes and even the use of terminologyon the ground research is required. You have to be there to see it and to stand back objectively and watch whether or not it is being done.
In order to get to some economy of scale, in order to report across the United States, I would argue that it is not feasible to do that kind of work. Yet, that is the critical work that needs to be done. You are caught in a catch-22 of do we just keep having small case studies, like the one I did with the tornado where we are on the ground doing it, and we never know the degree to which we can generalize? Or, do we try to gather data in other ways that gets at these issues while not perfectly, but at least allows us to access where we are?
I agree with your comment. We need a lot more work on this topic. I hope others take up the challenge and I look forward to reading their findings.
Keith Holman: Is it possible that the implementation of NIMS is impacted by the perception that NIMS is fire and not all hazards?
Jessica Jensen: I would suggest that is not the case based on my findings. The findings looked at issues related to whether or not there was a perceived need for NIMS, whether or not counties agreed with the goals of NIMSit didnt ask them if they though NIMS was a fire policy, but it certainly asked them if they thought it was relevant to their county.
I can report that counties believe NIMS is a good idea. The whole policy design issueis NIMS a good idea? Is NIMS needed? The answer across the United States is yes. So it is not a disagreement with the concept of NIMS where I found issues with implementation. It actually gets to those other aspects of the policy that I have already addressed.
I didnt ask that question specificallywas it considered a fire policy? But I did askdo you think there is a need for it? Do you believe it is a good idea? Do you think the policy is needed? In all cases, the majority said yes.
Richard Vandame: How did you determine the impact of non-conformance with NIMS on the success of the response effort?
Jessica Jensen: I did not look at actual response efforts. I asked county emergency managers to assess their counties. We know from the findings of these studies that counties are implementing NIMS in different ways across the country. They are modifying it. I saw differences within counties and states in terms of data I received.
There is a lot of variation out there. What is going on with NIMS implementation is really complex. Would that matter in a day to day eventeven a Joplin event? I would suggest no. I dont think you are going to see the impacts of not implementing NIMS in a standardized way in those jurisdictions.
We saw a great coordinated and collaborative effort in the response to Joplin. Whether or not they were utilizing each of the various components of NIMS or ICSand this is a new topic Ive been looking intodoesnt seem to be relevant because it was a small geographic area and there were a limited number of responders.
We wont really know the answer to that question until we have another event the size of 9/11 or Hurricane Katrinaor looking back into the past, Hurricanes Andrew or Camille. When we have events like that crossing multiple jurisdictional boundaries, multiple states, every level of government, thats when we really have to see standardization. That is when it is very clearly to our benefit, and it has to exist for things to run smoothly.
I would suggest, based on this data, it would be unlikely if a Hurricane Katrina were to occur tomorrow that we would see the response efforts unfold as efficiently and effectively as we would like, particularly given the investment we have made in NIMS.
Amy Sebring: That is a very good note to wrap up on. Thank you very much Jessica. We appreciate your taking the time to be with us today to share this information and we wish you success with your future research. We hope you will keep in touch. We look forward to learning more as time goes on.
Again, the video and audio recordings should be available later this afternoon. If you are not on our mailing list and would like to get notices of future sessions and availability of transcripts, just go to our home page to Subscribe.
Before you go, PLEASE take a moment to do the rating/review! Note: We are asking you to rate the relevance of the information, and this will assist us in our future programming.
Our next program will take place Wednesday, September 28th.. Please watch for our announcement and plan to be with us then. Until next time, thanks to everyone for participating today and please take time to provide your input on this document and the others. Have a great afternoon. We are adjourned.