As described in its presentation, this conference is placed under the sign of un-ness: unexpected, uncertainty, unknown, etc. This is a relevant and timely choice because concepts used in the field of risk and crises management were deeply revisited in recent decades to the point that uncertainty now seems to be a main marker of all fields of research and actions. It is therefore necessary to clarify this evolution.
Information is known to play a crucial role in risk management and in reducing uncertainties. It’s a subject to which numerous studies have been devoted, mainly under the label of “Risk Communication”. Yet despite the importance of such researches, traditional views of risk as well as of information must be questioned.
Risk studies have primarily focused on prevention and on crisis or disaster. But too little has been focused on the relationship between these two moments which actually require different forms of action, involve different actors and whose tilting point is crucial in crisis management. In reality actors are faced with a wide variety of situations that extend from one to another, and even that mix, without any sharp division between two temporalities. For example prevention data are often essential at the heart of a crisis situation.
My talk will focus on the concepts of risk, uncertainty, disasters and degraded situation and will attempt to clarify and define them in relation to each other. Each one covers specific procedures, information needs and action procedures.
Having specified these elements, the second part will focus on the consequences in terms of understanding the information and propose a complementary vision to the usual definitions. We shall see that producing and disseminating information does not mean that the information is seized, understood and usable. Production, circulation and reception of information will be analyzed through several case studies, including the work of the Operational Strategic Health Room at WHO, crisis management in Madagascar and the events following the January 2015 terrorist attacks in Paris.
This presentation will conclude that there is no generic information per se, that would be ipso facto relevant for all situations. If one does not take into account the context in which the information evolves information become thus an additional factor of uncertainty.
Prof. Hans Jochen Scholl
Public administrations have been using advanced ICTs in all areas of government business to increase agility and flexibility, redesign and streamline process flows, improve the quality and scope of service, and strengthen overall safety and security. Like in other public-sector areas these so-called “Electronic Government” practices, or, taking the more recently used term “Smart Government” practices, have also greatly increased governments' emergency and disaster response and recovery capabilities. In more general terms, information and communication technologies (ICTs) are playing increasingly important roles in all phases of disaster and catastrophe management.
In this vein, higher agility and resiliency in disaster response and recovery are needed more than ever, since around the world the frequency, scale, and impacts of natural and man-made disasters have markedly increased over the past decades. When a catastrophe strikes, local responders and communities are regularly vastly overwhelmed by the impact. In early incident response, accurate and reliable information is the scarcest resource, and the larger the disaster, the longer it can take until the extent of the impact is completely identified. Understanding the scale and scope of the impact, gaining situational awareness, and forming a common operating picture are the foremost tasks of any response, so that action can be taken in the most informed and targeted fashion.
While advanced ICTs have greatly increased response capabilities, they have, however, also introduced new vulnerabilities. In major catastrophes an over-reliance on the smooth functioning of advanced ICT capabilities might prove costly and dangerous, since critical information infrastructures might be compromised, or even completely damaged.
Disasters of the magnitude of the 2013 Tohoku catastrophe in East Japan, for example, have demonstrated that responders can be drastically diminished in their capacity to communicate with each other and the public, once critical infrastructures such as the electric power grid, computer networks, and wireless networks have been knocked over.
Building and maintaining resilient information infrastructures for all phases of disaster management and for managing all hazards appear as the most important undertakings for coping with the far greater frequencies of catastrophes observed in recent decades. Responders need to maintain proficiency in operating with a wide range of instruments from low-tech and manual cardboard-based systems with no need for electric power to highly sophisticated and networked information and communication systems that provide most accurate, timely, and actionable information to incident managers and other decision makers.
The talk looks at select recent cases and discusses the complexities of response efforts, the range and capabilities of technologies in support of response management and summarizes some lessons learned.
The talk finally raises and discusses the question to which extent disasters and catastrophes are truly natural, and which ones might be rather man-made.
Lars Peter Nissen
Strengthening the humanitarian sector’s ability to assess the impact of crisis has been high on the humanitarian policy agenda for several years now. At first glance, it seems like progress has been made: the IASC has developed and endorsed new and stronger methodologies, such as the Multi-Cluster Initial Rapid Assessment (MIRA); a number of new programmes specialise in the gathering and analysis of data; and in several recent operations, innovative ways of assessing crisis have been tested. In parallel with the efforts of the traditional humanitarian agencies, a creative and vibrant community of new actors has emerged, including Crisismappers and the Digital Humanitarian Network (DHN).
However, all these efforts share an almost exclusive focus on the “supply” side of the equation: how to collect, analyse, and visualise data quicker and better. Only very limited attention has been given to the “demand” side: what do decision-makers need to make better decisions?
Very little is known about how decisions are made. It is not clear what role evidence plays, meaning it is an open question whether efforts to collect more data and produce more information are having the desired impact, and making a real difference to operations. This in itself is problematic, but it is even more disturbing that there does not seem to be much of an appetite for tackling this unknown. The humanitarian information management community seems to be satisfied with only an embryonic understanding of decision-making.
The lack of enthusiasm for exploring decision-making may be found in its political nature. Improving analysis is primarily a technical problem. Examining decision-making forces us to recognise that decisions are political. It makes us ask what may be influencing decisions, other than the needs on the ground. This is a hard question, but it is vital that we ask it, if we are to improve our capacity.
The keynote will explore the consequences of our lack of understanding of decision-making from a practitioner’s point of view. Working from concrete operational examples, it will offer suggestions on how we can deepen our understanding of humanitarian decision-making, and from there, move forward and make it more accountable.