The history of data being evaluated as the next generation of oil for more than 10 years has been covered in various media, and data is just an important factor in a particular field. Data is now essential at all levels of marketing, logistics, finance, manufacturing, decision making, etc. in most private companies (if this is wrong, I’d better write a resume and change jobs immediately).

Using data could radically change the response to disasters that plague many victims, but I hear that data has rarely been used in the emergency response that has occurred in the last decade. You might be a little surprised. Over the years, disaster response agencies and private organizations have broadened the scope of data input and processed more data for disaster response, but the results are not very good and far from being useful.

However, with the spread of the Internet of Things (IoT), this situation is changing. Crisis management managers working at the forefront of disasters are becoming able to obtain the necessary data and make better decisions throughout the recovery, response, and recovery cycles. The technologies used at the forefront, such as aerial photography from drones, visualization of disaster assumptions, and simulation of AI-induced disasters, have not reached the highest level. This is just the beginning of a transformation in disaster response in the 2020s.

Finally get a huge amount of disaster data

Emergency response is a battle between uncertain anxiety and the ever-approaching time. At the scene of a wildfire or hurricane, everything can change in seconds. If you are not careful, things can change suddenly in an instant. Rescue due to wildfires spreading on safe roads that are supposed to transport evacuees and suddenly becoming impassable, evacuation teams repeatedly reorganizing and spreading too extensively, and unexpected situations suddenly occurring Often, the activity gets stuck. In some cases, the operation center, which had complete control of the information, suddenly loses ground verification data at all.

Unfortunately, it can even be extremely difficult to obtain unprocessed data before or during a disaster. Looking back at the data revolutions that have occurred in the business world, the early success was largely due to the fact that companies were always heavily dependent on data to carry out their activities. As it is today, what is important is digitization. In other words, in order to convert the unprocessed data that was left unattended into a format that can be analyzed by a personal computer, the work was transferred from documents to a personal computer. In the business world, the last 10 years have been, so to speak, an upgrade period from version 1 to version 2.

When thinking about emergency response management, many responding organizations have not upgraded from version 0. Taking floods as an example, how do we understand the sources of floods and the flow of water? Until very recently, there was not even comprehensive data on flood locations and water flows. In the case of wildfires, datasets on tree locations and flammability scattered around the world were unmanaged. Even infrastructure equipment such as electric wires and mobile phone base stations often had no contact with the digital world. Therefore, if the user cannot identify such equipment, even if there is such equipment, the equipment cannot recognize the user.

GettyImages 1229551957

The flood model is at the forefront of disaster prevention planning and disaster response.

Raw data is essential for models, simulations, forecasts, and analysis. Until now, detailed data did not exist in the field of disaster response.

Now that the Internet of Things (IoT) has become so pervasive, all kinds of things are connected to the Internet, and IoT sensors are installed in the United States and throughout the world. Extensive deployment of temperature, pressure, water level, humidity, air pollution, power, and other sensors analyze data that is routinely transmitted to data warehouses.

Take a wildfire in the western United States as an example. It’s not so long ago that the federal government and the state fire department couldn’t figure out where the fire broke out. “It has a 100-year history, but that tradition is not hampered by technological progress,” said Firefighting, who served as Director of the US Forest Service for 10 years and is now the Chief Fire Officer of Cornea. Tom Harbor says.

He is right. Fire extinguishing is an unreasonable activity. Firefighters can see the flames. You can even feel the hot air of flames on your skin. The data was useless in the western United States, where vast lands and strips of cities were scattered. Large fires can be detected by satellite, but it is almost impossible to see small fires smoldering in bushes from the Geospatial Information Bureau. But even if you can’t find a small fire, the area around California can be full of smoke. So how should firefighters on the ground handle such valuable information?

The success of IoT sensors has been touted for the past 10 years, but many of the obstacles are finally being resolved. Aaron Clark-Ginsberg, a social scientist at RAND Corporation, who is investigating a resilient community, said air pollution with “very cheap and easy-to-use” air quality sensors. Explain that this sensor is installed everywhere because detailed information about it (such as important signs of wildfires) is available. He cited Purple Air, which not only manufactures sensors, but also creates popular consumer air quality maps, as an indication of the potential of recent technologies.

Maps are an important tool when dealing with data in the event of a disaster. Most disaster prevention planning and response teams are based on geospatial information systems (GIS), but the privately held company Esri boasts the largest amount of map production in this area. That’s right. The increase in the number of water level sensors has dramatically changed the response to certain disasters, said Ryan Lanclos, director of public security solutions at the company. “The flood sensor is always up and running,” he said. The federal national flood forecasting model allows researchers to use GIS analysis to predict the impact of floods on communities more accurately than ever before.

GettyImages 1063020952

Digital maps and GIS systems are becoming increasingly indispensable for disaster prevention planning and disaster response, but printed maps are still preferred.

Thanks to these sensors, according to Cory Davis, director of public security strategy and crisis response at Verizon (Verizon Media is TechCrunch’s parent company, so Verizon is our ultimate owner). So, the work that the company’s workers do to manage the infrastructure has changed. “Imagine a power company that has sensors installed on power lines. With sensors, you can quickly rush to the location of a failure, solve a problem, and recover.”

The extended battery life of the sensor has made significant progress in the sensors used in this area over the last few years, he said. Thanks to the constant improvement of ultra-low power wireless chips, battery performance and energy management system, it can be used for a very long time without maintenance of sensors installed in wasteland. “Some devices have a battery life of 10 years,” he said. This is important. This is because it is not possible to connect sensors to the front-line power grid.

The same idea applies to T-Mobile. Regarding disaster preparedness plans, Jay Naillon, senior director of US technology services operations strategy for the telephone company, said: “A type of data that continues to grow in value is storm surge data, which makes it easy to verify that your equipment is operating normally.” Storm surge data is sent from flood sensors, so alerts can be sent to real-time disaster prevention planners across the United States.

Attracting telco and business interest was essential to the adoption of disaster-related sensors and other data streams. Governments are the end users who need flood and wildfire data, but governments are not the only ones interested in the visibility of such data. “Most of the time, it’s the private sector that needs this information,” said Jonathan Sury, project director at the National Disaster Prevention Center, Institute for Earth Sciences, Columbia University. “New types of risks, such as climate change, are affecting a company’s bottom line,” he said, increasing business interest in sensor data in areas such as debt grading and insurance underwriting. Point out that.

Sensors can’t be installed everywhere, but they have helped to see ambiguous situations in the field that emergency response managers couldn’t see before.

Finally, mobile devices, which are becoming more and more popular around the world, have huge datasets. For example, Facebook’s Data for Good project has a data layer for connections. If a user who was connecting from one location connects to another location, it can be inferred that they have moved. By using such data provided by Facebook and telephone companies, staff members who formulate emergency response plans can grasp the movement of people in real time.

Flooded data and the potential of AI

It’s more informative now than it was in the past when data was scarce, but it’s time to respond to the flood of data, like the floods in cities around the world. IT stacks such as data warehouses and business intelligence tools are collecting a plethora of big data.

All that is required is that disaster data can be processed easily, but the reality is not so easy. Since various organizations such as private companies, public institutions, and non-profit organizations hold disaster-related data, there are major obstacles to the mutual operation of data. Even if you can integrate distributed data and gain insights, it is difficult to put together front-line staff to help you make decisions in the field. Therefore, it is still difficult to market AI for purposes other than disaster prevention planning. Verizon’s Davis said: “Many cities and government agencies are struggling with how to leverage the excess data.”

Unfortunately, standardization at all levels is a challenge. Globally, standardization is gradually progressing, but interoperability between countries has hardly been realized. “There are big gaps between countries in terms of both technology and standardization,” said Amir Elichai, founder and CEO of Carbyne, an emergency call platform. He points out that in order to use it in, it often needs to be remade into something completely different.

Tom Cotter, director of emergency response preparation for the healthcare disaster response organization Project HOPE, says it is difficult to even establish communication between responding staff in an international environment. “Some countries allow multiple platforms, while others don’t allow them, and things are constantly changing. Basically, technology communication platforms are separate for each country. It is in a state of being prepared. ”

A senior person in the federal emergency management department acknowledged that data compatibility is becoming more and more important in technology procurement contracts, and the government is not using its own software, but rather off-the-shelf products. He says he recognizes the need to buy. These messages have also reached companies such as Esri. “Our core mission is to be open. Our idea is to make the data we create open to the public and share it, or to secure it based on open standards and share it.” ..

The lack of interoperability has a number of downsides, but ironically it can have a positive effect on innovation. “It’s an advantage that it’s not standardized because it doesn’t have to meet traditional standards,” Erichai points out. In non-standardized situations, it may be possible to build quality protocols that assume the latest data workflows.

Even if interoperability is ensured, the issue of data screening remains after that. There are also dangers in disaster-related data. The data stream transmitted from the sensor can be verified and collated with another data set, but since the amount of information transmitted from the general public is increasing dramatically, it is safe before it is released to the initial response staff and the general public. Need to be scrutinized.

GettyImages 465694096

Since ordinary users can access smartphones more than ever, the staff who formulate emergency response plans need to select, verify, and make the uploaded data available.

Bailey Farren, CEO and co-founder of the disaster communication platform Perimeter, said, “In some cases, it is the general public who has accurate and up-to-date information. It would be nice if the citizens could tell the government officials before the staff started the work. ” The problem is how to sort out quality information from useless and malicious information. Team Rubico, CIO Raj Kamachee, a non-profit organization that makes up a team of volunteer veterans to respond to natural disasters, said data validation is essential, he said in 2017. Since joining Team Rubicon in 2011, I believe that data validation is an important element of the infrastructure that an organization builds. “As we grow more users, so does the amount of feedback data. The result is a highly self-service, highly collaborative approach.”

Should AI models be used if quantity and quality are ensured? The answer is yes and no.

Columbia University’s Shree believes that AI shouldn’t be over-expected, as some have talked about. “It’s important to note that machine learning and big data-related applications can’t do everything. These applications can handle a lot of different information, but AI can give you a concrete solution. Not, “he says. “The initial response staff is already processing a large amount of information,” and does not necessarily require further guidance.

In the disaster field, the use of AI for disaster prevention planning and recovery is increasing. Shree cites the recovery planning platform OneConcern as an example of combining data and AI in the disaster preparedness planning process. It is also a CDC (Centers for Disease Control and Prevention) social vulnerability indicator that combines different data signals into several scalar values ​​to help emergency response planning staff optimize crisis management plans. He also mentioned the FEMA (Federal Emergency Management Agency) risk tools.

That said, most of the experts I spoke to were skeptical about using AI. As we briefly discussed in Part 1 of this series on disaster sales cycles, data tools must be up-to-date, especially when life is at stake. When choosing a tool, Team Rubicon’s Kamachi says he focuses only on the practicality of each vendor, not on the tool’s strengths. “We pursue high-tech capabilities, but we also have low-tech,” he said, emphasizing that the key to disaster response is the agility to respond to changing situations.

Carbine’s Erichai recognizes that the company’s sales performance has a similar pattern. “There is both a high level of awareness of new technologies in the market and a cautiousness to hesitate to adopt them,” he said, but admits that “AI will definitely benefit once it reaches a certain level.”

Similarly, T-Mobile’s Nylon said from a management perspective, “I don’t think we can get the most out of AI” in T-Mobile’s disaster planning. Instead of using AI as its brain, T-Mobile simply uses data and predictive modeling to optimize device placement. There is no need for an advanced hostile generation network.

In addition to planning, AI is used for recovery after a disaster, especially for damage assessment. After the disaster is over, infrastructure and private property needs to be assessed, insured and the community moved forward. Art delaCruz, COO and President of Team Rubicon, points out that the proliferation of technology and AI has significantly reduced the work of assessing damage. Since Team Rubycon often assists in rebuilding the community during the recovery process, determining the severity of damage is essential for effective response strategies.

The light of the sun will brighten the future, but the light can cause burns.

AI is thus of some utility in the areas of recovery planning and disaster recovery, but not very useful in the area of ​​emergency response. However, there will be more and more effective situations throughout the disaster response cycle. There are great expectations for the future of drones, and the number of cases where they are used in the field is increasing. However, in the long run, there are concerns that AI and data may not be the solution and cause new problems.

The use of drones in disaster response settings seems clearly worthwhile. Even in situations where it is difficult for rescue workers to enter, the team that introduced the drone can obtain images and information from the sky. A team on the ground used a drone to find survivors because a major road was closed during a mission in the Bahamas, said Kamachi of Team Rubicon. Images taken from the drone were AI-processed to help identify and evacuate survivors. “It’s just a great tool,” he said of the drone and its potential.

GettyImages 1279579605

Taking aerial photographs from a drone will significantly improve the quality of real-time information available to disaster response teams. Especially in the field where you can’t get close to the ground.

Project Hope’s Cotter also says that speeding up data processing will enable us to respond more accurately. “It’s speed, after all, that saves lives in disaster areas. More and more cases allow us to remotely manage our response, so we don’t have to send a lot of people to the site,” he said. This is very important for response teams working in areas with limited staff.

“The number of emergency management agencies that use drone technology for search, rescue, aerial photography, etc. is increasing,” said Verizon’s Davis. Point out and continue as follows. “AI performance is improving, and initial response staff are becoming more effective, efficient, and safer.”

The ability to quickly process and validate large amounts of data sent by sensors and drones will improve the quality of disaster response. The number of catastrophes caused by nature on a whim is increasing, and it may be possible to cope with such a situation. But there is a problem. Will AI algorithms cause new problems in the future?

Rand offers typical alternative analytics, but Rand’s Clark Ginsburg said these solutions could cause problems: “Technology-triggered disasters and the world of technology. Exacerbates the disaster. ” These systems can fail. You may make a mistake. And the creepiest thing is that the system can be crafted to increase havoc and destruction.

I recently introduced the Chairman of the Board of Risk & Return, a disaster response VC fund and philanthropic organization, the former Co-Chair of the 9/11 Commission, and Nebraska. Bob Kerrey, who also served as Governor and Senator, points out that cybersecurity is becoming an increasingly uncertain factor in many response settings. “In 2004 (when the research committee was doing business), there was no concept of zero-day, of course, there was no market for zero-day, but now there is.” “Terrorists had to come to the United States and hijack their planes. Now it’s possible to destroy the United States without hijacking,” he said in the 9/11 terrorist attacks. Hackers can attack while sitting at home with colleagues in Moscow, Tehran, China, and perhaps withdrawals at home. ”

While data is gaining attention in the area of ​​disaster response, this situation can lead to secondary problems that did not exist before. What is given is robbed. Wells where oil is now welling up will suddenly run out someday. Or the well may catch fire.

Categorized in: