In a recent keynote speech at Wharton’s Initiative for Global Environmental Leadership (IGEL), Innovyze CEO Colby Manwaring took the stage to address the current state of flood modeling techniques. The story? We can do better. Wet weather is becoming more unpredictable and the impacts are being felt by our communities. Municipalities, engineering consultants, and government agencies cannot continue to operate with outdated flood models that are plagued by inaccurate or incomplete data. If they do, residents and businesses are left vulnerable to flood risks and costs, compromising the integrity of local governments.
Where We Come from Determines Where We Are
The presentation began with a call to re-evaluate current flood modeling methods: “We need to consider our frame of reference if we are going to get to a solution, we need to consider our assumptions on where we are.”, Colby stated. Once we realize where we come from, we can plot a course for the future.
Colby goes on to explain that what we currently know about flood forecasting, mapping, and modeling originated in the 1970’s. During this period, the foundation was laid for data collection systems and computing that was later used to shape flood modeling approaches.
These methods were eventually codified and legislated in the developed world based on key data points extracted from weather patterns. The result was a reliance on risk assessments based on 100-year return intervals of flooding (probably from most people thinking “I won’t be here in 100 years, so I’m safe”). In an elementary sense, risk assessment was framed as: How likely is an area to flood throughout a 100-year time frame: once? 15 times? 40 times? This framework for risk assessment is relatable to most people, regulators and the public alike, but somewhere along the way the understanding was lost that a “100-year flood” is just a way of saying that every year there is a 1% chance of this flood occurring. Probability this year is not affected by last year’s events, nor by future events – so we can have “100-year floods” anytime.
Flood mapping based on this framework was, inevitably, misunderstood and flawed. Weather and rainfall input data was spread out, disjointed, and often assumed because it is hard to predict and codify. The inaccuracies in rainfall data contributed to the assumptions made for the overall model, which led to the need for tweaking of output data.
Flood maps based on 100-year risk assessments emerged as a binary, or “single truth”, basis for flood insurance, infrastructure planning, and risk mitigation. Businesses and residents were either in or out of the floodplain, and insurance costs were calculated accordingly. In reality, flood emergencies do not occur in an “in vs. out”, binary manner. They are more fluid than that – they are graduated events that we are trying to quantify with probabilistic methods – so floods don’t stop at some imaginary floodplain line.
What Happens when the Map is Wrong?
In 2012, Hurricane Sandy ravaged the US east coast and the Caribbean. Hundreds of thousands of housing units were destroyed and billions of dollars were spent in reconstruction efforts for infrastructure, homes, and businesses.
In his first major address following the disastrous storm, former New York City Mayor Michael Bloomberg compared the FEMA 100-year flood maps to the actual flooding caused by the storm. According to the former mayor “two-thirds of all homes damaged by Sandy [were] outside of FEMA’s existing 100-year flood maps.”
Mr. Bloomberg also stated the city’s need to adopt to risks posed by climate change:
“We have to build smarter and stronger and more sustainably,” also noting that the city will “determine exactly what that means.” He later added: “No matter how much we do to make homes and businesses more resilient, the fact of the matter is, living next to the ocean comes with risks that we cannot eliminate.”
As weather patterns become more erratic, and superstorms hit with unpredictability, we do need to build stronger and more resilient cities. It’s also true that we cannot eliminate all risks faced by municipalities in flood-prone areas. But just because we cannot eliminate the risk does not mean that we cannot be better prepared in our flood modeling and mapping.
In the aftermath of Sandy, FEMA redrew the flood maps for the first time since 1983, in some cases relying on data that was 30 years old. The initial maps released by the agency included some unfortunate news for New York City residents: 35,000 more homes and businesses would be in flood zones, essentially doubling the previous number and likely raising insurance rates.
In a rare move, the city rejected FEMA’s proposed maps, and in 2016 current New York City Mayor Bill de Blasio announced a plan to revise FEMA maps with the intention of lowering flood insurance premiums for New Yorkers. However, with millions of dollars at stake in property value and insurance costs, there are likely political implications in the Mayor’s decision to challenge the agency’s proposed updates.
Impacts on Enterprise and Democracy
The focus of this event at IGEL was “The Consequences of Extreme Climatic Disruption for Business and Democracy”.
Colby states that it all starts with acceptance: “We’ve got to accept that our methods and our plans need to be dynamic, not static single-source of truth answers. We need to use the best available methods, models and data now, not rely on prescriptive regulations from decades past.”
Concerning the private sector, outdated flood modeling and maps can lead to repetitive property loss, faulty business continuity planning, and potential job loss. For heavy industries, environmental externalities are of heightened importance considering that flood damage can lead to contamination and pollution.
So, what can be done? Some businesses may choose to conduct their own flood risk assessment, as they do not necessarily need to rely on FEMA. Businesses can contract experts and consulting engineers to conduct a more accurate assessment of mitigation strategies so that emergency flood plans can be defined. While this may present costly expenses up front, it can be a valuable investment when considering the alternative.
The negative impact on democratic governments can be wide-ranging and profound. For example, repetitive disaster episodes can impact rule of law as societal norms are shaken during times of disaster. As Colby indicates, government agencies lose confidence as they are perceived as incompetent in flood disaster planning, or indifferent to the impacts.
Legislators need to modernize the regulatory approach. State, Federal, and Local governments can avoid loss of confidence if this is done. Compromises for building on flood-prone areas and politically motivated negotiations of flood zone designations must end if governments are to maintain good faith with the people they represent.
Can We Do Better?
The answer to the proposed question is not only a resounding “YES”, but an acknowledgement that we must do better. The stakes are too high and are held by too many people to be ignored.
As Colby put it, “We should not be analyzing the weather data we’ve got now using technology and using methods developed 50 years ago. We have better options!”
Stakeholders need to update their approach and acceptance. Plans need to be dynamic not static, and based on probability distribution models that accurately reflect floodplains with the ability to work with new data as it emerges.
In Colby’s closing remarks, he stated: “Change is not optional…If we cannot change the climate for the better, we’ve got to change ourselves and our response.”
He’s right, and these changes first require an acceptance that flood modeling methods must be amended. And while updating these methods requires advanced technology, we have the technological capabilities at our disposal.
To watch Innovyze CEO Colby Manwaring’s presentation at IGEL follow the link below: