As Hurricane Ian was bearing down on Florida in September 2022, Floridians braced for a “potential historic catastrophe” in the Tampa Bay area. Instead, a convergence of circumstances, including a low pressure system that disrupted the jet stream air current and warm water, meant that Ian strengthened more quickly and blew in faster than anticipated – 75 miles to the south.
While that distance made a drastic difference for the communities affected, the margin of error actually represented a huge improvement in accuracy over storm predictions of the past.
The accuracy of predicting where a hurricane will hit 72 hours before landfall has improved dramatically in recent years. In 2020-2021, for example, 72-hour landfall estimates were accurate within 75 miles – compared to the 1990s, when there was an average distance of 282 miles between where forecasts expected hurricanes to make landfall and where they actually struck. To place this in context, the entire width of Florida, from Pensacola to Jacksonville, is just 360 miles.
That recent improvement represents decades of steady progress. Forecast accuracy tightened to 192 miles between 2000-2009, then to 100 miles during the next decade, and then to the current 75 miles.
Forecasters have also improved how accurately they can predict total hurricane activity each season, reflected in NOAA’s annual projection of a range of the expected number of hurricanes.
In 4 of the 10 years between 2003 and 2013, the actual number of hurricanes fell squarely within this range. But in the last decade, the actual number of storms fell within the NOAA predicted range in 7 out of 10 years, a significant improvement. Indeed, since 2014, the actual number of hurricanes each season was only off by an average of 0.5 storms per year compared with the predicted range.
It’s unlikely that forecasters will be able to predict hurricanes with 100% accuracy anytime soon, but their dedication to improvement will continue to save lives in the years ahead.