More computing power leads to better hurricane forecasts

As we head into the late summer and fall months, hurricanes tend to be a little more top of mind. Considering how much goes into public response to hurricane track and intensity forecasts, it’s extremely important to have the most accurate predictions we can. A great deal of money and man power goes into boarding up, battening down, and evacuating ahead of a serious coastal storm. The more reliable the forecasts become, the more serious the population will take evacuation warnings.

Last year’s hurricane track forecasts proved very reliable, but everyone agreed that we could do better. On Monday, the National Weather Service announced that it has recently more than doubled the computing power available for running weather prediction models. In the meteorology world, that announcement is a huge deal. One of the limitations scientists have faced when developing the models is how much data can be processed. More detailed data and improved algorithms, which are necessary for accurate forecasts, require more processing power.

According to the NWS, the new supercomputers are operating with 213 teraflops (TF) while the older ones were operating at 90. Not being a computer geek, I can only tell you that a teraflop is defined as a unit of measurement equaling 1 trillion floating-point operations per second. It sounds impressive, right?

These two new supercomputers are called “Gyre” and “Tide,” and the upgrades are the first in a long time. It’s all part of the NOAA program that is called “The Weather Ready Nation Initiative.” By the summer of 2015, the goal is for the computers that run the weather prediction models to be running at 1,950 TF.

Remember that the models are only as good as the initial data and the algorithms used in them, but more processing power means more room for improvement of those initial pieces and better output and forecasts.