A LITTLE BIT ABOUT MODELS
Written by: Bob Metcalfe
You load up your models. You're looking for the next big storm. Heck, the skier in me is anxiously waiting for 12"+.
You scroll over through the GFS hours. Nothing on the 12Z. I'll wait 'till the 0Z. 24 hours... nothin'. 72... yup, nothing. 184 hours... what's that I see, a low forming east of the Rockies, and it looks like an upper level trough may join it... I know where this is going! Then BINGO: the 204 hour GFS data gives me (you) EXACTLY what we're looking for; a gorgeous Nor'easter, right on Christmas!
It's true. If I were to issue a forecast based solely on the 0Z GFS from yesterday for Christmas day, we'd have one HECK of a storm on our hands. It'd probably offer us a mixed bag of precip, and cause one heck of a mess for northeast travel, and of course locally.
Now I don't want to be the bearer of bad news, and even though I love a good snow storm just as much as the next guy, I'm not excited. My Meteorology 101 professor called it "wish-casting". I had a kid who sat next to me all year that would jump for joy at the sight of the extended range data the GFS pumped out around this time of year, and you know what? He was always disappointed 8 days later!
My point: take model data with a grain of salt. While forecast models are absolutely useful tools to meteorologists (and you, the weather nerd!), we need to know their strengths and weaknesses. The GFS is a great tool for locating synoptic (read national) scale storm locations and timing them out, as well as identifying cold/warm patterns. The further out you go, the greater the errors can become.
For local data, on say the state level, the NAM (formerly ETA) is even more useful. It has finer scale data ingested and run, so there are more points over a smaller area to use. This is our 720p model if you will.
Then there's the mesoscale models. These can see individual storms, lake effect snow, and all sorts of cool stuff. This is Hi-Def 1080p with 240mhz technology.
But the problem is with these that even one snippet of bad data at the beginning can be magnified hundreds of times and give us erroneous data in the model output. Remember, a computer model is nothing more than hundreds of thousands of mathematical calculations of physics, cloud dynamics, heat transfer... the list goes on and on. Since we can't read all data in every spot at every point in time, that means there MUST be errors.
So, keep it in mind when you look at the long range stuff. I can honestly say I would NEVER go on air and mention a 384 hour forecast from the GFS. That's like me telling you what the weather is going to be like on January 2nd of 2010 right now!
With that being said... where's the next snowstorm?!
Great article, Bob. Really puts into perspective just how complicated our climate and weather is - and then to have to forecast it. A meteorologist once told me how he watched steam swirling out of a dishwasher. He watched all of the swirls and turns and shifts that the steam seemingly randomly made as it traveled up through the air. He thought to himself - how can we possibly model that. I think that says it all.
ReplyDeleteBy the way, our snow "drought" is officially over for now. Rochester has pushed its way to the lead in the NYS snowfall derby. We're ahead of all the big NYS thruway cities, at over 18 inches. In fact, we are literally exactly at average as of today, right down to the exact tenth of an inch. Just a suggestion, this info might make a great blog topic for later today.
ReplyDeleteCheck out the snow standings:
http://goldensnowball.com/