15 March 2009
Posted by Dan Satterfield
I had an email from a viewer last week, who was upset that he canceled his golf game three days in a row because I forecasted rain. He pointed out that it was Thursday, and the rain had finally started.
My first thought was that he had gotten my forecast mixed up with someone else’s. (It happens frequently. I’ve been blamed for missing a snow forecast, when I was spot on!)
Turns out, I had forecasted a 25% chance of rain, at the most, early in the week. (I had indeed forecasted a 95% chance of rain for Thursday) This was apparently enough for this gentleman to call off his Monday-Tuesday golf games. Whose fault is it??
Some of the blame, and perhaps most should go to me.
I would never cancel an outdoor activity on a 25% chance of rain. Nor should you. The problem here, is that everyone interprets a weathercast in their own way. Some responsibility is on the viewer to have a modicum of intelligence, but it’s my responsibility to produce a weathercast that is understandable, to the vast majority of viewers.
Easier said, than done.
The public has a very poor understanding of what a chance of rain means. When I forecast, first, I decide if we will see precipitation, and then what percentage of the viewing area will likely get wet. If it’s 50%, then that’s the rain chance.
The NWS does it a little differently. My friend, and exc. forecaster James Paul Dice, (Chief Meteorologist for WBRC in Birmingham) has written in detail about how they do it. The difference is usually minor between these two methods.
As the forecast period increases, accuracy decreases. A forecast beyond 7 days, has little accuracy, and your best bet is to look at the averages. There is some skill in temperature, (Above normal, below or average.) out to two weeks. You would be surprised how often I get requests for a forecast, that is 15 or more days into the future.
On my forecast pages, I have a line at the bottom that says “Forecast confidence _______” with high, low etc. in it. There is a very good reason for that. It’s really the main reason I am writing this post. The next sentence is the most important one in this post!
In Science, any measurement, or prediction, is WORTHLESS, if it’s not accompanied, by a measurement of the uncertainty in it!
This is true for weather forecasts, and it’s true for Climate Science and all of Science. The IPCC projects a rise of 4.0C by 2100 (business as usual scenario) with a range of 2.4 to 6.4C. The scientific confidence in that range is 66%. (Based on some complex statistical analysis, not just a guess that it’s 66%!
So, to be really correct when I put a forecast on TV, it should actually something like this:
Sunny and warm Tomorrow
High 72 +- 3 F
Wind: SE 6-14 mph gusts to 20 +- 5 mph.
Also, if it’s in a Scientific journal, it would be in degrees Celsius or Kelvin. Not Fahrenheit.
IF, I did this on air, I’m not sure who would want to string me up first, the viewers or station management! Both would likely have cause!
That forecast confidence line is what I do instead. Not every forecast has the same degree of uncertainty. This is important information for the viewer. It also reminds them that there is a bit of uncertainty in any scientific prediction.
Most of the time, in Science papers, the range of uncertainty given is the 95% confidence level. In other words, the high will be between 20-24C with 95% confidence. By the way, a 24 hour forecast is usually within 4 degrees Fahrenheit, much of the time.
You may think this is all rather kooky, but you do it too. Ever measured a room for carpet? Say you came up with 14.4 square yard needed. did you order 14.4 or 17 yards, just in case you were a bit off.
Just so you know, a forecast 5 days out of 70 degrees is a good forecast in my book, if the high is between 65-75. If you are demanding more than this, you are likely to be disappointed. A 40% chance of rain means you are likely going to stay dry. Don’t cancel the golf game!
Cancel golf, what gives, that is what umbrellas are for 🙂
I recently talked about some of the reasons for the uncertainty in a post I did on Chaos Theory, but another good read for the forecaster and viewer alike is a good book about both uncertainty and challenges in effective presentation called ‘Completeing the Forecast’. The book can be read online at http://books.nap.edu/catalog.php?record_id=11699.
Have heard of this book! Thanks much. Glad it’s free because I have already blown my budget at Amazon for March.
Ok so I don’t do the golfing thing, but if the weather man (in this case you) says there’s a 25% chance it’s gonna rain I pretty much assume it’s probably not gonna rain and if it does it’s not gonna rain much. That was just silly of him to cancel his golf game…..and then blame it on you lol. Good post….gave me a giggle.
The POPS can be so confusing to some people that some weather forecasters don’t like using them. I like them when the % chance (using the NWS system) is between say 30-70% but if under 30% I would think “rain chances minimal” would cover it, if above 70% I’d think “Rain likely” would be good, if 100% I see no reason to offer a percentage – just say “It’s gonna’ rain!” Heheh . . . although I can only see your forecasts on my wifi connection I think you generally do a great job . . . I think more people pay attention to the graphics than the numbers though . . . but that’s a whole different subject . . . if the numbers are confusing perhaps the graphics could fill in the gaps since most people are visual learners . . . ?
Just throwing ideas around as always . . .