Showing posts with label Weather. Show all posts
Showing posts with label Weather. Show all posts

Tuesday, April 14, 2015

Edmonton Air Quality

This morning, I read an Edmonton Journal article that claimed that Edmonton's air quality was worse than Toronto's, even though we have five times less population than Toronto. The article's subtitle reads: "Particulate readings 25 per cent higher on some winter days."

I'll admit that my initial reaction to this was skepticism - the language used in the article seemed pretty wishy-washy and I wasn't sure what all the fuss was about. It's not terribly unnatural for some days in some cities to be worse than some days in other cities. Also, if pollution levels are particularly low on certain days, being 25% higher than another city is pretty easy and still reasonably healthy. So I decided to look into the numbers a little bit more.

The article continues, saying "pollution from particulate matter exceeded legal limits of 30 micrograms per cubic metre at two city monitoring stations on several winter days in 2010 through 2012." Ok, that sounds pretty bad, but what do these limits correspond to, and how bad is "several", really?

First of all, let's take a look at what makes air unhealthy. The Air Quality Health Index used by Environment Canada looks at three factors: Ozone at ground level, Particulate Matter (PM2.5/PM10), and Nitrogen Dioxide. Exposure to Ozone is linked to asthma, bronchitis, heart attack, and death (fun), nitrogen dioxide is pretty toxic, and particulate matter less than 2.5 microns is small enough to pass right through your lungs and play with some of your other organs. These aren't things you really want to be breathing a whole lot of. The AQHI for Edmonton today is a 3 out of 10, considered ideal for outdoor activities, but at a 10 out of 10 level people are pretty much encouraged to stay inside and play board games.

The report in the Journal article referenced PM2.5 only, which is particulate matter that's smaller than 2.5 microns. The maximum allowed levels for PM2.5 in Alberta are 80 micrograms per cubic meter (ug/m3) in a single hour, or 30 ug/m3 over a day. According to the Journal article, these levels were exceeded "several" times between 2010 and 2012. How many is several?

Data from the Clean Air Strategic Alliance

I don't know about you, but exceeding government safe levels for air quality on one day out of every eleven in 2010 is not what I'd call "several." There was over a combined month of air quality limits being broken in 2010 in central and east Edmonton.

I strongly disliked the phrase "25 percent higher on some winter days" due to its vagueness, but the idea of comparing Edmonton to Toronto seemed fun. Based on the CASA values for Edmonton, and the Air Quality Ontario values for Toronto, here's a comparison of the two cities from 2006-2012:


That's... not even close. Edmonton was 25% higher than Toronto for pretty much all of 2012, not "some winter days." This is enough to make me feel like perhaps the sources referenced in the Journal were using different data, or perhaps I'm mistaken, but the sources I used are all publicly available and I encourage you to check them out yourself.

But what about the other major air quality indicators? Turns out that, fortunately, exceeding their limits has proven to be much tougher. The maximum one-hour limit for nitrogen dioxide is 0.159 ppm, over 10 times the daily average for both Toronto and Edmonton recently:


Similarly, the one-hour limit for Ozone is 0.082 ppm, about four times the recent daily averages:


Again, these levels are much safer than the particulate levels were, and in general Edmonton is about the same or slightly better than Toronto for these indicators.

So all in all, I started out today thinking the article was being alarmist, if vague, and I've ended up thinking that it's well-meaning but presented oddly. Edmonton definitely does seem to have a problem with one of the major indicators of air quality, and if it takes a city-pride fight with Toronto to get that sorted out, so be it.

Monday, December 15, 2014

Winter Tires in Canada

Well there's snow on the ground and the temperature's pretty low, so we can pretty solidly declare that winter is upon us. And with wintry blizzards comes one of the great Canadian traditions: changing over your summer tires for winter tires.


If you're anything like me, you probably waited until just after the first major snowfall to remember to put them on. This often ends up with you driving around dangerously for a week waiting for your appointment, all the while dodging other summer-tire skidders. It's a fairly dangerous and unpredictable way to go about driving.

Recently I tried looking up recommendations for when to put on your tires and came to an interesting discovery: almost every single source recommends to put them on once the temperatures dip below 7 degrees Celsius. Everyone from the tire producers to the Tire and Rubber Association of Canada agrees with this fairly precise temperature recommendation.

Why? Turns out that summer tires are made of a different rubber that gets quite stiff below 7 degrees, and reduces the friction of the tires (the comparison that was used was that they approach the consistency of a hockey puck). Winter tires become more effective below 7C, even on dry clean pavement.

Not to scale. Probably.
If you're looking to drive as safely as possible (which you should, seeing as road injuries are the 9th leading cause of death worldwide), it might not be quite enough to just wait until the forecast predicts a temperature below 7, seeing as it often takes time to book an appointment and by that point it could be a bit late. Fortunately, Environment Canada has the daily temperature for various cities over the past several decades all neatly stored online.

So I decided to take a look. These are the average mean daily temperatures for Edmonton per day for the 30-year span between 1981-2010:


Since each day of the year has a decent variation to them, it's also possible to determine the expected probability that any given day will be below 7 degrees Celsius (using their averages and standard deviations). That might get you something like this:



Once you have this, it's fairly straightforward to choose when to put on your winter tires. If you were willing to accept a 50% risk of being ill-equipped for the weather, you'd be looking to put them on sometime around the beginning of October, and take them off around the beginning of May. That's vastly longer than I typically have mine on for, and I suspect that's the same for many people. In total, an Edmontonian ought to have their winter tires on by October 1st, and leave them on for 210 days (at least seven months of the year!).

Of course, a 50% risk of having the wrong tires might seem a bit high for some people. If you were only willing to accept a 10% risk, you'd be looking at 261 days of winter tires starting September 4.

So that's all well and good for Edmonton, but how about the rest of the country? I decided to look at 30 stations' worth of data spanning 1981-2010 (~330,000 data points) to try to develop a map for winter tires in Canada. These stations included all major cities and a few select points to accurately represent the geographical differences. This is what I got:


Unsurprisingly, the northern territories tended to need winter tires more than the southern provinces (quite frankly, it's not worth taking winter tires off if you live in Iqaluit). What might be surprising to some is that even the warmest parts of the country, that hardly ever see snow, ought to have proper winter tires on for at least a third of the year.

Another way to represent the data is to show the probability of being below 7C on any given day like this:


Where green means 0% chance of being below 7C, and red means 100%.

The vast majority of Canadian cities have a high risk of being below 7C sometime in October, and it's important to know when exactly that will be in order to be sure you're driving with the best equipment available. In fact, the above graph can be summarized as follows:


One final thing to note: only the province of Quebec has legal requirements for winter tires, with the exception of some British Columbia highways. These legal requirements fall way outside of the 7 degree recommendation though. It's all well and good to have laws for additional safety when operating motor vehicles, but if they fail to capture the designed temperature ranges of the actual tires, it seems like a bit of a missed opportunity.

Monday, December 8, 2014

2014-2015 ski season

In case you haven't noticed, it's snowed a bit recently in town. And any time it snows in Alberta, I get excited that it's likely been snowing up in the mountains. And that means skiing!

As of December 7, the website OnTheSnow shows that Marmot Basin (the closest ski hill to Edmonton) has a snow depth of 90 cm. That sounds rather decent, and certainly right at the end of November it got a massive dump - but how does that actually compare to normal? I decided to figure out.

Here is the cumulative snowfall of Marmot basin for every ski season since 2007-08:


Alright, so there's quite a bit of variation in there. Maybe a better way of looking at it is like this:


For these graphs, the grey zone represents the maximum and minimum values over the last seven seasons, the light grey line is the average, and the black line is this season so far.

So there's good news and bad news here. The good news is that there's actually quite a bit more snow this year so far than normal! In fact, there's about as much snow at Marmot right now as there typically is by about January first. All in all, maybe not a bad time to go there, in fact!

The bad news is that, apart from two huge dumps, there really hasn't been any action in Marmot. It was way below any of the previous seasons measured until two weeks ago. Marmot looks like it's in a good position now, but if it hadn't gotten luck at the end of November it would pretty much just be rocks. In fact, we can tell it *has* been lucky - Marmot Basin typically only gets two to three snowfalls exceeding 20 cm per day per season (actually 2.43 average), and has already had two this year. Lucky for it now, but it's hard to predict for the future of the season.

Marmot Basin is also relatively easy to predict - on average by the end of the season, its total snowfall has a coefficient of variation of 37.1%. It also has a reasonably early season, with 100 cm of snow fallen on average by December 31.

But how about other Alberta ski hills? Take Sunshine Village, for instance:


Sunshine has a similar situation to Marmot Basin. It's been lagging behind previous years until the end of November (though still within normal ranges), and is now pretty much back on track. Hard to say how that will hold up though. They don't typically reach 100 cm of snowfall until a bit earlier than Marmot (average December 18), and tend to be more predictable (coefficient of variation of 23.8%). They also get far more snow in total than Marmot Basin does...

Lake Louise enjoys a base of 100 cm on average by December 16, but is raucously tricky to predict (coefficient of variation at end of season of 44.3%). Lake Louise has had the same problem as Marmot Basin - it had far less snow than previous years up until a sudden burst rather recently, but it's been flat since. Hopefully that isn't terrible news for the season...

Nakiska's almost doing the best for this time of year out of any of the last 7 years! Good for it. They tend to have more variation at this time of year than other Alberta hills too, so it's actually a bit tougher to say if they'll have a good season or not. They tend not to get a 100 cm base until around January 23rd, and have very unpredictable seasons, with a coefficient of variation of total snowfall of 48.2%.

Norquay's a bit sad. They're well within previous years' ranges, but it's still not looking nice. They'll get their first 100 cm on average by February 10 (yikes), and have a variation in total snowfall around 37.6%. Some years they don't even get 100 cm of snow, though.

Castle Mountain's another sort of sad mountain with a later season (100 cm average by January 11th) and high variability between seasons (43.2%). Both Castle and Norquay seem to have missed the awesome snow dump that the rest of Alberta had, but are tending to stick a bit better into where they'd be expected at this point in the season.

So overall for mountains in Alberta, it's looking like now is a great time to go to Marmot, Sunshine, Lake Louise, or Nakiska. They're certainly at least all doing much better than average for this time of year, and will likely continue to be above average for the rest of December.

Summary:

Earliest decent season: Lake Louise (December 16)
Highest average snowfall: Sunshine Village (486 cm by May)
Most predictable: Sunshine Village (23.8% variation by season)

The sad thing is... BC mountains do way better on almost all counts. Take for example Fernie:

(100 cm by Dec 22, average snowfall 705 cm, COV 25.8%)

Or Whistler:


(100 cm by November 24, average snowfall 796 cm, COV 27.7%).

Both mountains consistently and reliably get far more snow than anything in Alberta. While that may make them sound great on paper, they still haven't had the trend-bucking dump that Alberta mountains have had, and are currently lagging quite far behind their Alberta peers. So while I can't guarantee that they'd have particularly good December skiing this year, you certainly ought to be able to rely on them for quality skiing in the mid- to late-season!

Monday, June 24, 2013

Spring Weather


I've now been doing this weather analysis of mine for 365 days, and what a year it's been. That's 365 days of exciting weather, culminating with tornado watches a couple of weeks ago and the horrible flooding around Southern Alberta going on right now.

The end of the year also marks the end of my experiment with tracking weather stations, which I previously reported for the summer, fall, and winter. The scores for spring (out of 100) were:

  • Weather Network: 72.08
  • Weather Channel: 69.56
  • Global Weather: 68.52
  • TimeandDate.com: 68.47
  • Environment Canada: 65.96
  • CTV Weather: 64.31
Just in case you didn't have the previous values memorized, this is how they've changed over the last season:


A couple of things about this are really quite noteworthy. First of all, the forecasters are all a lot closer together than they were in the previous two seasons. Good for them? Also, TimeandDate.com and CTV had massive improvements to their scores, with TimeandDate.com jumping two ranks higher. For the fourth season in a row, the Weather Network and the Weather Channel both hold the number 1 and number 2 spot.
As I mentioned before, one of the issues with this analysis is that CTV Weather only presents numerical values for POP up to four days in the future. If we only take these four days for all weather stations, the results change fairly drastically:

  • Weather Network: 74.47
  • CTV Weather: 71.09
  • Global Weather: 70.63
  • Weather Channel: 70.46
  • TimeandDate.com: 69.84
  • Environment Canada: 67.74
The overall score graphs for spring look like this:


As always, graphs like this highlight the fact that weather forecasting gets increasingly more difficult the farther into the future you attempt to predict. Any given day into the future only has a difference in score of about 10 points between any station, and all six weather forecasters have a nearly identical trend decreasing with time.

Since it's now been a year, I can also share the overall yearly scores for each forecaster. They are:
  • Weather Network: 76.39
  • Weather Channel: 73.92
  • Global Weather: 72.00
  • Environment Canada: 71.20
  • TimeandDate.com: 70.02
  • CTV Weather: 60.99*
*CTV Weather isn't a fair comparison as the analysis was started only in the fall, and the best season for all the other forecasters was summer. This probably messes up their average scores a wee bit...

Aside from the difference in numbers of days accounted for and length of analysis, one of the other major differences between stations is the presentation of POP values. For instance, Environment Canada has listed policies that state that if their calculated chance of precipitation is between 30 and 70% it will be listed (at increments of 10%), but 50% isn't allowed to be listed. This forces their values to be listed as either 0, 30, 40, 60, 70, or 100. CTV has a similar policy, but the other stations present values all the way between 0 and 100 at increments of 10.

The problem with this is that all forecasters could predict a 10% chance of rain, and if it does end up raining they'd receive a pretty bad score for that prediction. However, Environment Canada and CTV would receive a much worse score, because they would have published a value of 0%.

If we account for this, the new net yearly scores become:
  • Weather Network: 74.61 (-1.78)
  • Weather Channel: 71.94 (-1.98)
  • Environment Canada: 71.20 (No change)
  • Global Weather: 69.68 (-2.32)
  • TimeandDate.com: 68.14 (-1.87)
  • CTV: 60.99 (No change)
So yeah, changes in how values are presented definitely have an effect on the scores. This isn't quite enough to counteract the full difference between the Weather Network and everyone else, but it does tighten things up a little bit.

One last thing. When a weather station gives a POP value, what is the actual chance of any rain falling?


Ideally, the graph would be a straight line from bottom left to top right, but almost exclusively the bars on the graph are higher than what would be expected.

Part of the reason for this is that the definition of precipitation that I used was similar to how Environment Canada defines POP - the "chance that measurable precipitation (0.2 mm of rain or 0.2 cm of snow) will fall on “any random point of the forecast region” during the forecast period." 0.2 mm isn't a ton of rain, really, so what if I bump the values up to 1 mm?


Well that's quite the improvement. In fact, it substantially helps the scores for all weather stations by about 4-7 points each. What's cool is how close the bars are for low values of POP, and how they tend to diverge at higher values.

Anyway, this has been a pretty cool project for me for the last year. Hope you had fun too!

Thursday, March 21, 2013

Winter Weather

Oh hi! Didn't see you there.

It is now spring! And despite the massive continuous blizzard that appears to be going on outside, we're supposed to be getting warmer. Any day now...

You may have seen my analyses of Summer and Fall for Edmonton weather. Hopefully ever since then you've been on the edge of your seat awaiting the results for winter.

Wait no longer! The winner for winter is: The Weather Network. (three times in a row!)

Scores for winter (out of 100):
Noteworthy about these scores is that Environment Canada climbed from 5th place to 3rd place for the winter, and that everyone's scores (apart from Environment Canada's) continued to decrease from the fall. This is all shown in this graph:


Weather or not (PUN!) temperatures and precipitation are actually tougher to forecast in winter is a question better asked of the actual meteorologists. My suspicion is that at least part of the continued decrease in scores is that trace levels of snow are harder to measure as precipitation than rain, but that's mostly just a guess.


Some fun facts!

Best high temperature prediction: Weather Channel 1-day prediction: 71.84%
Best low temperature prediction: Weather Network 1-day prediction: 68.64%
Best precipitation prediction: Weather Network 1-day prediction: 76.67

Worst high temperature prediction: TimeandDate.com 6-day prediction: 36.30%
Worst low temperature prediction: TimeandDate.com 5-day prediction: 37.64%
Worst precipitation prediction: Environment Canada 6-day prediction: 54.70

Some graphs!

Again, CTV scores are only directly compared to the others for four days. I still find it cool that there is as strong of a downward trend as there is - on average, a forecast for a week in the future is 15% less accurate than a forecast for tomorrow.

For those of you who are still reading and like graphs, you can check out the breakdown of where the previous graph comes from:



Have a good spring!

Wednesday, January 2, 2013

When Life Gives you Weather Stats...



...make Statsonade!

So here's my dilemma. I have this 'weblog', and it's really cool when people read it. (Not quite as cool as before, because apparently somebody spam-clicked my ads and now I no longer get money. On the other hand - no ads!) By far the most popular posts are when I talk about weather stuff or SU stuff, and as there are no present SU elections to write about and I've been doing weather posts at the end of each season, I may have nothing good to offer for this week.

Instead, I'm going to make statsonade from the stats that life tossed me. Yum!

In my last post, I presented the weather forecast comparison I had for six weather stations in Edmonton during autumn. It was pretty fun, and only one of the six forecasters outright told me I was wrong.

One of the accuracy measures I used in that analysis was the percentage of time that a forecaster was within three degrees of the true high temperature. An alternative way of presenting these results would be to just outright plot the predicted vs. actual temperature results for each station. Maybe it would look something like this:


There are some wicked fun facts from these graphs. In all of them, the red line is perfection, where what you predict is exactly what you end up with. The data was only presented for autumn, and stations are all quite close to perfect, as well as close to each other - the R-squared values range from 0.900 to 0.926. In general it appears as though most of the stations over-predict the temperature when it gets to the higher range

That's all well and good, but what if we wanted to take this a step further? Is there some combination of  stations that gets you better than any individual station? That would be like a weather model or something.

It turns out that you can actually get a marginally better prediction by using a weighted average of the stations. Consider the following:

T = 0.085TAD + 0.301EC + 0.148GB + 0.155WN + 0.483WC - 0.172CTV

After all that work, our R-squared value is a whopping 0.944. Though this method of aggregating weather forecasts is apparently a minor improvement, it's likely not worth it in terms of predicting the weather.

A fun result of the regression suggests that it would be easier to just take a weighted average of Environment Canada and Weather Channel's predictions, as they make up the majority of the formula. What's really strange, though, is that the CTV predictions get factored in as a negative value. CTV itself has a completely respectable correlation between predicted and real temperatures, but for some reason subtracting a weighted version of their numbers improves the overall prediction (when using them and at least two of any other weather station). Mysterious...

Thursday, December 20, 2012

Fall Weather

Hey there!

I realize that technically tomorrow is the end of fall, but seeing as some crazy people think the world's going to end then, I figured I'd get this done now.

Last season's weather analysis seemed pretty popular, and I decided to continue it into the fall. There are two big changes from last time:

  • I added CTV Edmonton Weather as a sixth weather forecaster. Their system is a little bit different from the other five forecasters in the analysis so far; they only give probability of precipitation numbers up to four days in the future, but do a much longer range of temperature predictions. As a result their total score is only directly comparable to other stations for four out of six of the days predicted, as comparing a number to a rainy cloud icon isn't very fair statistically.
  • I changed the way POP scores are calculated. Previously I used a weird system that was more-or-less based on p-scores, but as soon as a station predicts 0% and it rains, or vice versa, their scores are shot. The new system is based on the Brier Score (a system that other people made up and actually use). In this case, a 0% prediction with rain still gives a score of 0, but it's averaged against other scores. 
Anyway, the winner for the wonderfully balmy season of fall is: The Weather Network. (again!)

Scores for fall (out of 100):
Unfortunately for CTV, their score is a artificially lowered compared to everyone else due to the lack of precipitation forecast on the last two days. It isn't that significant of a penalty, though, as the 5th and 6th day forecasts are weighted the least. If we ignore them and only consider four days, their weighted score would become 65.18, much closer to the others.

It turns out that, with the change in POP scoring system, the numbers for fall are significantly lower than the previous numbers for summer. Take a look at this graph:

Not only do all forecasters do worse during the fall, but they also become less consistent with each other.

Some fun facts!

Best high temperature prediction: Weather Network 1-day prediction: 70.60%
Best low temperature prediction: Weather Network 1-day prediction: 78.38%
Best precipitation prediction: Weather Network 1-day prediction: 76.38

Worst high temperature prediction: Weather Network 6-day prediction: 45.70%
Worst low temperature prediction: Environment Canada 5-day prediction: 49.20%
Worst precipitation prediction: Environment Canada 6-day prediction: 54.28

Some graphs!
Again, CTV scores are only directly compared to the others for four days. What's cool to see is that almost all of the stations consistently lose accuracy the farther into the future they try to predict - which, of course, makes intuitive sense. If you're interested in more of a breakdown of how these scores were developed, you can check out these other graphs.



See ya at the end of winter!

Friday, October 5, 2012

Summer Weather: Part 2

A quick update to my post from last time!

Last week I posted the summer numbers from my weather station analysis. At that time, the scores were (out of 100):
  • Weather Network: 66.92
  • Global Weather: 66.02
  • Weather Channel: 63.99
  • Environment Canada: 55.00
  • TimeandDate.com: 54.25
Based on the scoring system, a station could have gotten 100 if all of their temperature predictions were within three degrees of the actual weather, and the fraction of days with rain accurately matched the POP forecasts for every POP value (in increments of 10). A station could have gotten 0 only if their POP values were wildly inaccurate.

A better benchmark, though, is how well my system would have scored someone just guessing. That would potentially better demonstrate the effectiveness of weather forecasters.

Using historical data, I was able to create a "dummy" weather station that used previous years' averages to "forecast" the weather on a month-to-month basis. For example, every day in July was predicted to have a high of 23, a low of 12, and a POP of 60%.

The score obtained using this method? 38.12. In fact, the average temperature predictions were less accurate than every forecast in my model so far (just over 50% within three degrees), and the POP predictions was only better than half of the other stations' 5- or 6-day predictions*.

That's certainly encouraging! A weather station's forecasts even six days in the future are significantly better than the best educated guess you could make given historical data. So there you go - next time you criticize the meteorologist for being inaccurate, remember that actually, they're at least twice as good as you.

*: The method I use for scoring POP forecasts is perhaps objectively fair, but not very accommodating to different weather stations' reporting methods. Stations that give increments of 10% between 0 and 30 will necessarily do better than those who don't, even though a 10% POP forecast is more-or-less useless. I'm looking into ways to be a little bit more fair with this.

Monday, September 24, 2012

Summer Weather

Weather forecasting is insane.

As a career I couldn't even imagine how un-rewarding it is - you could pour hours and hours into developing new algorithms that only get tiny increases in accuracy due simply to the massive complexity of the system you're trying to model. When you're right people take you for granted, and when you're wrong you take a lot of blame.

That being said, a while ago I noticed that sometimes different weather forecasters will predict radically different weather for the same day, given the same data. Also, I noticed that on Monday the weather for the weekend could be substantially different than the forecast from Friday. These are all fair differences - tweaks to models could cause differences of opinions between meteorologists, and the closer your prediction is to when you make it, the more accurate we'd hope it would be.

I was curious as to how much of a change there would be, though, which is why I decided to keep track of it. Since the beginning of June I've kept track of the six-day forecasts for High temperature, Low temperature, and Probability of Precipitation for five different forecasting stations: timeanddate.com, Environment Canada, Global Weather, the Weather Network, and the Weather Channel. Environment Canada, Global, and the Weather Network were chosen based on the sites visited most frequently by myself and my friends, the Weather Channel was chosen as it is the basis of Yahoo! weather, and subsequently the commonly-used Apple weather app, and timeanddate.com was chosen because it's a large multinational site. All stations were chosen at the Edmonton downtown location, not the international airport, and data for predictions was collected between 11 and 12 am for consistency in comparison.

Now that summer's over, I have some preliminary results. And the winner (by a hair) is the Weather Network!

Score (out of 100):
  • Weather Network: 66.92
  • Global Weather: 66.02
  • Weather Channel: 63.99
  • Environment Canada: 55.00
  • TimeandDate.com: 54.25
The score is based on a weighted average that was more or less arbitrarily decided by me: each subsequent day in the future was weighted less (so that a prediction for tomorrow's weather is worth more than a prediction for next week's), and POP was worth more than the High prediction, which was in turn weighted more than the Low prediction.

Some fun facts!

Best High temperature prediction: Weather Channel 1-day prediction (96.79% within 3 degrees)
Best Low temperature prediction: Environment Canada 2-day prediction (96.07% within 3 degrees)
Best POP: Global 4-day prediction (p-value 0.346)

Worst High temperature prediction: TimeandDate 6-day prediction (55.20% within 3 degrees)
Worst Low temperature prediction: Global 6-day prediction (68.57% within 3 degrees)
Worst POP: TimeandDate 3-day prediction (p-value 0.038)

Some graphs!

Temperature score was based on the percentage of predictions that were within 3 degrees of the actual temperature. In general there was a very strong downward trend for the high temperature predictions - almost all stations had better than 95% accuracy at predicting tomorrow's weather, and they were all about 70% accurate at the weather a week from now. There was less of a trend noted for the low predictions, however those are typically less useful apart from determining the likelihood of frost.

The score for POP is based off the p-value for each category of prediction. In essence, I checked the number of days that a given station predicted a POP of 10%, and compared it to the fraction of days that it actually did rain for that prediction. This doesn't translate directly into an accuracy percentage, which is why I call them 'scores' instead (though if every category had precisely the incidence of rain as predicted, it would end up with a score of 100).


So there you go! Hopefully this helps you the next time you're planning a picnic (or whatever people check the weather for...).