The deeply flawed analyses of Fort McMurray's climate
As every Tom, Dick, and Harrietta tries to talk about climate change and the massive wildfire that destroyed a good portion of the northern Alberta city of Fort McMurray, in the heart of the oil sands region, there is unscientific nonsense being spewed all over the place.
Take Elizabeth Kolbert's article at The New Yorker. Apparently Kolbert won the "2015 Pulitzer Prize for general nonfiction."
Kolbert claims the following: "April was exceptionally mild."
Wrong. Wrong. And even more wrong.
Here are the mean daily maximum and mean daily temperatures for the month of April at Fort McMurray since records began in 1916, with the 2016 data highlighted in red.
Not only was April 2016 not "exceptionally mild," but it was colder than April 2015 and quite a bit colder than many other Aprils in the past several decades.
What is happening instead is that Fort Mac's annual average temperature is changing rapidly over time.
That is just fact, like it or not, whatever the causes be.
Annual precipitation is in a steep declining trend since about 1970, driven by trends during each of the individual seasons, but there is a clear cycle dating back to the early 1900s. Thus, overall, there is no significant reduction in precipitation for the region if we use the entire century-long climate record, since the recent drying trend is offset by the wetting trend that took place from about 1940 to 1970.
Kolbert goes on to claim that the region experienced "temperatures regularly in the seventies" during April. Well, at Fort McMurray, just four days during April reached into the 70s. That is "regularly"?
As well, Kolbert links to a CBC story on the disaster in which Mike Wotton, a research scientist with the Canadian Forest Service and professor at the University of Toronto, is quoted with the following:
You hate to use the cliche, but it really was kind of a perfect storm. There was a mild winter and not a lot of meltwater from the mountain snow pack.
Indeed, we would hate to use that cliché, particularly since I would like Wotton to explain how having "not a lot of meltwater from the mountain snow pack" in the mountains that are hundreds of kilometers away from Fort McMurray has any significant impact whatsoever on the fire at Fort McMurray itself. Levels in the Athabasca River, driven in large part by the spring melt from the distant Rocky Mountain snowpack, are effectively irrelevant when assessing fire risk in the surrounding countryside around Fort McMurray. How, indeed, do modest river level variations directly influence regional fire risks? I would like to hear that mechanistic explanation.
What is relevant is the amount of precipitation that fell around Fort McMurray itself – and not in far away mountains – in the months preceding the fire. And, as I've previously noted, it appears to have been a very dry winter at Fort McMurray, which led to a general drying of the countryside, greatly increased fire risk, and an increased extent of the damage once the fire started.
So, overall, since 1970, the Fort McMurray area has become much warmer and much drier. That will, all other things being equal, substantially increase fire risks.