A data-oriented retrospective

Tracking the conditions at ski touring centers represented a way to imprint specifics of the season in memory, and better yet, have data to show for it. I figured I could use the recorded conditions to get a longitudinal perspective on the season. I also wondered if it would reveal anything unexpected.

There don’t seem to be any retrospective assessments of ski seasons, so if I’m the first, yay me! For more on what I found out, read on.

The rankings

I’ve been tracking up to 45 ski areas on average twice per week this winter, and I’ve picked a handful for their particular interest to me (and any other Nordic skiers in the metro NYC area*). For the sake of simplicity the ski areas are listed with length of ski season and grouped by driving distance. I set the length of the season in the downstate area of the Northeast to fourteen weeks (Christmas to April 1), and have expressed the length of a ski center’s season approximately by quartile.

The lists below are in a ‘ranked order’, but I’m purposely not showing a rank or metric. For more on why, read through the section titled ‘Nerdopolis’ below.

Touring centers in day-trip distance from NYC:

NameLength of ski season 2018-19
Minnewaska4-6 weeks
Canterbury4-6 weeks
Mohonk4-6 weeks
Weston7-10 weeks
Maple Corner Farm4-6 weeks
High Point Cross Country Ski Centerless than 4 weeks
Fahnestockless than 4 weeks

There’s one surprise here: Canterbury farms would have been a viable option for skiing, particularly in January when Minnewaska didn’t have skiing. Weston is an anomaly because they make snow, so their season was artificially lengthened.

Because the length of the ski season is highly dependent on the weather, the ranking is only useful in retrospect and not as a prediction. The rank is also biased for proximity to NYC. If I the data had been from the ski seasons of 2010-11 or 2013-14, Fahnestock might have popped to the top of the ranks due to its shorter drive time.

Best from ‘summer camp country’:

NameLength of ski season 2018-19
Lapland Lake11+ weeks
Prospect Mountain11+ weeks
Notchview Reservation11+ weeks
Timber Creek11+ weeks

If there’s a surprise in the list from ‘summer camp country’, it’s Notchview, which is clearly punching above its weight as an alternative to Prospect Mountain.

What’s not in the scope of the data is that lodging is sparse outside of Lapland Lake, and while Bennington has some motels and B&Bs, there’s not a great selection. I’ve never skied at Notchview, but there are places to stay off of route 8A or route 7. Definitely seems worth a look next season, if not sooner.

The takeaway

The data from the season says: don’t overlook the Berkshires! There will be times Canterbury and Notchview will have skiable conditions when Minnewaska doesn’t, and Canterbury in particular is a shorter drive than Prospect Mountain.

Report from nerdopolis- predictability vs. inconvenience

In developing the ranking, I used length of ski season as a proxy for predictability (ie: when you want to go skiing, there will be skiable trails); drive time was used as a proxy for inconvenience (the more carbon you have to burn to go skiing, the more it takes to justify the trip). Predictability involved also having some guarantee of skiable conditions, so touring center statuses were used to factor in quality of opportunity. Arriving at the ranking took a fair amount of fudging in the arithmetic.

If you want to know more, read on. If taking a deeper dive into how I came up with the rankings sounds boring, skip down or jump away.

Inconvenience and quality of opportunity are human-qualitative values to begin with, so right away you know the rankings aren’t anything like ‘scientific’. In addition, any judgement of the quality of skiing is suspect. I read between the lines of the ski reports to set the ski area’s status for each report- not to mention factoring in the period of the season:’Good skiing’ in early February means something different than in late March. With the disclaimers out of the way, let’s move on.

The length of season was easy to determine- the first reported skiable conditions at an area started the clock; and on the other end, the last time an update was posted or the declared date of closure for the season.

Drive-time is of course based on mid-town NYC as the start point (see ‘List of touring centers’ link above). To make it better resemble ‘inconvenience’ I made an increase in drive-time count exponentially- eg a 3 hour drive has almost twice as much ‘inconvenience factor’ as a 2 hour drive.

I felt that the longer the ski season the more opportunity there was to go to a given touring center, but it had to take conditions (quality) into account. The average of skiability statuses across each area’s ski season was the indicator of quality. **

To check the results, I paired up some areas to compare and tried to translate what I felt was the reality of the past season into the calculations- in other words I rationalized the math until I got answers that matched observation and experiential knowledge.

  • Minnewaska and Mohonk are a natural pair being so close to each other. Mohonk actually had a slightly longer season but didn’t groom as much or as well, so the ranking had to reflect better ‘quality of opportunity’ at Minnewaska.
  • Notchview and Northfield Mountain are somewhat close, but had very different seasons, and any rational ranking had to put Notchview on top.
  • Lapland Lake and Prospect Mountain each had long seasons, and their drive-times are the same. But Lapland Lake had an edge on trail conditions, so the rankings had to reflect that nuance.

Determining the quality of a given area’s ski season without direct observation was hard, and will always retain some uncertainty. Condition reports always need a ‘credibility filter’ which I tried to correct for when setting skiability status. ***

The crux fudge factor involved adjusting the average skiability status using the length of each area’s ski season to arrive at ‘predictability’, such that the rankings came out the way I thought they should. In effect, the fudge factor means that having skiable trails more often counts a bit more than better conditions.

The result is that a significantly longer ski season at one area will win in ranking against another with slightly better conditions, all other things being equal. But a touring center close to NYC will win against one further away, even if it has a slightly shorter season. At the same time, for two ski areas with similar ski seasons and distance, the one with better conditions on average comes out on top.

There is only one season of data to work with, so I’ll probably tweak or overhaul the calculations based on next year’s reports. This is indeed a certain kind of fun. Got a comment? Go ahead and leave a note- I’m listening.

Just one more thing

Another side effect of tracking the season the way I did was that it produced a bunch of pictures of snow depth that I’ve collected into a time-lapse view of the past ski season. You can see how much it matters to the March ski season whether a given location builds a snow pack during January and February.

Posts will get very sparse from here on till next fall, but hope you’ll check back every so often, and tune in for next season. Thanks for reading.

Footnotes

[*] Why NYC as the start point? I live in the metro NYC area, as do 20 million others, at least some of whom must be interested cross country skiing but may be unsure where they can go for a day or a weekend during the season.

[**] A change to statuses I’ll make for next year will be to use an explicit status for season start/end. It’d also be nice if more ski touring centers would update their conditions page to actually say: “closed for the season” instead of just leaving a week-old grooming report up. Sometimes I wondered if they’d just closed up and didn’t bother to say anything or just decided not to do an update for a couple of days.

[***] Some touring centers have reports that feel like real advice: Minnewaska, Notchview, Osceola, and a couple of others. Generally speaking it seems to go with the attention paid to grooming, and who’s writing the reports. Those areas whose raison d’être isn’t critically dependent on the cross-country ski crowd (eg: spa-type resorts and Alpine ski areas) tend to have vague updates. And almost all have some shading of honesty with PR-speak.

2 thoughts on “A data-oriented retrospective

Comments are closed.