A belated ‘welcome back’ to all you prediction and projection enthusiasts! BttP’s seventh annual review of preseason prognostications is finally here, a delay which can only be blamed on me. All of you who thought bad predictions were going to go unmentioned can breathe easy now.

A quick recap of how this works for the new reader. For each of the sets of predictions and projections that featured in our preseason analysis, the mean absolute error (MAE) and root mean squared error (RMSE) has been calculated. MAE is the average difference between the predicted total and the actual, while RMSE is the square root of the average of the squares of all the differences. RMSE gives greater weight to large errors because they are squared, so if you think bigger misses should be punished more heavily, this is the more relevant number.

Read the preseason piece for a full breakdown of where all the competitors stood in March, but if you want to get right to the results, here’s a quick reminder of who’s competing for the title:

The Contenders

PECOTA (PEC): The Baseball Prospectus projected win totals based on their in-house projection system.

FanGraphs (FG): The FanGraphs Depth Charts projected totals, which are a combination of the Steamer and ZiPS projection systems, with an additional playing time adjustment applied by FanGraphs staff.

Davenport (Dav): Totals based on Clay Davenport’s projection system, with Clay’s own playing time estimates.

FiveThirtyEight (538): Site projections from FiveThirtyEight.com, based on their Elo rating system.

Banished to the Pen writers (BttP): Predictions from each of our writers from our season preview series.

Effectively Wild guests (EW): Predictions from each of Effectively Wild‘s team preview podcast guests. All of these were very helpfully compiled by Paul McCord (@BravesStats on Twitter) on this spreadsheet, thus saving me the job.

Bat Flips & Nerds (BFN): Predictions from the annual roundtable podcast game that I carry out with the Bat Flips & Nerds crew.

Composite (Comp): The average of the four projection systems plus the BttP/EW predictions, with the latter sets adjusted down to add up to 2430 wins so they are not given extra weight.

Public (Pub): The average of all responses to a preseason poll in which I asked people to predict win totals for every team.

The Results

SetMAEMAE RankRMSERMSE Rank
BttP8.733211.1181
FG9.233511.3922
Composite9.007311.4403
Pub9.333611.5074
5389.167411.7325
Dav8.633111.7636
EW9.733811.8277
BFN9.633711.9298
PEC10.667912.9319

An unusually large split between the MAE and RMSE outcomes at the top, where Davenport pipped the BttP writers to the MAE crown but slid all the way down to sixth in the RMSE reckoning. Davenport nailed the Astros’ win total and missed eight more teams by just a single win, but was undone relative to the competition in RMSE with huge misses on the Padres (+20), the Nationals (+21) and the Diamondbacks (+28). That left our writers here at BttP as the RMSE champions over FanGraphs. That was thanks in no small part to James Cardis and Jameson Weiss, for their optimism on the White Sox and Red Sox respectively, and pessimists Nick Strangis and Ahaan Rungta, who didn’t like the Diamondbacks and Twins as much as everyone else (not that they predicted the disaster seasons, but those misses cost BttP less than any other set).

It was a solid year for the public predictions, which beat all the projection sets but FanGraphs in RMSE to finish fourth. As usual, we had some truly dreadful predictions within that, including several that featured a RMSE of more than 14 wins. Offsetting those was the outstanding work of Ash Taylor and Conor Kelly, who both beat even the BttP set by recording RMSE marks below 11, with Ash’s 10.965 leading the way. The secret appeared to be truly incredible pessimism about Arizona, as Ash crushed everyone else by predicting that the Diamondbacks would win a mere 54 games. Considering almost everyone else went at least 20 games overboard, it made a huge difference. Credit also to George Martin of @AstrosFansUK, who recorded an excellent MAE of 8.467 to lead the way there. The full results can be seen here.

At the bottom it was an unmitigated disaster for PECOTA, which followed up last year’s first-place finish with a no-doubt last place in both categories by a huge margin. While a lot of Atlanta fans got mad about their preseason 80-82 projection, that wasn’t the reason for the demise nearly as much as some Central misses, notably under-selling the Cardinals and White Sox by 12 wins each when no other set was even into the double digits. Everyone missed on the Giants by a record-breaking amount, but PECOTA was tied for the most at 33 wins below San Francisco’s actual total. Perhaps most inexcusably, PECOTA was too optimistic on the Orioles, projecting them 16 wins higher than their eventual 52-110 record.

This year was just plain hard to predict overall – the hardest I have on record, in fact. The biggest errors I have seen doing this in previous years were back in 2018, when the winning set recorded a 9.3 RMSE and the worst was at 10.7 – a number that would have easily taken the overall win this year. The Giants played a big part in that, but so did NL West rivals San Diego and Arizona, and Washington, Minnesota and Seattle, all teams that had an average miss of over 15 wins. The ‘easiest’ teams to predict this year were the Royals, Phillies and A’s, who had an average miss of just 2.05, 2.28 and 3.04 wins respectively. No-one whiffed on Kansas City’s total by more than four wins.

Irrational credit, as always, to those spot-on predictions: PECOTA on Pittsburgh; FanGraphs on Atlanta; Davenport on Houston; 538 on Kansas City; and Effectively Wild guest Will Leitch on St. Louis. The Composite set, as always, performed quite well, finishing third in both categories and nailing Philadelphia, Atlanta and Kansas City, so I guess credit goes to me for bothering to average the projections.

Below is a table showing all of the predictions and projections relative to the actual win total, shaded in red according to the size of the error.

Ranks

One way we can mitigate those colossal misses is to look at how good the projections were at predicting the final standings order.

SetMAEMAE RankRMSERMSE Rank
BttP5.76717.5301
Composite6.13347.9832
FG6.06738.1533
Dav6.00028.1734
Pub6.46768.3195
EW6.80088.4506
5386.40058.4547
BFN6.63378.4878
PEC7.33399.0559

It’s no surprise that BttP came out on top, but the size of the victory on both counts is impressive. The RMSE change is a little friendlier to Davenport than it was in the raw win total department. It also pushed the Composite set over FanGraphs for second place. It was less kind to 538 and it wasn’t a very good year generally for the podcast prognosticators, as both us idiots at Bat Flips and Nerds and the Effectively Wild guests were well off the pace. PECOTA was already too far behind for the rank analysis to do any more than close the gap a little.

This wasn’t as difficult a year to predict the rankings as 2017, when the winner still missed by an average of 8.67 places. Many of the expected hard-to-predict teams from the win totals show up again here, although the Diamondbacks’ terrible season is significantly mitigated by this way of evaluating the season. The predictions and projections were collectively the closest on the Orioles and Marlins, since everyone projected them to be extremely bad and slightly less bad, and the Dodgers, since everyone predicted them to be in first place. The average error on all three was a nice neat one.

Here’s the same shaded table of differences for the ranks:

And that’s a wrap on 2021. Join me again when the 2022 season is about to start – hopefully for a full 162-game set and with no delays.


Previous post:

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.