The great John Madden was once quoted as saying, "Usually the team that scores the most points wins the game." Granted, Madden was referring to his beloved game of football, but amazingly the same is true of all sports including baseball.
While few would argue that the most points or runs in a game would result in a win (although I'm curious as to what instances Madden has experienced to the contrary), many are curious to know if the same holds true when estimating wins based on runs over the course of a season. More specifically, does early-season run differential help predict division champions in Major League Baseball?
The discussion surrounding run differential has picked up steam over the past year and some are buying into its predictive power, while others think it's pure stupidity. The novelty of the discussion isn't quite as intriguing as the concept surrounding sabermetrics and the impact of Moneyball, but it does deserve brief inspection to see if there is truth to the matter.
To find out I did a quick comparison spanning back five full seasons to 2007. I looked at MLB standings on July 1st of each season and compared that to the final standings at season's end. Teams with the highest run differential and teams in first place on July 1st of each season were compared to the eventual division winner -- the second place finisher was noted as well.
What I found was somewhat surprising considering all of the growing hype surrounding run differential. Is a team in third place, with a massive advantage in overall runs scored, really a victim of bad luck (see 2012 St. Louis Cardinals)? Or is the mirage of the law-of-averages simply a case of misguided hope?
From 2007 to 2011, the teams with the highest run differential on July 1st finished as division champions fourteen out of thirty times -- just 46.67% of the time. When expanding the finishing place to first or second at the end of the season the outlook was a little more positive. Twenty-five out of thirty times the team with the highest run differential on July 1st went on to finish first or second in the final divisional standings -- good enough for 83.33% of the time.
But how does that compare to teams that are in first place in the standings on July 1st -- ignoring run differential?
In this instance, seventeen out of thirty division leaders were still in the same position at the conclusion of the regular season -- a success rate of 56.67%. When expanding the finish to first or second, the rate improves as well to twenty-eight out of thirty times, or 93.33% of the time.
According to these numbers, the fad of utilizing run differential may be out faster than it came in. Simply looking at the division leader produces a ten percent better predictive model for assessing the eventual division champion or chance for a top-two finish,
So what does this mean for 2012? Granted we aren't to July 1st quite yet, but if we were, here are the implications.
Division leaders Tampa Bay and New York (AL), Chicago (AL), Texas, Washington, Pittsburgh and Cincinnati, and Los Angeles (NL) all have a 56.67% likelihood of winning their respective divisions -- including a 93.33% chance of finishing first or second at season's end.
Teams sitting out of first place but with the top run differential in the division include New York (AL) -- currently tied with the Rays in first -- and St. Louis. According to the pattern of the past five years, each team has a 46.67% chance of winning their division or 83.33% of landing in the top two based on the trends of the past five seasons.
But numbers sometimes lie or flat-out confuse us.
After all, "If this team doesn't put points on the board, I don't see how they can win."
Which current MLB division leaders will be missing at season's end?
Justin Mikels is a staff writer for Operation Sports. You can follow him on Twitter @long_snapper.