In the 2014 baseball season, teams averaged only 4.07 runs per game. That represented the nadir of a steady decline since the 2000 season, and the lowest run production since 1981, when the figure was 4.00. In comparison, the figure had been more than a run higher (5.14) as recently as the year 2000.
Trends like this tend to distort our perception of player achievement. A slugger who batted in 102 runs in 2014 is as productive as one who batted in 129 in 2000, yet those two RBI figures look very different to our eyes. Lacking indepth analysis, we perceive a player who bats in 100 runs to be a solid middleofthelineup guy, a good player, but one possessed by virtually every team in the majors. On the other hand, our mental shortcuts determine a player who drives in 129 teammates to be a premier slugger, possibly the league leader. In fact, no player came within ten of that number in 2014.
Similarly, a pitcher with a 3.74 ERA in 2014 was an average major leaguer, a sturdy third starter, but in 2000, that would have been good enough for third place in the American League, behind only Pedro Martinez and Roger Clemens, and barely behind Clemens (3.70) at that.
The gap between 2000 and 2014 by no means represents a radical outlier in baseball history. If you want to see a dramatic contrast, study 1908, 1930 and 1968, when the "runs per game" statistic went on a roller coaster ride from 3.38 up to 5.55 and back down to 3.42. There is no need to study years which are multiple decades apart. The decline from 2000 to 2014 is a steep and sudden one to occur in merely fourteen years, but nowhere near as steep as the drop in the fourteen years from 1894 to 1908, which looked like this:
The detailed reasons for that decline are outside the purview of this article, but the simplified explanation is that the pitching distance was changed from 50 feet to 60 feet in 1893, and that change resulted in an immediate upsurge in run production. In 1892 the runs per game figure looked like a modern number and was in fact lower than in the year 2000, but the modified pitching distance immediately gave batters a massive edge which they only gradually relinquished as the existing pitchers learned how to adapt, then new pitchers arrived who had developed their deliveries from the longer distance.
Given those sharp contrasts between the apogee and perigee of run production, it's amazing that the baseball itself has not changed appreciably in size and weight since 1893, which was the first season played with both overhand pitching and the modern pitching distance. Many other things have changed: the inside composition of the ball, playing conditions, ballparks, mitts, the height of the mound, the size of the strike zone, the strength and conditioning of the players, the hitting strategies, the use of relief pitchers, night games, PEDs, how often the balls are changed within a game, and various rules here and there. Whenever the game has tilted too far toward either offense or defense, the lords of the game have begun tinkering with counterstrategies to restore some kind of equilibrium satisfactory to players and fans.
In order to understand the value of various players throughout baseball history, we need detailed analysis to adjust our perceptions of the players' stats from year to year and era to era. As shown in the graph above, the run production per game dropped from 7.44 to 3.38 in a relatively short period at the turn of the 20th century, so 70 RBI in 1908 were approximately equivalent to 150 in 1894. (Well, these would be hypothetical, retroactively calculated RBI, since that particular statistic was then unknown.) It is not easy for us to accept the fact that a 70 RBI man and a 150 RBI man are equivalent. It is equally difficult to understand that Pedro Martinez's 1.74 ERA in 2000, when the major league ERA was 4.77, was actually better than Bob Gibson's 1.12 in 1968, when the average of MLB was 2.98. In fact, Gibson's season was only the sixthbest of the modern era, relative to the league's performance. Greg Maddux alone had two better seasons.
I suppose many of you are already aware of every point I have made thus far. I'm afraid that I have, as usual, given a verbose and marginally relevant introduction to an article about exactly how offenses have changed over the years, but for the benefit of those who do not pore over old record books, it's essential to establish first that those changes have actually happened and have often been significant. For the essence of this analysis, I am going to concentrate on only four years:
The goal: to compare contemporary baseball to the top offensive seasons in history, in order to determine precisely what has changed.
The first thing we learn from that comparison is that onbase percentages have declined steadily.
We then can determine that walks have relatively little to do with the decline. The number of walks per 550 at bats has remained relatively constant since 1894.
And therefore, since walks are not really to blame for the decline in onbase percentage, we realize that batting averages have declined steadily, ultimately dropping 55 points since 1894.
We can completely eliminate extra base hits as the cause of that decline. Although triples have metamorphosed into homers over the years, the number of doubles has remained almost constant, and so has the total number of extra base hits per 550 AB:
We have therefore found our culprit in the case of the declining onbase percentage: the number of singles per 550 AB has steadily declined.
A decline from 125 singles per 550 AB to 93 represents a loss of .058 in batting average, and that explains the entire decline for the past 120 years. Knowing what has changed drives an investigation into why. On the surface this fact seems puzzling, because baseball has a littleknown constant which is worded as follows: excluding extrabase hits and strikeouts, batters achieve a single once out of every four times they put the ball into play.
And that finally ends the quest to determine why onbase percentages have declined. Since batters have always had approximately the same success rate  about one in four  when putting the ball into play for a single, and since neither the number of extra base hits nor the number of walks has changed significantly, the simple answer to the decline in OBP is that batters are not putting the ball into play as often. Bingo! With the possible exception of the fact that modern players hit ever more homers and ever fewer triples, the greatest change in baseball offenses from 1894 to the present has been the number of strikeouts per 550 AB.
In a hypothetical season of 550 AB, the average player in 1894 would put the ball into play for a single about 474 times, since he would hit 44 extra base shots and whiff 32 times. By 1930 that number had declined to 452 tries, then to 396 in 2000, and finally all the way to 380 in 2015. That decline represents an immense difference. For an average player, it means that 94 atbats per year that used to result in fly balls, line drives or ground balls, thus presenting a one in four chance of a hit, are now terminated while the batter stands in the box.
Why? It may be that today's pitchers are faster and have more pitches in their repertoires; it may be that today's pitching strategies force the batter to face a fresh arm far more often; it may be that today's batters more frequently swing for distance rather than contact; it may be that there has been a decline in the art of bunting for a base hit; it may be (and probably is) all of these things. The explanation may be complex and nuanced, but the conclusion is not: the one and only reason why onbase percentages have declined over the years is simply the steady decline in the frequency of batters putting the ball into play.
To address the question posed at the top of this page  "What Happened to the Offense?"  the answer is obviously "strikeouts."
This trend is not at all in remission. Today, there are 7.7 strikeouts per team per game. At the beginning of the decade the figure was 7.1. At the beginning of the previous decade it was 6.5. In 1990, it was 5.7. In 1980 it was 4.8. There were some up and downs in the 195080 period, but in 1950 the number was 3.9. In 1940 it was 3.7. In 1930 it was 3.2. That number went up and down a bit in the 18931930 period, but in the first year of the modern pitching distance, it was 2.13.
Here is the chart that reflects all the data referenced above. All nonpercentage numbers are expressed per 550 AB. I use that arbitrary number because it represents, to us, a typical season for a typical fulltime player. We understand the difference between a 30strikeout player and a 90, or the difference between a twohomer man and an 18, so it is easy to view the chart and immediately perceive what kind of player performance was typical in an era.
We know that batters in 2015 hit .254 overall and .329 when not striking out. Thanks to Fan Graphs, we also know approximately how that breaks down by type of contact.
The vast and relatively recent increase in strikeouts affects pitching statistics and strategies perhaps even more dramatically than it affects batting, because it greatly increases the advantage of a strikeout pitcher over a pitcher who allows contact. In essence, strikeouts and WHIP have become ever more closely correlated. This point stems from another of baseball's constants: when not striking out, batters achieve a hit once every three times. The difference between this constant and the one above is that extra base hits are now considered in the same category as singles, simply so we can see the effect of strikeouts on batting average. In other words, when batters do not strike out, they hit about .333, and always have since the game started using the modern rules. The following shows the batting averages of all players when not striking out:
It is astounding that with all the changes in baseball since 1894, the needle here has not moved.
That statistic stays relatively constant from team to team and pitcher to pitcher as well, although a single season may not be adequate for the stability to be apparent because of the variations caused by relatively small sample sizes. While a modern major league season produces overall statistics based on some 180,000 plate appearances, and therefore reflects an accurate measurement of performance relatively unaffected by random variations, an individual starting pitcher will only face about a thousand batters or fewer in a season, so his statistics will be more significantly vulnerable to the vicissitudes of chance. But even there we can see the truth of the axiom. You know that batters hit for very low averages against Randy Johnson, and 2002 was his winningest year, when he won the Cy Young by going 245 and leading the league in wins, winning percentage, strikeouts and ERA. He allowed only 6.8 hits per nine innings, compared to a league average of 9.0. So what did batters hit against him when not striking out? They batted .321, the same as they hit against any mere mortal. Pedro Martinez had a similar season in 1999, when he was 234 and led the league in all those same categories. Batters hit .343 against him when not striking out. Bob Gibson's winningest year was 1970, when he went 237. Batters hit .315 against him when not striking out.
What additional value derives from knowing this?
Quite a bit, actually. Since players succeed in one out of every three atbats when not striking out, you can accurately predict a pitcher's "hits allowed" if you know how many strikeouts he had. To word it another way, "hits allowed" is not really an independent variable. Since the batting average in nonstrikeout at bats is a constant, the number of hits is a derived statistic which hinges on the number of nonstrikeout opportunities, as follows:
You can combine those steps of course. Steps three and four are easy to combine because dividing a number by 2/3 then taking a third of the result is the same as taking half of the original number. Therefore, half of (20/7 IP)K is 10/7 IP K/2. In English, multiply innings pitched times 10/7, then subtract half of strikeouts.
If you are working with "per inning" numbers, hits per inning equals 1.43 minus half of strikeouts per inning, as summarized below:
Here is the same chart expressed per 9 innings rather than per inning, since some people prefer to read it that way:
The last row in each chart represents the equilibrium point, the place where strikeouts are equal to hits. For example, if a pitcher strikes out 8.57 batters per nine innings, you would also expect him to allow about the same number of hits.
Does this knowledge have any practical value?
Maybe. These charts enable one to identify pitchers with special talents. If a pitcher strikes out only three players per nine innings, for example, but consistently manages to hold the opposition to nine hits in those innings, then he has the ability to defy the .333 constant in some way, perhaps by inducing weak grounders, perhaps by getting many fly balls in a cavernous ballpark. Such a pitcher was Ned Garver, the only man ever to win twenty games for a team that lost a hundred. He went 2012 for the hapless 1951 Browns, a feat so miraculous that it earned him the start in the AllStar Game and second place in the MVP balloting. In the course of his career he averaged only 3.2 strikeouts per game (defined as nine innings), but allowed only nine hits, as compared to a predicted eleven. He did this by defying the .333 constant. Over the course of his career, batters hit only .286 against him when not striking out.
Hall of Famer Robin Roberts also managed to defy the constant. Over the length of his career, batters hit only .292 against him when not striking out. In the peak of his career, 19521955, when he led the league in wins every year, he allowed a typical number of extra base hits, but demonstrated a remarkable ability to prevent singles, especially in his home park. If these sorts of pitchers can be identified, and if it is possible to determine the reasons for their success, they can be placed in situations suited to their talents.
Trends like this tend to distort our perception of player achievement. A slugger who batted in 102 runs in 2014 is as productive as one who batted in 129 in 2000, yet those two RBI figures look very different to our eyes. Lacking indepth analysis, we perceive a player who bats in 100 runs to be a solid middleofthelineup guy, a good player, but one possessed by virtually every team in the majors. On the other hand, our mental shortcuts determine a player who drives in 129 teammates to be a premier slugger, possibly the league leader. In fact, no player came within ten of that number in 2014.
Similarly, a pitcher with a 3.74 ERA in 2014 was an average major leaguer, a sturdy third starter, but in 2000, that would have been good enough for third place in the American League, behind only Pedro Martinez and Roger Clemens, and barely behind Clemens (3.70) at that.
The gap between 2000 and 2014 by no means represents a radical outlier in baseball history. If you want to see a dramatic contrast, study 1908, 1930 and 1968, when the "runs per game" statistic went on a roller coaster ride from 3.38 up to 5.55 and back down to 3.42. There is no need to study years which are multiple decades apart. The decline from 2000 to 2014 is a steep and sudden one to occur in merely fourteen years, but nowhere near as steep as the drop in the fourteen years from 1894 to 1908, which looked like this:
The detailed reasons for that decline are outside the purview of this article, but the simplified explanation is that the pitching distance was changed from 50 feet to 60 feet in 1893, and that change resulted in an immediate upsurge in run production. In 1892 the runs per game figure looked like a modern number and was in fact lower than in the year 2000, but the modified pitching distance immediately gave batters a massive edge which they only gradually relinquished as the existing pitchers learned how to adapt, then new pitchers arrived who had developed their deliveries from the longer distance.
Given those sharp contrasts between the apogee and perigee of run production, it's amazing that the baseball itself has not changed appreciably in size and weight since 1893, which was the first season played with both overhand pitching and the modern pitching distance. Many other things have changed: the inside composition of the ball, playing conditions, ballparks, mitts, the height of the mound, the size of the strike zone, the strength and conditioning of the players, the hitting strategies, the use of relief pitchers, night games, PEDs, how often the balls are changed within a game, and various rules here and there. Whenever the game has tilted too far toward either offense or defense, the lords of the game have begun tinkering with counterstrategies to restore some kind of equilibrium satisfactory to players and fans.
In order to understand the value of various players throughout baseball history, we need detailed analysis to adjust our perceptions of the players' stats from year to year and era to era. As shown in the graph above, the run production per game dropped from 7.44 to 3.38 in a relatively short period at the turn of the 20th century, so 70 RBI in 1908 were approximately equivalent to 150 in 1894. (Well, these would be hypothetical, retroactively calculated RBI, since that particular statistic was then unknown.) It is not easy for us to accept the fact that a 70 RBI man and a 150 RBI man are equivalent. It is equally difficult to understand that Pedro Martinez's 1.74 ERA in 2000, when the major league ERA was 4.77, was actually better than Bob Gibson's 1.12 in 1968, when the average of MLB was 2.98. In fact, Gibson's season was only the sixthbest of the modern era, relative to the league's performance. Greg Maddux alone had two better seasons.
I suppose many of you are already aware of every point I have made thus far. I'm afraid that I have, as usual, given a verbose and marginally relevant introduction to an article about exactly how offenses have changed over the years, but for the benefit of those who do not pore over old record books, it's essential to establish first that those changes have actually happened and have often been significant. For the essence of this analysis, I am going to concentrate on only four years:
 1894, when the game reached its offensive peak.
 1930, which was the peak year of the offensive explosion begun by Babe Ruth, which essentially ended after the prime years of Mays and Mantle. (That era lasted approximately from 1920 until 1962.)
 2000, which was the peak year of the offensive revival that occurred from 1993 to 2009.
 2015, which was neither a peak nor a nadir of any trends, but just happens to be the most recent year in the books. (If I really wanted to establish a point about offensive decline, I'd use 2014, in which the run production was the lowest since 1981. Inconveniently enough for my point, offenses actually made a slight comeback in 2015.)
The goal: to compare contemporary baseball to the top offensive seasons in history, in order to determine precisely what has changed.
The first thing we learn from that comparison is that onbase percentages have declined steadily.
 1894: .379
 1930: .356
 2000: .345
 2015: .317
We then can determine that walks have relatively little to do with the decline. The number of walks per 550 at bats has remained relatively constant since 1894.
 1894: 56
 1930: 49
 2000: 60
 2015: 47
And therefore, since walks are not really to blame for the decline in onbase percentage, we realize that batting averages have declined steadily, ultimately dropping 55 points since 1894.
 1894: .309
 1930: .296
 2000: .270
 2015: .254
We can completely eliminate extra base hits as the cause of that decline. Although triples have metamorphosed into homers over the years, the number of doubles has remained almost constant, and so has the total number of extra base hits per 550 AB:
 1894: 44
 1930: 48
 2000: 51
 2015: 46
We have therefore found our culprit in the case of the declining onbase percentage: the number of singles per 550 AB has steadily declined.
 1894: 125
 1930: 114
 2000: 98
 2015: 93
A decline from 125 singles per 550 AB to 93 represents a loss of .058 in batting average, and that explains the entire decline for the past 120 years. Knowing what has changed drives an investigation into why. On the surface this fact seems puzzling, because baseball has a littleknown constant which is worded as follows: excluding extrabase hits and strikeouts, batters achieve a single once out of every four times they put the ball into play.
 1894: .265
 1930: .253
 2000: .247
 2015: .246
And that finally ends the quest to determine why onbase percentages have declined. Since batters have always had approximately the same success rate  about one in four  when putting the ball into play for a single, and since neither the number of extra base hits nor the number of walks has changed significantly, the simple answer to the decline in OBP is that batters are not putting the ball into play as often. Bingo! With the possible exception of the fact that modern players hit ever more homers and ever fewer triples, the greatest change in baseball offenses from 1894 to the present has been the number of strikeouts per 550 AB.
 1894: 32
 1930: 50
 2000: 103
 2015: 124
In a hypothetical season of 550 AB, the average player in 1894 would put the ball into play for a single about 474 times, since he would hit 44 extra base shots and whiff 32 times. By 1930 that number had declined to 452 tries, then to 396 in 2000, and finally all the way to 380 in 2015. That decline represents an immense difference. For an average player, it means that 94 atbats per year that used to result in fly balls, line drives or ground balls, thus presenting a one in four chance of a hit, are now terminated while the batter stands in the box.
Why? It may be that today's pitchers are faster and have more pitches in their repertoires; it may be that today's pitching strategies force the batter to face a fresh arm far more often; it may be that today's batters more frequently swing for distance rather than contact; it may be that there has been a decline in the art of bunting for a base hit; it may be (and probably is) all of these things. The explanation may be complex and nuanced, but the conclusion is not: the one and only reason why onbase percentages have declined over the years is simply the steady decline in the frequency of batters putting the ball into play.
To address the question posed at the top of this page  "What Happened to the Offense?"  the answer is obviously "strikeouts."
This trend is not at all in remission. Today, there are 7.7 strikeouts per team per game. At the beginning of the decade the figure was 7.1. At the beginning of the previous decade it was 6.5. In 1990, it was 5.7. In 1980 it was 4.8. There were some up and downs in the 195080 period, but in 1950 the number was 3.9. In 1940 it was 3.7. In 1930 it was 3.2. That number went up and down a bit in the 18931930 period, but in the first year of the modern pitching distance, it was 2.13.
Here is the chart that reflects all the data referenced above. All nonpercentage numbers are expressed per 550 AB. I use that arbitrary number because it represents, to us, a typical season for a typical fulltime player. We understand the difference between a 30strikeout player and a 90, or the difference between a twohomer man and an 18, so it is easy to view the chart and immediately perceive what kind of player performance was typical in an era.
R/G 
1B 
2B 
3B 
HR 
EBH 
K 
BB 
avg 
avg ex K  avg ex K
and HR 
avg ex K
and EBH 

Year
2015 
4.25 
93 
27 
3 
16 
46 
124 
47 
.254 
.329 
.302 
.246 
Year
2000 
5.14 
98 
29 
3 
19 
51 
103 
60 
.270 
.333 
.304 
.247 
Year
1930 
5.65 
114 
30 
8 
10 
48 
50 
49 
.296 
.326 
.312 
.253 
Year
1894 
7.38 
125 
26 
12 
6 
44 
32 
56 
.309 
.328 
.320 
.265 
We know that batters in 2015 hit .254 overall and .329 when not striking out. Thanks to Fan Graphs, we also know approximately how that breaks down by type of contact.
type of
contact 
percent
of at bats 
batting
average 
grounder 
34.1% 
.245 
fly ball 
24.0% 
.245 
line
drive 
16.3% 
.685 
pop up 
3.1% 
.015 
strike
out 
22.5% 
.000 
all at
bats 
.254 

non
strikeouts 
.329 
How does this affect pitching statistics?
The vast and relatively recent increase in strikeouts affects pitching statistics and strategies perhaps even more dramatically than it affects batting, because it greatly increases the advantage of a strikeout pitcher over a pitcher who allows contact. In essence, strikeouts and WHIP have become ever more closely correlated. This point stems from another of baseball's constants: when not striking out, batters achieve a hit once every three times. The difference between this constant and the one above is that extra base hits are now considered in the same category as singles, simply so we can see the effect of strikeouts on batting average. In other words, when batters do not strike out, they hit about .333, and always have since the game started using the modern rules. The following shows the batting averages of all players when not striking out:
 1894: .328
 1930: .326
 2000: .333
 2015: .329
It is astounding that with all the changes in baseball since 1894, the needle here has not moved.
That statistic stays relatively constant from team to team and pitcher to pitcher as well, although a single season may not be adequate for the stability to be apparent because of the variations caused by relatively small sample sizes. While a modern major league season produces overall statistics based on some 180,000 plate appearances, and therefore reflects an accurate measurement of performance relatively unaffected by random variations, an individual starting pitcher will only face about a thousand batters or fewer in a season, so his statistics will be more significantly vulnerable to the vicissitudes of chance. But even there we can see the truth of the axiom. You know that batters hit for very low averages against Randy Johnson, and 2002 was his winningest year, when he won the Cy Young by going 245 and leading the league in wins, winning percentage, strikeouts and ERA. He allowed only 6.8 hits per nine innings, compared to a league average of 9.0. So what did batters hit against him when not striking out? They batted .321, the same as they hit against any mere mortal. Pedro Martinez had a similar season in 1999, when he was 234 and led the league in all those same categories. Batters hit .343 against him when not striking out. Bob Gibson's winningest year was 1970, when he went 237. Batters hit .315 against him when not striking out.
What additional value derives from knowing this?
Quite a bit, actually. Since players succeed in one out of every three atbats when not striking out, you can accurately predict a pitcher's "hits allowed" if you know how many strikeouts he had. To word it another way, "hits allowed" is not really an independent variable. Since the batting average in nonstrikeout at bats is a constant, the number of hits is a derived statistic which hinges on the number of nonstrikeout opportunities, as follows:
 Take the number of innings pitched and derive the number of pitching outs. There are three outs per innings, but only 20 of every 21 outs is created by the pitcher. The rest come from baserunning, so the number of pitching outs = innings pitched times 20/7.
 Subtract his strikeouts to get the number of other pitching outs.
 Divide the total by 2/3 to get the number of other at bats against him. (If hits represent 1/3 of the atbats, then outs must be the other 2/3.
 Multiply that number times 1/3 to get his hits allowed.
You can combine those steps of course. Steps three and four are easy to combine because dividing a number by 2/3 then taking a third of the result is the same as taking half of the original number. Therefore, half of (20/7 IP)K is 10/7 IP K/2. In English, multiply innings pitched times 10/7, then subtract half of strikeouts.
If you are working with "per inning" numbers, hits per inning equals 1.43 minus half of strikeouts per inning, as summarized below:
k/inn 
h/inn 
1.4 
.73 
1.3 
.78 
1.2 
.83 
1.1 
.88 
1 
.93 
.9 
.98 
.8 
1.03 
.7 
1.08 
.6 
1.13 
.5 
1.18 
.4 
1.23 
.95 
.95 
Here is the same chart expressed per 9 innings rather than per inning, since some people prefer to read it that way:
k/g 
h/g 
13 
6.4 
12 
6.9 
11 
7.4 
10 
7.9 
9 
8.4 
8 
8.9 
7 
9.4 
6 
9.9 
5 
10.4 
4 
10.9 
3 
11.4 
8.57 
8.57 
The last row in each chart represents the equilibrium point, the place where strikeouts are equal to hits. For example, if a pitcher strikes out 8.57 batters per nine innings, you would also expect him to allow about the same number of hits.
Does this knowledge have any practical value?
Maybe. These charts enable one to identify pitchers with special talents. If a pitcher strikes out only three players per nine innings, for example, but consistently manages to hold the opposition to nine hits in those innings, then he has the ability to defy the .333 constant in some way, perhaps by inducing weak grounders, perhaps by getting many fly balls in a cavernous ballpark. Such a pitcher was Ned Garver, the only man ever to win twenty games for a team that lost a hundred. He went 2012 for the hapless 1951 Browns, a feat so miraculous that it earned him the start in the AllStar Game and second place in the MVP balloting. In the course of his career he averaged only 3.2 strikeouts per game (defined as nine innings), but allowed only nine hits, as compared to a predicted eleven. He did this by defying the .333 constant. Over the course of his career, batters hit only .286 against him when not striking out.
Hall of Famer Robin Roberts also managed to defy the constant. Over the length of his career, batters hit only .292 against him when not striking out. In the peak of his career, 19521955, when he led the league in wins every year, he allowed a typical number of extra base hits, but demonstrated a remarkable ability to prevent singles, especially in his home park. If these sorts of pitchers can be identified, and if it is possible to determine the reasons for their success, they can be placed in situations suited to their talents.