What happened in the 1990’s that made Major League Baseball obsessed with “protecting pitchers” through a strict pitch count?
In my opinion, the pressure is coming from the front office. General managers and owners will do anything to avoid putting a star pitcher at risk of injury, even if it means putting the team second. With so much money invested in a single player –C.C. Sabathia is getting over $700,000 per start over the next 6 years—the fear is understandable. But does pulling a starter after 6 innings or 100 pitches make sense for the team? Usually not.
Along the same lines, innings pitched per season has witnessed the same downward trend. No pitcher has thrown 300 innings in a single season since Steve Carton in 1980. In fact, the leaders in innings pitched over the past 3 years have had totals in the 230’s and 240’s. Between 1962 (the year the schedule switched to 162 games) and 1980, the league leader in innings pitched exceeded 300 innings every single year, with totals regularly reaching the mid 300’s.
Starters are arguably the most skilled pitchers in the game. With an arsenal that typically includes 4 to 5 pitches and the stamina to throw for hours, it’s tough to make a case against your starter for being the pitcher most likely to get the batter out. And that’s the name of the game, right? Well it used to be. Maybe there are just more skilled pitchers in the game than there were years ago? One thing we can be certain of is that there are more pitchers in Major League Baseball than there ever were.
Over the past 3 years, the total number of pitchers used in a game was 660! Compare that to 1962 when just 300 different pitchers were used. It wasn’t a sudden jump either—the growth rate has been constant for nearly 50 years. Here’s the question, though: is the increase in the number of different pitchers and the decrease in the number of innings they pitch each game and throughout the season resulting in better overall pitching in the game? Let’s examine two basic pitching benchmarks: ERA (earned run average) and WHIP (walks + hits / innings pitched).
Since 1993, the average ERA has been over 4.00. Between 1980 and 1992, the average ERA was consistently between 3.70 and 4.00, going over 4.00 just once (1987). The years between 1962 and 1979 showed a wider range in ERA, with yearly averages as low as 2.98 and as high as 4.00. A similar trend can be found with WHIP, as averages below 1.3 cannot be found before the 1970s.
Before the end of the 20th century, pitchers like Cy Young, Christy Mathewson, Lefty Grove, Warren Spahn, Sandy Koufax, and Gaylord Perry had seasons of 30+ complete games in consecutive years. Did their arms fall off? Did they turn into Hall of Fame pitchers? So why are today’s pitchers treated so differently, even with the advanced training, conditioning, and physical build of the 21st century athlete?
Today’s top pitchers are treated with more caution and shown less confidence in their ability to go complete the game they started than the below average pitchers were shown half a century ago. Last summer, Austin Wood threw 13 scoreless innings in a 25 inning thriller for the University of Texas. Is it a coincidence that he isn’t tied to a multi-million dollar contract? I don’t think so. Starters today have been groomed to throw on a limited pitch count when they enter into professional baseball.
If pitch count were ignored, and the trips to the mound where the pitching coach asks the pitcher “How do you feel,” I strongly believe that we would see many more complete games, less blown saves, and less inferior middle relief pitchers eating up innings in close games.
This is a guest post written by Chet Kresge
[Photo Credits: CC Sabathia – blog.nj.com | Cy Young – PineCrest.edu]