How do errors today compare with the 1800s?

In the early days of professional baseball, fielders did not use gloves. All plays had to be made barehanded, easy fly balls and scorching line drives alike. There was no rule specifically forbidding gloves, but they were considered “sissy” devices. Supposedly, the first man to don a glove was St. Louis’s Charlie Waitt in 1875. Waitt wore a street-wear leather glove, on one hand, to protect it while fielding the ball, inspiring other players, and later Michael Jackson, to do the same. Before that point, every ball put in play was an adventure. Teams averaged over eight errors per game and killed their hands in the process. By 1880 that number had been cut nearly in half. Still, gloves offered only meager protection and no actual fielding help. Over the years, those thin leather gloves slowly evolved into the tough, padded hide gloves we have today. In 1920, spitballer Bill Doak approached a baseball glove manufacturer about designing a custom version that would have a pre-made pocket (before that, gloves developed a pocket through wear and tear) and webbing between the thumb and index finger. And so the modern glove was born.

The impact of Doak’s design was not immediately felt, as errors per game were only slightly lower in 1930 than they were in 1920. However, the defense was gradually improving. The 1940 Cincinnati Reds were the first team to sport a fielding percentage of .980 or better. In 1947, for the first time ever, no team had more total miscues than games played. Today errors are at an all-time low. Teams average only 0.7 a game, and there have been teams that total less than 100 every year throughout the ‘90s. Last season the Cleveland Indians, featuring a fantastic middle infield of Omar Vizquel and Roberto Alomar, led the majors with 72, good for a measly 0.44 per contest. Compare that to the league-leading 1877 Louisville club, which committed 4.5 a game, or 10 times as many.