At one point in America’s history, time was a serious headache. In the late 18th and early 19th centuries, towns and cities across the country used local mean time (solar time) to set their clocks. A town’s main clock—usually on a church or city hall—would be set by the sun, and the townspeople would use that main clock as the standard time. Using local mean time meant, however, that every hamlet, every town, every city, had its own standard of time. It is estimated that at one point there were 8,000 times zones in the Unites States alone—and rarely did two ever agree on what time it was. Still, since people rarely traveled, this method worked well.
Until the trains came.
Starting in the 1860s, train schedules became an issue throughout the States. Regional and national trains in the U.S. and Canada were all working on different time tables based on each town’s standard time, creating havoc in stations and frustration with passengers. By 1875 people had had enough. In 1879, American meteorologist Cleveland Abbe proposed a concept of four standard times zones across the contiguous United States, based on Greenwich Mean Time (which had been in use in Great Britain since 1855). At the same time, Canadian engineer Sandford Fleming was proposing Standard Time to his government. The train companies saw the idea of standard time zones as the solution to all of their problems, and on October 11, 1883, they adopted the Standard Time System (STS) of four time zones for the Continental U.S.
As soon as the train companies adopted the STS, states started adopting it as well; and at noon on November 18, 1883, the U.S. Naval Observatory officially changed its telegraphic signals to correspond with the STS. Still, as great as it was, the Standard Time System didn’t become law until 1918, when the Standard Time Act was established.