In 1934, when the first Masters was held—at that time, it was called the Augusta National Invitation Tournament—if you mentioned the “four majors” of golf to anyone, they would likely think you were referring to the four that Bobby Jones swept in 1930: The U.S. Open, the Open Championship, the British Amateur and U.S. Amateur. Already by that time, though, the amateur game was declining in importance, and the professional era, though still nascent, was gaining in prestige. Eventually, of course, the pros would become predominant, but even in 1934, the definition of a “major” was blurring.
Fast forward 40 years, to the first iteration of the Players Championship. By then, the four modern majors—the Masters, PGA Championship, U.S. Open, and Open Championship—had been cemented in the public consciousness, to the point that even a new flagship event for the PGA Tour, the foremost professional tour in the world, could not aspire to anything more than the mythical “fifth major” level. The majors club was officially closed for entrance.
MORE:Â The Great Schism and life after Tiger Woods
What happened in those 40 years? There’s no governing body or person to declare what constitutes a “major,” so how did perception cohere into our current system? The answer involves the individual merit of the tournaments, some of them bolstered by history and some, like the Masters, spurred on by brilliant PR campaigns, and it also involves the accidental intervention of the greatest player and agent of their time.
On this week’s Local Knowledge podcast episode, we examine how these forces convened to create a system of four majors that became the standard by which we measure legacies. Listen below, or wherever you get your podcasts.
RELATED: Did you know that he original four majors included two amateur tournaments?
This article was originally published on golfdigest.com