Originally Posted by
Spinx
I enjoy NFL, and some other US sports, but I got a question for Americans that watch them:
Do you think it's pretty silly that they call the NFL champions 'The World Champions when it's not actually a competition open to foreign teams? And the same with calling it 'The World Series' in MLB.
When a team wins the football (or soccer to you!) Premiership in the UK, we call them the Premiership Champions, and they then play games against European opposition the next year (along with a few other teams that finished oin the top places in the Premiership) that have done very well in their respective countries leagues, and the winner is called the European Champion. And that all makes sense.
Calling yourself world champion without playing anyone from another country just seems rather daft, and really doesn't help America's rather dodgy image worldwide.
Not having a go, just never heard an American's take on it.