I live in America, and people are always confused when I call this sport "football". To them, football is tackling people. THATS NOT FOOTBALL. The real football is with feet. I am a big fan of real football, even though I live in America, and I still call the sport football, not that retarded name soccer that someone came up with, it sounds so stupid, I feel like such an idiot if I say "I watch soccer", sounds like the name of some little kids game, a sport as great as this deserves the name of the true football.
I believe that Association Football deserves the title of football more than American Football. Why do I think this?
Let's be honest, association football is the only sport under the name of "football" where people actually use their foot for the majority of the game.
Please note I'm not disrespecting American Football.