Originally posted by Unregistered
View Post
I have always believed that America is the greatest country in the world. Is that racist sexist nazi or elitist? I would assume that people from other countries feel the same about their countries as well but maybe they don’t
So back to your point- is America not a great country? Has it ever been a great country?
I believe it was and is and want it to remain so forever- while you obviously take issue with that, the rest of the world agrees with me as they seem to be willing to risk their lives to come here and I haven’t heard of any mass exodus away from the US
Comment