Should the U.S.A. have conqured the world?

Keeping in line with my controversial threads that solve nothing, I ask, retrospectively, if U.S.A should have conquered the World after the World War 2, when it was the only one with the bomb.

Conquered places such as Russia after war. And kept Islamic countires in check. I commonly hear "turn sand into crater of glass solids". Also, U.S.A could have expanded into Mexico. Or something.

I'm going to obviously say no, because I think it woud be immoral, but the world would honestly better place?

This is all retrospective though.

Weird topic I know. Just basicly, woud the world be a better place with a U.S.A. control?
 

Premium Content

This thread contains exclusive content for our premium community members.

What you're missing:
  • Full discussion and replies
  • Community interaction and voting
Already have an account?
✨ Unlock exclusive discussions and premium features
Premium Benefits:
Exclusive content • Priority support • Advanced features • Full thread access
Top