Site hosted by Angelfire.com: Build your free website today!

War

War either makes or breaks a country. If you win a war it's really good for the economy in the long run. If you lose a war then you're fucked economically. However, what's most concerning about America at war is that we've mostly become the worlds army. We're always trying to get people to sign out peace treaties, and then whenever they go to war we get sucked into it. It seems like we have to fight in EVERY single war that comes along in ANY country now.

Our policy is: be nice to us and we'll beat up your enemies. That way every country who signs with us gets a nice little sign that says "Don't Fuck With Us". But it's bullshit.

We fight wars that have nothing to do with us, which causes us to have enemies and forces us into big mean fights we shouldn't get into. But, we do, because if we're not an army and we become neutral than we're going to have the entire world up our asses, trying to fuck with us, trying to dominate us. Which is lame.

It's all lame. But, we have to dance this stupid political dance and no one really likes it. It's just a fact of politics that just happens to be extremely lame.

Cheers,
Lost