what was the point do you think USA changed from the non-interventionist, moral, pro-development, Republic into the evil Empire we see today? Why do you think it changed? It certainly didn't benefit American citizens.