The premise of many of these books is that whether we like it or not, America is an empire. Whether we like acknowledging it or not is whole other thing.
When many people think of empires, we tend to think of the ancient Romans with their gladiators and colisseums, or the British Empire and some guy in a funny-looking pith helmet tromping through the African jungle: "Dr. Livingston, I presume?" With the word "empire" or "imperialism" comes so much negative baggage; plus, those words are so down-right anti-democratic. If we're controlling the destinies of other countries, how are we allowing them to be democratic and free nations?
Being an imperialist nation, apparently, has been something we've been working at for several decades now. A book by Stephen Kinzer called Overthrow outlines over a dozen instances where the United States has taken control of a country b/c our business or political interests were threatened, resources were slipping out of control, or during the Cold War, we felt the creep of socialism/communism get too close.
Questions to consider but not necessarily answer - is the course we've taken towards building an empire worth the hatred of the world? Our safety doesn't mean much when terrorists want to kill us at home. Morally, are we doing the right thing by keeping other countries from determining their own destiny? Economically, is the tax money we're spending on our military also the right thing to do? Should the other countries of the world shoulder their own defense expense? Why or why not?