Good question Daisygirl. Just want to say right off that the French are known not to like anyone other than the French. Maybe now, they are a bit friendlier to Germans, but that's a recent development, stemming from their united view on world politics.

Anti-Americanism in Germany? No, I've never personally experienced it. But when you watch TV, you get a lot of anti Bush-ism. And you do get to hear how perplexed the Germans are about American politics.

Before the Iraq war, I was often asked how I can stand living in a small German town. And how could I ever have left such a fantastic country like America. Those questions have stopped.

I think the Germans have developed a newfound patriotism, which oddly enough has been triggered through the World Soccer Games. Also Germans are proud of the fact that they held their stand and didn't participate in the war. Because of the socialised system; slums and extreme poverty is practically non-existent here. Many can't understand how a world power, like the U.S. can't or won't take better care of it's own people, like in the New Orleans catastrophe.

To answer your question Daisygirl, I've noticed that Germans don't gush anymore when they talk about America. But American tourists are welcomed here, and Germans still love Americans as individuals and their hospitality. That hasn't changed at all.