They're the villains of every Hollywood movie. I think that 1933-1945 greatly
damaged their reputation in the West but it is a comparably small period of time to centuries of history. Germany is well respected because German culture has been an essential pillar of Western civilization
This message has been modified
Originally posted on May 22, 2019 at 7:29:10am
Message modified by Dynasty1 on May 22, 2019 at 7:29:25am