Just a few thoughts... Pretty sure we weren't at war with Germany when the attack by the Japanese on Dec. 7 1941 occured. I believe it was Germany who declared war on the USA first, shortly after Pearl Harbor, in support of the Japanese Empire . My historical memory comes and goes so someone please check this out.
Last edited by Jared Valeski; 09-23-2011 at 08:48 PM..
|