Once the U.S. entered the Second World War, it become necessary for Hollywood to address certain aspects of grim reality which post-Depression era cinema had specialized in ignoring. After the war had ended, reality seemed to become even grimmer, and filmmakers responded by showing audiences a version of the world in which Hollywood’s conservative, romantic […]