However, only the mobilization that followed America's entry into World War II finally brought an end to the Depression. Though the Allies and the Axis Powers had been at war since 1939, the United States remained neutral until the Japanese attacked Pearl Harbor on December 7, 1941. World War II solidified America's role as a global power. It also ushered in numerous social changes, including the movement of women into previously male-only jobs. And it established the reform agendas that would occupy the United States for the remainder of the 20th century. Yet while the United States was defending democracy against totalitarian aggression, it was denying the civil liberties of interned Japanese Americans and the civil rights of racial minorities. The country emerged from World War II a very different nation, with new enemies to confront abroad and new challenges to face at home.