The role of women has changed drastically during the 20th century. In the early 1900s female workers were employed mainly in factories or worked as servants. In the course of time they got more educated and started working as nurses, teachers, even doctors and lawyers.