Gender Roles
As we grow, we learn how to behave from those around us. In this socialization process, children are introduced to certain roles that are typically linked to their biological sex. The term gender role refers to society's concept of how men and women are expected to act and behave. These roles are based on norms, or standards, created by society. In American culture, masculine roles are usually associated with strength, aggression, and dominance, while feminine roles are usually associated with passivity, nurturing, and subordination .