Definition - What does Gender Roles mean?
Gender roles are any society's expected social behaviors for people of a specific physical sex. These roles dictate how individuals speak, walk, dress, think, and interact with the world around them as they follow the roles dictated by their particular society. Gender roles are constantly reinforced by interactions with family, peers, and media. Parents seem to have the greatest influence of the child's adherence to gender roles.
Kinkly explains Gender Roles
From birth, children are faced with the expectation that they fit certain gender roles based on the anatomy with which they were born. This can be problematic as people do not always identify as the same gender as their anatomy indicates. Additionally, the traditional gender roles draw from a binary male/female system which allows for two genders. Many individuals identify as neither male or female.