Rape Culture

Definition - What does Rape Culture mean?

Rape culture is an environment where rape and sexual violence is normalized and excused.

The term rape culture is typically used to describe an environment where society blames the victims of sexual violence, rather than holding perpetrators accountable for their actions. It was originally used by feminists in the 1970s, but has become a common part of the vernacular in recent years as more people share their stories of sexual violence.

Kinkly explains Rape Culture

Physical rape is just one component of rape culture. It is also perpetuated by misogynistic language, the objectification of women, and the glamorization of sexual violence. Suggesting that wearing sexy clothes raises the risk of rape, or trivializing or dismissing sexual harassment or assault can be considered aspects of rape culture. Rape culture is spread by the members of the culture, its popular culture, and its media.

Rape culture has a significant negative impact on the people who live in environments where it exists. Even if they are not physically raped themselves, people (particularly women) may live in fear and feel physically and emotionally unsafe.

Combating rape culture has several key components, but includes avoiding degrading and objectifying language, supporting victims of sexual assault, promoting a culture based around consent and mutual respect, and respecting the boundaries of those around us.

Share this:

Email Newsletter

Join thousands receiving hot new sex related articles, goodies, and great deals.

Featured Partners