Rape Culture

Last Updated: March 4, 2019

Share this:

Definition - What does Rape Culture mean?

Rape culture is an environment where rape and sexual violence is normalized and excused.

The term rape culture is typically used to describe an environment where society blames the victims of sexual violence, rather than holding perpetrators accountable for their actions. It was originally used by feminists in the 1970s, but has become a common part of the vernacular in recent years as more people share their stories of sexual violence.

Kinkly explains Rape Culture

Physical rape is just one component of rape culture. It is also perpetuated by misogynistic language, the objectification of women, and the glamorization of sexual violence. Suggesting that wearing sexy clothes raises the risk of rape, or trivializing or dismissing sexual harassment or assault can be considered aspects of rape culture. Rape culture is spread by the members of the culture, its popular culture, and its media.

Rape culture has a significant negative impact on the people who live in environments where it exists. Even if they are not physically raped themselves, people (particularly women) may live in fear and feel physically and emotionally unsafe.

Combating rape culture has several key components, but includes avoiding degrading and objectifying language, supporting victims of sexual assault, promoting a culture based around consent and mutual respect, and respecting the boundaries of those around us.

Do you need ideas for your next steamy scene? Take our quiz to get a personalized scene built just for you!

If you're just tipping your toe into the world of BDSM, you may be unsure where to even start when it comes to planning out a scene.

We made this quiz to provide you with your next, or first, BDSM scene based on your own tastes and desires!

Email Newsletter

Join thousands receiving hot new sex related articles, goodies, and great deals.

FEATURED PARTNERS