Definition - What does Gynarchy mean?
The rule, governance, or leadership by women. In a gynarchy, women have political, legal, sexual, economic, or social power over men.
Kinkly explains Gynarchy
The dominance of women, either politically or socially, has existed in many cultures, and a contemporary gynarchy may extend to sexual, legal, or economic control.