I realize that until fairly recently women were viewed more as possessions than as human beings. If men were able to create life themselves, I’m sure women would still be indentured servants at best, and possibly even go the way of the passenger pigeon.
This makes sex the most primary human activity, more important even than making money.
But now that women are no longer dying young in childbirth or being raped and pillaged during border disputes and range wars (this doesn’t apply to all parts of the world), men have been forced to pay attention to women for other than pleasure and procreation. Women have become an important economic and political force in the world, even though the angry old white men in the United States are dragging their feet to recognize this fact. Many men cannot wrap their feeble little minds around the concept that women should be treated as equals.
Why is that?