Sex seems to be something that our society finds normal and natural for men to engage in, but not women. In fact, sex is a taboo subject for women…unless they want to be thought of as “easy.” As far as society is concerned, men are supposed to sleep around, then go on to satisfy themselves even further in the shower. It seems like female masturbation is considered dirty or shameful…it’s something that is only supposed to happen in porn movies.
Here in the US there has always been a strong sense of gender roles. It’s still common to hear terms like, “That’s a man’s job,” or “That isn’t a woman’s place.” You’d think that by the 21st century we’d be far past these kinds of primitive barriers. In this country, we’ve been fighting for over a century and a half to break down these ridiculous walls.
Some people I know blame religion, especially ones of Judeo-Christian origin, for this kind of separation between the genders. Perhaps they are right, but I think it goes much deeper than that. There are plenty of other countries that consisted of very similar religious ideals that have been able to shed the oppression of others. Somehow, the US can’t do that. There is some mass need among the majority to hold power over the rest…to bully others…to hurt and shame other people so that these people can remain in control, so they can stay the majority. They gratify themselves by hurting others.