This is not exactly in line with "women in the news" but I found it incredibly interesting after we discussed how gender roles can still affect us today. This article looks at the study in which participants, both male and female, were asked about how assertive in sexual situations they were, their use of protection and who should do what in a sexual situation.
Interestingly the study found that the more likely someone was to agree with statements like:
"It's OK if some groups have more of a chance in life than others;"
"The man should be the one who dictates what happens during sex."
The more likely they were to have adverse feelings about women's rights and sexual harassment.
It also said that men who thought men should be more dominant were also less comfortable talking about sexual situations and protection with their partner. Also women who thought men should dominate were less likely to use female condoms or have confidence in their sexual abilities.
I interpret this study as providing evidence that the continuation of traditional gender roles is a public health issue as well as a hindrance to sexual satisfaction. You are less likely to enjoy and benefit from the sexual experience if you are worried about breaking social norms or lacking the confidence to use or demand protection. At the risk of making everyone blush, theoretically do you think society would benefit if everyone read this study, dropped traditional gender roles and the sex got better?
No comments:
Post a Comment