Does Sex Sell?
- By: Samantha Adler
- Mar 10, 2017
- 2 min read

When one hears the word “sex”, the thought of “sin” usually follows shortly after. The media has b r a i n w a s h e d audiences into thinking that the word “sex”, even the thought of sex is a sin and worshipped by the devil. But what if I told you that the media has been selling their audience sex subtly (and really not so subtly) all this time by blasting women bodies onto every platform known to the world.
There are soooo many options out there that I could dissect, but what I truly want to delve into is how the tv show Sex and the City depicted the bodies of women and how they think women should be viewed as:
s e x o b j e c t s t h a t c a n s e l l a s h o w t o t h e g e n e r a l p u b l i c.
In blog, “Women in the Media” they illustrate that Sex and the City is “a great example of how women have contested the naturalized stereotype of them being portrayed as sexualized objects. The television show does promote sex, but at the same time it contests the sexual stereotype of women . . .” And what I find interesting is that the show was created by man, written by man and produced by man.
Every decision made for the show was always run by a man before it even hit the screens of insecure women everywhere. What the show did was simple: blast unrealistic body types into the viewers faces to force women into thinking thats what every “American” body should look like and then they (the viewer) would start absorbing body dysmorphia to emulate said body so they themselves can be viewed as sexy. A final note to end with is that these expectations were from the viewpoints of men, NOT from women. So to answer the question: Does sex sell, my guess is that it truly does.
Comments