Women and the Media
With all the political discussion about women and their biological rights in the Unite State and what this means in general, I have begun thinking about a global understanding of women in the world. In Europe it is understood culturally that a woman has the right to choose to take birth control, have abortions, and expect condoms. At least from what my students have told me, what I have read, and what I have experienced. But the portrayal of women in the media is vastly different.
Women are, at least in Eastern Europe, shown as sexual objects. I really don't understand why having a nearly-naked woman in your coffee ad will increase coffee sales. Now some may say, and have, "You are American, and your country is prudish about the female body." And that is absolutely true. We are also prudish about the male body as well. If you show a man's penis in a film that nearly guarantees an X rating by the MPAA. I know that the United States isn't perfect in the way it portrays and treats women, but women are shown as more than giggling sales and sex objects.
I was reading an article from the LA Times and what it made me realize is that women do want a respectful view of themselves in the media. How do we do this? It's not easy--because sex sells. Also, we live in a world where most of the corporate control is in the hands of men. In the United States, we have television shows that show women as lawyers, doctors, reporters, mothers, and even President of the United States. But these women are still beautiful, and still eye candy. When a woman speaks in front of Congress about her political opinion, she is vilified and called a slut. Some Congressional leaders still want t o see women barefoot and pregnant, but want to give themselves a bigger salary and cutting funding for domestic abuse. I wonder if the world really wants to see real women on TV. I think though showing women in more roles other than watermelon and mastika sales girl will be helpful.