skip to main |
skip to sidebar
Movies
Why is the amount of sex and violence increasing in motion pictures? Is this a case of Hollywood giving society what they want or is it simply society’s acceptance of what we are given?- As in most cases, I think it is a little of both. Obviously if society didnt want sex and violence, then Hollywood would not make any money selling it. So yes, Hollywood is giving society what they want, but also because of Hollywood, we are becoming more and more immuned to sex and violence. They keep pushing the envelope, and seeing what we will accept, and what we will not.
- Example: the TV show Grey's Anatomy is an extremely popular tv show, especially among women. I dont watch it, but my mom was complaining a couple months ago about a new relationship on the show, involving two women. She asked me if i thought she was being irrational in being bugged by it. I told her yes, and that was it. Last week we were talking about it again, and my mom told me that apparently she wasnt the only one who cared, becasue they eliminated that relationship from the show, and killed off one of the women participating in this relationship. This shows that society is still moral enough to take stands, and not accept everything that hollywood throws at them.
If you were a movie producer, what would you do to make a box office hit in 2008?- I would make a movie where people could relate with the character. A Forest Gump type of movie where the character is developed, and realistic. I would make the movie about success, a positive story. In these hard times, i would make a story that gave people hope, like life is beautiful. I think this is what people are striving for in this dificult world
No comments:
Post a Comment