It’s time for Hollywood to listen rather than lecture.

Harvey Weinstein is a problem. But he’s not THE problem. It goes far deeper. We all know that Harvey Weinstein is a symptom of a diseased culture.

I enjoy going to the movies but I’ve noticed something disturbing in recent years. It’s hit me a few times when I’m watching a movie that the people who made the film don’t actually like people. You can see it in the way they portray suburban life as if it’s something merely to be mocked. You can see it in the way the movies depict businessmen and women as if they’re always up to something nefarious. You can see it in the casual way pedestrians are crushed or killed by an explosion while the caped protagonist battles the supervillain. Sometimes those deaths are even played for laughs as if the death were retribution for being an extra, a normal person.

It hit me that Hollywood is angry. I think I’m understanding why.

The men in charge in Hollywood use and abuse women. The men view the women as objects and the women see men as oppressors and potential assaulters. And to at least some extent, these are the people creating our cultural conversation.

Is it any wonder that people in Hollywood believe that all businesses require a drastic increase in oversight and regulation? Is it a shock that many in Hollywood talk about the #waronwomen? or #rapeculture? Is it a surprise that they talk about the women’s movement like it’s still the 1940’s? Maybe that’s because in Hollywood nothing much has changed since then. They thought they were describing the country but in reality it was a call for help from Hollywood.

Maybe it’s time that Hollywood caught up to the rest of the country. Maybe these cultural elites should listen to us normals for a little while.

Please continue reading at The National Catholic Register>>>

*subhead*Consent or love.*subhead*