In all seriousness, many movies deal with moral dilemmas or crises and almost always deal with a right or wrong. It's extremely difficult to make a movie that doesn't have some sort of moral or ethical stance in it. Many times people take those stances and assume that Hollywood is trying to make a political statement or promote a certain political agenda. I'll buy that certain aspects of a movie may be affected by the political leanings of the producers, directors, and writers. However, there is large jump from having the political influences on a show to a movie expressly promoting a political viewpoint or having a political agenda.
Case in point - a movie includes aspects of being environmentally concious and responsible in caring for the earth, and somehow it gets interpreted as "Liberal Hollywood is pushing their tree-hugging political philosophies down our throat!! Don't they know that global warming is a myth!"