A new survey conducted by HarrisX for The Faith & Entertainment Index claims that 92% of Americans say that faith belongs in films and television series.
Hollywood only includes faith to dump on it, villainize it, or patronize it. I prefer no representation at all when that is what "representation" looks like.
Without question
Hollywood only includes faith to dump on it, villainize it, or patronize it. I prefer no representation at all when that is what "representation" looks like.
Still remember The Mission, it’s characters and music
Wait, people don't want BLMLGBTQP monster shit?