Whether it’s politics, sexual identity or women’s liberation, Hollywood has a well-earned reputation for slanting liberal. The conservative viewpoint, meanwhile, has struggled to gain a strong foothold in American pop culture, whether that’s through television, movies or music. That trend seems to shift a little bit this year. Is that likely to continue? Joel Penney, associate professor at the School of Communication and Media at Montclair State University, joins The Excerpt to explore the political gains that can be won through entertainment media.
source
Why was this uploaded? This is the most boring, non-news that I've seen in awhile…
???
No, it's always been more conservative. Only censorship makes it seem otherwise.
Not really becoming more conservative, becoming more of a disease. Tired of seeing Taylor Swift 🐴 everywhere in spanx snd and hearing all these "musicians" belting on the radio with their Walgreens music. Real music needs to come back.
No. Much of America and corporate America have gone leftist woke.