playmaker88
New member
Iheart~Cali;1346698 said:I'm gonna disagree on this one and say no. Hollywood is notoriously liberal. But they're also about the bottom line; money. They'll only go where the money is, and if viewers gravitate towards the "stereotypical" Black role, well that's what Hollywood is going to play up. It's business. I wouldn't call it racism.
If the feeding frenzy is on those negative aspects of blackness and money is there motivation what would you call it..... Also its hilarious how the same shit niggas criticizein black movies they go around and cheer and support in white movies.... or the fact that certain movies wont be made... hollywood has barely tested the waters with anyhting new.. Look at the current shows.. shit is very lilly white.. across the board.. You have your spots.. but there isnt one primarily black focused Drama on network tv..
Last edited: