General

Is Hollywood losing its influence over politics?

Dr. Gina & Leslie Marshall discussed the way that Hollywood allows only leftist thought and opinion in their industry

Close