Is Hollywood losing its influence over politics? Jason Browning 7 years ago Dr. Gina & Leslie Marshall discussed the way that Hollywood allows only leftist thought and opinion in their industry