Only Text Quote

Hollywood pushes a liberal agenda to the rest of the country. And, whether we like it or not, Hollywood dictates the culture of the country.