Hollywood is often characterised as a stronghold of left-liberal ideals. This title shows Hollywood is in fact deeply complicit in serving the interests of the most regressive US corporate and political forces.[...]