Tag Archives: Hollywood

A Welcome Parasite: Changing the Face of White Hollywood

The Academy has always disproportionately praised the work of white men over minorities and women. Besides occasional nominations for Best Picture, non-English films usually only succeed within the confines of the Best International Film category, which itself has historically been badly defined.

Read More »