The only woman ever to win an Oscar for Best Director is Kathryn Bigelow in 2010. The first ever Academy Award show was in 1929. So Hollywood is racist and sexist. Good to know. I take a film class and we’re always learning about directors and their style. The directors are always men. I. Frickin. Hate that! I´m so tired of men and their societal domination. I don’t care to win an Oscar anymore like I did in the past, but I have to now. I have to be that director that teachers talk about in film classes. I want to change Hollywood, but I can’t do it alone. I need some women to help me out. I need black women, Hispanic/Latina women, Asian women, Native American women, and even white women. We have to change this (mostly white) male dominated world. I don’t know if any of this makes sense, but I’m really annoyed.