The lack of female representation behind the camera in Hollywood is palpable. It's a male-dominated profession with very little female influence. But in recent years, things have started to change: women are increasingly breaking through.
In this gallery, we've highlighted some critically acclaimed films directed by women that are worth seeing if you haven't already. Simply click on to discover them.