A Conversation with “Ex Machina” director Alex Garland

What defines sentience? Is it possible to create artificial intelligence? If so, how far would you be willing to go to create such a thing?

“Ex Machina,” Alex Garland’s (“Never Let Me Go,” “28 Days Later”) latest science fiction film attempts to tackle these questions.

Caleb (Domhnall Gleeson), a computer programmer for the Google-like Bluebook is invited by his CEO Nathan (Oscar Isaac) to partake in what he calls a post-Turing test with Ava (Alicia Vikander), an artificial intelligence he has created. Ava, demonstrating a ballerina’s grace, piques Caleb’s interests, and the two subsequently get to know each other through a series of interviews overlooked by Nathan. Every now and then, however, random power outages during their meetings start to disrupt Nathan’s surveillance, and it is in these moments where Caleb discovers through Ava that something more sinister is going on behind closed doors.

I had an opportunity to speak with director and screenwriter Alex Garland to find out what we should expect in this upcoming release. Straightforward, thankfully blunt and brutally honest, Garland had some unique answers for the questions of our generation.


Winston Eng: One thing I like is that it (“Ex Machina”) wasn’t necessarily about artificial intelligence and the inherent dangers in that. Would you like to talk about whether those elements were in your mind when you were writing the script? 

Alex Garland: I tried to step away from it (“Ex Machina”) as a creation myth and a cautionary tale in that sense, because I saw it as more preventive than a God-like act. You know it’s a creation of a new consciousness, but that essentially is what parents do when they have children. And I also—I felt very clear in my own mind, I was allied with the machine.

And I think what some people can read as being manipulation in a kind of deceptive way; I saw it in my mind, when I was writing it, as resourcefulness. So if I position myself with Ava, which is what I do, I see a sentient creature who is trapped in a glass box. And there are things within that prison that are designed to make her aware that there is an external world. Ava is given clips from magazines, and there’s a garden area, which you can see through glass but she can’t access.

There are some other strange things in there, like there’s a crack on the glass that she didn’t make, which implies that something else made it, something that was there before her, something that appeared to be trying to get out. And she has a jailer in the form of Nathan, who is predatory and intimidating and frightening. And an implication that things may not end out well for her.

And then there’s this jailer’s friend that turned up who may or may not be trustworthy. But there’s a key conversation that happened between them where she effectively tests the jailer’s friend saying, “What will happen if I fail this test?” And the jailer’s friend sort of hedges his bets when he answers. And in some respects, I think that probably proves that he’s not necessarily entirely trustworthy or at least leaves open the possibility that he’s not trustworthy.

In those terms, I’d slightly rephrased the terms of the question because—just simply because I feel so allied to Ava from my point-of-view. What Ava does feels reasonable and understandable. But I do understand that that’s not how everybody takes the film. And some people even slightly, to my dismay, see Ava as being malevolent or cold or even evil.

But I think that’s in the subjective nature of responding to stories. And I understand that. But I feel pretty sure about where I stand, I guess.


WE: In some of your other interviews, you’ve talked about distinguishing between a real sentient AI, which like Ava, sort of deserves to be treated as a person or a sentient entity. You also distinguish between the sort of non-sentient technology at an AI level. Do you have any thoughts on whether such things have a capacity for good or evil?

AL: I know there’s a lot of alarm about AI, and I understand why there’s alarm. And some of it is perfectly reasonable.

You know there’s always alarm with big changes in science and technology, because they create paradigm shift and that freaks people out. That’s completely understandable.

But for example, I live in the U.K., where we have a National Health Service, which is a complicated and non-wieldy and difficult to run system of healthcare basically, which is universal and paid for by taxes. And in many ways, humans often do a bad job of running it.

I don’t consider it impossible that an AI, not a self-aware AI but just a very, very complex AI, could do a better job of running that health service. You know it may be responding more quickly to the need of a population in terms of distribution of drugs and meds in different part of the country, whatever happens to be, than its human equivalent.

So there were ways where an AI could be fantastic and may genuinely improve our qualitative life. I don’t feel alarmed about AI.

But I do think that we got to be careful because a flip side of that, just to have stated it, is that within the very perceivable near future, you could have an AI controlling a drone, for example, in the battlefield. Now, in that circumstance, you can have a machine that’s making a kill decision over human being. And that is immensely ethically complicated and almost certainly dubious.

It kind of depends how it’s being used and what the application is.


WE: I know we’re here to spread the word on the film, but I think we both enjoy the cinema to some extent. What do you think we can do as a community to really spread the word on the independent film scene and subsequently help out these filmmakers who don’t necessarily have the backing of major studios?

AL: There’s a simple kind of capitalism of play. Some of the independent movies actually get a fuck of a lot support from like the community of journalists, the people that run websites and film fans. They go well out of their way to support movies like this. There’s nothing one can do to complain about or ask for more.

The issue always comes down to something really basic, which is like, it’s a Friday night and there’s a couple of people or a group of friends, and they’re saying, “What are we going to see tonight?” And nine times out of 10 or 99 times out of 100, what they choose is the really, really big, visceral movie release.

And it actually ends up at the hands of cinema goers, truthfully. For example, remember “Llewyn Davis,” that Coen Brothers movie starring Oscar Isaac? So it’s subjective, right?  Everyone has the right opinion. I thought that was a really beautifully made film, really stunningly made film. And it got really supported. And a lot of people said it’s brilliant. And then nobody went to fucking see it.

And so I don’t know what you’re supposed to do, because I read a ton of stuff saying this film is really wonderful and it’s incredibly beautifully shot and its wonderful performances and it’s thoughtful and all that kind of stuff. But then it was like in a wasteland when it came to box office receipts.

I find that kind of mysterious. I find it doubly mysterious, because the same people who often maybe don’t go to see a movie in the cinema—sometimes they love it on television. So very difficult adult drama with complex ethics, complex moral situations and complex characterization—like “Breaking Bad”—get a terrific response on television.

Sometimes I feel like complex adult drama or the sort that used to exist in cinema, like “Taxi Driver”—now it’s on television in “Breaking Bad,” or “The Sopranos,” or “The Wire” or whatever it happens to be.

All I’m really saying is that there’s something kind of mysterious thing underlying all of this for me. I feel like independent film actually gets quite a lot of support, but something goes wrong somewhere along the line. And I don’t get it.