Stanley Kubrick’s 2001: A Space Odyssey is, among many things, an incredible speculation on the future of artificial intelligence. If that movie is your jam (like it is mine) you’ll want to see how the brainiacs over at Interactive Architecture have taken this idea to an unexpected and deeply weird conclusion.
“Some 16 years past his prediction,” they write, “our project “Neural Kubrick” examines the state of the art in Machine Learning, using the latest in “Deep Neural Network” techniques to reinterpret and redirect Kubrick’s own films. Three machine learning algorithms take respective roles in our AI film crew; Art Director, Film Editor and Director of Photography.”
They look at it as an “artist-machine collaboration” where each player makes up for the weaknesses of the other.
For our purposes, the experiment to watch is the AI Director of Photography, which uses photogrammetry software and an AI to essentially reshoot scenes. Here’s an explanation that oversimplifies a bit: The team used photogrammetry software to extract camera positions in Kubrick’s scenes. Next, they took the camera positions used in those scenes, along with the images, and fed them to an AI that reshot the scenes in virtual 3D space.
Here’s how the processing looks.
And here’s a look at snippets from the final videos.
To watch more, see neuralkubrick.com, where you can select scenes from one of three Kubrick films and view the results of the AI reinterpretation.
For more information about how they built an AI art director and film editor (and more in-depth explanations for the photogrammetry process I oversimplified above), check the project’s website.