Not long ago, we compiled a list of 100 Iconic Moments from the Best TV Sitcoms of All Time. Turns out, several of these memorable moments between fictional characters could’ve gone into helping a machine better understand real people.
As the Associated Press reports, the Massachusetts Institute of Technology is currently conducting research into building a better artificial intelligence system, a goal eerily familiar to fans of Fallout 4.
Using “predictive vision,” researchers Carl Vondrick, Antonio Torralba and Hamed Pirsiavash aimed to create a robot capable of correctly guessing what would happen between two humans when they met, based on their movements and behavior.
To do so, the trio needed a nearly endless source of mundane, everyday interactions. They decided to draw from the deep wellspring that was YouTube.
The researchers locked the poor program into a Clockwork Orange-style session of binge-watching 600 hours (25 days worth) of these videos by converting them into data that it could look through and find patterns in.
They then tested the program’s human interaction competence with clips of shows like Big Bang Theory, Desperate Housewives and the American version of The Office.
The robot attempted to guess if characters would hug, kiss, high-five or shake hands. It reportedly guessed correctly 43 percent of the time, 7 percent higher than other similar algorithms.
A preview video below shows some of the robot’s progress. There’s small comfort in knowing that the lovable but socially stunted Michael Scott perplexes even a highly intelligent machine.
The results of this two-year project will be presented on at The International Conference on Computer Vision and Pattern Recognition at Caesar’s Palace in Las Vegas starting June 27.
“Humans are really good at predicting the immediate future,” Pirsiavash told the AP. “To have robots interact with humans seamlessly, the robot should be able to reason about the immediate future of our actions.”