Carnegie Mellon undergraduates Dan Eisenberg, Kevin Li and Ilya Brin have developed the EyeTable, which is described as "an artificially intelligent dinner table that reads physical gestures and speech patterns and lets the participants know how the date is going—in real time." The EyeTable is composed of a centerpiece and two headpieces. Sensors placed on these parts detect fluctuations in tone of voice, periods of silence, and distance between the couple. If the EyeTable detects that the date is not going well, it will try to help the couple by suggesting some post-dinner activities or by suggesting another bottle of wine. If the EyeTable detects that the date is beyond help, it will instead give the numbers for the local cab companies. While the developers agree that the headsets would need to be modified for this to be used on a real date, the conceptual product is certainly intriguing.