Students in a two-quarter-long course worked in teams in CALIT2’s second floor EVoKE lab to develop experimental theater projects using augmented and virtual reality technology. One team created a mixed reality pre-theater experience for a musical titled “The Next Fairy Tale.” Through a virtual reality game, users could walk through the enchanted forest and enter the cottage of the Magic Mirror, turn the pages of a spell book and practice casting a few magic charms, and put on the cloak and hat of the queen fairy godmother to see the world from her perspective. Another team designed a mixed-reality tool for set design using both smartphones and Windows Mixed Reality headsets. The third project was an interactive virtual reality experience set in the world of a musical in development, in which the player gets to step into the shoes of a character and help her make choices. The course framework incorporated a real-world project approach as a way to prepare students for careers.
In the CALIT2 Visualization Lab, also located on the second floor, research focuses on large-scale visualization, interactive rendering and virtual reality. Computer science professor Aditi Majumder has developed novel display technology that allows anyone to create personal augmented reality experiences using multiple off-the-shelf projectors. With her company Summit Technology Laboratories, Majumder was one of only a handful of female CEOs to occupy CALIT2’s startup incubator, TechPortal. “This visualization software allows us to harness the power of many projectors to create seamless displays on objects of any shape and size with a push of a button,” she explains. In addition, unlike with current projector display technologies, “viewers can interact with our system using laser pointers, tablets or just hand gestures.” This software-driven, high-resolution, scalable plug-and-play system could have applications in varied environments: education, trade shows, training simulations and entertainment.
Researchers looking to understand the large-scale structure of human social relationships and interactions can find support at the Center for Networks and Relational Analysis, housed on the third floor in the CALIT2 Building. Led by Carter Butts, a UCI sociology and statistics professor, the center brings together researchers who examine networks from a variety of different standpoints and domains. For example, more and more, public agencies have adopted social media to impart vital information during emergency situations. And when those important missives go “viral,” it can be an effective outreach tool. In an NSF-funded study, the center’s researchers, led by Butts, along with colleagues at the University of Kentucky, reviewed tweets sent by emergency officials during a terrorist attack, a wildfire, a blizzard, a hurricane and a flash flood to determine the number of times each message was retweeted. They then analyzed factors related to the probability the tweets would be retransmitted by recipients. Messages describing hazard impacts and emphasizing cohesion among users generated the most retweets. “Our findings support the intuition that critical information – like advisories or hazard impacts – makes a message more likely to get passed on,” says Butts. “But we also found that strong emotional appeals can sometimes enhance the retransmission rate. Content is important, but the most compelling content is not always the most pragmatic.” As a result of this work, emergency management officials have sought input from Butts and his colleagues on how best to utilize social media platforms such as Twitter and Facebook, which require brevity.
In 2018, professor of music and violinist/composer Mari Kimura turned to CALIT2 for help in building the technology for her innovative motion sensor system called MUGIC. A team of undergraduate students in CALIT2’s MDP program took up the task under the direction of CALIT2 technical manager Michael Klopfer and Kimura. MUGIC analyzes movement and gesture to extract human expression. It is a wearable music controller that enables the creation of seamless motion-driven interfaces for real-time multimedia performance. It contains a gyroscope, an accelerometer and a magnetometer. In past performances, this fully wireless Wi-Fi device has been housed inside gloves that were worn by the performers. It is meant to be attached not only to the violin or other musical instruments, but also to any object that accepts communicative expressive motions.