বৃহস্পতিবার, ৬ ডিসেম্বর, ২০১২

Research takes next generation augmented reality apps 'anywhere'

Research takes next generation augmented reality apps 'anywhere' [ Back to EurekAlert! ] Public release date: 6-Dec-2012
[ | E-mail | Share Share ]

Contact: Melissa Van De Werfhorst
melissa@engineering.ucsb.edu
805-893-4301
University of California - Santa Barbara

UC Santa Barbara computer scientists are changing the face of augmented reality by modeling user experience and adding dynamic crowdsourced data

Augmented reality applications for mobile devices could become smarter and more sophisticated, thanks to two recent grants awarded to UC Santa Barbara computer science professors Matthew Turk and Tobias Hllerer.

While many mainstream augmented reality (AR) applications rely on mobile device sensors and a static dataset layered over real-time visuals or GPS coordinates, Turk and Hllerer envision next-generation AR that is more stable, realistic, and dynamically updated by users.

"Our research employs real-time computer vision for more stable presentation of 3D computer graphics that appear as if they are truly part of the physical world," said Professor Hllerer. "Imagine applications, such as a landscape architect experimenting with design by placing virtual trees or walking within the grounds they plan to develop. A tourist at an archaeological site could explore the reconstruction of an ancient temple wh ere it once stood."

The UCSB team is conducting intensive research that would couple mobile computer vision capture with crowdsourced user data that could immediately discern whether the app object matches the object in reality. They've termed it "anywhere" augmented reality.

There is a need to close the sensor loop to get more accurate overlays, Hllerer explained. "The camera feed from the mobile device can be matched to indexed image databases. But images of the real world change. So, the next time I'm standing in front of a restaurant using an AR app, and the faade has recently changed, I can update that information just by virtue of looking at it. The use of the app improves the experience of the next user."

To achieve "anywhere augmentation" the researchers must first design an interface with which potential developers can easily experiment out of the box.

With a recent $300,000 grant from the federal Office of Naval Research, Turk and Hllerer's computer science research groupcalled the Four Eyes Labwill be closely studying user experience to create that optimal user interface for AR. The project is a collaboration with Virginia Tech Professor Doug Bowman.

Hllerer and Bowman turned to the virtual reality world for testing and prototyping, using head-worn gear and tablet computers in the UCSB's AlloSphere Research Facilitya unique 3-story immersive virtual reality laboratory housed in the UCSB NanoSystems Institute. The AlloSphere provides a seamless 180-degree, three-dimensional environment integrated with advanced computer equipment for complete data visualization immersion.

Their research group was also recently awarded $500,000 over three years from the National Science Foundation for a project that uses computer vision-based tracking and augmented reality to enhance remote collaboration (telecollaboration) in physical spaces.

"Live teleconferences would mimic in-person meetings where an engineer or doctor, for example, would be able to interact remotely with a three-dimensional space, such as an instrument or with a team of people," explained Hllerer. Their project allows users in different locations to see and assign data within a target scene, extending two-dimensional tracking to a real-time three dimensional scenario.

"You can point to and annotate an object in a target environment through your screen, and the annotation will 'stick' to the object even when the camera moves, and it will be visible to all users," said Turk. Prototype testing of their technology allowed users, instructed by a remote expert, to control a mock-up airplane cockpit using just a visual camera feed.

Next-generation AR could make it possible to explore any physical environment, known or unknown, live from a remote location, according to the researchers. Rapid advancements in mobile computing devices in recent yearsprimarily smartphones and tabletsprovide a perfect springboard for the technology. "The applications for mobile, real-time augmented reality can have a major impact on health, education, entertainment, and many other areas," added Turk.

Advancements in augmented reality from the UCSB Four Eyes Lab have caught the attention of the global research community this past year. Their work on live tracking and mapping recently won the best paper award at the selective 2012 International Symposium on Mixed and Augmented Reality (ISMAR).

###

Research at the UCSB Four Eyes Lab focuses on the "four I's" of Imaging, Interaction, and Innovative Interfaces. The laboratory is directed by professors Matthew Turk and Tobias Hllerer and is affiliated with the UCSB Department of Computer Science and the Media Arts and Technology graduate program.

The Department of Computer Science at UC Santa Barbara is part of the College of Engineering, a recognized global leader among the top tier of engineering education and research programs.



[ Back to EurekAlert! ] [ | E-mail | Share Share ]

?


AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert! system.


Research takes next generation augmented reality apps 'anywhere' [ Back to EurekAlert! ] Public release date: 6-Dec-2012
[ | E-mail | Share Share ]

Contact: Melissa Van De Werfhorst
melissa@engineering.ucsb.edu
805-893-4301
University of California - Santa Barbara

UC Santa Barbara computer scientists are changing the face of augmented reality by modeling user experience and adding dynamic crowdsourced data

Augmented reality applications for mobile devices could become smarter and more sophisticated, thanks to two recent grants awarded to UC Santa Barbara computer science professors Matthew Turk and Tobias Hllerer.

While many mainstream augmented reality (AR) applications rely on mobile device sensors and a static dataset layered over real-time visuals or GPS coordinates, Turk and Hllerer envision next-generation AR that is more stable, realistic, and dynamically updated by users.

"Our research employs real-time computer vision for more stable presentation of 3D computer graphics that appear as if they are truly part of the physical world," said Professor Hllerer. "Imagine applications, such as a landscape architect experimenting with design by placing virtual trees or walking within the grounds they plan to develop. A tourist at an archaeological site could explore the reconstruction of an ancient temple wh ere it once stood."

The UCSB team is conducting intensive research that would couple mobile computer vision capture with crowdsourced user data that could immediately discern whether the app object matches the object in reality. They've termed it "anywhere" augmented reality.

There is a need to close the sensor loop to get more accurate overlays, Hllerer explained. "The camera feed from the mobile device can be matched to indexed image databases. But images of the real world change. So, the next time I'm standing in front of a restaurant using an AR app, and the faade has recently changed, I can update that information just by virtue of looking at it. The use of the app improves the experience of the next user."

To achieve "anywhere augmentation" the researchers must first design an interface with which potential developers can easily experiment out of the box.

With a recent $300,000 grant from the federal Office of Naval Research, Turk and Hllerer's computer science research groupcalled the Four Eyes Labwill be closely studying user experience to create that optimal user interface for AR. The project is a collaboration with Virginia Tech Professor Doug Bowman.

Hllerer and Bowman turned to the virtual reality world for testing and prototyping, using head-worn gear and tablet computers in the UCSB's AlloSphere Research Facilitya unique 3-story immersive virtual reality laboratory housed in the UCSB NanoSystems Institute. The AlloSphere provides a seamless 180-degree, three-dimensional environment integrated with advanced computer equipment for complete data visualization immersion.

Their research group was also recently awarded $500,000 over three years from the National Science Foundation for a project that uses computer vision-based tracking and augmented reality to enhance remote collaboration (telecollaboration) in physical spaces.

"Live teleconferences would mimic in-person meetings where an engineer or doctor, for example, would be able to interact remotely with a three-dimensional space, such as an instrument or with a team of people," explained Hllerer. Their project allows users in different locations to see and assign data within a target scene, extending two-dimensional tracking to a real-time three dimensional scenario.

"You can point to and annotate an object in a target environment through your screen, and the annotation will 'stick' to the object even when the camera moves, and it will be visible to all users," said Turk. Prototype testing of their technology allowed users, instructed by a remote expert, to control a mock-up airplane cockpit using just a visual camera feed.

Next-generation AR could make it possible to explore any physical environment, known or unknown, live from a remote location, according to the researchers. Rapid advancements in mobile computing devices in recent yearsprimarily smartphones and tabletsprovide a perfect springboard for the technology. "The applications for mobile, real-time augmented reality can have a major impact on health, education, entertainment, and many other areas," added Turk.

Advancements in augmented reality from the UCSB Four Eyes Lab have caught the attention of the global research community this past year. Their work on live tracking and mapping recently won the best paper award at the selective 2012 International Symposium on Mixed and Augmented Reality (ISMAR).

###

Research at the UCSB Four Eyes Lab focuses on the "four I's" of Imaging, Interaction, and Innovative Interfaces. The laboratory is directed by professors Matthew Turk and Tobias Hllerer and is affiliated with the UCSB Department of Computer Science and the Media Arts and Technology graduate program.

The Department of Computer Science at UC Santa Barbara is part of the College of Engineering, a recognized global leader among the top tier of engineering education and research programs.



[ Back to EurekAlert! ] [ | E-mail | Share Share ]

?


AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert! system.


Source: http://www.eurekalert.org/pub_releases/2012-12/uoc--rtn120512.php

Jenny Johnson olivier martinez ny lottery Ohio Lottery Colorado Lottery Pa Lottery finish line

0টি মন্তব্য:

একটি মন্তব্য পোস্ট করুন

এতে সদস্যতা মন্তব্যগুলি পোস্ট করুন [Atom]

<< হোম