Scientists build first-person view video database to provide visual data in line with human experience

Scientists build first-person view video database to provide visual data in line with human experience

To better understand the organization of the brain and the perceptual tendencies in humans, a team of four scientists are recording video from four head-mounted cameras - with eyetracking and head movement - and assembling a massive video database with more than 240 hours of first-person video that can be used by researchers everywhere. The brain is adapted to the world around us, but we don't have good data on what the world actually looks like to human observers. There are no collections of videos that sample the world the way that humans do - Hollywood cinematographers don't zip the cameras around as fast as human eyes move, so movies don't really reflect the way we take in the world." Mark Lescroart, assistant professor and neuroscientist in the psychology department at the University of Nevada, Reno The team of neuroscientists and social scientists is setting out to build a visual database that can be used to more accurately reflect human activity. They will create the vast gallery of videos that show what people see as they go about their daily activities. Their Visual Experience Database can be used to support research and impact future research in fields that rely on the analysis and recognition of images, including neuroscience, vision science, cognitive science, artificial intelligence and possibly digital humanities and art. To gather the videos, the scientists designed a headset/glasses device to use in collecting the data. While early versions look like a prototype for a Borg device, the team is streamlining the system to keep the weight down and improve wearability. It has two cameras facing forward to see the world and two cameras facing the eyes to track eye movement. There will be five headsets for each of the four labs participating in the research. The subjects wearing the headsets for the visual data gathering will vary in age from 5 to 70 years old. They'll go to a variety of spaces and engage in many activities: to museums, libraries, shopping, commuting, riding bikes, walking. The research team will analyze cues for 3D space perception and how people commonly experience walls, corners, landmarks and other built structures in the world. A sample of what the cameras see shows the eyes moving and the head turning to look at people and objects. "We wanted to use mini-computers, but they weren't robust enough to handle our needs, so we ended up with a laptop in a backpack, it makes the headset a little more user-friendly so our subjects wouldn't be distracted by the tech," Paul MacNeilage, assistant professor and neuroscientist in the College of Science at the University of Nevada, Reno, said. "We decided to go with a Pupil Labs product for the base and added devices to it. We didn't want it to be too distracting for others, either." The system must be able to collect GPS data, run four cameras, access software, have a decent power supply, record three video streams at once, utilize an internal motion sensor and an accelerometer. The technology is much more involved than the stationary devices that are typically used in the lab for eye-tracking studies with a chin rest and a display. "This is definitely not the same type of studies on eye movement that are done in the lab, with chin rest and a display showing pictures of the environment," MacNeilage said. "This is out in the real world with people interacting with their environment." This is especially important to MacNeilage, who runs the Self Motion Lab at the University of Nevada, Reno, where graduate students are involved in the groundbreaking research. "The system measures head and body movement through space," he said. "This allows us to reconstruct visual input moment to moment and get insights on sensory-motor control. No existing database includes head motion." MacNeilage, Lescroart and graduate student Christian Sinnott tried out the headset, taking it outdoors. They got a few weird looks from those they passed on the street. Related Stories



Also in Industry News

How to decide whether or not to start treatment for prostate cancer?
How to decide whether or not to start treatment for prostate cancer?

0 Comments

How to decide whether or not to start treatment for prostate cancer?

Read More

Analysis of the SARS-CoV-2 proteome via visual tools
Analysis of the SARS-CoV-2 proteome via visual tools

0 Comments

Analysis of the SARS-CoV-2 proteome via visual tools

Read More

$65m investment increases British Patient Capital’s exposure to life sciences and health technology
$65m investment increases British Patient Capital’s exposure to life sciences and health technology

0 Comments

$65m investment increases British Patient Capital’s exposure to life sciences and health technology

Read More