Technology has allowed us to immerse ourselves in a world of sights and sounds from the comfort of our homes, but something is missing: touch.
Tactile sensation is an extremely important part of how humans perceive their reality. Haptics or devices that can produce extremely specific vibrations that can mimic the sensation of touch are one way to bring this third sense to life. However, when it comes to haptics, humans are incredibly attuned to whether something feels “right” or not, and virtual textures don’t always hit the mark.
Now, researchers at the USC Viterbi School of Engineering have developed a new way for computers to get that real texture — with the help of humans.
Called the preference-based model, the framework uses our ability to distinguish the details of certain textures as a tool to give these virtual counterparts some focus.
The research has been published in IEEE Transactions on Haptics by three USC Viterbi Ph.D. computer science students, Shihan Lu, Mianlun Zheng, and Matthew Fontaine, as well as USC Viterbi Assistant Professor of Computer Science Stefanos Nikolaidis and USC Viterbi WiSE Gabilan Assistant Professor of Computer Science Heather Culbertson.
“We ask users to compare their feelings between the real texture and the virtual texture,” explained Lu, the first author. “The model then iteratively updates a virtual texture so that the virtual texture can match the real one in the end.”
According to Fontaine, the idea first emerged when they shared a course on haptic interfaces and virtual environments in fall 2019 taught by Culbertson. They were inspired by the art app Picbreeder, which can generate images based on a user’s preferences over and over again until they achieve the desired result.
“We thought, what if we could do this for textures? Fontaine recalled.
Using this preference-based model, the user first receives a real texture, and the model randomly generates three virtual textures using dozens of variables, from which the user can then choose which one looks like most to reality. Over time, research adjusts its distribution of these variables as it gets closer to what the user prefers. According to Fontaine, this method has an advantage over recording and directly “reading” textures, because there is always a gap between what the computer reads and what we feel.
“You measure the metrics of exactly what they’re feeling, rather than just mimicking what we can register,” Fontaine said. There is going to be an error in the way you saved this texture, in the way you play it back.
The only thing the user has to do is choose the texture that suits them best and adjust the amount of friction using a simple slider. Friction is essential to how we perceive textures, and it can vary from person to person. It’s “very easy,” Lu said.
Their work arrives just in time for the emerging market of specific and precise virtual textures. Everything from video games to fashion design incorporates haptic technology, and existing databases of virtual textures can be enhanced through this method of user preference.
“There is a growing popularity of the haptic device in video games and fashion design and surgery simulation,” Lu said. ) that are becoming as popular as the laptop. For example, with first-person video games, it will make them feel like they are really interacting with their environment. »
Lu has previously done other work on immersive technology, but with sound – specifically, making the virtual texture even more immersive by introducing corresponding sounds when the tool interacts with it.
“When we interact with the environment through a tool, tactile feedback is just one modality, one type of sensory feedback,” Lu said. “Audio is another type of sensory feedback, and both are very important.”
The texture search model also allows someone to pull a virtual texture from a database, such as the University of Pennsylvania’s Haptic Texture Toolkit, and refine it until they achieve the desired result.
“You can use the previous virtual textures others have researched, and then based on those you can continue to tune them,” Lu said. “You don’t have to research from scratch every time. “
This is especially useful for virtual textures used in dental or surgical training, which need to be extremely precise, according to Lu.
“Surgical training is definitely a huge field that requires very realistic textures and tactile feedback,” Lu said. “Fashion design also requires a lot of precision in texture during development, before you go and make it. “
In the future, actual textures might not even be needed for the model, Lu explained. The feel of some things in our lives is so intuitive that refining a texture to match that memory is something we can do. inherently just looking at a photo, without having the real reference texture in front of us.
“When we see a table, we can imagine how the table will feel once we touch it,” Lu said. “Using this prior knowledge we have of the surface, you can just provide visual feedback to users, and it allows them to choose what fits.”
IEEE Transactions on Haptics
The title of the article
Preference-based texture modeling through interactive generation and search
Publication date of articles
Warning: AAAS and EurekAlert! are not responsible for the accuracy of press releases posted on EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.