Image credit: University of Chicago
The next big thing in VR may not be higher resolution or more immersive sound, but an experience enhanced by physical sensations or moving parts that trick your senses into mistaking virtual for reality. Researchers at SIGGRAPH, from Meta to international student groups, showcased their latest attempts to make VR and AR more compelling.
The Conference on Computer Graphics and Related Domains is happening this week in Los Angeles, and everyone from Meta to Epic to universities and movie studios were showing off their wares.
It’s the 50th SIGGRAPH, so a disproportionate amount of the event was dedicated to retrospectives and the like, even though the exhibit hall was full of the latest VFX, virtual production, and motion capture hardware and software.
In the “emerging technologies” hall, or cave, as the darkened, black-draped room felt, dozens of experimental approaches at the frontiers of VR seemed to describe the latest: visually impressive, but with immersion almost entirely dependent on it. What could be done to make the illusion more complete? For many, the answer lies not in the virtual world with better sound or graphics, but in the physical world.
Meta’s varifocal VR headset literally changes your perspective
Meta was a big presence in the room, with its first demonstration of two experimental headsets, dubbed Butterscotch and Flamera. Flamera has an interesting approach to “passthrough” video, but it’s Butterscotch’s “varifocal” approach that really changes things in the virtual world.
VR headsets generally consist of a pair of tiny, high-resolution screens attached to a stack of lenses that make them appear to fill the wearer’s field of vision. This works quite well, as anyone who has tried a recent headset can attest. But there is a flaw in the simple fact that moving things closer doesn’t really allow you to see them better. They stay at the same resolution, and while you might be able to see a bit more, it’s not like picking up an object and inspecting it closely in real life.
Meta’s Butterscotch prototype, which I tested and grilled the researchers about, replicates that experience by tracking your gaze in the headset, and when your gaze falls on something closer, physically sliding screens closer to your eyes. The result is shocking to anyone accustomed to the poor approximation of “looking closely” at something in VR.
The screen only moves over a span of about 14 millimeters, a researcher at the Meta booth told me, and that’s more than enough in that range, not just to create a clearer image of the nearby element—remarkably clear, I must say – but to allow the eyes to more naturally change their “accommodation” and “convergence”, the way they naturally track and focus on objects.
While the process worked extremely well for me, it failed miserably for one attendee (who I suspect was someone higher up in Sony’s VR department, but his experience seemed genuine), who said the optical approach was at odds with his own vision impairment, and turning the feature on actually made everything look worse. It is, after all, an experiment, and others I spoke to found it more compelling. Unfortunately, the switching screens can be somewhat impractical on a consumer model, making it quite unlikely that the feature will come to the Quest in the near future.
Rumble (and tumble) packs
Elsewhere on the demo floor, other far more outlandish physical methods are being tested to trick your perception.
One of the Sony researchers takes the concept of a rumble pack to the extreme: a controller mounted on a kind of baton, inside which is a weight that can be driven up and down by motors to change the center of gravity or simulate movement.
Consistent with the other haptic experiments I tried, it doesn’t feel like much outside the context of VR, but when paired with a visual stimulus, it’s very compelling. A set of rapid-fire demos first got me opening a virtual umbrella – obviously not a game you’d want to play for long, but a great way to show how a change in center of gravity can make an object seem real. The movement of the umbrella opening felt right, and then the weight (at the farthest limit) made it feel like the mass had actually moved to the end of the handle.
Next, another baton was attached to the first in a perpendicular fashion that formed a gun-like shape, and the demo had me blasting aliens with a shotgun and a pistol, each having a distinct “feel” due to how the weights were programmed to move and simulate recoil and reloading. Finally, I used a virtual lightsaber on a nearby monster, which provided tactile feedback when the beam made contact. The researcher I spoke to said there are no plans to commercialize it, but the response has been very positive and they are working on improvements and new applications.
An unusual and clever take on this idea of shifting weight was SomatoShift, exhibited at a stand by researchers from the University of Tokyo. There I was fitted with a powered armband on which two spinning gyros were opposed to each other, but could have their orientation changed to produce a force that either counteracted or accelerated the movement of the hand.
The mechanism is a little hard to understand, but rotating weights like this essentially want to stay “upright” and by changing their orientation relative to gravity or the object to which they are mounted, this tendency can correct itself produce fairly accurate force vectors. The technology has been used in satellites for decades, where they are known as “reaction wheels”, and the principle worked here too, delaying or assisting the movements of my hand as it moved between two buttons. The forces involved are small but noticeable, and one can imagine clever use of the gyros creating all sorts of subtle but convincing pushes and pulls.
The concept was taken to a local extreme a few meters away at the University of Chicago’s booth, where participants were equipped with a large motorized backpack with a motorized scale that could move up and down rapidly. This was used to give the illusion of a higher or lower jump, as by shifting the weight at the right moment one seems to be lighter or accelerated upwards, or alternately pushed downwards – if a mistake is made in the associated jumping game.
Our colleagues at Engadget wrote up the details of the technology ahead of its debut last week.
While the bulky mechanism and narrow use case mark it like the others as a proof of concept, it shows that the perception of bodily movement, not just of an object or an appendage, can be affected by judicious use of power.
When it comes to the feel of holding things, current VR controllers also fall short. While the motion-tracking capabilities of the latest Quest and PlayStation VR2 headsets are nothing short of amazing, you never feel like you’re really interacting with the objects in a virtual environment. The Tokyo Institute of Technology team created an ingenious – and hilariously difficult – method to simulate the sensation of touching or holding an object with the fingertips.
The user is equipped with four small rings on each hand, one for each finger except the pinky. Each ring is equipped with a tiny tiny motor on top, and from each motor hangs a tiny tiny wire loop which is fitted around the pad of each fingertip. The positions of the hands and fingers are tracked with a depth sensor attached (just button) to the headset.
In a VR simulation, a tabletop is covered with a variety of cubes and other shapes. When the tracker detects your virtual hand crossing the edge of a virtual block, the motor spins a bit and pulls the loop – which feels pretty much like something touching your fingers!
It all sounds very corny, and it certainly was – but the basic idea and sensation was worth experiencing, and the set-up was clearly not too expensive. Haptic gloves that can simulate resistance are few and far between, and quite complicated to boot (in fact, another researcher present was working on this device, a more complex version of a similar principle). A refined version of this system can be made for under $100 and provide a basic experience that is still transformative.
SIGGRAPH and especially this hall was full of these and more experiences that straddled the border between the physical and the digital. While VR has yet to take off into the mainstream, many have taken that to mean they should redouble efforts to improve and expand it rather than abandon it as a dead platform.
The conference also showed a large overlap between gaming, VFX, art, virtual production and several other domains. The minds behind these experiments and the more established products on the show floor clearly feel that the industry is converging as it diversifies, and a multi-modal, multi-medium, multi-sensory experience is the future.
But it’s not inevitable – someone has to make it. So they go to work.
#Hidden #powers #sliding #screens #trick #senses #feel #real #TechCrunch