Dark or Light
logo
Logo

Owlchemy Labs' VR Hand Tracking Concept Demo Has Me Excited For The Potential Future VR MMOs

Joseph Bradford Posted:
Category:
Features 0

MMORPGs and Virtual Reality games have one major thing in common: they transport us to new worlds. VR does this by tricking yourself that you are physically in a new space, while MMORPGs do it through world-building, story, and the social relationships you make across the experience.

VR MMORPGs have been hard to come by - at least good ones - yet they aim to bring those two things together: transporting the player to a world they inhabit. Yet, VR has a massive barrier to entry: cost. 

Getting around the cost factor, though, the act of interacting with the world around you can frequently feel gimmicky or one-note, especially when your interactions are tied to a game controller that serves only a few in-game functions.

However, during GDC 2023 earlier this year, Google-owned Owlchemy Labs showed off a tech demo of the hand tracking tech they are working on, and the possibilities of how this could apply to MMOs kept firing off in my head as I walked away from our meeting.

The demo itself was built using Owlchemy's most notable game, Job Simulator - though it should be stressed that this doesn't mean we'll be seeing a new Job Sim game. Rather it's a tech demo using the existing environment. That being said, the demo, running on an Oculus Quest 2, was transformative in practice.

"We really see hand tracking as the future of VR, right?" Owlchemy Labs' CEOwl Andrew Eiche told me in an interview at the Game Developer's Conference in March. Part of the reason why Eiche seemingly thinks this as it breaks down some of the barriers that keep players out of VR - the cost of the device and its accessories being one, as well as the simplicity of controlling the environment.

"The big thing here is if you want a billion people to use it, you're gonna have to get rid of the controllers."

Eiche likens the stage in VR where we find ourselves much like the BlackBerry stage of Cell Phones, on the cusp of Apple swooping in and showing the world that a non-physical keyboard could work on phones. This of course would go on to shape the modern world forever. 

"You'll hear a lot of people say, 'Oh, you can't get rid of the controllers, that's going to ruin this.' How many people did you hear when Apple was like, 'We're going to have this keyboard on the phone, on the screen,' [say] 'We'll never be able to do that. I'd want buttons.' And now there are next to no phones that have physical buttons. And we've just accepted that. And that's a lot of where we see this, right? The people who play video games, they understand these controllers inherently. But that's not a billion people. There are a lot of people who have no clue."

It gets worse too for those people who just are not used to using controllers themselves: you blindfold them the minute the headset goes on their head. Hand tracking removes those barriers, in theory. Hand tracking, when implemented well, enables players to be able to just interact with a VR scene much like the way we would in real life. 

This was perfectly on display in the Job Simulator tech demo I tried at GDC. Donning the Quest 2 headset, I found myself behind the counter of a cafe, a complicated cappuccino machine in front of me, a jukebox behind me, cups, and more surrounding me. Interacting with the world was as easy as raising my hands and grabbing something much like I would in real life.

Iron Man

We all have seen those movies where characters interact with virtual screens and space using just their hands, right? Robert Downey Jr in Iron Man comes to mind immediately when I think of this, with Tony Stark interacting with Jarvis through a series of gestures, swipes, grips, and more while doing his special brand of science. Hand tracking in VR feels like the first step on the path to that reality, and while the fate of the world didn't hang in the balance of me finding a new element like in Iron Man 2, I really wanted to make the perfect cup of coffee in this VR setting.

First, though, I had to input my name for my nametag. While most VR settings would have a virtual keyboard that you poke with virtual sausage fingers, or aim and pull a trigger to interact with, here Owlchemy has you squeeze the letter itself.

This was weird at first, but rather satisfying as I went on, inputting my name for all to see. The major reason it felt odd at first was that I was so used to the sensation that I've interacted with something in a game world came down to the vibration or haptics on a controller. However, with an empty hand that just isn't there. This was felt more keenly the first time I went to pick up a coffee cup or adjust a dial on the coffee maker in front of me: no game haptics to tell my brain I was interacting with something clearly in front of me.

Owlchemy Labs Job Sim Handtracking Demo

The gesture to use these items was much like a crab claw, and as I would close the claw around the cup or dial, my body became the haptics. It was weird at first, but slowly it started to click. As I turned around my station, the more I would interact with my surroundings, the less I would feel the absence of a controller. Instead, it simply felt natural.

Eiche calls this "self haptics," and it's a clear design element the team thought of when implementing the hand tracking for the demo.

"When we need discrete input we use what's called self haptics," Eiche explained. "That's by the keyboard was built like the little bubbles. Because in a keyboard, you need to know I've almost pressed the key [and] I've pressed the key. And so by pinching, you're touching yourself, and that gives a very strong indication of 'I am in the process of selecting,' and 'I've now selected.' And that's why that keyboard works so well. 

Eiche mentions that the team was convinced that not everything would work well using this system of self haptics at first, especially when the controller medium was so good at emulating them. A trigger, whether it be on a gun or the humble squirt bottle, was one area where the team wasn't sure how well it would work.

"So when we really, really need that discrete input, we use a self haptic. The triggers are actually another great example of that. So the squirt bottle works so well because you feel your own finger touch against your knuckles. So you're like, 'Oh, this is great!' And that's one of those interactions that we were a hundred percent convinced would be bad. Because the triggers are great on controllers - that's what the controllers are built for. So triggers are going to feel bad in hand tracking because it feels so good on [a] controller. And so we built the squirt bottle and we were like, 'No way, this is awesome.'"

It almost feels, listening to Eiche speak, that hand tracking has liberated the team so to speak to find new ways to create meaningful, realistic interactions in their game environment. And honestly, in my nearly ten or so minutes in the VR demo, figuring out the way the coffee maker worked (it could literally go from creating an outstanding cup of Joe to motor oil with a simple dial turn), I realized I wasn't fighting with how to interact with the environment. Instead, I was trying to solve the puzzle of the game itself. 

Everything felt so natural, so good. And it made me start to think about the VR experiences I crave the most: interactable, MMO-scale worlds. 

Games like Zenith: The Last City marked a turning point for VR MMOs in my mind. It wasn't perfect, and the tracking definitely left a lot to be desired when I played it around launch, but it gave me a vision of what I want in a VR game: an interactable world with my friends. 

I want to be transported to the world itself through its gameplay and the way I interact with objects and people in that game world. Combat in these games usually came down to swinging my controller around, never really feeling like I connected with anything, or firing a gun using those well-built Oculus triggers. But I always felt a bit removed from the experience. Almost as if I was controlling someone else as there just wasn't a good way to make it convince me that I was holding that sword, and more. 

I felt, disconnected, so to speak, from my avatar, not least because the tracking never felt really accurate either.

This is a problem that Owlchemy told me they are trying to solve also. While they didn't announce an MMO or even a multiplayer VR game in our interview, it's a problem they are looking at for the future for sure. Eiche mentioned too that the "magical" part about multiplayer is proximity.

"We are building multiplayer, and I can talk about what's magical. So the most magical thing that we have found in multiplayer is proximity. And it's really, really difficult to do. But we are working on this problem, which is if you think about a lot of multiplayer and proximity, there's very little overlap, right? We'll put it in MMO terms. In World of Warcraft, you have your character, and things go out into the world. And that's it. In VR, we can have our hands in the same physical space. We can be planting seeds, we could be playing piano - we could be doing all these things where we're occupying this intersectional, close space and I can hand you something."

This is exactly how Zenith felt to me in those early days, yet without that natural interaction. An example Eiche gives is a moment during some testing where a player has handed someone a doughnut and the other player just bites the doughnut out of their hand. And while that doesn't seem that impressive, I can attest, the idea of interacting with another player in a meaningful way in VR space, organically, is peak VR.

I remember back at PAX West 2015 going to the Nvidia booth to try out the game that would eventually become Raw Data. Myself and former associate editor Shank were at the booth playing the demo on a team together and, instinctually, the first thing we did when we got into VR space was wave to each other. This would end up sparking an inside joke between the two of us where every game we played from that point forward the first thing we would do is try to wave to each other, but the main reason this sticks out to me was just how natural it all felt.

You get that in Zenith as well - you can just pick your arm up and just wave with each other. But from there the interaction with other players and the world around you is through pop-up menus, drop-downs, or just using your weapons on them. There are gesture controls in many VR applications, but so many of them feel unnatural, forced, or even designed. With Owlchemy's implementation of hand tracking, while there is definitely a design aspect to its gestures and control, they are rooted in how we naturally interact with our world.

The beauty too is, as Eiche puts it, the use of hand tracking doesn't stop a game developer from implementing controllers as well for those experiences that call for one. It adds another layer of how to interact with the world around you. I'm imagining running around (yes, running) a world with friends, jumping into a dungeon together. Using hand tracking, we would interact with a giant puzzle on the wall, moving glyphs around by physically grasping them, moving them up and down the wall, turning them around in our hands and more.

The puzzle completes, opening a door with a monster that needs to be dealt with. Instead of pulling out a controller and pretending it's a sword, I've got something in my hand that can be tracked that better emulates the feel of swinging a sword. My friends all have their own weapons, from maces to guns, using a mix of controllers and tracked objects to fight with.

This is what I kept thinking of as I walked out of the demo that day in San Francisco. That dream of physically feeling transported to a new world in a way that only VR truly can do, and then playing the game using myself as the controller, interacting with the world around me in a meaningful, natural way. This was the dream that a simple job sim tech demo embedded in my brain.  All because it felt great to virtually make a cup of coffee, or squirt a spray bottle using nothing but my hands.

It's a gaming future that I cannot wait to get here.


lotrlore

Joseph Bradford

Joseph has been writing or podcasting about games in some form since about 2012. Having written for multiple major outlets such as IGN, Playboy, and more, Joseph started writing for MMORPG in 2015. When he's not writing or talking about games, you can typically find him hanging out with his 15-year old or playing Magic: The Gathering with his family. Also, don't get him started on why Balrogs *don't* have wings. You can find him on Twitter @LotrLore