One of the goals for the project was to implement a form of mixed reality. Mixed reality is the idea of combining the real world with the virtual world. Walking around in VR space where the walls are in the same location as the real world is a form of mixed reality. In our case we wanted to translate the airplane door to the virtual space so that the trainees would actually have to hold on to a door handle. Enforcing the idea of opening the door and developing muscle memory, hence it is a training simulation. Research started with figuring out how the HTC Vive controller work and how they are tracked within Unity projects with the use of the VRTK toolkit. The Vive website has an extensive article of the inner workings of the controllers (link) outlining the guidelines for developers. The SteamVR plugin handles all of the tracking and correctly showing the controllers in Unity. Therefor to get the position and orientation of the controllers is as easy as retrieving the transform component from the controller objects. No extra work. I explored a few options on how to track other objects in virtual space. This could be done with the Vive controllers itself or third-party tracking sensors. However the latter would require different products and another toolkit next to the already in-use SteamVR and VRTK toolkits which is an unnecessary overhead. With the available resources I went with the first approach of using the Vive controllers. The Vive had two options when it comes to tracking. The Vive controller and the Vive trackers (as pictured above). The Vive controllers are mainly used for interaction from the player in the virtual space. However the trackers have the benefit that they can be mounted on any object making that object trackable in VR. At this point in time the trackers are sadly not available to consumers yet. So we had to make do with two Vive controllers. One for the player and one to track an object. To create the illusion that an object in reality is also in VR the position, shape and scaling all have to be the same for it to feel real. For the example I used a bottle of terpentine that had a fairly generic shape and recreated that object in VR trying to get the same dimensions. One problem arrose: the player needs their hands free to pick up the bottle and the only free hand is the one not being tracked. With the headset on the player would not know where his or her hand is. So a new question took priority: how can we track the show the player's hands position in VR while keeping the hands free? During research into the Manus VR gloves a picture of an early development build of the gloves came up where Vive controllers were attached to the users forearms. This gave us a direction to experiment with. Over the weekend I created a simple controller mount with scraps of foam and velcro and developed a scene in Unity that has a table, with the same dimensions as the table in the VR room, and a bottle, to which the other controller is mounted. Now the position of the users hand is approximately being tracked and has the hands free. And the object is being tracked. That is all there is to it given that the tracking of the controllers happens already, the positions of objects is by itself already in the same position as in reality. The picture above is the resulting product. The user no long has the ability to use any of the buttons on the controller but has full control of their hands. Given that the human brain is excellent in replacing your arms with some other appendage, simply having a sphere in the headset as your arm is enough.
I asked a couple of students to try out the project with the bottle and all of them mentioned that it works better than anticipated. Even though the users don't see a proper hand in VR, the fact that they can see their hand is close to the bottle they instinctively open their hand to grab it. It felt natural. With this conclusion I started researching interaction with more complex objects for mixed reality purposes. In case of the KLM project: a door handle. Comments are closed.
|