This is one of the courses in Coursera’s Virtual Reality Specialization. This course covers how to interact with objects in a 3D world in VR. Topics such as movement and grabbing objects are covered in this course. Implementations of both of these are also shown in Unity.
Week 1: Interaction in Virtual Reality
This course starts off with discussing the different types of interaction people use with computer, software, and games and how those interactions compare to how people would interact with things in the real world. The lectures explain the differences between “active interaction” and “passive interaction.” This section also discusses what affordances are and examples of affordances in the real world. A handle on a door, for example, is an affordance showing a user how to interact with the door.
One really good point that the lecturers bring up is the idea of testing your software and games with people that are different from you. If you are a young person, perhaps you should test your application with older users as they may have a different perspective than you do. Another consideration is having non-able bodied people test your software so you can make sure your programs are accessible to all users.
The next part of this week covers the different types of ways to interface with computers, or in the case of this specific course, virtual reality world. Various types of controls are discussed including head mounted displays, motion controllers and even full-body scans that could be done with hardware such as the Microsoft Connect. Two specific types of controls caught my attention: microphone control and heart rate monitoring. Speech recognition is important in computer assisted language learning and has the possibility to be used in conjunction with VR to create some immersive experiences. However, accurate speech recognition is very difficult. The lecturer explains heart rate monitoring and its uses in therapy.
This week ends with showing some free template files that can be downloaded from the Unity Asset Store. These template files contain example games and example code for implementing VR in Unity.
The discussions of the different types of interfaces was the most helpful part of this week as learning about them gave me many ideas of how VR could be used in non-gaming environments. Additionally, showing examples of VR templates from the Unity Asset Store was very helpful because using premade scripts and prefabs from those projects can save me a lot of time when building my VR projects.
Week 2: Navigation in VR
Week two discusses navigation in VR. The most interesting topics that were covered in this section were determining body movement when using a device that only tracks head movement. The lectures give an example of using machine learning to learn how a person’s head moves when they walk. Using that data, a user can walk in place and the device will detect that a user is “walking.” The Unity Asset Store has a package that implements this in Unity. It allows a user to walk in place and send that data to Unity to move forward in a Google Cardboard app. I thought this was the most interesting part of this section as it gave me many ideas for future VR apps that use simple mobile devices and no controllers.
This section also covers teleporting in VR and shows how to implement teleporting in Unity just by looking at a specified teleport pad.
Week 3: Interacting with Objects in VR
This week goes over how to interact with objects in VR. The lectures cover different ways to interact with objects such as using VR glove controllers and controllers found on devices such as the Oculus Rift. One this that the lectures also mention is that when having virtual hands in VR, the developer should not make the hands look to masculine or too feminine as it could break the immersion of the player. One the other hand, if the payer is supposed to being playing a specific character in the virtual world and not themselves, then stylizing the hands a certain way may add to the immersion of the game or application.
This week also contains a section about how physics work in VR. The lectures cover what rigidbodies, physics materials, and colliders are. There are also videos on how to implement all of these things in Unity. The lecturers also provide a link to a package on the Unity Asset Store that easily implements grabbing in VR, Newton VR.
Finally, the lectures cover animation controllers and animations in Unity. These are briefly discussed in previous lectures, but they are covered more in depth here. Animation triggers and how to trigger them using VR are covered in this section.
This week covered quite a lot of information and the topics covered may be a little bit difficult for new Unity users; however, the explanations and actual implementations in Unity were very helpful.
Week 4: Challenges in VR interaction and User Interfaces in VR
The final week of this course courses the challenges that faced when designing interaction in VR. The lectures in this section discuss the differences between graphical interfaces in 2D and 3D. Gestures are also covered. The lecturers notes that because we may be fimiliar with some gestures that we use on current technology, such as swiping or pinching and zooming, similar gestures can easily be implemented and learned in VR.
This section also includes an introduction to canvas in Unity. One of the most helpful parts in this section was showing how to make canvas elements into world space instead of being screen overlays.
The last section of the week covers prototyping. The lectures note that because interaction is much different with VR than it is with other types of mediums, that prototyping in VR is very important.
Thoughts about this course
This course was really helpful for learning about all the different ways to interact with things in VR. The lectures included theory and practical implementations of interactions in Unity including moving using teleporting and grabbing objects. Interaction is something that most VR developers will need to implement at sometime and the things learned in this course will surely help the development of said interactions.