This new VR industry is awesome right? It exciting to see all the developments and evolutions as we, the early-adopters, soak it all up and get to experience it first hand. We engage and interact with all the new content in an entirely new way – a stark change from only a few short years ago. Everything needs to be reviewed from a fresh perspective so we can dominate our understanding of what is the best way to proceed. We can draw from past conventions as we move from the limited keyboard/mouse/gamepad input systems we have mastered industries with, but what we really need is standards and principles.
It will take years to know what is the best user experience strategy could be, with the lightly-hood of many varied and diverse solutions catering to specific niche. Rather than try to answer a lot of these questions now I decided that it was best to draw on the standards and principals we already have. With the understanding that UIs could take the form of modular 2D flat interfaces, I wanted to replicate a well known standard with many communities supporting and contributing to it. That is why i chose Google’s Material Design
I now had everything I needed to begin building a UI kit that could work with VR, I started to build Aframe Material Collection and as the name suggests, this UI kit was intended to target AframeVR and WebVR technologies. Now all I had to do was recreate the UI components designers are used to working with: radio selection, checkbox, text input, buttons, modals and toast messages to name a few. So I got to work trying to recreate the aesthetic and behavior of these components as closely as I could maintaining the benefits of material design.
Material Ripple is a visual feedback technique to supplement things like haptic or audio feedback for UIs. It is a great way to offer a more accessible interface not only to those with disabilities but also those that prefer to ignore or disable the other feedback methods. This effect is usually achieved using 2d shapes – mostly a circle. Without technologies like CSS or SVG to create shapes I decided to use 3D geometry to achieve the same effect. This approach has the caveat that geometry must be rendered on each frame continuously. Later I would create a 2D renderer to solve this problem but in the mean time I was eager to explore any optimizations I could make. I decided rather than a circle geometry I could save on some vertices if I used a hexagon shape instead. A hexagon seem to be the lowest denomination I could find that still had a striking aesthetic appeal. This shape would also feature later in other components for the same reasons.
I created a series of buttons to match those usually found in material based UI kits for the web. I built buttons with cursor interaction and animations, that could easy be represented in several states – disabled, floating action and normal. Using standard techniques for capturing interaction in Aframe using raycasters I was able to create buttons that could work with traditional mouse inputs as well as cursor based interaction with tracked motion controllers. Incorporating the ripple effect we can make the buttons function pretty well.
Form elements are important for capturing information for the user. These come in the form of radio or single selection, checkbox or multiple selection, drop down menus single/multi selection and also text input fields. A lot of these components actually achieve similar behavior so I decided to only create radios, check-boxes and switch inputs leaving out drop down menus and multi select drop downs as these usually required more clicks from the user.
Text Input fields are important and for this UI kit I had to recreate text input from scratch. Someone once told me that text input largly goes unnoticed if its done properly. I wanted text input to behave exactly like the browser equivalent in terms of cursor selection/navigation and keyboard shortcuts for selecting/scrolling text.
In order to optimize the UI rendering, I decided to take a slightly different approach to rendering the components. I realized that I could control when the objects rendered and eliminate the unnecessary render calls when the interface is idle. I could also expose settings like FPS and resolution to be able to control optimize the rendered output event more – we could also cater differently for mobile and desktop using these options. This provided many opportunities to improve on the UI also, being able to wrap the rendered UI around custom geometries. I used a curved plane in the examples.
Yoga Layout Engine
Passionate about VR and human computer interaction. I previously worked as CTO for Sensum – an emotional research company focused on correlating biometric data from physiological responses with emotions. I have been very passionate about devloping in VR ever since I presented with Sensum at CES in 2015 with a VR based demo and one of the oculus staff members came to try it out. I am eager to do my part to push this industry forward towards the next evolution of computers and how we use them.
Always thought of VR as one of the greatest things to do with a computer. I have a great passion for technology and automation, and work in a R&D as a technical manager. Also love making things come to life in the real world and in vr, I spend a lot of time modelling and coding machines and animations. I love creating things and making things better, and I’m excited to help built the community and the tools to bring people together and share all the craziness they can think off.
VR is cool… I mean really cool! I began my journey with Virtual Reality almost 3 years ago in Sensum – An emotional research company (which is also when I met Shane) by implementing their biometric data feeds into the Unity engine. I created a VR experience that would adapt its gameplay features and difficulty based on the players biometric response… Since then I have worked on multiple VR projects and am currently waist deep in all things AR and VR! I am truly excited to see how The Expanse adds to the social VR community and can not wait to see what you all come up with in the editor..
I started getting into VR in 2016 thinking I was jumping into the next level of gaming. I didn’t expect that a year later, I would be hosting and performing in events for users from around the world in social VR. VR has proven to be much more than a gaming platform and is bringing people together in ways we once thought possible only in science fiction. As a power user, I’ve been smack-dab in the middle of this new frontier for human interaction, experiencing the social dynamics between users in the metaverse first-hand. While I do play VR games still, hanging with my friends, creating and hosting events, and performing improv comedy or music in VR for the community have become my main activities. I’ve had time to really grow in VR, learning the ins-and-outs of event production and even dabbling in asset creation for avatars to represent myself and venues to perform in. I’m excited to bring my experience and insight to the Expanse and can’t wait to see how this new platform develops.