Doc-Ok is a blog on VR and 3D computer graphics, reporting news and reviews from a developer’s perspective; all of this written in the most immersive, engaging manner along with videos and pictures. Seeing things from a developer’s perspective, you are guaranteed to learn more and in-depth about VR and 3D computer graphics.
A funny thing has been happening for the last few days. There is a link farming or SEO or whatever site out there that is currently doing a deep clone of this blog (I’m obviously not going to provide a link here), but unlike all the other times that’s happened, these guys aren’t just straight-up copying, they’re running the text through what looks like a game of Google Translate Telephone (or a simple synonym replacer, but I’m going to pretend it’s the former). I’m guessing the reason is to obfuscate the source and throw off plagiarism detectors, but the result is unintentionally(?) hilarious.
For a fuller experience, here are two paragraphs from the first article I mentioned above:
“The quick reply is after all that this relies on your mannequin of the headset. However should you occur to have an HTC Vive, view the graphs in footage 1 and a couple of (the opposite headsets behave in the identical method, however the precise numbers differ). These figures present the display decision, in pixels / °, alongside two traces (horizontal and vertical, respectively) that undergo the middle of the suitable lens of my very own Vive. The purple, inexperienced and blue curves present the decision for the purple, inexperienced and blue main colours respectively, this time not on the idea of my very own measurements, however by analyzing the show calibration knowledge measured for every particular person headset on the manufacturing unit after which saved within the firmware.
At this level it’s possible you’ll surprise why these graphs look so unusual, however for that you’ll have to learn the lengthy reply. Earlier than I’m going into that, I need to throw away a single quantity: proper in the course of the suitable lens of my Vive (on pixel 492, 602) the decision for the inexperienced shade channel is 11.42 pixels / °, each horizontal and vertical instructions. When you needed to cite a single decision quantity for headphones, that will be the one I used to be going to, as a result of it’s what you get once you have a look at one thing instantly in entrance of you and much away. As figures 1 and a couple of clearly present, no quantity can inform the entire story.”
Now, having posted this article, I am waiting with bated breath for what kind of hash their cloning bot will make out of it.
Apparently, there were good sales numbers for VR equipment prior to the holiday season, and therefore a host of new VR users are coming in just about now. This meta-post collects a bunch of stuff I’ve written (or presented) in the past that might be of interest to some of those new users. These questions/answers are not hardware-specific, meaning they apply to any current-generation VR system (Oculus Rift, HTC Vive, all the Windows Mixed Reality headsets, PlayStation VR, …), and go beyond basic tech questions such as “how do I plug this in, install drivers, …).
How does VR actually work? (presentation at VRLA ’16 Expo, 25 minutes) In other words, how does a collection of screens and lenses and tracking sensors etc. create the illusion of a virtual space that feels real?
I heard I need to dial in my IPD, or Inter-Pupillary Distance. Do I need to go to (and potentially pay) an optometrist to have it measured, or can I do that myself at home? (This post is also referenced in the presentation I linked above.)
What are the problems with artificial VR locomotion, i.e., moving through a virtual space via a joystick/touchpad/button or other means that does not involve actually walking? (presentation at a VR meet-up, 40 minutes) (This presentation is also referenced in the presentation I linked above.)
There is one other issue for which I do not have a full article, but it’s quite important for new users: VR sickness (aka motion sickness, simulator sickness, …). Today’s VR headsets, at least the ones doing full head tracking (that means Rift/Vive et al., and not Gear VR, Oculus Go, Google Cardboard, …) should not cause VR sickness per se. These days, it is primarily caused by artificial locomotion in games or applications, as I explain in the second presentation I linked above.
The important message is: do not attempt to fight through VR sickness! If you try to stomach it out, it will only get worse. Stop using VR the moment you feel the first symptoms, take a long break, and then try again if you want to continue with the application/game that made you sick. If you try to power through repeatedly, your body might learn to associate sickness with VR, and that might cause you to get sick even when merely thinking about VR, or smelling the headset, or similar triggers. Just don’t do it.
That’s about it; now go ahead and enjoy your shiny new VR systems!
Want to Know More?
Here are a couple of other, more hardware-specific, topics:
How exactly does Oculus’s Constellation tracking system work? A deep dive into the Oculus Rift DK2’s camera tracking system — which works basically the same as the commercial Rift’s, in three posts:
Good news, everyone! This here blog, Doc-Ok.org, hit 1,000,000 (that’s one million) total page views early in the morning of Christmas Day, 12/25/2018, just about six years and four months after I started it.
I have no idea whether that’s something to be embarrassed or proud of, given the narrow range of topics I’m covering and the niche audience I’m targeting. Either way, it’s a nice, round number.
It’s been more than two years since the last time I posted set-up instructions for Vrui and HTC Vive, and a lot has changed in the meantime. While Vrui-5.0 and its major changes are still not out of the kitchen, the current release of Vrui, Vrui-4.6-005, is stable and works very well with the Vive. The recent demise of our CAVE, and our move towards VR headsets until we figure out how to fix it, have caused a lot of progress in Vrui’s set-up and user experience. The rest of this article contains detailed installation and set-up instructions, starting from where my previous step-by-step guide, “An Illustrated Guide to Connecting an HTC Vive VR Headset to Linux Mint 19 (“Tara”),” left off.
If you use a Linux distribution that is not Ubuntu-based, such as my own favorite, Fedora, or another desktop environment such as Gnome Shell or Cinnamon, you will have to make some adjustments throughout the rest of this guide.
This guide also assumes that you have already set up your Vive virtual reality system, including its tracking base stations, and that your Vive headset is connected to your PC via HDMI and USB (I will publish a detailed illustrated guide on that part soon-ish).
The Vrui VR development toolkit is distributed as tarballs (gzip-compressed tar archives) containing source code, documentation, other resources, and build scripts. Installing Vrui (and most other software distributed in source form) typically requires four steps:
Install required libraries (the list of libraries required to build Vrui can be found in its “Quick Installation Guide“).
Install the Vrui software (via sudo make install).
To simplify these steps, Vrui offers installation scripts for Ubuntu- or Fedora-based Linux distributions on its download page. The fastest way to download and run one of those scripts is by entering exactly the following sequence of commands into a terminal window (the dollar signs ($) indicate the terminal’s input prompt and must not be typed in):
$ wget http://idav.ucdavis.edu/~okreylos/ResDev/Vrui/Build-Ubuntu.sh
$ bash Build-Ubuntu.sh
If you are using a Fedora-based Linux distribution, replace “Build-Ubuntu.sh” with “Build-Fedora.sh” in the commands above. The script run in the last command will ask you for your user’s password to install prerequisite libraries, and then build Vrui. That will take a little while and produce lots of output, and at the end, if everything worked, you’ll be greeted by a spinning globe in a window. Once you’re done admiring the globe, you can close the window.
The script executed the four steps listed above. Additionally, it created a directory called “src” underneath your home directory, and unpacked the current Vrui tarball into a directory called “Vrui-<major>.<minor>-<release>” (e.g., Vrui-4.6-005) inside that “src” directory.
Check for SteamVR and Vive Support
While Vrui is not itself based on SteamVR, it uses SteamVR’s low-level hardware drivers to talk to the Vive headset and receive 6-DOF tracking and event data from it and its controllers. As an aside, this means that Vrui needs SteamVR to be installed on the local computer, but it does not require actually running SteamVR. On the contrary, it is best not to run Steam and/or SteamVR when using the Vive with Vrui applications, as doing that could cause compatibility issues.
During configuration, Vrui’s build system attempts to detect whether the local computer has SteamVR installed, and builds in support for Vive headsets if it succeeds in finding a SteamVR installation. I have tested this detection code on several versions of Steam and SteamVR and multiple Linux distributions, but it might still fail in rare cases, for example if SteamVR was installed in a non-standard location.
To check whether Vrui found SteamVR, scroll back through the output from the Build-Ubuntu.sh script to where it prints:
---- Vrui installation configuration ----
Somewhat below that, look for lines that either state
OpenVR SDK and SteamVR run-time exist on host system; support for
HTC Vive enabled
OpenVR SDK or SteamVR run-time do not exist on host system; support
for HTC Vive disabled
If it printed the former, everything is fine, and you can skip ahead to section “Running the Vive Tracking Driver.” Otherwise, continue with the next section.
Manually Enable SteamVR Support
If Vrui’s build system did not find your SteamVR installation, you will have to find it yourself, and re-run the last two installation steps manually. First, you will need to look for a directory called “SteamVR”, inside a directory called “common”, in turn inside a directory called “steamapps” or similar. If you do not know where to look, you can run the following command in a terminal window:
$ find / -name SteamVR
This will search your entire file system for directories or files called “SteamVR”, which might take some time. Look through the output from “find,” and note down the line that ends with “/common/SteamVR”, such as the following from my own SteamVR installation:
Next, open a terminal window and enter the Vrui source directory:
$ cd ~/src
$ cd Vrui-<major>.<minor>-<release>
The second command will print the list of sub-directories inside the “src” directory. Identify the one containing Vrui, and replace “Vrui-<major>.<minor>-<release>” in the third command with the correct name. Then continue in the same terminal window:
$ rm bin/VRDeviceDaemon
$ make STEAMVRDIR=<location of SteamVR directory>
In the second command, replace <location of SteamVR directory> with the full directory name you noted down from the output of “find.” For example, I would enter:
This command will show Vrui’s configuration options again. It should now print
OpenVR SDK and SteamVR run-time exist on host system; support for
HTC Vive enabled
and below that line three lines with directory or file names:
The first line will echo back the same directory you passed to “make” as STEAMVRDIR, the second line will list two directories containing “/ubuntu12_32/steam-runtime/amd64/” somewhere inside and ending in “/x86_64-linux-gnu”, and the third line’s file name will end in “/linux64/driver_lighthouse.so”.
If those lines exist, you can now re-install Vrui with Vive support by running:
$ sudo make STEAMVRDIR=<location of SteamVR directory> install
where you pass the same directory name as STEAMVRDIR as before. You might have to enter your password again to make this adminstrator-level change.
Running the Vive Tracking Driver
Now that Vrui is installed with SteamVR/Vive support, it is time to test headset and controller tracking. Vrui uses a stand-alone tracking server called “VRDeviceDaemon.” It is typically started when you begin a VR session, around the same time you turn on your Vive’s display, and keeps running until you end your session and turn the Vive’s display off again. To start the server, open a new terminal window and run:
RunViveTracker.sh is a script that gets installed as part of Vrui, to simplify running VRDeviceDaemon in Vive mode. It will print a lot of status messages to the terminal during start-up, and then keep printing messages as important events occur. For initial testing, look for messages similar to the following (all serial numbers refer to my devices, and will be different for yours):
OpenVRHost: Activating newly-added head-mounted display with serial number LHR-CCB9BBDB
OpenVRHost: Tracked device with serial number LHR-CCB9BBDB is now connected
OpenVRHost: Activating newly-added tracking base station with serial number LHB-7FB0B90B
OpenVRHost: Tracked device with serial number LHB-7FB0B90B is now connected
OpenVRHost: Activating newly-added tracking base station with serial number LHB-B5250EB9
OpenVRHost: Tracked device with serial number LHB-B5250EB9 is now connected
OpenVRHost: Tracked device with serial number LHR-CCB9BBDB regained tracking
OpenVRHost: Tracked device with serial number LHB-B5250EB9 regained tracking
OpenVRHost: Tracked device with serial number LHB-7FB0B90B regained tracking
The first two messages refer to the Vive headset itself. The next four messages refer to the two tracking base stations, who get detected by the tracking sensors on the headset. The final three messages tell that the two base stations can see each other, that the headset can see one or both base stations, and that 6-DOF tracking was established. In other words, everything appears to be working.
If you now turn on your controllers (by pressing the power buttons below the touchpad), you will get additional messages like the following (again, the serial numbers will be different):
OpenVRHost: Activating newly-added controller with serial number LHR-FFFC3F40
OpenVRHost: Tracked device with serial number LHR-FFFC3F40 is now connected
OpenVRHost: Tracked device with serial number LHR-FFFC3F40 regained tracking
OpenVRHost: Device LHR-FFFC3F40 is now discharging
OpenVRHost: Battery level on device LHR-FFFC3F40 is 90%
OpenVRHost: Activating newly-added controller with serial number LHR-FF6F3D43
OpenVRHost: Tracked device with serial number LHR-FF6F3D43 is now connected
OpenVRHost: Tracked device with serial number LHR-FF6F3D43 regained tracking
OpenVRHost: Device LHR-FF6F3D43 is now discharging
OpenVRHost: Battery level on device LHR-FF6F3D43 is 84%
This tells that both controllers connected to the headset, that they started tracking their positions and orientations, and that they are running on battery and are slowly discharging. VRDeviceDaemon will keep updating the controllers’ battery levels as they drain, and will print messages if any of the devices temporarily lose and then re-gain 6-DOF tracking.
The tracking driver needs to be running for Vrui-based VR applications to use the headset and controllers. I recommend starting it in a new terminal window, and moving the window to a corner of the main display so you can monitor its status (and the battery levels of the controllers). To shut down the tracking driver when your VR session is finished, either press Ctrl+C inside its terminal window, or simply close the window. Shutting down the driver will also power off the controllers.
Creating an Icon for the Tracking Driver
It is convenient to create a desktop icon to start the tracking driver. Right-click anywhere on the desktop, and select “Create Launcher…” from the pop-up menu. In the “Create Launcher…” dialog that appears, set “Type” to “Application,” enter something like “Start Vive Tracker” into the “Name” field, and optionally enter a helpful comment into the “Comment” field.
Most importantly, enter a command to run the tracking driver into the “Command” field. You want to run the driver in its own terminal window, and the best way to achieve that is to use the following command:
This command opens a new terminal window (“mate-terminal”), sets the window’s size to 80 columns by 30 rows, which is a good size, and sets the window’s title to “Vive Tracker” (you can choose any title you like). Finally, the “-e RunViveTracker.sh” option tells the newly-opened terminal to run the tracking driver script.
Click the “OK” button to confirm and create the icon on your desktop. From now on, you can start the tracking driver in its own terminal window by double-clicking its icon.
At this point everything should be working, but you still might want to check actual tracking data for any of the devices, or you may want to do this later if you run into any tracking issues down the line. Vrui contains two utilities to check tracking: DeviceTest is a text-based utility, and TrackingTest is a graphical application.
The simplest way to do a quick tracking check is to run DeviceTest in a terminal:
$ DeviceTest -t <tracker index>
where you replace <tracker index> with the index of the device you want to check (0: headset, 1: first controller, 2: second controller). This will continuously print the selected device’s 3D position, and the device’s battery level if it is a controller. To stop DeviceTest and get back to the terminal prompt, press the Enter key.
DeviceTest has a lot of additional options. Add -listDevices to its command line to get details on all connected devices; add -listHMDs to get details on your headset; add -b to see the status of all buttons on all devices; add -v to see the status of all analog axes; and add -o to additionally see the orientation of the selected device, expressed as a rotation around a 3D axis.
Configuring the Environment
Before using any VR applications, you need to tell Vrui the physical layout of your environment, i.e., your VR workspace. Specifically, Vrui needs to know where the center of your workspace is, how high your floor is, which direction you’re generally facing when in your workspace, and the locations of walls, furniture, etc. around your workspace. All this is done using a graphical set-up utility called “RoomSetup.” In a terminal window, run
$ RoomSetup Vive
This will start the configuration utility and tell it to configure an environment for a Vive VR system, see Figure 1. Then follow the procedure outlined in this video:
Configuring a VR Environment in Vrui using RoomSetup - YouTube
Figure 1: Vrui’s RoomSetup utility.
In order of use, the setup steps are:
Controller: Check that “Controller Type” is set to “From Driver,” and that the “Probe Tip” fields are set to 0.0, 0.0, 0.0.
Floor Plane: Set your workspace’s center position and floor elevation. Click the “Reset” button, then lay one controller down onto the floor, in the center of your workspace, with the trigger button facing down, and press any button on that controller.
Forward Direction: Point one controller in the direction you will generally be facing, and press any button on that controller.
Boundary Polygon: Define the boundaries of your VR workspace. First press the “Reset” button, then touch each corner point of your workspace boundary in clockwise or counter-clockwise order with one controller, and press any button on that controller to add a vertex to the boundary polygon.
Surface Polygons: Define horizontal surfaces, for example safe areas to lay down your controllers while in VR. Sketch out a polygon in the same way as during boundary polygon set-up, and close each polygon by clicking the “Close Surface” button.
Save Configuration: Click the “Save Layout” button, and then close RoomSetup’s window.
RoomSetup is a Vrui application, run here in desktop mode. You can pan the workspace display by pressing and holding the “z” key and moving the mouse (without pressing any mouse buttons), and zoom in/out by rolling the mouse wheel. Should you get lost, you can press “Win+Home” to reset the view, or open the program’s main menu by pressing and holding the right mouse button, pointing at “Vrui System,” then at “View,” and finally at “Reset View.”
Running Vrui Applications in VR
At this point, Vrui is installed and configured, and now it’s time to start trying Vrui applications in VR mode. The build script you used to install Vrui also built Vrui’s example applications (including the spinning globe with which it greeted you), and you can try any of those, or download and install specific applications you want to use such as LiDAR Viewer or 3D Visualizer. As it is sort of Vrui’s unofficial “Hello World,” let’s try the Earth viewer first. In a terminal window, run:
$ cd ~/src/Vrui-<major>.<minor>-<release>/ExamplePrograms
$ OnVive.sh ./bin/ShowEarthModel -norotate
The first command enters the directory containing Vrui’s example programs, inside Vrui’s source directory. Adjust Vrui’s directory name according to the version you downloaded.
The second command starts the Earth viewer. By default, Vrui applications start in “desktop mode,” where they pretend to be regular 3D graphics applications, controlled via keyboard and mouse. The easiest way to start Vrui applications in VR mode is to use the OnVive.sh script that is installed as part of Vrui. It will detect the Vive’s display, alert you if the display is not connected or not turned on, do other important housekeeping tasks, and finally start the given Vrui application in VR mode.
By default, OnVive.sh will not mirror the VR application’s 3D view to the main display, so you won’t see anything until you put on the headset. But once you do, you should see a globe in glorious 3D, sitting in the middle of your configured VR workspace. As you step close to your workspace’s boundaries, or reach out with a controller, you should see a green grid indicating your boundaries. Do not step or reach through those boundaries, or you might get hurt or damage your equipment!
As a first-time setup step, adjust the Vive’s lenses to match the distance between your eyes (your inter-pupillary distance, IPD). There is a small knob on the lower right of the headset; twisting that knob will move the left and right lenses closer together or farther apart. While you are twisting the knob, Vrui will show you the current distance between the two lenses’ centers in millimeters. Make sure this distance matches your IPD for the best experience. You can measure your IPD yourself ahead of time, using a mirror and a ruler.
This is also a good time to check for visual quality. In general, the visual quality of Vrui applications inside a Vive, in terms of lag, jitter, judder, distortion, chromatic aberration, etc., should be on par with native SteamVR applications. If it is not, it’s a serious problem that needs to be investigated. Unfortunately, there’s no step-by-step guide to troubleshoot those issues, but they appear to occur rarely. If they do, take to the comments.
When you’re done, place the controllers back onto their home spots using the horizontal polygon(s) you added during environment configuration as a guide, and take off the headset. To stop an application from the outside, move the mouse off the main display to the right (where the Vive’s display is mapped), click, and press the Esc key. Alternatively, if you did not move the mouse or click anywhere on the desktop since you started the current Vrui application, its window will still have the input focus, and you can just press the Esc key immediately. See the next section on how to shut down a Vrui application while still in VR.
If you want to see a secondary view of your virtual environment on the main display, add “-mergeConfig ControlWindow” to the command line of any Vrui application, like this:
-mergeConfig tells Vrui to load an additional configuration file, in this case /usr/local/etc/Vrui-4.6/ControlWindow.cfg, into its configuration space. That configuration file instructs Vrui to create a second virtual camera, and open a secondary window showing that camera’s view on the main display. You can edit that configuration file to fine-tune the properties of that window:
All Vrui applications share a common user interface, provided by the Vrui toolkit itself. The interface is a combination of direct actions mapped to controller buttons and/or analog axes, and indirect actions represented by and accessed through traditional-looking “2.5D” GUI widgets (menus, dialogs, etc.).
In Vrui parlance, “navigation” is the process of changing the mapping of an application’s internal 3D space (“navigational space”) into the real space of your VR workspace (“physical space”). For example, in its own navigational space, the Earth viewer represents the Earth at its actual size, meaning with a diameter of about 12,700 km, and with the Earth’s center at position (0, 0, 0). It only appears beachball-sized and in the middle of your VR workspace because Vrui transforms from navigational space to physical space by scaling such that 12,700 km fits into the workspace’s configured size, and translating such that (0, 0, 0) is mapped to the workspace’s configured..
Running Vrui-based applications in glorious VR on an HTC Vive head-mounted display requires some initial set-up before Vrui itself can be installed and configured. This step-by-step guide will build upon an already-installed Linux operating system with high-performance graphics card drivers, specifically upon the current (as of 12/17/2018) version 19, code-named “Tara,” of Linux Mint, one of the most popular and user-friendly Linux distributions. This guide picks up right where the previous one in this series, “An Illustrated Guide to Installing Linux Mint 19 (“Tara”),” left off.
If you did not follow that guide, this one assumes that you have a “VR ready” or “gaming” PC with a powerful Nvidia GeForce graphics card, an installation of the 64-bit version of Linux Mint 19 (“Tara”) with the MATE desktop environment, and the recommended proprietary Nvidia graphics card driver. And an HTC Vive VR headset, of course.
Graphics Card Driver Set-up
Using a Vive headset with Vrui requires a change to the Nvidia graphics card driver’s configuration. Nvidia’s driver scans connected display devices for known VR headsets, and hides detected headsets from the desktop environment. This does make sense, as headsets are not standard monitors, and it would be awkward if windows or dialogs were to show up on a headset’s display. That said, here’s one relatively large quibble: headset filtering should happen earlier during the boot sequence, not just when the graphics card driver is loaded. As it is, headsets are still enumerated during boot, meaning that boot screens, BIOS menus, boot menus, etc. often show up on the headset, causing real problems. Anyway, carrying on.
Unfortunately for Vrui, there is currently no way to activate a hidden headset from inside an OpenGL-based VR application. For the time being, this means headset filtering in the driver needs to be disabled. To do so, open a terminal window (click on the terminal icon in the panel along the bottom screen edge, or right-click anywhere on the desktop and select “Open in Terminal” from the pop-up menu), enter exactly the following command into it (also see Figure 1) and press the Enter key (the $ sign indicates the terminal’s input prompt; don’t type it):
Figure 1: Creating a configuration file fragment using the xed text editor.
The terminal window will then ask you for your password, as “sudo” (“do as superuser”) requests an administrator-level operation. Enter your password — which will not be displayed back to you, so type carefully — and press the Enter key to confirm. This will open a text editor window with a blank page. Into that blank page, enter exactly the following text, see Figure 2:
Figure 2: Creating a configuration file fragment using the xed text editor.
This file will be read by the Nvidia driver, and instruct it to disable headset filtering. As a result, connected VR headsets will show up as regular displays, and will be able to be used by Vrui applications. To activate this change, re-start the graphics driver by logging out from the desktop, and logging back in.
Connecting the Vive VR Headset
After logging back in, it is time to connect the Vive headset by plugging its HDMI video cable into one of the HDMI ports on the back of your graphics card. Depending on your graphics card model and the phases of the moon, your main display might turn momentarily black at the moment you plug in the headset’s cable, and the desktop panel along the bottom edge of the screen might disappear (that’s exactly the reason why headset filtering was added in the first place). Worry not, we will fix that next.
First, let’s check if the Vive headset was detected by the graphics card driver, and was shown to the desktop environment instead of being hidden. Open a terminal window, enter exactly the following command, and press the Enter key:
The reply printed to the terminal window will look similar to Figure 3:
Figure 3: Checking for a connected Vive headset using xrandr.
xrandr is a program that displays detailed information about all non-hidden display devices connected to the computer, and lets users control those displays. We will be using that later. Until then, as is often the case, xrandr displays too much detail, so we will be using the grep program to cut the chaff (note that there is a space between the opening quote and the word “connected” in the following command):
$ xrandr | grep " connected"
The terminal will reply with something like the following (also see Figure 4):
HDMI-0 connected primary 2160x1200+2560+0 (normal left inverted right
x axis y axis) 122mm x 68mm
DP-1 connected 2560x1440+0+0 (normal left inverted right x axis y
axis) 597mm x 336mm
Figure 4: Getting to the meat of xrandr output with grep.
We can tell from the above output that there are two connected display devices. The first is plugged into the first HDMI port and has 2160×1200 pixels (this is the Vive headset), and the second one is plugged into the second DisplayPort port, and has 2560×1440 pixels (this is my main display). This is good news; it means the Vive was detected, and not hidden by the graphics card driver. The “primary” tag in the Vive’s line indicates that, in my case and for whatever reason, the desktop manager chose the Vive as the main display, meaning my desktop panel moved over to it and disappeared from the main display.
Your xrandr output will probably be different from mine. Identify your main display and your Vive headset (the Vive will show with a size of 2160×1200 pixels, and a refresh rate of 89.53 Hz in the un-grepped xrandr output, see Figure 3), and note down their outputs, in my case DP-1 and HDMI-0 for the main display and the Vive, respectively.
We will now use xrandr to properly configure the set of displays. Into the terminal window, enter the following (replacing DP-1 and HDMI-0 with the outputs previously reported by your xrandr) and press the Enter key (also see Figure 5):
This command tells xrandr to enable the display connected to the second DisplayPort port (–output DP-1 –auto) and make it the primary display (–primary), to enable the display connected to the first HDMI port (–output HDMI-0 –auto) and place it to the right of the display on DP-1 (–right-of DP-1). Of course, replace DP-1 and HDMI-0 with the output ports reported by your xrandr.
This rather long command will be very useful in the future, as it will turn on the Vive’s display and place it properly next to the main display. To make it simpler to invoke this command, we will create a script file. Into the terminal, enter the following sequence of four commands, confirming each with the Enter key (also see Figure 6):
$ mkdir bin
$ cd bin
$ xed ViveOn
Figure 6: Creating a “bin” directory for scripts and creating a “ViveOn” script.
These commands will change into your home directory (cd, “change directory”), create a new directory called “bin” under your home directory (mkdir, “make directory”), change into that new directory (cd bin), and open a blank text editor window for a new file called “ViveOn.” Into that text editor window, enter the following text (also see Figure 7):
# Turn on the Vive's display and place it to the right of the main display:
xrandr --output DP-1 --auto --primary --output HDMI-0 --auto --right-of DP-1
As above, replace DP-1 and HDMI-0 with the correct connectors for your system.
Figure 7: Creating a script with the xed text editor.
Once done, save the file and close the text editor window. Then enter exactly the following into the terminal window (also see Figure 8):
$ chmod a+x ViveOn
Figure 8: Marking the “ViveOn” script as executable.
The chmod (“change mode”) command tells the terminal that the ViveOn file is a program that can be executed (+x) by any user (a).
Next, we create a companion script that turns the Vive’s display off again, by copying the ViveOn script and making changes with the text editor (also see Figure 9):
$ cp ViveOn ViveOff
$ xed ViveOff
Figure 9: Creating a companion “ViveOff” script.
The first command (cp, “copy”) copies the “ViveOn” script to “ViveOff” to use as a starting point, with the executable flag already set. Inside the text editor, change the script’s contents to the following, again replacing output names as appropriate (also see Figure 10):
# Turn off the Vive's display:
xrandr --output DP-1 --auto --primary --output HDMI-0 --off
Figure 10: Editing the “ViveOff” script.
Next, we need to tell the terminal to look for commands inside the new “bin” directory that we created. The terminal used by Linux Mint is run by a command interpreter called “bash” (“Bourne-again shell,” programmers like their puns) controlled through a configuration file called “.bashrc” (“bash resources”) inside the user’s home directory. We will edit that file by entering the following command into a terminal (see Figure 11):
$ xed ~/.bashrc
Figure 11: Opening the bash terminal configuration file with the xed text editor.
The tilde (“~”) at the beginning of the file name is a shortcut for the current user’s home director. This command will open the existing .bashrc file in a text editor. The file is already quite long; scroll to the very bottom of it, and append the following two lines, see Figure 12:
# Set up executable search paths:
Figure 12: Appending the user’s bin directory to the terminal’s executable search path.
When looking for executable commands, bash searches a colon-separated list of directories stored in the PATH variable. The line we added takes the directory containing our new scripts, $HOME/bin (HOME is a variable containing the current user’s home directory, and the dollar sign instructs bash to replace the name of that variable with its value), and puts it in front of the already-existing directory list, $PATH, with a colon in-between. As a result, bash will be looking for executables in our bin directory first, before looking in the usual places.
After saving the .bashrc file and closing the text editor, opening a new terminal window and simply typing “ViveOn” followed by the Enter key will turn on the Vive’s display, and typing “ViveOff” followed by the Enter key will turn it off again. That’s already convenient, but we can make it even easier by creating two desktop icons, to turn the Vive’s display on and off, respectively.
Click the right mouse button anywhere on the desktop, and select “Create Launcher…” from the pop-up menu, see Figure 13.
Figure 13: Creating a new launcher through the desktop’s context menu.
This will open the “Create Launcher” dialog, see Figure 14. Fill in the name of the new launcher and a usage hint as shown in Figure 14, and then click the “Browse” button to select which executable to launch when the launcher is double-clicked. You could also enter the name directly into the “Command:” text field; it would be /home/<your user name>/bin/ViveOn.
Figure 14: The “Create Launcher” dialog, with name and comment filled in.
Clicking the “Browse” button instead will open an application chooser dialog, see Figure 15. In that dialog, first click on the “Home” icon in the left column, and then double-click on the “bin” directory in the right column.
Figure 15: Navigating to the user’s “bin” directory.
That will open the directory and show the only two files in there, ViveOn and ViveOff, see Figure 16. Double-click on “ViveOn” (or select “ViveOn” and click the “Open” button) to copy the script’s full name into the “Create Launcher” dialog’s “Command” field, see Figure 17.
Figure 16: Selecting the “ViveOn” script.
Figure 17: The “Create Launcher” dialog, with the “Command” field filled in.
To finish creating the launcher, click the “OK” button. This will create a new icon labelled “Vive On” on the desktop. Double-clicking that icon will turn on the Vive’s display.
Next, repeat the same steps to create a launcher to turn off the Vive’s display, using the ViveOff script. The result should be two new icons in total, see Figure 18.
Figure 18: A set of new icons to control the Vive’s display on the desktop.
Do not worry if the icons look different on your desktop; it does not matter. This completes the first set-up step, configuring the graphics card driver and creating convenient ways to control the Vive’s display.
Installing Steam and SteamVR
The Vive head-mounted display requires low-level driver software to operate. This software, called “SteamVR,” is distributed through the “Steam” content delivery system (an app store for, mostly, PC games). While both Steam and SteamVR can be downloaded and installed at no cost, it takes quite a few steps to do so (see Figures 19-37). It would be much more convenient if one could bypass Steam and directly download only those (small) parts of SteamVR that are essential to operating a Vive. Alas, that would be a copyright violation; “free of charge” is not the same thing as “free” after all.
The first step is installing Steam itself. In Linux Mint, this is easy as Steam is available through the Software Manager. To reach it, go back to the “Welcome” screen, select “First Steps” in the left column, scroll down in the right column, and click the “Launch” button underneath the “Software Manager” heading, see Figure 19.
Figure 19: Launching the Software Manager from the “Welcome” screen.
When the Software Manager starts, Steam might already be prominently featured on the “Editor’s Picks” panel, see Figure 20. If it is not, click on the search field in the top-right corner, see Figure 20, and enter “Steam,” see Figure 21.
Figure 20: The Software Manager. The search field is in the top-right corner.
Figure 21: The Software Manager showing a list of applications matching the keyword “Steam.” The actual “Steam” application should be the first in the list.
The correct application (its description should read “Valve’s Steam digital software delivery system”) should then appear at the top of the list. Click on it to get to Steam’s detail page, and click the “Install” button, see Figure 22.
Figure 22: Steam’s detail page.
Clicking “Install” will show a sequence of confirmation dialogs. The first will alert you that Steam requires additional changes, see Figure 23. Click the “Continue” button to see an authentication dialog where you will have to enter your password and click the “Authenticate” button, see Figure 24.
Figure 23: Confirming additional changes to the system.
Figure 24: Enter your password to authorize system-level changes.
After that the Software Manager will download Steam and install it, and then replace the “Install” button on Steam’s detail page with a pair of “Launch” and “Remove” buttons, see Figure 25. Click the “Launch” button to launch Steam — which will not actually launch Steam, but a second-stage installer, see Figure 26. Just wait for however long it takes to download and install the actual application.
Figure 25: After installation, click the “Launch” button to start Steam’s second-stage installer.
Figure 26: Steam’s second-stage installer downloading the actual application.
When the second-stage installer is finished, it will show a “welcome” dialog, see Figure 27. Steam requires a user account before one can download any software, even free-of-charge software. If you already have a Steam account, click on “Login to an Existing Account” and enter your account name and password into the login dialog that pops up, see Figure 29, and continue from there. If you do not have a Steam account yet, click on “Create New Account” to create one, which is also free of charge.
Figure 27: Steam’s log-in page.
To create an account, you need a valid email address, need to consent to Steam’s Subscriber Agreement and confirm that you are at least 13 years old, and pass a CAPTCHA challenge, see Figure 28. You must use a real working email address at this point, as Steam will send a confirmation email to that address before it will let you pass. Some of Steam’s CAPTCHAs are very tough; click the “Refresh” button if you can’t parse the one presented to you. There does not seem to be an upper limit on the number of challenges you can attempt. After you have filled in all the fields, click the “Continue” button.
Figure 28: The dialog to sign up for a no-cost Steam account. You must use a real email address here.
After you press the “Continue” button, and your entered values pass Steam’s check, the Steam servers will send an email to the address you entered. You need to click on the link in that email to activate your account, which will move you past the account creation dialog and back to the “Steam Login” window, see Figure 29, where you can enter the name and password you assigned to your new account. Click the “Login” button when you have entered your credentials. You could also check the “Remember my password” field so you do not have to re-enter those every time you start Steam, but, as it turns out, you do not have to start Steam ever again if all you want to do is use the Vive with Vrui applications — it is, in fact, better to never start Steam again, as it might automatically update itself and break..
The first step towards installing any Vrui-based software, including the Augmented Reality Sandbox, is installing some version of the Linux operating system on a new computer, which might sound like a daunting proposition to those who have never done that kind of thing before, but is actually very straightforward. This guide will be using the current (as of 12/17/2018) version 19, code-named “Tara,” of Linux Mint, one of the most popular and user-friendly Linux distributions.
As this guide is geared towards installing and running Vrui-based 3D graphics applications, it assumes that the computer onto which Linux is to be installed is some type of “gaming” or “VR ready” PC, containing an Nvidia GeForce graphics card. The exact model of graphics card, as well as the exact model of CPU, amount of main memory, and hard drive size are not really important (that said, to run 3D graphics applications effectively, the PC should have a recent CPU, at least 4GB of main memory, and at least 60GB of hard drive space). While AMD/ATI graphics cards are otherwise perfectly serviceable, they have traditionally had inferior Linux driver support, and therefore Vrui and Vrui applications have not been tested on them in quite a while. In other words, we do not recommend them for these purposes.
Before installing Linux, one needs to download an installation image for one’s chosen Linux distribution and flavor, and copy it to an installation medium, like a CD/DVD or USB stick. For this guide, we will be using the 64-bit version of Linux Mint 19 (“Tara”), with the MATE desktop environment. The page in the preceding link offers a 1.9GB disk (“iso”) image via a wide selection of download sites all over the world. Click the link for the site that is located most closely to you, and wait for the download to finish.
If the installation medium created in the previous step is a USB drive, plug it into a USB port on the new computer before turning it on for the first time. If it is a CD/DVD, turn the computer on first, and then insert the medium as quickly as possible.
The next step is to tell the computer to boot from the installation medium instead of from its internal hard drive. There is usually a key that needs to be pressed shortly after powering on the computer; typically either the “Delete” key to enter the computer’s BIOS, or “F8” to enter a boot menu. If this does not work on the first try, and the computer “hangs” or boots into whatever operating system was previously installed on it, don’t wait until it finishes booting — just turn it right back off (or press the reset button if it has one) and try again. We are going to erase any previous operating system anyway, so there’s no harm. However, do wait for about 20-30 seconds between turning the computer off and on again to avoid any danger of electrical damage. If neither the “Delete” nor “F8” keys work, look in the computer’s manual or online for the correct key sequence. Amazingly, finding the correct key to boot from the installation medium is by far the most difficult step of installing Linux Mint.
Once a BIOS screen or boot menu show up, select to boot from the installation medium. From there, it will take a few seconds to boot into a “live” Linux Mint environment, see Figure 1.
Figure 1: Linux Mint’s “live” installation environment.
This “live” environment is a fully-functioning Linux system, and you can already try some of the installed applications. However, it’s best not to dawdle, and directly double-click the “Install Linux Mint” icon. That will open the installer, which will guide you through the rest of the installation (see Figures 2-11). Basically, each step requires making some choices (or sticking with the defaults), and then pressing a “Continue” button. The first steps, see Figures 2 and 3, are selecting the installation language and keyboard layout.
Figure 2: Selecting the installation language.
Figure 3: Selecting the keyboard layout.
In the next step, see Figure 4, the installer will ask whether you want to install third-party software such as graphics card drivers. That is generally what we want, but I recommend leaving this option unchecked, and dealing with alternative drivers after base system installation is complete. There are fewer chances for things to go wrong that way.
Figure 4: To install, or not to install, third-party drivers, that is the question. We’ll leave this unselected and do it later.
The next step, see Figure 5, is where it gets interesting. At this point, the installer will have checked the computer’s hard drive for already-installed operating systems. If it found one, it will offer to retain the existing system and install Linux Mint alongside it, or completely remove the the existing system. Both options are valid. If there is already another operating system on the computer, say some version of Microsoft Windows, and you want to be able to select which operating system to load at boot-time, then you should check the first option. This is supposed to work reliably, but I myself have not tested it. If going this route, you will be presented with a menu every time you turn on the computer, and will have to choose with operating system to use.
The other option is to completely wipe the hard drive and install Linux Mint as the only operating system (if there is no pre-existing operating system, this is the only option). If you want to set up an AR Sandbox in “kiosk mode,” where the computer has neither a keyboard nor mouse, and powering it on will automatically start the AR Sandbox, you must choose the second option, or it will not work reliably.
The rest of the installation instructions assumes that you chose the second option. If you chose the first option, follow any additional prompts from the installer.
Figure 5: Decide whether to install Linux Mint alongside an existing operating system, or wipe the hard drive.
In the next step, see Figure 6, you will select to which hard drive to install Linux Mint. If your computer has multiple hard drives, double-check to make sure that you are installing to the right one. There will be another confirmation dialog, see Figure 7, but this is where things are going to happen. Carefully read the description of what the installer is going to do, double-check the installation destination, and click the “Install Now” button when you are confident. Then double-check again and read the warning shown in Figure 7, and click the “Continue” button when you are certain you want to proceed.
Figure 6: Double-check the installation destination, read the fine print, and click “Install Now.”
Figure 7: Double-check again. If you click “Continue” in this dialog, the installer will start writing to the selected hard drive(s).
At this point, the installer will shrink, reformat, or erase the selected installation hard drive(s) and start installing the final Linux Mint system. While doing so, it will ask a couple of additional questions, see Figures 8-9.
First, the installer will let you choose your location and time zone, see Figure 8.
Figure 8: Select your location and time zone.
Next, the installer will ask your full name, let you choose a login ID and password, and a name to identify the computer, see Figure 9. In home or equivalent set-ups you can choose any computer name you like, but if you want to embed the computer into an existing network, you should ask your local IT administrator.
In the general case, do not select the “Log in automatically” option, and do select the “Require my password to log in” option. Only if you want to set up an AR Sandbox in kiosk mode, do the opposite. This will let you run the AR Sandbox application with neither a mouse nor a keyboard connected.
Figure 9: Selecting a user name/password, computer name, and deciding whether to allow automatic log-in.
This was it; from this point on, the installer will continue copying files to the selected hard drive and show a slide show of Linux Mint features, see Figure 10. Depending on the speed of your hard drive, this might take upwards of 5 minutes. When installation is complete, the installer will show a confirmation message and offer to reboot the computer to start the newly-installed operating system, see Figure 11. To do so, click the “Restart Now” button, wait for the message asking you to remove the installation medium, and then press “Enter” to reboot.
Figure 10: Waiting for installation to complete.
Figure 11: Installation is complete; time to reboot to continue with the next stage.
Graphics Card Driver Installation
If base system installation completed successfully, the computer will boot into the newly-installed operating system. If “Log in automatically” was not selected in the dialog in Figure 9, the computer will show the Linux Mint login screen, where you need to select your user name from the list and enter your password to enter the desktop. If “Log in automatically” was selected, it will drop you straight onto the desktop.
Linux Mint’s default desktop will look mostly identical to the “live” environment shown in Figure 1, minus the “Install Linux Mint” icon. Upon logging in for the first time, Mint will show a “Welcome” screen, see Figure 12.
Figure 12: Linux Mint’s “Welcome” screen, from where we can install third-party graphics card drivers.
To install third-party graphics card drivers for the Nvidia GeForce card, click on the “First Steps” page in the left column, and then the “Launch” button underneath the “Driver Manager” heading in the right column. This is an administrator-level change, and Mint will ask you for your password before proceeding, see Figure 13.
Figure 13: Enter your password to open the Driver Manager.
The driver manager will show which graphics card driver is currently installed, and which alternative drivers are available, see Figure 14. If you followed these instructions and left “Install third-party software…” unchecked in the dialog shown in Figure 4, the current driver should be the open-source “xserver-xorg-video-nouveau” driver. What we need, however, is the recommended (and proprietary and closed-source) Nvidia driver, in this case version “nvidia-driver-390.” The open-source driver does not work well enough to run demanding 3D graphics application, especially VR applications or the AR Sandbox.
Figure 14: Check the currently-installed graphics card driver.
Select the recommended driver, see Figure 15, and then click the “Apply Changes” button.
Figure 15: Select the recommended driver, and click “Apply Changes.”
This will download the recommended driver software and install it, which may take a few minutes. When the installation is complete, the Driver Manager will give you the option to restart the computer to activate the new driver, see Figure 16. Click the “Restart…” button, follow any additional instructions, and log back in when the login screen re-appears.
Figure 16: Restart the computer a second time to activate the recommended graphics card driver. This will be the last restart.
Once logged back in after the computer restarts, it is time to check that the graphics card driver installation was successful. To do this, open a terminal window. You can do this either by clicking on the terminal icon in the panel at the bottom edge of the screen (it’s the black rectangle with a dollar sign (“$”) in it, see Figure 1), or by right-clicking anywhere on the desktop, and selecting “Open in Terminal” from the menu that appears.
When you have a terminal window, enter exactly the following command into it and press the Enter key (the $ sign indicates the terminal’s input prompt; don’t type it):
$ glxinfo | grep vendor
This command is called a “pipe,” and one of the ways in which the UNIX/Linux command line lets you do nifty things. This will come in handy later. In detail, glxinfo is a utility program that prints a ton of information about the installed graphics card (run just glxinfo by itself to see). Instead of showing you all that text, the vertical bar (also called “pipe”) directs that text into the grep program, which is a filter that only shows you lines of text that match a certain pattern. In this case, “grep vendor” only shows lines that contain the keyword “vendor,” which are exactly the ones you need to see at this point.
The terminal should reply with three lines of text, see also Figure 17:
If the any of those lines show a vendor other than Nvidia, something went horribly wrong.
Figure 17: Checking if the Nvidia graphics card driver was installed correctly.
If you made it to this point: congratulations! You now have a Linux Mint system with high-performance drivers for your graphics card, and are ready to install the Vrui VR toolkit and Vrui-based applications, including the Augmented Reality Sandbox.
At this point, you could also update your Linux Mint system from the software versions that were packaged onto the installation medium to whatever the most recent software versions in the on-line repositories are (to do so, launch the “Update Manager” from the welcome screen’s “First Steps” page, see Figure 12). This is recommended for a general-use system, but you do not have to do it for a computational appliance, such as an AR Sandbox in kiosk mode, which will never again be connected to the network.
Computer reviews aren’t my thing, but for this one I had to make an exception. My 3.5 year old laptop, the HP Spectre x360 I had scored as swag at the 2015 Microsoft Build conference, suddenly died a few months ago. I had taken a liking to that thing, so when I had to leave for a conference in early November, and realized I should probably bring a laptop with me, I decided to replace it with the current version of the same model. Fortunately they had one in stock at my neighborhood Best Buy (alas, only the silver one and not the pretty black and gold one), so I was able to pick it right up.
I then had the bad idea to search online for Linux support on the x360 after already having bought it, and was dismayed by what I found. A lot of people reported poor performance, too-hot-to-handle operating temperatures, and very poor battery life. Not having much of a choice at that point, I decided to go ahead anyways and install Fedora 28 on it, the then-current release of my go-to Linux distribution. Long story short: installation was a breeze, everything worked out-of-the-box, performance is great, the laptop runs barely warm, and battery life is awesome (so awesome, in fact, that I initially thought the readings were wrong). In order to provide a counter-narrative to those other reports, this is my experience of installing and running Linux on a 2018 HP Spectre x360.
Before the detailed list of things I did, here’s the exact laptop model I bought:
Intel Core i7-8550U CPU at 1.8GHz (up to 4.0GHz in turbo mode), 8MB L3 cache, Intel UHD Graphics 620
13.3″ 1920×1080 touchscreen display with active stylus
My first step was to download Fedora 28 workstation 64-bit with MATE desktop (I had Gnome on my previous laptop, and never fully warmed up to it) and burn it onto a 2GB USB stick. I then powered on the laptop and went straight to the BIOS setup screen by madly mashing the F10 key (the laptop came with Windows 10 pre-installed, and I didn’t want to get into that). Once in the BIOS, I checked that “Legacy Boot” was disabled (most Linux instructions recommend enabling it) and “Secure Boot” was enabled. I also set the “System Power Scheme” to “balanced,” as I was still expecting heat/battery issues at that point. I left all other settings at their defaults.
Next, I booted from my prepared USB stick into a live Fedora desktop, without problems, and clicked on “Install to Hard Disk.” I clicked through to the main installation panel, created an administrator user account for myself, and selected “custom partitioning” on the built-in SSD with the following layout:
300MB for /boot
260MB for /boot/efi
30GB for /
8GB for swap
40GB for /home
>160GB (whatever was left) for /work
The rest of the installation took less than 10 minutes, and then it was time to reboot. Booting into the new OS took only a few seconds, and I was up and running. Unlike my previous Spectre, this one did not require any kernel options or special treatment. Touchpad — working. Brightness controls — working. Keyboard backlight — working. Audio and audio control buttons — working. Microphone — working. Speakers — working. Headphones — working. Headset microphone — working. Wi-Fi — working, both in 2.4GHz and 5GHz. Bluetooth — working, including LE mode. Side volume buttons — working. Fingerprint reader — don’t know, don’t care. SD card slot — haven’t tried yet. Camera — working. Touchscreen — working. Active stylus (came in the box) — working. External video connector — working. Suspend and resume — working.
While the touchpad was working immediately after installation, I didn’t like its default settings. Fortunately, MATE’s “Mouse Preferences” control panel offers all the options I needed on the “Touchpad” pad, with one caveat. I disabled “Disable touchpad while typing” because Vrui can use keyboard keys as additional mouse buttons; I enabled “Enable mouse clicks with touchpad;” I set “Two-finger click emulation” to “Right button” and “Three-finger click emulation” to “Middle button;” finally I enabled “Vertical two-finger scrolling” and disabled “Natural scrolling” because I prefer it that way. The control panel displays a warning under the finger-click emulation settings: “Warning: multi-finger emulation may disable software buttons.” And indeed it does: clicking the touchpad would now always generate left-clicks, no matter where the touchpad was pressed. To re-enable that useful feature, I had to run the following in a terminal:
This made both methods work at once: multi-finger taps and software buttons, so why the control panel disabled the latter is a mystery to me. To make this adjustment permanent, I put the above command into a script and added that script to the session start-up programs.
The laptop wasn’t running hot during those first steps, but I installed powertop nonetheless, and switched all available tunables from “bad” to “good.” On my previous Spectre, I had to leave some tunables at “bad” to avoid stability issues, but I haven’t run into any problems on this one yet, after more than a month of intensive use. The results are very impressive, as shown in Figure 1: For typical use, power drain hovers around 4W, which, combined with the battery’s large 61.4Wh design capacity, predicts more than 15 hours of use time. I haven’t run that long a session yet, but had many days where I was using the laptop for 8 hours straight, and still had almost 50% battery left at the end. When the laptop is in screen saver mode, with the display turned off, it idles at around 2W. In suspend mode, it feels like it is using no power at all. Streaming a movie via Netflix brings it up to about 8W, and I was in fact able to watch six straight hours of Netflix yesterday (being sick as a dog in a hotel room in Washington, DC) without draining the battery, and running only comfortably warm.
Figure 1: Discharge rate of HP Spectre x360 during typical use.
Even with the “balanced” system power scheme, the laptop is rather sporty. It completed my personal CPU performance benchmark, building the Vrui VR toolkit from scratch, in 1:15 minutes using 8 threads. That’s only about twice as long as my top-of-the-line main desktop PC.
Graphics performance is another story. I didn’t expect this to be a graphics workstation, given that it does not have a discrete graphics card, but nonetheless, it runs most Vrui applications surprisingly well. It should even be able to run some less-complex applications in full VR mode on a connected HTC Vive — at least my previous Spectre x360 could do it — but for some reason, this one has noticeable latency that I reckon must be due to some changes, or changed default settings, in the Intel graphics driver. I haven’t been able to look into this in detail yet, as it’s not really an intended use case.
Touchscreen and Stylus
One of the main — and unexpected — reasons I ended up liking the Spectre x360 line is its touchscreen with active stylus. The Spectre x360 is advertised as a “2-in-1” laptop that turns into a sort-of tablet when its screen is rotated by 360° and locked to the back of its keyboard. I initially didn’t think I would be using that feature, or the touchscreen itself, but I fell in love with it for one reason: the ability to sketch or write or do math on an infinite slate, and being able to do algebra by directly copying and pasting or moving sub-expressions around. My previous Spectre didn’t come with a stylus in the box, but I found a compatible third-party one, and while it worked, it wasn’t exactly great. The new stylus, on the other hand, is great. I don’t know whether HP increased the touchscreen’s resolution, or upped the reporting rate, or HP’s original stylus is just better than the one I had previously, but drawings and hand-written text and formulas look a lot smoother now than they used to, and there is noticeably less lag even without using fancy smoothing and motion prediction.
I don’t know what exactly changed between the reports I read and my own experience, but installing Fedora 28 Linux (64-bit, with MATE desktop) on the 13.3″ HP Spectre x360, in non-legacy secure mode, worked without any special consideration or any hitch, and it is now running like a charm, with good performance and at least 12 hours of battery life under normal use. A+, would buy again. Now let’s hope that it survives longer than its predecessor.
I wrote an article earlier this year in which I looked closely at the physical display resolution of VR headsets, measured in pixels/degree, and how that resolution changes across the field of view of a headset due to non-linear effects from tangent-space rendering and lens distortion. Unfortunately, back then I only did the analysis for the HTC Vive. In the meantime I got access to an Oculus Rift, and was able to extract the necessary raw data from it — after spending some significant effort, more on that later.
With these new results, it is time for a follow-up post where I re-create the HTC Vive graphs from the previous article with a new method that makes them easier to read, and where I compare display properties between the two main PC-based headsets. Without further ado, here are the goods.
The first two figures, 1 and 2, show display resolution in pixels/°, on one horizontal and one vertical cross-section through the lens center of my Vive’s right display.
Figure 1: Display resolution in pixels/° along a horizontal line through the right display’s lens center of an HTC Vive.
Figure 2: Display resolution in pixels/° along a vertical line through the right display’s lens center of an HTC Vive.
What is obvious from these figures is that display resolution is not constant across the entire display, and different for each of the primary colors. See the previous article for a detailed explanation of these observations. On the other hand, there is a large “flat area” around the lens center in which the resolution is more or less constant. Therefore, I propose the green-channel resolution at the exact center of the lens as a convenient single measure of the “true” resolution of a VR headset.
It is important to note that the shown resolution of the red and blue color channels is nominal resolution, i.e., the number of pixels per degree as rendered. Due to the Vive’s (and Rift’s) PenTile sub-pixel layout, the effective sub-pixel resolution of those two channels as displayed is lower by a factor of sqrt(2)/2=0.7071, as the red and blue channels use a pixel grid that is rotated by 45° with respect to the green channel’s pixel grid.
The second set of figures, 3 and 4, show the size relationship between display pixels and pixels on the intermediate tangent-space render target using the default render target size (1512×1680 pixels). Factors smaller than 1 indicate that display pixels are smaller than intermediate pixels, and factors larger than 1 indicate that one display pixel covers multiple intermediate pixels. Again, refer to the previous article for an explanation of tangent-space rendering and post-rendering lens distortion correction.
Figure 3: Sampling factor between display and intermediate render target along a horizontal line through the right display’s lens center of an HTC Vive.
Figure 4: Sampling factor between display and intermediate render target along a vertical line through the right display’s lens center of an HTC Vive.
These sampling factors are an important consideration in designing a VR rendering pipeline. A high factor like the 4.0 at the upper and lower edges of the display (see Figure 4) means that the renderer has to draw four pixels (at the horizontal lens center), or up to 16 pixels at one of the image corners, to generate a single display pixel. Not only is this a lot of wasted efforts, but it can also cause aliasing during the lens correction step. The standard bilinear filter used in texture mapping does not cope well with high sampling factors, and especially highly anisotropic sampling factors such as 4:1, requiring a move to more complex and costly higher-order filters. Rendering tricks such as lens-matched shading are one approach to maintain most of the standard 3D rendering pipeline without requiring high sample factors.
Interestingly, at standard render target size, the sampling factor around the center of the lens is smaller than 1.0, meaning the render target is slightly lower resolution than the actual display. This was probably chosen to create a larger region of the field of view where display and intermediate pixels are roughly the same size.
The last figure in this section, Figure 5, shows the lens distortion correction mesh and the spatial relationship between the actual display (left) and the intermediate render target (right). The black rectangle on the right denotes the tangent-space boundaries of the intermediate render target, i.e., the upper limit on the headset’s field of view.
Figure 5: Visualization of HTC Vive’s lens distortion correction. Left: Distortion correction mesh in display space. Right: Distortion correction mesh mapped to intermediate render target, drawn in tangent space. Only parts of the mesh intersecting the render target’s domain (black rectangle) are shown.
Note that in the Vive’s case, not only do some display pixels not show valid data, i.e., the parts of the distortion mesh that are mapped outside the black rectangle, but neither do all parts of the intermediate image contribute to the display. The render target overshoots the left edge of the display, leading to the lens-shaped “missing area” visible in the graph.
As above, the first two figures, 6 and 7, show display resolution in pixels/°, on one horizontal and one vertical cross-section through the lens center of one specific Oculus Rift’s right display.
Figure 6: Display resolution in pixels/° along a horizontal line through the right display’s lens center of an Oculus Rift.
Figure 7: Display resolution in pixels/° along a vertical line through the right display’s lens center of an Oculus Rift.
Unlike in the previous section, there is only a single resolution curve instead of one curve for each of the primary color channels. This is due to Oculus treating their lens distortion correction method as a trade secret. I had to pull some tricks to get even a single curve, as I am going to describe later. Unfortunately, I am not exactly sure to which primary color channel this single curve belongs. Based on how I reconstructed it, it would make sense for it to be the green channel, but based on the shape of the curve, it looks more like it is the red channel. Note the “double hump” around the lens center, where resolution goes up before it goes down. This is the same shape as the Vive’s red channel curve, but neither the Vive’s green nor blue channel curves show those humps. This will become a point in the comparison section below.
The next set of figures, 8 and 9, are again horizontal and vertical sampling factors, at the default render target size (1344×1600 in the Rift’s case).
Figure 8: Sampling factor between display and intermediate render target along a horizontal line through the right display’s lens center of an Oculus Rift.
Figure 9: Sampling factor between display and intermediate render target along a vertical line through the right display’s lens center of an Oculus Rift.
The final figure, Figure 10, is as before a visualization of the lens distortion correction step used at the end of the Rift’s rendering pipeline.
Figure 10: Visualization of Oculus Rift’s lens distortion correction. Left: Distortion correction mesh in display space. Right: Distortion correction mesh mapped to intermediate render target, drawn in tangent space. Only parts of the mesh intersecting the render target’s domain (black rectangle) are shown.
Based on the results from the two headsets, the most obvious difference between the optical properties of the HTC Vive and Oculus Rift is that Oculus traded reduced field of view for increased resolution. The “single number” resolution of the Oculus Rift, 13.852 pixels/°, is significantly higher than the HTC Vive’s. Under the assumption that the single curve I was able to extract represents the green channel, the Rift’s resolution is 21.2% higher than the Vive’s. Should the curve represent the red channel (as hinted at by the “double hump”), it would still be 20.6% higher (so no big deal).
The graphs in the previous sections, e.g., figures 1 and 6, show the change of resolution over the display’s pixel space, but that does not compare well between headsets due to their different fields of view. Drawing them in the same figure requires changing the horizontal axis from pixel space to, say, angle space, as in Figure 11.
Figure 11: Comparison of display resolution along horizontal axes through their respective lens centers between an HTC Vive and an Oculus Rift, drawn in angle space instead of pixel space.
Figure 11 also sheds light on the apparent difference in pixel sample factors between the two headsets. According to figures 3 and 4, the Vive’s rendering pipeline has to deal with a sampling factor dynamic range of 4.5 across the display (from 0.875 to 4.0), whereas the Rift only faces a range of 2.9. This is a second effect of the Rift’s reduced field of view (FoV). The Vive uses more of the periphery of the lens, where sampling factors rise dramatically. By staying closer to the center of the lens, the Rift avoids those dangerous areas.
Resolution and Field of View
Given that both headsets have displays with the same pixel count (1080×1200 per eye), it is clear that one pays for higher resolution with a concomitant reduction FoV. Due to the non-linearities of tangent space rendering and lens distortion correction, there is, however, not a simple relationship between resolution and FoV.
Fortunately, knowing the tangent-space projection parameters of the intermediate render target and the coefficients used during lens distortion correction, one can calculate the precise fields of view and compare them directly. The first approach is to compare figures 5 and 10, which are drawn to the same scale and are replicated here for convenience:
Figure 5: Visualization of HTC Vive’s lens distortion correction. Left: Distortion correction mesh in display space. Right: Distortion correction mesh mapped to intermediate render target, drawn in tangent space. Only parts of the mesh intersecting the render target’s domain (black rectangle) are shown.
Figure 10: Visualization of Oculus Rift’s lens distortion correction. Left: Distortion correction mesh in display space. Right: Distortion correction mesh mapped to intermediate render target, drawn in tangent space. Only parts of the mesh intersecting the render target’s domain (black rectangle) are shown.
However, while they are to scale, comparing these two diagrams is somewhat misleading (which is why I did not attempt to overlay them on top of each other). They are drawn in tangent space, and while tangent space is important for the VR rendering pipeline, there is no linear relationship between differences in tangent space size and the resulting differences in observed FoV.
There is not a single way to draw headset FoV in a 2D diagram that works for all inquiries, but in this case, a better comparison could be drawn from transforming the FoV extents to polar coordinates, where each point on the display is drawn at its correct lateral angle from the central “forward” or 0° direction, and the distance from a display point to the central point is proportional to that point’s angle away from the 0° direction.
Figures 12 and 13 show the Vive’s and Rift’s FoVs, respectively, in polar coordinates. In the Vive’s case, Figure 11 does not show the full FoV area implied by the intermediate render target (minus the “missing area”, see Figure 5), as the Vive’s rendering pipeline does not draw to the entire intermediate target, but only to the inside of a circle around the forward direction.
Figure 12: Combined field of view (both eyes) of the HTC Vive in polar coordinates. Each grey circle indicates 5° away from the forward direction.
Figure 13: Combined field of view (both eyes) of the Oculus Rift in polar coordinates. Each grey circle indicates 5° away from the forward direction.
In addition, I am showing the combined field of view of both eyes for each headset, to illustrate the different amounts of binocular overlap between the two headsets. Figure 12 shows that the Vive’s overall FoV is roughly circular, with the lens-shaped “missing area” on the inside of either eye. Aside from that area, the per-eye fields of view overlap perfectly
Figure 13 shows that the Rift’s FoV is more rectangular, as it is using the entire rectangle of the intermediate render target, and has a distinct displacement between the two eyes, to increase total horizontal FoV at the cost of binocular overlap. Also of note is that the Rift’s FoV is asymmetric in the vertical direction. The screens are slightly shifted downward behind the lenses, to better place the somewhat smaller FoV in the user’s overall field of vision.
Unlike in tangent space, it does make sense to directly compare these polar-coordinate diagrams, and I am doing so in Figure 14:
Figure 14: The fields of view of the HTC Vive and Oculus Rift overlaid in polar coordinates. Each grey circle indicates 5° away from the forward direction.
Notably, the Rift’s entire binocular FoV, including the non-stereoscopic side regions, is exactly contained within the Vive’s binocular overlap area.
Extracting the Oculus Rift’s Lens Distortion Correction Coefficients
As I mentioned in the beginning, Oculus are treating the Rift’s lens distortion correction method as a trade secret. Unlike the Vive, which, through OpenVR, offers a function to calculate the lens-corrected tangent space position of any point on the display, Oculus’ VR run-time does all post-rendering processing under the hood.
So how did I generate the data in figures 6-10 and 13? The hard way, that’s how. I wrote a small libOVR-based VR application that ignored 3D and head tracking and all that and simply rendered a checkerboard in tangent space, directly into the intermediate rendering target. Then I used a debugging mode in libOVR mirror window functionality to show a barrel-distorted view of that checkerboard on the computer’s main monitor, where I took a screenshot (see Figure 15).
Figure 15: Screen shot from libOVR mirror window, showing a barrel-distorted rendering of a checkerboard drawn directly in tangent space. Unfortunately, the image shows no sign of chromatic aberration correction.
To my surprise, the barrel-distorted view did not show any signs of chromatic aberration correction. So it was not, as I had hoped, a 1:1 representation of the image painted to the Rift’s display (see Figure 16 for proof that the Rift’s lenses do suffer from chromatic aberration), but some artificial recreation thereof, potentially generated with a different set of lens distortion coefficients. Even in the best case, this meant I was not going to be able to create separate resolution curves for the three primary color channels.
I barreled ahead nonetheless, and since the results I got made sense, I assume there is at least some relationship between this debugging output and the real display image. Next, I fed the screen shot into my lens distortion correction utility, which extracted the interior corner points of the distorted grid, and then ran those corner points through a non-linear optimization procedure that, in the end, spat out a set of distortion correction coefficients that mapped the input image to the original drawing I had made in tangent space to about 0.5 pixels tolerance. With those correction coefficients in hand, plus the tangent-space projection parameters given to me by libOVR, I was able to generate the diagrams and figures I needed.
Figure 16: Through-the-lens photo of Oculus Rift’s display, showing a calibration grid. There are noticeable color fringes from chromatic aberration, meaning that Oculus’ rendering pipeline does correct for it, but does not reflect that in the debugging mirror window mode.