Animals Can Use VR??

Apparently, yes!

One of my tasks when I was working at the Angelaki Lab at NYU was to somehow fit a VR headset on to a monkey. Another one of my tasks was to build a physical VR environment for mice. The purpose of both of these new experimental setups was to provide a more immersive simulation environment for the animals so that when researchers conduct experiments, they can record better and more accurate results.

MonkeyVR/Headeset

I modified an HTC Vive Cosmos Elite headset to fit in front of the monkey's head. The monkeys weren't going to be directly wearing the headsets since they'd be too heavy for their little heads; instead, they sat down in little enclosures with their head and arms sticking out. On this enclosure was where the headset was mounted.

In order for the monkey to be able to use the headset, I had to tear apart some of the headset structure, mainly around where the screens were situated. I also had to completely replace the lenses with smaller lenses of the same focal length. I'm not sure anymore how I found out what the lens focal length was, but I remember finding it as a small detail on some manual or technical specification sheet. Anyway, I 3D printed parts that fit into the socket where the old lens used to be, and made sure that there was a spot for the new lens, as well as openings for some infrared LEDs, which were going to be used for eye tracking (the eye tracker used was from Pupil Labs). Lastly, I also 3D printed the mount that would interface between the headset and the enclosure.

To ensure that what I built was actually going to work, I and the post-doctorate I was working with just put the headset in front of our faces and we confirmed ourselves that we can still see a stereoscopic image.

MonkeyVR/Software

The software was developed almost entirely using Unity and C#; there were a couple of custom peripherals made with Arduinos and coded in C. I used SteamVR and Pupil third-party plugins to aid in programming everything. Several different experiments were available, and many parameters can be set for each one. Eye tracking calibration was also made available.

The modified headset had a interpupillary distance (IPD) that was around 28cm to 30cm, so that meant that some changes had to be made to camera rendering. Specifically, I had to modify the camera matrix so that the center of each stereoscopic eye were closer together. This was probably the most complicated task, but we were able to mathematically and empirically confirm the correct parameters for the camera matrix.

The experiments that were available were based on previous experiments ran at the lab; their methods were well documented in papers that were published by the primary investigator (PI) and post-doctorates. I also made modifications to these pre-existing experiments in order to collect new types of data and test new hypotheses. I wrote code to be able to interface with a multitude of peripherals as well, some of them being uncommon hardware.

MouseVR/Hardware

MouseVR involved a lot more customized hardware. I built the rig on top of a hexapod robot from Symmetrie (a French robotics company), which was capable of moving with 6 degrees of freedom. The rig was comprised of a frame constructed with 80/20 aluminum framing, which had four sides. One side, which was considered the front, had a door that swung open so that the interior was accessible. Each side had a monitor attached to it. In the middle of the rig was a large trackball and an interface for what we called a "mouse holder". As the name suggests, it held mice in place, but it did allow the mice to rotate and walk freely on the trackball. At the top of the rig were two projectors. As the monitors displayed what was in front, behind, left and right of the mouse; the two projectors displayed the floor.

There was a control box that controlled the robot; it fell into a state of mild disrepair only after a few sessions working with it because it hadn't been used in so long. I spent many days talking with support, which was based in France. There was a lot of troubleshooting to find out what cables, motors, fuses, etc. were not working, but eventually it was fixed and didn't need any fixing after that.

MouseVR/Software

This software was also developed using Unity and C#; and many of the peripherals were coded in C. There was only one type of experiment available, but several variations were made by adjusting different parameters.

The main challenge in getting this to work was interfacing with the robot. The control box communicated via TCP/IP, and timing was everything. If I was behind by even one frame, or there was some sort of lag or any other interuption, the robot would stop immediately. I had to make sure that everything was optimized as much as possible so as to provide the most consistent frame rate possible. These experiments relied heavily on precise timing as well. Additionally, there were some algorithms written in C++ for "motion cueing" (a term used to describe simulating the acceleration one feels when moving) by the previous engineer, and there didn't seem to be a good way to interface with the control box via C#; so I had to write libraries in C++ and compile them as DLL's in order to be able to call those functions with C#.

Legacy

I was supposed to write some papers based on experiments conducted with the setups I built, but due to me starting my Master's program in Japan, and later on some unforseen circumstances at the lab; we never wrote the papers. An old colleague had inherited the lab, however, and actually asked me to help reset the lab in Minnesota. I declined the offer, but they're still using my setups or something based on what I built.

Back to Top