CISC 499 projects in the Biomotion Lab

The BioMotion Lab at Queen's University studies how people see. In doing so, we design experiments that often are reminiscent of computer games. We also use computer vision to test hypotheses about how people recognize objects, events, and particularly people in our visual environment. Check us out at Contact Dr. Niko Troje at for more info.

Vision in virtual reality: pictorial vs visual spaces

Imagine looking at the picture of a person facing the camera. When mounted on the wall of your living room, the depicted person seems to be looking at you and the person's head is oriented towards you. That impression remains even if you change your position in front of the picture. Interestingly, the fact that the depicted person keeps facing you while you move past the image doesn't mean that you perceive motion. The person doesn't seem to rotate.

The observation demonstrates that our visual system treats picture very differently from objects of the real world. If the real person depicted in the above picture was always facing us, no matter where we are, we would perceive motion. We would see the person rotating.

We are investigating this phenomenon in virtual reality. Here we can control for all sorts of attributes of the renderings that might inform our visual system whether it deals with the picture of an object or with an object in real space.

Your task is to develop an experimental environment using Oculus Rift and Unity3D that allows us to independently manipulate presence/absence of a picture frame, perspective foreshortening, stereoscopic depth, motion parallax, and shading. You would learn a lot about 3D computer graphics, but also about visual perception of shape and depth and how to play with the cues that our visual system is using in that respect.

Online experiments on biological motion perception

My lab has developed a prototype of an experimental suite that allows the experimenter to generate stimuli and then combine them into experiments that probe our ability to recognize people from their motion. The responses of the participants in these experiments are sent back into a mySQL data base on our lab's server.

The prototype is functional in many respects but its user interface is awkward, it still lacks some important functionality, and it is written in Adobe Flash.

Your task will be to turn the project into an attractive web-based experimental system that can be used by course instructors, student experimenters or established researchers. As Flash doesn't seem to have a future, we have to port the project into WebGL (using Adobe Animate), we have to add an intuitive front end, write a bit of documentation, and package the whole project into a good looking and easy to use webpage.

You will use your web programming skills (HTML5, WebGL, PHP, mySQL), and you will learn a lot about design principles in psychophysical experimentation. Once the final piece goes online, it will become your master showpiece when it comes to impress peers, parents or potential employers.