I've been thinking about mixed reality a lot lately. Mixing footage of a person inside VR with rendered footage of the virtual world from the same vantage point is a really neat way of demonstrating virtual reality to folks who have not tried it. This weekend, I started wondering about the inverse of this, mixing real world footage into VR itself.
Over the last few days I've put together a first proof of concept in Unity3D using a PS3 Eye Camera and my turntables and mixer. The PS3 Eye Camera is both cheap ($10) and provides various high frame rate capture modes, which makes it ideal for this experiment. Frames from the camera are captured via a Unity native plugin I wrote and then projected onto accurately sized geometry in the scene. Audio output from the mixer is also fed into Unity for processing, enabling realtime 3D visualizations such as the speakers pulsating, note particles being emitted, lights flashing and a nifty 3D analyzer, all based on spectrum data.
For a few days work, I think this turned out quite well. There's lots of possibilities here, for example using the eye's 120fps mode and markers on the tone arm/platter, visual tracking could be used to determine orientation and update virtual representations rather than relying on the video feed. Using multiple cameras could also be used to get higher quality video of each turntable/mixer etc.
I hope you enjoyed this experiment. Let me know your thoughts on Twitter @kode80!