Making the “virtual” come to life

Making the “virtual” come to life

Queen’s Human Media Lab set to unveil programmable matter

By Chris Moffatt Armes

November 5, 2015

Share

From foldable phones to holographic video conferencing, the team at the Human Media Lab have long been ahead of the curve in creating innovative and groundbreaking technology. Their latest project is no exception, as they attempt to take the “virtual” out of virtual reality.

On Monday, Nov. 9, Queen’s professor Roel Vertegaal and his students will unveil the BitDrones system at the ACM Symposium on User Interface Software and Technology in Charlotte, North Carolina. The BitDrones system uses nano-quadcopter drones to create complex 3D models in real-time.

[ShapeDrone]
. “ShapeDrones”, one of three BitDrone forms unveilled by the Human Media Lab, are encased within with a lightweight mesh and a 3D printed frame, serving as building blocks for complex 3D models. (Image credit; Human Media Lab/Queen's University)

“BitDrones brings flying programmable matter, such as featured in the futuristic Disney movie Big Hero 6, closer to reality,” says Dr. Vertegaal. “It is a first step towards allowing people to interact with virtual 3D objects as real physical objects.”

Dr. Vertegaal and his team at the Human Media Lab created three types of BitDrones, each with different applications. “PixelDrones” are equipped with one LED and a small low resolution display and can be used to form simple shapes or display small amounts of information. “ShapeDrones” are encased within with a lightweight mesh and a 3D printed frame, serving as building blocks for complex 3D models. “DisplayDrones” are fitted with a curved flexible high resolution touchscreen, a forward-facing video camera and Android smartphone board. These drones are intended for use in virtual telepresence settings; allowing users to appear, communicate and even tour in a remote location via Skype.

[DisplayDrone]
A remote user demonstrates the DisplayDrone's remote telepresence functionality. (Image credit: Human Media Lab/Queen's University)

All three BitDrone types are equipped with reflective markers, allowing them to be individually tracked and positioned in real time. Using motion capture technology, like that used in film and video game production, the system can track the drones’ locations and allow users to move drones in space without requiring them to physically touch them.

“We call this a ‘real reality’ interface rather than a virtual reality interface. This is what distinguishes it from technologies such as Microsoft HoloLens and the Oculus Rift: you can actually touch these pixels, and see them without a headset,” says Dr. Vertegaal.

While their system currently only supports a dozen, comparatively large 2.5 inch to  5 inch sized drones, the team at the Human Media Lab is working to scale up its system to support thousands of drones. These future drones would measure no more than a half inch in size, allowing users to create more seamless, high resolution objects. 

More information on the BitDrone system, including photographs of the drone types and videos of them in use can be found at http://www.hml.queensu.ca/bitdrones

Arts and Science