The tap shoes surrounded by tap dancing videos are generated when the device camera detects the AR marker which anchors these 3D objects. For better detection results, point the device camera close to the designated AR marker before running the app. It is also best not to tap the screen before the 3D objects from the AR marker is generated.

The project also utilizes Vuforia’s ground plane detection feature to generate footprints whenever the user taps the screen on a flat surface. Each footprint generated on a surface have slight variations in their angles to simulate vigorous dance patterns. You may also move around the models and generate the footprints around the space.

The idea is to eventually have the footprints lead people around the space, and perhaps be used as simple instructions for a tap dance routine. As people move through the space, there will be more black and white and colour videos showcasing the history of tap dancing through memorable performances. We would also play with the idea of old vs new and black and white vs color with the material and texture of the 3D model tap shoes, as well as a tapping animation and sound for the shoes.

Adiola Palmer, Erik Fernandez, Kwame Kyei-Boateng, Jiaoxuan Hou