The development of the software is done by using Processing,
which is a Java library for interactive multimedia application.
This library allows us to quickly develop the software,
because the code can be run instantly inside the IDE
(Integrated Development Environment) without long
compilation process, thus the road from prototype to end
product were made shorter.
We also made use of the Leap Motion P5 library, which is the
implementation of Leap Motion API (Application
Programming Interface) for Processing. This library exposes
custom events from Leap Motion, such as detecting hand and certain gestures in practical ways that can be used during the
development of the software.
Due to the fact that the software was targeted for general
young audience, we then aimed for something visually
colorful yet minimal. So that the visual aspects will not
overshadow the sound, which is what the user should focus
on. So, several basic shapes with different colors were
generated using Processing to create the main user interface.
Additional animations were also added to emphasize certain
interaction, as a way to notify users that they've made a
meaningful actions.
As for the audio element, it is also done inside Processing,
using an audio library named Minim. We initially use it to
trigger a piano sample, but after further research, we thought
that basic sine wave should be better due to its simplicity. We
then used one of Minim's audio generator to generate short
sine wave every time the user touches the rectangles on the
side. The circle on the center also reacts to the audio to further
enhance the interactive experience by letting the user know
that a sound is played.
We endured several difficulties during the making of the
software, but two of the most crucial is how the hand grasping
event was recognized and how Leap Motion differentiate left
and right hand. The first problem was solved by using the hand
closing event detector, but several custom programming were
also need to be added because initially this causes the sine
wave to be retriggered several time, creating sort of noisy
delay effects which wouldn't be good. We also used threading
to solve this, but this has caused the exported application to
crash when no hands were recognized during startup.
In the end, the threading were removed and we can still detect
the gestures. Also, differentiating the left and right hand had to
be settled by comparing the position of both hands in respect to
the original axis of the Leap Motion. This did the job, although
it can be tricky once both hands are in the same position. The
version 2 of the Leap Motion API, which came out after the
software has been developed, resolves this issue and will be
considered for further development.