English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

HandSonor: A Customizable Vision-based Control Interface for Musical Expression

MPS-Authors
/persons/resource/persons79499

Sridhar,  Srinath
Computer Graphics, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Sridhar, S. (2013). HandSonor: A Customizable Vision-based Control Interface for Musical Expression. In P. Baudisch, M. Beaudouin-Lafon, & W. E. Mackay (Eds.), CHI 2013 Extended Abstracts (pp. 2755-2760). New York, NY: ACM. doi:10.1145/2468356.2479505.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0015-38A2-D
Abstract
The availability of electronic audio synthesizers has led to the development of many novel control interfaces for music synthesis. The importance of the human hand as a communication channel makes it a natural candidate for such a control interface. In this paper I present HandSonor, a novel non-contact and fully customizable control interface that uses the motion of the hand for music synthesis. HandSonor uses images from multiple cameras to track the realtime, articulated 3D motion of the hand without using markers or gloves. I frame the problem of transforming hand motion into music as a parameter mapping problem for a range of instruments. I have built a graphical user interface (GUI) to allow users to dynamically select instruments and map the corresponding parameters to the motion of the hand. I present results of hand motion tracking, parameter mapping and realtime audio synthesis which show that users can play music using HandSonor.