- Create a dmi which attempts to allow its user to control the vast sound spaces afforded by digital synthesis algorithms whilst trying to provide a sense of connection with those sounds and a sense of engagement for performer and audience. The instrument should also present a learning curve to users and be reasonably challenging to ensure practice is rewarded with increased control and mastery yet novice players find the instrument approachable.
- Controller should provide transducers based on research into which gestures are best suited to performing certain musical functions. Ideally transducers will be able to provide feedback, such as haptic, in order to contribute to the connection mentioned above.
- Controller could also aim to give vibrotactile feedback and sound localization by using speaker(s) mounted in its body.
- Controller will not be any kind of sequencer.
- Controller likely to be an "instrument-inspired controller."
-> Ease of concept
-> Provides some users with familiarity of use
-> Provides a clue as to the kind of suitable gesture vocabulary potentially even for novices and non-musicians
-> Currently leaning towards guitar
- Sensors connected to computer via arduino and algorithm run in Max/MSP
- Emphasis on controller design so could potentially use an open source algorithm for final model
-> Will have to use simple test algorithms whilst designing
- Mapping strategies to be properly considered and based on suggestions of research

No comments:
Post a Comment