Purpose

The purpose of this blog is to enable my university supervisors and I to easily share multimedia content regarding ideas for my Final Year Project and to allow ideas and opinions to be discussed.

Friday, 28 May 2010

oddmusic.com

The grand tour section of this site runs through a large number of weird and unusual instruments giving lots of detail, sound examples, videos and external links.

Silent Construction 2 Composed and Performed by Jaime Oliver

A video of the performance can be found here.

This system using computer vision technology to track the movement of the hands of the performer in a designated performance space by following several points identified on the hands. This could be referred to as an "empty-handed" or "naked" gesture system. The boundaries of the performance space can be seen when the video shows the captured image of the hands.

The page does not provide a technical description of the synthesis or mapping system used other than to say that samples are used. A colleague and I theorized that it is likely to be some kind of granular synthesis algorithm performed on the samples, changing them to be unrecognizable from the original. For me there are times when it is extremely difficult to determine the sonic results of some of the gestures. The very small and delicate gesture beginning at 05:23 is particularly ambiguous to me. There are also clearly several sets of samples used throughout the piece from more ambient sounds to chime/bell like sounds with a more definite sense of pitch. It is unclear however how the choice of samples being controlled is changed through the piece. It is possible that this is being carried out by another individual off camera.

Tuesday, 25 May 2010

Meeting with Hugues Genevois

A friend of mine referred me to a Hugues Genevois of LAM. Hugues specializes in areas related to DMI production and has created and studied DMIs himself. I contacted Hugues via e-mail, outlining my project and asking for any advice or suggestions he had. He suggested we meet up to talk further. I spent a couple of hours today talking to Hugues at the LAM institute and below is a rough summary of our discussion. Hugues seemed rather enthusiastic and shared some beliefs with me. He gave me plenty of advice and suggested that we meet again in a few weeks to see how this project is progressing.
  • It may be interesting to think about using the controller to trigger and manipulate samples. Samples of real-world sounds allows very (naturally) rich sounds and timbres to be used and manipulated in a reasonably simplistic way (compared to attempting to synthesize such richness). Similar results could be obtained by using PM algorithms obtained via open sources. This could be a nice way for the project to progress in terms of the synthesis algorithm. Ie start by testing with fairly simplistic sine tone generators, progress onto more complex results using samples and then move onto having naturalistic control over the internals of the sound such as timbre via a PM.
  • Hugues suggested gesture acceleration as being an important and useful variable. It allows for more degrees of freedom without necessarily having to include extra transducers. For example, position on a linear pot can also be complemented by the rate of change of position on the same pot. This also requires more physical energy which improves control and feedback for performer and audience. This aspect is also engaging for the audience because they can clearly see the performing changing aspects of sound, as opposed to the difference in sound between one finger position and the next which is perhaps not as obvious.
  • The above point also indicates that the dimensions of the object are important. Giving a pot slider for example it is more difficult to accurately produce gestures on a smaller slider than a larger one. Emphasis should also be placed on actually having a substantial object to feel and touch and manipulate - the whole essence of my project!! Much more engaging for performer and audience! AND substantially sized instruments would allow more control using touch rather than visual feedback. This could allow the user to engage more with other musicians both technically and in terms of showing enjoyment and also with the audience. It also implies a greater possible degree of mastery since experienced players tend to use feedback channels other than visual for control. A small instrument would perhaps require a lot of visual concentration.
  • One could incorporate sensors of different sizes for various musical tasks. For example a very long position sensor for long and accurate gestures (continuous bow-like control for example) and smaller FSRs for gestures requiring less long-term accuracy.
  • The instrument should make use of the user's energy input in a big way in order to really connect the user. The timbre of the resulting sounds should be directly affected by the users energy ie changes in attack strength. This can be done fairly easily in many systems. A different sample (of different attack strengths) can be played depending on attack strength. Attack strength can be linked to the plucking, blowing or bowing strength variable of a PM. Attack strength can change the filter or the number of partials in a simple additive synthesis system - for example more force introduces more higher pitched partials. Changing timbre with energy is perhaps more important than loudness.
  • It is important to experiment with various synthesis algorithm set ups in order to arrive at the best sound and the best control styles. Linking a controller with software is very much like a luthier fine tuning an instrument's design. It is important to make small and careful changes, listen to the resulting changes in sound and continue the process until the desired sound is arrived at.
  • The way in which the signals resulting from the synthesis algorithm are transformed into sound is very important. The sound should be a living thing which is a direct result of the instrument. Using built in speakers is a very good way to provide sound localization and vibrotactile feedback. It also increases the engagement for the performer and the audience. It is a very important step in providing a link to the sounds. Multiple loudspeakers would be better than one. Loudspeakers are directional which is very unnatural for music and sounds in general. One could also consider a speaker pointing towards the musician for direct audio feedback.
  • Non-linear mapping strategies will often be required for the kind of control desired.
  • For tapping gestures like the one's I was thinking of a piezo contact mic may be suitable. Using this method control data would be derived from the audio signal of the mic rather than pressure or resistance. The mic could be placed directly onto the top plate of the instrument and then tapping and banging the instrument would produce a signal. This could be used for example to control the onset of the note which could then be caused to resonate (using karplus-strong for example) with a decay time based on the intensity of the attack.
It is most important to listen to sounds and try to understand what is happening. If one can develop a good grasp of what sonic consequences a physical action has then one can attempt to reproduce such consequences in a dmi. Another important point is that it is often the case that fairly simple algorithms used with complex mappings often produce very good results and that making simplistic changes, such as moving a filter, can be convincing.

I'm going to attempt to do some further research in various areas including piezo mics and loudspeaker arrays and attempt to create a fairly solid first draft design of my DMI to show Hugues in a few week's time.

Great Resource

Through the emfinstitute time line link below I came across the Sbass and its creator Curtis Bahn. His personal website has a few interesting things on it. A series of recorded examples of his performances with the SBass can be found here and a description of the instrument here.

Perhaps most important is his resources page. This is full of what look like incredibly useful links. The best thing i've found so far is a link to SensorWiki.org. Here can be found, amongst other things, a very detailed list of sensor types, each having its own in depth description. Also available is a run-down of sensor interfaces as well as tutorials and other resources.

Bahn's pages should definitely be looked into further and I can see SensorWiki becoming a definitive reference for me when it comes to finally selecting and working with the sensors of my DMI.

BoSSA: The Deconstructed Violin Reconstructed

BoSSA: The Deconstructed Violin Reconstructed - Dan Trueman, Perry Cook

The scientific article behind the BoSSA DMI which gives a detailed over view of the sensors present in the BoSSA system and their operation. For me the most interesting thing about this instrument is the spherical speaker array used to emulate via DSP the dispersion of sound produce by an acoustic instrument's resonating body. Not only does this allow the digital sounds to be radiated, reflected and heard in a listening space in a naturalistic way but it also gives the instrument a sense of presence and a sense of wholeness by localizing the source of the sound to the controller itself. This is very important for both performer and listener.

I think the most interesting part of this article is section 5.0 Presence, Intimacy, Physicality, and Expression. This section focuses less on the technical aspects of BoSSA and more on its artistic potential emphasizing things such as "expression" and "deep communication" caused by a "stirring" in the body. The kinds of qualities described in the section should be the ultimate goal of any DMI in my opinion.

Friday, 21 May 2010

Towards a Model for Instrumental Mapping in Expert Musical Interaction - A. Hunt

Towards a Model for Instrumental Mapping in Expert Musical Interaction - A. Hunt, M. Wanderley, R. Kirk

A fairly in depth report into mapping strategies as regards DMIs. The conclusions of the report are backed up by the results of a large study carried out at York University by the authors. The article concludes that "Mapping strategies which are not one-to-one can be more engaging to users than one-to-one mappings."

The report also makes the case for multi-layered mapping strategies involving an abstraction layer between the controller outputs and the synthesis inputs.

I certainly think I should attempt to employ a one-to-many or complex (many-to-many) strategy where possible and that if I should avoid direct control of low level parameters such as partial frequencies in additive synthesis.

Project Path


An attempt to identify some key steps to take when designing the instrument. Image also shows the links between each step. This diagram does not currently include a suggestion towards including a vibrotactile feedback channel but it may have to be updated to do so in the future depending on the steps involved in that task. Updates may also have to be made to reflect the possible implementation of a double mapping layer. The diagram is in fact a revision of the original steps I followed in order to shape my ideas. This original diagram is shown below:
Although this form served me well for forming initial ideas I have had to update it to better represent the kind of thinking I am now using after having researched the topic further. In particular this diagram suggests starting with a synthesis algorithm and using the available parameters to craft the controller. In fact for my project I think it would be better to start with a controller design and a synthesis algorithm and find a good mapping between the two (perhaps including layers) thereby meeting in the middle as it were.

Clearer Project Definition based on Research

I've written a rough attempt at a clearer description of the project goals along with one or two practical notes.

  • Create a dmi which attempts to allow its user to control the vast sound spaces afforded by digital synthesis algorithms whilst trying to provide a sense of connection with those sounds and a sense of engagement for performer and audience. The instrument should also present a learning curve to users and be reasonably challenging to ensure practice is rewarded with increased control and mastery yet novice players find the instrument approachable.
  • Controller should provide transducers based on research into which gestures are best suited to performing certain musical functions. Ideally transducers will be able to provide feedback, such as haptic, in order to contribute to the connection mentioned above.
  • Controller could also aim to give vibrotactile feedback and sound localization by using speaker(s) mounted in its body.
  • Controller will not be any kind of sequencer.
  • Controller likely to be an "instrument-inspired controller."
-> Ease of concept
-> Provides some users with familiarity of use
-> Provides a clue as to the kind of suitable gesture vocabulary potentially even for novices and non-musicians
-> Currently leaning towards guitar
  • Sensors connected to computer via arduino and algorithm run in Max/MSP
  • Emphasis on controller design so could potentially use an open source algorithm for final model
-> Will have to use simple test algorithms whilst designing
  • Mapping strategies to be properly considered and based on suggestions of research

Thursday, 20 May 2010

Zendrum

Zendrum - "Redefining the Terms of Modern Drumming"

Looking into Futureman's Drumitar I was referred to a commercially produced instrument which uses a similar design principle to create a pretty unique MIDI drum machine. The history of the company is a pretty interesting read and involves great drummers such as Manu Katche and Future Man himself.

SynthAxe

SynthAxe wikipedia
"This ain't no MIDI guitar" (inc labelled diagram)

The SynthAxe is a good example of an instrument-like dmi in that it attempts to recreate the features of a guitar in a MIDI controller. The controller consists of a guitar-like neck complete with frets and strings. The strings are not responsible for producing sound themselves but rather provide control data when fretting and bending. Interestingly, it is also possible to attack the instrument by plucking strings but these strings are in fact a completely separate set. These strings are much shorter and are mounted on the instrument's body. They use magnets and the '"hall effect" for velocity control. As well as this there are also a set of 6 drum pad like controllers which can be used to play the instrument. Each pad acts the same as the 6 control strings but also has after-touch sensitivity. In this way individual notes in a chord can be modified as the chord is being held. There are also various other controllers on the body which presumably can be mapped as the player sees fit. There is also a whammy bar. Naturally the whammy-bar need not exclusively be assigned to pitch bends but can also be used to control things like filter cut offs to produce interesting effects.

This is certainly an interesting instrument and especially I think for its use of real guitar strings. This will naturally add a realistic level of haptic feedback for the player in that they will be able to feel the strings bending and vibrating with their fingers, giving an experience akin the naturalness of acoustic instruments. I would imagine this would greatly add to the experience and engagement when playing the instrument. The pads are a nice touch since they add a method of control not possible on acoustic guitars which will allow a different feel to the sounds being produced. One draw back is that although there is haptic feedback I assume there is no vibrotactile feedback. Also the sound source is located externally from the controller, which would unlikely even have the low audible noise output of an electric guitar. It seems a shame to have great haptic feedback but not to go the whole hog and give vibrotactile feedback as well as possibly an internal sound source to really create a complete system which feels like a "proper instrument" to the player. This is further confounded of course by the amount of extra equipment the instrument requires (rather than the pedals etc being optional extras like an electric guitar). Another point to make is that I assume the method of excitation used to play the string controller has no effect on the sound. To my knowledge the waveform imposed on the string is not measured so using different playing styles (finger vs plectrum picking for example) and changing one's position on the string will not have audible effects.

This instrument is cited as the inspiration for Futureman's "Drumitar." Futureman is the percussionist with (predominantly) the group Bela Fleck and the Flecktones. He plays (amongst other things) a custom made MIDI percussion instrument called the Drumitar. The Drumitar uses a series of drum pads mounted on a guitar-shaped instrument to trigger samples. This is a great instrument and well worth looking at. It can also be said that Futureman is a virtuosic performer on his drumitar which proves its worth as an instrument. I can't confirm that the Drumitar is in fact a modified SynthAxe but the shape and style of the controller is certainly similar. Nor can I confirm exactly where each pad is located on the instrument since the decorations are weird and the layout seems frankly bewildering to look at.

An interview with Futureman gives a great demonstration of playing the Drumitar (though sadly lacks an description of what is or how it works, aside from mentioning samples). See part 1 and part 2.

Wednesday, 19 May 2010

Monome

Monome

A sequencer type controller but with lots of different functionality and ways of trigger different things over time. Designed purely as a hardware controller for which it is possible to write custom software and as such use the controller in any way you like. Presumably some pre-written software is available for quickstart projects and practice. Very similar to the Yamaha Tenori-on but I've been told by a colleague (Ianis Lallemand) that the Monome was in fact released first and as such is the original device. Not sure how the two devices differ exactly, if at all, except for some design concepts such as the Tenori-on model available with two screens in order to allow the audience to see the sequencing being performed in an attempt to make the instrument for engaging, some thing I'm beginning to feel that for live performance at least is incredibly important for a dmi.

Digital Lutherie - Crafting Musical Computers for New Musics' Performance and Improvisation

Digital Lutherie - Crafting Musical Computers for New Musics' Performance and Improvisation. - Sergi Jordà Puig

Another useful and interesting PHD Thesis. Particularly since it concentrates in part as to how we can define a "good" musical instrument and how we can use this definition in the creation of DMIs.

The Studio for Electro-instrumental Music

STEIM - The Studio for Electro-instrumental Music

From STEIM's website:

The studio for electro-instrumental music is the only independent live electronic music centre in the world that is exclusively dedicated to the performing arts.
The foundation's artistic and technical departments support an international community of performers and musicians, and a growing group of visual artists, to develop unique instruments for their work. STEIM invites these people for residencies and provides them with an artistic and technical environment in which concepts can be given concrete form. It catalyzes their ideas by providing critical feedback grounded in professional experience. These new creations are then exposed to a receptive responsive niche public at STEIM before being groomed for a larger audience.

Importantly:

STEIM promotes the idea that Touch is crucial in communicating with the new electronic performance art technologies. Too much the computer has been used, and designed, as an exclusive extension of the formalistic capabilities of humans. At STEIM the intelligence of the body, for example: the knowledge of the fingers or lips is considered musically as important as the 'brain-knowledge'. STEIM has stimulated the design of extremely physical interfaces and is widely considered as the pioneering place for the new live electronic concepts.

Seems like lots of cool things are going on here so this will be worth a detailed look at some point. Following the Electronicmusicalinstrumentsexhibition link provides access to several video demonstrations of some of STEIM's alternative controllers, which can be accessed directly here

Novel DMI From NIME

Example of a performance of a novel alternative dmi at a NIME conference.

Unfortunately there isn't a technical description to go with this video. The five poles seem to each be responsible for triggering a different loop of sound. Some are melodies (taken from a sigur ros tune) and others are more ambient pad-like sounds. Each pole seems to be equipped with sensors (of what sort I'm not sure) which I would guess are able to detect the speed of the plate's rotation and probably some of things, tilt in several axes perhaps. These controls are then mapped to elements of the sounds. It is certainly possible to hear the main melody slowing down and speeding up over time. The pads also have very definite "swirling" effects probably caused by the modulation of a filter cut of frequency/center frequency. Some of the spinning plates may be themselves control values as apposed to actually triggering any audio.

Although this appears a novelty instrument there are actually a couple of quite interesting things about it.

One is how visual the instrument is. This works in two ways. The main feedback channel for the instrument is visual. The performer has to monitor each spinning plate to check for ones that may be beginning to slow down and fall off. This task becomes harder as the performer adds more plates and thus loops. He must check after adding a plate that all the others are still ok. It is often sited that visual feedback is most important for amateur players, becoming less important for virtuosi but it seems in this case the visual feedback will always be important. Another visual aspect of the instrument as from the performer's perspective. The audience in the video are clearly very engaged in the performance. They seem to be enjoying the visual spectacle of watching the performing attempting to keep all the plates spinning and the audience gets more excited as more plates are added. Many times the performer is seen leaping from one side of the stage to the other to try and save a plate which provokes a very positive response from the audience. This audience engagement is aided by the fact that the instrument has a very clear link between performer actions and sound consequences. It is clear that when a performer adds a plate, a new loop is started. When a plate slows down, its loop slows down. When the performer speeds the plate back up, the loop speeds back up. This obvious and natural link means that the audience can easily follow the performance and recognize the performers actions as being meaningful. This is interesting since in this respect the spinning plates are much like an acoustic instrument, despite looking a world apart! The lack of this link is often given as a criticism of live digital music performances involving typical arrays of sliders, knobs and buttons, since it is usually not clear at all in what ways the performer is affecting the sound. This is certainly some thing that I've experience first hand at electronic music concerts.

Another important aspect to do with engagement is on the part of the performer. To put it simply, he's loving it! This seems like a very entertaining instrument to play, probably because of the sheer physicality of the gestures involved. This physicality increases as the piece becomes more complex and it seems to be a real challenge to keep much more than 4 plates active at the same time. In this respect the instrument provides a really entertaining challenge to the performing, sort of like a game, and in that way the perform has an enjoyable experience playing. There are also the factors mentioned above such as the high levels of feedback and the obvious link to sonic results that add to this engagement level.

I think another interesting thing about this performance is that the performer is clearly not really that good at playing the instrument, no offense. He is able to get 2-3 plates spinning ok but begins to have real trouble at 4-5. Throughout the performance there are times where plates slow down to a wobble, plates fall off, he has trouble getting the plates started etc. This is very important since it implies that the instrument is quite hard to play, which implies that it is possible to improve one's skill levels and possibly eventually master the instrument. Indeed, spinning plates is some thing that I can imagine being hard. This ability to improve and master an instrument is always sited as being a category of a "good" dmi (disregarding those designed for public installations which often take a some what opposite approach). Interestingly, one could consider the skill levels of the instrument as existing on a continuum. The entry cost for an amateur should be low since for example keeping one plate spinning and possibly modifying its speed to producing dynamic sounds should not be too much of a challenge. It could take potentially a lot of practice however to improve one's skill and be able to manage many plates so there is plenty of room for mastery, especially if the instrument was expanding to allow more plates. The entry fee for playing an instrument is often sited as an important consideration since many researchers feel that DMIs should be appealing to novice users as well as to experience musicians.

One possible draw back of the instrument is the levels of expressivity it would allow. Its difficult to make comments on this without knowing the exact inputs of the instrument but one could argue that simply being able to start, stop, speed up and slow down sample loops would give access to fairly limited levels of expressivity. This is interesting since there seems here to be an abstraction between levels of expressivity and virtuosity. For example I would think one would have to practice quite a lot in order to be able to keep a larger array of plates spinning nicely, but even so with this set up this level of competence would still have a limit on expressivity. There is physical mastery of the instrument but perhaps limited musical mastery. That is not to say that the instrument could never be expressive. For example if as mentioned above some of the plates where control plates, filter plates, other effects plates etc the expressivity could be vastly improved and perhaps brought inline with the amount of physical mastery available. The synthesis system also will obviously have a large impact on expressivity.

I think the fact that I've been able to right so much about some spinning plates is perhaps a testament to this instrument's originality and coolitude.

Tuesday, 18 May 2010

A few controllers taken from Sergi Jordà Puig's Thesis (above)

The Yamaha Miburi - Alternate Controller using empty-handed gestures captured by various sensors including an exo-skeleton vest and shoe sensors.

Wacom home page - Manufacturers of computer interfacing technologies, notably a series of Tablets utilizing Stylus controllers, buttons and touch screen technology for control. The wacom tablets have historically been a popular choice as alternative digital musical instruments since they offer many Degrees of Freedom.

Music Mouse - Home page of a dmi based on the humble computer mouse. Also viewable on the site is a nice (although not necessarily complete) time line for electronic (and digital) musical instruments.

The Mouthesizer - A very strange dmi using computer-sight technology to monitor the shape of the performer's mouth, providing the results a synthesis control values (wah-wah for example)

New Interfaces for Musical Expression (NIME)

The NIME homepage.

" The International Conference on New Interfaces for Musical Expression is currently in its 10th year. Researchers and musicians from all over the world gather to share their knowledge and late-breaking work on new musical interface design. The conference started out as a workshop at the Conference on Human Factors in Computing Systems (CHI) in 2001. Since then, international conferences have been held around the world, hosted by groups dedicated to research on New Interfaces for Musical Expression. "

Website includes links to archives of the proceedings of the NIME conferences including A LOT of pdf articles.

Monday, 17 May 2010

TOWARDS A NEW CONCEPTUAL FRAMEWORK FOR DIGITAL MUSICAL INSTRUMENTS

Towards a new conceptual framework for digital musical instruments - Joseph Malloch et al

A presentation of another method of defining the functioning of a DMI and the user's interaction with it. Some DMIs are presented in the context of having this model applied to them. DMIs presented - The Rules, The CelloBoard and The GyroTyre.

Gyrotyre: A dynamic handheld computer music controller based on a spinning wheel

Gyrotyre: A dynamic handheld computer music controller based on a spinning wheel - Elliot Sinyor.

An article giving an in depth description of the workings of the DMI The Gyrotyre, written by its creator Elliot Sinyor. An Alternate instrument created from the wheel of a child's bike (!), The Gyrotyre uses a veritable plethora of sensors attached to the wheel and a custom made handle to extract various physical readings of the tyre's movement. The circular movement of the tyre is even used as a sequencer track allowing the spinning to trigger samples with each rotation of the tyre qualifying as a measure! It can even record a measure, loop it, and allow the user to create additional layers of beats!! This seems to be an extremely imaginative instrument and serves as a prime example that although computer algorithms are very important in a dmi, the physical controller can have a massive effect on the physical interaction of the user and the resulting sound possibilities. Interestingly this instrument does not involve any visual feedback from the computer and as such can be played without paying any attention to the computer. In this respect it is reported then when played it feels like a unique and "proper" instrument.

PCMAG.COM's top 10 digital musical instruments

Top 10 Digital Musical Instruments

A short slide show presenting some digital Musical Instruments. Most seem to be instrument-like controllers, ie they attempt to reproduce every feature of their acoustical counter parts. Some are more commercial such as the Piano Wizard, designed to be a learning interface but others seem more professional such as the trumpet and violin. Interesting instruments include a roll-up piano as well as a digital bag-pipe. No technical information, simply some examples.

A Study of Gesture-based Electronic Musical Instruments

A Study of Gesture-based Electronic Musical Instruments

This articles presents a collection of alternative controllers that are primarily based on gestural control of music that does not (in most cases) involve interaction with an explicit control interface. Rather, the instruments use techniques such as data gloves, capacitance sensing and LEDS as transducers. In places the report gives a more in depth description of some instruments which have been mentioned in previously posted works.

The problem with this report is that it seems to present a very casual definition of gesture when it comes to musical control. It suggests that the definition of a gesture is predominantly that kinds of movements involved in the kind of "free-air" controllers it presents. In that respect this article is best used simple to give some examples of existing alternative digital musical instruments, rather than being a more technical references as some of the earlier articles.

Wednesday, 12 May 2010

Physical Interface Design for Digital Musical Instruments - Mark Marshall

Physical Interface Design for Digital Musical Instruments - Mark Marshall (pdf can be viewed using the site's own reader or a copy can be saved to computer).

This is a comprehensive PhD thesis on the subject of Digital Musical Instruments, who's research is predominantly aimed towards improving the "feel" of a digital instrument. Notably the thesis was supervised by Marcelo Wanderley, who has contributed many important papers ont he subject.

This is a large work and so contains incredibly detailed and valuable information and is largely driven towards presenting good DMI design techniques which have been chosen based on tests of usability.


Monday, 10 May 2010

Misa Digital Guitar

Another digital instrument which is some thing like the picture I have in my head of my own.


I haven't found an actual semi-technical description of what functions this thing has but I can outline a few from the video. Its a MIDI controller designed to run on Linux and trigger any thing you have on your computer. The control interface is very similar to a guitar in shape and style. There are no strings. It seems like each fret position on the neck (across all 6 strings) has been replaced by a button, so it looks like the chord shapes are the same (given the same tuning scheme, not sure if this can be changed). The unique thing about this instrument is that notes are triggered using a touch screen interface. The touch screen has a visual feedback elements in the form of lights at points of touch. Looking at it there seems to be one circle of light which is always present and can be actually dragged around the screen with the finger. I assume the x-y position of this circle can be linked to any control parameter. To trigger notes the gesture seems to be tapping else where on the screen with fingers. Interestingly the screen supports multi-touch so the instrument can be played by drumming multiple fingers on the screen, some what like a series of drum pads. The x-y position of the tap is also used as a control input. Notes can be sustain by tapping and holding and during the hold the x-y position can be changed dynamically. I can' t be sure if the controller is pressure sensitive are if the screen can be split into different sections etc.

One idea for the excitation gesture for my controller is to use pressure pads to allow the user to tap the body of the instrument with their fingers to trigger sounds. The pad would (hopefully) also allow sustained notes and obviously the pressure of the finger would be used to control some aspect of the synthesis. Using a tapping gesture like this along with pressure should provide a nice haptic feedback system given an effective mapping strategy.

Eigenharp

A digital musical instrument from eigenlabs that appears to be able to do everything ever.


I think the most important aspect of this instrument as far as my project is concerned is perhaps the actual physical design. The Eigenharp Alpha is some what reminiscent of a guitar or bass in the way that it is carried and played. Interestingly this seems to have had an effect on the player in this video. I would say that based on the way he moves his left hand fingers its likely he is a guitarist or a bassist and this seems to have really helped him when approaching this new instrument for the first time. He already has a gestural vocabulary which fits this situation and has allowed him to apply his past experiences with a new instrument. Despite this it is clear he is still unfamiliar with many of the gestures the instrument requires such as the way samples and loops are trigger and the way the sliders work. I think this makes a strong argument for instrument-inspired controllers. The controller is immediately accessible to many musicians and it is arguable that most users would have some concept of the appropriate gestures. The controller still offers however a range of new gestures and functions to be mastered.

The combination of the 3-axis (left-right, up-down and pressure) keys and the breath controller seem to allow for an incredible amount of expression and musical subtly and would certainly allow scope for the instrument to be mastered to a high level of expressivity given a large amount of practice and training.

Another interesting feature is the visual display which is most noticeable when the instrument is used in its step sequencing mode. This seems to have been aimed towards providing a degree of audience involvement when it comes to performers who use a lot of loops and samples in an attempt to give the audience a sense of the musical functions the performer is carrying out.

The full range of products currently offered by Eigenlabs is available here. This seems like a really interesting company and, most importantly, its British!

According to the website the keys are finger position sensitive to within a micron - the wavelength of light!! This is immensely impressive but seems kind of over kill since I would say it is physically impossible to produce position variations to that level of precision. Still it should surely mean that a vibrato effect, for example, could be applied (give the correct level of skill) just as subtly and expressively as any acoustic instrument, ie the violin.

The Specials - Ghost Town. Performed by three eigenharpists (?). Brilliant. The best thing is the guy at the back who is using is Eigenharp to play and change between several drum loops as well as provide a bass line. Eigenlab say the instruments have enormous expressive potential and a limitless range of sounds - and I'm starting to believe them.

Expressiveness and Digital Musical Instrument Design - Daniel Arfib et al

Expressiveness and Digital Musical Instrument Design - Daniel Arfib et al

This has proven to be a good read since it goes some way towards providing a definition for expressivity when it comes to playing musical instruments. It also presents important aspects of expressivity that must be taken into account when designing digital musical instruments. The last sections of the report provide examples of how these notions of expressiveness where implemented in some existing digital musical instruments.

Friday, 7 May 2010

Mapping Performer Parameters to Synthesis Engines - A Hunt, M Wanderley


This article takes an in depth look at mapping. It suggests appropriate mapping strategies for various control situations based on practical experiments. In particular it suggests that a divergent (one-to-many) or not one-to-many mapping strategies are more suited to complex control situations (but lead to difficulties for simple situations) and in terms of music offer an increased potential for ease of expressivity. Not only this but it suggests that systems "which utilize a measure of the user's energy under the control of more than one limb (or body part)" are "more engaging to users" (with respect to simplistic one-to-one systems, such as the one screen slider controller used in the study).

Haken Continuum

Haken Continuum - 3 dimensional finger board. left-right, up-down and pressure.

Musical Taste

Some thing has occurred to me. Its related to one of my original motivations of this project which is the contrast I mentioned between my own music taste and the general musical activities of Ircam. Some of the controllers that I have mentioned so far (see especially the BoSSA and BioMuse from the last post as well as the t-stick) have been used and possibly designed exclusively for use in "contemporary music".

Contemporary music is largely some thing I am not particularly interested in (despite trying). That is to say at least that I am selective and fussy with my contemporary styles. In any case, this raises the question of whether the DMI that I may create could fit into my own personal music tastes.

I should look into the extent to which the synthesis algorithm behind a dmi influences the kinds of genres it is suited to. For example it seems clear that the synthesis behind the biomuse makes it somewhat unsuitable for use in a general rock context for example. Or perhaps it would if used in the right manner. Certainly some examples of songs using electronic/digital elements spring to mind.

I think that again I need to look closely at the kinds of things I want my controller to be able to control and also regarding mapping layers to allow the controller to control anything so long as some software calibration was involved...

Gesture Analysis of Bow Strokes Using an Augmented Violin - Nicolas Hainiandry Rasamimanana

Gesture Analysis of Bow Strokes Using an Augmented Violin - Nicolas Hainiandry Rasamimanana


This research this article presents has been aimed towards finding new ways of interacting with a computer in a musical context. They do this using a traditional violin and bow which have been augmented with various sensors to provide different kinds of data readings which can be used for synthesis.


In this respect the article presents a good example of an augmented instrument and goes into detail about the sensor systems it employs (particularly in chapter 2, "Ircam's Augmented Violin").


Chapter 1, "State of the Art", begins by presenting definitions for "new music interfaces" (corresponding to the instrument-like, instrument-inspired and alternative controllers from previous definitions) and "augmented instruments". It also briefly presents some pros and cons of each approach to DMI creation.


Importantly, Chapter 1 also presents examples of existing systems some of which I have been able to find resources for:


BoSSA - the Bowed-Sensor-Speaker-Array - an instrument which looks as weird as it does cool. The link provides a description and a lot of examples and resources, including a publication which I will review. For convenience a demo video can be found here. Rasamimanana's article would consider (clearly) a new music interface. One could possibly argue that we could consider it an instrument-inspired controller, based on the violin. Indeed the instrument attempted directly to mimic some of the violin's physical performance interface but there is a question of how much a new controller must resemble an existing one in order to be considered "instrument-inspired". Personally, I feel comfortable enough giving it this label. Every aspect of this instrument is interesting. In terms of interaction it has many more transducers that it immediately appears to have including pressure sensors and several accelerometers on both the bow and finger board (which is also equipped with a linear position sensor for selecting notes). The most interesting aspect of this controller is the fact that it has attempted to reproduce the resonator of a violin using a spherical array of speakers which (i assume through some kind of DSP) can "reconstruct the radiative timbral qualities of violins in a
traditional acoustic space". According to Rasamimanana, in the video in the above link the bow data was used to control a comb filter vibrato.


Note: This section refers to pressure sensors as FSRs (force sensing resistors)


Biomuse - The report sites the BioMuse created by Atau Tanaka. "Bioelectrical signals, and particularly electromyograms of his arm muscles, are digitized and mapped to sounds and images. Therefore, the movements of his body are directly interpreted to create music". The sound quality in the above video isn't create but it does show very clearly the biomuse being used. Some brief research also returned the Biomuse Trio. Where a similar instrument again called the biomuse can be seen. This biomuse however is being played by Ben Knapp who is also sited as its creator. Not sure what, if any, the connection is between the two instruments.


Hypercello - See the video. An instrument created for Yo-Yo Ma by Tod Machover as part of his HyperInstruments research group. Again the cello has been augmented with various sensors, the signals from which are used as parameters for synthesis.


The rest of the report goes into detail about the analysis carried out on the signals received from this augmented instrument's sensors.


My aim from the beginning of this project was to look at ways in which some form of connection can be reestablished with digitally synthesized music in order to allow it to be used with a greater degree of expressivity. This paper specifically states that often in the case of new musical interfaces, although connection can be established using the right kinds of transducers, the simplicity (with regards to acoustic instruments) of the interface "often means a poorer expressive interface". The report also highlights again the importance of haptic (touch) feedback when playing, stating that this is something still under research. Perhaps the best way to over come these things is to consider an augmented instrument approach?

Thursday, 6 May 2010

Gesture - Music by Claude Cadoz


An article which is very important for me in that it provides a greater depth of information into some of the terms used in other articles previously mentioned (which indeed use this article as a basis for their work).

The article starts with an in depth look at definitions and thinking behind the term "Gesture". It does not however present its own definition of gesture but merely compares and contrasts currently existing ones.

In the section titled "Phenomenological analysis", the paper outlines some physiology principles behind "gesture" pointing out
the importance of understanding the basic physiological behavior of the human body when modeling the interaction between man and a machine
In particular this section presents a definition for the terms Isometric Force and Isotonic Force used in Wanderley's "On the Choice of Transducer Technologies..." as such:
  • Isometric - in order to produce a force
  • Isotonic - in order to produce a displacement
The section titled "Functional Analysis" presents a description of the possible functionalities of a gesture. In particular it outlines and defines three classifications, also used in Wanderley's report:

  • The ergotic function - material action, modification and transformation of the environment;
  • The epistemic function - perception of the environment;
  • The semiotic function - communication of information towards the environment.
The section the goes on to define Instrumental Gestures. Proposing a classification of instrumental gesture based on its function:

  • Excitation gesture;
  • Modification gesture;
  • Selection gesture.
In the next section "Case Studies" the author goes on to give real life examples of this terminology by examining existing acoustic instruments.

A Possible Choice of Material for the Final Controller Design

Ground breaking

Some Answers to Michaela's Questions

1)


It is possible to group digital musical instruments into three broad categories:

a - instrument like - A control interface that tends to reproduce each feature of an existing instrument (ie an electric guitar, keyboard, sax etc)


- instrument inspired - A sub-group of instrument like controllres. An interface that is largely inspired by an existing instrument but it is intended for a different and often more general purpose (ie the Digital Trumpet)

Buchla Marimba Lumina

Electronic Sax


c - augmented instrument - An existing controller which has been fitted with additional sensors in order to provide extra elements of control/synthesis

Disklavier


d - alternative controller - A controller which does not follow the design of an established instrument

Reactable


Originally in this project I was leaning towards an "instrument inspired" controller. There were two main reasons for this. The first is conceptual ease of design. I felt that because I don't have any background in product design and manufacture that perhaps it would help me when it came to actually designing the control interface if I already had some preexisting notion of the general shape, size and materials my controller could consist of. The second reason is to address questions of ease of use and playability. Reports suggest that when one first comes across a new musical control interface, the key aspect in terms of playing is ease of use. Once the basics have been learned the emphasis shifts on learnability, the ability to spend time with an instrument in order to learn more subtle and masterful control. I felt that using an instrument inspired controller should help in the first instance of ease of use, since many players would already have to a greater or lesser extent a notion of the kinds of gestures that are suitable to such an instrument. At the same time the choice of transducers could produce interesting challenges for even established players of the original instrument since they could effect and change for example the established attack gesture (for example a pressure pad used to trigger sounds on a guitar-like interface, changing the typical attack gesture from a plectrum attack to applying varying pressure or perhaps short percussive whacks).


Recently, David Creasey highlighted the importance of physical feedback when it comes to musical control so this is some thing I'd really like to think about. My above idea involving a pressure pad may be a good one since applying pressure to the pad would naturally produce some kinesthetic feedback. David suggested considering an augmented instrument controller in order to have physical feedback "built-in". This is an interesting approach which I will look into and post some articles and examples.


I also remember an earlier post regarding a slightly ambitious idea of equipping an instrument inspired controller with a resonator (some how) and passing the synthesis signals through it via some form of transducer. Having this resonator could possibly add an extra element of physical feedback useful for control, especially if combined with other techniques.


2)


one could say: "yes, we can build a lot of different kinds of new electronic musical instruments, but in the end it all comes down to doing the synthesis (in programmes like max/msp), so why bother, they can all sound the same..." what would your response be?


I think my response would be to talk about the expressivity possible with a given synthesis algorithm and controller combination. That is to say, not particularly the core of the sound which is the algorithm's responsibility, but rather how that sound is used to make music. Not the way it sounds but the way it is played. At the moment I could define two elements of expressivity. The first is the kind of physical feedback mentioned in (1).


The second element is related to the controller itself and its transducers. The choice of transducers for a controller can have a large impact on the possible expressivity of the resultant sound. To take an extreme example one could compare a controller like the one from a past post to the digital trumpet. The first controller triggers sounds using a small pressure button. The second has a pressure sensor mounted behind a trumpet mouth piece which detects even the smallest changes in the blowing style of the player. It is clear that hypothetically if they both trigger the same synthesis algorithm, the digital trumpet would allow sounds to be played with much greater control. Another question here however is ease of use and learnability. One could argue that the trumpet controller would be harder to use and almost impossible for most players to truly master (to possibly a greater level than the player in the video exhibits). In this respect perhaps for a lot of users the simpler button controller would be more appropriate. Linked to ease of use of course is the style of controller itself. Guitar players may find it much easier to use the first controller whilst trumpet players would be more comfortable with the second. If an alternate controller could be used all players would likely be of approximately equal physical skill to begin with.


I think it is possible to see that the choice and design of a controller is an important one since it effects the way in which a player can create the sounds produced by the synthesis. Another important point however is that of course the synthesis algorithm must be such that it is able to respond to the parameters made available by the controller. For example if a very simple algorithm which trigger sounds simply via an on or off stage were controlled by the trumpet controller, much of its expressivity would be lost. It would no longer be possible for example to produce crescendos using increasing breath pressure.


It seems that neither the controller nor the algorithm should be underestimated in their contribution to the expressivity and playability of the complete system. I think this question is definitely some thing to be looked into further. I think it would also be worth while looking further into mapping techniques and the possibility of introducing mapping layers to allow controllers to be portable from one algorithm the next...


3)


I think for this project i'd like to focus on ways of playing, since we've done quite a lot of synthesis work over the past 2 years. The issue is however that, as I mentioned above, however complex I make the controller I will have to design an algorithm sufficiently complex enough to respond to that controller's parameters. May be a better answer then would be that i'll have to focus on both. Actually David Creasey advised me that the best way to approach this kind of project is to try and keep the advancement of each section at roughly the same level so that the complete system has a chance of coming together as a whole. Or at least if the end goal cannot be reached then the system can still function well together for a presentation (instead of for example having a very simple bread board circuit of a controller and a complex synthesis program).


The fun thing for me would be actually playing the instrument and feeling at least some what involved in the sound making process.

Monday, 3 May 2010

Yamaha Tenori-On

Product Description

A nice example of an alternative controller. It is an instrument in its own right since it has many built in sounds but it can also be used as a MIDI controller. Interesting visual feedback element which does draw you into the music, whilst being slightly bewildering in my opinion. Nontheless an interesting way to get the player "involved" in the digital music.

Digital Trumpet

A nice video showcasing James Morrison's Digital Trumpet, a fairly new digital musical instrument. Interesting to see that the instrument doesn't rely on its own software but rather external, unrelated sound modules. James makes an interesting point at around 4.30 regarding how the expressivity of a digital keyboard can be reasonably limited compared to other types of controllers, in this case air-pressure based.

Initial Project Spider Diagram


After talking with Dr David Creasey regarding another potential final year project, he suggested that I create a spider diagram to sum up the key points of the project and also identify possible difficulties. Taking this advice I produced one for my DMI project proposal.

COLOUR KEY
Project description => no colour/black
Learning that music occur => orange
Available technology/algorithms => blue
Risks => red

I'll keep adding to this mind map (produced with www.mindomo.com) so if there's anything I may have missed then let me know and I can think about how to incorporate it.