No blog for a while – the last month or so has been really busy for me – I’ve been working a lot on bringing together the ideas from my exploratory project to create various different sound games. The project has been really hard work for me in terms of coding and linking the interactive, visual and sound but I think it has all come together really well : )
Here are some of the experiments that I worked on:
In an attempt to explore the ways in which sound can be used within interaction, I have developed several experiments which link various elements of interaction with sound, both in the form of music and sound effects. I began by selecting parameters of sound which I could modulate such as volume, pan and tempo, and also effects which I could add to sounds in order to change their audible properties, for example, adding reverb or delay. I then created different types of basic interaction within Flash, all of which are based on the manipulation of an object which would then be used as a modulator for the sound attached to it.
In terms of interaction, I decided to use basic shapes such as squares, circles and triangles as visual symbols. I then applied a user interaction, such as a roll over, click or drag, which would change both, a visual aspect of the shape, and also modulate the sound in some way.
During the process of making these experiments, I found that developing one solution would lead me to come up with other ideas for interactions. I also found that looking at existing examples of interaction with sound, gave me ideas for other experiments to try out.
In this initial experiment, I wanted to look at how several sounds could be used at a time, to find out how easily they could be controlled and synchronised.
The control element within this experiment consisted of very conventional volume controls, each of which adjusted a separate sound clip. In this example, three sound loops begin playing, each of which can be turned up or down using the volume control below it.
In the second initial experiment, I wanted to get to grips with some of the possible sound modulation functions that can be used within Flash. I created three arrays of bars which, when rolled over, would each decrease or increase the value of a parameter by a cumulative value of ten. The strength of each bar is also indicated visually by its opacity and length, the lighter and shorter bars will have a weaker effect on the sound playing, and the darker and longer bars will have a stronger effect.
Within this example, I attached the volume adjustment of the sound clip to the set of bars placed horizontally at the top of the screen, and the left and right pan of the sound clip to the vertical bars a the left and right edges of the screen. By moving the mouse to the top, left or right of the screen the user can determine the volume of the sample playing, as well as the direction from which the sound is coming.
Both of these initial experiments provided me with the basic technical functions that could be employed to manipulate sound within future experiments.
This is one of the first experiments that I created, based on the idea of visual focus being a modulator for audible focus. Within this experiment, I first created a series of 3 shapes which naturally appear out of focus, when rolling over the shape with the mouse, the shape gradually focuses.
After creating the visual element of the interaction, I then attached a short sound clip to each shape on the screen which automatically begins playing after the Flash movie has loaded. To link the visual interaction to the sound, I set each shape to fade out all other sounds but its own. The resulting Flash movie creates the impression that visual focus also triggers audible focus.
I tried this experiment with both individual sounds and short music loops in order to tests some of the different ways that this control could be used.
Drag and Drop
In order to test some of the possible functions of the drag and drop interaction, I experimented with the idea of having a sound attached to a target which would be activated when a visual key is dropped on to it.
Within this experiment, I also tried to create a connection between the visual and the audible by simulating the effect of sound quality being changed by physical boundaries. In this example, three different sound samples can activated by dragging and dropping a figure into a box. In addition to the drag and drop interactivity, the user also has the opportunity to open or close each box. Dropping the figure into an open box activates a clear and unaffected sound sample, dropping the figure into a closed box however, causes the sound sample to reverberate.
In this experiment, I decided to look at one of the ways in which the variable volume control could be configured to control both shape scaling and sound modulating functions to get away from the use of conventional controls that are always used to control sound.
By reworking the interactivity of the volume control to double as a scaling function on a shape, the control of sound is made more interesting and could lead to the creation of a more original and exciting sound production interface.
I initially began to experiment with making shapes which could be resized using a draggable handle on the bottom right corner. I then added the volume variable to the same handle so that as the shape is resized by the user, the volume of the sound adjusts at the same time.
Having developed controls for the volume and pan modulators of sound, I then looked into ways in which the tempo of a sound could be controlled. I started out by creating a button controlled tempo adjustment which set the intervals at which a sound was triggered.
In order to represent the sound visually, I create a short pulsing animation which fired a dot every time the sound was triggered, similar to the blip on a sonar display. This resulted in a relatively crude form of control where by each button represented a timing.
In order to bring this control in line with the other experiments that I have produced I incorporated the variable control which was used with the volume and pan experiments. I also made the control more visual by animating the dot to travel around a square, in this example, each time the dot hits a corner of the square, the sound attached to it is triggered.
I then developed this experiment further to incorporate different shapes, featuring different sounds, all of which could be placed on the screen at the same time. By resizing each of the shapes, it is possible to adjust the tempo.
The development of each of these experiments has allowed me to try out different techniques for modulating sound within interactive experiences.
In terms of further development of these particular experiments, I would like to try to refine each of the controls and also add more functionality to them. For example, it would be interesting to develop some form of music making game where the coordinates of the shape on screen, after it has been dropped, could be used to determine its pan and volume settings, or maybe the pitch of the note it plays. The user could then build up layers of sound by dropping more shapes on the stage in different positions with different sizes to construct rhythms.