Vlog 9 - Panels 2

A very simple demonstration of how the iPhone can be used in actual practice. It will be interesting to see how a dancer can actually use this. one thing I do note here is that the system will need to be controllable with one hand whilst moving (dancing). So I will have a look at the size of the buttons the placement of the buttons on the screen just to make sure that it's simple enough to be used whilst dancing without having to look at the screen to locate whether buttons may be. This would be distracting and potentially have a limiting impact on the dancers intention.

Vlog 8 - Panels 2

In this video we look at the development of the panel system for the iPhone. The panel system using the Mira software solution allows for customisation of the interface on the iPhone. This has a number of potentially exciting applications for the system. The concept behind this system is the flexibility for the user and the ability to customise the interface to suit their requirements. The advantage of this system is that the user can configure the sonic media from their device whilst in situ. This does not require them to pre-configure their sampled material on the host computer before commencing to work on the dance floor. This means that the user could actually make adjustments to the sonic media in real time which will benefit the creative process.

Vlog 7 - Panels 1.1

This video takes a look at the development of an interface for the iPhone to allow the user to directly interact with their psych media. It looks at the Max programming that is taking place to interpret the incoming data from the iPhone. although it is very much in a "working" for it is clear that there is potential for this interface to function effectively with the iPhone.

Vlog 6 - Impletmenting MiraWeb

This short video demonstrates are what the Mira Web software actually is and how can be applied to the development of the interface for the iPhone to control sonic media. This is an exciting development as it demonstrates the potential for a graphic user interface (GUI) that will give the iPhone even greater functionality in a system of choreography.

Vlog 5 - Iphone Proto 1.1

Here we can see the first real implementation of the iPhone as a controller. Linking it to the max/MSP software via the application gyroscope has excellent potential. In order to make this system usable, in a practical sense, I have to look at ways in which I can scale the data that is coming in from the iPhone. It is raw form, there is a tremendous amount of data flowing in from the iPhone which is in range of values that do not correlate with MIDI values for example (0-127). Additionally, the rate at which the data flows in (and changes) is problematic and will need to be controlled in a way that provides meaningful feedback that can be mapped to a synthesiser or sample playback. it would also be interesting to see what type of recording functionality can be built into the system so that a dancer could potentially play back a series of movements that have been captured by the iPhone tracking them through the space. To do this I will look back to my previous work on recording data into a coll in Max.

Vlog 3 - iPhone Proto 1

The first example of an iPhone being used to control sounds on a computer. By utilising the IOS application gyroscope I'm able to send data from the iPhone over OSC which is received in Max. Max interprets the data streams that are flowing in and scales them and routes them in a way that can be used musically. This is an exciting development and has great potential for the system. There are so many iPhones out there that are being used by students in schools that I believe that this platform should be very attractive to both teachers and students. the real challenge to implementing the iPhone in the system will be to control the way in which the stream data is utilised in the playback of sonic media. I need to find a meaningful way in which to control sound not just a means switch sounds on and off.

© MARK BROWN 2017 Contact Mark