Juxtaproject is an interactive installation focusing on immersive portable technology fused into a projection-based installation. The piece is inspired by strange, alternative and twisted reality’s of the human form displayed in the work of Hieronymus Bosch, The Garden of Earthly Delights.
Each device within the installation projects surreal content producing its own visual and tactile characteristics, including striking audio and visual imagery. The portability allows for a creative interaction with audiences in moving and rearranging the content creating new and reformed versions of the human form.
As part of my practice based research I have been using my time during the MA to work on a variety of different lighting, projection and digital projects. Working with Squidsoup on one of their new lighting projects in Kew Gardens allowed me to explore the realm of large immersive environment creation.
Squidsoup are renown for immersive lighting environments developed off the basis of audience participation and interaction. With an extremely successful LED grid that’s recognised for its elegant immersion and beautiful tactile experience, the group are looking to take forward the conceptual ideas into a new area of development and research. The grid is connected to practical research into the exploded pixels and burst out of screen and into the physical environment. The grid uses colour systems, particle physics and flocking patterns to display a natural flow of imagery and shapes within 3 dimensions with the ability to interact audiences in a 3d space which they can move through to become further immersed.
The new project as part of the Kew Garden’s Christmas Lights is a conceptual rethinking in terms of distributing the grid across a single plane that can produce a similar effect to what’s achieved within the grids physical presence. The first iteration of the project was displayed in Cardiff port, which has been developed a step further in the current iterations. The latest version adds far more features such as WIFI, GPS data, sound control and accelerometer data. All of these data inputs can now be used to generate a range of new dynamic light designs distributed across a vast plane of 1000 LED lights.
The new project has many connections to the well-explored grid as it works of similar ideas in relation to the redistribution of pixels into a physical landscape. However, the flattening of the grid introduces a new aspect and consideration of perspective of view. This leads to discussions of pattern, collection and correlation between natural forms across a far less digital 3-dimensional surface. This different kind of restructuring provides the SquidSoup researchers with an entirely new perspective of the distribution and dissemination in the form of lit immersive spaces
The LED system is based off the use of a hi-tech wireless system controlling and distributing to the 1000 individual devices. Power cords connect the devices however the main control comes form OSC commands sent via the network. These commands come in relation to what colours need to be displayed or sounds played from the tinny speaker. Arduino is used as the primary control of the devices, this can then be distributed to the devices through a web server. Processing is then used to control and send out OSC messages to the devices in order to command them in specific patterns and alignments. The technology provides many issues when trying to sync a vast number of devices onto a single network including the huge amount of data that is returned from that.
Having been given the chance to work on a project like this was a great pleasure and opened my eyes to the possibilities of immersive environments and the scale, production, time, complexity and planning which goes into developing such a project. Including this its added many new ideas into my own practices such as the element of distributed pixels across surfaces. This idea has been toyed with regarding projection as a step further than augmented reality with a direct link to the new aesthetic also
Description: A digital projection mapping exploring the complexity and changeability within the human form through ideas in connection to body dysmorphia. The piece aims to portray the complex social and cultural issues within modern society connected to body image and mental health. Through the use of digital imagery and sculptural form these ideas are presented to the audience in a immersive visual display.
In today’s digitally advanced society, it becomes impossible not to perceive that we are overwhelmed by technology and the boundaries between real and simulated worlds are blurred by the technology and what derives from technology acting as an agency upon us. At the same time the notion of old and new within this context become indistinct as tech objects are continuously remediated, resurfacing, finding new uses, context and adaptations (Parikka 2015).
Technologic obsolescence, as discussed by Slade (2006) has developed a psychologically advanced society that drives the development of technology resulting in an ever-increasing waste. The notion of remediating technology has become a very recent reaction, both to the increases in e-waste as consumers react to brand over marketing and mass production. The physical waste within the installation is designed to illustrate a result of the over consumption and disposal of e-waste and to create a new product for artistic purpose.
The installation allows the audience’s senses to pass through reality via interaction with a medium, constituted by water immersed obsolete objects. This augmented reality was created through projection mapping onto water surface. Through the use of a public exhibition curated by Dr Anna Troisi, the immersive installation was used to portray these ideas to the audience from which to gain a response as a result to their perceived understanding. The use of projection mapping within augmented reality is allowing
designers to change the audience’s perception within their own environment, distorting their perception and therefore blurring the boundaries between realities.
From the study it has been understood that through the use of tactile interaction, different perception and emotions can be felt due to the audience’s ability to socially participate through individual and group interaction. The key outcomes from my study are the feedback response from participants that showed this interactive installation started to break down the modern art gallery perception of interaction and involvement with the art itself
Create a piece of interactive information design for a shared public space, which is intended to elucidate/explain some an idea or concept you perceive as key to our 21st century media experience.
The goal of the installation is to present the idea of ambience of an environment and the change in it. The final installation is a combination of theories involving colour, sound, ambience, audiences and design principles. The idea of ambience was presented using a processing sketch that took in sound using a microphone and presented back to an audience on a display. The area for the installation and hardware available had major effects upon the final outcome as it dictated some design factors. These were resolved to produce an information graphic which used peoples presence with in the environment as a form of interaction.
The final piece transformed several times throughout its design iterations.
Initially mainly focused on presenting music back to an audience using visual screen design.
The introduction of audience interaction using reactivision and Minim created an activity for of participation around feelings and audience participation but, didn’t present a media concept specifically.
Lastly ambience and theory of passive participation allowed an audience to participate. As knowing or unknowingly were actively participating in media constantly.
In the foyer from testing the concept works well to show the change is activity within the area. Secondly adds some nice atmospheric sounds to the area describing the current environment sound level in comparison to another place using the form of sound. I think it could work well as a tool for classrooms to control sound levels or as a scale of sound measurement within busy areas
Some aspects of the final installation that didn’t work like was not being able to use the large screen high up. This created a problem in terms of the audience as it was meant to only be seen by an active audience who were looking up towards the screen. The screen used in testing is in direct line of view meaning even passive audience would see it possibly defeating the object of on of the concepts.
As an infografic it works well to describe the changing environment as it becomes busy or quite with a design that doesn’t require to must processing by an audience who are passing by. The design integrated well with the architecture of the environment well as its bright colour worked well with that of the foyer.
To improve there could be more consideration given to the scale measurements, as it might need so adjustment and a slightly more precise system of calculation to give a more better reading. As an installation its could be improved to have more user interaction this could come from use of the camera in some aspect. The motion detection attempted to do this however, it was hard for the audience to understand and a more fun and inventive way of interacting with the ambience of movement was needed.
Next time I would have liked to put it in a few different places to see what kinds of reactions of audiences are. An example of this could be an event background to encourage in opposition to this a louder and more noise within an ambient environment.
A quick re test of the screen yet again threw some possible issues and problems to resolve. As discussed in the previous post the decision to combine the visualizer and ambient audio sketch was taken. This combination was tested on the screen to see what the results were like in the environment.
As you can see from the video there was quite allot of erratic behaviour of some elements. This was due to the some of the sensitivity of microphone but also the background music being played by Costa coffee. A possible fix for this could be by adding a variable to take away some sensitivity, which could be adjusted for different locations.
This small implementation of a variable to all elements could allow and install team to adjust the sound level to that of different environments or places to cancel out unwanted noise. This would mean that it could be calibrated for different install locations if it was to be placed in other location. Even though the piece was designed to get all the noise of which most of the element do some require so dampening down as they will start to behave erratically if not
Live Testing within the installation area has given me some food for thought in regards to the final design of the installation. After some issues with hardware, design and motion detection a synthesizer version of the ambient detect and representation sketch is presented below. As you can see its relatively simple in design with 2 simple and refined moving features effect by incoming microphone audio.
The visualizer design which was pushed out as a result of changing screen could be used as a way to introduce some more animating parts. Its design was compromised with the change in screen however it had some really cool features of which could be utilized. The emotive time based colour schemes and complex system of music analysis displays change in different audio frequency’s. It would need some reconfiguration with the design but could add more depth to the current layout as there is lots of empty space that could be utilized.
Needs some re design.
Motion Ambience has been left out as the based on its recognition system and hardware ability’s to detect the difference between the motion at varying distances. The camera not being mounted in a high position meant that there was to much interaction in the close proximity however, about 4 meters back very little was being detected creating a representation which was not according to it desired effect/purpose. Some exploration of the kenect could have helped but it was decide that its importance was not a fundamentally part of sketch and could be taken out.
Combining the visualizer and ambient audio can be seen in the video below. Some of the graphics of the ambient audio sketch were not implemented original as it was considered it may not have been needed.
It was good to see all the elements working together but the design still looked messy and like before without a purpose for some of elements left them undefined. Simply re applying the infografic designs added purpose to elements in the design as it gave them a descriptive value.
Testing the installation in the foyer brought about slightly different results to what I was expecting. In the previous post I discussed how the design had to be change due to the large screen being out of use which substantially effecting the design. This also had an effect upon the setup of equipment. I had a setup in mind of how the microphone for the audio input and the camera for the motion detect would be situated. As you can see from the diagram below the equipment was place in high areas to get the data from a larger sample. The microphone was placed in a high position to gain better coverage of the whole environment otherwise people closer could start to have greater effect upon this.
In the area it was more difficult to get a microphone and camera to a high areas as we had limited equipment like ladders and cables to put them in these positions. The compromise to this was facing the microphone and camera towards the main walking area which would capture the most action and best emulate the changing environment. This was not my original plan but the only way I could ensure changing ambience was similar to that of the enviroment.
First Day Testing
The videos below show the first simple visualizer working on the screen. At the time I was working on implementing the idea of Motion ambience within the sketch. The first tests gave a good insight into how the colours and size of elements changed on a large display meaning I could go away and adjust these. In terms of ambience it occurred to me that the environment was a lot noisier than expected also, the ambient music that Costa coffee was playing started to interfere. Lastly it was suggested that some idea of measurement could be implemented as current labels of each sound level to give the graphics slightly more of a descriptive element. These problems are solved in the next tests situation.
Implementing the Motion Ambience was relatively simple adding another dimension of interactivity to the installation. My thought was that the louder the environment got the greater the amount of movement would be. Testing proved an important aspect at this point as some unforeseen issues with motion detect aspect of Open CV.
As you can see from the video below, the motion detect was controlling the white bar that increased in size the more movement it detected. The video shows that the sound was increasing however the motion wasn’t as there was lots of people in the foyer theoretically it should have been of similar levels. From moving around in front of the sketch It occurred that the people closer to the camera were having greater effect upon the motion and the movement. This meant there was very little movement in contrast to the audio. This was something un expected and needs to be looked at in the next development.
In summary from testing in the environment I have found
That motion detect setup didn’t work as expected, it mainly was influenced by the distance away form the camera of motion and not the amount of movement itself.
Implementing the scale in the design explained the purpose of the installation far better without making it to forceful and full of text.
Adding the sound as a representation of what a similar noise level that was being experienced was effective as it grabbed peoples attention while, producing a good signification of there surroundings.