Investigates ways of constructing intelligent agents that work as independent spatial features or combine to assemble virtually infinite constructs. The ‘Angel’ project plays with architectures historically rigid nature playfully looking at the possibilities of an architecture lighter than air capable of sheltering us and even bringing communities together.
The initial concept developed out of a building proposal in which a conversation space could transform its spatial conditions reacting to a set of protocols based on inhabitant’s discourse. The constantly reconfiguring space was actuated by a series of agents that could descend, rise, approach and retreat from the people within the space as well as articulate a range of behaviours. These “Gestures” attempted to act as catalysts for the generation of new conversation and interaction. This investigation led to the exploration of LTA (Lighter Than Air) Vehicles capable of acting independently or in flocks constructing dynamic spaces for people to meet.
Below are Initial concept images of how these flying transforming agents would interact and transform.
Our research examined how simple behaviors actuated by the first iteration of Angels affects the experience of a ‘conversational’ space. The following images show our test environment in which we were able measure the success of the LTA vehicles movement and interaction with inhabitants.
A number of observations and recordings were made over two days of flight testing. The next stage was to critically analyze these and focus on the individual behaviors exhibited that were most successful. Part of our investigation was also to experiment with suitable forms of notation to express interaction in space. Initial drawings described the motion paths of the Angels and Inhabitants and were later followed by notation that correlated statistical data.
Using the Angels onboard Vision System transmitted wirelessly to a local computer we processed real-time data of conversation space using a piece of software we developed in MaxMSP Jitter that generated formal representations to support our recording and notation of the interactions that occurred. These projections also provided an added form of feedback when projected into the conversation space. Below is a sequence of transformations over 3 seconds based on input data from the Angels onboard sensors. Below is a 60 second timeline exploring statistical representation as a tool for notation and analysis.