23.5.09
18.5.09
16.5.09
Unballooning the smear
During last crit, one point was to not represent the virtual smear as a balloon/slug shape. Clichéd and hackneyed I am looking to thinking of the smear as an additive or subtractive response within the apartment. If the apartment is a sponge how would the occupant's smear react to touch/smell/sound ... ? London based Not To Scale is one amazing way of expressing what we cannot see, superimposing it over an existing space.
Selective Animation
To create this hypnotic vid for Dominik Eulberg’s “SANSULA (oder der letzte Grund)”, director Dirk Rauscher basically just has lights, night time, and forest. But the visions he’s realized with them, simply by projecting animations onto the trees, are so much more than the sum of their parts…
In the dark, what one can't see becomes visible...
SANSULA Dominik Eulberg musicvideo from dirk rauscher on Vimeo.
5.5.09
22.4.09
Invisible Magnetic Fields no longer Invisible
Invisible Magnetic Fields at NASA's Space Sciences Laboratories, UC Berkeley
Press Play
Fantastic movie done by Semiconductor
///
Semiconductor's Magnetic Movie: by Douglas Kahn
''In 1744 a simple experiment was conducted in Sweden to reproduce the underlying cause of the Aurora Borealis in a laboratory, what we would now think of as a room. A small hole in a shade "the size of a large pea" let through a ray of sunlight that then was refracted through a prism. The small patch of light broken into a spectrum of colours then traveled through a medium of turbulent air directly above a warmed glass of aquavit. The resulting image landed on a screen a few short feet away and looked like what was seen dancing in the sky on many long Swedish nights, nature's sublime entertainment in the real pre-history of cinema.
The experiment concluded that the aurora was caused by a refraction of light through volatile vapors. Straining a rainbow through drunken air may have not proved to be most scientifically accurate recreation of the Aurora Borealis, but it was the "very most beautiful thing that can be arranged in a dark room…flashing beams shoot suddenly up and then transform into colored veils, endlessly changing position between themselves, the one against the other." The shift in magnitude from the scale of the earth to a miniature in the laboratory was no doubt greased by the remaining aquavit left undedicated to the pursuit of science.
In Magnetic Movie, Semiconductor have taken the magnificent scientific visualisations of the sun and solar winds conducted at the Space Sciences Laboratory and Semiconducted them. Ruth Jarman and Joe Gerhardt of Semiconductor were artists-in-residence at SSL. Combining their in-house lab culture experience with formidable artistic instincts in sound, animation and programming, they have created a magnetic magnum opus in nuce, a tour de force of a massive invisible force brought down to human scale, and a "very most beautiful thing."
Just as the finicky sun in Sweden was let through a small hole in the shade in 1744, scientists at the SSL at University of California in Berkeley theoretically model, conduct experiments, and develop instruments to study the magnetic fields of the sun. They study them deep inside the sun's core, their effect on the looping of the corona flaring above its surface (the photosphere, that lights our days), and the solar winds of charged particles that interact with the earth's own magnetic field, creating the auroral displays at the poles. Magnetic Movie is the aquavit, something not precisely scientific but grants us an uncanny experience of geophysical and cosmological forces.
With Magnetic Movie, Semiconductor have tapped into a new and ancient aesthetic of turbulence. We can hear it in the sounds of natural radio-naturally-occurring electromagnetic signals from the earth's ionosphere and magnetosphere-that course through Magnetic Movie, at times animating the animation, a quick nervous response condensed into static. The sound itself is the product of the combined turbulences of the earth's molten core, weather systems and electrical storms, ephemeral ionization in the upper atmosphere, and the solar winds. What we hear is underscored with complex and supple orders, in fact, too complex and supple to be ordered. We already have experience of them in the tangible turbulence of water and the crazy convection of fluids combining, tongues of fire and the thermal afterthought of smoke, the ribbons of clouds stiffly blown twisted up a hill. The flux championed by Hericlitus that has awed audiences since antiquity.''
20.4.09
Maps + Wii = Tokyo Jogging
Tokyo Jogging
Ryo Katsuma, who happens to be a software engineer, made a mashup of Wii technology and Google Maps that allows you to take a virtual run through the streets of Tokyo.
press play
Ryo Katsuma, who happens to be a software engineer, made a mashup of Wii technology and Google Maps that allows you to take a virtual run through the streets of Tokyo.
press play
18.4.09
Bukowski on the Project
there is always that space there
just before they get to us
that space
that fine relaxer
the breather
while say
flopping on a bed
thinking of nothing
or say
pouring a glass of water from the
spigot
while entranced by
nothing
that gentle pure space
it's worth
centuries of existence
say
just to scratch your neck
while looking out the window at
a bare branch
that space
there
before they get to us
ensures
that
when they get to us
when they do
they won't get it at all
ever.
just before they get to us
that space
that fine relaxer
the breather
while say
flopping on a bed
thinking of nothing
or say
pouring a glass of water from the
spigot
while entranced by
nothing
that gentle pure space
it's worth
centuries of existence
say
just to scratch your neck
while looking out the window at
a bare branch
that space
there
before they get to us
ensures
that
when they get to us
when they do
they won't get it at all
ever.
13.4.09
4.4.09
31.3.09
The Kitchen + Placebo Project
///\\\
Dunne & Raby’s Placebo project is an experiment that negotiates between people’s experiences with and attitudes to electronic objects. The Placebo’s eight pieces of “furniture” are designed to be familiar and simple in their ordinary capacity to illicit common responses from users. The focus here is not about the objects themselves but on the stories and narratives that tie the electromagnetic fields to the inhabitants of a space.
Compass table: EM fields given off by electronic devices placed on the table’s surface cause the compass needles to twitch and spin.
Electro-draught excluder: Strategic positioning of this device helps deflect stray electromagnetic fields.
Electricity drain: By sitting naked on a stool, accumulated electricity drains from the body into the chair then out of the house through the earth pin of a special plug.
GPS table: The table has a small display set in its surface witch either shows the word "lost" or its co-ordinates. It should be positioned by a clean window with a clear view of the sky.
Nipple Chair: Nodules embedded in the back of the chair vibrate when radiation passes through the sitter's upper body reminding them that electronic products extend beyond their visible limits.
19.3.09
MArch Report 08|09
MArch Report 08|09
When organizing a tea party, or in this case a report, it is best to consult the experts. During her adventures in Wonderland, Alice exclaimed, “If I had a world of my own, everything would be nonsense. Nothing would be what it is because everything would be what it isn't. And contrary-wise; what it is it wouldn't be, and what it wouldn't be, it would. You see?” [Alice, 1951] The dichotomy of alternates, what is and what isn’t, is fascinating. By creating a dialogue between virtual and actual, Mirror Mirror examines how space is perceived in a contemporary light. It accomplishes this by grasping traditional methods of viewing space and upgrading said methods with the aid of contemporary technology. Old-school meets New-skool with the utilization of Augmented Reality and Head Tracking.
Back in the Fall of ’08 the project began with three prescribed settings [cliff/forest/swamp]. Mythologies, by Roland Barthes, was on the nightstand at the time and the concept of semiotics was applied to the site analyses. The original three landscapes were never conceived with a narrative backbone. Instead semiotics intervened by allowing the idea of each landscape to dictate its aesthetic; meaning, by deriving a hierarchical set for each [cliff, forest and swamp], signs and symbols pertaining to each site were gathered from around the world. After slicing, distorting, pasting and scaling, three imaginary landscapes were born. In order to further analyze each as a prospective site, I referenced the composing images back onto their geographic origins. Instead of a site being one stationary location, a site is now redefined as a collective, fixed through the lens of a camera and uploaded into cyberspace.
The imaginary forest, cliff and swamp were ribbons of landscape taken before, during and after an agent of change. For the purpose of this report it is enough to state that this agent is no longer part of the project. Instead the project continued with the images themselves as a starting point for an investigation into the perception of space. Exploring the spatial quality of these landscapes became important; one way of doing so was to introduce new methods of visualization with the aid of technology such as Augmented Reality and Wii Remote [Wiimote] Head Tracking. These applications have been selected because of their ability to add unforseen factors to the project. By using Head Tracking and Augmented Reality the system between actual and virtual becomes immediate. To properly understand Augmented Reality a comparison with Virtual Reality is necessary. The user in Virtual Reality undergoes a complete immersion into a virtual environment; whereas Augmented Reality is the superimposition of information or virtual objects over the real world. Augmented Reality operates by recognizing prescribed symbols in a real-time video feed. Over these symbols a virtual object is then projected. Now imagine an object in a room with its own symbol. The Augmented Reality reads the space behind the object and superimposes it over the object. No longer visible within the virtual world, the object is now camouflaged or transformed. It would seem plausible that if what was on the screen was fed back via a projector into the real room the object would be camouflaged in reality. Since Augmented Reality works with a live video feed, an object in motion would be hidden as long as the camera could capture the space behind it.
Parallel to the use of Augmented Reality is the Wiimote Head Tracking hack. The Wiimote is the primary controller in a Nintendo game. What sets this hand-held device apart from its predecessors is its accelerometer and optical sensor technology that allows advanced motion sensing capability to manipulate and interact with a screen. Traditionally the player holds the controller and points it at the direction of the screen. Above the screen sits a sensor bar that emits infra-red light. The Wiimote picks up only infra-red light, thus communicating the player with the game. The controller has generated many a hack, one of these being the Head Tracking device. Developed by Johnny Lee while at MIT, the Wii Remote is reversed in orientation and instead searches for the user’s position in space. The user now wears the infra-red emitter, which the controller locates in space. As the user moves in space the Wiimote relays location to the screen and the display moves in sync with the user. The hack uses the Fishbowl software to operate. Fishbowl’s extroverted system allows the display to become three dimensional; by moving around the room, images in Fishbowl can move slide past each other, space is perceived in all three axis.
Working with Augmented Reality and Head Tracking has rooted the project in its dependency upon the image. Rather than accept the fact as a negative, the image is analyzed in representation as a positive through collection and hierarchy; Augmented Reality and Head Tracking allow a composite reading of space via multiple images. These technologies situate the project between actual and virtual reality. Without digressing into the particulars of why Mirror Mirror is not virtual or actual, it sits between the two. The project seeks to dialogue between the two perceived spaces creating a layer that blends the two together.
From the three landscapes emerged the need for a physical testing ground. Apartment 35b was selected as a next site. By the canal in Little Venice, sits apartment 35b on the very top floor. Like any ordinary apartment 35b has within visible space an entire network of hidden spaces. Under floors and between walls are spaces that do not exist in everyday consciousness. These inaccessible depths hold the building’s organs and seem mysterious only when it comes time to repair damages. By measuring light with composite images a perception of depth and obstruction is formed. These sections when combined with Head Tracking gave an intro-scopic perception of space. Head Tracking places the collective sections within an environment where a viewer can look inside walls and under floors.
After the midterm review the entire Shaun Murray studio left for Plymouth to participate in a data visualization workshop. Architecture students met with other students from both the Oslo School of Architecture and Design, and Plymouth Digital Arts and Tech. Each group could make use of either an immersive theatre or a LED screen. Our group selected the immersive theatre. The task was to take data [i.e. room temperature, wind flow and student movement] from a campus building and translate it into an easy to read visualization; one that would dialogue between the building’s spatial composition and the data collected. After a few hours of brainstorming the data was then reduced to a hierarchy established by actual space. The data visualization began to take shape literally from this hierarchy. The goal was to make the visualization interactive, so that any user looking at the website’s live data feed could comprehend it. With the extraordinary talents of our Plymouth DAT programmer the group devised a series of russian dolls. First the building encompassed the three atriums which in turn held the lecture rooms. Each doll or globe would have a skin that reflected the appropriate data. So with changes in temperature it would pulse or change color.
This workshop marked a turning point within the Masters studio. No longer was the project bound to be an interactive projected space. From the workshop came a synthesis of information, technology and spatial connection. What first began as a visualization of a building’s data led to an unexpected abstraction of physical space. New virtual relationships of space came from actual room layouts. Mirror Mirror could develop meaning through the collection and combination of data within the actual and the virtual. If these hidden spaces were conduits to spaces beyond any immediate space a similar relationship could be developed. By using data such as sound, resonance, temperature or movement as input, space is perceived as constantly vibrating and fluctuating. Evidently, physical space was reorganized to amplify spatial connections and/or data readings. Apartment 35b’s hidden spaces could be amplified in this same manner. While investigating these hidden spaces its tenant gave the following testimonial:
“For some reason, I can always hear when my neighbor downstairs, a middle-aged fellow, invites his guest to his bachelor pad. The first time I heard Barry White while brushing my teeth, my curiosity was peaked and after poking about I realized that only when standing by the sink was I able to hear sounds emanating from below. I began to imagine how I could relate this back to the hidden spaces within my apartment, like pinching together visible spaces with hidden ones. Now I start to wonder wherever I am in the apartment what is happening to those spaces I cannot see but can hear. I press my ear to walls and floors, trying to connect groans and creaks with rooms beneath and beside me. The water pump in the attic roars to life whenever a faucet is opened. If I go to the bathroom at 4:00am I measure the time I wash my face so as not to have the beast upstairs wake the entire house. I know my apartment well enough but when I hear my neighbors moving about the acoustics change. I can hear conversations coming through where there should be closets and stairs, not rooms. Have they discovered new rooms or are they living in the walls?”
Like Gormenghast in your living room floor, Alice through the kitchen sink or “Schrödinger’s Cat as retold by Rem Koolhaas: a potentially unreal maze of interconnected architectural spaces enshrouding you on all sides like a halo. Saint Crawlspace.”
BIBLIOGRAPHY
Alice in Wonderland,. 1951. [film] directed by Clyde Geronimi. USA: Walt Disney Pictures.
Baird, George,. 1969. “‘La Dimension Amoreuse’ in Architecture” from Charles Jencks and George Baird, Meaning in Architecture. In K.Michael Hays, ed. Architecture Theory since 1968. 3rd Edition. Massachusetts, U.S.A.: MIT Press in assoc. with Columbia Books of Architecture. pp 69.
Barthes, Roland., 1972. Mythologies. Edition du Seuil 1957. Great Britain: Vintage 200 Classics.
Lee, Johnny Chung,. 2008. Johnny Chung Lee: Projects. Carnegie Mellon University.. Redmond, Washington.
Kelley, Kevin,.1994. Out of Control: The New Biology of Machines. U.S.A.:Fourth Estate.
Macey, David,. 2000. The Penguin Dictionary of Critical Theory. Great Britain: Penguin Books in assoc. with Clays Ltd.
Manaugh, Geoff,. 2008. Sounding Rooms.. California, USA: BLDGBLG Website.
Sotomayor, Camila E,. The Tea Party.
When organizing a tea party, or in this case a report, it is best to consult the experts. During her adventures in Wonderland, Alice exclaimed, “If I had a world of my own, everything would be nonsense. Nothing would be what it is because everything would be what it isn't. And contrary-wise; what it is it wouldn't be, and what it wouldn't be, it would. You see?” [Alice, 1951] The dichotomy of alternates, what is and what isn’t, is fascinating. By creating a dialogue between virtual and actual, Mirror Mirror examines how space is perceived in a contemporary light. It accomplishes this by grasping traditional methods of viewing space and upgrading said methods with the aid of contemporary technology. Old-school meets New-skool with the utilization of Augmented Reality and Head Tracking.
Back in the Fall of ’08 the project began with three prescribed settings [cliff/forest/swamp]. Mythologies, by Roland Barthes, was on the nightstand at the time and the concept of semiotics was applied to the site analyses. The original three landscapes were never conceived with a narrative backbone. Instead semiotics intervened by allowing the idea of each landscape to dictate its aesthetic; meaning, by deriving a hierarchical set for each [cliff, forest and swamp], signs and symbols pertaining to each site were gathered from around the world. After slicing, distorting, pasting and scaling, three imaginary landscapes were born. In order to further analyze each as a prospective site, I referenced the composing images back onto their geographic origins. Instead of a site being one stationary location, a site is now redefined as a collective, fixed through the lens of a camera and uploaded into cyberspace.
The imaginary forest, cliff and swamp were ribbons of landscape taken before, during and after an agent of change. For the purpose of this report it is enough to state that this agent is no longer part of the project. Instead the project continued with the images themselves as a starting point for an investigation into the perception of space. Exploring the spatial quality of these landscapes became important; one way of doing so was to introduce new methods of visualization with the aid of technology such as Augmented Reality and Wii Remote [Wiimote] Head Tracking. These applications have been selected because of their ability to add unforseen factors to the project. By using Head Tracking and Augmented Reality the system between actual and virtual becomes immediate. To properly understand Augmented Reality a comparison with Virtual Reality is necessary. The user in Virtual Reality undergoes a complete immersion into a virtual environment; whereas Augmented Reality is the superimposition of information or virtual objects over the real world. Augmented Reality operates by recognizing prescribed symbols in a real-time video feed. Over these symbols a virtual object is then projected. Now imagine an object in a room with its own symbol. The Augmented Reality reads the space behind the object and superimposes it over the object. No longer visible within the virtual world, the object is now camouflaged or transformed. It would seem plausible that if what was on the screen was fed back via a projector into the real room the object would be camouflaged in reality. Since Augmented Reality works with a live video feed, an object in motion would be hidden as long as the camera could capture the space behind it.
Parallel to the use of Augmented Reality is the Wiimote Head Tracking hack. The Wiimote is the primary controller in a Nintendo game. What sets this hand-held device apart from its predecessors is its accelerometer and optical sensor technology that allows advanced motion sensing capability to manipulate and interact with a screen. Traditionally the player holds the controller and points it at the direction of the screen. Above the screen sits a sensor bar that emits infra-red light. The Wiimote picks up only infra-red light, thus communicating the player with the game. The controller has generated many a hack, one of these being the Head Tracking device. Developed by Johnny Lee while at MIT, the Wii Remote is reversed in orientation and instead searches for the user’s position in space. The user now wears the infra-red emitter, which the controller locates in space. As the user moves in space the Wiimote relays location to the screen and the display moves in sync with the user. The hack uses the Fishbowl software to operate. Fishbowl’s extroverted system allows the display to become three dimensional; by moving around the room, images in Fishbowl can move slide past each other, space is perceived in all three axis.
Working with Augmented Reality and Head Tracking has rooted the project in its dependency upon the image. Rather than accept the fact as a negative, the image is analyzed in representation as a positive through collection and hierarchy; Augmented Reality and Head Tracking allow a composite reading of space via multiple images. These technologies situate the project between actual and virtual reality. Without digressing into the particulars of why Mirror Mirror is not virtual or actual, it sits between the two. The project seeks to dialogue between the two perceived spaces creating a layer that blends the two together.
From the three landscapes emerged the need for a physical testing ground. Apartment 35b was selected as a next site. By the canal in Little Venice, sits apartment 35b on the very top floor. Like any ordinary apartment 35b has within visible space an entire network of hidden spaces. Under floors and between walls are spaces that do not exist in everyday consciousness. These inaccessible depths hold the building’s organs and seem mysterious only when it comes time to repair damages. By measuring light with composite images a perception of depth and obstruction is formed. These sections when combined with Head Tracking gave an intro-scopic perception of space. Head Tracking places the collective sections within an environment where a viewer can look inside walls and under floors.
After the midterm review the entire Shaun Murray studio left for Plymouth to participate in a data visualization workshop. Architecture students met with other students from both the Oslo School of Architecture and Design, and Plymouth Digital Arts and Tech. Each group could make use of either an immersive theatre or a LED screen. Our group selected the immersive theatre. The task was to take data [i.e. room temperature, wind flow and student movement] from a campus building and translate it into an easy to read visualization; one that would dialogue between the building’s spatial composition and the data collected. After a few hours of brainstorming the data was then reduced to a hierarchy established by actual space. The data visualization began to take shape literally from this hierarchy. The goal was to make the visualization interactive, so that any user looking at the website’s live data feed could comprehend it. With the extraordinary talents of our Plymouth DAT programmer the group devised a series of russian dolls. First the building encompassed the three atriums which in turn held the lecture rooms. Each doll or globe would have a skin that reflected the appropriate data. So with changes in temperature it would pulse or change color.
This workshop marked a turning point within the Masters studio. No longer was the project bound to be an interactive projected space. From the workshop came a synthesis of information, technology and spatial connection. What first began as a visualization of a building’s data led to an unexpected abstraction of physical space. New virtual relationships of space came from actual room layouts. Mirror Mirror could develop meaning through the collection and combination of data within the actual and the virtual. If these hidden spaces were conduits to spaces beyond any immediate space a similar relationship could be developed. By using data such as sound, resonance, temperature or movement as input, space is perceived as constantly vibrating and fluctuating. Evidently, physical space was reorganized to amplify spatial connections and/or data readings. Apartment 35b’s hidden spaces could be amplified in this same manner. While investigating these hidden spaces its tenant gave the following testimonial:
“For some reason, I can always hear when my neighbor downstairs, a middle-aged fellow, invites his guest to his bachelor pad. The first time I heard Barry White while brushing my teeth, my curiosity was peaked and after poking about I realized that only when standing by the sink was I able to hear sounds emanating from below. I began to imagine how I could relate this back to the hidden spaces within my apartment, like pinching together visible spaces with hidden ones. Now I start to wonder wherever I am in the apartment what is happening to those spaces I cannot see but can hear. I press my ear to walls and floors, trying to connect groans and creaks with rooms beneath and beside me. The water pump in the attic roars to life whenever a faucet is opened. If I go to the bathroom at 4:00am I measure the time I wash my face so as not to have the beast upstairs wake the entire house. I know my apartment well enough but when I hear my neighbors moving about the acoustics change. I can hear conversations coming through where there should be closets and stairs, not rooms. Have they discovered new rooms or are they living in the walls?”
Like Gormenghast in your living room floor, Alice through the kitchen sink or “Schrödinger’s Cat as retold by Rem Koolhaas: a potentially unreal maze of interconnected architectural spaces enshrouding you on all sides like a halo. Saint Crawlspace.”
BIBLIOGRAPHY
Alice in Wonderland,. 1951. [film] directed by Clyde Geronimi. USA: Walt Disney Pictures.
Baird, George,. 1969. “‘La Dimension Amoreuse’ in Architecture” from Charles Jencks and George Baird, Meaning in Architecture. In K.Michael Hays, ed. Architecture Theory since 1968. 3rd Edition. Massachusetts, U.S.A.: MIT Press in assoc. with Columbia Books of Architecture. pp 69.
Barthes, Roland., 1972. Mythologies. Edition du Seuil 1957. Great Britain: Vintage 200 Classics.
Lee, Johnny Chung,. 2008. Johnny Chung Lee: Projects. Carnegie Mellon University.
Kelley, Kevin,.1994. Out of Control: The New Biology of Machines. U.S.A.:Fourth Estate.
Macey, David,. 2000. The Penguin Dictionary of Critical Theory. Great Britain: Penguin Books in assoc. with Clays Ltd.
Manaugh, Geoff,. 2008. Sounding Rooms.
Sotomayor, Camila E,. The Tea Party.
9.3.09
The Pinching of Spaces
For some reason, I can always hear when my downstairs neighbor, a middle-aged modelizer, has a lady over at his bachelor pad. The first time I heard Barry White while brushing my teeth, my curiosity was peaked and after poking about I realized that only standing by the sink could I hear best the downstairs floor. This got me thinking about the next step for the hidden spaces/workshop combo. What first began as a visualization of a building’s data led to an unexpected abstraction of physical space. Meaning, we created new relationships between rooms that weren’t there physically [see previous post]; we evidently reorganized physical space to amplify spatial connections and/or data readings.
I began to imagine how I could relate this back to the hidden spaces within my apartment, like pinching together visible spaces with hidden ones. Now I start to wonder wherever I am in the apartment what is happening to those spaces I cannot see but can hear. I press my ear to walls and floors, trying to connect groans and creaks with rooms underneath and beside. The water pump in the attic roars to life when I open a valve. So I go to the bathroom at 4am and have to time my washing up so as not to wake the beast upstairs. I know my apartment well enough but when I hear my neighbors moving about the acoustics change. I can hear conversations coming through where there should be closets and stairs not rooms. Have they discovered new rooms or are they living in the walls? Like Gormenghast in your living room floor, Alice through the kitchen sink or “Schrödinger's Cat as retold by Rem Koolhaas: a potentially unreal maze of interconnected architectural spaces enshrouding you on all sides like a halo. Saint Crawlspace.”
Gordon Matta Clark (American) 1943 – 1978 [he cut whole sections of floors, ceilings or roofs to expose the hidden spaces in between]
AntiVJ & Crea Composite: Nuit Blanche Bruxelles from AntiVJ on Vimeo.
During the crit a suggestion was to instill a program. Allow the program to drive the project and then use the AR and wiimote media to be the tool for exploration/explaining. If I begin from a simple point the project thus far would be for an exploration of space beyond the hidden.
-This was sent to me by Aqueel. During the review Nic asked if it was so bad to have a projected final project. I am keeping the option open after seeing this.
Royskopp . Remind Me
6.3.09
Bartlett + Plymouth + AHO = Workshop 02.09
The entire Murray group left for Plymouth a week ago to participate in a data visualization workshop. Architecture students mixed with othe architecture students from the Oslo School of Architecture and Design and Plymouth Digital Arts and Tech students. With two tools at our disposal, immersive theatre and an LED screen, I chose the immersive theatre. Our task was to translate data taken from a campus building into an easy to read visual, that would reflect back on the building's spatial relationships. An initial meeting delivered the task to a. work together to propose an initial idea and b. for each of our individual interests to be active in the project. After a few ideas were tossed around we decided to attack all the data by placing it into a spatial heirarchy.

This is my masters project show on the immersive dome.
The data visualization began to take shape literally from this heirarchy. We wanted to make it interactive, so any user looking at the website's live data feed could understand the visualization, via an attached widget. With the extraordinary talents of our Plymouth DAT programmer! we devised a series of "russian dolls" first the building encompassed the three atriums which in turn held the lecture rooms. Each "doll" or globe would have a skin that reflected the appropriate data. So with changes in temperature it would pulse or change color.
Below are pictures of our first try combining the data and visual and then to the right is the final product. The skins of each globe were also papered with jpgs taken from each space.
This is my masters project show on the immersive dome.
Labels:
drawing,
inspiration,
photo.graph,
plymouth workshop [02.09]
23.2.09
Understanding the Composite and Spatial
The original three landscapes were never conceived with a narrative backbone. Instead I relied on the concept of each landscape to dictate its aesthetic; meaning, by deriving a heirarchical set from either cliff/forest/swamp images were found from around the world. After slicing, distorting and scaling, pasting three imaginary landscapes were born. In order to further analyse each as a prospective site, I referenced the composing images back onto their geographic origins. Instead of a site being one stationary location, site is now redefined as a collective, fixed through the lens of a camera and uploaded into cyberspace.
Applying AR and HT to space
Working with Augmented Reality (AR) and Head Tracking (HT) has rooted the project in its dependency upon the image. Rather than accept the image as being a single superficial representation. AR + HT allow a composite reading of space via multiple images. Readings of inaccessible spaces are possible via HT; you control the depth of perceived space and the cone of vision, to peer around, up and above. Augmented Reality presents a different system by referencing the virtual back into the actual world. The same spatial applications can be applied, by seeing inaccessible spaces simply through an overlay.
Inside floors and walls are spaces that do not existe in everyday conciousness. These inacessable depths hold the building's organs and seem mysterious only when it comes time to repair damages. By measuring light with composite images a perception of depth and obstruction is formed. HT places the collective in an environment where we the viewer can look inside walls and under floors.
Inside floors and walls are spaces that do not existe in everyday conciousness. These inacessable depths hold the building's organs and seem mysterious only when it comes time to repair damages. By measuring light with composite images a perception of depth and obstruction is formed. HT places the collective in an environment where we the viewer can look inside walls and under floors.
Under the Floorboards
15.2.09
The what why how
This Masters project centers on the agent of change within a given environment. Presently, the project has evolved into an investigation that uses Augmented Reality [AR] as a transmitter between systems in a given setting. In adopting Augmented Reality as a tool, the Master’s project must confront the issues of AR’s spatial application and the ethics behind the projected image. Necessary to the goals of the projects is this discussion of ethics. As the projection is inherently an application into space, an environment is not affected physically at all. Within an installation involving projections, the factors are the people, sound or movement within a space; leaving no impact upon the environment. Rather than contend with a superficial result, the use of Augmented Reality desires a more invasive result. Two avenues of possible architectural application currently under study are: spatial camouflage and either inducing or alleviating a Synesthetic condition.
Camouflage is promising because of its tactical approach to space; the desire of individuals to connect with their surroundings could lead to a blending of the physical and virtual. The intention here is to venture beyond the end result being an installation piece that relies on projections. In order to do so, an understanding of the limitations of Augmented Reality is essential in order to explore it as a tool. To properly understand Augmented Reality a comparison with Virtual Reality [VR] is necessary. The user in Virtual Reality undergoes a complete immersion into a virtual environment; whereas Augmented Reality is the superimposition of information or virtual objects over the real world. AR operates by recognizing prescribed symbols in a real-time video feed. Over these symbols a virtual object is then projected. Now imagine an object in a room with its own symbol, the Augmented Reality reads the space behind the object and superimposes it over the object. No longer visible within the virtual world, the object is now camouflaged or transformed. It would seem plausible that if what was on the screen was fed back via a projector into the real room the object would be camouflaged in reality. Since AR works with a live video feed, an object in motion would be hidden as long as the camera could capture the space behind it. Parallel to the Augmented Reality concept is the Wii Remote Head Tracking1 hack. Developed by Johnny Lee while at MIT, the Wii Remote is reversed in orientation and instead searches for the user’s position in space. This extroverted system then allows the display to become three dimensional within its frame. Further investigation will occur in order to combine this tool with the intention of spatial camouflage using Augmented Reality.
The alternative application is to use the overlay between virtual and real-time time is to enduce or alleviate the condition of synesthesia. Synesthesia is the neurological condition that occurs when one sensory element crosses paths with another sensory element involuntarily. In the case of induced synesthesia, the Augmented Reality tool can be used to create a sensory environment for the user that cross pollinates sound with vision and possibly smell. Instead of the brain triggering automatic reactions in other senses, a computer is programmed to respond to sound as a series of colors or movements. Induced Synesthesia can be applied on an individual basis to either alleviate mental or physical strain. By employing Augmented Reality, the user creates his/her own environment without affecting others.
Bibliography
Hisel, Dan. Camouflage: Or the Miscommunication of Space, http://temptationbyspace.blogspot.com/2009/01/camouflage-or-miscommunication-of.html, 2002.
Lee, Johnny Chung. Johnny Chung Lee: Projects, Carnegie Mellon University http://www.cs.cmu.edu/~johnny/projects/wii/, Redmond Washington, 2008.
The Synesthetic Experience, MIT, http://web.mit.edu/synesthesia/www/, Oakbog Studios, 1997.
1 “As of June 2008, Nintendo has sold nearly 30 million Wii game consoles. This significantly exceeds the number of Tablet PCs in use today according to even the most generous estimates of Tablet PC sales. This makes the Wii Remote one of the most common computer input devices in the world. It also happens to be one of the most sophisticated. It contains a 1024x768 infrared camera with built-in hardware blob tracking of up to 4 points at 100Hz. This significantly out performs any PC "webcam" available today. It also contains a +/-3g 8-bit 3-axis accelerometer also operating at 100Hz and an expandsion port for even more capability. These projects are an effort to explore and demonstrate applications that the millions of Wii Remotes in world readily support.” Johnny Lee
13.2.09
Possibility of...
I came across the work of Enzo Mari, an Italian designer whose work spanned 50 some odd years. I find the two pictures below fascinating and way beyond their time-aesthetic. The potential for playing with depth perception is fantastic and could really help structurally with my project. Have to work it in somehow.
http://www.designboom.com/weblog/cat/8/view/4127/enzo-mari-the-art-of-design-exhibition-at-the-gam-museum-turin.html
homage to fadat', lights, switches, plexiglass and steel, 1967
structure, anodized aluminium, brass and steel, 1964
http://www.designboom.com/weblog/cat/8/view/4127/enzo-mari-the-art-of-design-exhibition-at-the-gam-museum-turin.html
structure, anodized aluminium, brass and steel, 1964
1.2.09
For the Wiimote Augmented Reality to work, the viewer needs to be wearing InfraRed sensors. This allows one person at a time to see the projected image. This part of the project has been interesting because I am having to construct a circuit board for the LED's to work. Using the Instructables website has been helpful in order to learn a lot about wiring a light system. The next step is to setup the Wiimote relay between the remote, infrared lights, and the display.
25.1.09
Success!
Last week I began testing out the AR plug-in for Sketchup, and luckily the problem was in the way I was saving the jpg. Sketchup does not view jpgs with transparency so after a few modifications I finally had a transparent image in the AR viewfinder. I staggered the split landscape and can now project the cliff into Augmented Reality through the webcam and into my computer. So far the plug-in has a few limitations: as the pieces line up they need to be progressively enlarged to provide a faithful image, and it is limited in its 3dimensionality. As a feedback system this could be used with a more sophisticated system to have an environment altered by certain amounts of data. Say, placed in a room, the projected environment could morph positively or negatively depending on sound level, number of people, their movement through space. So the more people were in the room, the landscape would flourish, if it was a dead zone, the landscape would decay.
Setting the Table
After doing some further research into Augmented Reality I have been working with ARToolkit.
My plan is to separate the landscapes into components and then spatialize using either the ART or the plugin for Sketchup. The plugin requires fineagling, first in 3Dsm and then put into Rhino.
.........................................
The Wii Remote arrived and I am in the process of constructing a sensor bar. Maplin has the IR LED's thankfully!
The ARToolKit is a collection of libraries, utilities applications, and documentation and sample code. The libraries provide the user with a means to capture images from video sources, process those images to optically track markers in the images, and to allow compositing of computer-generated content with the real-world images and display the result using OpenGL (Phillip Lamb, 2004). ARToolKit is designed to build on Windows, Linux, SGI Irix, andMacintosh OS X platforms.It seems that the ARToolkit runs on a .net programming architecture instead of plugging into another programme - like the AR plugin for Sketchup, a much more direct system. So far the parts for the toolkit are downloading and then it is a matter of connecting everything together.
My plan is to separate the landscapes into components and then spatialize using either the ART or the plugin for Sketchup. The plugin requires fineagling, first in 3Dsm and then put into Rhino.
.........................................
The Wii Remote arrived and I am in the process of constructing a sensor bar. Maplin has the IR LED's thankfully!
19.1.09
The Tools
A few of the tools I am using to develop the landscapes in 3D are AR Toolkit, Augmented Reality and Johnny Lee's Wii Hacks. It seems like every Wii remote has been bought in the city of London, so I am waiting for one online.
The Description below is taken from Lee's website:
This type of technology would really allow to experiment with change over time within the landscape. I am curious as to how it could develop away from a flat screen. Perhaps how one can occupy the same space as the 3dimensional projection.
So far I have been testing out the spatialization of the cliffs with the Augmented Reality plugin for Google Sketchup. The setup being so simple I could immediately have a simple volume projected [see below]. However when I spliced the cliff into different staggered layers, like a loaf of bread on a flat plane, the jpg images would not transfer with transparent backgrounds, prohibiting a compounded view of the cliff when projected in 3D. Since Sketchup is so basic I am planning on working with ARToolkit and seeing if the results will be better and at a more sophisticated level.
The Description below is taken from Lee's website:
Head Tracking for Desktop VR Displays using the Wii Remote
Using the infrared camera in the Wii remote and a head mounted sensor bar (two IR LEDs), you can accurately track the location of your head and render view dependent images on the screen. This effectively transforms your display into a portal to a virtual environment. The display properly reacts to head and body movement as if it were a real window creating a realistic illusion of depth and space.
The program only needs to know your display size and the size of your sensor bar. The software is a custom C# DirectX program and is primarily provided as sample code for developers without support or additional documentation. You may need the most recent version of DirectX installed for this to work.
This type of technology would really allow to experiment with change over time within the landscape. I am curious as to how it could develop away from a flat screen. Perhaps how one can occupy the same space as the 3dimensional projection.
So far I have been testing out the spatialization of the cliffs with the Augmented Reality plugin for Google Sketchup. The setup being so simple I could immediately have a simple volume projected [see below]. However when I spliced the cliff into different staggered layers, like a loaf of bread on a flat plane, the jpg images would not transfer with transparent backgrounds, prohibiting a compounded view of the cliff when projected in 3D. Since Sketchup is so basic I am planning on working with ARToolkit and seeing if the results will be better and at a more sophisticated level.
The Reaction
Post the Barbican exhibit [mentioned below] I began thinking about how a simple information relay could be applied to my project. The siting within the nuclear fallout idea [previous posts] was a tangent that began to take ideas into a limited direction. As the three renders are landscapes within themselves, the issue of siting is there already. Furthermore by the end of the project a smooth process needs to be apparent.
S suggested spatializing the landscapes in order to test out relationships within each [cliff, forest,swamp] and as a collective. The eventual change in each site is hazy still but I am learning that not every step needs to be pre-determined. So, going back to Hemmer's exhibit, the ideas of humans tuning the space/surrounding audio could easily be translated to the tuning of any surrounding landscape. Necessary elements would be: factors [people in the room, temperature,movement, shadow], a warping device [a system that would read factors and allocate a prescribed rule of change], this change then would trigger the landscape to distort.
The previous post mentioned an interface called Reactable, a demo below. The factors that make up the interface are incredibly simple. A good model to follow when producing my own system.
S suggested spatializing the landscapes in order to test out relationships within each [cliff, forest,swamp] and as a collective. The eventual change in each site is hazy still but I am learning that not every step needs to be pre-determined. So, going back to Hemmer's exhibit, the ideas of humans tuning the space/surrounding audio could easily be translated to the tuning of any surrounding landscape. Necessary elements would be: factors [people in the room, temperature,movement, shadow], a warping device [a system that would read factors and allocate a prescribed rule of change], this change then would trigger the landscape to distort.
The previous post mentioned an interface called Reactable, a demo below. The factors that make up the interface are incredibly simple. A good model to follow when producing my own system.
Frequency and Volume at the Barbican_January 2009
This Saturday was the ending of Rafael Lozano-Hemmer's project Frequency and Volume. Not knowing what exactly to expect it was pretty brilliant to have the installation explained via one's own exploration. Meaning that if you truly wanted to understand the project you would have to look for the clues indicating the mechanics of the installation. The premise was so simple. The human body moving through the long curve of the Barbican hall is a giant tuning device for several radio stations. The stations belonged to news, popular music, maritime, aeronautical and even pirate radio stations. Volume was controlled by one's distance from a projector on the ground. The projector casts your shadow and while you move a webcam records your position in space. Position is then relayed back to the main radio control room where equalizers, amplifiers and computer harddrives calculated the appropriate radio station and volume. This setup was repeated every ten steps; if a shadow moved even slightly in either direction the station would change.
In Kevin Kelley's book Out of Control, hive mentality was explained through a similar experiment. In the mid nineties a relay system was devised so that a camera could read a room full of blue and red placards. Each person held a placard and the objective was to collectively land a virtual plane. After several tries, the group began moving instinctively instead of individually. Similar to Hemmer's project one could easily imagine a roomfull of people tuning the DJ's set at a party.As we were exploring the exibition at the Barbican the docent gave us a mini tutorial and mentioned a similar system that Bjork used in her recent tour called Reactable.
The above is video taken from the Curve space at the Barbican, judging from the previous exhibition spaces Curve couldn't be more appropriate. The space one occupies to tune the radios has to be linear. The people at the exhibition seem to be missing the point, interested more in the shadow projections than the radio stations. *Notice when the giant baby shadow is cast, the volume soars.
13.1.09
The Skeleton of the Skeleton: The site destroyed
The idea of the site being a Nuclear Fallout result came from the last three drawings [the final layer] and this mapplet from CarlosLabs. They devised a project called Ground Zero.
"Have you ever wondered what would happen if a nuclear bomb goes off in your city? With Google's Maps framework and a bit of Javascript, you can see the outcome.
And it does not look good."
"Have you ever wondered what would happen if a nuclear bomb goes off in your city? With Google's Maps framework and a bit of Javascript, you can see the outcome.
And it does not look good."
Subscribe to:
Posts (Atom)
































