In order for me to develop an augmented reality project I felt that first I had to investigate how AR works and feels myself. I therefore set about finding a source on the web with which I could test the capabilities of AR. The first I cam across was the Smart Grid AR designed by Ecomagination (http://ge.ecomagination.com/smartgrid/#/augmented_reality). The following are a couple of videos of me looking at these AR projects designed by Ecomagination:
These AR visualisations required the user to hold the paper up vertically infront of the camera. After testing this method I feel that this is not the best way of dealing with AR as it is not that flexible and can be awkward. Also it limits the users view of the AR as there is not a 360 degree rotation on the most sensible axis and so users only get a 180 degree view of the model from that angle. I therefore think an AR project should be something that can be located on a table top or somewhere flat to ensure that the users get the easiest manipulation of the object.
AS well as experimenting with these I decided that I should also look at the AR project which I most like, the packaging AR shown in the previous post. However I was unable to launch the software for this and so could not test it. However I feel that the way that this visualisation uses AR is better as it allows you to place the AR flat and so there is the 360 degree manipulation. This simulation is also clever in that you can relate it to objects of the real world which you may wish to package. The simulator incorporates an opacity setting which allows users to increase or decrease the opacity of the AR box so they can work out whether their object will fit inside. This demonstrates a real practical use of AR and is the kind of simulation I would like to be able to achieve in my project.
Monday, 25 January 2010
IDAT 204 - Further investigation into AR
For our brief we have been given two options:
- To investigates either (option 1) the ways in which the real world can be augmented with different forms of digital information or (option 2) the ways in which an interactive sound environment can be designed, constricted and implemented.
Therefore I have done some research into both sections in order to decide which brief I would rather take on. Firstly there is the AR brief as previously explained. After my previous post I decided I needed to find out what AR has been used for previously and so did some research. In general I found that the most popular uses of AR are in branding and commercial areas in which it can be used to advertise products and engage potential customers. For example AR has been used quite often in recent years to allow consumers to investigate a new brand of car up close without the need for the physical car to be there. This is demonstrated in the video below:
This is a very useful concept as the use of AR extends the information the user can obtain about the car in terms of shape and look due to its use of 3D and the ability to manipulate the model. Also AR has quite a strong relationship with gaming such as through AR trading cards, and AR game components. It is often used to make games more interactive as players are required to use and move the AR picture boards to play the game.
My favourite use of AR that I have found so far though is that of the USPS Priority Mail Simulator. This AR tool allows users to work out what size of box is required for any package they are trying to send. They are able to use the AR to compare the package simply and easily with the size of the box which makes this use of AR very functional. It is this functionality and purpose, along with the uniqueness of the concept, that makes me particularly interested in this AR tool. The video below is a demonstration of this tool:
- To investigates either (option 1) the ways in which the real world can be augmented with different forms of digital information or (option 2) the ways in which an interactive sound environment can be designed, constricted and implemented.
Therefore I have done some research into both sections in order to decide which brief I would rather take on. Firstly there is the AR brief as previously explained. After my previous post I decided I needed to find out what AR has been used for previously and so did some research. In general I found that the most popular uses of AR are in branding and commercial areas in which it can be used to advertise products and engage potential customers. For example AR has been used quite often in recent years to allow consumers to investigate a new brand of car up close without the need for the physical car to be there. This is demonstrated in the video below:
This is a very useful concept as the use of AR extends the information the user can obtain about the car in terms of shape and look due to its use of 3D and the ability to manipulate the model. Also AR has quite a strong relationship with gaming such as through AR trading cards, and AR game components. It is often used to make games more interactive as players are required to use and move the AR picture boards to play the game.
My favourite use of AR that I have found so far though is that of the USPS Priority Mail Simulator. This AR tool allows users to work out what size of box is required for any package they are trying to send. They are able to use the AR to compare the package simply and easily with the size of the box which makes this use of AR very functional. It is this functionality and purpose, along with the uniqueness of the concept, that makes me particularly interested in this AR tool. The video below is a demonstration of this tool:
As well as AR there is also the interactive sound environment brief. This is a much less explored area than that of AR but there are also a number of interesting examples that have been carried out in relation to this topic. The first of these that interested me was the Map1 sound installation by Garth Paine. This environment allows people to walk around a space in which the sound is relevant to them and their position within the space. Sensors are used to get values such as weight and direction which is then converted into sound. As well as this the location of the source of the sound is manipulated to match with the users location within the installation. (http://www.activatedspace.com/Installations/Map1/map1.html) Similarly the work of David Strang who we will be working with on this project is based around interactive sound installation. His work includes sonic representations such as Resonate Space Q121 and structural recording installations such as Building Systems.(http://www.davidstrang.co.uk/buildingsystems.html)
Another version of interactive sound installation is that similar to the work of Goudeseune and Kaczmarski. This work involves the use of GPS to locate a sound at a particular location which then can be found using the right hardware and software. The sound can only be heard if at the correct location meaning that you can provide the user with aural information very specific to the location they are at. Such experimentations with sound also include sound mapping and sound walks that use sound to provide a representation of the environment and the changes within it. (http://zx81.isl.uiuc.edu/camilleg/icmc01.pdf)
Both of these areas would be interesting to explore. The knowledge I currently have would be more applicable to creating an AR project, but the sound installation is a lot more explorative and with the help of David Strang it would be possible to expand my knowledge to suit this brief also. I feel that I would enjoy working with augmented reality more and so will follow this brief for my project.
Saturday, 23 January 2010
IDAT 204 - Augmented Reality
#bdat The final project for the IDAT204 module is based on Augmented Reality. Augmented Reality or AR refers to the combination of the real world and computer generated objects or information. There are a number different ways that AR can be achieved. The most common of these is via recent mobile phones that are able to use software and inbuilt cameras to show the user an augmented environment. This is shown in the image below:
There is also the use of Head Mounted Displays to which act like virtual reality glasses through which the user can see the augmentation. In either of these examples the video image of the current location is input via a camera into a computer which then takes the information and layers the augmentation onto it. This composite image is then shown on the display.
Augmented reality can be used for a variety of things. The following is a list of uses for Augmented Reality:
Entertainment - games, characters,
E-Learning - books, pictures, 3d representations
Medical - pre-op, teaching
Consumer Design - clothes, hair, rooms?
Aesthetic - objects, pictures
Information - location, height, time, memos, interest,
Geographic - direction, location
The aim of any of these is to improve the users perception and understanding of the world around them.
Thursday, 21 January 2010
Stonehouse - GPS Drawing
Following on from our initial investigations into the hertzian space of Stonehouse this week we experimented with some GPS drawing similar to last years work (see beckyvidat106.blogspot.com). This time however we were asked to use a set of rules to trace our drawing rather than planning the image before hand. We used the centre of town for our location rather than the campus as this provided more opportunity for experimentation with this idea. Our group initially began by trying to find a way of using light sources for our rules so that we could relate our findings to our general project of light and communication. However we could not come up with any worthwhile rules that would have a very good effect for the drawing and so after some discussion deciced to use tall poles such as those of street lights, as well as using surveillance cameras as our way-points. We therefore produced the following GPS drawing with the results of these rules:
Although the rules we used did not link with our overall project, the concept of surveillance is quite interesting as it enables you to think about the spaces that are controlled and monitored and how this links to the concept of a digitally enhanced area.
Wednesday, 20 January 2010
Stonehouse - Hertzian Space
For the Transforming Stonehouse project we are now looking at how we can investigate the herztian space in our area to inform our final project. This was an area that we covered last year for IDAT106 and now we are reviving it for this project so that we can look at it in a more cultural and social sense. The use of Wifi and Bluetooth signals among others is ever increasing in the digital age, and this therefore is likely to lead to a change in society and the way we use the space around us. For a detailed overview of Hertzian Space go here:
The hertzian space and its uses also has an effect on us. We use some element of the hertzian space such as wifi, mobile, and Bluetooth frequencies everyday to reach out beyond our physical location. This extends our physical space and presence making contact easier and freer. For example when we turn the Bluetooth on on our phones, we are enlarging our personal space in one aspect to the radius of the Bluetooth signal. Others can converse and connect with us from a distance, including those who we may not know, as long as they are within the extended radius created by the hertzian signal.
To find out more about our location in relation to Hertzian space we went down to Union street and logged the Bluetooth signals that we received. The following is a visual mapping of this information with a list of the signals we found:
The hertzian space and its uses also has an effect on us. We use some element of the hertzian space such as wifi, mobile, and Bluetooth frequencies everyday to reach out beyond our physical location. This extends our physical space and presence making contact easier and freer. For example when we turn the Bluetooth on on our phones, we are enlarging our personal space in one aspect to the radius of the Bluetooth signal. Others can converse and connect with us from a distance, including those who we may not know, as long as they are within the extended radius created by the hertzian signal.
To find out more about our location in relation to Hertzian space we went down to Union street and logged the Bluetooth signals that we received. The following is a visual mapping of this information with a list of the signals we found:
This map shows the hertzian space of the area in terms of the signals that we found. However this information is not particularly accurate of the area in general as the signals we found were mainly those of other people in our group who were also looking for bluetooth signals in the area. This therefore is not a good representation of the hertzian space that would usually exist in the area. It occurred to me during this task however that our location is quite well chosen as it is right next to a bus stop which means there is more likely hood of receiving signals such as Bluetooth at our location.
One idea I had in terms of implementing hertzian space into our project is to use the Bluetooth signals found for our projection. Alike to the long exposure projection I suggested in a previous post, the words drawn on the floor could be those of the Bluetooth names in the area, rather than the word cloud concept. Either that or messages could be sent to any Bluetooth signals in the area asking for a single word response to a question, for example. These responses would then be drawn on the floor, demonstrating the feelings of the people in the area. In either of these examples the hertzian space would be used to provide the data for the projection which would ensure it was incorporated into the project.
Sunday, 17 January 2010
IDAT 211 - Fluidic Forms
My final perception video is called Fluidic Forms. This is because it uses fluid to create forms that enable the viewer to discover an object before it is actually visible. The final video is shown below but is in the dome corrected format so will look slightly warped unless viewed using a fish eye projector. The sound for this video is also in 5.1 surround sound. Again this will only be properly noticable if played in a 5.1 surround sound format.
IDAT 211 - References of Resources
Although the references for my resources I have used to make the video are included in the video I thought I should also put them here on the blog so that they are easy to access and view. The following is a list of all the websites I have obtained resources from and the details for my video:
http://e2-productions.com/repository/index.php - umbrella
http://www.diatonis.com/ - ambient background music
http://feeblemind.tuxfamily.org/dotclear/index.php/2004/12/24/4-blender-faire-pleuvoir---making-rain - rain particles
http://blenderartists.org/forum/showthread.php?t=137038 - rubber ducky
http://www.pacdv.com/sounds/index.html - rain sound, water running sound,
http://www.partnersinrhyme.com/soundfx/watersounds.shtml - splash sound
http://www.blender.org/ - software + Python
In terms of software I also used Adobe After Effects for video composition and Adobe Premiere Pro for the 5.1 sound arrangement and final composition.
http://e2-productions.com/repository/index.php - umbrella
http://www.diatonis.com/ - ambient background music
http://feeblemind.tuxfamily.org/dotclear/index.php/2004/12/24/4-blender-faire-pleuvoir---making-rain - rain particles
http://blenderartists.org/forum/showthread.php?t=137038 - rubber ducky
http://www.pacdv.com/sounds/index.html - rain sound, water running sound,
http://www.partnersinrhyme.com/soundfx/watersounds.shtml - splash sound
http://www.blender.org/ - software + Python
In terms of software I also used Adobe After Effects for video composition and Adobe Premiere Pro for the 5.1 sound arrangement and final composition.
IDAT 211 - A change to the plot
Due to the length of the other scenes in my video and also the lack of success in creating an effective look to the scene I have decided to remove the middle scene of my storyboard from the video. This includes the rising water section and the floating object section. I tried to create these scenes a number of different times but the results I obtained were not very effective. I did not like the aesthetic of the rising water scene shown in the test renders below, due to the way it was created, and attempts at the floating object scene did not work like I had hoped when using the dome camera to make the dome correction. The other scenes in my video also took more video time than first anticipated so that they were more coherent and cohesive. I therefore removed the middle scene to ensure that the video worked well and fitted with the specification.
This change should not have too much of an effect on the overall effectiveness of the video, and although it limits the amount of scenes included that were intended to relate to perception, hopefully the other two main scenes will get the concept across appropriately. The storyboard for my video in writing therefore now looks like this:
This change should not have too much of an effect on the overall effectiveness of the video, and although it limits the amount of scenes included that were intended to relate to perception, hopefully the other two main scenes will get the concept across appropriately. The storyboard for my video in writing therefore now looks like this:
- Title
- Raining scene
- Rubber duck scene
- Credits
Thursday, 14 January 2010
IDAT 211 - Test Renders
Below are some test renders of each of the scenes I have done for my video. These will be tested and changed if necessary before being render at 100% and put together into my final dome video. The videos show below have been dome corrected and so they will look warped due to the fish eye lense unless viewed through a fish eye projector.
IDAT 211 - Concept overview
For this perception video project we were asked to create a video for the IVT that explores an aspect of perception and representation. The concept I have chosen for my video as shown in my storyboarding below is related to the idea that our perception is based on our experiences and knowledge of things and that we apply this to what we see in order to find some kind of meaning. I therefore decided to explore this by providing the viewer of my video with a chance to test how much information is needed to trigger their recognition of objects. It occurred to me that fluids such as water, due to their fluidic properties, can mould around objects and give an idea of the shape underneath, without the shape necessarily being visible. My concept therefore relies on a hidden-revealed narrative where shapes are originally only visible in how the water reacts. This allows viewers to see if their perception can find meaning in the scene before the reveal. The object is then revealed to the viewer either confirming or falsifying their perception.
My video uses this as its main concept, focusing on water and how it can be used to provide a suggestion of a shape. The environments in which I set my scenes are also important for the viewers perception. Although the main environment of the scene is basic, the events and objects in the scene link with each other to enable the viewer to more easily gain recognition of what they are seeing before the object itself is revealed. This is as follows:

- The invisible umbrella is located in a scene where it is raining. Umbrellas are closely linked with rain due to their usage and so the rain should narrow the viewers perception to a limited set of possible objects to find the meaning of the scene.
- The invisible rubber ducky is placed under an inflow system like a tap. Rubber ducks are used in baths - baths use running taps to fill them, and so again the type of water situation used helps to narrow the viewers perception and hopefully make the object easier to recognise.
Subscribe to:
Posts (Atom)