Thursday, 12 December 2013

Pre Crit Visualisation

This is the version of my App in use I will be submitting for crit:

 

The things I will be looking for feedback on are realism, atmosphere and brand adhesion.

Filming for Final Tracking

For my final visualisation I filmed some footage of my hands gesturing and using my iPhone. As this would've been a problem shooting of location in a pub given the amount of equipment used and time taken I decided to recreate a pub in my kitchen:
For this set-up I used a couple of props to surround the phone and give it a more pub-like feel. Lighting was a simple 3-point set-up with the slightly cold fluorescent bulb in the kitchen as a pub/neon keylight, a warm incandescent bulb shone against a shoot-through umbrella for a diffused, glowing fill light, and a small candle used as a rim light to provide a subtle moving shimmer to the ale inside the bottle.

The camera rigging is an old tripod head taped to a mic boom stand with the camera levelled on the end. I used EOS utility to control the camera via USB to prevent any slight camera movement when pressing buttons as well as a larger and better screen to compose the lighting and prop position on. Below is a still from the footage I achieved:
The results weren't particularly amazing but they were the best I was able to achieve with what I had and the time I had to do it in, apart from maybe lighting the whole scene much brighter and shooting in a lower ISO as my footage is a little too noisy for my liking.

Evaluation

I found this project to very enjoyable overall. The time scale was very tight but I think learning to do things in the smallest amount of time will stand me in good stead in the future. The freedom to conceptualise and visualise an App without worrying about the finer details of whether it was actually possible, was very freeing and I found it a lot easier to generate ideas. The only part I didn't enjoy was making the preview for both iPad and iPhone, because it felt like we were doing the same thing twice but slightly differently each time.

Overall I feel like my final products fulfil their intended purpose and communicate the intention behind them clearly, although my iPad visualisation wasn't as good as the iPhone one, due to me being unable to film an iPad and the waning enthusiasm because I was just repeating myself.

Post Crit Changes

After the crit I received advice and opinions on my animations so far. For my video showing the app in use, no improvements were suggested so it remained the same. My promo however was view in a much more critical way and all changes suggested were acted on by me.

The first change I made was to include an App store icon to make make it obvious I was promoting an app, this was an oversight by me and should have been included pre-crit. The stroke outlining the words that scroll through was also removed so that the final company logo and tagline stood out more clearly. The final change was a simple spelling mistake in the word 'Shepherd', something I didn't notice until submission day and wasn't picked up by anyone during the crit.

Here is how my promo looked once these changes were applied:

Pre Crit Promo

This is what I will be submitting for crit:



The main things I will be looking for feedback on overall feel, pace and soundtrack.

Monday, 2 December 2013

First Keying and Tracking Experiment

I decided to experiment with the idea of using live footage as my framing for the dynamic visualisation of my app, with the purpose of showing it working in its in intended environment. To achieve this I used a combination of mocha and After Effects. First I filmed my hands holding and manipulating my iPhone with a completely green screen, to make it possible to key-out later. That clip was then trimmed to a few seconds and imported into mocha for After Effects so that I could motion track it. Once in mocha, I used the Create x-Spline tool to circle the iPhone and define it as the area to be tracked. The program was then left to track forwards and then backwards through the clip to define the areas it will work with. The Planar Surface tool was then used to set up the 'flat' surface that the image will then be paired with it, with minor corrections being made using keyframing to keep it on track. Below is what I ended up with at this point:
Once all the tracking was completed the corner point data was copied from mocha and pasted into the timeline of the Bishops Finger layer as keyframes. After slight adjusting using the anchor point tool the image is the aligned and skewed to match the iPhone screen. The Keylight tool was then used to remove the green from the iPhone screen and let the Bishops Finger image show from below my finger. Here is the clip before the process:
And after:

Evaluation: Overall the test achieved what I set out to do but the visuals were a bit lacking. I think the tracking looks a little skew but I didn't want to invest hours on something that wouldn't be used, when it comes to the final outcome I think I will be able to correct the tracking so that it looks much more realistic. It also isn't too obvious on the resolution of the YouTube video but when played back in After Effects the extreme sharpness of the graphic on the screen didn't match the slight, natural blur of the video. This will be corrected using a simulated lens blur as well as a bit of colour correction to make it tie-in visually with its surroundings and ambient light.