Once all the tracking was completed the corner point data was copied from mocha and pasted into the timeline of the Bishops Finger layer as keyframes. After slight adjusting using the anchor point tool the image is the aligned and skewed to match the iPhone screen. The Keylight tool was then used to remove the green from the iPhone screen and let the Bishops Finger image show from below my finger. Here is the clip before the process:
And after:
Evaluation: Overall the test achieved what I set out to do but the visuals were a bit lacking. I think the tracking looks a little skew but I didn't want to invest hours on something that wouldn't be used, when it comes to the final outcome I think I will be able to correct the tracking so that it looks much more realistic. It also isn't too obvious on the resolution of the YouTube video but when played back in After Effects the extreme sharpness of the graphic on the screen didn't match the slight, natural blur of the video. This will be corrected using a simulated lens blur as well as a bit of colour correction to make it tie-in visually with its surroundings and ambient light.