Digital Storytelling 1
For this first preview presentation of the actual digital storytelling project we had to come up with 2 short ideas that were a minute long each. We were given full control to come up with the different stories that we wanted to present. For my ideas, I had 2 different genres which were chosen because I wanted to try out different techniques with video design.
For the first idea, I wanted to try out sending live feed from my iPhone camera into Resolume, and add effects over it, and I wanted to create this sense of the reality of the actors crossing over into the visual content on the screen. The concept for this idea was a person going to a party and taking videos on his phone, and the phone revealing that the party was actually haunted. The protagonist would start taking photos while the other actors posed, but the content on the screen would actually reveal that the other party goers posing for the photos were actually ghosts. The protagonist would then start realising that the photos he took did not match up with what he was seeing, but before he could do anything he would be eaten by the ghosts.
Pete gave us a very helpful lecture in our scheduled classes about how to key live video in Resolume with the chromakey plugin, and I used that to key the actors skin out and make the ghosts' skin look translucent during the live camera period. However, the lighting was not optimal and the actors had different skin tones, so the keying was not very accurate and did not work out well.
I think that the main challenge for this piece was to figure out a way to send the video feed from the iPhone into Resolume. It had to be a wireless transfer as the actor would be using the phone as a prop, and having a cable attached from the phone to the laptop which was running Resolume would be impractical. I started asking the 3rd year VDLP students if they had any idea of how to send video feed wirelessly, and they told me they had never tried it with an iPhone, but pointed me in the direction of Syphon, which was a Mac program that allowed transferof video feed through different applications. I was using a Windows laptop, so I had to find an alternative which worked on Windows OS, which was another program called Spout. I tried finding a way to do it through forums and video tutorials, but there was not enough information for me to figure out how it worked. Luckily I saw a forum post that talked about how someone had managed to send feed from their iPhone through an NDI transfer, which was another program that sent realtime video through network. On the Resolume website, I found out that Spout, Syphon and NDI were supported by Resolume so there was no need to download any external programs, however the NDI app on the iPhone cost 20 pounds and I was not keen to spend that money on a preview of a piece that was only a minute long. I managed to find a workaround by downloading a free app on the appstore called Epocam which converted the iPhone into a wireless webcam, and then to send that feed into Resolume, I used the NDI scan converter application which turned any open windows on the desktop into and NDI capture region.
I also used some pre-filmed footage that was used in the later part of the piece, with the actors wearing the same clothes as the actual performance, to create the content for the scene of the photos not matching up with what the protagonist was seeing.
It was also a huge challenge to key out the background of the footage as we did not have a proper green screen and just used a white wall in the gym as the background. I was still pretty happy with the result I got from the pre-filmed footage after adding some effects and playing around with the blending modes in Aftereffects.
The second idea was actors acting to a set animated piece, and reacting to the movements made in the animation. The idea was a packet of crisps coming to life and then escaping from a vending machine. The actors would be bystanders who observed the packet of crisps coming to life and then reacting to it.
The challenge I faced to create this idea was the huge amount of keyframing for the puppet animation that I was trying to do.
It was very tedious work to slowly key in each movement for the animation, so I utilised tools like the record function, which allowed me to drag the pins of the puppet tool and the motions would be recorded in time. It still took a few tries to get it right with the record function. Another tool I used was the wiggle effect to create the sense of the packet of crisps falling, making it look realistic as it would wiggle as it fell from the top of the machine.
I think that this idea took more effort in the production stage of the process, as it mainly was a single video clip that was triggered in resolume, and I did not have to do much during the actual performance. It was hard to animate the video according to the actors which I did not have much access to, so their actions were pretty much dictated by the video. If I had more time I probably would have tweaked the video delivery so that the piece had more interactivity between the video content and the actors, instead of just one side being totally dictated by the other.
I was happy with the final result for both groups, as I felt that I managed to achieve what I set out to learn and practice with for the ideas, which was to learn how to incorporate live video feed into a piece, as well as learning and utilising basic animation with the puppet tool.