Madefire Motion Comics

> Go To Website

Madefire is an application that has been optimised for the iPad and iPhone, emulating the traditional graphic novel format with the addition of motion, interactivity and audio. What’s more, Madefire also offers free development tools that can be used by independent artists to create and publish their own comic books onto the platform. With Moving Brands CEO Ben Wolstenholme and comic book legends Dave Gibbons and Liam Sharp involved in the creation of the app, we jumped at the chance to create the audio for the first three in-house story releases, namely “Treatment”, “Captain Stone Is Missing” and “Mono”. Not only were we creating the audio for the narratives, we were also constructing a SFX library for use within the development tool. It was therefore important that the audio enhanced each story while being effective for future titles.

A large part of the Madefire audio suite consisted of ambient beds. These would serve as a backdrop to each scene and aim to further immerse the reader in the story. We recognised early on it was important to enhance the reading experience rather than detract from it. We wanted to instil mood and atmosphere without interfering with the flow of events. One thing that helped us to achieve this was to ensure that the ambiences were dynamically flat and did not involve large spikes of distracting events.

In the early stages, each story was already scripted out but the art and text were still in production. This was an enjoyable challenge as we found ourselves ‘auralising’ each scene with reference to the written script, as opposed to a visual reference. What we had created was an audio storyboard that helped us allocate our requirements across several recording sessions and plan ahead accordingly.

An initial concern was that guns and battlefields featured quite heavily throughout the stories, yet a full blown firearm recording session was not within our remit. After some scouting around, the Madefire guns and WWII battlefield ambience came courtesy of Surrey Sporting Club who kindly let us record them for a day with our Sennheiser MKH 416 mics and a Roland R-26 recorder. With this set up we were easily mobile and could make use of independent XLR gain to adapt to different mic positions on the fly. The daylong session involved shadowing members who cycled through a series of shooting positions at the edge of a wood, pointing towards a central common. Due to the nature of the schedule guns were firing at all times at various distances, which allowed us to not only target near field shooters but also capture gunfire from the far field. Not all club members were aware of our intentions however, and assumed we were monitoring decibel levels for the local council! After a few hasty explanations and manly handshakes we were glad to hear back at the studio that we had captured not only the initial attack of the guns but also a lovely downrange peel out. The true character of the gunfire only revealed itself after a certain amount of compression and we were able to isolate, design and process the close range guns for ‘first person’ weapons and edit up the distant gunfire for what was predominantly used as the WWII ambience in “Mono”.

An additional and slightly more troublesome element to the Madefire weaponry was bullet ricochets. A full blown firearm recording session was not within our remit and so we were charged with tackling this from another angle. Surprisingly, it turned out all we needed was a slingshot, a bag of 1p’s and a brick. And the 1p’s were no accident! We meticulously narrowed it down as being the best British coin for producing that classic peeeoooww sound when glanced off a brick. The resulting recording was layered with a high-pitched, synth based laser effect to enhance the attack and add a little more pace.

We took the age-old approach of creating blood and gore SFX by taking a trip down to the local grocers. By smashing, squishing, and stabbing your 5 a day we were able produce a mixed salad of sounds for Madefire’s more unsavoury scenes. Leeks and celery were excellent for breaking bones, lettuce made for a convincing ripping of flesh and tomatoes and watermelons provided the much sought after headshot. 

“Mono” portrays a fair amount of solitude and nostalgia in the opening scenes of Edward Heston’s house, with a grandfather clock as the perfect symbolic representation of time past. Pendulum of Mayfair, a London based antique clock dealer, very kindly took time out of their day to let us into their showrooms and record some of the grandfather clocks on display. The owners were very accommodating and gave us permission to get in and around the clocks to capture the character of each model. We were too busy being careful to find out more about what we were recording but they were definitely very old and sounded great. Each clock had its own brand of ‘tick’, so we were able to choose from quite a broad range of sounds to achieve the mood we were looking for.

Comic books being what they are, we had a major requirement for impacts. We approached this on several levels, most of which required hitting stuff with a sledgehammer. As luck would have it our sound design assistant Dan’s new garden was laden with sonic treats. We spent the day laying in to what was essentially a big pile of scrap wood and rubble, and when we added a few cheap glass picture frames we had ourselves an outdoor foley studio with no holds barred. The area was quiet and suburban and with careful close miking we were able to successfully capture clean and detailed recordings. Our second impact outing involved a visit to the polite and accommodating people at a Surrey based scrapyard. This time we were unleashing our fury upon a broken down van and definitely felt the part when we turned up with a bag full of hammers, crowbars and cricket bats. We miked up inside and outside the vehicle to try to capture all angles but much like the gun recordings the true character of the sounds were only revealed after a certain amount of compression. 

One of the scenes in the Treatment series of stories is a Day Of The Dead carnival. Listening back through previous recordings on our Zoom H2 we found a recording of a food market during a trip to the 2011 Barcelona OFFF festival. Despite not really having any real purpose for the recording at the time, the general walla was aptly blurred yet featured the correct lingual tone for a Mexican setting. When combined with various percussion recorded in the studio and a busker’s steel guitar from the same Barcelona trip, the ambience needed for this scene came to life. This exercise really inspired us to stay proactive with recording, to the point where the Tascam IM2 iphone mic became a permanent fixture to our utility belts.

A similar example is within “Treatment”s graveyard ambience. The church bells were recorded at Ilfracombe, Devon and required a fair amount of seagull removal, but appeared to be perfect for the job. The crows were captured outside our studio window. These recordings were combined with the pitched up jangling of keys against our speaker stands, for an effective flag pole tapping sound that could easily be associated with desolate and isolated buildings. To glue these elements together we needed a more consistent base of exterior ambience, so we subtly introduced trees, distant walla and wind synthesised with NI Reaktor. The wind was a product of 4 or 5 noise generators that were filtered to different bands and given individual movement with an LFO module. A certain amount of randomisation was then added to higher frequency elements to give more of a natural flutter.

Another “Treatment” themed ambience (one of our personal favourites) is the Future City. This was very much inspired by the cityscapes of Blade Runner and Fifth Element, an influence that is hopefully evident to the listener. Our sound design assistants, the dynamic duo Dan and Dougie, did a great job of capturing the sounds of central London. One recording in particular really conveyed the ambience of the city without featuring too much in the way of close traffic or recognisable sounds. This, alongside the rumble of a passing train, served as the foundation that we built upon. The majority of the traffic flybys were created using noise generators and heavily LFO’d sine/saw waves from the Roland SH-101 and Lennar Digital’s Synlenth, which were then treated with a doppler effect. With our influences in mind, we also created a foghorn by heavily processing a tuba sample, sirens from modulated sine waves and an unintelligible female announcement by EQ’ing, pitching and applying reverb to a recording of Dougie’s voice. We don’t know any girls 😉

We also wanted to create a series of more abstract ambiences. These were intended to assist the more bizarre scenes of “Captain Stone”, but given the non-literal nature of these sounds, we anticipated that they would be of particular use within the development tool SFX library. These abstract ambiences were mostly dark in nature and were created once again using NI Reaktor – or more specifically using the ‘Metaphysical Function’ ensemble. By using the randomisation function and a bit of tweaking to further develop the sounds, we found we could create no end of ambient layers, which could be combined, reversed and looped for a series of more non-diagetic soundscapes. This ensemble was particularly appropriate as it combines synthesised and sampled elements to create a more natural timbre overall.

Implementation of the audio was carried out using Madefire’s development tool. The software itself was hosted via an online portal so we were able to populate projects in real-time and allow the Madefire team in San Francisco to review our progress as it happened (allowing for 8 hours latency!). With the project aimed at portable devices, MP3 was the primary format used, so latency and upload issues were not a problem. The entire approach of the tool is timeline based in a non-traditional sense, in that it focuses on a series of marked events for images, text and audio. The events are triggered by the reader as they tap their way through the story, so we were able to fade in or fade out audio elements according to the reader’s progress. The only other available parameters (currently) are volume and loopcount, and so any reverb, EQ etc. had to be baked in. This resulted in some trial and error but considering the audio had been created ‘blind’ the majority of it fitted surprisingly well. We aimed for a target of about 30 seconds per ambience as this seemed a good balance between variation and precious storage space.

This was one of our most engaging projects to date and overall, I think we really learned a lot about organisation and resourcefulness. I hope that by writing this we can pass some of that on and inspire others to find creative solutions whatever the scale of the project.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s