Petri Alanko [Game Composer]

In the wake of the success of Remedy Entertainment‘s latest release, “Quantum Break, I was able to chat with the game’s BAFTA-nominated Finnish composer, Petri Alanko. The 46 year-old has already tasted mainstream game music success after having scored “Alan Wake” a few years ago, and is receiving even greater recognition this time around. You can read below what he had to say about his history, studio gear and career as not only a composer but trance music artist.

Hi Petri. Can you tell me about how you became involved with scoring video games? Was it a deliberate pursuit of yours from early on?

Composing for picture had always been my career of choice since I was in kindergarten. Back then, however, there was no gaming industry whatsoever, so my dream was just a castle in the clouds. It took decades to have even the slightest resemblance of a possible market for such music-making. Finland wasn’t exactly the most obvious country for the gaming industry in the early 80s and even the 90s.

I started out by getting quite heavily involved with pop music after my high school years. For a long time I survived thanks to pop productions and songwriting, but eventually my heart wasn’t in it anymore and I decided to move on to what I would actually enjoy. When the word got around, I ended up receiving offers for the projects I really wanted, and the less attractive ones disappeared. Little by little I got involved with the IT business and content providers, and when those people left their original companies and formed their own, they kept calling me, thankfully.

A friend of mine, after a few career moves, ended up at Futuremark, which was partially owned by Remedy Entertainment, and during one of their mutual summer parties, my friend was asked whether he knew anyone capable of writing modern orchestral music. He gave them my number, and my phone rang the next week. I had dreamt about making music for games, but couldn’t think of any Finnish company I’d love to work with back then, except Remedy. We met soon after, I was shown some material, and we decided I’d write a short demo for a little cinematic which was later used in “Alan Wake”.

So, solid dreams, hard work, some luck, and a lot of contacts led me to my current career.

So scoring “Alan Wake” is what led to you being offered the job on “Quantum Break”?

Yes. I’d composed the soundtrack for “Alan Wake”, all the DLC packages and “Alan Wake’s American Nightmare“. Each project was incredibly cool, well-steered, and it seemed that I fit nicely for what Remedy wanted. They know I’m not taking this nonchalantly. I’m putting in my full time and effort all the time, and I never count hours.

After “Alan Wake’s American Nightmare” was done, and the game itself was waiting for a release, Saku Lehtinen, the Creative Director at Remedy, contacted me with some music requirements for a new presentation. The new idea looked cool, and pretty soon afterwards I received the first draft of the cinematic they had prepared for the concept. It looked really good, even as a mere prototype of a draft. I was sold immediately.

Most companies are a bit scared about when to bring the composer in. In my opinion, the longer the gig, the better the music you’ll get by bringing in a composer early on, unless you just want fillers and pointless war-elephant drum patterns.

Since “Alan Wake” is considered a sort of prequel to “Quantum Break”, were you prepared for scoring your latest game by working on the first one?

Well, both are in some way dealing with matters happening inside the characters’ heads, but in the case of “Alan Wake”, he pretty much was all alone against everything, as opposed to “Quantum Break”, where events are causing ripples in everyone’s lives. Further differences are that Alan Wake saved his wife, whilst Jack Joyce in “Quantum Break” saves everyone. Alan Wake sacrificed himself knowingly, whilst Jack Joyce acts against his own will. Nevertheless, both stories were great dramas.

What I loved in “Alan Wake” was the delicate, quiet notes. At some point, someone at Remedy said, “It would be nice to put something like that behind some of the busier scenes“. I used that feedback a lot. Also, some of the raw material for my instrument sounds came from the Remedy’s audio team, and what I got from them was all top notch stuff.

There’s too much library music around nowadays; far too many developers and even TV producers are relying on pre-composed music to accompany their characters from the beginning to the end, and that has almost destroyed some of the magic of games and TV – and even the movies. Why rely on meaningless pieces of looped music underneath a scene? Because it fits there? Well, in that case, why not just create the whole TV show with ready-made clip art, instead of actors or graphic artists? The music has to have meaning, and you can reach it only via custom compositions. I have to add that I have nothing against music libraries, but they don’t really raise any quality thresholds in daily TV, games or movies. They’re okay for ads and commercials, maybe with some meaningless little apps as well, but if the developer has any self-esteem at all, I’d adamantly maintain that they should use bespoke music.

Were there any stereotypes you wanted to avoid in creating the “Quantum Break” score, so as to not make another collection of orchestral soundscapes and pulsing synth sounds? Many of today’s scores for AAA games describe themselves as “epic” or “grand”, but the result is that things sound the same from game to game.

Well, “epic” should be defined by the nature of the music, and if the scene’s been directed and written properly, it doesn’t need 120 drummers to back it up, or an ogre choir. Forcing something “epic” usually makes a scene uncomfortable and bleak. I very much tried to avoid the clichés of sci-fi soundtrack music, like the typical “action drumming” sounds everywhere. Instead, since Jack had his goals, the movement could be subtle, minimally underlining and emphasizing the game’s events. Of course, there had to be fast-paced music or otherwise it would have been a bit distracting, but there’s a surprising amount of what appears to be low-key score music that caters to the motions and motivations behind the action itself.

In “Quantum Break’s” case, employing an orchestra would have been distracting and made the perspective more grandiose, and we decided early on to try and survive without any orchestras, to keep Jack closer to the player holding the controller. We wanted to keep in mind that he was just a guy in a cab in the beginning. Since his change wasn’t voluntary, I decided to use much longer and subtle musical arcs than most so-called superhero games would typically use.

Much of the workflow used in making electronic music has become standard across other genres, such as using digital synths, plugins, MIDI controllers, etc. Has your past work in the trance genre served as an advantage for your work as a game composer?

I would say it works both ways. There are still a surprising number of people who make the assumption, “You’re doing club stuff, so you’re a DJ who can’t play any instruments”, and I actually don’t mind that at all – the truth about what I can do is already out there. However, since the build-ups in trance music are lengthy, some over ten minutes, there’s a lot of tension and release involved, which has been a good exercise for me.

Considering the tools, I’ve had pretty much the same setup for ages. Computer processing speeds and the power of plugins have changed, but the core of my workflow has remained almost the same since 1986, with only a few changes. For example, after “Quantum Break”, I’ve been using a Roli Seaboard as my main controller – it forces me to think differently. And now I’m seriously considering switching over to Cubase.

I’ve heard that you like modular synths. Can you contrast them to subtractive synths

For me, music is a set of symbols and colors in my head, and the analogy applies to modules in my rack too. A basic oscillator is colored as a pale grey-blue circle, whereas a QMMG filter is an orange-brown pentagon with bright red stripes. Although I tend to construct mostly subtractive signal paths with my modular stuff, the nature of each patch makes them slightly more alive than a subtractive synth, as the signal path isn’t that clean. Usually, with simple subtractive synths, like a Roland SH-101, you can learn how to predict their sounds by looking at the settings on the knobs. The same applies eventually to each modular system, if you use it daily and know the pieces thoroughly.

Modular synths appeal to me as puzzle pieces that can be used to construct something new; a bit like a collage. But I’m not criticizing the ready-built synths either, nor the digital or the hybrid machines, as I own some from each camp.

Concerning your work on “Quantum Break”, you said once “…a huge amount of custom sound libraries had to be created even before the main composing could commence”. Can you talk a bit about this process of creating your library? 

There were no suitable sounds for my work in 2011, so I decided to start recording a lot of metal and glass to try to get something tonal from there. Typical glass recordings tend to merely be the sound of glass breaking or the rubbing of one’s finger round the glass rim, but the object itself can provide much more than that. The same applies to metal, piano frames, string instruments and other resonating real-world objects. Sound libraries are usually made according to certain usability aspects, but back when I started to create my own library, none of them offered anything new. Also, libraries of guitar feedback sounds were merely heavy metal pastiches and comedy takes, or just too short and clichéd.

Once I decided to begin making my own stuff, a lot of raw sounds ended up being used as convolutions, like cellophane paper, tinfoil, plastic and glass shards. Sometimes I adjusted the start and end points of the recordings, and sometimes I just let the echo reflections come and go whenever they wanted. Usually these recordings takes two or three months, processing them can take three or four months, and creating instruments from raw samples with Kontakt, Reaktor, Iris 2 or Kyma takes another one or two months – that was my initial cycle during “Quantum Break”. However, since the game needed a lot of material, the preparing phase took even longer.

Also, some commercial library providers can be pricey when it comes to sample usage in commercial products, so I’ve always tried not to rely on something 2000+ other people are using daily. I’m not criticizing the rights payments; it’s very much okay to be paid for your hard work, but some developers might encounter a few surprises when certain applications are released in the US App Store. I’ve heard of Spectrasonics charging $10,000 dollars for some sound that was used as a warning signal in a piece of media. Whether that’s true or not, I don’t know.

What were some of your most-used tools for your latest score, in terms of both hardware and software? Any particular reason for these choices?

Well, a few things come to mind: the Roland V-Synth XT, the Prophet-6 and the Korg Arp Odyssey; but I used other things as well. For instance, quite a few tracks were done with the Roland System-1m and my analog modular synth, interfaced via Expert Sleepers’ Lightpipe and S/PDIF interfaces. I also used their Silent Way plugin.

Sometimes a track started out analog, but was transformed into a digital production towards the end. But I’d say about 80% of the music was analog, created using Oberheims, old Rolands, Korgs, Sequential Circuits, Moogs, etc. I used to have an original Arp Odyssey, but like my dear old Prophet-5, they were both sold and replaced with remakes (Korg Arp Odyssey and Prophet-6) and boy do they sound great!

I’m very keen on using the Apollo interfaces by Universal Audio. I have three of those, as well as a lot of their plugins.

I’ve had an Oberheim Matrix-12 and Xpander for ages, but I turned to Arturia’s Matrix V plugin when it came out. The sound was pretty much identical to my hardware Matrix-12 and way too different from my Xpander. I had too much trouble with maintaining them, so I decided a more predictable solution had to be implemented.

I used a lot of stuff by Zynaptiq, DMGAudio and ValhallaDSP, but if I had to name a few key plugins crucial for the production, they would be Kontakt 5, Reaktor 5 (I hate Reaktor 6; it’s too slow), Iris 2, Alchemy, PitchMap, and Morph.

I usually choose a plugin based on its abusability (laughs). Some plugins output interesting stuff when used against the recommendations. For instance, Celemony Melodyne was used heavily for the end of “Alan Wake”, and I even transposed an entire orchestral cue with it. Towards the end of “Quantum Break”, Melodyne’s spectral edit capabilities were used, and that was all over the soundtrack.

What are some of the equipment that one can expect to find in your studio? Anything out of the ordinary that an average producer might have never seen?

I have the Macbeth M5, for starters. I used to have a summing mixer made of Neve 8086 modules that had been scavenged from an old 70s desk, but they kept on breaking down, so I sold it.

My usage of the Neuron VS and Nuke controller has gotten a bit out of hand. I’m using it through Plogue’s Bidule, and it’s interesting: you put just about anything in, and out comes this monstrous sound. Then there’s the Haken Audio Continuum and Roli Seaboard controllers.

How has the availability of digital tools helped your work, and how have you balanced that out with the presence of analog gear?

It’s not about the tools, but rather the experience, knowledge, and the ideas. Put ten people with the same gear into the same room and let them make their own albums, and out comes ten different albums.

Recall is a clear plus of digital. It’s appreciated greatly in my profession. Tools need to facilitate, and not burden or slow you down. In my case, the whole rig of digital gear is interfaced permanently to all the inputs and output of my analog instruments, and I know them all by heart, so when the production cycle kicks in, I can react quickly. Making everything accessible leaves more time to hone my ideas. However, I do think the so-called “producers” are losing some of their edge nowadays. What separates a regular idea from a great idea is the “campfire test”, which means that whatever you create should be able to work around the campfire with a guitar or an acoustic piano. You can “produce” a mediocre idea into something clever, but the production flavors cannot fade the foul taste in your mouth if your core musicality is poor. This is why I’ve always kept each melody away from my DAWs for at least two weeks before writing it down. It feels crucial to let things mature first.

How are you preparing for your concert with the Helsinki Philharmonic Orchestra and Choir? Is translating trance music into an orchestral performance going well?

Yes, it is! We sold out the first concert in just under fifteen minutes, and as I’m talking to you here, I just got a text on my phone stating, “The second one sold out”. We quickly arranged yet another concert for the same night, and luckily it was okay for the solo guests, orchestra and the choir. It’s amazing, and tells you so much more about subcultures, their fan-base, and the eagerness of the performers. We’re all very thrilled and excited, and the versions of the tracks are going to be so cool, as it’s going to be very different from your average classical concert. There’s a possibility we might need an occasional 808 kick here or there, but I’m not sure yet. It’ll be interesting and nerve-wracking, and I must get my hands back in piano condition, so I’m honing my Hanons and Czerny-Germers until August the 26th like a maniac (laughs).