Loading...

Follow Composer Winifred Phillips on Feedspot

Continue with Google
Continue with Facebook
or

Valid

By Winifred Phillips | ContactFollow

Hello there!  I’m video game composer Winifred Phillips.  At this year’s Game Developers Conference in San Francisco, I was pleased to give a presentation entitled How Music Enhances Virtual Presence (I’ve included the official description of my talk at the end of this article). The talk I delivered at GDC gave me the opportunity to pull a lot of ideas about virtual reality together and present a concentrated exploration of how music can increase a sensation of presence for VR gamers.  It occurred to me that such a discussion might be interesting to share in this forum as well. So, with that in mind, I’m excited to begin a four-part article series based on my GDC 2019 presentation!

Over the past couple of years, a lot of development studios have hired me to create music for virtual reality games. It’s fascinating work!  In my GDC talk, I discussed virtual presence in connection with seven of the virtual reality games and experiences that I’ve scored, which have either released within the past year or will be released within the coming months.  These include the PSVR version of the Scraper: First Strike VR shooter that was released just last week by Labrodex Inc, and numerous other VR titles including  Audioshield (Audiosurf LLC), Bebylon Battle Royale (Kite & Lightning), Fail Factory (Armature Studio), The Haunted Graveyard (Holospark), Life Hutch VR (Next Stop Willoughby), and Shattered State (Supermassive Games).

Gaming in virtual reality involves an engulfing 360 degree spherical environment and a sense of physical agency never-before possible with traditional gaming. Unlike other forms of video gaming, players in VR have the best chance to feel bodily present inside the virtual world. This sense of Virtual Presence offers an awesome opportunity for expert development teams to create powerful gaming experiences.

During this four-part article series, we’ll explore what Virtual Presence is. We’ll take a look at some creative composition strategies we can use to induce Virtual Presence.  We’ll discuss some of the possible drawbacks of Virtual Presence, and what we video game composers can do to support players and keep the fun going during their adventures in VR. I hope you’ll find these creative strategies to be useful in your own VR projects. Keep in mind that, while all these ideas are primarily designed to make gaming more fun in the popular VR gaming platform, they are essentially meant to get players more viscerally involved in any kind of gameplay, which means they can be applied to traditional game projects too.

Let’s start with the basics.

What is Virtual Presence?

According to Professor Thomas B. Sheridan of the Massachusetts Institute of Technology, Virtual Presence is defined as the sensation of being “present in the environment generated by the computer.” In order to feel virtually present, we have to accept the in-game location as a tangible place. According to Professor Sheridan, there are lots of methods that VR designers can use to accentuate the reality of their virtual environments. They can shower players with sensory stimuli. They can encourage players to move about, changing their relative viewpoint and altering their binaural soundscape. They can place objects in the environment that can be manipulated and changed by players, leading to a great sense of material engagement.

Here’s an example – one of my most recent projects, The Haunted Graveyard, from developers Holospark. As a Halloween-inspired VR experience designed primarily to be accessible to a wide audience in top VR arcades, The Haunted Graveyard focuses on an involving, atmospheric landscape for players to explore. Virtual Presence is extremely important, so the game provides the necessary sensory stimuli, player movement and usable objects that Professor Sheridan described. However, the game also includes lots of music to keep players emotionally stimulated. Here’s a ten-minute gameplay video of The Haunted Graveyard:

The Haunted Graveyard - music composed by Winifred Phillips - YouTube

In order for true Virtual Presence to be attained, players have to let go of their natural incredulity and get emotionally involved. They have to forget about the fact that they’re in VR, and just live the adventure.

As you saw from the gameplay video, the music in The Haunted Graveyard is an important part of the game’s design.

Music can be inspiring – it can make you feel more committed to what you’re doing, and more invested in the world around you. The question is – how does music enable Virtual Presence? And what tools can video game music composers use to make that happen?

In this four-part article series, we’re going to explore three ways wherein music enables Virtual Presence:

  • Music empowers Flow.
  • Music promotes psychological attachment.
  • Music provides an avenue for mood attenuation.

In this article, we’ll start with the first (and most famous) mechanism on our list:

The Theory of Flow

First introduced in the book Flow: The Psychology of Optimal Experience, the Theory of Flow tries to explain what it means when we get so absorbed by some interesting task that we forget everything else – all distractions fall away and our minds sharpen to a laser-point of focus. Sometimes it’s called being “in the zone.” The feeling is an incredible rush. Moreover, it’s a useful tool when trying to instill the sensation of Virtual Presence in our players.

In a study published in the journal Computers in Human Behavior, researchers from Indiana University found that the Flow phenomenon, when experienced in a virtual world, can enable and enhance the sensation of Virtual Presence. In other words, when we’re feeling “in the zone” during VR, we also tend to feel more like we’re actually inside an alternate world.

With that in mind, how can we game music composers use this phenomenon to our advantage? Well, science has shown us that music has a direct correlation with the Flow state. In an experiment conducted at Brunel University, researchers studied athletes who listened to music during their exercise routines. The researchers found that music had a strong beneficial effect on the athletes’ flow state, helping them to get “into the zone.”

Here’s an example – one of my most recent projects was the main theme for Audioshield – a music rhythm game for VR. Using a shield in each hand, players block glowing orbs that fly towards them in synch with the music. The game analyzes the unique qualities of the music to determine the type and frequency of orbs coming towards the player.

Because Audioshield analyzes music and then constructs gameplay around it, composing music for Audioshield was a real trial-and-error process. All during music production, the game designer and I both repeatedly loaded the music into the game and played through it, looking for any orb activity that felt jarring, or choppy, or rough – anything that would interrupt Flow. I’d go back into the music studio, make changes – then we’d play through the track in Audioshield again. Were the orbs showing up in ways that were too surprising? Or not surprising enough?

We were looking for a sweet spot, when Flow would grab hold, when the player would be firmly in the zone. And when that happened, it felt like everything came together. The gameplay was fun, the music was driving, and you really felt like you were there. Everything seemed tangible and real. Flow and Virtual Presence kicked in together. Here’s a video showing how gameplay worked during the main theme of Audioshield:

Audioshield Theme - composed by Winifred Phillips - YouTube

So we’ve now discussed the relationship between Flow Theory and the concept of Virtual Presence.  In our next article, we’ll examine the second mechanism by which music enables Virtual Presence – psychological attachment.  Thanks for reading!

How Music Enhances Virtual Presence

(Game Developers Conference Session Description)

Virtual Presence is defined as a state in which gamers fully accept the virtual world around them and their existence within it. This talk, “How Music Enhances Virtual Presence,” will explore how highly effective game music can enhance the sensation of Virtual Presence in VR gaming.

The talk will begin with an exploration of both the Flow Theory of Mihaly Csikszentmihalyi and the research of Dr. Paul Cairns on psychological engagement in video gaming. By understanding how the mental activity of players interacts with the way a game is designed, composers can create music intended to induce psychological states conducive with the formation of Virtual Presence.

The talk will include a discussion of techniques aimed at drawing attention to mission objectives,..

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Winifred Phillips | Contact | Follow

Hey everyone!  I’m video game music composer Winifred Phillips.  This past April, I gave a lecture on video game music composition techniques at the invitation of The Library of Congress in Washington DC. It was the first speech on game music composition given at The Library of Congress, and I was tremendously honored to be able to represent the field of video game music!  My presentation was entitled “The Interface Between Music Composition and Game Design,” and was supported by a full house in the Whittall Pavilion of the Thomas Jefferson Building at the Library of Congress. In a previous article, I posted a partial transcript of the Q&A portion from my Library of Congress session, including some of the best questions from the Q&A.  Since then, The Library of Congress has included a video of my entire presentation as a part of their permanent archival collection for future generations.  I’m very pleased to be able to share the entire video with you!

The video of my Library of Congress lecture is embedded below.  The video includes my discussion of some of the top interactive music composition techniques executed by modern video game composers.  Along the way, I explored the history of interactive music, and I included some of the new challenges and considerations that composers face when composing music for Virtual Reality projects.  In addition, my lecture methodically explores the most effective processes for integrating both original and licensed music into video games.

This video lecture embedded below is now available for free in the Library of Congress’ permanent collection.  I hope you enjoy the video!

The Interface Between Music Composition and Game Design - YouTube

About the Library of Congress

The Library of Congress is the largest and most popular library in the world, with millions of great books, recordings, photographs, newspapers, maps and manuscripts in its collections. The Library of Congress is the main research arm of the U.S. Congress and the home of the U.S. Copyright Office. As the world’s preeminent reservoir of knowledge, The Library of Congress is the steward of millions of recordings dating from the earliest Edison films to the present. In addition to the awesome size of its collection, The Library sponsors events, lectures and concerts that are free and open to the public. The Library of Congress hosts public events featuring famous authors, world leaders, entertainers, scholars, experts and sports legends. The Library has been recording Library events for decades and makes those recordings available in their expansive online collection.

Today’s Library of Congress is an unparalleled world resource. The collection of more than 168 million items includes more than 39 million cataloged books and other print materials in 470 languages; more than 72 million manuscripts; the largest rare book collection in North America; and the world’s largest collection of legal materials, films, maps, sheet music and sound recordings. The Library preserves and provides access to a rich, diverse and enduring source of knowledge to inform, inspire and engage both current and future generations in their intellectual and creative endeavors.

Popular music from composer Winifred Phillips’ award-winning Assassin’s Creed Liberation score will be performed live by a top 80-piece orchestra and choir as part of the Assassin’s Creed Symphony World Tour, which kicks off in June 2019. As an accomplished video game composer, Phillips is best known for composing music for games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims.  Phillips’ other notable projects include the triple-A first person shooter Homefront: The Revolution, and numerous virtual reality games, including Sports Scramble, Audioshield, Scraper: First Strike, Dragon Front, and many more.  She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Phillips’ is a sought-after public speaker, and she has been invited to speak about her work as a game composer at the Library of Congress, the Game Developers Conference, the Audio Engineering Society, the Society of Composers and Lyricists, and many more.  Follow her on Twitter @winphillips.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Winifred Phillips | Contact | Follow

Glad you’re here!  I’m video game music composer Winifred Phillips, and I’m the author of the book A Composer’s Guide to Game Music.  Recently my publisher The MIT Press requested that I host a question and answer session on Reddit’s famous Ask Me Anything forum, to share my knowledge about game music and spread the word about my book on that topic.  I’d be answering questions from a community consisting of thousands of gamers, developers and aspiring composers.  It sounded like fun, so last Thursday and Friday I logged onto Reddit and answered as many questions as I possibly could.  It was an awesome experience!  Over the course of those two days, my Reddit AMA went viral.  It ascended to the Reddit front page, receiving 14.8 thousand upvotes and garnering Reddit’s gold and platinum awards.  My AMA has now become one of the most engaged and popular Reddit gaming AMAs ever hosted on the Ask-Me-Anything subreddit.  I’m so grateful to the Reddit community for their amazing support and enthusiasm!!  During the course of those two days, the community posed some wonderful questions, and I thought it would be great to gather together some of those questions and answers that might interest us here.  Below you’ll find a discussion focused on the art and craft of game music composition.  The discussion covered the gamut of subjects, from elementary to expert, and I’ve arranged the discussion below under topic headings for the sake of convenience.  I hope you enjoy this excerpted Q&A from my Reddit Ask-Me-Anything!  If you’d like to read the entire AMA (which also includes lots of discussion of my past video game music projects), you’ll find the whole Reddit AMA here.

Questions about workflow

Question: I am really curious to learn what the process of developing music for a game is. I know with film media, musical directors get footage to create along with, how does this process work for games?

Winifred Phillips: You’re right about the process with film media. It’s a spotting procedure, wherein the director and composer go through the film and look for good opportunities to place music in positions that will have maximum impact. I think that there’s a similar philosophy behind how music is placed in games. The game development team and the composer make decisions about where music is going to be most impactful. The difference is that we can’t just watch the game all the way through, the way we might watch a film. Instead, we can look at the design documents, look at the currently built levels at whatever stage of development they’re currently in, and make decisions based on that. The dev team usually has strong ideas about the role of music in their project, and how they want the music to interact with their game. Sometimes I’ll have more input regarding these choices, and other times the team will be fired up about their vision for the role that music will play, and I’ll need to execute that vision.

Question: When composing, what kind of details are important to ensure it fits the game? Do you play through a scene without music before working on the piece?

Winifred Phillips: Cool question! I always love receiving a build of the game while I’m working so that I can play it and get inspired by what the development team is doing. That’s not always possible, though. Sometimes the game is just too early in development for me to receive a playable version. In that case, I read all sorts of design documents, look at tons of concept art, have lots of great meetings with the developers to talk about what inspires them and what their vision is for the music of their game. I’ll also do a bunch of research before I begin work. The research sometimes focuses on musical style, genre, instrumentation, etc. Sometimes the research also includes topics related to the game narrative and history. I want to understand the world of the game, so that I can create music that’s appropriate for it.

Question: How is your typical workflow when writing music for a new scene/part of a game? Like; play the scene once, fiddle around with some melodies, record a quick piece and let it sit for a few days and so on.

Winifred Phillips: I like to watch gameplay video or play a game build (if there’s one available) before I start work. Every piece of gameplay has its own visual rhythm, and that has a profound influence on the pacing and momentum of the music I’ll create for it. Regarding how I schedule my work — usually I’m operating on some pretty tight deadlines, so I rarely have the luxury to set a piece aside and then come back to it. The music must be finished! The deadline gods will be satisfied! Honestly, there’s nothing more inspiring than that terrifying ticking of the clock counting down to a deadline. I amaze myself with how much I can get done.

Question: I’m interested in your work on God of War and Assassin’s Creed. Those are two, for the most part, period pieces. Do you try and incorporate instruments or styles of music from ancient times? If so, how do you choose where to have modern/period music?  As a second question; how much creative freedom do you have? Do it vary gig to gig?

Winifred Phillips: Good questions!  I enjoy working on period pieces, because I get to do research and incorporate instruments and performance styles from another time. For Assassin’s Creed Liberation, I dug deep into baroque musical structures and instruments, and I learned a -lot- from the experience of creating the music for that game. For God of War I listened to a lot of world music from the locations where the game is based. Of course, we don’t really know what the ancient music of Sparta would have sounded like, although there are some guesses being made. Regarding creative freedom — you are absolutely right. It varies from gig to gig. I’ve worked on some gigs where the team just gave me their blessing and told me to do whatever I liked. I’ve also worked on projects where the development team closely supervised everything I did and had detailed instructions at every step of the project. It’s important to be able to adapt to whatever the circumstances may be.

Questions about working with development teams

Question: I would consider the music to be a huge part of the feel of any good game; considering what you just said about not having the ability to play a piece of the game before you compose, would you say that you have some creative license to sway the tone and feel of a game? I understand you want to help the game become what it’s supposed to become but sometimes a little creativity can help make the game into something more.

Winifred Phillips: You’re right about the creative license that composers sometimes have to define the sound of a game. When we’re brought in before the levels are finished, our music might actually have a big impact on the design of those levels. For instance, after I was hired to compose music for LittleBigPlanet 2, my first assignment was to create music for the Victoria’s Lab level, and I was given the description of the main character as a sort of mad scientist figure. She likes to build killer robots, and she’s a bit nuts. Always a delightful combination! So I composed a track with a lot of dark elements — gritty guitars, epic orchestral strings, etc. But since it’s also a LittleBigPlanet track, I made sure to infuse it with a lot of fun and wacky elements, like calliope, accordion, beat boxing, vocoder, and so on. Later, I found out that after I’d submitted the music, the level designers had gone back to the drawing board and revised the level pretty extensively. When I finally saw the level, Victoria was still a mad scientist, but now she was also a baker. The level was filled with cookies and cakes, and Sackboy could attack the killer robots by hurling giant cupcakes at them. The team at Media Molecule let me know that they’d changed the level because the music had inspired them. I can’t express how much that meant to me. The folks at Media Molecule are profoundly gifted and amazing, so I was so touched that my contribution helped to shape their creative process!  Here’s a vid of that music:

Little Big Planet 2 Soundtrack - Victoria's Lab (Winifred Phillips) - YouTube

Question: I am a 29 year old who literally loves little big planet! Is there ever a time where you would feel like the music wasn’t right for the game and you were forced to pick it because of the director or developer?

Winifred Phillips: Nothing wrong with being 29 and loving LittleBigPlanet! I love it too! Regarding your question — I’ve been asked sometimes to deliver music that seems like sort of an eccentric choice for the game in question. But in those moments, I think it’s important to remember that the dev team knows their game a lot better than I do. They know their audience. They also know the overall effect they’re trying to create with the music of the game. Creative collaboration doesn’t work unless everybody trusts everybody else. I just have to metaphorically close my eyes and do a ‘trust fall’ in those circumstances. The final result is almost always fantastic!

Question: In projects where the sound design team and music team are not really coordinating wth each other, how do you ensure the music produced and evaluated stays on brand for a project?

Winifred Phillips: I know what you mean. Really, the sound design team and music team ought to be coordinating with each other! At the very least, there should be some kind of supervisor in the team that’s keeping an eye on those things and making sure that the music and sound design work together. In most projects I’ve worked on, that’s the case. However, I also like to ask the team to send me videos of gameplay that include the sound design. That way, I can drop the video with its sound design into my Pro Tools session and hear the sounds of the game while I’m creating the music. This helps me create music that isn’t going to clash with the other aural elements in the game.

Questions about dynamic / interactive music composition

Question: What would you consider the main difference between game music and normal music?

Winifred Phillips: Well, normal music has a beginning, that proceeds to a middle, and then concludes with an ending. Game music usually doesn’t have any of those things. At its simplest, game music is composed so that it can be repeated indefinitely, which means that it must be composed with a very different composition structure than traditional music. As game composers, we have to think about what qualities will help players to enjoy a piece of music that repeats, and what qualities will lessen that enjoyment. Beyond this simplest of considerations, as game music gets more interactive and responsive to the actions of the player, the whole situation grows exponentially more complex. The music starts getting fragmented into many different segments that can be juggled around according to the action of the game. It can be very challenging for a traditional composer to understand how fundamentally different game music composition is.

Question: Is making game music different from making regular music, is there a guideline to follow?

Winifred Phillips: Regarding making game music — it couldn’t be -more- different than making regular music! Game music is very distinct. The demands on the composer are very different than they would be for a film or television composer, or even for a symphonic composer. Game music has to be interactive. It has to react to the actions of the player. That’s actually really inspiring to me. I feel like I’m having a sort of musical conversation with players. They perform actions, and the music responds. Hopefully the music inspires players on their in-game journey. In terms of the technical aspects, game music has to be constructed in bits and pieces, that can be jig-sawed together by the game engine according to what’s going on in the game. I go into a lot of detail about this in my book — it’s a fascinating way to think about music creation, and it’s really inspired me to stretch and grow as a composer.

Question: How do you manage to get certain feelings (tension, etc) to change over time during certain songs? Such as going into combat or coming out of it.

Winifred Phillips: Good question! When we’re structuring music for a game, we’ll assign different tasks to different compositions. A track may be assigned the task..

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Winifred Phillips | Contact | Follow

Hi!  I’m video game music composer Winifred Phillips, and sometimes my game music shows up in places I never would have expected.  A little over a week ago, while I was eagerly watching an awesome trailer for the just-released blockbuster Avengers Endgame, I was suddenly stunned to hear my own music in it!  (I’ve embedded the Avengers Endgame trailer that features my music at the end of this article.)  What made this moment even more jaw-dropping for me was that I had originally composed this music for the video game Spore Hero (a game from Electronic Arts’ popular Spore franchise).  Just as a reference, here’s what the characters look like in Spore Hero:

The style of Spore Hero couldn’t be further away from that famous Avengers style, as expertly displayed in the Avengers Endgame trailer.  Yet the same music was used for both projects.

The Spore Hero music I was hearing in the Avengers Endgame trailer was my “Hero Theme,” which functions essentially as a leitmotif within the Spore Hero score – it’s the central recurring melody in the game.  By virtue of the theme-and-variation technique, the melody undergoes a gradual transformation from invitingly cute to heroically epic.

The Avengers Endgame trailer featured the most dramatic iteration of this theme.  When I recovered from the initial surprise, it occurred to me that a mini-postmortem of this particular melodic theme might be the best way to explore an interesting topic: how does a single theme transform itself from an amiable melody to an avenging one?

I’ve written about the “theme and variation” technique before, both in these articles and in my MIT Press book, A Composer’s Guide to Game Music. In my book, I described the technique as follows:

“Theme and variation is a key concept for us to fully understand. Like our film and television counterparts, we as video game composers will sometimes create melodies to signify important characters, special events, or particular locations. These melodic themes may then recur within the score, sometimes in an altered form called a variation. This practice does not particularly distinguish our work from that of composers for film and television. However, in video games we often use theme and variation to convey messages to the gamer. Certain musical themes may be used to inform the player that they are achieving successful results, or that they have arrived at a desired destination….The joy of devising variations on a theme is that there are always numerous possibilities, depending on the musical effect we wish to achieve. For the video game composer, the theme-and-variation technique allows us to continue to assert a sense of thematic unity throughout the work while avoiding repetition fatigue.”

The Hero Theme in Spore Hero makes its first appearance at the very top of the game, in the opening menu. This track was the first distinct melody that was composed for Spore Hero.  Prior to the creation of this track, music production had proceeded for a solid month, focusing solely on components of the dynamic music system that required a very non-melodic approach.  In contrast, the opening menu presented the first real opportunity to create an iconic, thematic statement for the game.  The composition of this Hero theme included a lot of ascending melodic lines designed to emphasize optimism and positivity.  The music was structured in a cheerful major mode, and the style favored world-music rhythms, whimsical instrumentation, and woodwinds.  Here’s that track (from the official soundtrack album of Spore Hero):

Spore Hero Main Theme - YouTube

Once this track was complete, work resumed on the dozens and dozens of short music chunks that enabled the dynamic music system to function. Creation of the next full-length melody-driven track for Spore Hero came about a month later, when I recorded and delivered the music for the Creature Editor.  Anyone who has played a Spore game will be familiar with the great Creature Editor mechanic.  It allows players to create their own unique monster characters using a deep library of body parts that move and function according to the way they’re connected together.  The music that would play during these creature editing sequences had to be contemplative and low-keyed.  Nevertheless, it was important to acknowledge the central melodic Hero theme of the game within the Creature Editor music.  Here, that melody appears intermittently as a simple ascending motif (i.e. a melody fragment).  Using a recognizable portion of the theme here helps to solidify its importance within the game’s score.  Here’s that music as it is included in the official soundtrack album:

Sporabilities - YouTube

We can hear that this melody fragment now starts to take on iconic overtones, with the upward movement resembling a clarion call.  However, the instrumental treatment is still gently playful and light.

The next several months were consumed with lots of music composition for combat and exploration, while still incorporating that short musical motif wherever possible. However, the Hero motif didn’t go through any significant variation until composition commenced on the Storytelling track.  This music appeared in the game when players received portions of the overall narrative and learned about the importance of their quest.  This seemed like a perfect opportunity to expand upon that main Hero theme and surround it with a more complex harmonic structure.  Clearing away the whimsy and bright-eyed optimism, the Storytelling music opted for a more serious approach instead.  The harmonic underpinnings underwent a radical reimagining.  Moving firmly away from the simple major mode cheerfulness it had previously displayed, the iconic Hero theme now incorporates elaborate chord progressions to infuse the music with a sense of development and maturation.  Here’s that Storytelling version of the Hero theme (titled Sporeward in the official Spore Hero soundtrack album):

Sporeward - YouTube

After finishing this storytelling music, composition work progressed through the music requirements for all the quests that the main character would undertake throughout the game.  Again, it was important to incorporate the short iconic motif wherever possible, lending a sense of unity to the overall score of the game.  Along the way, I composed another important melody for this project, so now I’d like to take a short detour and talk a little about that other recurring musical theme – the Nemesis theme.  Packed tightly with chord clusters and chromatic runs, the composition structure of the Nemesis theme emphasized a sense of oddness and eccentricity that allowed it to contrast sharply with the hero’s more traditionally harmonious theme.  These two melodies essentially ran a parallel course throughout the game as the main character faced off against his evil adversary time and time again.  The first appearance of the Nemesis theme favors quirky, whimsical instrumentation and techniques.  I’m including the Nemesis theme here because it becomes important as we head towards the final iteration of the Hero theme.  So now let’s listen to the Nemesis theme from the Spore Hero soundtrack album:

Nemesis - YouTube

After completing the Nemesis theme, music composition continued for several more months while work focused on tracks for quests, environments and combat. Finally, the music composition and production schedule reached the penultimate track that plays while the hero rushes towards the climactic end-of-game confrontation.  This is the definitive Hero theme of the Spore Hero video game.  It takes the optimistic, fanciful melody from the beginning of the game and transforms it into a high-stakes anthem.  Most of the overt quirkiness and eccentric instrumentation are gone now, replaced with traditional brass and strings.  The world rhythm has disappeared, and military percussion has taken its place.  The entire theme has shifted into the minor mode, darkening the texture of the melody and emphasizing a sense of sober importance.  In addition, the Nemesis theme breaks in at 1:53, showing the culmination of these two contrasting melodies as they are juxtaposed directly in a single track.  Finally, a full choir joins in at 2:35, pushing the intensity still higher.  The track has now transformed from a lighthearted and amiable melody to an avenging anthem.  First, let’s listen to the whole Hero theme as it appears in the Spore Hero soundtrack album:

Hero Theme - YouTube

Here, the theme has shed its fanciful qualities and become an action track suitable for superheroes.  With that in mind, let’s now see how the Hero Theme worked within the Avengers Endgame trailer:

Avengers Endgame - Know Before You Go Trailer - YouTube

I hope you’ve found this mini-postmortem of the Hero Theme from Spore Hero to be an interesting case study of how theme and variation can be useful to us as game composers!

 

Popular music from composer Winifred Phillips’ award-winning Assassin’s Creed Liberation score will be performed live by a top 80-piece orchestra and choir as part of the Assassin’s Creed Symphony World Tour, which kicks off in 2019 with its Los Angeles premiere at the famous Dolby Theatre. As an accomplished video game composer, Phillips is best known for composing music for games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims.  Phillips’ other notable projects include the triple-A first person shooter Homefront: The Revolution, and numerous virtual reality games, including Scraper: First Strike, Dragon Front, and many more.   She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Phillips’ is a sought-after public speaker, and she has been invited to speak about her work as a game composer at the Library of Congress, the Game Developers Conference, the Audio Engineering Society, the Society of Composers and Lyricists, and many more.  Follow her on Twitter @winphillips.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Winifred Phillips | Contact | Follow

On April 6th I was honored to give a lecture at the Thomas Jefferson Building of the Library of Congress in Washington DC (pictured right).  As a video game composer, I’d been invited to speak by the Music Division of the Library of Congress.  I’d be delivering the concluding presentation during their premiere event celebrating popular video game music.  My lecture would be the very first video game music composition lecture ever given at the Library of Congress.  I was both honored and humbled to accept the invitation and have my lecture included in the 2018-2019 season of concerts and symposia from the Library of Congress.

In my presentation, I included many topics that I’ve written about in previous articles.  My lecture topics included horizontal resequencing, vertical layering, and interactive MIDI-based composition. I explored the various roles that music has played in famous games from the earliest days of game design (like Frogger and Ballblazer).  I also discussed how music has been implemented in some of the awesome games from the modern era (like one of my own projects, Assassin’s Creed Liberation).

My lecture was supported by a full house in the Whittall Pavilion at the Library of Congress. The audience gave me both a warm welcome and lots of great questions following the conclusion of my lecture.  Afterwards, the discussion continued during a book signing event that was kindly hosted by the Library of Congress shop.  During the book signing event, I was pleased to sign copies of my book A Composer’s Guide to Game Music. I also got to talk personally with quite a few audience members.  Such an engaging and insightful crowd!  It was a pleasure getting to know these lovely people.  I really enjoyed the lively conversation – I had the best time!!

The video of my full lecture will be posted on the Library of Congress web site within the coming months.  However, I thought I might offer a preview in the form of a partial transcript including some of the top questions from the Q&A session that followed my lecture.  So here are some of the questions that were posed – starting with a question about a topic of great importance to the Library of Congress – copyright protection for artists!

Question:  I know copyright is a big thing, so I was wondering how do you approach that?  So that you can avoid having the copyright issues?  At least from your experience?

Phillips: Copyright is a very interesting and important aspect of our work.  A lot of games are structured specifically around the idea of using licensed music. I’m sure you can think of a lot of Electronic Arts sports games in which music is introduced for the first time.  You find a band that you really love by playing one of those sports games.  That’s been a really big avenue for young artists to make their start, and it’s been great for the music community at large.

No game developer wants to find out that they’ve used a piece of music and that they haven’t secured the rights appropriately.  Particularly if you’ve fallen in love with a track.  You’ve incorporated it into your game, and (oh God forbid) you’ve actually structured your gameplay around it.  Then you find out you can’t use it!

It’s great that the Library of Congress has served the artistic community for so long in making sure that artists are protected.

Question:  What challenges do you face, or conventions do you follow, when you are mixing down a dynamic piece of music for a retail CD release? Or for a promotional material?  How do you make that stay interesting to the listener, if it’s not as dynamic as it once was?

Phillips:  When you’re composing music for a game, you’re essentially composing a lot of different bits and pieces.  You know that during gameplay they’re going to be triggered by the progress of the player.  So it’s essentially a flexible, fluid story.  I try to think about the most impactful course that the player might have taken through that level…  through that piece of music. Then I will construct in my music production software an ideal course, an ideal way to go through it.  I’ll mix it so that it becomes a memory of the experience of playing that game.  A lot of the people who buy these soundtracks are people who have played the games.  They want to own the music because they want to relive the experience.  That’s what I’m thinking about when I’m pulling all of the interactive elements together.  I want to create a sort of ideal listening experience.

Question:  How often do you use virtual libraries, in comparison to a real orchestra?  Do you use virtual libraries more for mockups, or do you use it more for scoring a game?

Phillips:  There have been projects where I’ve used it just in the mockup stage at the beginning. Then the project has gone on to record with a live orchestra.  So that’s always fun. But then there are other projects where the budget is just not going to accommodate that.  One of the things that’s important to me as an artist is the option to work with both large studios and also indie teams that don’t have the same kind of budgets. It allows me to do a wide range of projects that are exciting and creative.

To make an orchestral sample library sound satisfying and realistic requires a minute attention to details.  Also, a really good understanding of how sample libraries work.  How live musicians play.  Then, you can approximate the sound in a way that’s going to feel satisfying for listeners.  On the other hand, when you’re dealing with a live orchestra, you really want to be able to take advantage of the strengths of that medium, and appeal to the expressiveness that a live orchestra or live soloists can bring.

Question:  As someone who plays musical instruments but who has never composed – if I were to try to make an indie game or something and I wanted to make my own music – what kind of recommendations would you have?

Phillips:  I’ve seen development teams in which the main developer also creates the music.  There have been some really interesting games that have been created that way.  If you are a musician creating a game, you can sense how the music fits into the mechanic of the game – since you’re creating both.  So that’s something that I’ve seen done.  But I do think that if you haven’t composed before, it might make sense to try to start doing some of that first, before you try to bring those two elements together.  They’re very different disciplines.  You want to have a basic skill set.  You want to be comfortable with that, before you start taking something that’s hard in and of itself (music composition), and then adding into it something else that’s hard in and of itself (game design). You don’t want to get overwhelmed.

If you are familiar with any student teams, or teams that are involved in game jams, you could get involved in that.  A game jam is one of these events in which all of these game developers come together to create games on the fly really quick.  It’s actually a lot of fun, because it becomes a way to be very creative and solve problems right on the spot.  It’s also a fantastic opportunity to jump into a team right away as the composer.  You get an opportunity to just think about that part of the game, to create the music within the structure of a team.  I think you’d learn a lot about what goes into music composition for games.  Do that first – before you put the thousand ton weight on yourself by doing both things at the same time.

Question:  Given the gaming consoles and other platforms have grown so much in terms of their capacity, what limits do you feel are placed now (or still) on your budget?  And what’s your response then, in terms of strategies that you bring to increase your expressiveness?  What resources are available, while still remaining within the resources allowed?

Phillips:  That definitely becomes an issue from project to project.  I can’t honestly think of a single project I’ve worked on when budget doesn’t become a factor, even when the budgets are very large. There’s always the opportunity for your ambitions to explode beyond the boundaries that any game budget can accommodate.  At that point you triage the situation.  You look at it and you say, ‘how can I maximize the potential of a smaller amount of music? How can it provide coverage for a game and not become too repetitive – too annoying?’  You don’t want your music to become a negative. You always want it to be something people love.

I’ve talked about vertical layering, the idea of music broken apart into its individual layers that can be used separately.  That makes one piece of music more flexible to cover a larger amount of time. It can morph and change and become more adaptive to what’s going on.

On the other hand, there’s also the quite valid consideration of when music should settle back into silence. That gives the player room to absorb the moment. You strategically place music in those positions where it’s going to have maximum impact, where it’s going to be meaningful. You can still have a satisfying musical experience in the game without needing an enormous budget to accommodate it.

So those are two approaches that can address the problem.  But it’s a continuing problem.  Every game development studio wrestles with it at one time or another. We’ve all got the hope and the yearning to do something really special with music.  A lot of the times we can!  Sometimes the restrictions can make us be creative in ways we couldn’t have predicted.  That is the way to grow as a development team or as an artist and composer. So these challenges can serve to make us grow and become better.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Winifred Phillips | Contact | Follow

Delighted you’re here!  I’m very pleased to share that over the next two months I’ll be speaking at two fantastic events focusing on music in video games!  My two presentations will explore the unique structure and character of video game music, and how it helps to better envelop players in the worlds that game designers have created.  I thought that this article might be a good opportunity to delve into some of the ideas that form the basis of my two upcoming talks.  First, I’d like to share some details about the presentations I’ll be giving.

The Library of Congress has invited me to speak this April as a part of their “Augmented Realities” video game music festival. My presentation, “The Interface Between Music Composition and Game Design,” will take place at the Library of Congress in Washington DC. I’m very excited to participate in this event, which will be the first of its kind hosted by the “Concerts from the Library” series at the Library of Congress! The “Augmented Realities” video game music festival will also include panels on video game music history and preservation presented by distinguished curators and archivists at the Library of Congress, a special documentary screening that explores the ChipTunes movement, and a live “game creation lab.” My presentation will be the concluding lecture of the festival, and I’m honored to speak at such an illustrious event!  If you find yourself in the Washington DC area on April 6th 2019, you’re very welcome to come to my lecture at the Library of Congress!  Tickets are free (first come, first served), and they’re available now via EventBrite.

But before my lecture at the Library of Congress, I’ll be making a trip to San Francisco for the famous Game Developers Conference that takes place next month. For the past few years I’ve been excited and honored to be selected as a Game Developers Conference speaker in the Game Audio track, and I’m happy to share that I’ll be speaking again next month in San Francisco at GDC 2019! My talk this year is entitled “How Music Enhances Virtual Presence.

As a composer of video game music, the concept of Virtual Presence is a really intriguing topic for me.  I find myself often thinking about how my work as a video game composer can help to more fully envelop players in the awesome virtual worlds that they’re exploring.  In my GDC 2019 talk, I’ll be discussing Virtual Presence in connection with seven of the Virtual Reality games and experiences that I’ve scored, which have either released within the past year or will be released within the coming months.

When I decided to pursue this idea as the basis of my GDC 2019 lecture, I had the pleasure to read some fascinating research papers and expert opinions on the subject.  While most of my research is included in my presentation, some interesting topics couldn’t be included within the time constraints of my GDC 2019 talk.

In light of these limitations, I thought that we could explore these extra ideas in this article – that way we’d have a basis of thought-provoking supplementary information in advance of my upcoming GDC lecture next month.  So, let’s get started!

What is Virtual Presence?

In studies conducted at the University of Haifa in Israel, a team of researchers attempted to develop the optimal Virtual Reality user profile, with the aim of determining how to achieve ideal Virtual Presence.  For the purposes of their experiments, the research team defined Virtual Presence as “the subjective experience in which the client/subject feels as if s/he is “in” the situation even though it is not real. Presence is influenced by personality and technological factors, as well as by the interaction between the two.”

This is a useful definition of Virtual Presence, and offers some important prerequisites that must be achieved in order to attain it.  According to these researchers, in order to experience Virtual Presence, we need to fully accept the artificial environment as authentic, and experience our presence within that environment as convincing and emotionally engaging.  It’s no small task.  In considering how to accomplish this, I found the viewpoint of Harvard professor and researcher Chris Dede to be very helpful.  In an article about video games for the journal Science, Dede wrote, “The more a virtual immersive experience is based on design strategies that combine actional, symbolic, and sensory factors, the greater the participant’s suspension of disbelief that s/he is ‘inside’ a digitally enhanced setting.”

In other words, the more compelling stimuli we receive inside a game world, the more we’ll be willing to suspend our disbelief in favor of the virtual reality with which we’ve been presented… but it has to be believable, and engaging on both an intellectual and emotional level.

Within the body of research aimed at understanding how human beings achieve intellectual and emotional engagement with tasks, the theory of Flow stands out. In writing about Flow for Escapist Magazine, renowned RPG game designer Allen Varney described the sensation as an “intense focus, loss of self, distorted time sense, effortless action.” When we’re experiencing the Flow state, we’re engaging in an activity in which our top skill level and the greatest challenge of the task balance perfectly.  The task isn’t causing anxiety because it’s dauntingly difficult, nor is the task boring us because its so simplistic.  To quote Goldilocks, the task is “just right.” Let’s flesh out this idea by watching a popular video about Flow produced by journalist Evan Puschak, best known as The Nerdwriter:

Flow: Happiness in Super Focus - YouTube

Okay, so now that we’ve explored the importance of emotional and intellectual engagement, let’s expand that to connect with Virtual Presence.  Game designer Allen Varney describes “intense focus, loss of self, distorted time sense, effortless action” as factors that allow gamers to suspend their disbelief and become enveloped by a game.  With that in mind, can these intense mental conditions cause us to consider ourselves “in” the game situation even though it is not real, as described by the University of Haifa researchers?  In other words, can Flow lead to Virtual Presence in VR?  And if so, what can we (as video game composers) do to help make this happen?

Music can make you smarter

In 2016 I wrote a three-article series about how game music has the capability to temporarily enhance cognitive ability in players, increasing their skill level, which in turn makes the challenges of gaming more enjoyable.  While the articles were written specifically in connection with the rigors of strategy gaming, the concepts are applicable to other types of games as well (you can read Parts One, Two, and Three at these links).  The idea that ‘listening to music elevates our intelligence’ is not new.  Listening to music has been proven to increase cognitive ability, spatial ability, and even IQ. It’s definitely a temporary phenomenon, but it’s well documented. The effect depends on two factors.  The music has to be energetic and in a major key – that is, it has to be happy. Many of us will be familiar with this concept as the famous Mozart Effect, but subsequent research has shown that the effect depends more on our enjoyment of the music, rather than who the composer is.

The experience of Flow is all about successfully completing tasks that feel satisfyingly challenging.  Flow feels great!  But in order to experience the full effect, we have to have the prerequisite skills to succeed, and we have to have the necessary focus to respond capably to the demands of the game.  We’ve talked about how music can help us perform more skillfully, but can it also focus our minds to the tasks at hand?

According to a study conducted by Dr. Larry Morton of the University of Windsor and published in the Journal of Music Therapy, music can decrease our tendency to be distracted and enhance our focus and memory during tasks.  In Morton’s study, a group of subjects were asked to remember a sequence of numerical digits after having either listened to music for awhile, or sat in silence.  The results showed that exposure to music not only increased memory for the study subjects, but also enabled them to focus more effectively on the number sequence.

As Time Goes By

Since we’re exploring game designer Allen Varney’s prerequisites for an enveloping gaming experience, let’s take a look at what Allen Varney calls “distorted time sense.”  This entails the sensation that time has ceased behaving predictably and is now zipping along without our conscious awareness.  For instance, launching a game on Sunday afternoon, playing for what feels like two or three hours, and then finally looking up from the screen to discover with horror that it is now Monday morning at 3am – that is an excellent example of “distorted time sense.”  It’s often experienced during the Flow state.  Since we’re considering the possibility that this effect could support and enable Virtual Presence in VR, let’s now ask ourselves – how can game music composers distort the sensation of the passage of time?

There are many ways in which music can influence our perception of the passage of time.  First, let’s check out this 3 minute video from the BrainCraft series that explores how sensory input (including music) can cause time to become perceptually-distorted:

Your Warped Perception of Time - YouTube

Now, let’s think in more specific terms about what music can do to alter our perception of time.  Dr. James Kellaris of the University of Cincinnati has performed a lot of research studies on this specific subject.  Dr. Kellaris is best known for his research into music and the mind, particularly in regards to what he calls the musical “earworm” that gets stuck in our heads and won’t let us be.  But Kellaris has also extensively investigated the relationship between music and time perception through several studies.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Winifred Phillips | Contact | Follow

So happy you’ve joined us!  I’m videogame composer Winifred Phillips (pictured above working on my career breakthrough project, God of War). Today I’ll be discussing a hot topic that we’ve previously explored, but that definitely deserves to be revisited periodically.  This is one of the most popular subjects that I’ve addressed in my previous articles here: How does a newcomer get hired as a game composer?

I’m asked this question frequently, and while I offered quite a lot of advice on this topic in my book A Composer’s Guide to Game Music, I’m keenly aware of how urgent the need is for updated guidance on this issue for aspiring video game composers.  Game music newcomers often feel adrift and alone in the game industry, and some good advice can be a welcome lifeline.  In my book, I described the career path that led me into the game industry and allowed me to land my first gigs, but I’m well aware that my experience was pretty unique.  With that in mind, I’ve collated some recent research and insights from some top game industry professionals in this article, in the hopes that some of these expert observations might prove helpful.  There are lots of original and provocative viewpoints presented here, so we should feel free to pick and choose the strategies and tips that will work best for us.

Also, later in the article you’ll find my presentation for the Society of Composers and Lyricists seminar, in which I answered the question about how I personally got my start in the games industry (for those who might be curious).  Finally, at the end of the article I have included a full list of links for further reading and reference.

The Demo Reel

Here’s a new topic this year: the music demo reel, otherwise known as a professional portfolio.  To prepare an awesome demo reel, we game composers usually start by collecting our best music pieces and arranging them in a way that we hope will be impressive enough to arrest the attention of a potential client.  If we were in a sentimental frame-of-mind, we might think of our demo reel as a fragile flower in an outstretched palm, offered up in the hopes of finding an appreciative audience.  But there’s nothing about this process that’s sentimental.  The demo reel is a marketing tool, and its crucial that we think of it with a sense of strategic detachment.  We have a product we’re trying to sell, and our demo reel needs to act as our figurative foot in the door.  So, how do we make our reels stand out from the competition so that they attract the interest of clients who might want us for a famous game franchise or an indie masterpiece?

According to sound designer Nathan Madsen, when preparing our demo reel for review, we should put our newest work first.  “Having an up to date portfolio, or even relatively up to date, really helps you be ready for sudden job opportunities,” Madsen observes.  This opinion is echoed by Daniel Spreadbury (product marketing manager at Steinberg Media Technologies), who goes even further to suggest that “if you think that your talent as a composer is best displayed by the piece you’re currently working on, there is no reason that this can’t be submitted as well, as an example of a work-in-progress. While it will be incomplete, it can still be used to illustrate your creative process and your understanding of composition.”  That being said, Spreadbury urges that composers turn a critical eye to their work when selecting musical candidates for their demo reel.  “It’s often tempting to try and include as much of your own material as possible. In fact, it is usually far more useful to submit a carefully curated selection of your composing work, rather than everything that you’ve ever worked on. It is important to strike a delicate balance between submitting enough work that your talent and consistency are clearly demonstrated, but not so much that it overwhelms your intended audience.”

Are there any considerations regarding the format of our demo reel?  According to Matthew Marteinsson (sound designer at Klei Entertainment), a collection of music files may not be enough.  “Have a video reel,” Marteinsson urges.  “We are a medium that is the marriage of video and audio.”  Marteinsson observes that when receiving an audio-only reel, he is left uncertain of the capabilities of the composer he’s evaluating.  “I know you can make cool sounds, but I don’t know if you can marry them to the right visuals, so have a video reel.”

How much music should be included in our demo reels?  According to Kevin Regamey (Creative Director of Power Up Audio), less is more.  “What you’re making here is a teaser trailer,” he points out.  For composer demo reels, Regamey suggests that a two minute time limit is a good rule of thumb.  “Most reels are usually too long,” he states.  “Keep in mind that you can always show longer things later on. Get them interested, and if they like your stuff they will listen to your things that are not in your main demo reel.”  Regamey goes on to point out that “you need to tailor your reel to its audience. Make sure you know who is getting this reel and what they want to see.”

Certainly it’s preferable to know what a prospective client is looking for, but how can we achieve this?  Ariel Gross (Founder of the Audio Mentoring Project), suggests that we decide what games or franchises we’d like to be working on, and then pursue that goal.  “Most of the time you don’t need to be an avid player of the specific game or franchise that you’re trying to work on, though it can be helpful,” Gross observes. “At the very least, you want to be well versed with the products. If you can’t play the game, go consume lots of gameplay videos on YouTube, read articles and reviews about the games, and look for interviews with the people that worked on it.”

Apart from how our music demo is constructed and targeted, timing may also be an essential consideration.  “You absolutely need to have a showreel or a demo. Also, you need to have it ready before you start contacting people,” advises Will Morton (Audio Director at Solid Audioworks).  “You don’t have to be carrying around a folder of CDs and DVDs with you all the time, you can have a Soundcloud account with MP3s ready to give people links to if you need to,” Morton suggests.  “It is easier than ever to have your work on-line and accessible from anywhere these days.”

Finally, Brian Schmidt (Founder and Executive Director of the GameSoundCon conference), suggests a novel approach.  “If you want to create a demo that will really create an impression, create a ‘MOD’ for a game. A MOD is a game where some component has been altered or changed. Taking a portion of a game and swapping in your own music or sound design is a great demo,” Schmidt observes.  “By creating an interactive demo (instead of just a bunch of mp3 files) you will stand out over 95 out of 100 other composers or sound designers.”

There seem to be numerous theories regarding what makes an effective demo reel, as well as many options for its format and content.  But if we’re unable to get it into the hands of a potential client, it won’t be of much use.  So let’s discuss how to build relationships that can open doors in the game development industry.

Effective Networking

Beyond having a strong music demo, we will also need to be able to make professional connections in the games industry, and that can be a bewildering task.  How to begin?  Perhaps the better question is not how, but when.  “Your time at college is the ideal time to start networking,” advises Will Morton of Solid Audioworks.  “Make friends and work with people who are doing sound design. Make friends and work with people studying composition. Make friends and work with people on game development courses. Make friends and work with people studying TV or film production. Make friends and work with people studying any kind of performing arts. This will again help you build a better body of work to use in your showreel, and arguably more importantly gives you a network of people who may at some point ask you to be involved with a project they are working on.”

This opinion is echoed by Jason W. Bay (Owner and Editor of GameIndustryCareerGuide.com).  “Attending an audio school will also help to kick start your career networking,” Bay says, “because as the people in your classes graduate and then start getting jobs all over the country or even the world, they’ll become your eyes and ears inside of those companies. And they can help you spot job openings and even help you get interviews whenever the opportunities arise.”  The benefits of cultivating a network of college friendships are invaluable, according to Bay.  “It helps you find out about job openings before they’re posted. It increases the chances of getting a job when people inside the company already know you and trust you, so it helps you get hired.”

However, not all of us attended colleges ripe with networking opportunities, and some of us began pursuing a game audio career many years after graduating college.  What then?  According to Bobby Prince (Owner of Bobby Prince Music), it’s possible to attain some of the same collegial relationships by attending educational conferences.  “Go to the GDC (Game Developer’s Conference),” Prince suggests. “You can hang out in the public section of the location for the GDC and watch for miracles. They happen every second. Another great place to hang out is the local after/during hours hangouts. Keep your eyes and ears open for an opportunity. You don’t have to force an opportunity — the best ones will come to you without effort from you.”

While such efforts may yield results, the chances are far better if we’re emotionally ready to sell ourselves to our fullest potential.  Confidence is key, according to Rocky Kev of Black Shell Media.  “Great game developers know they’re talented, and that confidence enhances their work. You can do the same with networking. Block out all the self doubt and act like you know what you’re doing. Fake it. You’ll be surprised how much it works.” Kev goes on to add that “powerful networkers approach conversations with curiosity, treating the speaker like they’re the most important person in the world.”

Our ability to connect with people and convey our confidence and enthusiasm can be an invaluable asset.  In my book, A Composer’s Guide to Game Music, I discussed how our excitement for our work can intersect with the need to exude self-discipline and restraint when developing our professional network.

“As composers, we feel passionately about our jobs.  This passion drives our daily workflow and inspires the creative decisions we make. Emotions such as this can be helpful to us when we’re meeting with possible clients, but only when well controlled through a disciplined and organized presentation.  Some of our prospective employers may appreciate raw enthusiasm, but this sort of eagerness also has the potential to scare some people off.  What we need is the ability to gracefully articulate our enthusiasm while at the same time impressing developers and publishers with our thorough professionalism.

“Unfortunately, this is a skill attained only through practice.  Those of us who are uncomfortable or fearful in such situations can look for opportunities to practice in a safe, consequence-free environment.  For instance, helpful friends may volunteer to be an audience for us, applauding our strengths and drawing our attention to areas needing improvement.”  A Composer’s Guide to Game Music, page 242.

Getting the first gig

Everyone’s ‘big break’ story is unique.  I’m frequently asked..

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Winifred Phillips | Contact Follow

Welcome!  I’m videogame composer Winifred Phillips, and my projects lately have included music composition for a lot of great virtual reality games.  It’s been fascinating work!  Last year, when my VR work started to really pick up, I wrote an article with lots of resources to help video game music composers become more comfortable in the world of VR audio development.  Since this discipline progresses rapidly, I thought it would be best to post an update article now that adds additional resources to address new developments in the field.

VR development is continuously innovative and cutting-edge, and I’ve been fortunately to experience this first-hand.  As an example: one of my more recent virtual reality game projects was music for Audioshield Fitness, developed by the creator of the famous Audiosurf music-rhythm game. I was asked to compose the new official Audioshield Theme for release with the Audioshield Fitness game, which takes the core game mechanics of Audioshield and pumps up the challenge with obstacles that make players dodge and duck to the music.  The result is an intense workout that was named as one of the top 5 VR Fitness Games of 2018 by PerfectBodyMate.com. To maximize the power of the Audioshield procedural system, my composition had to attune itself to the system’s powerful music analysis algorithm and deliver moments of both challenge and spectacle. I composed and mixed the music with specifically-targeted EQ frequency ranges where I placed rhythmic elements and punchy crescendoes.  The Audioshield music analysis system then reacted to this audio content and changed the pacing and content of gameplay to match these variables.  It was a fun challenge!  Here’s a video showing how that worked:

Audioshield Theme - for Audioshield Fitness - composed by Winifred Phillips - YouTube

Composing for virtual reality is its own unique discipline, requiring a specialized set of skills and tools.  In this article, let’s collect some resources that explore the techniques, tools, and technologies associated with VR audio development.  Let’s also take a look at the professional community of VR developers that are there to help each other through the rough spots.  Ready?  Let’s go!

Technology and tools

First, let’s consider some of the tools and technology that has been designed to help audio folks create awesome virtual reality sound, starting first with new innovations over the past year:

Google Releases Resonance Audio

Google has entered the VR business in a big way, first with Google Cardboard, then with the Google Daydream device.  Now, Google has released its own multiplatform software development kit for VR audio.  This article sums up some of the features.  Below, I’ve also included a video demonstration of a binaural and ambisonic demo for Resonance Audio:

Binaural + Ambisonic Demo for Resonance Audio in Unity - YouTube

Let’s Test: 3D Audio Spatialization Plugins

In this article from the Designing Sound site, three spatializer plugins are tested for their performance in localizing sound sources under an assortment of conditions: circular movement, acoustic shadowing, vertical position, occlusion, moving occlusion, modeled attenuation, and near-field effect.  Similarities and differences between the plugins are discussed, and multiple audio samples are provided for comparison. 

MixOnline: The dearVR 3D Audio Reality Engine

This system from the Dear Reality company combines a plugin suitable for integration with Digital Audio Workstations, and a virtual reality application that allows users to instantly test their DAW mixes within a VR environment.  The MixOnline article (linked above) argues that such a system is long overdue and necessary for effective VR audio production workflows.  Below I’ve also included the demo video for this technology, produced by Dear Reality.

dearVR PRO | Immersive audio VST, AAX, AU plugin - YouTube

Envelop for Ableton Live

This suite of spatial sound tools is designed to integrate into the Ableton Live Digital Audio Workstation.  For Ableton Live users, the Envelop suite of tools allows effective mixing for a virtual space.  The article (linked above) discusses many practical applications for this software, including virtual reality. 

Below, I’m including the links that were offered in last year’s article, since the information is still relevant:

The “Works” 3D Audio plugin for Pro Tools

This article explores the 3D rendering technology of the G’Audio Works plugin, which supports multi-channel, object-based and ambisonic spatialization within the Pro Tools application.

The Steam Audio Software Development Kit

This article focuses on the spatialized audio solution for VR developed by the famous Steam software distribution platform.  Available as a free download without any royalty requirements, Steam Audio is designed to assist both Unity and Unreal developers in creating and implementing spatialized audio in their projects.

Facebook 360 Spatial Workstation

The Two Big Ears audio company is known for its 3Dception software enabling audio folks to author spatialized audio for VR applications.  Now that Facebook has purchased the company, the software has been rebranded as the Facebook 360 Spatial Workstation with added compatibility for 360 videos hosted on the Facebook platform. The software is free for everyone to use in their projects. However, the previous plugin compatibility with Unity, Wwise and FMOD is no longer offered for new users.

Google’s Omnitone, the open source project for spatialized sound in VR

This article describes the Omnitone application, developed by Google to combine ambisonic decoding with binaural rendering.  Omnitone was designed to deliver spatialized audio for browser-based experiences and apps designed for Android and iOS.

NVIDIA VRWorks Audio

For Windows games and applications, NVIDIA now offers the VRWorks Audio Software Development Kit for implementing spatialized audio in VR for 64 bit Windows apps.  For developers working in Unreal Engine 4, the VRWorks Audio game engine plugin can be added directly to the UE4 engine, while future plugins are promised for other game and audio platforms.

Methods and techniques

Now, let’s check out some recent articles that offer methods and techniques for creating immersive virtual reality sound:

How to Build Audio for VR Games

In this article, the audio director of CCP Games delves into how human beings perceive sound, and how that extends to VR.  Included: the virtues of hard panning stereo in VR, and the importance of adaptive music in VR. 

Oculus’ Audio SDK: Localization and the Human Auditory System

While this article is primarily a component of the Oculus Audio SDK documentation for VR audio development, it includes a concise and thorough breakdown of the components of sound that enable human beings to localize sound sources.

Immersive Audio for VR Workflow

This is a video recorded at the VR LA conference, in which a panel of game audio experts discusses a variety of topics connected to VR audio development.  Included: an exploration of the ways in which audio pros can lobby for the importance of audio content within a VR development team, and what features would be most welcome in audio production tools to make them more useful in VR development.  Watch here:

Immersive Audio for VR Workflow, Presented by DTS - YouTube

Audio Design for Interactive Narrative VR Experiences

In this audio postmortem of the VR project The Price of Freedom, the audio director explores how to prioritize audio implementation tasks during development, and how to effectively place audio elements within VR environments.  Mixing in VR is also discussed. 

9 Things You Should Know When Creating Sound for Virtual Reality

This article published by the GameSoundCon organization lays out several issues to consider when creating and implementing sound content in VR titles.  Among the topics addressed: equalization issues pertaining to HRTF processing, and natural human barriers to sound localization that can be addressed with careful spatialization choices.

As before, I’m including the links that were offered in last year’s article on this topic below, since the articles share a wealth of professional solutions to VR audio problems, and their insights can still be useful:

3D Audio formats for VR

3D Sound Labs takes us through the three most popular sound formats for spatial audio in VR: Multi-channel, Object-based, and Ambisonic.

How 3D Spatialized Audio Bottlenecks Virtual Reality Video

VR audio can consume enormous computational resources, resulting in a resource war between audio and video content.  This article discusses how audio demands in VR consume memory bus bandwidth, and proposes a short-cut that manipulates the frequency response of the audio content to enhance spatialization while consuming fewer resources.

Adapting Your DAW for VR Audio

A VR sound editor discusses his methods and workflow for virtual reality in this article.  Topics include capturing audio in the ambisonic format, and customizing the Pro Tools environment for spatialized audio projects.

Simple Spatial Audio for Beginners

This article summarizes the available entry-level techniques and equipment that can help a newcomer jump into the world of spatial audio for virtual reality.

An audio post production house shares techniques for VR Audio

In this article, we learn about some of the top equipment and methods used by the VR-focused division of an experienced audio post production facility.

Communities & Organizations

Virtual reality audio development can be daunting.  Fortunately, there are lots of online communities and organizations that can provide advice and encouragement when needed.  I’ve assembled links to some of these below:

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Winifred Phillips | Contact | Follow

Hey everybody!  I’m videogame composer Winifred Phillips.  Every year, between working in my studio creating music for some awesome games, I like to take a little time to gather together some of the top online resources and guidance available for newbies in the field of video game music.  What follows in this article is an updated and expanded collection of links on a variety of topics pertinent to our profession.  We begin with the concert tours and events where we can get inspired by seeing game music performed live.  Then we’ll move on to a discussion of online communities that can help us out when we’re trying to solve a problem.  Next, we’ll see a collection of software tools that are commonplace in our field.  Finally, we’ll check out some conferences and academic organizations where we can absorb new ideas and skills.

Ready?  Let’s get started!

Concerts and Tours

Let’s check out some of the great concert events and tours that are circling the globe, offering famous video game music performed live to audiences ranging from sedate symphony halls to screaming mosh pits. There are tons of ways in which we game composers can find inspiration in these performances, and there’s a wealth of options from which to choose.  If our tastes lean towards the more classical side of things, we can check out the big orchestral concerts like the Video Games Live and Distant Worlds tours, or we can opt for the subtler pleasures of a chamber ensemble approach with the intimate music of A New World.  Then again, some of us would rather head for the mosh pits and get ourselves some head-banging good times.  These folks may want to opt for events like MagFest and Bit Gen Gamer Fest.  There’s something for everyone in the collection of links below.  I’ve also included video clips that show notable performances from past shows.

I’d like to start with a concert tour that was just announced last week, and that means a lot to me personally:

Assassin’s Creed Symphony World Tour

Kicking off its world tour in June 2019, the Assassin’s Creed Symphony will feature the most popular music selections from the entire Assassin’s Creed game franchise, including music from the score I composed for Assassin’s Creed Liberation.  I’m very excited that selections of my Assassin’s Creed Liberation music will be performed by an 80-piece orchestra and choir as a part of the world tour.  The concert tour will premiere in the famous Dolby Theatre in Los Angeles, best known for hosting the Oscars ceremonies each year.  This is a brand-new concert tour that hasn’t premiered yet.  Since there aren’t any videos from past shows, here is the official trailer from Assassin’s Creed Liberation, featuring three of the tracks from my Liberation score: “Stealth,” “In the Service of Humanity,” and “The Hunt.”

Assassin's Creed® Liberation HD [UK] - YouTube

A New World: Intimate Music from Final Fantasy

This concert tour of video game music from the Final Fantasy repertoire takes a unique approach.  Instead of opting for large-scale orchestral ensembles and choirs, A New World: Intimate Music from Final Fantasy uses small chamber ensembles and special arrangements designed to accommodate them.  The result is a complete reimagining of Final Fantasy music, allowing well-worn tracks to feel more fresh and personal.  Three concerts are currently set to take place in small venues during 2019, including performances in Los Angeles, Seattle and Atlanta.  Here is a performance of the “Chocobo Medley” from a 2017 show that took place in Vancouver.

Chocobo Medley (A New World: Intimate Music from Final Fantasy, Vancouver 2017) - YouTube

Bit Gen Gamer Fest

The Bit Gen Gamer Fest is an annual event celebrating game soundtracks during one jam-packed day of music and mayhem.  Feeling like a cross between a rock festival and a video game arcade, the July 2018 edition of Bit Gen Gamer Fest included 18 musical acts performing video game cover songs at the Ottobar in Baltimore.  Here’s an extended video of the X-Hunters full set during Bit Gen XIII.

Bit Gen XIII: The X-Hunters - YouTube

Distant Worlds: Music of Final Fantasy

Launching into its twelfth year of touring the world, the Distant Worlds: Music of Final Fantasy concert tour continues its quest to spread the music of Nobuo Uematsu to video game fans everywhere.  The performances include the large-scale Distant Worlds Philharmonic Orchestra and Chorus under the direction of Grammy Award-winning conductor Arnie Roth.  Here’s a video of their performance of the Final Fantasy VII Main Theme during a 2014 performance at the Royal Albert Hall in London.

Final Fantasy VII - Main Theme - YouTube

Game Music Festival

This is a brand-new game concert series, planned to be a yearly event.  The concert took place this past October at the National Forum of Music in Wroclaw Poland.  Sponsored by GameMusic.pl, the Game Music Festival concert featured musical selections from Heroes of Might and Magic, Grim Fandango, Ori and the Blind Forest, and three Blizzard properties: Diablo, World of Warcraft, and Starcraft.  Here’s a trailer produced by the concert series for their first annual event:

Game Music Festival in Poland 26-27 October 2018 - YouTube

Joystick with the Malmo Symphony Orchestra

Like the Game Music Festival in Poland, the Joystick concerts in Sweden are intended as an annual event.  Joystick is now entering its eleventh year of offering video game music as performed by the Malmo Symphony Orchestra.  The program for this year’s concert includes selections from The Witcher 3, Hitman 2, Horizon Zero Dawn and Final Fantasy VII, among others.  Here is a performance of “The Dragonborn Comes” track from Skyrim, as performed during the Joystick concert in 2013.

Malmö symfoniorkester - The Dragonborn Comes (Joystick 5.0, 2013). - YouTube

MAGFest

The “Music And Gaming Festival” known as MAGFest takes place over the course of four intense days each year in which massive gaming tournaments run 24 hours a day and banging video game music concerts play loud and long into the night.  In addition to the big yearly bash (coming to National Harbor Maryland in January 2019), the nonprofit MAGFest organization also sponsors a touring concert series called Game Over, and several smaller music/gaming events that take place around the country.  Here’s a cover version of the Mega Man 3 Intro music as performed by The Advantage during MAGFest VI:

The Advantage - Mega Man 3 Intro LIVE@MAGfest VI - YouTube

Metal Gear in Concert

The Metal Gear in Concert tour began with two performances in Japan before coming to Paris in October 2018.  This coming year, the Metal Gear in Concert tour will stage two concerts in the United States, including its stateside premiere in March 2019 at the United Palace in New York City, and a Los Angeles performance in April at the Wellshire Ebell Theatre.  The tour features a 70-piece orchestra and performances by vocalist Donna Burke, best known for singing the themes for both Peacewalker and The Phantom Pain.  Here is a video of Donna Burke performing the Snake Eater theme during the Metal Gear in Concert performance in Paris:

Paris Palais de Congres Metal Gear in Concert - Donna Burke Snake Eater October 28 2018 - YouTube

Video Games Live

Finally, we have the granddaddy of them all – the Video Games Live concert tour.  Since its debut in 2005 at the Hollywood Bowl in LA, the Video Games Live concert tour has pursued a rigorous schedule involving hundreds of performance dates around the world.  The Video Games Live series eschews its own orchestral ensemble in favor of recruiting local symphony orchestras and musicians in each of the touring cities and towns it visits. The result is a touch of local flavor influencing the character and size of every Video Games Live performance.  Here is a clip of Video Games Live performing music from the Overwatch game during a 2018 concert in Germany:

Video Games Live Performs Overwatch | Gamescom 2018 - YouTube

So now that we’ve looked at the concert tours that can get us inspired to make great game music, let’s look at other resources that can help us to stay energized and improve our skills.

Communities / Discussion Forums

Need help?  Expert advice?  A shoulder to lean on?  These are some of the most popular online communities where you just might find the answers you’re looking for.

The game audio community is tremendously friendly and approachable.

Some of these communities listed below are focused on specific topics (such as a software application).

Other communities have a broader mandate to discuss any and all issues pertaining to game music composition and sound design.  Feel free to explore the below links and find a community that’s a good fit for you!

Software Tools

There are a wide variety of audio middleware solutions available for implementing audio and music into games, and I’ve listed some of the more high-profile software packages below.

Some of these middleware solutions are designed specifically with video game music composers in mind, to provide a user-friendly way for us to have the best control over the music implementation process.  These include Elias, FMOD, Nuendo, and Wwise.

The rest are more general-purpose audio implementation tools, with the exception of the Facebook 360 Spatial Workstation (designed with Virtual Reality and 360 video applications in mind) and PureData (designed specifically for generative music uses).

Game Music Academia & Conferences

When we’re in the mood to broaden our minds and think about our discipline in a new way, there are lots of scholarly organizations and conferences ready to offer us some inspiration and enlightenment!  First we’ll check out a list of academic and scholarly groups dedicated to studying the history and practice of music creation for video games.  After that, we’ll see a list of the yearly conferences that focus on audio and music creation.  Most of the list consists of conferences exclusively dedicated to the video game industry, but one of the conferences (Music & the Moving Image) offers a more general “music for media” event that includes video games in its offered content.

Academia
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Winifred Phillips | Contact | Follow

Hello there!  I’m video game music composer Winifred Phillips.  Lately, I’ve been very busy in my production studio composing music for a lot of awesome virtual reality games, including the upcoming Scraper: First Strike first person VR shooter (pictured above) that’s coming out next Wednesday (November 21st) for the Oculus Rift, HTC Vive and Windows Mixed Reality Devices, and will be released on December 18th for the Playstation VR.  My work on this project has definitely stoked my interest in everything VR!  Since the game will be released very soon, here’s a trailer video released by the developers Labrodex Studios, featuring some of the music I composed for the game:

Scraper: First Strike is just one of a whole slew of VR games I’ve been working on over the past year.  Last year, when I was just starting to get really busy working with VR development teams, I wrote an article here that offered a bunch of informative resources connected to the field of VR audio.  The article I posted in 2017 took a general approach to the role that audio plays in Virtual Reality experiences.  Since we’re well into 2018, I thought we could benefit from expanding that topic to include the state-of-the-art in VR headset platforms.  Taking a look at the hardware platforms that are currently available should give us video game composers a better idea of the direction that VR audio is currently headed.

For one thing, VR is now broadly considered a part of a larger category that also includes AR (Augmented Reality) and MR (Mixed Reality) devices.  Those two categories are often considered synonymous, although that’s certainly debatable.  Since there’s no clear expert consensus at this point on what characteristics separate AR from MR, let’s just consider them as one category that we’ll call AR/MR for now.  In this article I’ll be focusing on resources that are specific to each of the competing platforms in VR and AR/MR.

Let’s get started!

Audio for VR and AR/MR devices

A wide variety of head-mounted devices now exist that can immerse us in imaginary worlds, or bring fantastic creatures to life in our living rooms.  While many of these devices share common underlying technologies in regards to audio creation and implementation, there are differing tools and techniques that apply to each of them.  I’ve included links in the discussion below that may be helpful in understanding how these technologies differ.

When virtual acoustics meets actual acoustics

The newly-released Magic Leap One is an AR/MR device.  This means that it allows the wearer to see the real world, while superimposing digital images that seem to exist in reality, and not just within the device.  For instance, an AR/MR device can make us think that a miniature toy dinosaur is toddling across our coffee table.  With this in mind, creating audio for AR/MR becomes a little tricky.

For instance, let’s say that we want our tiny dinosaur to emit a ferociously-adorable little roar as he climbs on top of our coffee table books.  That sound won’t be convincing if it doesn’t seem to be happening inside our actual living room, with its unique acoustical properties.  The real-life room has to be mapped, and acoustic calculations have to be factored in.  This isn’t an issue when developing sound for virtual reality, since the sound sources emit within an environment that exists completely within the virtual world.

It’s a fascinating problem, and one that the Magic Leap folks have considered seriously, using a system they’ve dubbed ‘Soundfield Audio’ to apply physics calculations that can produce appropriate acoustics based on the environment.  They’ve also patented a spatial audio technology that uses the wearer’s head movements to calculate the position of virtual sound sources.  Here’s a video that shows off a video game music visualization application for Magic Leap called Tónandi:

Sigur Rós in collaboration with Magic Leap Studios | Tónandi - YouTube

The Hololens is also an AR/MR device, and therefore faces a lot of the same issues as the Magic Leap One.  To address these, Hololens uses a spatial audio engine that calculates the position of sound-emitting sources combined with personalized Head Related Transfer Functions or HRTFs (a concept we discussed in an article from 2015).  These HRTFs help to localize all the aural components of the virtual soundscape.  In addition, the Hololens creates a room model to match the user’s location so that sounds seem to reflect from real-life walls and travel convincingly to the player’s ears.  We should expect this technology to improve when Microsoft releases their next generation of Hololens early next year.  Here’s a video produced by Engadget that goes into more detail about the audio experience delivered by Hololens:

3D audio is the secret to Hololens' convincing holograms - YouTube

Spatial sound for mixed reality

While we’re waiting for the next generation of Hololens to be released, Microsoft has been keeping busy in the traditional VR space with its Windows Mixed Reality platform, which allows third party equipment manufacturers to create VR headsets based on its existing VR reference designs and software.  While the Mixed Reality platform shares common elements with the Hololens, the VR devices under the Windows Mixed Reality banner offer standard VR experiences, without any AR/MR elements.  Both the Hololens and the Windows Mixed Reality devices use the Spatial Sound software development kit for the design and implementation of positional audio.  This allows audio developers to create soundscapes for a large number of devices using the same tools.  While the convenience factor is certainly attractive, Hololens and Windows Mixed Reality offer very different experiences, so audio developers will certainly need to keep that in mind.  Here’s a short video that reviews the capabilities of the Spatial Sound SDK:

Microsoft HoloLens: Spatial Sound - YouTube

Positional audio inside the virtual machine

Now let’s move on to discuss what’s happening with the current VR devices.  As we know, unlike an AR/MR headset, a VR device cuts us off completely from the outside world and plunges us into an environment existing entirely within the machine.  There is currently a healthy and varied crop of VR devices from which to choose.  The two most popular and famous VR headsets are the Oculus Rift and the HTC Vive.  Both devices rely on positional audio technologies to deliver great aural experiences, and each company has worked diligently to improve the technology over time.  In June 2018 the HTC Vive introduced a new Software Development Kit (SDK) for immersive audio.  The new SDK allows for more sophisticated audio technologies like higher order ambisonics, higher resolution audio, more refined spatial acoustics (pictured right), and HRTFs based on refined real-world models to improve the accuracy of positional audio.

The Oculus Rift has upgraded their existing audio SDK to improve the positional accuracy of sounds emitting very close to the player (a technology they call Near-Field Head-Related-Transfer-Function (HRTF).  They have also provided the option of implementing sound that originate from large sources (such as an ocean, for instance, or a forest fire).  Using the Volumetric Sound Sources technology, large sound-emitting objects can project their aural content across an assigned radius consistent with their scale.  Here’s a video from the Oculus Connect 4 conference, demonstrating the Near-Field HRTF and Volumetric Sound Sources capabilities of the Oculus audio SDK:

Oculus Connect 4 | Breakthroughs In Spatial Audio Technologies - YouTube

The PlayStation VR, as the only console-specific VR device, does not share the same market with such devices as the Vive or the Rift and therefore is not faced with the same competitive pressures.  Nevertheless, improvements continue to be made to the PSVR’s technology.  The newest model of the PSVR (released late last year) is a revised version with small but valuable improvements.  Among the changes, Sony added built-in stereo headphones to the headset (pictured right), removing the need for players to hook up separate headphones in order to experience VR audio.

Standalone VR audio

Now let’s take a quick look at the standalone VR devices (i.e. those devices that don’t need to be hooked up to a computer or console, and don’t need a mobile phone installed in order to work).  These VR headsets offer untethered, cable-free virtual reality exploration, but they’re also usually a bit less powerful and full-featured.  The five best-known standalone headsets are the Oculus Go, the Oculus Quest, the Lenovo Mirage Solo, the HTC Vive Focus, and the Shadow VR.

The Oculus Go and Lenovo Mirage Solo both hit retail this May.  The HTC Vive Focus and the Shadow VR both became available for consumers just this month. The Oculus Quest was recently announced and is expected to hit retail in spring 2019.  All five use a Qualcomm smartphone processor chip from the Snapdragon line, so in that respect they’ve essentially adopted the internal mechanism of a high-end mobile phone and simply incorporated it into their on-board hardware.  In fact, the Qualcomm Snapdragon 835 (used in the Lenovo, Vive, Oculus Quest and Shadow VR devices), is also the same chip that’s at the heart of the Samsung Galaxy S8, the Google Pixel 2, and many other smartphone models.  Since all five untethered VR devices use the Snapdragon technology, developers can choose to avail themselves of the Qualcomm..

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview