ARTS 2963: Designing Musical Games

Arts Department, Rensselaer Polytechnic Institute

Monday/Thursday, 12:00-1:50pm, VAST Lab, SAGE

Instructor: Rob Hamilton, West Hall 114

e: hamilr4 [at] rpi [dot] edu


Office Hours: Wednesdays 12:00 - 2:00, West Hall 114





One of the most exciting areas of music technology development is happening in the realm of gaming and interactive virtual space. Music and Sound Design play crucial roles in the design of gaming environments, narratives and flow. And as designers create ever more innovative game experiences featuring rich graphics, fast multiplayer networking and next-generation controllers, new techniques for creating immersive music and sound for games to complement and showcase these advances are not only possible but necessary.

This Studio class will explore cutting edge techniques for building interactive sound and music systems for games and 2D/3D rendered environments. To better understand the link between virtual space and sound, students will learn the basics of designing sound and composing music for interactive game spaces by designing and implementing rich musical games within the Unreal gaming engine. Coursework will require the ability and desire to code game logic and design game environments. Techniques for integrating sound and music within games including game-centric middleware tools like FMOD and WWISE, interactive sound synthesis and computer networking using Open Sound Control will all be explored.

Working in teams or on their own, students will design their own music-rich game experience, compose music, design sound material, and implement their own playable musical game experiences.


Students will explore the artistic role of music and sound in gaming by building their own interactive sound and music-rich games and 2D/3D rendered environments. Within the context of their own creative game projects, students will learn the basics of designing sound and composing music for interactive game spaces. Using workflow programming languages and software tools, students will program basic gaming interactions, link them to interactive audio software, and create a musical gaming experience.


    Students who successfully complete this course will demonstrate...
  1. an understanding and appreciation of game design and musical creation through an awareness of the many disciplines underlying the field including: software design and programming, interaction design, listening skills, musical theory, musical acoustics, digital audio theory, and digital signal processing.
  2. basic technical facility in the areas of game development, audio recording, editing, sound synthesis, software development and post production.
  3. creativity and resourcefulness through the creation of musical game environments and composition of your own sonic projects


Evaluation is based on the following:


You will be required to present some of your assignments to the class, to show your work within the software environment you used to create it, and to engage the class in discussion of your work. When you are not presenting your own work, you need to be attentive to whoever is presenting, and to engage them in discussion of their work. Failure to participate in class will lower your grade.


You must attend class to succeed in this course.

  1. Since much of the class is focused on listening to and discussing work in class, attendance is mandatory.
  2. ** More then two unexcused absences will affect your grade, detracting 1/2 grade each additional 2 unexcused absences. **
  3. Absences can only be excused by a letter from a medical doctor or from the Dean of Students' office.
  4. Late arrivals are very disruptive - continued late arrival will affect your grade.
  5. It should go without saying but no use of mobile devices or personal computers during class time (except for as required by the coursework itself) is acceptable. Continued violations will be treated as an unexcused absence.


Collaboration between students in this course is strongly encouraged. Likewise, students are encouraged—indeed, to some extent required—to exchange ideas, opinions and information . You are also encouraged to help each other in the lab and with performance, production, and presentation of composition projects.

Plagiarism of any kind is in direct violation of University policy on Academic Dishonesty as defined in the Rensselaer Handbook, and penalties for plagiarism can be severe. In this class you will be expected to attribute due credit to the originator of any ideas, words, sounds, or music which you incorporate substantially into your own work. This applies particularly to citation of sources for sonic "samples" included in your compositions.

Submission of any assignment that is in violation of this policy may result in a grade of F for the assignment in question. Violation of this policy will be reported, as defined in the Rensselaer Handbook


Students requiring assistance are encouraged to contact Disability Services: to discuss any special accommodations or needs for this course.


Office Hours for Fall, 2019 will be: TBD


The proposed course topics and schedule will be as follows (take note of project due dates!). Based on class progress and interests, this schedule is subject to change. Special topics, guest lectures, supplemental reading, listening and additional assignments to be announced.

Week 1:
Thursday, 8/29
Introduction: Designing Musical Games :: Gaming Musical Design
- The role of sound and music in games; filmic influences; spectra of interactivity
Week 2:
Tuesday, 9/3

Judas Priest Painkiller on Rocksmith video 1, video 2

Sounds in Game Space: 1 (Overwatch), 2 (Limbo), 3 (Dark Castle)
Samples: N64 Samples (Reddit), Drum samples (ChucK)
Audacity: editing and processing sound files (download)

Thursday, 9/5
Introduction to Unity: tutorials
Roll-A-Ball demo project (tutorial, project)

Video: Digital Foley: Mortal Kombat Sound Design
Week 3:
Monday, 9/9
Audacity: normalization, editing

In Class/At Home: Sound Design/Editing Challenge (
Your challlenge, should you choose to accept it*, is to creatively edit an entire game audio scene out of your recorded samples
    NOTE: in case of emergency, select 1 sound file from and find a singular 30 second block of sound from that file
    Using Audacity, create as many of the following sounds as possible:
      Shooting sound
      Rolling ball
      Large object impact
      Small object impact
      Short looping musical sequence (pitched "pad"" sounds)
      Low drum (kick)
      High-pitched drum (tom)
      Sharp attack drum (snare)
      Motor idling
      Motor revving
      Space-ship flying
      HUD interface sounds (button presses, toggles, sliders)
      Player Death
      Player Spawn
      Game Victory
      Game Failure
Thursday, 9/12
Basics of sound in Unity: AudioClip, Audio Source Documentation, Audio Listener
Videos: Audio Listeners and Sources
Roll-a-ball with audio collision + pitch and volume randomness, tutorial

Assignment #1: Sample-based game audio scene (Due Thursday, 9/26)

Using your favorite samples from the in-class recording challenge, populate a simple game scene with your sounds. You can make use of the scenes shown in class (Roll-a-Ball, Simple Shooter), Unity example scenes or a game scene of your own making.

The success of your scene will revolve around whether or not your scene feels coherent, i.e. all the sounds exist in the same space. Coherence can be helped by creative use of reverberation in game, by careful editing of sounds in Audacity, or simply through artistic choices you make in your role as sound-designer.
In addition to the project itself, create a detailed "call-sheet" list of all the sounds you will use/are using in the project, including variations
Please include examples of the following Unity audio processes in your scene:

    * Audio Source (multiple)
    * Mixer Groups
    * Basic C# scripting of audio playback events
    * Varying pitch, amplitude, etc.
    * 3D sound/distance-based attenuation
    * Sounds triggered through collision
    * A looping background "musical" track
    * EXTRA-CREDIT: include MIDI and/or Microphone input
Week 4:
Monday, 9/16
Week 5:
Monday, 9/23

Thursday, 9/26
Homework #1 Due: In-class Project Presentations
Week 6:
Monday, 9/30
Pure Data: Introduction to PD (hands-on):

- basic operation, synthesis, subpatches, abstractions, externals, logic
Pure Data deep dive: osc~/cosine, phasor~/sawtooth wave, square wave, note envelopes (line, line~), note sequences, karplus-strong
VR Musical Instruments: Coretet:

Thursday, 10/3
Week 7:
Monday, 10/7
Open Sound Control: specification, namespaces
OSCsharp C# classes, integration with Unity
PD + Unity: Roll-A-Ball OSC demo project (download)
Bi-directional OSC PD Patch: simple_pd_receiver.pd

Sound-only games: Soundvoyager, Papa Sangre
Listening games: dotstream, Enemy-Zero

Mobile PD Interfaces: mobmuplat, Pd Party, git
RJDJ: RJ Voyager scene

Installing externals: link
Pure Data externals: cyclone/mousestate (install via package-manager or

Assignment #2: Audio-Only game (Due 10/24 @ 10am)
For this assignment, build an interactive music game where the primary player interaction and gameplay is conveyed via sound and music. For this project your audio engine should be built within Pure Data.

The game can have no visible user-interface or a _very_ minimal user-interface, the choice is yours. What is important is that information you wish to convey to your players is primarily conveyed via sound and music.

Your game can be built entirely in Pure Data or you can use Pure Data as the audio backend to a _minimal_ front end built in Unity, MobMuPlat, or any other OSC-capable interface.
Week 8:
Monday, 10/14
Week 9:
Monday, 10/21
In-Class Hack-day.
Week 10:
Monday, 10/28
Chuck Basics: syntax, resources, example code.
Week 11:
Monday, 11/4
Chuck: Open Sound Control How-to
GDC Talk: Martin Stig Andersen Inside: A Game That Listens, Audio-driven Gameplay, Example 2
Overwatch WWise Tour 2016: Vol. 1/7 Audio in Overwatch, Vol. 5/7 Gameplay Information
Inside soundtrack samples:

Beat Detection Unity example project (right/left arrow to load each different beat-tracking audio scene)
Assignment #3: Integrated ChucK+Unity game (Due Monday, 11/18)
For this assignment, build an interactive and integrated music game using ChucK and Unity. The two environments should communicate with one another in a bi-directional fashion, either using Open Sound Control or Chunity.

The game should (as always) be "audio-first" where audio and/or musical sound are primary components and drivers of gameplay. Make sure there is data from your audio engine directly contributing to the Unity engine's action or reaction to player events.

Week 12:
Monday, 11/11
q3apd by Julian Oliver: site
Rotating Brains Beating Heart - Avatar Orchestra Metaverse: video
Canned Bits Mechanics by Juan Pablo Caceres, Rob Hamilton: video
Appolonian Gasket by Ignacio Pecino: video, paper
Singularity by Ignacio Pecino: video, demo (Firefox, Safari)
Kilgore by Marko Ciciliani: video

New Atlantis videos: twitch, video 1, video 2
project: New Atlantis
Week 13:
Monday, 11/18
Chuck Project Due
Week 14:
Monday, 11/25
Final Project Proposals Due: Submit a written 1-page description of your proposed final project for this class. Your project may encompass any or all of the three primary sound and music compositional/music/sonic techniques and technologies we have investigated during the course of this class including fixed media composition with Audacity or Pro Tools, workflow interactive programming systems with Pure Data, and text-based programming with ChucK. Your proposal should address the integration (techincal) and the mapping (artistic) of your data and control systems from game engine to audio layer, and vice versa. Your proposal should be clear, concise and well written (typed) and should include both creative ideas and descriptions alongside technical tools and concepts.
Week 15:
Monday, 12/2
Final Project Work
Week 16:
Monday, 12/9