FamJam

Role: Lead Developer

When: Sep 2023, Ongoing

FamJam is my passion project. I am working on developing a Digital Audio Workplace (DAW) plug-in for UE5. The goal is to create intuitive and grounded tools for musicians to make interactive music. These interactive tracks would then be grafted to animations and mechanics.

Showcasing:

  • Unreal Engine 5

  • Clean and Intuitive Architecture

  • Iterative User Testing

  • Artist centered development

  • Synced Audio Queuing

  • UE5 Meta Sounds

Demo

Inspiration:

Over the past several years I’ve fallen in love with Toronto. There are so many communities filled with healthy supportive people trying to make places somewhere we can all gather. I’ve had so many good experiences, from jam sessions at venues to no-tech-walks to soundbaths to renovating Bampot Tea House, Toronto has welcomed me. There are so many things to collect from this city, and I’m lucky enough to feel comfortable on stage or in the crowd. And this got me thinking. I love making music. I’ve taken piano lessons since I was four. I can have as much fun learning sheet music or just making something up. I get to make music. I’m lucky.

Videogame music mechanics fit pretty well into two categories: rhythm accuracy systems (i.e. Guitar Hero, Beat Saber), and playable instruments (i.e. Voice in Wandersong, Flute in Animal Well, Guitar in The Last of Us 2). Rhythm accuracy mechanics give you the experience of flowing through a fully fleshed-out track, but there is no player expression. On the other hand, putting a playable instrument into a game still means that you need to have musical experience. So how do we solve the dilemma between player expression and experience? With the same practices, we should always rely on: limit player choices to the ones they’ll likely enjoy, test our assumptions, and iterate where we are wrong.

As this idea for these interactive music mechanics became apparent I began to see that what I was developing was essentially a dialogue tree, but a tree that would have assets made by musicians instead of writers. The two most influential sources I pulled from were Campo Santo’s tool post mortems and Ahoy’s coverage of 16-bit era trackers. The natural conversations of Campo Santo’s Firewatch are built out of the team’s prioritization of giving the player timed and organic choices with accompanying performances. The history of 16-bit-era trackers allowed me to see what artistic roadblocks have gotten in the way of developing innovative software for musicians.

With these ideas, I started development.

Development:

The standard that I set for myself was this:

The architecture will be accessible to anyone. We will prioritize the intuitiveness and simplicity of concepts and data interfaces.

The main way I’ve embraced it is by talking to my target audience (musicians) and listening. I listen for insights, for grievances and most importantly confusion. Through these interviews I begin to understand the scope of the project. To begin with, I’m making a DAW. On the other hand, I’m making a new tool to solve a problem that my audience is thinking about; a process that would likely lead to a new music production pipeline. So at this stage, it’s all about building flexible prototypes. Something a musician could experiment with while someone with training works the tool.

The project is able to queue data sound assets into finite queued music phrases that are edited in the main data table. Players are assigned sound assets, which are held until the player inputs to their controller in time with the greater song. These finite queued music phrases can be played in any order and can be triggered globally.

Currently, I’m auditing audio sync through waveform analysis. The fidelity of sound execution is foundational for a rhythm game after all. But beyond this, future debugging endeavours will benefit from a waveform analysis pipeline which is why it’s my highest priority.

In the future, I anticipate building a better 2d UI to prioritize latency between inspiration and intuition. In tandem, I hope to understand how I’ll be implementing MIDI support. Both these features will streamline getting this tool into the artist’s hands because this whole process is about feedback.

Explore The Project

  • Source Code

    Explore The Public Repository