Project
Live Band Karaoke Production Platform
I built an entire production platform from scratch to run live band karaoke shows. Click tracks, lyric videos, music videos, DMX stage lighting, audience signup, set list management — all synced, all from one app on my tablet.
The Problem
"We had a piece of paper."
About a month into joining Emo Night Live, it hit me: "Hey guys, if we ever actually play a show, how do we handle people signing up? And the karaoke lyrics? And all that?"
The bass player had done one show two years earlier before the band broke up. Their system? Click tracks on lyric videos, somehow routed to in-ear monitors in a way nobody remembered, and his brother holding a piece of paper for signups. That was it.
I was like... dude. QR code signup. People scan a code, pick their song, it shows up on our end, I hit play, and we're going. That's where it started.
V1
Electron app, duct tape, one month.
Our first show was a month out and I had a full-time job, so speed was everything. I built an Electron app with all the songs listed. Click a song, it starts playing the lyric video and the click track.
Then the first venue had a projector. So I thought "man, it'd be cool to put the music video behind us." So the app got three buttons per song: one opens the lyric video window, one opens the music video window, one opens the click track window. I'd drag each window to a different screen — lyrics in front of the singer, music video behind the band — hit play, and they'd all start in sync.
For signups, I built our website with a QR code that went to the current show. People signed up online, I'd look at the queue, manually find the song in the Electron app, and play it. It worked. Barely.
V2
The ready check.
After that first show, a few things bugged me. I didn't want to look at the online signup list, then hunt for the song in the app and click it. That's annoying. I wanted to do everything from my tablet, right in front of me.
But the bigger problem was the thumbs up. See, we cover a ton of songs across like 8 different guitar tunings. Between songs there's a minute or two while people retune. I hated looking around at everyone trying to get a thumbs up that they were ready. It's dark, it's loud, you can't see anything.
So I built a Next.js app that ran on the same machine. If your device was on the same network as my laptop, you could hit my IP on port 3001 and log in. Each person's login showed: the next song, the singer's name, what tuning it's in, and a big ass button — the Ready button.
Tap it, and everyone else sees a green check next to your name. Don't tap it, red X. No more squinting across the stage for a thumbs up. The ready check became the source of truth.
On my login, I could also tap through the signup list, select a person which auto-selected their song, and hit play from my tablet. The whole show was now run from a tablet talking to a laptop over the local network.
The Lighting Rabbit Hole
12 apps later, I just built my own.
Then I got interested in DMX stage lighting. I tried about 12 off-the-shelf lighting apps. They fell into two camps: timeline-based and macro-based. Timeline was like mapping out an entire song per fixture — perfect for us but insanely tedious. Macro was more for DJs who don't do the same songs every night.
I went with timeline. Spent 8 hours. Got 3 songs done. Then I did the math: full-time job, guitar practice, 4 kids, 80+ songs. This was impossible at that pace.
But now I understood DMX. It's just a big array of values. That's really easy to work with in software. And what I wanted the lights to do could be described pretty simply in a JSON object. So I built my own DMX controller in C — a USB-to-DMX box with a standard driver, a lighting configuration layer that knew which channels controlled what, and a timing engine that calculated changes based on BPM, measures, and time signatures.
Each song gets a JSON file that describes the entire light show. Here's a snippet from "Cute Without the E" by Taking Back Sunday:
{
"bpm": 196,
"timeSignature": [4, 4],
"delayMeasures": 8,
"sections": [
{
"label": "verse",
"bpm": 198,
"measures": 16,
"colorScheme": ["Purple", "Orange"],
"colorShift": 4,
"laserColor": "Off",
"moving": {
"colorScheme": ["Pink"],
"shift": 1,
"focalPoints": ["Tony", "Aaron"]
}
},
{
"label": "chorus",
"bpm": 189,
"measures": 16,
"colorScheme": ["Red", "Orange"],
"colorShift": 1,
"laserColor": "Red",
"moving": {
"colorScheme": ["Yellow"],
"shift": 2,
"focalPoints": ["Singer", "Cross"]
}
}
]
}The timing engine in C reads this and does all the math. colorShiftis how many beats before the wash lights cycle to the next color — so in the verse it's every 4 beats (once per measure, pretty chill), but the chorus drops to 1 (every single beat, intense). delayMeasures is how long the lights wait after the click track starts — 8 measures of count-in before anything fires.
The moving heads have named focal points — "Singer", "Tony", "Aaron", "Cross" — which are physical positions on stage. During the verse the movers track the guitarists. When the chorus hits, they snap to the singer and the center cross pattern. Lasers kick on for choruses and cut for everything else.
The whole song is just a list of these sections. The engine knows the BPM and time signature for each one, so it calculates exactly when every color change, every mover repositioning, every laser trigger happens in real time. All I had to do was trigger the light show at the same moment I pressed play on the click track and videos, and now we had fully synced lighting, video, and audio.
V3 — Current
One app to run the whole show.
Everything got thrown into a single Electron app. It serves the tablet interface for ready checks, sends DMX signals for lighting, plays lyric videos and music videos and click tracks, and manages the signup queue.
The set list management got smarter too. We can curate on the fly — make sure we don't have the same singer twice in a row, give more people a chance to sing, and minimize tuning changes between songs so we can fit more music into the night.
One decision I made on day one: when a singer signs up, they can optionally subscribe to our email list. It's the only way to join it. Since we launched that, we've built a list of 300+ unique emails — all people who have actually performed with us. They know us. And it's been a huge driver for getting the word out when we have a show coming up. We've gotten bigger and bigger over the last few years.
See It Live
The system in action.
Underoath — A Boy Brushed Red Living in Black and White
No Doubt — Just a Girl
Stack
What's under the hood.
- Electron — main production app, video/audio playback, DMX control
- Next.js — tablet interface, ready check system, set list management
- C — custom DMX controller, USB-to-DMX driver integration
- JSON — lighting configuration per song, BPM/measure/time-signature aware
- QR code signup — audience-facing web app with optional email list capture
V4 — Under Development
What's next.
The JSON lighting system works great — but I can make it faster. And there's a bunch of features I've been dying to build.
Light Show GUI + Stage Visualizer
Right now I hand-write JSON for every song. It's not terrible, but a visual editor would make it way faster. More importantly, I want to preview light shows without plugging in physical fixtures. A graphical representation of our actual stage layout — moving heads, wash lights, lasers — so I can watch an animation of what the show looks like and tweak it from my couch.
Moving Head Patterns
Right now moving heads just snap to a focal point. I'm adding real patterns. Chase patterns — light 1 fires on beat 1, light 2 fires on beat 1.5, light 3 on beat 2, sequentially across the rig, all timed to the actual beat. Sweep patterns — slow movement from point A to point B, then stop. And variable speed for everything: maybe I want a mover to take a full measure to travel from the singer to center cross, or maybe I want it there instantly. Same for color animations — control how fast transitions happen, not just what changes.
Dynamic Lyric System
Our lyric videos are old school — actual video files. I want to replace them with JSON-driven, dynamically rendered lyrics served over a socket. If it's not a video but a live render, it can run on any device with a browser. As long as that device is listening to the socket for "play", it's in sync.
The grand vision: people who scan the QR code at a show can pull up what's playing right now and see the actual scrolling lyrics synced on their phone in real time. Any device, any screen, always in sync with the band.
Smarter Set List Algorithm
The queue management needs to get way more autonomous. Right now I manually build the set list from signups roughly every 45 minutes when I'm not playing. People don't realize that — they think it's broken and sign up 20 times for the same song.
V4 gets a real algorithm: never let the same singer go twice in a row, pad repeat singers 6–10 songs apart, always prioritize new singers, alert people when a song is already taken so they pick something else, and give better real-time feedback on the signup page so people know their request went through and where they are in the queue.
Backing Tracks + Live DAW (Maybe V5)
We've started adopting songs that need synths or piano. Right now those are AI-extracted stems as MP3s — and they sound a bit garbled. I want to build proper backing tracks quickly and have the click and backing tracks live in a DAW rather than static files. Real-time effects based on the venue, volume normalization, always in sync. This might be its own version though — I'm not quite ready to build a DAW yet.
Want One?
I build custom live event production software.
If you run live karaoke shows, live band events, or any performance that needs synchronized lighting, video, audio, and audience interaction — I've already solved these problems. Custom karaoke systems, DMX lighting automation, real-time audience signup platforms, click track synchronization — this is what I do for fun. Imagine what I'd build for you.
Get in touch and let's talk about what you need.