Skip to content

mtaruno/harmoniq

Repository files navigation

harmoniq

1 Overview

1.1 What is Harmoniq?

Harmoniq is a tool for aspiring pianists who want to improvise/compose, and familiarize themselves with music theory in an interactive and synergistic way. It's kind of like auto-complete but for chords. Many composers, cover artists, and improvisers rely heavily on playing by ear. However, they hit a bottleneck when trying to:

  • Reproduce harmonic combinations they imagine
  • Understand or recall what they just played
  • Explore "better sounding alternatives" grounded in music theory and stuff that the whole community of artists like on YouTube have created. There is a huge passionate community of composers and aspiring pianists on YouTube that create amazing compositions, and they will be the clients for Harmoniq.

People talk about hand-eye coordination; this tool helps with about ear-music coordination. This is a tool for beginner-intermediate composers, basically for people who are not still able to train that ear-chord muscle or are not super familiar with the music theory names. Harmoniq will help you be able to reproduce all the chords you are playing. Even for more experienced users, Harmoniq can help you.

1.2 How to Use Harmoniq?

We begin by starting your session by clicking START, which activates the microphone and starts listening for incoming music being played. In the meanwhile, run python live_chord_recognizer.py to start the chord recognizer and python live_chord_progression.py to start the chord progression analyzer.

A real-time music assistant (implemented in live_chord_detector.py) listens to your piano via audio, detects and visualizes chords as you play (demo of this is shown in live_chord_progression.py) and figures out the chords you are playing at any given time window (I have set it to every ~2 seconds).

When we end the session, a Session Report is made that displays the chords that you have played over your session (it only includes the chords it is more than ~80% confident in).

I have also implemented a Key Signature classifier in live_chord_progression.py. Because we have a best guess of what the key signature is based on what you played in the session, harmonic progression roman numerals can also be displayed.

Future implementations can also include other dimensions of music such as harmonic tendencies over time, voice arrangement (bass, alto, tenor). The problem right now is that the algorithm is not perfect, so it is more useful as a display currently rather than something 100% accurate.

A "favorites" list can be saved, where the musical dimensions of those songs can be clearly displayed and recommended as you play. This can be stored in a simple database. For example, for Die with a Smile by Bruno Mars (Imaj7 -> IVmaj7 -> Imaj7 -> iii7)

Because we have the Session Report and Favorites List, a cool feature idea would be to cross reference the Session Report to the Favorites list to display something like "Bruno Mars would often use this chord progression Imaj7 -> IVmaj7!" which would hopefully ground your compositions to your favorite artists.

As for the front end, I envision having a mic button that has "START SESSION" or something like that on it. Then when the session starts, this is the information that shows (but make it more suitable following UX principles to be displayed well in the front end)

2 Application

Tech Stack

alt text

Initially I wanted to create a web interface. But in testing during actual piano playing sessions, it is not ergonomic to bring my laptop on top of the piano every time I want to have a composition seession. Because of this, I pivoted to create a mobile app. MIDI was also an initial plan, but using a microphone based system is more powerful for actual performers as I having to lug around a MIDI keyboard everywhere isn't as accessible.

Front End

AI generated application homepage concepts:

Homepage mockup

Homepage mockup 2

I want the design to be cute. Any designers are more than welcome to contribute to this application.

After contemplating between writing this project in React Native or Flutter I have devided to use flutter as it is better suited for MVPs, it has a growing support for lower latency real time audio, it has a custom expressive UI since Flutters widget system is perfect for drawing chord graphs, timelines, etc. It's also a strong cross platform choice as it can run on iOS, Android, desktop, and web.

If you want to run the current web version, go to the web folder and follow the instructions in the README there.

3 Algorithms

  • live_chord_recognizer.py - this file is the base technology that converts a music inside a session in real time into chords that it tries to predict every second and assigns a confidence score.
  • live_chord_progression.py - This version extends the Chord Recognizer to display the harmonic progression at the end of the session

3.1 How does Chord Recognizer work?

3.2 How does Chord Progression work?

First, we need to know the base key signature, so for this an algorithm is written to figure out the base key signature. From here, we take the chords in the session that has higher than 80% confidence to be included in the final harmonic progression we would like to display in the aplication.

  • Real-time audio processing via sounddevice + librosa
  • Chroma vector extraction using librosa.feature.chroma_cqt()
  • Chord matching algorithm with dot product similarity scoring
  • Dynamic key detection from chord sequence analysis
  • Roman numeral conversion (I, ii, iii, IV, V, vi, vii°)
  • Pattern recognition for famous progressions

Contact Information

I am happy to get feedback on this application! Send to: mtaruno@uw.edu with Subject Line "[Harmoniq Feedback]" if you can!

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors