a small tool to render spectrograms (waterfall graphs) or waveforms from audio in your browser
Demo 1 (standard config using microphone)
Demo 2 (showing different config possibilities with audio tracks)
npm i @fjw/audiovisualizer
Initialise the AudioVisualizer object:
new AudioVisualizer({options});
new AudioVisualizer({ // no src, uses the microphone
v: [
{
type: "spectrum",
container: "#myspectrum"
},
{
type: "waveform",
container: "#mywaveform"
}
]
});van array of visualizations, each with individual optionsv.typethe type of the visualization, possible values arewaveformandspectrumv.containerthe css selector of the container (HTMLElement) where the canvas gets rendered in, if the container is resized, the canvas will be resized, too.
-
srcURL of audio file/stream. skip to use microphone as source -
mutedstart muted or with hearable audio (default: true) -
analyserobject with additional options for the analyser (see AnalyserNode (Mozilla Docs)) for exampleanalyser.fftSize: 4096increases the resolution (standard is 2048) -
v.backgroundbackground color
v.lineWidthwidth of the linev.strokeStylestrokeStyle (color) of the line
v.rowsPerSecspeed of the waterfallv.colorthemearray of colors for the gradients (see examples)
mute()mutes the audiounmute()unmutes the audiosetSource(url)sets a new audiosource (false/null/undefined = microphone)
