Fun with Web Audio API
I love music. I love math. I'm an engineer. Maybe that's why I love the Web Audio API.
The Web Audio API is very powerful. The way you can create audio nodes and chain them together is very flexible. The audio source can be an HTML5 <audio>
element on the page, you can ask permission to use the microphone or you can even create sound with oscillator nodes.
Imagine a recording studio with lots of equipment like mics, amplifiers, mixers, speakers and synths with lots of wires that connect the equipment. The AudioContext
is the studio where everything lives. From the AudioContext
you can pick up some equipment and wire it together, say an OscillatorNode
(synth) that you connect to a GainNode
(amplifier) that you connect to the AudioDestinationNode
(speakers).
Filter
The first Pen I want to show you is an experiment with audio filters. I use Soundcloud to stream Martin Garrix's remix Project T. By putting the audio in an HTML5 <audio>
element with the controls
attribute present we get all the control functionality for free (play/pause/skip/volume/time left).
This is how you create a filter:
biquadFilter = audioContext.createBiquadFilter();
biquadFilter.type = "lowpass";
biquadFilter.frequency.value = 1000;
biquadFilter.Q.value = 10;
biquadFilter.gain.value = 20;
Filter type is selected with the drop down. Frequency, Q and gain is controlled with the sliders. Some filter parameters are not applicable for certain filter types, then the corresponding slider is disabled.
Then we need to connect the audio source node with the filter node and then the filter node with the output node. Source -> Filter -> Output.
To draw the frequency response resulting from the filter I use getFrequencyResponse(). The first parameter is an array containing all the frequencies we would like to get the response for. The second and third parameters are the output arrays (same length as the first). Then we can use those two output arrays to draw curves on the canvas.
biquadFilter.getFrequencyResponse(
myFrequencyArray,
magResponseOutput,
phaseResponseOutput);
- magResponseOutput is the white curve
- phaseResponseOutput is the red curve
When you use Web Audio API to stream music you have to be aware of the CORS restrictions. If you would like to load audio from another origin than CodePen, you have to be sure that the origin has set up CORS headers properly.
Try using the bandpass filter to isolate the melody.
Oscillator
The second Pen I like to show you is a simple tone generator. It uses an oscillator node to produce sound and an analyser node plus a canvas element to display the resulting sound waves. (More info on the analyser later.)
Try the square wave and set the frequency rather low, then you should be able to see the Gibbs phenomenon.
Also try the custom wave shape. It uses audioContext.createPeriodicWave()
and oscillator.setPeriodicWave()
to create a wave with three combined frequencies. The base frequency is the one you set with the slider, the second one is twice the base frequency and the third one is three times the base frequency. The arrays passed to createPeriodicWave
contains these first three Fourier coefficients. Each value specifies how much of that coefficient you can hear (0-1).
This pen shows that you don't need to connect the audio nodes like a train, you can branch off as you like. The gain node is used to toggle sound on or off with the button. And the analyser node gives us real time data about the sound playing.
Oscillator -- Gain +-- Out (speaker/phones)
|
+-- Analyser
Analyser
The third pen features an analyser node to visualise music streamed from Soundcloud. There are two different kinds of data you can get from the analyser node: frequency or time domain. With frequency data you can draw a frequency spectrum - "Frequency bars" button and with time domain you can draw the sound wave - "Oscilloscope" button.
There is a single note playing repeatedly at the beginning of the mix. Click the "Frequency bars" button immediately and you will see the base frequency and the overtones.
Note that you can change song anytime by pasting the song's Soundcloud URL into the text box and then click Find.
- https://soundcloud.com/adi-cohen-8/my-heart-is-stone-adi-cee
- https://soundcloud.com/scitec/tec115-1-adjd-alexi-delano
- https://soundcloud.com/hardwell/hardwell-ww-the-dance-floor-is-yours-free-download
The Outsider is a friend of mine. The majority of his tracks are on Mixcloud: https://www.mixcloud.com/outsider_music/
More Pens
Here are all my Web Audio API Pens in a CodePen Collection.
I have another follow up post on the subject with other types of nodes like the convolver (very cool!): More Fun with Web Audio API.
So cool! :-) Would be great to get the analyzer to render in a sort of 3D soundscape :-)
Thanks :)
Yes! I've been thinking about that for a while. I would plot the level returned by the analyser node along the Z axis, the frequency along X and then time along Y.
@netsi1964
So I did it! A 3D soundscape. It's called Rasta Lines and is a Pen for the CodePen Creative Challenge #Greyscale_HintOfRainbow_CurvedLinesWeekend http://codepen.io/DonKarlssonSan/pen/mJWrME
This is a fantastic web audio demo. Very slick. I'm a web audio nerd myself. I really like what you've done with the custom waveform creator, very user friendly. I'm trying to do the same thing you're doing - that is, making web audio more accessible to people. And what you did here is just that. Such exited! Very inspiration! Thanks dude.
Thank you for your feedback! I'm happy to see others that are exited about Web Audio API =)
Nice Web Audio Pens you've made! Hearted immediately =)
Click the last link, if you haven't already, to see all of my 16 Web Audio Pens. I've made several new since I wrote this blog post. My personal favorite is The Checkbox Drum Machine
Hi, this is so awesome it help me alot!! by the way, how do I display a waveform while recording using Web Audio API??? like this pen of you but with waveform?? https://codepen.io/DonKarlssonSan/pen/Mazevy
Awesome post my brother! This will serve as a reference for me. Question...
In your code for the Analyser, I noticed you have a timeline slider. I've been trying to implement that myself, allowing a user to scroll say half-way through the song and the audio would pick up from there. I've been having difficulty with keeping my input range synced with the "currentTime". In your example, it looks like you're doing it with a created <audio> element by changing
audio.currentTime
. I have been tying my sliders' value to theAudioContext
constructor.var context = new AudioContext();
And then later down the road I use the (read-only)context.currentTime
value. I was getting some hiccups with that setup. Is there some internal connection between theaudio.currentTime
and theAudioContext
constructor that you're taking advantage of? Your example seems so simple and it works just as I need it to!@qodesmith Ok my Pen has two things going on with the slider:
When I did Automatic move too frequently it was hard to do User move. The Automatic move was interrupting the User move. I solved this by performing Automatic move only each 4th second with setInterval():.
This gives the user time to move the slider. It could still happen that the Automatic moving overrides the user's move, but it is less likely.
To be able to do User move I hooked up the change event of the slider to
audio.currentTime = x;
Like this:Sorry @seannhok I must have missed your comment.
I forked the Pitch Shifter Pen and made: Pitch Shifter With Audio Analyser
jungle
):Hi Johan,
Thank you for taking the time to put all of this together. Your API has become very useful to me for a personal project/experimentation I am doing involving music and UX/UI. I am normally a designer, and not a developer, so I am very limited with Javascript experience.
Regarding the analyser you built, I was wondering if there is any way to combine the canvas visual with a Soundcloud playlist that's written out in HTML? Right now, I am using the standard Soundcloud API, with the HTML portion as follows:
<div> <a href="https://soundcloud.com/ulysses-arthur/boots-darling-demo"><div></div></a> </div> <a href="https://soundcloud.com/ulysses-arthur/last-call-late-night-dub"><div></div></a> <a href="https://soundcloud.com/ulysses-arthur/sunrise-siem-reap"><div></div></a> <a href="https://soundcloud.com/ulysses-arthur/bangkok-watch"><div></div></a> <a href="https://soundcloud.com/ulysses-arthur/the-way-you"><div></div></a> <a href="https://soundcloud.com/ulysses-arthur/the-siren-interlude"><div></div></a> <a href="https://soundcloud.com/ulysses-arthur/hypnotize-bones-for-a-glimpse"><div></div></a> <a href="https://soundcloud.com/ulysses-arthur/a-different-wavelength"><div></div></a> <a href="https://soundcloud.com/ulysses-arthur/trouble-take-me-home-demo"><div></div></a> <a href="https://soundcloud.com/ulysses-arthur/hungry-for-some-heartbreak"><div></div></a> <a href="https://soundcloud.com/ulysses-arthur/the-next-one-a-familiar-feeling"><div></div></a> <a href="https://soundcloud.com/ulysses-arthur/hannah-hunt-vampire-weekend-cover"><div></div></a>
I appreciate any help or guidance to where I can find some.
Thanks! Shawn
@shawnth
I'm happy that you think the blog post is useful.
Regarding a playlist, there are two options:
Both are possible to accomplish with just plain JavaScript. But 1) would be easy with jQuery. 2) would be possible with Knockout. Now a days however Angular and React are more popular, in this particular case both of them seems like overkill for a playlist.
I will throw together an example of 1) with plain JavaScript later.
@DonKarlssonSan
Thank you for taking the time to throw together an example. In the meantime, I will look into angular and react.
@shawnth
I made a fork of the pen: http://codepen.io/DonKarlssonSan/pen/KNZqOO
Since the Pen does not use any framework like Angular or React, I continued in the same style.
Here is what I did:
<div></div>
I leave the rest to you 😀
Even though Angular and React are both overkill in this situation it can still be an interesting experiment just for learning purposes. 😀
@DonKarlssonSan
I am having trouble getting the next song in the playlist to play after the current one is done. Right now, the first song in the list is played, then it stops when it's finished.
As a matter of fact, I am not even familiar enough with Javascript to make the playlist clickable, make previous/next buttons, etc. At least, I am not sure how to do this using the SoundCloud API.
I appreciate you helping out this far, and I don't expect you to take the time to build out a full-fledged audio player that works with your analyser. I am just stumped :(
@shawnth
If you want to learn, my advice is to keep working on your player, keep it as your motivation. Every now and then you will meet a stumbling block. Like now, you don't know how to make the playlist clickable. Make a new Pen from scratch where you make a list of something clickable. Isolate it, keep it simple, just console.log() which item in the list was clicked. Next step: make previous/next buttons. Next step: somehow highlight the currently "playing" song. But it isn't really playing.
And then when you've accomplished that, come back to your player and add click functionality.
As for automatically playing the next song when the current ends, I would look into the ended event. Never tried it myself though.
If you look at how I implemented the Find button, you can see how I go about changing song.
@DonKarlssonSan
Thank you, Johan. I was able to figure out a workaround, however, if I were to want a stroke for the waveform, rather than a fill, how would I go about doing so? I would like the moving wave when responding to music (currently with a black fill) to resemble the idle wave (black stroke).
@shawnth
I made changes to the fork I created earlier.
Is that what you are looking for?
Old code
Draws vertical black lines. Always starts at the bottom, stops at the current point of the dataArray[i].
New code
Draws a line between all the points contained in dataArray.
@DonKarlssonSan
That's exactly what I was looking for! Thanks, again, for your help with all of this.
You are welcome. :)
Great post, Johan.
Do the examples work on iPad or Android phone for you? On iPad, it encountered an error on the first example and the rest didn't load. On a Samsung phone, there didn't appear to be any errors (each example appeared to load OK), but none of the examples actually worked (no sound, no visualizations).
Hmm...
The first and the last plays music on my iPhone. But the visualisation doesn't work on the last one.
Will look into it later. Thanks for reporting.
Thanks so much for writing this. I'm trying to do some visualizations with three.js and you filled in some gaps in my noob-level understanding of Web Audio API.
@sjcobb You are welcome! I'm glad it helped you =)