The aim of this tutorial is to create a more complex visualization, combining several components of the waves.js library, as well as how they can be integrated with d3.
At the end of the tutorial, you should be able to build the following example:
Let dig into the code step-by-step:
1. HTML setup
First, define some html tag to host the visualization, and load the library
the #zoomer div tag will host the d3.scale which will be used to interact with the zoom helper
the #timeline div tag will host the waves.js visualization
finally, load the libray at the end the body tag
2. Javascript setup
We add some data to be visualized:
The time unit used inside a timeline, especially if visualizing audio data through a waveform should be the second. The start and duration properties inside the data are therefore expressed in seconds.
3. Load the audio file
The waves library provide an AudioBufferLoader to load some audio file and converting it to an audioBuffer with a promise API. Let’s use it:
4. Create the graph and its layers
The following code show you how to create a graph containing several components. It will use the waveform component to display the audio buffer, a segment and a label component binded to the same data (see metadata in 2. Javascript setup), and marker layer to create an anchor that will be used to visualize the center of the zoom.
5. Add the zooming ability to the whole timeline
First, let’s create a d3 axis inside the #zoomer tag (the following code is pure d3):
Then, bind the waves.ui.zoomer helper to this newly created axis, and configure it to interact with the graph we created earlier:
6. Add somme style
To help the user, you can add some css to display usefull information about the possible interactions:
Et voilà, you should now have a working visualization of your audio file and its metadatas!