Sending message from Application to AudioWorkletProcessor

Web Audio API is a powerful tool for creating interactive web audio applications. It provides a low-level interface for working with audio, including loading and decoding audio files, creating and connecting audio nodes, and processing audio data in real-time. In this web audio api javascript programming tutorial, we will explore how to send messages from the application to the AudioWorkletProcessor using the postMessage() method.

AudioWorkletProcessor is a JavaScript class that allows developers to write custom audio processing algorithms that can run on a separate thread, independent of the main JavaScript thread. AudioWorkletProcessor is a key component of the Web Audio API and is used to create AudioWorkletNodes. These nodes can be connected to other nodes in the audio graph and process audio data in real-time. The process() method of the AudioWorkletProcessor is called by the Web Audio API with input and output arrays and a set of audio parameters. The developer can write their custom audio processing code in this method, which will be executed on a separate thread. AudioWorkletProcessor enables high-performance audio processing in real-time, making it ideal for applications such as audio synthesis, audio effects, and music production. 

The "main JavaScript thread" generally refers to the code running in the main.js file, as well as any other JavaScript code running in the same thread. The main thread is responsible for rendering the user interface and handling user interactions, as well as executing JavaScript code. AudioWorkletProcessor allows audio processing to be performed on a separate thread, freeing up the main thread for other tasks and improving overall performance.

Sending Messages from Application to AudioWorkletProcessor

In our example, we have an app.js file that loads an audio file and creates an AudioBufferSourceNode and an AudioWorkletNode. We also have a worklet.js file that defines an AudioWorkletProcessor.

To send a message from the application(app.js) to the AudioWorkletProcessor(worklet.js), we can use the postMessage() method. In our example, we send a message from the AudioBufferSourceNode to the AudioWorkletNode when the audio file has ended.

Here is the code that sends the message from the AudioBufferSourceNode in app.js:

   
// When the AudioBufferSourceNode starts playing, send a message to the AudioWorkletProcessor
      source.addEventListener("ended", () => {
        workletNode.port.postMessage("audio-ended");
      });
 

 And here is the code that receives the message in the AudioWorkletProcessor in worklet.js:

  
this.port.onmessage = (event) => {
      if (event.data === "audio-ended") {
        console.log("Received message from main: audio ended");
      }
 

The postMessage() method sends a message to the AudioWorkletProcessor, and the onmessage event handler in the AudioWorkletProcessor receives the message. In this example, we check if the message is "audio-ended", and if so, we log a message to the console.

send message from app to worklet

 

Complete Program Codes

The full code for index.html, app.js worklet.js are below:

index.html

        <h1>Send message from app to worklet</h1>
        <button id="start-button">Send Data to Worker</button>
        <script src="app.js"></script>

The html file contains a button which when clicked starts the communication process.

 app.js

 
// Create an AudioContext
const audioContext = new AudioContext();

// Get the start button
const startButton = document.querySelector("#start-button");

// Load an audio file using fetch and create an AudioBuffer
fetch("helloaudio.wav")
  .then((response) => response.arrayBuffer())
  .then((arrayBuffer) => audioContext.decodeAudioData(arrayBuffer))
  .then((audioBuffer) => {
    // Create an AudioBufferSourceNode and connect it to the AudioContext destination
    const source = audioContext.createBufferSource();
    source.buffer = audioBuffer;
    source.connect(audioContext.destination);

    // Create an AudioWorkletNode and connect it to the AudioContext destination
    audioContext.audioWorklet.addModule("worklet.js").then(() => {
      const workletNode = new AudioWorkletNode(audioContext, "my-worklet-processor");
      workletNode.connect(audioContext.destination);

      // When the AudioBufferSourceNode starts playing, send a message to the AudioWorkletProcessor
      source.addEventListener("ended", () => {
        workletNode.port.postMessage("audio-ended");
      });

      // Start playing the audio when the start button is clicked
      startButton.addEventListener("click", () => {
        source.start();
      });
    });
  });
 

In this code, we first select the start button using document.querySelector("#start-button"). Then, we load the audio file using fetch() and create an AudioBuffer object using the decodeAudioData() method of the AudioContext object. Once we have the AudioBuffer, we create an AudioBufferSourceNode and connect it to the AudioContext destination, as we did before. We also create an AudioWorkletNode and connect it to the AudioContext destination, and send a message to the worklet when the audio playback ends, as we did before.

To start the audio playback, we add an event listener to the start button that calls the start() method of the AudioBufferSourceNode object when the button is clicked.

worklet.js

// Define an AudioWorkletProcessor
class MyWorkletProcessor extends AudioWorkletProcessor {
  constructor() {
    super();
    this.port.onmessage = (event) => {
      if (event.data === "audio-ended") {
        console.log("Received message from main: audio ended");
      }
    };
  }

  // Implement the process() method
  process(inputs, outputs, parameters) {
    return true;
  }
}

// Register the AudioWorkletProcessor
registerProcessor("my-worklet-processor", MyWorkletProcessor);

 The worklet.js file defines an AudioWorkletProcessor class called MyWorkletProcessor. It implements the process() method and also listens for messages from the main thread via the this.port.onmessage event listener.

Once the AudioWorkletProcessor is defined, it is registered using the registerProcessor() method.

When the AudioBufferSourceNode finishes playing, the main.js file sends a message to the AudioWorkletProcessor via the workletNode.port.postMessage() method. The MyWorkletProcessor class in worklet.js receives the message and logs a message to the console.

Conclusion

 In this JavaScript programming tutorial on Web Audio API, we have demonstrated how to send messages from the application to the AudioWorkletProcessor using the postMessage() method. The Web Audio API provides a powerful and flexible way to work with audio in the browser, and the ability to send messages between the application and the AudioWorkletProcessor adds even more functionality and interactivity to web audio applications.

 References

[1] Web Audio API Volume Control with Gain Slider

[2] Create Oscillator with Web Audio API

[3] Filters with Audio Web API 

[4] Web Audio API : Visualizing Frequency Response

Post a Comment

Previous Post Next Post