Category Archives: JavaScript

Binaural Sound with the Web Audio API

The simulation

Use headphones and click on any point around the person below to choose a direction for the incoming sound. The blue dots are in perpendicular directions relative to the listener. Try different head-related impulse responses (HRIR). Some of them will work better than others, depending on the individual. Note that the simulation has only been tested on Firefox and Chrome! Also, some people get errors with their web audio context having a different sample rate than the HRIR:s*.

  You need headphones for the following simulation!

http://kaistale.com/blog/141226hrir/

The theory

Head-related transfer functions describe the cues we receive that enable us to determine the direction a sound arrives from. We only have two ears. To be able to determine the direction the sound arrives from in 3D, our brain has to use all the information it can.

For example, the sound will often arrive at the other ear with a small delay. Also, there will often be a difference in the sound level at one ear, as compared to the other (especially at high frequencies). But, additionally, there is a ton of information available for our brain to use. Our shoulders reflect sound. Sound reflects and diffracts around our external ears (pinna). As our features, such as the shape of our pinna, are individual, so is the way our brain perceives sound in 3D.

Still, our heads are often similar enough, which enables us to approximate 3D sound by ready made head-related transfer functions. Once we have a description of how sound arrives at our ears from different angles, we can take any sound and play it back from some direction in 3D.

The simulation in this post uses head-related transfer functions from the CIPIC HRTF database. This paper provides some nice additional information about head-related transfer functions.

The source code

The source code is here: https://github.com/kai5z/hrtf-simulation

*) If your web audio context has a different sample rate as compared to the HRIR:s sample rate (44.1 kHz), the audio won’t work. Apparently the sample rate of the context isn’t definable (please correct me if I’m wrong!), so the HRIR should be resampled for it to work.

Synthesizing thunder using JavaScript

Have you ever considered what actually causes the sound of thunder? The past summer brought with it a few thunder storms, which got me thinking about a topic I find very fascinating and cool: simulating thunder!

The simulation

Note that the simulation can be quite heavy for more complex lightning strikes (some are even unrealistically complex) and for longer distances from the strike. Calculating the result can take quite a bit of time, depending on your computer. A sample frequency of 22 kHz is used and the simulation is monaural. The lightning consists of a single discharge between the cloud and the ground.

http://www.kaistale.com/blog/140809thunder/index.html

If the simulation appears to jam up, please reload the page. Once again, I recommend Chrome for the simulation. Take a look at the source code here – I was somewhat lazy with the commenting of the code!

What is lightning?

Let’s use the definition of lightning given by google:

The occurrence of a natural electrical discharge of very short duration and high voltage between a cloud and the ground or within a cloud, accompanied by a bright flash and typically also thunder.

I believe the geometry of the lightning channel becomes clearer when one considers the part of the following video starting at 1 min 15 s:

Note that the person in the video talks about the lightning consisting of “roughly 50 yard segments”. These segments, referred to as the tortuosity of the lightning channel, are usually between 5 and 70 meters long [Rakov et al. 2003].

In our case, let’s simplify the lightning as consisting of pretty much straight lines, with a random length of 5 to 70 meters. The lines zig-zag constantly, with a random variation of about 16 degrees between each line. Also, we’ll need to add a small statistical deviation in the vertical direction.

What causes thunder?

Ok, so what causes the sound of the lightning? Let’s consider what happens when the discharge happens. We very quickly warm up a channel of air (the orange area in the cut plane image below). As the temperature in the channel rapidly rises to ~24000K [Orville, 1968], the pressure of the air in the channel rises enormously (to about 10^6 Pa). This pushes the the air outwards at speeds exceeding the speed of sound, causing a shock wave expanding at roughly 3000 m/s [Few, 1986].

After this, the air in the channel quickly cools down. The pressure behind the rapidly expanding shock wave will momentarily drop below atmospheric pressure due to the inertia of the outwards traveling air mass. The shock wave will travel some distance (the “relaxation radius”), after which it will dissipate, leaving behind what is called the weak shock wave. This weak shock wave can now be plotted as a function of pressure.

Never mind the scales for now (y-axis represents atmospheric pressure), but note this: the pressure wave will propagate towards you from the lightning so that the “sharp” part of it will reach you first.

What is thunder?

Ok, so now we now that the really hot lightning channel causes a traveling pressure wave. We also know that changes in pressure equals sound. So what we’re hearing is the pressure waves caused by the rapidly heating air in the lightning channel.

But what causes the rumbling sound? Why does the thunder keep on rumbling for many seconds? What makes a close lightning strike sound (sort of) like a clap, while a distant strike can only be heard as rumble?

Here are some of the reasons:

1. The size of the lightning is huge

Consider a lightning strike some distance from you, as in the image above. If you measure the distance to each part of the lightning, you will see that the distance can vary by miles/kilometres! Considering that sound only travels at about 340 m/s (1,125 ft/s), there will be multiple seconds between when the sound from the nearest part of the lightning strike arrives at your position, as compared to the sounds form the more distant parts.

The situation can also be thought of according to Huygens’ principle, which states that any source can be thought of as a series of spherical sources (kind of like in the image above). This is how the simulation, presented at the beginning of this post, works. The lightning is divided into multiple small segments, each modeled as a separate spherical sound source.

2. Sound attenuates by distance

It is, perhaps, obvious that more distant sound sources are quieter. But, additionally, it should be noted that higher frequencies attenuate much faster! Thus, when sound travels a distance, it gets “muffled” by air. Wolfram alpha is a great resource for this, it calculates this attenuation according to ISO 9613-1:1993. This causes distant sounds to “rumble” while the closest sounds are sharp and discernible.

3. Other stuff

There are loads of things at play in real life (for example atmospheric diffraction, which is the reason for the thunder sometimes being completely inaudible even when the lightning strike is clearly visible). If you’re interested in learning more about the topic, I found the following books useful:

  • Vladimir Rakov & Martin Uman – Lightning, physics and effects
  • Geophysics study committee – Earth’s electrical environment
  • Hans Volland – Handbook of Atmospheric Electrodynamics, Volume 2

4. Future ideas

It would be really cool to make the simulation in stereo, so that the sounds from the lightning channel are panned to their respective position  (or even using head related transfer functions!). If someone else is up for the task and has some knowledge of acoustics and DSP, feel free to contact me! 🙂

Using beam tracing to calculate reflections in JavaScript

I have been researching beam tracing for a project of mine for a while now. Beam tracing is a method for calculating reflection paths. I won’t go into any details of how beam tracing works, as this is something you can find on google in case you’re not familiar with it.

My first attempt at beam tracing was made in Python, about a year ago. I made an algorithm that worked in 3D, but it became exceedingly complicated as I worked on it (mostly related to occlusion and clipping). As it turned out, I was overcomplicating things. The paper “Accelerated beam tracing algorithm” by Laine et al. served as an excellent introduction into how basic beam tracing should be done: as simply as possible. To get a better understanding of the paper, I decided to try to implement some of the very basics in 2D using JavaScript.

The result

Open simulation

Click to place the source, the specular reflections are updated automatically as you move the mouse around. Note that reflections are only calculated up to a certain limit (theoretically there are an infinite amount of reflections).

Firstly, it should be noted that I only utilised the very basic ideas presented in the paper in my implementation (beam trees and accelerated ray tracing). I only wanted to spend a few days on this, which means that there are probably a lot of things which could (and should) be optimised. I spent most of the time pruning out bugs, which at times were very hard to find. I also pushed the MIT-licensed source code to Bitbucket, in the hopes that someone might think of something useful to do with 2D beam tracing in JavaScript!