Edge diffraction with geometrical acoustics

In this post, I'm using geometrical acoustics to calculate the sound field around a wedge, including specular reflections, shadow zones and diffraction.

The simulation

You might encounter some problems with the simulation if you're not using recent hardware/software with full WebGL support.

The simulation: http://doca.kaistale.com/btm/

Note that when you move the source or change the frequency of the source, it will take a while for the diffraction calculations to update.

I recommend that you keep the source to the left of the wedge if you want to observe the results of the diffraction calculations, as the calculations are done starting to the right of the wedge (you'll see what I mean if you play around with the simulation).

I could make the code much more effective by optimizing things (many times more effective, probably), but I won't do any of that as it's just a proof of concept.

The theory

The specular reflections are calculated by reflecting the source relative to the wedge. The shadowing is self-explanatory.

The diffraction is calculated using the Biot-Tolstoy expressions, using the method presented by Svensson et al (JASA 1999). The simulation spans 1 meter by 1 meters. The wedge spans 1 meter to both sides of the source. The simulation represents a cut plane perpendicular to to the wedge.

The amplitude and phase of the diffracted signal is calculated from the impulse response at many points in space, using the Goertzel algorithm (for a single frequency). The algorithm is implemented server-side, using Python/Numpy.

The calculations are done with some simplifications, to keep the server happy. I haven't validated the results in the simulation thoroughly, small errors could be seen for the few points I tested (most likely due to the simplifications). But the calculation model works; I compared my Python implementation more rigorously with the examples in the paper by Svensson.

The implementation

A lot of things are calculated using the shaders which definitely should not be calculated using them. If you delve deeper into the code (esp. the shaders), you will most definitely encounter quite a few ugly things. But the simulation runs happily on my computer(s), so I'm happy.

Each time you move the source or change the frequency, the contribution of the diffracted signal is calculated for a polar grid, spanning 32 x 32 points (it extends quite a bit outside the visible view).  The origin of the grid is at the edge of the wedge. The calculations are done server-side, and fetched one data point at a time using jQuery/ajax/JSON. This makes the calculations really slow, but it's partly also because I don't want to strain the server too much. I mainly wanted to test how these types of calculations can be done using Django.

This diffraction data is passed to the shaders using a 2D texture, with two bytes ("red" and "green") representing the amplitude of the signal and one byte ("blue") representing the necessary phase data.

WebGL can handle linear interpolation for textures automatically, so the data is interpolated nicely in-between the data points.

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.