Javascript, Three Js
Leave a comment

Hertzian Landscapes – The interactive space of a radio spectrum

Created by Richard Vijgen, ‘Hertzian Landscapes’ is a live visualization of the radio spectrum. It includes a digital receiver to scan large swaths of radio spectrum in near real-time and using Three.js visualises thousands of signals into a panoramic electromagnetic landscape.

Unlike visible light, waves in the radio spectrum cannot be perceived by us directly yet this space is teeming with human activity. The installation invites users to zoom in to specific frequencies by positioning themselves in front of the panorama as if controlling a radio tuner with their body, giving them a sense of walking through the spectrum.

From radio broadcasts to weather satellites and from medical implants to aeronautical navigation, the radio spectrum is divided into hundreds of designated slices each tied to a specific application. Based on a localized frequency database that describes these slices, signals are annotated to provide information about their theoretical type and application.

The signals that are picked up using a HackRF receiver that is running in a so called “sweep mode”, a setting that allows the receiver to scan large swaths of radio spectrum in near real time. A host computer is running a NodeJS script that takes the output stream from the HackRF and applies a Fast Fourier Transform on the signal before translating them into a spatial pattern by plotting the frequency on the X axis, amplitude on the Z Axis and time on the Y axis.

↑ Screenshots from the custom NodeJS tracking server that has a ThreeJS front-end for managing tracking settings and debugging.

A custom GLSL shader in ThreeJS uses the spatial pattern created by the nodeJS script and applies it to a point cloud. The visualization emphasises how “defined” a signal is. Very defined signals are white and often have a certain graphic pattern in themselves (transmission patterns). Whereas less defined signals are darker coloured, blend into the background and can “fan out” over larger parts of the spectrum, sometimes interfering with stronger signals. The visualization emphasises this effect by using depth and perspective.

Three Intel RealSense sensors are connected to three Intel NUC Ubuntu computers running NuiTrack person recognition software. The “skeleton data” from these three computers is sent to a central NodeJS server that stitches the three skeleton streams together into a single space and creates a “window” defined by a central frequency and width for each user. When users are close to one another and these windows overlap, they are merged together into a single window.

Project Page | Richard Vijgen

Leave a Reply

Your email address will not be published. Required fields are marked *