North Atlantic right whales (Eubalaena glacialis) are among the most endangered whales in the world: it is thought that less than 500 are alive today. Since the 1970s, an average of two North Atlantic right whales have been killed annually by ship strikes. But their own whale calls could actually help save them from collisions. Last month, whale researchers from the Woods Hole Oceanographic Institution (WHOI) in Massachusetts, US, tested the first-ever real-time reporting and whale call classification system. Two 1.8-meter-long ‘gliders’ (torpedo-shaped robots) were successfully deployed in the Gulf of Maine detecting nine North Atlantic right whales.
The sensor-laden WHOI robots were able to identify and transmit their findings to shore-based researchers within hours of hearing the whales in real-time. As part of the oceanographic research, Mark Baumgartner and Dave Fratantoni, had equipped the gliders with digital acoustic monitoring (DMON) instruments and specialised acoustic software that allowed them to distinguish between four types of baleen whale — sei, fin, humpback and right whales.
Unfortunately, right whales are susceptible to collisions because of their docile nature, and their feeding habits. They feed on microscopic zooplankton close to the water surface (and subsequently are entangled in fishing nets), and are often deaf to the sounds of approaching boats. Their migratory route through the oceans also follows most of the world’s busiest shipping lanes.
The innovative whale detection system provides conservation managers with a cost-effective alternative to ship- or plane-based observations. “The conventional way of monitoring marine mammals involves attaching a hydrophone to a buoy and recording audio for up to a year in the ocean, after which you retrieve the recorder and sift through the audio for whale calls. Once you find a call, you know there was a whale at that particular fixed location many months ago. However, this isn't particularly useful for management purposes, as you can’t tell ships to avoid that place because the whales have long since left the area. The new whale detection system sends whale detections in real-time, so managers have the opportunity to do something with that information immediately,” says Baumgartner.
Compared to a moored buoy network, the robot gliders used by the WHOI researchers have the added advantage of being able to move up , down, and laterally in the water. Using GPS technology, the gliders can modify their behaviour autonomously. Attached to each glider is a DMON that can detect a call and communicate data to the glider computer via marine instrument protocol standard, NMEA text strings. The glider takes all that information, plus data from other instruments, and packages it up, and transmits it via an iridium satellite to a shore-based computer. Baumgartner and his team are using the technology to investigate areas that are difficult to access during certain times of year. “We knew within a couple of days of deploying them that the gliders were hearing humpback, fin, and right whales, so we knew what species we could expect to encounter before we even arrived in the study area by ship,” says Baumgartner. This allowed them to quickly locate whales and carry out their main scientific objectives - mapping the distribution of whales in the area, and collecting environmental data, including water temperature, salinity levels, and the abundance of zooplankton.
A spectral waterfall
Sound detection was carried out via each glider’s DMON equipment – a circuit board and battery about the size of an iPhone – which records and processes audio in real-time. This generates a spectrogram helping marine researchers visualize sound in a graphical format. People are used to listening to audio with their ears, but it is much easier to process data if we transfer it into a format that is visual, Baumgartner explains. He describes it as similar to notation in sheet music. In 2011, Mark Baumgartner developed a low-frequency detection and classification system (LFDCS) which "draws a line" through each call in a spectrogram. His software generates a pitch track derived from a spectrogram, which represents how the frequency (pitch) of a sound changes over time. “The pitch track that we create is exactly that — a visual representation of a sound. It tells you how loud a sound is, how long it is, and the how the frequency or pitch of the sound changes over time. All this information is pertinent for distinguishing between the calls of different species,” says Baumgartner.
Differentiating between species starts with the instrument (DCON) identifying parts of the spectrum above a certain threshold (over 10 decibels above background). Then, the LFDCS algorithm tries to describe that sound with a pitch track. Once those characteristics of the sound (e.g., duration, average frequency, rate of change of frequency with time) have been identified the researchers use a statistical procedure called discriminate function analysis to compare the sound to a library of calls. “The instrument has 15 call types in the library that it can recognize, and it compares the attributes of the pitch track to the attributes of all the call types in the call library. Each of the four species that the instrument recognizes makes relatively unique calls that we try to capture in the call library,” says Baumgartner. Right whales are known to groan, pop and belch typically at frequencies of around 500Hz.
Because of limitations in the amount of data that can be transmitted via satellite, the DMON/LFDCS relays a maximum of 8KB of pitch track data every hour. “It would take an enormous amount of bandwidth to get all the audio data transmitted from the glider to a shore-side computer via the satellite,” says Baumgartner. “The satellite communication system is simply not built for that kind of data volume. That is why its so important to process the audio for whale sounds right on the glider and just send the detection information back to shore.”