[OANNES Foro] Dolphin whistle instantly translated by computer

Mario Cabrejos casal en infotex.com.pe
Jue Abr 10 17:46:50 PDT 2014


Dolphin whistle instantly translated by computer

by  <http://www.newscientist.com/search?rbauthors=Hal+Hodson> Hal Hodson

26 March 2014

http://www.newscientist.com/article/mg22129624.300?cmpid=NLC%7CNSNS%7C2014-1
004-GLOBAL
<http://www.newscientist.com/article/mg22129624.300?cmpid=NLC%7CNSNS%7C2014-
1004-GLOBAL&utm_medium=NLC&utm_source=NSNS&>
&utm_medium=NLC&utm_source=NSNS&

 

Software has performed the first real-time translation of a dolphin whistle
- and better data tools are giving fresh insights into primate communication
too

IT was late August 2013 and Denise Herzing was swimming in the Caribbean.
The dolphin pod she had been tracking for the past 25 years was playing
around her boat. Suddenly, she heard one of them say, "Sargassum".

"I was like whoa! We have a match. I was stunned," says Herzing, who is the
director of the Wild Dolphin Project. She was wearing a prototype dolphin
translator called
<http://www.newscientist.com/article/mg21028115.400-talk-with-a-dolphin-via-
underwater-translation-machine.html> Cetacean Hearing and Telemetry (CHAT)
and it had just translated a live dolphin whistle for the first time.

 

It detected a whistle for sargassum, or seaweed, which she and her team had
invented to use when playing with the dolphin pod. They hoped the dolphins
would adopt the whistles, which are easy to distinguish from their own
natural whistles - and they were not disappointed. When the computer picked
up the sargassum whistle, Herzing heard her own recorded voice saying the
word into her ear.

As well as boosting our understanding of animal behaviour, the moment hints
at the potential for using algorithms to analyse any activity where
information is transmitted - including our daily activities (see "
<http://www.newscientist.com/article/mg22129624.300-dolphin-whistle-instantl
y-translated-by-computer.html?full=true#bx296243B1> Scripts for life").

 

"It sounds like a fabulous observation, one you almost have to resist
speculating on. It's provocative," says Michael Coen, a biostatistician at
the University of Wisconsin-Madison.

Herzing is quick to acknowledge potential problems with the sargassum
whistle. It is just one instance and so far hasn't been repeated. Its audio
profile looks different from the whistle they taught the dolphins - it has
the same shape but came in at a higher frequency. Brenda McCowan of the
University of California, Davis, says her experience with dolphin
vocalisations matches that observation.

 <http://www.cc.gatech.edu/home/thad/> Thad Starner at the Georgia Institute
of Technology and technical lead on the wearable computer
<http://www.newscientist.com/article/mg21929364.500-google-glass-has-its-ele
ctronic-eye-on-health.html> Google Glass, built CHAT for Herzing with a team
of graduate students. Starner and Herzing are using pattern-discovery
algorithms, designed to analyse dolphin whistles and extract meaningful
features that a person might miss or not think to look for. As well as
listening out for invented whistles, the team hopes to start trying to
figure out what the dolphins' natural communication means, too.

 

McCowan says it's an exciting time for the whole field of animal
communication. With better information-processing tools, researchers can
analyse huge data sets of animal behaviour for patterns.

Coen is already doing something like this with white-cheeked gibbons. Using
similar machine-learning techniques to those used by Starner and McCowan, he
has found 27 different fundamental units in gibbon calls.

McCowan, meanwhile, has recently modelled the behaviour of rhesus macaques
at the  <http://www.primate.ucdavis.edu/> National Primate Research Center
in California. The idea is to predict when the macaques would descend into
the violent social unrest known as "cage war" that often leads to the death
of the alpha family.

 

Her team started collecting data, making 37,000 observations of key signs of
dominance, subordination and affiliation over three years. Among other
things, their analysis showed that cage stability improved if new young
adult males were introduced now and again as they seemed to grow into
"policing" roles. "You had to look at the data," McCowan says. "It wasn't
something a human could see."

Terrence Deacon, an anthropologist and neuroscientist at the University of
California, Berkeley, explains that some pattern of repetition is a basic
requirement when information is transmitted. In other words, if Herzing's
dolphins or McCowan's macaques are exchanging information, if their
behaviour is not just random, meaningless noise, then there must be some
discoverable patterns. Information theory can find out what those pattern
are, which parts of a whistle are important, helping behaviourists figure
out what animals are communicating.

The first results from Starner and Herzing's work on dolphin
communication-processing are due to be presented at the speech and signal
processing conference in Florence, Italy, in May. Last summer's work was cut
short because the team lost the dolphin pod, but they did make some
progress. Starner's algorithms discovered eight different components in a
sample of 73 whistles. It's still preliminary, but they were able to match
certain strings of those components with mother-calf interactions, for
instance. The work has let them plan for the coming summer when they want to
confirm two-way communication between humans and dolphins.

Deacon is excited to see if such work can lead to a better understanding of
animal cultures. He suspects much animal communication will turn out to be
basic pointing or signposting rather than more complex language. But humans
often communicate on a basic level too. "I don't see a fundamental white
line that distinguishes us from other animals," he says.

This article appeared in print under the headline "Decoding dolphin"

 


Scripts for life


Thad Starner wants you to be the next guinea pigs for the algorithms he uses
to study animal communications (see main story). He thinks
pattern-recognition software can discover the signature of any activity,
from brushing your teeth to commuting to work. He wants to create wearable
computers that learn what the wearer is doing.

"Imagine having sensors on your wrist, and as you go through daily life, it
could figure out what paging through a book is, opening the car door. All
these things are unique gestures." Put together they are scripts, Starner
says. "Just by wearing the device it learns how to interact with the world."

 



---
This email is free from viruses and malware because avast! Antivirus protection is active.
http://www.avast.com
------------ próxima parte ------------
Se ha borrado un adjunto en formato HTML...
URL: <http://lista.oannes.org.pe/pipermail/oannes-oannes.org.pe/attachments/20140410/63afea34/attachment.html>


Más información sobre la lista de distribución OANNES