M.

Making Music at the Speed of Light

Everything that vibrates makes music. The music that is perceived by human beings is human music. For being able to perceive the music of atoms, stars, and animals, it has to be transformed. (Karlheinz Stockhausen 1975)

Sonification is the use of sounds to perceptualize data and information. It is an interesting complement or even an alternative to visualization techniques. Infographics have already become widespread and are considered as cool or even sexy.

Just imagine the huge potential of infosounds in the future.

“ATLAS is a music box” (c) Toya Walker.

One impressing example of sonification is the Large Hadron Collider (LHC) at CERN in Switzerland. LHC was in the news recently when it broke its own energy world record on March 30, 2010. The high energy collision created an enormous quantity of data which inot only  a big challenge in the field of computing, but also may be one of the reasons why LHC made it’s way into the world of music and sounds.

Here is a visualization of what happened inside the LHC when two opposing particle beams collided with an energy of 7 000 000 000 000 eV:

httpv://www.youtube.com/watch?v=EP0ouOgMuNY

Now a group of particle physicists, composers, software developers and artists in the UK started a project called LHCsound, turning real and simulated data from the ATLAS detector at the Large Hadron Collider into sounds.

The team members Lily Asquith, a particle physicist, Richard Dobson, a composer and music software developer, Archer Endrich, a composer, Toya Walker, an illustrator and painter, Ed Chocolate, a London-based producer and DJ and Sir Eddie Real, a percussionist, want to attract people to the results of the LHC experiments in a way that is novel, exciting and accessible.

Their “simplest” example of sonification is HiggsJetSimple. This sonification transforms properties of the particle jet to properties of sound: energy becomes volume, distance defines timing and the deflection of the beam is the pitch. In this example the sounds reduce in density very much towards the end, with isolated events separated by silences of several seconds:

httpv://www.youtube.com/watch?v=IrXqptn6qvo

By the way, Frank Zappa sonified the search for Higgs’ boson many years ago. You can find it on his album Trance-Fusion.
httpv://www.youtube.com/watch?v=2lHixowXnlU

In principle any data can be sonified. NASA sonified the sun and the planets. Seismic data of earthquakes and volcanos has been sonified to great use. You can even sonify a painting, photograph or moving image which has enabled blind people to see with sound.

Sonification has a great potential and I am eager to see if it can make its way into popular culture like data visualization did with infographics.

(Thank you Toya Walker, CERN, Frank Zappa and LHCsound)

W.

With new technology, literacy evolves

It was Louis Braille, a student at the Royal Institute for Blind Youth in Paris, who modified the French Army’s “night writing”  in 1821 and came up with what is known as Braille today. For the first time in history, blind people had access to a reliable method of written communication, resulting in a signifant rise in social status. Louis Braille was embraced as a liberator.

Braille code where the word “premier” (French for “first”) can be read.

Nowadays, with more and more written words digitized, MP3 players, audio books and screen-reading software are a real alternative for blind people to access and communicate in the written language, without even knowing Braille. A report by the National Federation of the Blind found that less than 10 percent of legally blind Americans learn to read and write Braille today. Back in the 1950s it was roughly half of all blind children.

There has been a big debate whether this affects cognitive development. Moving from the written to the spoken language may have more cultural consequences rather than cognitive ones. It is about losing your own way of communication, and discussing that issue may be as passionate as the debate about cochlear implants and their imoact on the use of sign language, or the decreasing of language variety in general.

But I would rather like to link the developments of Braille and new technology to the learning of reading and writing in schools today. Although for sighted people the transition from written and printed texts to digital representations has been more subtle, it is still remarkable and has important impacts, too. It will probably affect our general view of literacy. With new technology, literacy has become harder to define.

That's how text messaging looked like in 1912. (c) Underwood & Underwood
That’s how text messaging looked like in 1912. (c) Underwood & Underwood

Take penmanship for example: While handwriting was still necessary in the last century for documents, reports, etc., this is no longer the case today. The majority of formal documents are expected to be typed and most people use handwriting, if at all, only for informal notes and reminders. One could question the relevance of learning penmanship at all. A few decades ago, experts even predicted that the electronic age would create a postliterate generation as new forms of media eclipsed the written word. Marshall McLuhan, Canadian philosopher and scholar best known for his expression “global village”, claimed in Understanding Media: The Extensions of Man (1964) that Western culture would return to the “tribal and oral pattern.”

Do we really need to learn penmanship to be literate people today? The architecture of our brain is flexible. Blind people for instance consistently surpass sighted ones on tests of verbal memory, according to a 2003 study in Nature Neuroscience. Instead of teaching handwriting, it would be more appropriate to teach digital literacy, not least because even standardized tests are employing the new technologies. For example, in 2011, the writing test of the National Assessment of Educational Progress (NAEP) will require 8th and 11th graders to compose on computers, with 4th graders following in 2019.

Although one could argue that the question of teaching and testing is only another form of asking: “Which came first, the chicken or the egg?”, I would argue, that even without testing future education will be much more about digital literacy rather than calligraphy. And new technology will not only affect the way we teach reading and writing, but also the way we teach art, music, mathematics, science, foreign languages and literature. I am curious about the new approaches and very happy to live in a global village.