Telematic concerts, computer music, and UCI's mission to be extraordinary
Photos provided by Christopher Dobrian
Christopher Dobrian, a professor of Integrated Composition, Improvisation, and Technology (ICIT) in the Department of Music at UC Irvine, with a joint appointment in the Department of Informatics, composes instrumental and electronic music, teaches composition, theory, and computer music, conducts research and develops AI interactive computer systems that cognize, compose, and improvise music.

However, Christopher wasn't always interested in computer music and technology. Attending UC Santa Cruz to obtain an undergraduate degree in guitar, he had an opportunity to work with composers and access to electronic music studios, being exposed to modern contemporary ideas. "This pushed me out of my comfortable zone of thinking that everything had to be tonal and pretty," the professor says.

Wondering where to continue his education, Dobrian was advised to apply to UC San Diego, as their primary focus was composition. "They also had a computer music program, [with an emphasis] on digital music technology," he explains, describing what sparked his initial interest in the field. Graduating with a PhD from UCSD, Christopher began teaching, while also continuing to explore what role computers can play in music.

Hired by UCI to start the music technology program in 1996, the professor's main goal with ICIT was breaking down some of the categories that divided musicians and concentrating on combining them in creative ways. "We look for applicants who are [either] already doing that kind of integration or have shown that they really want to pursue that integration of improvisation, composition, and implementing that with technology," Dobrian shares.
"It was really just some of us on the faculty asking [ourselves]: how can we be extraordinary? What can we do that will really be special both for us and the students?"
With courses offered in improvisation, computer music, and composition, musicians also learn how these concepts work together — when it comes to doing the dissertation, students have to do both a creative and a research project, giving you the chance to "develop your sense of what you want to accomplish as an artist and … educating yourself to be the best artist."

Part of the implementation process was providing the necessary equipment. Christopher currently directs the Realtime Experimental Audio Laboratory (REALab), the Gassmann Electronic Music Studio and the Gassmann Electronic Music Series. "In the 1990s, the model of an electronic music studio was one composer with a whole bunch of equipment making strange sounds — I was interested not only in doing that but also bringing music technology on stage," he shares.

Originally created as a playground for experimentation, REALab is now a laboratory built for research and creative work in the use of computers in live performance, including realtime audio processing, sound spatialization, interactivity, and alternative computer-mediated instruments.

A significant area of research has been telematic performance, where musicians who are located in different parts of the world play in a joint event, with audio from each site transmitted in real time. This is achieved with low latency and specific software: if one has a very good wired ethernet connection to the internet, and the audio interface to get the sound into the computer with a minimum amount of latency, it is possible. "Within a couple of hundred miles it works really well," Dobrian tells me, recalling the concerts done between UCI and UCSD, and greater distances such as New York and South Korea.

That isn't the only advance in performance that ICIT has done: the 2020-2021 Gassmann Electronic Music Series recently held two online concerts in 360° interactive-immersive video with a mixed quartet Hub New Music. According to Christopher, this was also a case of "what will we do that will be exceptional?" Because the Insta 360 pro camera actually consists of six cameras that are stuck to a spherical object, and the images of each of those are stitched together digitally, the audience can navigate through the video and see how the different placement affects sound.
Photo of Christopher Dobrian
The professor conducts research on AI interactive computer systems that can interpret and improvise music, and has written software where a computer interprets the gestures of a live performance and responds to them. The research, however, doesn't imply physical gestures, but more so the metaphorical ones in sound. "I ask myself: what is it about music that gives us the feeling that there is movement in music? What evokes that sense of motion, and how do you make the computer understand that?" Christopher elucidates.

Although the machine can't sympathize and feel when the music is rising or accelerating, what it can do is track change over time. Dobrian began tracking a myriad of parameters and graphing them into shapes, which allowed for categorization of how things change. "The computer tracks, categorizes, listens to what someone is playing, and responds to that," he continues. If it "hears" an upward passage, the interactive system is able to identify the upward motion and can choose to play back in an upward manner too, or perhaps do a parallel downward passage instead.
"Technology can change what we expect as possible, which then changes our expectations of what might be useful artistically. These things will continue to expand our idea of what music and art even can be."
Most of the professor's compositions are a combination of live performance and electronic music. Microepiphanies, which was written and performed by Dobrian and Douglas-Scott Goheen, is a digital opera where music, sound, lights, and projections are all controlled by a computer. "We were the only people involved, we had a Yamaha Disklavier, and everything seemed mysterious to the audience in terms of how it was working," Christopher recalls. "We made fun of digital technology, deliberately simulat[ing] computer crashes."

Another interesting project, JazzBot, began when someone made music robots that weren't musical at all — as the professor says, they just made sound (for instance, one moved a mallet and hit a woodblock.) Deciding to see whether a computer could control these sonic sculptures, they invited a jazz pianist, who played the Yamaha Disklavier, but the notes were used as information to send to the "robots" so that they would play along with him.

Even if ICIT places a heavy emphasis on music and technology, it is now paying great attention to promoting a diverse student body. "Our secret long-term goal is to make universities have more diverse faculty," the professor conveys. What that meant was recognizing the implicit biases that exist in the system and, from there, figuring out how to change that.

"There's this idea that if you ignore gender and race, and just look at the music, you will decide who the best people are and it will all work out fairly... [but] that's not really true," Christopher believes. "If we look for people who sound like us and do things like us, we're going to end up choosing people who look like us as well. We really try to find people who aren't like us but who show potential to be interesting artists."

For UC Irvine, finding potential in musicians is finding people who are making connections between things that aren't normal and thinking outside the box. "We're not just looking at how good somebody is now, but seeing that they have a creative spark that might lead them to be very good in the future," Dobrian shares. "You don't necessarily choose the person who is already at a professional level in their skill; maybe you'd choose somebody who is not so professional but is more exciting in terms of their ideas."
Made on