ARTICLES & INTERVIEWS

These Symphony-commissioned feature articles offer insights into the music you’ll hear in the concert hall.

Jan 27, 2022

Processing Thought: Collaborative Partner Carol Reiley
BY CORINNA DA FONSECA-WOLLHEIM

Carol Reiley showed up prepared for her first meeting at the San Francisco Symphony as a newly minted Collaborative Partner. Like her musician colleagues on the eight-member brain trust, assembled by Music Director Esa-Pekka Salonen to breathe new life into the 110-year-old organization, she took her task seriously. So she arrived armed with a pitch deck, a five-year plan, and a crisp list of priorities. Also, a question: “What are your pain points?”

In a video interview Reiley recalled the consternation in the room. “They looked surprised and let out a laugh,” she says. “I guess they haven’t been asked that before in that way. It’s a very product person's question.”

As the only non-musician in the artistic leadership council, Reiley brings in a unique set of skills—and questions. A serial entrepreneur in the worlds of robotics and artificial intelligence (AI), she recently partnered with the star violinist Hilary Hahn to found deepmusic.ai, a company designed to infuse music making—and teaching—with the power of artificial intelligence. One of the first compositions to come out of it is David Lang’s out of body for solo violin, in which the Pulitzer Prize-winning composer engages in a virtual game of sound-tennis, with AI writing an upbow answer to every downbow challenge laid down by Lang.

Reiley’s data wizardry supported the creation of LIGETI: PARADIGMS, an entrancing video project on SFSymphony+ designed by the media artist Refik Anadol, in which shoals of “data crystals” and slow- morphing clouds unfurl in hypnotic waves to the strains of some of György Ligeti’s most ethereal scores. When the Symphony commissioned Nico Muhly in 2021 to compose a work featuring each of the Collaborative Partners, Reiley contributed an AI-written passage based on musical lines Muhly had provided her. Her job, in a project like this, is not to compose using a particular software, but to code the machine’s neural pathways so that it can assimilate the rules of a given musical language and compose by itself. In essence, she teaches machines to learn.

“At this point in time I would say it is a very bad student,” Reiley said of her creation with a wry laugh. So far, she said, much of the machine-composed music “might sound fine to the untrained ear, but to the composers who have to train the program, it can be a little bit painful.”

Muhly, diplomatically, said it was “really fascinating” being made to partner with a virtual collaborator. For Throughline he gave Reiley forty seconds worth of music—“something that was relatively Bach-y and also harmonically stable”—which she fed into her entity, coaxing it to compose another forty seconds in the same style. “I can obviously tell when it’s it and when it’s me,” Muhly said, “but it felt more like a game of telephone than a game of Computers Are Replacing Us.”

In any case, Reiley is not on a mission to replace human musicians with computers. Inspired by her background in telesurgery and health robotics, she sees AI as a superpower that can be harnessed for good. “When you apply it to surgeons, you create superhuman eyesight, you can get rid of hand imperfection to create superhuman precision,” she said. “So how can we use AI to empower artists, to make them super creative?”

From an early age, Reiley learned to investigate the inner workings of things. Her father, an engineer, encouraged her to take apart a computer. When her hamster escaped, he helped her jury-rig a mousetrap to a jar in order to lure back the pet unharmed. “That was a moment when I thought: engineering could be a really interesting way to solve problems,” she said.

But growing up she thought she might become a doctor. She volunteered at a hospital. One day a doctor spoke to her about pacemakers and a thought occurred to her. “I felt like a doctor could save one life at a time, but when I saw the pacemaker I thought, wow, I could actually build something and save millions of lives at once,” she said. “That impact is why I love being with one foot in tech and one foot in something else. It’s that sweet spot between technology and domain expertise.”

For fifteen years, she dove into health robotics, then joined a start-up, eventually acquired by Apple, that pioneered self-driving cars. The move felt like an extension of her work in healthcare. Considering the amount of people hurt in car crashes caused by human error, she said, “it’s almost like preventative surgery.”

She never stopped tinkering. It bugged her that cutting-edge technology was out of reach of so many. When she was a graduate student at Johns Hopkins University, where she worked with million-dollar telesurgery equipment by day, she spent late nights hacking the Guitar Hero game so that it could be played by amputees, with an EKG cuff transmitting neurological signals. She made open-source tutorials available to the public and founded a company to empower DIY enthusiasts.

“Robotics is a hodgepodge of all the engineering majors where you piece things together and build this cool box of skills,” she said. “You have to do everything yourself. I’d pop on the mechanical engineer’s hat and cut PVC pipe. I took arc welding lessons. I learned to solder. For self-driving cars we bought a car and went to town slicing it up and taking out and breaking things.”

That hands-on aspect of Reiley’s work is one point of contact with the musicians she now works alongside. (She took years of piano lessons but insists she lacked talent.) She also sees parallels between musicians and surgeons, in the value both professions place on precision, the technicality of the work, and a general obsessiveness around perfection.

But bringing AI into the realm of creativity is a new challenge. “It isn’t like surgery, where there is an end goal and a way to grade it,” she said. “There is no gold standard. Creativity is so abstract, and the process is almost more important than the outcome itself.”

At the Symphony, she is interested in ways to use technology behind the scenes, in audience engagement, and in filling the gap of music education created by slashed school budgets—a concern that is dear to Salonen.
Reiley is also keen to capture the special quality of a single live performance, using AI to translate the variables from a given concert into tangible artistic form. “In grad school we had auditory input robots that would generate art, whether calligraphy or spray painting,” she said. “I’m interested in something that would be representative of a single performance and moment in time.”

As for composition, Reiley doesn’t see AI muscling its way into the concert hall anytime soon. She sees room for a tiered coexistence, with AI churning out music for social media content or other utilitarian needs, while creative musicians might use it to generate the stems of ideas, or as a virtual collaborator to improvise with whenever inspiration strikes. “It can be a thought partner,” she said.

But training AI to think musically means teaching it human-made value systems around sound. And that forces the question whose values are to be encoded. “Are you taking the current biases of the most elite old-school musicians, or do you try to invent a new future by taking a diverse group and giving them new sounds?” Reiley asked. “And then, how do you judge it? I guess I’m fighting human biases that are so embedded in the history of music.”
 
CORINNA DA FONSECA-WOLLHEIM is a writer and the founder of Beginner’s Ear.

Please wait...