Bio

Bio


Chris Chafe is a composer, improvisor and cellist, developing much of his music alongside computer-based research. He is Director of Stanford University's Center for Computer Research in Music and Acoustics (CCRMA). At IRCAM (Paris) and The Banff Centre (Alberta), he pursued methods for digital synthesis, music performance and real-time internet collaboration. CCRMA's SoundWIRE project involves live concertizing with musicians the world over. Online collaboration software including jacktrip and research into latency factors continue to evolve. An active performer either on the net or physically present, his music reaches audiences in dozens of countries and sometimes at novel venues. A simultaneous five-country concert was hosted at the United Nations in 2009. Chafe's works are available from Centaur Records and various online media. Gallery and museum music installations are into their second decade with "musifications" resulting from collaborations with artists, scientists and MD's. Recent works include Tomato Quintet for the transLife:media Festival at the National Art Museum of China, Phasor for contrabass and Sun Shot played by the horns of large ships in the port of St. Johns, Newfoundland. Chafe premiered DiPietro's concerto, Finale, for electric cello and orchestra in 2012.

Academic Appointments


  • Professor, Music
  • Member, Bio-X

Administrative Appointments


  • Director, Center for Computer Research in Music and Acoustics (1996 - Present)

Teaching

2013-14 Courses


Publications

Journal Articles


  • Sound synthesis for a brain stethoscope. journal of the Acoustical Society of America Chafe, C., Caceres, J., Iorga, M. 2013; 134 (5): 4053-?

    Abstract

    Exploratory ascultation of brain signals has been prototyped in a project involving neurologists, real-time EEG and techniques for computer-based sound synthesis. In a manner similar to using a stethoscope, the listener can manipulate the location being listened to. Sounds which are heard are sonifications of electrode signals. We present a method for exploring sounds from arrays of sensors as sounds which are useful for distinguishing brain states. The approach maps brain wave signals to modulations characteristic of human voice. Computer-synthesized voices "sing" the dynamics of wakefulness, sleep, seizures, and other states. The goal of the project is to create a recognizable inventory of such vocal "performances" and allow the user to probe source locations in the sensor array in real time.

    View details for DOI 10.1121/1.4830793

    View details for PubMedID 24181199

  • Internet rooms from internet audio. journal of the Acoustical Society of America Chafe, C., Granzow, J. 2013; 133 (5): 3347-?

    Abstract

    Music rehearsal and concert performance at a distance over long-haul optical fiber is a reality because of expanding network capacity to support low-latency, uncompressed audio streaming. Multichannel sound exchanged across the globe in real time creates "rooms" for synchronous performance. Nearby connections work well and musicians feel like they are playing together in the same room. Larger, continental-size, distances remain a challenge because of transmission delay and seemingly subtle but perceptually important cues which are in conflict with qualities expected of natural rooms. Establishing plausible, room-like reverberation between the endpoints helps mitigate these difficulties and expand the distance across which remotely located musicians perform together comfortably. The paper presents a working implementation for distributed reverberation and qualitative evaluations of reverberated versus non-reverberated conditions over the same long-haul connection.

    View details for DOI 10.1121/1.4805672

    View details for PubMedID 23655010

  • JackTrip/SoundWIRE Meets Server Farm COMPUTER MUSIC JOURNAL Caceres, J., Chafe, C. 2010; 34 (3): 29-34
  • JackTrip: Under the Hood of an Engine for Network Audio JOURNAL OF NEW MUSIC RESEARCH Caceres, J., Chafe, C. 2010; 39 (3): 183-187
  • Effect of temporal separation on synchronization in rhythmic performance PERCEPTION Chafe, C., Caceres, J., Gurevich, M. 2010; 39 (7): 982-992

    Abstract

    A variety of short time delays inserted between pairs of subjects were found to affect their ability to synchronize a musical task. The subjects performed a clapping rhythm together from separate sound-isolated rooms via headphones and without visual contact. One-way time delays between pairs were manipulated electronically in the range of 3 to 78 ms. We are interested in quantifying the envelope of time delay within which two individuals produce synchronous performances. The results indicate that there are distinct regimes of mutually coupled behavior, and that 'natural time delay'--delay within the narrow range associated with travel times across spatial arrangements of groups and ensembles--supports the most stable performance. Conditions outside of this envelope, with time delays both below and above it, create characteristic interaction dynamics in the mutually coupled actions of the duo. Trials at extremely short delays (corresponding to unnaturally close proximity) had a tendency to accelerate from anticipation. Synchronization lagged at longer delays (larger than usual physical distances) and produced an increasingly severe deceleration and then deterioration of performed rhythms. The study has implications for music collaboration over the Internet and suggests that stable rhythmic performance can be achieved by 'wired ensembles' across distances of thousands of kilometers.

    View details for DOI 10.1068/p6465

    View details for Web of Science ID 000281270900011

    View details for PubMedID 20842974

  • Tapping into the Internet as an Acoustical/Musical Medium CONTEMPORARY MUSIC REVIEW Chafe, C. 2009; 28 (4-5): 413-420
  • Analysis of Flute Control Parameters: A Comparison Between a Novice and an Experienced Flautist ACTA ACUSTICA UNITED WITH ACUSTICA de la Cuadra, P., Fabre, B., Montgermont, N., Chafe, C. 2008; 94 (5): 740-749

    View details for DOI 10.3813/AAA.918091

    View details for Web of Science ID 000260966500012

  • Neural dynamics of event segmentation in music: Converging evidence for dissociable ventral and dorsal networks NEURON Sridharan, D., Levitin, D. J., Chafe, C. H., Berger, J., Menon, V. 2007; 55 (3): 521-532

    Abstract

    The real world presents our sensory systems with a continuous stream of undifferentiated information. Segmentation of this stream at event boundaries is necessary for object identification and feature extraction. Here, we investigate the neural dynamics of event segmentation in entire musical symphonies under natural listening conditions. We isolated time-dependent sequences of brain responses in a 10 s window surrounding transitions between movements of symphonic works. A strikingly right-lateralized network of brain regions showed peak response during the movement transitions when, paradoxically, there was no physical stimulus. Model-dependent and model-free analysis techniques provided converging evidence for activity in two distinct functional networks at the movement transition: a ventral fronto-temporal network associated with detecting salient events, followed in time by a dorsal fronto-parietal network associated with maintaining attention and updating working memory. Our study provides direct experimental evidence for dissociable and causally linked ventral and dorsal networks during event segmentation of ecologically valid auditory stimuli.

    View details for DOI 10.1016/j.neuron.2007.07.003

    View details for Web of Science ID 000248711000017

    View details for PubMedID 17678862

  • Cyberinstruments via physical modeling synthesis: Compositional applications LEONARDO MUSIC JOURNAL Kojs, J., Serafin, S., Chafe, C. 2007; 17: 61-66
  • Oxygen flute: A computer music instrument that grows JOURNAL OF NEW MUSIC RESEARCH Chafe, C. 2005; 34 (3): 219-226
  • Physical model synthesis with application to Internet acoustics 2002 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS I-IV, PROCEEDINGS Chafe, C., Wilson, S., Walling, D. 2002: 4056-4059
  • DREAM MACHINE 1990 COMPUTER MUSIC JOURNAL CHAFE, C. 1991; 15 (4): 62-64
  • TOWARD AN INTELLIGENT EDITOR OF DIGITAL AUDIO - RECOGNITION OF MUSICAL CONSTRUCTS COMPUTER MUSIC JOURNAL CHAFE, C., MONTREYNAUD, B., Rush, L. 1982; 6 (1): 30-41

Stanford Medicine Resources: