Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Analysis of the acoustics and playing strategies of turntable scratching
KTH, School of Computer Science and Communication (CSC), Media Technology and Interaction Design, MID. KTH, School of Computer Science and Communication (CSC), Speech, Music and Hearing, TMH, Music Acoustics. (Sound and Music Computing)ORCID iD: 0000-0003-4259-484X
KTH, School of Computer Science and Communication (CSC), Speech, Music and Hearing, TMH, Music Acoustics.
KTH, School of Computer Science and Communication (CSC), Speech, Music and Hearing, TMH, Music Acoustics.ORCID iD: 0000-0002-3086-0322
2011 (English)In: Acta Acoustica united with Acustica, ISSN 1610-1928, E-ISSN 1861-9959, Vol. 97, no 2, 303-314 p.Article in journal (Refereed) Published
Abstract [en]

Scratching performed by a DJ (disk jockey) is a skillful style of playingthe turntable with complex musical output. This study focuses on the description of some of the acoustical parameters and playing strategies of typical scratch improvisations, and how these parameters typically are used for expressive performance. Three professional DJs were instructed to express different emotions through improvisations, and both audio and gesturaldata were recorded. Feature extraction and analysis of the recordings are based on a combination of audio and gestural data, instrument characteristics, and playing techniques. The acoustical and performance parameters extracted from the recordings give a first approximation on the functional ranges within which DJs normally play. Results from the analysis show that parameters which are important for other solo instrument performances, suchas pitch, have less influence in scratching. Both differences and commonalities between the DJs’ playing styles were found. Impact that the findings of this work may have on constructing models for scratch performances arediscussed.

Place, publisher, year, edition, pages
S Hirzel verlag , 2011. Vol. 97, no 2, 303-314 p.
Keyword [en]
MUSIC PERFORMANCE; FREQUENCY; PERCEPTION; DURATION; PROSODY
National Category
Computer Science Human Computer Interaction Music Psychology
Identifiers
URN: urn:nbn:se:kth:diva-11923DOI: 10.3813/AAA.918410ISI: 000288130700014Scopus ID: 2-s2.0-79952119561OAI: oai:DiVA.org:kth-11923DiVA: diva2:290102
Note

QC20100729 (Uppdaterad från submitted till published 20110328). QC 20160115

Available from: 2010-01-26 Created: 2010-01-26 Last updated: 2017-12-12Bibliographically approved
In thesis
1. The acoustics and performance of DJ scratching, Analysis and modelling
Open this publication in new window or tab >>The acoustics and performance of DJ scratching, Analysis and modelling
2010 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

This thesis focuses on the analysis and modeling of scratching, in other words, the DJ (disk jockey) practice of using the turntable as a musical instrument. There has been experimental use of turntables as musical instruments since their invention, but the use is now mainly ascribed to the musical genre hip-hop and the playing style known as scratching. Scratching has developed to become a skillful instrument-playing practice with complex musical output performed by DJs. The impact on popular music culture has been significant, and for many, the DJ set-up of turntables and a mixer is now a natural instrument choice for undertaking a creative music activity. Six papers are included in this thesis, where the first three approach the acoustics and performance of scratching, and the second three approach scratch modeling and the DJ interface. Additional studies included here expand on the scope of the papers.

For the acoustics and performance studies, DJs were recorded playing both demonstrations of standard performance techniques, and expressive performances on sensor-equipped instruments. Analysis of the data revealed that there are both differences and commonalities in playing strategies between musicians, and between expressive intentions. One characteristic feature of scratching is the range of standard playing techniques, but in performances it seems DJs vary the combination of playing techniques more than the rendering of these techniques. The third study describes some of the acoustic parameters of typical scratch improvisations and looks at which musical parameters are typically used for expressive performances. Extracted acoustic and performance parameters from the data show the functional ranges within which DJs normally play.

Unlike traditional musical instruments, the equipment used for scratching was not intended to be used for creating music. The interface studies focus on traditional as well as new interfaces for DJs, where parameter mappings between input gestures and output signal are described. Standard performance techniques have been modeled in software called Skipproof, based on results from the first papers. Skipproof was used for testing other types of controllers than turntables, where complex DJ gestures could be manipulated using simplified control actions, enabling even non-experts to play expressively within the stylistic boundaries of DJ scratching. The last paper describes an experiment of using an existing hardware platform, the Reactable, to help designing and prototyping the interaction between different sound models and instrument interfaces, including scratching and Skipproof.

In addition to the included papers, studies were conducted of expressivity, description of the emotional contents of scratching, DJ playing activities, and the coupling between playing techniques and sample. The physical affordances of the turntable, mixer and samples, as well as genre conventions of hip-hop, are assumed to explain some of the findings that distinguish scratching from other instrumental sounds or practices.

Place, publisher, year, edition, pages
Stockholm: KTH, 2010. xii, 74 p.
Series
Trita-CSC-A, ISSN 1653-5723 ; 2010:01
National Category
Fluid Mechanics and Acoustics Other Veterinary Science
Identifiers
urn:nbn:se:kth:diva-11927 (URN)978-91-7415-541-9 (ISBN)
Public defence
2010-02-12, Sal F2, Lindstedtsvägen 26, KTH, Stockholm, 10:00 (English)
Opponent
Supervisors
Note
QC20100729Available from: 2010-01-26 Created: 2010-01-26 Last updated: 2010-07-29Bibliographically approved
2. Interactive computer-aided expressive music performance: Analysis, control, modification and synthesis
Open this publication in new window or tab >>Interactive computer-aided expressive music performance: Analysis, control, modification and synthesis
2011 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

This thesis describes the design and implementation process of two applications (PerMORFer and MoodifierLive) for the interactive manipulation of music performance. Such applications aim at closing the gap between the musicians, who play the music, and the listeners, who passively listen to it. The goal was to create computer programs that allow the user to actively control how the music is performed. This is achieved by modifying such parameters as tempo, dynamics, and articulation, much like a musician does when playing an instrument. An overview of similar systems and the problems related to their development is given in the first of the included papers.

Four requirements were defined for the applications: (1) to produce a natural, high quality sound; (2) to allow for realistic modifications of the performance parameters; (3) to be easy to control, even for non-musicians; (4) to be portable. Although there are many similarities between PerMORFer and MoodifierLive, the two applications fulfill different requirements. The first two were addressed in PerMORFer, with which the user can manipulate pre-recorded audio performance. The last two were addressed in MoodifierLive, a mobile phone application for gesture-based control of a MIDI score file. The tone-by tone modifications in both applications are based on the KTH rule system for music performance. The included papers describe studies, methods, and algorithms used in the development of the two applications.

Audio recordings of real performance have been used in PerMORFer toachieve a natural sound. The tone-by-tone manipulations defined by the KTH rules first require an analysis of the original performance to separate the tones and estimate their parameters (IOI, duration, dynamics). Available methods were combined with novel solutions, such as an approach to the separation of two overlapping sinusoidal components. On the topic of performance analysis, ad-hoc algorithms were also developed to analyze DJ scratching recordings.

A particularly complex problem is the estimation of a tone’s dynamic level. A study was conducted to identify the perceptual cues that listeners use to determinethe dynamics of a tone. The results showed that timbre is as important as loudness. These findings were applied in a partly unsuccessful attempt to estimate dynamics from spectral features.

The manipulation of tempo is a relatively simple problem, as is that of articulation (i.e. legato-staccato) as long as the tone can be separated. The modification of dynamics on the other hand is more difficult, as was its estimation. Following the findings of the previously mentioned perceptual study, a method to modify both loudness and timbre using a database of spectral models was implemented.

MoodifierLive was used to experiment with performance control interfaces. In particular, the mobile phone’s built-in accelerometer was used to track, analyze, and interpret the movements of the user. Expressive gestures were then mapped to corresponding expressive music performances. Evaluation showed that modes based on natural gestures were easier to use than those created witha top-down approach.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2011. 69 p.
Series
Trita-CSC-A, ISSN 1653-5723 ; 2011:12
National Category
Fluid Mechanics and Acoustics
Identifiers
urn:nbn:se:kth:diva-34099 (URN)978-91-7501-031-1 (ISBN)
Public defence
2011-06-15, F3, Lindstedtsvägen 26, KTH, Stockholm, 10:00 (English)
Opponent
Supervisors
Note
QC 20110607Available from: 2011-06-07 Created: 2011-05-25 Last updated: 2012-03-22Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopusPublished version

Authority records BETA

Hansen, Kjetil FalkenbergBresin, Roberto

Search in DiVA

By author/editor
Hansen, Kjetil FalkenbergFabiani, MarcoBresin, Roberto
By organisation
Media Technology and Interaction Design, MIDMusic Acoustics
In the same journal
Acta Acoustica united with Acustica
Computer ScienceHuman Computer InteractionMusicPsychology

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 164 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf