A performance of a piece of music is strongly characterized by the
musician's personal interpretation of the given score, especially in
terms of timing and note intensities. For the automated analysis of such
artistic idiosyncrasies, accurate annotations are required that in
practice are often added manually to the audio recording. This process,
however, is prohibitive in view of large data collections.
We present a fully automatic approach for the extraction of temporal
information from a musical recording. This information is given in form
of a tempo curve that reveals the relative tempo difference over time
between a reference representation of the piece and an actual
performance. An evaluation of our approach shows that it allows for
capturing the overall tempo flow of a piece as well as even finer
expressive tempo nuances for certain classes of music.