• Ei tuloksia

Computer-based musical data representation systems

3.6 MIDI and MIDI Files

MIDI (Musical Instrument Digital Interface) was designed in the early 1980’s as a means of controlling music synthesizers and other electronic keyboards. The original MIDI 1.0 specification (MIDI 1985 [1983]: 114-126) contained defini-tions for both a hardware interface and a data format. The data format consists of MIDI messages that are transmitted between instruments or other devices equipped with MIDI hardware interfaces. Later versions of the specification have included extensions to the data format and a file format for storing MIDI data.

Two central MIDI messages for controlling musical instruments are called Note On and Note Off. Both messages contain three parameters: a MIDI chan-nel, a key number, and a key-stroke velocity value. The key number, an integer between 0 and 127, specifies a key on a chromatic piano keyboard. MIDI thus allows the use of a 128-key keyboard, at the maximum. The key number 60 is specified as “middle C”. No means of explicit definition of pitch was given in the 1983 specification. In later revisions, a way of defining different tuning sys-tems was included.

A MIDI Note On event consists of three bytes: one Status Byte and two Data Bytes. As an example, let us say a Note On event contains the decimal values 144, 60, and 64. There, the first value (144; i.e., the status byte) specifies the MIDI message type and, in the case of a Note On event, also the MIDI channel.

Status byte values of 144 through 151 are reserved for Note On on channels 1 through 16, respectively. In the example message, the data byte values 60 and 64 specify the key number and velocity, respectively. A MIDI channel, ranging from 1 to 16, is encoded in the status byte.

An example Note Off message contains the decimal values 128, 60, and 64.

There, 128 is a Status byte specifying a Note Off for MIDI channel 1. (Status bytes 128 through 143 are reserved for Note Off on channels 1 though 16,

respectively.) The values 60 and 64 specify, respectively, the key number and a Note Off velocity (the speed at which the key is released).

MIDI was originally a real-time system. That is, MIDI messages were intended to be transmitted and performed instantly. In later revisions of the spec-ification, a file format, Standard MIDI Files, was included for attaching timing information to events. Still, the MIDI communication protocol itself remains mostly real-time.

Besides event timing, MIDI Files enables the storage of logical information that is not part of the MIDI protocol itself. Examples of these include time sig-nature and lyrics. Still, as pointed out by Hewlett and Selfridge-Field (1997: 68) as well as Haken and Lippold (1993: 43), MIDI or MIDI Files offers insufficient detail to be used as a music notation representation. Extensions have been pro-posed for including notation information in MIDI (e.g., Nordli 1997; Cooper et al. 1997). These extensions have, however, not been included in the official MIDI specification.

MIDI is a terse representation. This is an important requirement, because events are transmitted in real time between devices, and there should not be a noticeable delay in the reaction time of a receiving instrument to the messages sent by a MIDI keyboard or other controller.

In the MIDI protocol, all data is encoded in groups of 8-bit bytes, where one bit is reserved for indication of status or data byte. The remaining 7 bits are left for storing the actual data. This leads to difficulties, when large data entities have to be folded into 7 bit chunks. MIDI does not even specify a generic way of encoding and decoding typical 16 bit, 32 bit, or 64 bit computer data types.

Therefore, each application or extension of MIDI, must specify its own way of solving this common problem.

3.7 MusicKit

MusicKit (Boynton & Jaffe 1991) is an object-oriented software system origi-nally distributed as part of the system software of NeXT computer workstations.

Music Kit contains an application programming interface (abbr. API) with classes for sound synthesis as well as for processing and storing musical events.

MusicKit also included a text-based scoring language, which is a mixture of techniques used in MIDI and synthesis languages. In the NeXT system software, a separate class system, called SoundKit was provided for audio signal storage and editing.

Syntactically, the MusicKit score language resembles a modern statement and expression -based programming language, such as C or Pascal. This is a departure from the record/field-type syntax of Music V and many other

synthe-sis languages. Semantically, MusicKit borrows features from both synthesynthe-sis lan-guages and MIDI.

In MusicKit, Note is semantically equivalent to a sound event. Onset times and durations of sound events are determined by “noteOn” and “noteOff”

events, or alternatively by “noteDuration” events which are encoded in a timed event stream. Pitch may be specified either as a MIDI key number or as a funda-mental frequency value. The score may be used to control both external MIDI instruments and the computer’s internal sound synthesis engine.

Below, an example of a MusicKit score is presented. The first line contains comment, preceded by a “//” delimiter. The rest of the score consists of state-ments. A statement ends with a semicolon character (;) and may be written on one or more lines. The first statement in the sample score specifies a perfor-mance tempo in beats per minute. The next statement defines a named “part”, p1. The part is given an instrument in the next statement. There, the expression synthPatch:”midi” specifies, that a MIDI instrument is used to perform the part.

The BEGIN statement marks the start of a stream of timed events which ends with an END statement. The event stream contains a noteOn statement and a noteOff statement. The noteOn statement is preceded with a timing statement

“t 0” that sets the time of the performance to 0 beats. This means that all suc-ceeding events until the next timing statement are performed at beat 0. The noteOn event belongs to the previously defined part p1. In the parenthesis fol-lowing the noteOn expression, is a numeric identifier for the note. Each note may be given a unique identifier so that the note may be referred to in other events. In the noteOn event, pitch is specified with the expression keyNum:60 as a MIDI note number. Velocity is specified as a MIDI value with the expression velocity:64.

With the timing statement “t +1” time is advanced by one beat (i.e. 1/60th of a second as defined by the preceding tempo setting). Next, a noteOff event for the note ID 1 is given. Unlike in MIDI, a key number does not have to be speci-fied since the note already has a unique identifier.

// A sample MusicKit score info tempo:60;

part p1;

p1 synthPatch:”midi”;

BEGIN;

t 0;

p1 (noteOn 1) keyNum:60 velocity:64;

t +1;

p1 (noteOff 1);

END;

A MusicKit note is an extension of the MIDI Note concept. As one important extension, MusicKit supports uniquely identifiable notes. This enables one to change parameters of a note in between noteOn and noteOff events. For this pur-pose MusicKit includes a specialized event type called noteUpdate. MusicKit allows the specification of pitch or related information in alternative ways. One way, a key number, is used in the above example. Other alternatives are funda-mental frequency and key name. A key name allows one to specify a name on a note (c, d, e, etc.) and an octave range. For example, the key name c4 is equiva-lent to MIDI key 60. Unlike with MIDI key numbers, key names enable explicit definition of enharmonic variants. For example, the key name cs4 (c sharp, 4th octave) is logically different from df4 (d flat, 4th octave). When used to control a MIDI instrument both are interpreted as key number 61, but if the representation is translated into music notation, the enharmonic distinction can be preserved.

Still, a MusicKit score does not provide sufficient information to be used as a translation format or input language for music notation programs. Among the lacking parameters are key and time signature, lyrics, and stem direction.

MusicKit is more literal than MIDI and, on the other hand, less terse. Music-Kit is also somewhat cryptic and idiomatic in respect to the synthesis algorithms available in the NeXT workstations. These features become apparent in scores that are more complex than the above example and which use the internal syn-thesis algorithms instead of, or in addition to, MIDI control of external instru-ments.