MIDI stands for musical instrument digital interface. Learn what MIDI is: a powerful way for different hardware & software components to communicate musical performance data. MIDI doesn't actually make any sound at all; instead, it carries instructions on how to make sound, like sheet music. MIDI data is generated when a musician plays a MIDI controller like a keyboard, drum set, or hexaphonic (6-way split) guitar pickup. That data can then be edited & played back by any synthesizer or sampler.
- In addition to recording audio, most DAWs offer the ability to record, edit, and play back MIDI, which is fundamentally different from digital audio and has its own unique strengths. MIDI stands for musical instrument digital interface and it was created by a number of early synthesizer manufacturers in the 1980s. It's an electrical and digital protocol, a set of rules, for how different pieces of equipment can talk to each other. One part of the protocol defines the language, the vocabulary and grammar of communication, and another part defines the medium, the actual wires, plugs, and electrical signals that carry that language.
In its early days, MIDI was transmitted over cables with five pin connectors, like this. Those cables are still around, but nowadays most MIDI is transmitted over USB or entirely inside the computer between pieces of software. Despite the change in medium though, MIDI still uses the same digital language it always has, which was built to be powerful and flexible.
The MIDI language is made up of numbers that stand for messages like "note on", "note off", "pitch bend", and many other things. Let's hear what that sounds like. (tone) (tone slides up and down) Even though you heard sound while I was playing the keyboard, the MIDI itself didn't actually make any sound at all. Instead, MIDI is more like written sheet music. Sheet music doesn't make any sound, it contains instructions on how to make sound.
That's exactly what MIDI is, instructions on how to make sound. For example, the "note on" MIDI command tells a MIDI instrument when to play a note, which note it is, and how loud to play it. The "note off" command tells the instruments when to stop playing that note. Now, a page of sheet music can sound good or bad, depending both on what it says on the page and on who's playing it. In the same way, MIDI can sound good or bad, depending on both what the instructions say and on which synthesizer is following those instructions.
A collection of MIDI data, like a song for example, is called a MIDI sequence. In other words, it's a list of numbers that stand for commands in the MIDI language saying what should happen and timestamps saying when it should happen. The software or hardware that manages these lists is called a MIDI sequencer. Whenever you work with notes on a grid, like this, you're manipulating MIDI data. So where does the MIDI data come from? Some people use a mouse to click in the notes, sometimes programmers write software to compose MIDI, but the most well known way to create MIDI data is for a human musician to play an electronic instrument that doesn't record sound but instead detects the performance itself.
This instrument is called a MIDI controller. MIDI controllers come in different forms, but the most common is a piano-style keyboard. Some MIDI controller keyboards are also synthesizers which produce actual sound, but a non-synthesizer MIDI controller, like this one, makes no sound by itself. All it does is notice which keys are played and how hard and produces the corresponding MIDI language that describes those actions.
As you can see, whenever I play a note on this controller, the MIDI data is sent to the computer. Here's the "note on" command from the note I'm holding down. When I release the note, the keyboard sends a "note off" command. Some keyboards have other controls, like knobs, faders, or wheels. This one has a pitch bend wheel. As you can see, as I move the wheel, the controller sends updates to the computer of where the wheel is.
These control messages can also be edited after they've been recorded. Other knobs and faders on MIDI controllers can be assigned to almost any parameter within your DAW. For example, you could assign a knob to a filter that changes how a synthesizer sounds, so you can add expression while recording a part. (fluctuating note) It's important to remember that none of those controls actually make sound. They only provide data about their positions in the MIDI language.
How this MIDI data is interpreted is up to whatever is doing the interpretation. Commonly, MIDI data is used to make sound via a synthesizer, sampler, or virtual instrument, but MIDI can also be used to control lighting systems, mixers, and other devices. While keyboards are the most prevalent, there are many different types of MIDI controllers, such as electronic drums that sense how hard rubber or mesh pads are hit with drumsticks, wind controllers that register flute or saxophone fingerings and even sense your breath and send MIDI instructions as if the airflow were a knob or a fader, and guitar controllers that use a six way split pickup to sense which string is playing and what note.
With MIDI, you can play a part on a controller then completely change the instrument sound later without rerecording it, maybe changing a grand piano part to a string pad at the press of a button. You can also edit the notes that the instruments play without worrying about those edits sounding processed or artificial like it might sound if you did that to a digital audio file. This separation between how the performance is captured and what instrument plays it back allows for incredible flexibility when writing music with MIDI.
The course starts with explanations of what sound really is and how we hear it, including discussions on frequency, amplitude, phase, and psychoacoustics. Matt explores analog audio signal path, explaining connections, gain staging, and metering. Next, he brings the audio signal into the digital domain, discussing analog to digital conversion, digital gain staging, file formats and compression, and dither.
Then the course digs into digital audio workstations (DAWs), explaining the concepts and misconceptions involved in digital recording systems. Matt describes how memory, CPU speed, and storage affect your DAW's performance, as well as how to manage computer resources and understand the plethora of file formats associated with digital recording. He follows with an overview of MIDI: how to generate, store, process, and communicate MIDI data. He wraps up with the audio processors that are often used for mixing in a DAW—including EQ, compressors, reverb, delay, and many others.
- What is sound?
- The three domains of sound: acoustic, analog, and digital
- The analog vs. digital signal paths
- Converting analog audio to digital
- Digital formats and data compression
- Understanding the five types of DAWs
- Recording performances with MIDI
- Mixing and processing audio with EQ, compression, and other effects