Start learning with our library of video tutorials taught by experts. Get started
Viewers: in countries Watching now:
Whether one is producing music, podcasts, game sounds, or film sound effects, Digital Audio Principles provides the tips and techniques that will make the project a success. Author Dave Schroeder explains the basics of digital audio production techniques and covers the essential hardware and software. He also discusses sound theory, frequency response, the range of human hearing, and dynamic range.
So what is sound? Sound is essentially vibrations that travel through the air, it's the molecules around us moving. What happens is you have something that creates a disturbance in air pressure. And that disturbance creates these different patterns of vibrations in the air pressure, and these patterns of vibrations of shifting molecules are what we perceive as different sounds. And they travel from the initial sound source or where the disturbance first occurs and travel out as sound waves. You can think of sound as kind of having three stages in life.
There is the production stage, where it's actually created for the first time, and that's where there are some sort of impact, or action, that creates a vibration, or movement. You can think of this as hitting a snare drum head, or a cowbell, or plucking a guitar string. This creates a vibration, and it starts to move the air pressure around it. The second stage is the propagation, or the point where, kind of, sound actually travels to us. And so the disturbance in the air pressure actually travels via sound waves. So these sound waves are traveling out and they run into things.
Sometimes, they hit things like walls and then bounce and are redirected, and other times, they actually hit things that are sensitive to the vibrations, like our ears or microphones. And this brings us to the third stage, which is when the sound waves actually reach something that's sensitive to these changes in air pressure, better known as a receiver. And that receiver can take those changes in air pressure and convert them into something else. As humans, we pick up these changes in air pressure, and we actually perceive them as sound. A classic analogy for sound waves and kind of thinking about how sound travels is to think about throwing a rock into a still pond, and thinking about how the little ripples go out in all directions and eventually hit the shore, hit different things.
You think about that rock being kind of the production stage, and then the propagation stage being those waves or ripples as they go out from that center, and this bring us to the third stage, the perception stage. The perception stage is what those waves actually hit. In the case of hearing, it would be hitting our ears, or in the case of recording, it would be hitting a microphone. So each of these disturbances is different and will generate kind of a unique pattern of variations in the air pressure, and this is what gives the sound its character, this is kind of how we know a snare drum from a human voice.
They have different patterns. So there are two characteristics of waveforms that affect how we hear a sound. There is amplitude, or the loudness, and frequency. Let's take a look at amplitude. This is the relative strength of a sound or how much of a change in air pressure occurs. So in our visual we can see some waveforms with less amplitude and some waveforms with more amplitude, and we perceive these differences in amplitude as essentially quieter and louder sounds. Loudness is the term that we use to describe how humans perceive amplitude.
Another important unit of measurement we are talking about loudness is the decibel, and it's got two different applications, there is the decibel SPL or Sound Pressure Level, which is what we use to measure the strength of the sound. We have got a little list here of some of the different SPL levels, and you can see kind of how they correspond with different things, with zero dB representing the threshold of human hearing, or how loud something has to be in order for us to be able to hear it at all. 20 dB is whispering, 60 dB conversation, et cetera. A 130 dB the threshold of pain.
If you are at a concert, and you have to leave because it hurts too much, it's probably around 130 dB. And then if you get up to 194 dB, which is really, really loud, we are talking about shock waves, and I guess kind of above that you are thinking about stuff like the sonic boom and stuff that you don't want to be too close to when you hear, because it's pretty loud. It's important to note that decibels are logarithmic, and this means that the volume changes represented by decibels aren't weighted on a one-to-one basis. To that point, every time a sound goes up by three decibels, it effectively is becoming twice as loud.
We have got a little list here of some of the different SPL levels, and you can see how they correspond to different things. We also use the decibel to measure the loudness of sound and in terms of audio, it kind of refers to the signal level or the volume, and you will that when we are working with digital audio, we will talk about turning a sound up a few dB or down a few dB. In the next movie, we will take a look at hertz and frequency response.
There are currently no FAQs about Digital Audio Principles.
Access exercise files from a button right under the course name.
Search within course videos and transcripts, and jump right to the results.
Remove icons showing you already watched videos if you want to start over.
Make the video wide, narrow, full-screen, or pop the player out of the page into its own window.
Click on text in the transcript to jump to that spot in the video. As the video plays, the relevant spot in the transcript will be highlighted.