Easy-to-follow video tutorials help you learn software, creative, and business skills.Become a member
So the story goes that one fall day, two engineers at Bell Labs, George Smith and Willard Boyle, spent about an hour sketching out an idea for a new type of semiconductor that could be used as computer memory, you know, as one does on a nice fall day. Anyway, they thought that the semiconductor could also be used to create a video camera that didn't require vacuum tubes. In that hour, the two men created the plan for the charged coupled device, or CCD chip.
Now, we tend to think of digital cameras as a fairly new technology, but that fall day that I am talking about was in October of 1969. Within a year, Bell Labs had created a video camera using Smith and Boyle's new semiconductor. Their idea was to create a very simple device that could be used in a video telephone, but they soon had created a camera that was good enough for broadcast TV work. It wasn't until the late 1990s though that the quality from these image sensors had gotten good enough for still photography work. A still photo requires far more pixels than a standard-definition video image.
Smith and Boyle's design exploited something called the photoelectric effect, which is a property that some metals have. If you put an electrical charge on these types of metals, they will release some of that charge when they're exposed to light. As more light hits the metal, more charge gets released. After the light exposure, you can measure the charge that's left on the metal and know how much light struck the surface during the exposure. This is what the sensor in your camera does. There is a little piece of metal for every pixel in the resulting image.
Each piece of metal is called a photo site and after you take your shot, the camera measures the voltage at each photo site to determine the overall light levels. Now these are very, very weak voltages that we are talking about, so before the image data from the chip can be interpreted and processed, it needs to be amplified. Once it's boosted up to a reasonable level, the data can be processed into a final color image. Now when you increase the ISO on your camera, all you're doing is turning up that level of amplification. And just as turning up the volume knob or amplification on your stereo lets you hear quieter sounds in the music that you are listening to, turning up the ISO or amplification on your image sensor lets you see the lower light levels.
Now because the sensor becomes effectively more responsive, it doesn't require as long to gather a given amount of light. This means that as a light levels drop, if you raise the ISO on your camera, you don't have to suffer slower shutter speeds. This can be critical for freezing motion or preventing handheld shake. Now there is a price to pay for this. As you turn up the volume on your stereo, you can hear more static and hiss in the music that you are listening to, because as you increase amplification, you are not just increasing the recorded music, you are also increasing all sorts of electrical noise that's caused by the circuitry in the hardware and cosmic rays flying by and other electrical fields in the area.
Same thing is true on your camera. As you increase ISO, you amplify the signal coming from the sensor, but you also amplify any noise that has found its way into the electronics. This noise appears in your image's speckly patterns. There are three kinds of noise that can appear in an image. There is luminance noise. This is simply changes and brightness from pixel to pixel, and this noise appears roughly akin to film grain. It can actually be attractive because it can lend texture and atmosphere to an image. There is chromatic noise, or chrominance noise.
This is a change in color from place to place in your image. Chromatic noise can appear as colored specks or even big colored splotches. It's a pretty ugly kind of noise and it looks more like a digital artifact than luminance noise does. Both of these types of noise will get worse as you increase ISO. Now there is a third type of noise that can develop as your shutter speeds get longer. With a longer shutter speed, the pixels on the sensor can get stuck on, and end up appearing as bright specks in your final image. This is referred to as stuck-pixel noise or long-exposure noise. As we'll see later, your camera might have built- in features for dealing with this.
Noise is difficult to remove from a final image and when you do employ a noise- reduction process, you usually suffer a sharpness penalty. So as a low-light shooter--that is, as somebody who shoots at high ISOs with long shutter speeds--noise will be a major concern for you, and we'll be looking in detail at how to factor noise concerns into your process when you're working in low light.
Get unlimited access to all courses for just $25/month.Become a member
164 Video lessons · 45128 Viewers
64 Video lessons · 83027 Viewers
86 Video lessons · 53240 Viewers
148 Video lessons · 90225 Viewers