Photography

Why Camera Sensors Matter and How They Keep Improving


What is the most important aspect of a camera to consider when looking to buy a new one? In this video, Engadget put camera sensors in the spotlight and reviewed how they have improved and what role they play in today’s photographic equipment.

Camera brands regularly release new cameras, with each model improving on its past versions. However, video producer Chris Schodt from Engadget points out in the company’s latest YouTube video that it may appear camera sensors haven’t progressed as rapidly in the recent past, although resolution has increased. This is because modern-day cameras — such as the Canon EOS 5D released in 2005 — were already able to produce high-quality images over a decade ago and still continue to do so.

Camera sensors, in technical terms, can be described as a grid of photodiodes which act as a one-way valve for electrons. In CMOS sensors — which are widely used in digital cameras that photographers use today — each pixel has additional circuitry built into it aside from the photodiode.

These on-pixel electronics help CMOS sensors quick speed because they can read and reset quickly, although, in the past, this characteristic could also contribute to bringing up fixed-pattern noise. However, with the improvement of manufacturing processes, this side-effect has been largely eliminated in modern cameras.

Schodt explains that noise control is crucial to a camera’s low light performance and dynamic range, which is a measure of the range of light captured in the image between the maximum and minimum values. In a photograph, those are between white — such as when pixel clips or is overexposed — and black, respectively.

Clipped or overexposed pixels in an image

In an ideal scenario, camera sensors would capture light, which is emitted as photons, in a uniform way to reconstruct a perfectly clear image. However, that isn’t the case because they hit the sensor randomly.

One way to deal with this is to produce larger sensors and larger pixels, however, that comes with a large production cost and an equally large camera body, such as the Hasselblad H6D-100c digital back which has a 100MP CMOS sensor and a $26,500 price.

Other solutions include the development of Backside Illuminated sensors (BSI), such as the one announced by Nikon in 2017 and Sony first in 2015. This type of sensor leads to improved low-light performance and speed. Similarly, so does a stacked CMOS sensor that provides even faster speeds, such as the Sony Micro Four Thirds sensor published earlier in 2021.

Smartphones, on the other hand, use multiple images and average them together to improve noise and dynamic range, like the Google HDR+ with Bracketing Technology, which is also a direction that several modern video cameras have taken, too.

Looking towards the future of sensor development, Schodt explains that silicon, which is the material currently used to make sensors, is likely to stay, although some alternative materials have been used like gallium arsenide and graphene. Another possible direction is curved sensors, although they would make it difficult for users as curved sensors would need to be paired with precisely manufactured lenses. In practical terms, photographers would have to buy into a particular system with no option of using a third-party lens.

It’s likely that in the future focus will be on computational photography. Faster sensors and more on-camera processing to make use of smartphone-style image stacking might make its way to dedicated cameras, for example, in addition to AI-advanced image processing.

In the video above, Schodt explains more in detail the technical build of sensors and how their characteristics correlate to the resulting images. More Engadget educational videos can be found on the company’s YouTube page.


Image credits: Photos of camera sensors licensed via Depositphotos.





READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.