Do larger sensors collect more light?
If we talk about the amount of light (number of photons): the larger the capture surface, the more photons incident per unit of time.
Assuming that the illumination is the same in two sensors of different size and that the exposure time is the same: the larger sensor will collect more light (more photons) overall.
Each sensor cell will collect only the number of photons corresponding to its surface.
If we have two sensors of different sizes, but their photosensitive cells have the same capture surface: each cell will collect the same amount of light in the two sensors.
But overall, the larger sensor will have collected more light .
So the image from the larger sensor will appear brighter, clearer?
The apparent brightness of the image (if it appears darker or lighter) is known as exposure ( perceived exposure if we also take into account ISO ) and does not depend on the total amount of light, but on the amount of light per unit area reaching the sensor.
If you want to see it in a more intuitive way:
- The larger sensor receives more light, but spreads it over a larger surface area
- The smaller sensor receives less light, but spreads it over a smaller surface area
Exposure vs Amount of light
The exposure of an image depends on the intensity of light ( illuminance really, but think of it as photons per second ) that the sensor receives, by the time it is exposed to that light.
The exposure is not dependent on the size of the sensor (assuming no other factors that limit the amount of light received by the sensor).
In cameras, the actual exposure (total photons per unit area) depends on the aperture of the diaphragm ( f-number ) and the exposure time (shutter speed)
The apparent exposure (perceived exposure) of the final image also plays a role in the sensitivity of the film or sensor (ISO value).
That is, by varying the ISO value we make the image appear lighter or darker, but it is somewhat ‘ artificial ‘ since we start with the same number of photons.
Different cameras, exposure and sensor size
All digital cameras have the sensor calibrated in such a way that it adjusts (more or less) to the behavior of photographic film.
- Exhibithion time
All cameras will generate an image with an ‘identical’ exposure .
I put it in quotes because there are other factors that influence, for example:
- The aperture of the objective does not take into account the light that is lost when passing through the lenses that form it (the lenses are not totally transparent)
- Nor does it take vignetting into account: the central area receives more light than the areas farther from the center
- Each manufacturer calibrates its sensor (ISO values) with a certain tolerance margin
Sensor size, exposure and image quality
So if the exposure is not dependent on the size of the sensor… what advantage does a larger sensor have?
To understand it we have to talk about the signal to noise ratio: SNR ( signal to noise ratio )
The signal is the information, the actual scene that we are photographing.
Noise is something that is part of nature. The light itself includes a noise component: the number of photons reaching each point on the sensor fluctuates statistically (Poisson distribution).
There are other sources of noise that affect the electrical signal (once the sensor converts photons to electrons)
We will never have an image that is totally faithful to reality, it will always contain some noise .
Signal-to-noise ratio (SNR) and image quality
The signal-to-noise ratio, SNR, is used to evaluate the image quality .
This relationship is usually measured on a logarithmic scale, in dB (it could also be measured in light steps, EV, which is another logarithmic scale).
When the amplitude of the signal is double that of the noise, its SNR would be 6dB.
An image is considered to have an acceptable quality starting at around 20dB signal-to-noise ratio.
An image with an SNR above 30dB is considered excellent and the noise is hardly noticeable.
Noise is seen in the image as grainy and colored dots that do not correspond to the actual color of the scene.
Photonic noise and thermal noise
We then return to noise.
Fluctuations of light at the photon level are known as photonic noise ( shot noise ). Photonic noise grows proportionally to the square root of the number of photons.
That is, if we receive 100 photons on average in a sensor cell, the fluctuations will have an average level of 10. The adjacent cell can receive for example 93 photons, another around 105, etc.
If we receive 1000 photons on average, the fluctuations will have a mean level of about 32 (square root of 1000).
In addition, we must add the other sources of thermal noise, which do not depend on the amount of light. But let’s say the main contribution will be photonic noise in most cases.
As you can see, the noise level does not scale linearly with the amount of light (signal).
The more light the sensor captures, the more difference there will be between the signal level and the noise level.
The SNR increases with the amount of light . The more photons the sensor captures, the more SNR and the better image quality.
SNR at cell level or sensor level?
This is the eternal question that you will find since the beginning of time in forums and in bar counter conversations of brothers-in-law and photographers (or photographers in-law) … 🙂
Let’s first think about the image as a whole.
Imagine that we have two cameras positioned in such a way that they see the same frame of a scene. One of the cameras has a full frame sensor.
The other is a mobile camera with a very small sensor. Let’s further assume that the two sensors have the same cell size.
In the two cameras we adjust their exposure parameters exactly the same: aperture (whatever the mobile has), exposure time and ISO.
The two resulting images will look very similar in terms of medium brightness (exposure)
Which of the two images will have more ‘ quality ‘?
If we think of the image as a whole, the large sensor image has been created from many more photons.
If the SNR grows with the number of photons, it seems clear that the image from the large sensor will have a higher signal-to-noise ratio , therefore better image quality .
Are we going to notice it visually by comparing the images?
Imagine there is a lot of light in the scene and we expose correctly.
Let’s suppose that for the image of the small sensor we get 35dB of SNR and for the image of the large sensor an SNR of 55dB (I am inventing the values, it is just to give an example).
The two images will be of excellent quality and we will probably not notice any appreciable differences. Surely the optical part or any other external effect will have more influence.
Now let’s imagine that there is less light in the scene or that we need to set a very short shutter time. We raised the ISO on both cameras to compensate.
Now the amount of light is going to be less. Less photons, remember that with the ISO we simply rescale or amplify the signal.
Suppose that now for the image of the small sensor we stay at 25dB of SNR and in the large one we go down to 45dB.
Although the image from the small sensor is still of acceptable quality, the noise will be noticeable. In the large sensor image we still have a very high SNR, the quality will be excellent.
But, but, but … we said that the cells are the same size! …
Shouldn’t the SNR be the same for the cells of the two sensors?
At the cell level, each one of them collects the same number of photons in the two sensors.
The signal-to-noise ratio at the cell level is identical (we will neglect any effect: we consider the same technology, etc.)
If we enlarge the images on the computer to 100% to see each individual pixel, we will see that their variability (noise) is very similar in the two images.
The difference is that one pixel in the small sensor image corresponds to a set of pixels in the large sensor image.
The large sensor has a much higher resolution (the condition that we had set is that the cells of the two sensors had the same size, therefore in the large sensor there are more cells).
So, at a very local level in each small area of the image (pixel peeping) we find that the SNR is very similar in the two images. But the image from the large sensor contains more information about the scene , more dots.
More information about the same noise level: this already gives us a clue, it is telling us that the image of the large sensor, even locally, looking at the image with a magnifying glass, has a higher quality, a better signal-to-noise ratio.
What if the smallest sensor has larger cells?
In all the combinations we make, in the end the image of the large sensor will have a higher SNR (except in very very specific cases).
At a local level, looking at each area of the image with a magnifying glass, it is possible that in certain areas, for example the darkest areas, the quality is a little better in the small sensor (due to the better signal-to-noise ratio of each cell).
But if the two images show the same frame, the large sensor will have built it from more photons in general: it will have more information and a better SNR than the image generated by the small sensor (if we cut one or the other, the comparison would not be worth it. , we would be cheating)
Scale to compare
Okay, that’s all kind of theoretical.
Let’s get down to business. If you want to compare two images from two different cameras, you can do it for example by printing on paper at the same size.
Or if you want to see it on the screen, you can scale the two images so that they have the same resolution.
In either case, the process of scaling an image to make it smaller has the effect of increasing the signal-to-noise ratio.
This is so because the scene information that appears in the image is normally spatially correlated, while noise is a random component that is neither spatially correlated nor related to the elements of the scene.
By rescaling down let’s say that although we lose information (detail, resolution) we are reinforcing information from the scene and reducing noise.
Rescaling up to higher resolutions does not provide new information. SNR increases or remains constant (noise will probably become a little more visible).
Big sensor vs small sensor … in the real world
Is sensor size that important?
No (for what we are discussing here regarding noise and SNR / image quality)
When we use the camera with good lighting in the scene, there is literally plenty of light.
And all cameras, including mobile ones, offer excellent image quality (above those 30dB SNR that we mentioned)
Imagine that we progressively lower the lighting of the scene.
There will come a time, a level of light, for which the camera with the smallest sensor, for example that of a mobile phone, will not be able to generate images with that ratio of 30dB, and the noise will begin to be visually noticeable.
Meanwhile, another camera for example with a 1-inch sensor will continue to offer images with SNR> 30dB.
We keep lowering the lighting …
Images from the 1-inch sensor will already start to drop below 30dB
Then it will be the turn of the Micro 4/3 sensor
Then to the APS-C sensor
And finally to the Full Frame sensor
In other words, from this point of view, a larger sensor basically gives us a small additional margin for less light situations: dusk, interiors, museums …
… or situations in which we need a high shutter speed and we have to raise ISO: sports photography (especially indoor), children, pets … moving objects … when the light conditions are not perfect.
What margin or real difference is there between sensors of different sizes?
Between a mid / high-end mobile and a 1-inch sensor camera : 2 light steps ( approx. 2 EV )
Between a 1-inch sensor and a Micro 4/3 sensor : slightly more than 1 step ( approx. 1 EV )
Between a Micro 4/3 sensor and an APS-C sensor : 2/3 step ( approx. 0.7 EV )
Between an APS-C sensor and a Full Frame sensor : About 1 1/3 steps ( approx. 1.3 EV )
These differences are only approximate and with cameras / sensors of similar technology .
A current small sensor may perform better than a larger but much older sensor.
It’s just to get a mental idea …
… And to realize that cameras are not magical.
Note that for example a Micro 4/3 camera, an APS-C and a Full Frame are all within about 2 light stops of difference .
Case studies and examples
First we are going to see with real cameras if those differences that we have discussed are met.
There are many ways to estimate the differences, for example from the DXOMARK data or from the graphs from photonstophotos.net.
For example, here is a comparative graph based on the photographic dynamic range based on the size of the sensor:
But we are going to make our own graph.
We are going to choose several more or less current cameras (at the time of writing this article) with different sensors:
- 1 inch : Canon G7 X, Sony RX100 VII
- Micro 4/3 : Panasonic GX800, Panasonic GH5, Olympus E-M1 mark II
- APS-C : Sony a6400, Nikon D3500
- Full Frame : Canon EOS RP, Sony a7S, Sony a7R 4
The models I have chosen basically at random, the first that have occurred to me and I have found in DXOMARK for each type of sensor.
For example, within the micro 4/3 there are professional range cameras (E-M1 for photo, GH5 for video) and entry range cameras (GX800). And in APS-C the Sony a6400 would be an upper mid-range, while the Nikon D3500 is an entry range.
For each camera we look in DXOMARK the value for Sports (Low-Light ISO)
What is that value?
It is the maximum ISO that we can configure in the camera to achieve at least a signal-to-noise ratio of 30dB in the resulting image (excellent image quality).
Now we are going to try to analyze the data and draw conclusions.
The horizontal axis represents the sensor surface (in square millimeters)
The vertical axis represents the ISO value of the camera in linear scale.
The red points correspond to each of the cameras, they represent that ISO value for which we have SNR = 30dB
We see that the dots are vertically aligned in their corresponding standard sizes: 1 inch (116mm2), micro 4/3 (225mm2), APS-C (350mm2) and Full Frame (860mm2)
Note that the jump between the ‘crop’ sensors and the Full Frame sensor in terms of capture area is huge.
For each sensor size there is a certain variability: it may be due to the sensor technology, its age …
I have included in orange the lines that would correspond to the typical ISO values (ISO 400, 800, 1600 and 3200) to have a clearer estimate of the differences between cameras in light steps.
The cameras with 1-inch, micro 4/3 and APS-C sensors that we have chosen in this example are all in a range of less than 2 light steps.
APS-C has a slight advantage over micro 4/3.
And micro 4/3 has a slight advantage over 1-inch sensors (the GX800 probably has a somewhat old sensor, being an entry-end camera, while the G7 and RX100 are high-mid-range compact).
We see that Full Frame sensors have more than 1 light step of advantage over APS-C
In any case, it is seen that there is a fairly clear correlation between the size of the sensor and the signal-to-noise ratio that we can achieve .
Comparison in a real case
A situation where all cameras suffer:
Imagine that we want to take photos of our son / daughter who practices some kind of sport in a sports hall.
In pavilions, especially in small ones, the lighting is usually quite bad.
On the other hand, we need a high shutter speed to 1) freeze the movement of the players and 2) to avoid shake if we shoot freehand
Let’s assume an initial configuration that gives us a correct exposure:
- Lens with f / 4
- Shutter speed 1 / 250s
- ISO: 800
1-inch sensor cameras would be a bit on the edge but overall the resulting images would be fully usable.
The Micro 4/3, except for the GX800 which seems to be a bit below average, would give a good image quality. The APS-C the same and of course the Full Frame.
Now we want to raise the shutter speed a bit to freeze the action more. Outdoors the recommended speeds would be above 1 / 500s depending on the sport.
We go up to 1 / 500s and to compensate we have to go up an ISO step, up to ISO 1600.
At ISO 1600 the APS-C would be a bit on the limit but still give good quality images. Also some micro 4/3, like the E-M1.
In cameras with a smaller sensor, you would already notice some graininess with the naked eye, especially in the darkest areas.
Cameras with a Full Frame sensor would still have an extra light path. They could go up to ISO 3200 and 1 / 1000s and the image would still be excellent.
Summary and conclusions
Remember that in this context, when we talk about image quality we are referring only to the level of digital / grain noise.
Key ideas we’ve seen:
- The exposure does not depend on the size of the sensor.
It tells us the amount of light the sensor receives per unit area.
- The exposure we perceive in the image also depends on the ISO
value. Although the actual exposure measures the light received, in most cases we talk about the exposure triangle, including the aperture, the exposure time (shutter speed) and the ‘ sensitivity ‘ (ISO)
- Any camera with the same parameters: aperture, shutter and ISO will produce a similar image in terms of brightness / luminosity
- Larger sensors receive more light
The amount of total light (photons) depends on the exposure and the total surface of the sensor
- Image quality depends on signal to noise ratio
The higher the SNR, the better the image quality
- It is considered that with SNR> 30dB it is an image with very good quality
Ultimately the perception of digital noise in the image is a subjective matter, each photographer has their own criteria of what they consider an excellent, acceptable and bad image (not usable )
- Larger sensors – images with better SNR (higher quality)
- Sensors of the same size but different resolution:
The sensor with lower resolution (larger cells) may perform better in terms of noise (less variability between points of similar tonality), but also offers less detail of the scene.
- To compare the quality of two images we should do it with their final copies: for example, paper copies or images in their actual publication size. Or at least we would have to rescale to compare to the same resolution .
- All cameras produce excellent images in good light.
In these situations the SNR is high in all cases and the noise level can be considered negligible. The differences would be more in the optical quality of the objectives or in other external factors
- The difference between sensors will only be noticeable in certain situations (low light, when raising ISO, when we raise the shadows in development / editing, scenes with very high dynamic range …)
- The differences between sensors of different sizes are gradual.For
example, the difference between an APS-C sensor and a Micro 4/3 sensor is minimal.
A Full Frame sensor would give an advantage of just over 1 light step over an APS- sensor.