Quad Bayer sensors: what they are and what they are not


Seemingly overnight, a great variety of phones with 48MP cameras appeared – the first ones launched in December 2018. And interestingly enough, many of them are mid-rangers while plenty of flagships have settled on a 12MP sensor. What’s going on here?

What we’re seeing is the emergence of the Quad Bayer filter – we’ve covered it before when the Huawei P20 Pro debuted such a sensor (of 40MP resolution). And because we feel marketing departments often fail to understand what this sensor is about we felt we needed to step in and clear the confusion.

What’s a Bayer filter?

Let’s start from the beginning – a Bayer filter is a colorful mosaic of Red, Green and Blue filters that allows a digital sensor to capture color photos. Semiconductor pixels don’t “see” color, they only capture the amount of light that hits them, so without a filter you will get a Black & White photo. The Bayer filter makes sure that the light reaching each pixel is of one of the three primary colors.

The way it works out is that a 12MP sensor, for example, has 6 million pixels that see green, and 3 million pixels each for red and blue. Green gets more pixels because the human eye is the most sensitive to that color. An algorithm called demosaicing is used to interpolate a full 12MP resolution image.

A Quad Bayer filter is a bit of a misnomer as it’s actually the same as a regular Bayer filter. What really changes is not the filter but the sensor behind it – these new sensors put four pixels behind each color square instead of just one.

So, really these 48MP Quad Bayer sensors can’t offer much more detail than a 12MP sensor. Sensor and phone makers alike will tell you that smarter demosaicing algorithms can capture more detail, but our experience is that the gain is small – if there’s a gain to be had at all.

Where can you find Quad Bayer filters

The 48MP sensors are the most popular lately – they are already on several dozens of phone. Yet it all started with Huawei’s 40MP sensors on the P20 Pro, Mate 20 Pro and Mate 20 X. The Chinese maker even came up with a second version of its sensor for the Huawei P30 and P30 Pro, which switch to Red, Yellow, Yellow, Blue, instead of RGGB, but the principles are the same).

There are also several phones with 32MP selfie cameras with a Quad Bayer filter (e.g. vivo V15 Pro and Samsung Galaxy A70). These have the same pixel size (0.8µm), but are physically smaller and so have a lower resolution.

Samsung recently announced a 64MP Quad Bayer sensor, which again keeps the same pixel size but changes the sensor’s dimension – it’s 33% larger than the current crop of 48MP sensors.

What a Quad Bayer sensor can do

As we said, the true strength of a Quad Bayer lies elsewhere – it can treat a group of four pixels sharing a color filter square as one or as separate sensors.

Right out the gate, these are some of the largest sensors ever put in a phone. E.g. the Sony IMX586 – the first 48MP sensor and one of the most popular – measures 8mm in diagonal. The IMX363 (used in the Pixel 3) and the Samsung S5K2L4 (used in the S10 phones) measure 7.06mm in diagonal. That’s about a 30% gain in surface area.

The pixel sizes are vastly different, however, 0.8µm for the 48MP sensors and 1.4µm for the traditional sensors. All marketing material about the 48MP sensors boast that they can use pixel binning to work like they have 1.6µm pixels.

This creates a 12MP image. The “1.6µm” number is repeated often, but shouldn’t get all the credit for improving the low light performance of the sensor. Noise is a random process and if the large pixel of a traditional sensor captures noise instead of signal, there’s little to be done (other than covering it up by interpolating data from neighboring pixels).

If one of the four pixels on a Quad Bayer sensor captures noise, however, that’s only 25% of the information lost – a 4x noise reduction that doesn’t diminish the sharpness of the image.

Alternatively, the sensor can be split up into two logical sensors – one that captures a short exposure and one a long exposure. This is used in daylight for real-time HDR capture.

You could do noise reduction and HDR with a single non-quad Bayer sensor by taking two (or more) photos one after another and combining them. That’s what the Pixel phones do and they are quite good at it.

But there’s a problem – moving objects change position between sequential exposures. A Quad Bayer filter takes two photos at the same time, so there’s no need to use AI to correct for artifacts caused by moving objects. Here’s what it looks like when the correction fails.


HDR done with sequential exposures
HDR shot with a Quad Bayer sensor

HDR done with sequential exposures • HDR shot with a Quad Bayer sensor

The simplest way to think about a Quad Bayer filter is that it allows the camera software to capture two photos at the same time. This enables the image processing (HDR and night mode) that is the real reason modern smartphones capture great quality images – the hardware isn’t all that different.

And what it can’t do

Unlike an LCD, which has R, G and B sub-pixels for each pixel, a classic image sensor has only one sub-pixel per pixel. But they can get away with claiming the resolution that they do because these pixels are close together and demosaicing mostly does a solid job of reconstructing the original image (though pixel peepers will know it’s not perfect).


Samsung Galaxy A80: 12MP
Samsung Galaxy A80: 48MP

Samsung Galaxy A80: 12MP • 48MP

In a Quad Bayer filter, the pixels of different color are further apart, so demosaicing is less effective (despite what makers claim). So, you’re definitely not getting 4x the detail in 48MP mode than you do in 12MP. In fact, since the HDR and other image processing modes are disabled at 48MP, the 12MP photos sometimes come out with better detail (and much smaller file size, win-win).

Running the demosaicing algorithm on the raw 48MP data may result in a sharper image, but it changes from phone to phone and from scene to scene. If detail levels are critical to a particular shot we’ve found that the best strategy is to shoot in both modes and then pick whichever comes out better. Most of the time, however, you’re better off sticking to 12MP mode.


Oppo Reno 10x zoom: 12MP
Oppo Reno 10x zoom: 48MP

Oppo Reno 10x zoom: 12MP • 48MP

Not to mention that reading out the full 48MP image was also beyond the capabilities of some early sensors and chipsets, so they just got the 12MP image and upscaled it – this is just a waste of storage.

One of the most frequently mentioned advantages of Quad Bayer sensors is superior zoom. While the Nokia 808 PureView did have an impressive zoom, its enormous 41MP sensors had a classic Bayer filter. As discussed, Quad Bayer can offer only a limited gain in sharpness (if that), so it’s really no different than doing digital zoom on a 12MP sensors.


Asus Zenfone 6: 12MP
Asus Zenfone 6: 48MP

Asus Zenfone 6: 12MP • 48MP

Marketing departments really want you to believe you’re getting a Hasselblad-like image sensor, but the reality is that the Quad Bayer filter is just a clever (and effective) way of getting better-quality 12MP shots.


Huawei 20 Pro: 12MP
Huawei 20 Pro: 48MP

Huawei 20 Pro: 12MP • 48MP

Note: you can use our image comparison tool to see if there’s any real benefit of shooting in 48MP mode. Click the little icon and select two images.

Then there’s also the matter of optics. We won’t get into the details, but such high resolution cameras are often diffraction limited – meaning that the smallest spot of light can’t be focused on an area smaller than the pixel. You can read about Airy disks for mode detail. Long story short, there is a limit imposed by physics on the maximum resolution small optics and sensors can have.

More info: Sony 1 | 2 | 3



Source link

We will be happy to hear your thoughts

      Leave a reply

      Logo
      Login/Register access is temporary disabled