Ever since the announcement of the new iPhone 14 lineup, the primary thing on my feed and discussion within the mobile photographer circle is the increase in pixels on the main camera of the iPhone (every generation since the iPhone 6 has had a 12-megapixel sensor). While this is exciting for iPhone fans, People in the Android community (myself included) have been enjoying sensors that reach a staggering 108 megapixel so when Apple made the announcement, I dove into research with the primary question, what does megapixel mean and how does it affect images, especially in mobile photography?
Every megapixel is a million active pixels containing light information that come together to make up an image. Since the development of digital sensors, there’s been a notion that the more megapixels a given image sensor has, the better images it would produce but that is wrong. More megapixels directly equate to better detail in the image but that begins to fail in low light situations as there isn’t enough light to even cut across the pixels on a camera sensor hence the faster introduction of image noise at higher ISOs. In digital photography (with regular-sized DSLRs and Mirrorless cameras), the bigger the sensor, the more light your camera can capture so camera manufacturers sometimes cramp in huge megapixels, but when you move down to smaller sensors like those found on mobile phones, it’d be ideal not to put ridiculous megapixels because as earlier stated, in low light situations, sensors with higher pixels have their images fall apart faster yet phones like the Samsung Galaxy S22 and Redmi Note 11 Pro boost of 108-megapixel sensor and offer good night time photography… All this is possible because of something called pixel binning.
Pixel Binning is the process that combines data from four pixels into one. Pixel binning is a helpful feature that can improve the quality of images taken in low-light environments, however, pictures taken with pixel binning will not be as consistent as full-resolution images. Essentially, this means that while this camera has huge megapixels, the binning process that’s always active on mobile phones clusters these pixels together whenever you’re taking pictures but there’s always a button that deactivates the binning process and allows you to use the full resolution of your smartphone sensor.
Until now, the iPhone has always done without the binning process because its sensors have always maxed out at 12 megapixels but with the jump to 48 megapixels, the iPhone 14 will be using the binning process which means like other mobile phones with high pixel sensors, a regular photo will be shot with 12 megapixels but special images like RAW mode, etc would be shot with the entire 48 megapixels. And in line with the header of this feature, Yes, the iPhone 14 will create incredible images that would be significantly better than those on the iPhone 13 as their computational process has been vastly increased with the introduction of the photonic engine.
Click below to watch the iPhone 14 Pro Max go up against the Galaxy S22 Ultra