It’s a 12MP sensor with an f/1.9 lens, one that's faster than the f/2.2 lenses of the iPhone 13 family. Autofocus selfies & that new flashĪll four iPhone 14 models also get an upgraded front camera. This is why GoPro footage looks so smooth - a super-wide lens with plenty of spare image information to work with. When the camera “sees” more than is in the final video, the phone can move that cropped section around the full sensor view to negate actual motion in the final clip. This crop is the fuel for software stabilization. Apple showed off some guy running during its launch to demonstrate and, as expected, it footage looked reassuringly smooth.Īction mode appears to use a combination of sensor-shift stabilization and a very significant crop. This is intended for activities you might otherwise use a GoPro or gimbal for. However, it does have a new feature called Action mode, also found in the Pro models. It appears to have the same camera, after all. And tests show it’s particularly good at eliminating the kind of motion you get when, for example, shooting inside a moving car.Ĭommon sense suggests the iPhone 14 relies on the same sensor-shift stabilization hardware as the iPhone 13 Pro Max. Sensor-shift stabilization moves, you guessed it, the entire sensor, rather than just the lens. Earlier iPhones already feature OIS (optical image stabilization), where a motor counters camera movements by tilting the lens slightly to compensate. This is an excellent piece of phone camera technology. The previous iPhone 13 already had sensor stabilization. This allows for separate treatment of, say, a person and the background behind them. What are the rest? Apple did not go into that much detail, but the graphic clearly shows stages where the subject is identified and isolated. Of course, we don’t know if that last figure also includes the improvements caused by a faster lens and larger sensor, or what metric is used for this comparison but it promises to be an improvement, regardless.Īpple’s visual representation of the Photonic Engine pipeline shows 12 stages, only four of which are the Deep Fusion part (although 2019 reporting on Deep Fusion suggested it actually involves 9 exposures). Of course, the final perception of an image’s actual color is going to rely far more on tuning than such a palette extension, as impressive as it sounds on paper.ĭrance also claims the Photonic Engine has a massive impact on low-light photo quality, claiming a “2x” improvement for the wide and front cameras and a “2.5x” boost for the primary camera. At 12-bit, this extends to 68 billion colors. That thesis is supported by another thing Drance says, that it “enables rendering of more colors.” If Deep Fusion images are 10-bit, their palette consists of 1.07 billion colors. In this case we’re not talking about Deep Fusion working with RAW files versus JPGs, but more likely the bit depth of the images. This suggests the original version of Deep Fusion deals in compressed exposures, to allow the process to work more quickly, and reduce strain on the processing pipeline. 'We’re applying Deep Fusion much earlier in the process, on uncompressed images,' says Kaiann Drance, VP of iPhone Product Marketing. To understand the changes made, we have to pay close attention to the words Apple uses to describe it. This is Apple’s name for an evolution of the Deep Fusion process it introduced in 2019. Apple is relying on a different way to upgrade image quality. However, this does not mean the iPhone 14 ultra-wide will take the same kind of images as the iPhone 13's. Photonic EngineĪ slide from the iPhone 14's launch showing how the Photonic Engine expands on the work done by Apple's existing Deep Fusion imaging technology. This is, unfortunately, the same spec as the iPhone 13’s secondary camera. The second camera is a 12MP ultra-wide with an f/2.4 lens. The iPhone 12 Pro Max used the IMX603 a year before the more affordable iPhone 13 got it. It’s a straight upgrade over the iPhone 13’s IMX603, and this upgrade path follows a familiar pattern. It has large 1.9 micron sensor pixels, a super-fast f/1.5 lens and sensor-based stabilization.Ī little previous digging revealed this sensor to be the Sony IMX703, a 1/1.65-inch sensor. While Apple has not confirmed this, it looks like the iPhone 13 gains the primary camera used in the iPhone 13 Pro Max. As in the iPhone 13, they are both 12MP sensors. The iPhone 14 and iPhone 14 Plus – the new larger model – have two rear cameras. It starts at $799 but gets you some of the most important camera traits of the pricier iPhone 13 Pro Max from last year, plus some new stuff besides. The standard iPhone 14 series is starting to look like one of the best value high-end smartphones around.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |