Rita El Khoury / Android Authority
I travel a lot and I love taking panorama shots in scenic locations. They’re a great way to capture the immensity of the Alps, the peacefulness of the Slovenian seashore, or the Toronto skyline from the islands. But each time I take a panorama shot with my Pixel 7 Pro, I find the result disappointing. This is supposed to be one of the best camera phones on the market, but it can’t stitch a couple of images together to make a decent panorama — how come?
I had ignored this aspect of the Pixel’s photography experience until I went on a trip with a friend who carries an Apple iPhone 14. That’s when I (begrudgingly) noticed that their phone was taking much, much better panorama shots than my Pixel. So what gives? I picked up my old iPhone 12 Pro Max to investigate this often-forgotten aspect of mobile photography.
The iPhone captures panoramas with all lenses
Rita El Khoury / Android Authority
A bit of digging revealed that there are two major differences in the panorama-shooting experience between Apple’s iPhone and Google’s Pixel, and the first one is obvious from the moment you launch the mode in the camera app.
Google doesn’t give me any control over the pano experience. I can start capturing using the phone’s main rear lens and stop — that’s it. Apple, on the other hand, lets me take panoramas with all of the phone’s lenses. Since I’m using my iPhone 12 Pro Max as an example, I can shoot panoramas with the main lens, the ultrawide, and the 2.5x telephoto lens. That’s already a major advantage. The versatility of zooming or expanding is very neat and lets me frame the shot exactly how I want.
The images below show the same river bank shot with those three lenses. The main lens does the job, but I can imagine the ultrawide being very handy for capturing more majestic landscapes and the zoom being perfect for city skylines.
Some of the images below have been cropped for better alignment, converted from HEIC to JPG, and/or compressed. To pixel-peep at the full-resolution original samples, check this Google Drive folder.
iPhone vs Pixel panoramas: An issue of resolution and detail
The second difference is in the final image. The Pixel 7 Pro shoots very, very low-resolution panoramas — or at least it compresses them immediately after snapping them. The biggest panos I’ve saved are around 1600 pixels in width, and no more than 5MP in size. Ouch. For a sensor that’s capable of 50MP and binning down to 12MP, 5MP is so, so meager. This is late ’00s or early ’10s camera era.
As a result, Pixel-shot panoramas are fine if I look at them on my phone’s screen, but don’t stand up to the pinching test. Details are inexistent. The main reason to use the panorama mode, i.e. the ability to capture more in a single photo without the hassle of manual stitching, is completely lost because I end up with fewer details and information than if I snapped an ultrawide shot or got a few regular shots and combined them in Photoshop later.
By comparison, Apple captures and saves panoramas at — or near — each sensor’s maximum resolution. The resulting image is a high-resolution snap with plenty of detail; most of the panoramas I’ve shot are more than 20MP and sometimes even reached 60MP in size. The difference is really night and day and I can pinch in to reveal a lot.
Here’s another crop from another riverbank snap. You can see that, unlike the Pixel, the iPhone manages to capture a lot of detail from the buildings and trees.
The Pixel 7 Pro panoramas get HDR and colors right
The Pixel still manages to score some points, though — at least compared to my older iPhone 12 Pro Max. Panoramas shot on my Pixel 7 Pro have a better dynamic range than those I took on my iPhone.
Apple doesn’t seem to be using any HDR corrections here: Some areas are overexposed, others underexposed, and panoramas don’t forgive that. You’re already pushing your phone’s camera by mixing many “mini” scenes of varying brightness and exposure, and the iPhone doesn’t do itself any favors by entirely eschewing HDR. However, I mention this with the caveat that I’m using an older model, so maybe the newer iPhone 13 and 14 have fixed that.
The Pixel also wins a few extra points for its unique 360 photo sphere mode and the freedom to capture panoramas in any direction (i.e. it can snap them right-to-left and bottom-to-top, unlike the iPhone which only does left-to-right and top-to-bottom). Plus, personally, I prefer Google’s color science even if it feels a touch over-saturated. I find Apple’s colors too cool. See for yourself.
Pixel 7 Pro vs iPhone panorama: A ‘clear’ winner
Despite the Pixel’s better HDR and colors, I have to give Apple the win here if only for the higher-resolution snaps. You can always edit and improve colors, but you can’t add detail.
When I capture panoramas, I do it to get more into the photo — more width or height, more detail, more interesting subjects — and be able to recall that moment in time later as if I’m seeing it with my own eyes again. My eyes don’t see a blurry mess, they see detail, and the Pixel 7 Pro fails to deliver that, sadly. The iPhone does. And the option to switch to the ultrawide or zoom lens to frame things differently is the cherry on top.