First Look: iOS 26 Public Beta
First Look: iOS 26 Public Beta

First Look: iOS 26 Public Beta

How did your country report this? Share your view in the comments.

Diverging Reports Breakdown

First Look: iOS 26 Public Beta

iOS 26 is now available in the public beta. With very few exceptions, all of the iOS 26 features that Apple demoed during its WWDC keynote this year are available, right now. Apple has largely focused on smaller, “quality of life” improvements rather than marquee new features. The final version, when it arrives this fall, might look or work differently from the way it does today. Apple’s new design language, dubbed Liquid Glass, applies across all their platforms, but unsurprisingly, it feels most at home on the iPhone and iPad.Controls now overlay content rather than sitting in designated toolbars or areas of the screen reserved for those controls, and are rendered in transparent glass that refracts and distorts the colors of whatever passes behind it. The challenge of the design is finding things that both look and work great, and the company seems to be continuing to wrestle with that balance during the beta period. For the long-term, Apple has rethought the way some of its most fundamental interactions work.

Read full article ▼
By Dan Moren

First Look: iOS 26 Public Beta

iOS 26! It feels like just last year we were here discussing iOS 18. How time flies.

After a year that saw the debut of Apple Intelligence and the subsequent controversy over the features that it didn’t manage to ship, Apple seems to have taken a different tack with iOS 26. In addition to the expansive new Liquid Glass design that spans all of its platforms, Apple has largely focused on smaller, “quality of life” improvements rather than marquee new features. That’s not a bad thing, either—these are often the types of things that Apple does best, and which actually make a meaningful impact on the lives of their customers: saving them time waiting on hold on the phone, helping them avoid dealing with spam, and improving their driving features.

It’s also worth noting that, with very few exceptions, all of the iOS 26 features that Apple demoed during its WWDC keynote this year are available, right now, in the public beta. The exceptions include the digital ID feature in Wallet that uses info from your passport and the age rating/content restriction updates in the App Store. That’s it. Everything else has been there since the earliest beta builds.

I’ve spent the last few weeks running those initial developer betas of iOS 26 so you don’t have to. As the public beta arrives, you may be tempted to dive in, so allow me to run down the biggest changes to your phone. And, as per our usual reminder, this is the beta period, so everything is still subject to change and the final version, when it arrives this fall, might look or work differently from the way it does today.

With that disclaimer out of the way, let’s take a look at what might convince you to take the plunge.

Liquid Glass half-full

Apple’s new design language, dubbed Liquid Glass, applies across all their platforms, but unsurprisingly, it feels most at home on the iPhone and iPad. That’s in part because of the touch interface; the literal hands-on nature makes the feel responsive and more like physical things that you’re interacting with. For example, dragging the new magnifying loupe across the screen, watching the way it magnifies and distorts text and images as it passes over them—this interaction has always been unique to iOS for practical reasons, but the way it feels here doesn’t have a direct analogue on other platforms.

https://sixcolors.com/wp-content/uploads/2025/07/ios26-publicbeta-loupe2.mp4 Perhaps the truest “liquid glass” interaction, in that the loupe when it moves back and forth deforms like a water droplet.

Controls now overlay content rather than sitting in designated toolbars or areas of the screen reserved for those controls, and are rendered in transparent glass that refracts and distorts the colors of whatever passes behind it. That’s impressive but also, at times, distracting: sometimes you see a distortion of text from what you’re reading within the UI, which is odd. Or, when scrolling past content that goes abruptly from light to dark, the buttons might similarly flip appearance from, say, black icons to white icons in a way that can feel jarring.

App icons are built in transparent layers; interestingly, if developers adopted the design changes to better support iOS 18’s “tinted” theme, their icons already get some of the benefits baked in. That tinted theme has been expanded with both light and dark options and there’s also a new clear theme that turns all your app icons ghostly, which is a great way of testing your muscle memory for where you put your apps. I’m not sure it’s for me—everything looks a bit too same-y—but it is definitely a look, and I’m equally certain there will be folks who love it.

The Liquid Glass design has undergone perhaps the most substantial tweaks during the beta period to date. But those changes have been kind of all over the place. At times, Apple has seemed to dial back the transparency in an effort help legibility of UI controls, but in the most recently build, the company seems to have ramped the transparency back up, once again at the expense of readability. The challenge of design is finding things that both look and work great, and the company seems to be continuing to wrestle with that balance during the beta period.

iOS 26’s new popover menu makes it easier to find the option you want, rather than having to endlessly scroll.

The redesign is more than skin deep, however. Apple has rethought the way some of its most fundamental interactions work. For example, the increasingly long horizontal popover menus that hid options behind an interminable scroll have morphed into a dual-stage design. Tapping and holding on the screen brings up a popover with a few common options, but it now doesn’t make you scroll; instead, there’s an arrow indicating more options. Tap that, and you’ll get a big pop-up panel of all the available commands in a much easier to read and use format. As someone who frequently finds himself swiping through a very long list to find the one command I want (and somehow, it’s always the last one), this is a tangible improvement.

It would be nice if that first menu were more customizable, though. For example, I can imagine someone who’d want Translate or Speak higher up. And although the menu varies from app to app, some of the organizational choices are puzzling. In the Safari screenshot above, I’m not sure why Writing Tools is visible. After all, I’m looking at uneditable text on a web page. Am I rewriting the web now? This feels less like a feature focused on user needs and more like a reflexive promotion for Apple’s AI tools.

Other system-level features have been expanded as well. For example, while you used to be able to swipe from the left side of the screen to go back or up a level in a hierarchy, that gesture now works anywhere on the display, making it both more discoverable and easier to use.

https://sixcolors.com/wp-content/uploads/2025/07/ios26-publicbeta-lockscreen-glass2.mp4 You paid for the whole lock screen but you’ll only need the edge!

As with any change this sweeping, it’s always going to take some time to adjust. There are some who will decry it as change for change’s sake, but as undesirable as that might be, the countervailing argument is that you shouldn’t keep things the same just because it’s the way you’ve always done them. My experience with Liquid Glass has had its ups and downs, with interactions that feel both interesting and dynamic to those that are downright frustrating.

Lock step

After the last few years of Lock Screen customization options, this year’s additions are more muted, and mostly in step with updating the look for Liquid Glass.

The biggest addition—literally—is the new clock, which can expand to fit the empty space on your screen. If you have a rotating set of lock screen photos, it will dynamically adjust for each one, trying not to obscure people’s faces; while you can manually adjust the size of the clock in the lock screen customization screen, it seems as though it still alters dynamically, so I’m not entirely sure of the point of that exercise.

The clock on the lock screen now dynamically adjusts from very very large to what used to be normal.

I’m also happy to say that one of my favorite features of last year—the rainbow-hued clock available in some lock screen styles, like Live Photo—still exists—you just have to change the clock style to solid rather than glass in order to see it. There’s also an option to move the widgets that used to sit below the clock all the way to the bottom of the screen, right above the notification stack, so as to not block the subject of your photo. I kind of prefer this location—I find it easier to tap a widget and open the app if I want, and I find the data from them don’t get as lost. (I was, however, able to overlay the widgets on the clock, which feels like a bug.)

Your Lock Screen photos themselves can also be more dynamic now, with the addition of spatial scenes. That’s a feature imported from the Apple Vision Pro where iOS will apply a three-dimensional effect to an image, animating depth as you move the phone around. How effective that is varies from photo to photo, although it feels less compelling here than viewing true spatial scenes viewed on a Vision Pro; the animation of the spatial versions are sometimes a little jerky, and some people with motion sensitivity might find them off-putting. Apple’s attempting to identify what makes a “good” spatial scene and whatever system is making that determination can be hit or miss.

Speaking of images that move, the lock screen also now has an animated artwork option for music apps—note that I said “apps” not “the Music app” since it’s an API available to developers of third-party apps. But it will need adoption from the producers of albums in order to take full effect. When it shows up, it takes over the entire lock screen rather than being constrained within a little album square. It’s an interesting approach, although one that you may not notice depending on how often you actually visit the lock screen while music is playing. So, while it’s a cool idea, I’m not sure it does much for me. Maybe it’s time to commission some animated artwork for the Six Colors podcast?

Point and shoot

I’d venture a guess that Camera is the most used on the iPhone, though I’ve got no real numbers to back that up. But given the amount of time Apple has spent upgrading the camera hardware on the iPhone over the years, I feel pretty confident in my assessment.

As a result, redesigning the Camera app—hot on the heels of last year’s redesign of Photos—is a bold choice. But it’s not surprising that the company’s alterations here are focused on the minimal, reinforcing the way that most people already use the app. (And if anybody’s got the metrics to know how people use its apps, it’s obviously Apple.)

For example, controls for more advanced features like aperture, exposure, and photographic styles are now buried in a separate menu, available by tapping a button at the top of the screen or by swiping up from the bottom. Given that I’ve definitely ended up in these controls by accident over the years—and I suspect I’m not alone—that’s not a bad thing.

The simplified Camera interface makes it easier to point, shoot, and get out.

Likewise, what used to be an at times overwhelming carousel of modes—panorama, portrait, photo, video, slo-mo, time lapse, etc.—has now been visually reduced, by default, to just the two most popular: photo and video. The others are still there if you scroll left or right, but you’re less likely to accidentally find yourself shooting spatial video at a lower resolution when you don’t mean to. Similarly, those resolution and format choices are also now squirreled away behind a separate button, there if you need them without being omnipresent.

The redesign reflects the fact that most people want to get in, snap a picture or shoot a video, and get out. Not to mention that Apple has spent a lot of time designing its phones so that they take great photos without having to tweak those details. Those advanced features are still there—and, arguably, more accessible using something like the Camera Control button on the latest iPhones—and for those who long for more than Apple’s Camera app offers, there are an assortment of popular and powerful third-party camera apps to fill in the gaps.

The counterargument, of course, is that by hiding those features away, they are less discoverable. This is the eternal battle in interfaces, especially in someplace as constrained as an iPhone screen. In other places, Apple has done its part to pop up hints about features you might not see at first glance, including here in the Camera app. Personally, I think this redesign walks a solid line—the new interface is not so different from the old that I had any trouble with it, and I appreciate that there are fewer distractions.

There are also a couple of AirPods-related features in Camera: first, if you’ve got the latest models with H2 chips, the microphones should be improved. Apple touts them as “studio quality”, a meaningless qualifier that could mean anything from “suitable for a recording studio” to “you can use these in your studio apartment,” but at least it doesn’t sound like you’re in a wind tunnel anymore. In one of my test calls, my wife was genuinely impressed when I asked, at the end, how I’d sounded. “I wouldn’t have known you were on your AirPods if you hadn’t told me.”

And you can now use the AirPods’s stem controls to take a picture or start recording a video: handy for people using a selfie stick or tripod, or even just a quick way to snap a group photo (as long as you don’t mind having an AirPod in your ear in said photo). Bear in mind, this is a feature you’ll have to turn on in Settings under your AirPods, though it does let you choose between a simple press or a press-and-hold.

Calling cards

An update to the Phone app? Are we sure iOS 26 doesn’t stand for 1926? People knock the Phone app, but, well, I still make phone calls. In addition to a couple of handy features, there are also some substantial design changes afoot.

The new filtering menu in Phone works hand in hand with call screening and spam filtering features.

A redesign strikes again! The new Unified view pins your favorites to the top, then shows you your recent calls, missed calls, and voicemails all in a single very long list on the Calls tab, with separate tabs for Contacts and the Keypad. Some might not care for this approach, but I find it kind of a no-brainer. It did encourage me to pare my Favorites list down a bit to the one line of people I actually call as well as finally update their contact pictures to the more recent poster format. I don’t mind having voicemails mixed in; I don’t get very many. But if you hate this new interface, don’t worry: Apple will let you switch back.

Unquestionably good is the new set of Filtering features available in the menu at the top right. By default, this includes options to view just Missed calls or Voicemails, but there’s also now, praise the heavens, a Spam section for calls that are recognized as such. Apple’s using a combination of carrier tagging (those calls that you’ve seen flagged as “Spam Risk”) and its own analysis. You can manually mark a call as spam by swiping left on it in your recents list and choosing Block and Report Spam.

The real challenge, as always, is the calls that fall in between your contacts and out-and-out spam. For this there’s the new Screen Unknown Callers feature. You might remember that Apple previously added a Silence Unknown Callers feature in iOS 13 that would mute calls from numbers that weren’t recognized—with the challenge that if you got a call from a doctor’s office, tradesperson, or even food delivery, you might not see it. That was followed by Live Voicemail in iOS 17, which helped mitigate the issue, but Screen Unknown Callers goes a step further: when activated, which you can do in the Phone app or in Settings > Phone, callers not in your contacts will be asked to provide more information before the call rings through. You can also choose to leave unknown calls totally silence, or turn screening off entirely to have all calls ring your phone.

There’s a separate but connected feature in iOS 26 called Call Filtering. Once you turn this on, you’ll see an Unknown Callers category in the filter list in Phone, not dissimilar from the Messages filters that have existed for a few versions. From there, you can choose to mark the numbers as known, at which point they will ring through—without having to be added to your contact list, which is nice. However, I’m not sure how you move a number back to “unknown” if you accidentally mark it as known—you can delete it from the list or block it, but I’m not sure what to do if you want to simply move it back to the “Unknown Callers” section. You can also choose to have calls detected as spam by your carrier simply not ring at all, which seems like a real no-brainer.

Overall, I’ve got mixed feelings about the Screen Unknown Callers feature. On the one hand, it will undeniably help weed out potentialy spam calls. On the other, some part of my upbringing feels embarrassed about the idea that someone—especially a likely underpaid person in a service industry—is going to have to justify their call to a robot. I’ve gotten calls from AI assistants from my dentist office recently, and frankly…I just hang up. I’m not going to spend my time chatting with a computer, and I don’t blame anybody else for feeling the same. That said, I have turned it on, though I haven’t actually seen it in use yet.

Along similar lines, Apple’s also added a feature called Hold Assist that automates the oft-annoying task of waiting on hold. I did get a chance to try this out, and it worked fine except for one caveat. The idea behind the feature is that when you’re put on hold with some cheesy hold music or deafening silence, you can trigger this feature and be notified when somebody comes back on the line.

In my experience, however, one problem I encountered was that it registered the occasional recorded messaged while I was on hold with the Massachusetts Department of Revenue—”Your call is important to us!” or “Did you know you can go to our website?”—as a human coming back, and notified me, leaving me to scramble for the phone only to find that I wasn’t talking to a live person after all. My understanding is the feature should be able to distinguish between a regular recorded message and a human, but that was not my experience in one of the earlier betas—I haven’t yet had a chance to put the feature through its paces in the more recent builds.

Just browsing

Safari’s reduced interface hides its commands in a plethora of pop-up menus, which leads to some oddities like two Share buttons.

While Safari may not have gotten quite the expansive overhaul of some of Apple’s other built-in apps this year, it’s still worth mentioning, if only because, like Camera, it remains one of the most used apps on iOS.

Apple’s taken a variety of stabs at UI minimalism in Safari over the years, both on iOS and macOS. Often those first, more substantial changes, get dialed back. In iOS 26, these changes aren’t quite as radical, but they’re more than just a coat of Liquid Glass. Gone is the longstanding toolbar with its back/forward arrows, Share icon, bookmarks, and tab menus beneath a location bar. In its place, by default, is a more reduced UI with a back button, location bar, and now seemingly ubiquitous “More” button denoted with three dots.

You’ll find many of the previous controls under that More button, including both bookmark and tab management, as well as Share. But some controls are still accessed by tapping on the icon in the location bar—including Reader mode, if available, translation, extension management, and so on—and others are instead squirreled away under a long press on the location bar, including closing tabs, tab groups, and…another Share button. The button so nice they included it twice!

As with the Phone app you can, if you so wish, revert back to classic Safari—either with the location bar at the top or bottom. In a few weeks of usage, I’ve elected to stay with Apple’s new design, though I still struggle to remember whether the control I want is accessed via location bar or More button. Or…both? At least some common gestures, like swiping left and right on the location bars to switch tabs or flicking upwards on the URL to see your tab overview, have remained.

I never really felt like the old toolbar style was getting in the way of my content, so I’m not sure if this change is anything but an attempt to mix things up. I’ve largely gotten used to the look, though at times the effects of a non-uniform website background on Liquid Glass can lead to disparate effects like one pop-up menu being a light color while another is dark.

Beyond the design changes, most of Safari’s other updates are under the hood. Developers of web extensions don’t need to use Xcode or even a Mac anymore; they can just upload their extensions to the App Store in a ZIP file. Hopefully that’s another step closer to being able to bring some of the myriad of extensions out there to Safari. And any web page can be opened as a full web app from the home screen now, rather than just essentially being a bookmark.

Let’s get visual…visual

Apple Intelligence may have been the big news in iOS 18, but this year its new features are somewhat more muted. While those capabilities that didn’t end up shipping in 2025—Personal Context and a smarter Siri among them—are still expected to arrive in the future, Apple has with this release focused on some smaller capabilities, like integrating ChatGPT into Image Playground, the ability to combine two emoji in Genmoji, and summaries of voicemails. It’s also brought back summaries for notifications with more granular controls over what kinds of apps you want it to apply to—plus more stringent warnings about the fact that said notifications may be inaccurate, which certainly raises questions about whether they are useful.

Perhaps the most significant of these Apple Intelligence-powered features in iOS 26, though, is an expansion of the Visual Intelligence feature launched last year. Instead of being confined to pictures taken with the camera, Visual Intelligence in iOS 26 now offers the same capabilities with screenshots. In fact, the feature is built right into the existing screenshot interface, so now whenever you squeeze the sleep/wake button and volume up button to take a picture of what’s on your screen, you’ll seen two new controls at the bottom: Ask and Image Search.

Story checks out.

The former lets you ask ChatGPT questions about the image, while the latter brings up Google results. You can even highlight a portion of the image by running your finger over it if you only want to search on a subset of the picture. It’s a shot over the bow of Google’s longstanding Lens feature, with a dash of AI thrown in. I’ve barely used Visual Intelligence on my iPhone 16 Pro since its debut; I’m not sure if screenshot integration is enough to get me to change my ways, but it does open up some new possibilities for extracting information from your screen, in the same way that Live Text has done.

Speaking of Live Text, in case answers from two different tech giants aren’t enough, Apple is also using a bit of that same machine learning technology to pull out relevant details from the image, whether it be a URL or a calendar event and present them in a little floating lozenge at the bottom of the screen. That can be handy, though it’s also at the whims of whatever information is captured in the screenshot.

It is a little odd that Visual Intelligence is offered in two different places with two different interfaces, but given there is a distinction between screenshots and taking photos, perhaps that’s not as jarring as seems at first blush.

Bits and bobs

As with any major platform update, there’s simply too much to cover absolutely everything. Here, then, are a few other features that I’ve noticed in my time with the iOS 26 beta.

The Battery section of Settings has been redone, providing a quick glance at when you last charged or, if your phone is plugged in, how long until it’s charged. The main graph now your battery usage to your average daily usage—including in the app-by-app breakdown—rather than providing the somewhat less useful hour-over-hour view. There’s also a new Adaptive Power mode that supposedly helps prolong battery life if you’re using more than usual by toning down some things like display brightness.

As on the iPad, you can record your audio locally on the iPhone with the new Local Capture feature, whether it’s via the built-in mic, AirPods, or a USB-C microphone (not to mention a new audio input picker that lets you choose which mic you want to use). While it still needs controls for audio input volume—some mics, including my ATR-2100x, which I would be most likely to use with this feature, are distorted because they’re simply too loud—this does make it feasible to record a podcast on your iPhone. I honestly never thought I’d see the day, but it’s here.

Notes may not support writing in native Markdown, but it does now let you export a note into the format. That includes any images that you’ve embedded in the note, which is handy. Despite being a Markdown fan, I’m not sure I’m likely to use this feature…I like Markdown because I want to write in it for the web, not have to take the extra step to export. But it’s nice that there’s at least an easy and accessible way to get your data out of the Notes app.

The Passwords app adds a history of changes you’ve made to passwords (only, of course, for changes since installing iOS 26). That’s a nice feature because I have definitely ended up not realizing I’ve already got a password and then reset it. In fact, in one of my favorite moves, it will even tell you when it created a password for a site, even if that password may not have actually gotten submitted—something that’s happened to me more than a few times.

Remember how even iTunes had the ability to crossfade between songs maybe twenty years ago? Well, Music‘s new AutoMix feature takes that to eleven by trying to actually find the perfect DJ-style moment to segue into the next song. In my experience it does work, but it is definitely kind of trippy. You can also pin favorites to the top of your library, whether it’s a song, album, playlist, or artist.

Can’t remember the name of that café you stopped at on your most recent road trip? If you opt into the new Visited Places feature in Maps, you can search or scroll through places you visited—even by category or city. All the information is stored privately and securely, so nobody else can see it, not even Apple, and you can easily remove items from the list (or correct erroneous ones). It’s also a great way to retroactively build a guide of places for a trip you’ve taken. There’s also a Preferred Routes feature that supposedly learns about how you make regular trips, but as someone who works from home, I don’t expect to get too much use out of this.

I don’t generally use alarms, so an adjustable snooze time in the Clock app doesn’t really do much for me, but I know some people will be excited, so here you go. However, this does come with one interface criticism about the alarm screen, which now has equally sized Stop and Snooze buttons, leading to the possibility of sleep-addled people hitting the wrong button. Here’s hoping a future beta considers that and maybe makes some tweaks.

I do, however, order lots of stuff on the internet so I’m fascinated to see how Wallet‘s new Apple Intelligence-powered tracking of orders works. There have long been popular third-party apps that handle bringing all your orders and shipments into one spot, but if this really can do it all automatically, that’s worth the price of admission for me. You can also see this pop up in Mail, where a banner will tell you that it’s detected a shipment and prompt you to add it to Wallet.

Hallelujah, you can now select the text inside a bubble in Messages. I know it’s not the flashiest improvement, but it’s always seemed absurd that this was an all-or-nothing proposition. I mean, you can copy just some text out of an image these days, for heaven’s sake. A small but very meaningful improvement.

Last, but hardly least, CarPlay gets a handful of new features, including the new Liquid Glass design, a smaller call notification, tapbacks in Messages, and Widgets. I really want to like widgets but two things hold me back: first, my CarPlay screen is very small and can only show one widget at once, and second, I’ve struggled to figure out which widgets are actually useful while I’m driving. Most of the time I really do just want my map and whatever media is playing. Maybe on a bigger screen they’d be more compelling. I’m a little worried that tapbacks will encourage people to interact with the screens too much, but at least it’s a quick action and not, say, typing out a reply.

Even in what seems like a modest update, there’s way more in iOS 26 than I can go through here. And as the beta period progresses, it will be interesting to see how the thinking on major elements like the design continue to evolve and change. Apple’s already shown that it’s receptive to feedback, and while not every complaint or criticism that people have is likely to result in a change, you never know. There’s plenty to dig into over the next few months before this fall’s release, so if you want to give it a shot, the public beta is out there and waiting for you.

[Dan Moren is the East Coast Bureau Chief of Six Colors. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His latest novel, the sci-fi spy thriller The Armageddon Protocol, is out now.]

If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.

Source: Sixcolors.com | View original article

Source: https://sixcolors.com/post/2025/07/first-look-ios-26-public-beta/

Leave a Reply

Your email address will not be published. Required fields are marked *