Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

 

Contents

Welcome to 300.. 3

Join Us for Our Special Apple Event Coverage.. 3

Update on the Vision Australia CEO Controversy. 3

Introduction to the iOS 18 Review.. 5

Supported Devices. 5

VoiceOver. What’s New?.. 6

The Live Recognition Rotor 6

Voice Rotor, Customization, Equalization.. 8

Use Your Personal Voice with VoiceOver 15

A New Approach to Audio Ducking.. 17

VoiceOver Tutorial 18

Haptic Vibration at VoiceOver Start 20

Delay Before Selection.. 20

Separate Braille Input and Output Tables. 20

Reconnect Braille Display. 22

Item Chooser for Braille.. 22

Braille Screen Input Gets a Significant Revamp.. 23

Custom Vocabulary With Voice Control 32

Vocal Shortcuts. 32

Music Haptics. 39

New Background Sounds. 40

More Powerful Control Center 41

Voice Memos. 49

Notes. 53

Calculations in Edit Fields. 58

Changes to the Files App.. 59

Messages. 60

Emergency Features. 67

Locking and Hiding Apps. 68

Apple Music. 71

The Passwords App.. 71

Gaming Mode.. 77

Battery Care.. 77

Phone.. 79

Wallet 81

AirPods. 81

TV.. 81

Home.. 82

Calculator 82

Calendar 82

Podcasts. 84

Conclusion.. 84

Closing.. 85

 

 

 

Welcome to 300

[music]

Voiceover: From Wellington, New Zealand, to the world, it’s Living Blindfully – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.

Hello!

It’s not far away now. Apple will soon release iOS 18. It’s packed with accessibility and general enhancements. I’ll give you a preview of some of the new iOS 18 features so you know what to expect when it’s finally released in just over a week.

We made it to episode 300. I mean just, to be fair, just. [laughs]

We have made it to episode 300, and what happens once you get to the 3s in the country code department is that the area codes actually go back to two digits. And, you know, hindsight’s 20-20 and all that malarkey. It’s kind of a shame that all the way back in the 30s, we weren’t doing the country code thing.

There is no US area code 300. But once we get past 300, there are plenty of US area codes to talk about that start with the number 3. They’ll keep us busy, I’m sure, until we’re done.

Join Us for Our Special Apple Event Coverage

Now, we’re here for the most part to talk about ios 18, but a couple of other notes first. The first is Apple-related: Apple’s big event.

I was going to say the Glow Worm. See, it’s still stuck in my head after a week. It’s Glow Time is actually what they’re calling it, and it happens on the 9th of September.

Right after that, We’re going to have Heidi Taylor in the studio. We’re also going to have our other guests join us remotely. We are going to give you extensive, comprehensive analysis of the event from a blindness perspective.

A reminder that now that Living Blindfully is winding down and we have wound down the plus program, everybody will get episode 301 at the same time, which will be just after we record it on the 9th of September US time. The time of that Apple event will be 10 AM Pacific time. That equates to 1 PM US Eastern time. It’s going to be 5 in the morning here in New Zealand, and it’s going to be even worse for our friends in Australia, 3 AM there on the Tuesday morning, and it’ll be 6 PM on Monday in the UK.

Looking forward to bringing that to you.

Update on the Vision Australia CEO Controversy

The second thing I wanted to talk about is the Vision Australia situation that is percolating away. It is great to see that petition continuing to percolate away. Over 700 signatures on that petition now. Congratulations to everyone who has signed this and spoken up against an injustice.

I can also tell you that there’s been a bit of media coverage of this, and that’s always good because it means that donors are becoming aware of what’s going on, and they should, in this enlightened age, be aware of the kind of things that are going on in an organization that they donate to.

Vision Australia, in an article for The Guardian, has fessed up to censoring the New Horizons program. We talked about this extensively with the host of New Horizons, Vaughn Bennison, back in episode 298. And the reason they gave in The Guardian for doing so was that it contained, and I quote, a shock jock style editorial.

Now, you can go back and listen to New Horizons in the archives, and I think any reasonable person will agree that while it may be described as a passionate editorial, I don’t think that any reasonable person would call that a shock jock style tutorial. And even if it is, (I don’t accept that it is for a moment), it is not Vision Australia’s role to treat blind people like little children and say you mustn’t say that about Vision Australia.

Vision Australia is in a position of trust and a position of privilege. It is a gross abuse of power for Vision Australia to be censoring a programme that is put together on behalf of a consumer organization. It is outrageous, and I hope appropriate regulatory action follows because this sets a frightening precedent, and it must not be allowed to stand.

And even for those of us who were not in Australia, this issue is symptomatic of a much wider problem. We’ve seen a couple more CEOs appointed in the United States of significant blindness organizations just in the last couple of weeks, both of them sighted, one of them with no knowledge whatsoever of blindness.

This thing is a pandemic in our community, and it’s going to remain so, unless we demand change, and unless we decline to be intimidated by those who would intimidate us into silence. It ain’t going to happen anymore. I suggest there has to be a tipping point, and that we have probably reached it.

The Vision Australia example is a bridge too far, and it is, I think, inspiring others to take a look at what they have tolerated in silence for far too long.

To that end, you can now go to a website to keep up to date with what’s happening with this campaign. That website is UnitedBlindLeaders.org. Nice, simple URL, UnitedBlindLeaders.org.

There is an official event on this Thursday, the 12th of September, to launch the United Blind Leaders website. It’s going to be happening at 7 PM Melbourne time. I’ll be there, and I hope that you will be, too. You can find out more on the United Blind Leaders website at UnitedBlindLeaders.org.

I send words of encouragement and empowerment to everyone involved in this campaign in Australia. Stay strong because your cause is important, and your cause is right.

Advertisement: Our transcripts have become one of the most valued features of Living Blindfully, and they’re made possible, thanks to the generous sponsorship of Pneuma Solutions. Pneuma Solutions, among other things, are the RIM people.

If you haven’t used Remote Incident Manager yet, you really want to give it a try. It is a fully accessible, screen reader agnostic way to either get or provide remote assistance. And it works for PC and Mac completely transparently. So if somebody assisting you is using a Mac and you’re using a PC or vice versa, RIM will handle that no problem at all.

We all want to use accessible websites whenever possible, right? But there are some times where we just have to get something done on a website that’s not accessible.

I try not to do it too often. But every so often, I’ll get in touch with one of my adult children and ask them if they have a couple of minutes to get me past a difficult accessibility problem on a website, or even in a specific app. For this, we use RIM.

I like it because I don’t have to tab around looking for some sort of arbitrary code in a semi-accessible app. We can choose the keyword that is going to be used.

And now, of course, I can call Aira and have a professionally-trained visual interpreter assist me through RIM.

I’m pleased to be a RIM user because Remote Incident Manager was designed by blind people with blind people in mind, but it has advantages over other remote access solutions that sighted people have been used to using, but are nowhere near as accessible. So if you have a family member sometimes assisting you through a murky web situation, I’m sure you won’t regret switching to Remote Incident Manager to get the job done.

To get the app for PC and Mac, you and the person assisting you can head over to [GetRIM.app](https://pneumasolutions.com/products/rim. That’s G-E-T-R-I-M.app.

Introduction to the iOS 18 Review

The first official version of iOS 18 is about to be released. It’s the latest operating system from Apple that adds plenty of new smarts to your Apple mobile device.

Interestingly, the smarts that everyone was talking about after Apple’s Worldwide Developers Conference keynote are not here yet, specifically all the Apple Intelligence features. We’ll have to wait a bit longer for those to start appearing. Nevertheless, there’s plenty to cover.

This episode is a super deluxe long one, and it is indexed by chapter. So if there’s a particular feature of iOS 18 that I’m talking about that doesn’t interest you, if your podcast player supports chapters, you can easily skip to the next one. If you’re using Castro, you can bring up a list of all the chapters and deselect the features that don’t interest you. Then, Castro will just play the episode out with the features that you want to hear about.

Supported Devices

Let’s begin by checking if your device is capable of running the latest operating system.

The new iPhone 16 devices will obviously run the latest operating system. It’d be a bit of a scandal if they didn’t.

Also supporting iOS 18 are all iPhone 15 models, all iPhone 14 models, all iPhone 13 models including the mini, all iPhone 12 models, all iPhone 11 models, all iPhone XS models, the iPhone XR, and the iPhone SE second generation or later.

But if you want to make the most of the upcoming Apple intelligence features, you’ll need the iPhone 15 Pro, iPhone 15 Pro Max, or any of the iPhone 16 models. This is due in part to the phones requiring 8GB of RAM.

I’ll be focusing on iPhone with this review, but most of the features are also available on iPad with iPadOS 18. To run this on your iPad, you’ll need iPad Pro M4, iPad Pro 12.9 3rd generation and later, iPad Pro 11 1st generation and later, iPad Air M2, iPad Air 3rd generation and later, iPad 7th generation and later, and iPad Mini 5th generation and later.

VoiceOver. What’s New?

As I like to do, we’ll get to the nitty-gritty right off the bat and have a look at new VoiceOver features in iOS 18. There are some of significant substance this year.

You may remember that for some years, I wrote a series called iOS Without the Eye. When developer beta 1 of iOS came out every year, I would start my trolling around the operating system, looking for new things and chronicling them from a VoiceOver user’s perspective. I was really doing Apple’s job for them, to some extent, because Apple did not provide documentation within iOS about what they’ve changed, particularly from a VoiceOver user’s perspective. So the first thing I want to show you is that Apple has now changed that.

You can, of course, go into VoiceOver settings if you have a Bluetooth keyboard by using the shortcut key VO+F8. That’ll get you right into the screen that I’m at now.

And if you flick through the screen a little way, you’ll eventually find this.

VoiceOver: What’s new in VoiceOver.

Jonathan: If I double tap this, …

VoiceOver: Settings.

VoiceOver, heading.

What’s new with VoiceOver in ios 18, heading.

Live recognition. Detect text, people, doors, furniture, and more by using the live recognition rotor. Live recognition can also be started with a 4-finger triple tap.

Jonathan: You can swipe your way through this and have a look at the new things that are in VoiceOver. This is a welcome addition.

The Live Recognition Rotor

Since Apple has highlighted this new live recognition rotor, let’s look at that first.

Over the years here on this podcast, I have chronicled whenever Apple has added a new feature relating to live recognition. We’ve seen door detection. I think the original one was people recognition, and the most recent one was point and speak.

Before, you’d have to do a 4-finger quadruple tap to get in and play with this live recognition stuff. Now, it’s on the rotor if you choose to add it to your rotor.

To do so, go into rotor settings, double tap the option that exposes all the rotor options that are there (and there’s quite a lot of them now), and make sure that live recognition is added to the rotor.

When you have this added, you can use the rotor gesture anywhere on your device (kind of like twisting a virtual dial), and you will eventually find…

VoiceOver: Live recognition.

Scenes, off.

Jonathan: If I swipe up and down, I’ll find all the different live recognition options that are available in the operating system.

VoiceOver: Point and speak, off.

Text, off.

People, off.

Doors, off.

Scenes, off.

Jonathan: I applaud Apple for adding this new rotor because now that they have, I find myself using these functions a lot more than I used to. For example, since it’s right there, I often find myself switching the text recognition on, and it seems comparable to Seeing AI but right there on the rotor, so there’s slightly less friction in getting it going.

Let’s just have a play with this quickly. We’re on scenes right now. So if I double tap, …

VoiceOver: On.

Settings. Back, button.

Daniel: Live recognition starting.

A group of cardboard boxes on a gray surface.

Agroup of cardboard boxes in a room.

Jonathan: I’m panning the camera around.

Daniel: Agroup of electronic devices on a white surface.

A person standing in front of a computer monitor.

A microphone on a stand next to a keyboard on a table.

A microphone on a stand next to a keyboard.

A pair of headphones hanging from a ceiling.

Jonathan: Okay, I’ll just turn that off.

So when you double tap, it turns it on and you are then getting your default voice speaking. So I have that running at a slightly faster speed than the voice I’m using for this demo, which is the Nuance Vocalizer Tom voice.

You can have as many of these on at the same time as you want. So if I flick down, …

VoiceOver: Point and speak, off.

Jonathan: and flick down, …

VoiceOver: Text, off.

Jonathan: I have had very good luck with this. When I’ve turned on the text, I’ve been able to read packages, any text that’s in the vicinity. it’s quite effective.

VoiceOver: People, off.

Jonathan: This was an incredibly useful feature during the era of the pandemic and social distancing. We’ve heard on this podcast about people who use the people feature to find a spare seat on a bus, for example.

VoiceOver: Doors, off.

Jonathan: You may remember when door detection came out, we wandered around the house and looked at the way it was detecting doors. This can be quite useful in an unfamiliar environment.

VoiceOver: Scenes, off.

Jonathan: And then, we’re back to scenes.

It’s important to emphasize. You can have as many of these on as you want, so you can just flick down through the rotor options and double tap each one you want. If you have a number of them turned on at the same time, the voice could get quite busy. But this is a far more simple, intuitive way of making use of live recognition compared to the previous way.

Voice Rotor, Customization, Equalization

Next, let’s talk about the new voice rotor in ios 18.

You now have much more flexibility in terms of the voices you add to VoiceOver.

Previously, you could choose a default voice, and then add one voice for each language. For example, you could have one British English voice, one US English voice, etc.

Now, all those constraints are gone. You seem to be able to add an unlimited number of voices in what’s now called the voice rotor.

To show you this, we need to go into…

VoiceOver: Speech, Button.

Jonathan: and double tap.

VoiceOver: Primary voice, Heading.

Daniel. English United Kingdom, button.

Jonathan: Let’s flick right.

VoiceOver: Additional rotor voices, heading.

Eddie. English United States, button.

Daniel. English United Kingdom, button.

Jonathan: The reason why Daniel is also appearing on the rotor is that my default voice is the compact Daniel voice, and the voice on the rotor here is the premium Daniel.

VoiceOver: Karen. English Australia, button.

VoiceOver: Siri voice 4. English United States, button.

Jonathan’s personal voice. English United States, button.

Jonathan: Yes. That’s an exciting breakthrough in iOS 18, and I’ll talk more about that a little later.

VoiceOver: Ava. English United States, button.

Tom. English United States, button.

Add a rotor voice, button.

Jonathan: If we double tap, …

VoiceOver: Add a rotor voice, button.

Voices added here can be selected in the voice rotor.

Pitch change. Switch button, on.

Detect languages. Switch button, off.

VoiceOver will always speak with the current voice.

Pronunciations, button.

Done, button.

Language, heading.

English, button.

Jonathan: We can go through all the languages for which VoiceOver voices exist. If I double tap the language that I speak, it’s English.

VoiceOver: English.

Jonathan: Now, I can navigate by heading.

VoiceOver: Personal voice, heading.

Jonathan: Again, I’ll come back to that.

VoiceOver: English Australia, heading.

Jonathan: Now, we’ve got all the Australian voices grouped together.

VoiceOver: Karen, button.

Download Karen, enhanced.

Karen, premium. Using 213.5 MB.

Download Lee. 2.8 MB.

Jonathan: All the Australian voices are there, and you can add as many as you want.

If I navigate to the next heading, …

VoiceOver: Siri button, heading. Expanded.

Jonathan: we have the Siri Australian voices available for me to select.

VoiceOver: English India, heading.

Download Isha, 5.2 MB, button.

[Actions available sound]

Jonathan: Now, you heard that sound, which is an optional feature that you can turn on in VoiceOver. Instead of it saying actions available all the time, it can just play that sound to tell you that actions are available.

So what are the actions? If I flick down, …

VoiceOver: Speak sample.

Jonathan: This is a quick way for you to preview the voice.

So if I double tap, …

Isha: Hello! My name is Isha. I am an Indian English voice.

Jonathan: I can flick right.

VoiceOver: Download Isha enhanced. 294.8 MB, button.

Jonathan: And we can hear the difference between the enhanced and the regular one by playing the enhanced speech.

VoiceOver: Speak sample.

Isha: Hello! My name is Isha. I am an Indian English voice.

Jonathan: A lot more clarity, a lot more inflection there.

You can have plenty of fun previewing these voices. If you want to install one after previewing it, you can flick back up, …

VoiceOver: Activate, default.

Jonathan: double tap, …

VoiceOver: Isha, Enhanced. 294.8 MB.

Download Isha Premium, 337.4 MB, button.

Rishi, Button.

Download Rishi.

Stop Downloading Isha, Enhanced. 46% downloaded, button.

Jonathan: So the download has now begun, and it won’t take too long for that download to complete.

VoiceOver: Stop Downloading Isha, Enhanced. 92% downloaded, button.

Jonathan: It should be there now, I would say.

VoiceOver: Isha, enhanced. Using 294.8 MB, button.

Jonathan: And if you need to free up some space, we can flick up, …

VoiceOver: delete.

Jonathan: and delete the voice.

So add voices to your heart’s content with a much more flexible voice rotor system.

Once you’ve added these voices, of course, you use the rotor.

VoiceOver: Live recognition.

Voices.

Jonathan: Now, I’ve found voices on my rotor. And if I flick down, …

VoiceOver: Daniel, primary voice.

Eddie.

Daniel.

Karen.

Siri voice 4.

Jonathan’s personal voice.

Ava.

Tom.

Jonathan: And now, we’re back to Tom which I’m using throughout this review.

What is super interesting to me is that you can add multiple versions of the same voice. You might want to do this because there’s a lot of configurability now in terms of the pitch, the speed, the timbre of the voice.

And you may, for some reason, want to have different versions of the same voice for different scenarios. This could be particularly compelling if you set activities up so that when you go into a particular app, perhaps you like a voice reading to you in a certain way, you can invoke this particular version of the voice.

I’ve gone back into the Add Voice button, chosen English, and I’ve chosen the Tom voice a second time. Now that I’m in there, if I flick right, …

VoiceOver: Tom 2, button.

Jonathan: There’s a button there where I can rename this voice to something else. By default, it will have the voice name and the number 2 after it to indicate that this is the second installation of the same voice. But you can double tap this, an edit field will pop up, and you can call it anything you like.

So how much flexibility is there in terms of the way these voices can sound? Quite a bit. If I flick right, …

VoiceOver: Rate, heading.

Decrease speed, image. An illustration of a group of gray shapes. Slider. 50%. Adjustable.

Pitch, heading. Waveform badge.

Jonathan: So we’ve got rate and pitch.

VoiceOver: Audio effects, heading.

Equalizer, button.

Jonathan: This is new. You can now equalize each voice to your specifications. This is a huge deal for people with hearing impairments. People who are using made for iPhone hearing aids connected to their iPhones will really benefit from being able to have a voice that is optimized for their particular hearing loss.

Let’s have a look at how much power there is in here. There’s a lot of it.

I’ll double tap.

VoiceOver: Enabled. Switch button, off.

Jonathan: I’ll leave it disabled for now, but I’ll flick right and show you how much flexibility we have.

VoiceOver: Ultra lows, heading.

Jonathan: The first is ultra low.

VoiceOver: Ultra low, 0. Adjustable.

Jonathan: You can set this all the way up to 20, and all the way down to -20. The trick here is to just play with these settings until you get a voice that’s sounding good to you.

If I flick right, …

VoiceOver: Lows, heading.

Jonathan: We go from ultra lows to lows.

VoiceOver: Low, 0. Adjustable.

0.0 decibels.

Mids, heading.

Mid, 0. Adjustable.

Jonathan: Now, we’re adjusting the mids, and you’ll note that each part of the screen is handily navigable by heading. It’s very well done.

VoiceOver: 0.0 decibels.

Highs, heading.

Highs, 0. Adjustable. 0.0 decibels.

Jonathan: That’s what’s on the screen, and that’s what’s on all the screens for the different bands when you go in here. This is a very cool, comprehensive equalizer, and you can do this now for each voice.

There are also various styles that you can add to the Siri voices, and this could be a reason why you might want to add multiple versions of the same voice. So if we go into the Siri voice that I have set up which is the American Voice 4, …

VoiceOver: Siri Voice 4. English United States, button.

Jonathan: We’ll see how this works.

Rate, heading.

Decrease speed image.

Jonathan: I’ll just navigate by heading at this point.

VoiceOver: Pitch, heading.

Volume, heading.

Voice presets, heading.

Jonathan: Here’s the presets.

If I flick right, …

VoiceOver: Default, button.

Even inflection, button.

Faster pace, button.

Even inflection and faster pace, button.

Selected. Narration, button.

Voice presets change the overall speaking style of a voice. They can be further customized with specific voice settings.

Jonathan: I have mine set to narration at the moment. Let’s invoke this voice.

VoiceOver: Live recognition, scene.

Characters.

Voices.

Jonathan: There we go.

VoiceOver: Ava.

Jonathan’s personal voice.

Siri voice 4.

Jonathan: Let’s get it to do some reading while the narration preset is active.

VoiceOver: Voice presets change the overall speaking style of a voice. They can be further customized with specific voice settings.

Jonathan: Let’s change the preset.

VoiceOver: Selected. Narration.

Even inflection and faster pace.

Faster pace.

Jonathan: Alright, I’ll double tap that.

VoiceOver: Selected. Faster pace.

Jonathan: And now, we’ll go back to that same sentence.

VoiceOver: Narration. Voice presets change the overall speaking style of a voice. They can be further customized with specific voice settings.

Jonathan: So you see that it’s sped up, but it’s also changed its inflection. It’s much less animated.

We’ll go back.

VoiceOver: Narration, button.

Even inflection and faster pace, button.

Selected. Faster pace.

Even inflection, button.

Jonathan: We’ll go to the even inflection preset.

VoiceOver: Selected. Even inflection.

Voice presets change the overall speaking style of a voice. They can be further customized with specific voice settings.

Narration, button.

Jonathan: I’ll double tap that again.

VoiceOver: Selected. Narration.

Jonathan: And now, you’ll hear the difference.

VoiceOver: Voice presets change the overall speaking style of a voice. They can be further customized with specific voice settings.

Jonathan: And you have this for all of the siri voices. So you have plenty of flexibility, and you can have a lot of fun customizing these voices.

Use Your Personal Voice with VoiceOver

Now, we’ve heard as we’ve traversed all these new options that it’s now possible to use a personal voice in VoiceOver.

Personal Voice was introduced in iOS 17 as a way of preserving someone’s voice who may be losing it. It can be made to speak certain phrases that you type, or that are stored on your device.

But the capability has been rolled out more widely in iOS 18. It can now be used with VoiceOver and third-party apps. Voice Dream Reader, for example, can now read books using personal voice.

To set this up, you need to go into settings, accessibility, and then navigate to the speech heading and choose personal voice. The phone will prompt you through the process. You’ll need to find a quiet environment and record 150 phrases.

Once you’ve recorded the phrases, the building process can take a while. You’ll need to keep your iPhone locked and connected to power while the personal voice is built. If you set it going overnight, you should find that it’s ready for you in the morning. Now, this all happens on your device.

About 24 years ago, I got contacted by a company that at that time was making text-to-speech engines. They wanted me to go into their studio, record a lot of phonemic noises, and I guess phrases and things, and they wanted to turn me into a text-to-speech engine. This sounded hideous to me, so I didn’t get it done.

But now, I can do this in the privacy of my own home. And I’ve got to say, it does sound a bit like me.

I’m going to find this voice on the voice rotor.

VoiceOver: Live recognition.

Actions.

Voices.

Ava.

Jonathan’s personal voice.

Jonathan: Oh boy! Here he is.

So if I just flick through here, …

VoiceOver: Voices added here can be selected in the voice rotor.

Pitch change. Switch button, on.

Jonathan: It’s quite freaky hearing yourself speaking back to you as a TTS.

If I go to the home screen, …

VoiceOver: Settings.

Smart home folder.

App store.

Sonos beta.

Aira Explorer.

Fantastical.

Castro.

Whatsapp.

Jonathan: It’s reasonably responsive. A little bit laggy, but nothing too serious to complain about.

So that is personal voice. If you take the trouble to set this up, you can now have yourself reading to yourself.

A New Approach to Audio Ducking

Let’s talk about audio ducking because there have been some changes there as well. There’s now considerably more flexibility when it comes to audio ducking.

This is the feature that turns down other media that’s playing when VoiceOver talks, so you can hear VoiceOver more clearly. You can add these features to the rotor if you want to. But I don’t want my rotor to be too cluttered, so I’ve added them to the quick settings feature on my iPhone.

The first thing we’ll notice is the volume for VoiceOver.

VoiceOver: Volume. 154%. Button, adjustable.

Jonathan: Why is the volume at 154%? That seems like a strange percentage for a volume control.

The reason for that is that once you get the volume past 100% now, the higher the volume, the more pronounced the ducking is going to be. So this is a matter of personal preference. Just how dramatic do you want the ducking to be? I have mine relatively high because as somebody with a hearing impairment, I can find background noise to be quite distracting.

The other feature pertaining to audio ducking is important.

VoiceOver: Audio ducking. When speaking, button. Adjustable.

Jonathan: Before iOS 18, we’ve had 2 options for audio ducking. It can either be on or it’s not. When it’s off, VoiceOver doesn’t duck at all. when it’s on, it does.

Now, we’ve got 3 options. So this is the equivalence to the old on feature. When VoiceOver is speaking, the volume of the other audio that’s playing will duck. But you can also now turn it off like before, or have the audio always duck. This means that you can strike a permanent balance between VoiceOver’s volume and the volume of other media.

So have a play with these features. You’ll find, I think, a sweet spot for you, a set of parameters that allow you to work efficiently.

VoiceOver Tutorial

I don’t use a Mac anymore, but I did buy one in 2012 and had it for about 4 years. One of the things that really impressed me about setting it up was the comprehensive VoiceOver tutorial, and it’s always surprised me that Apple didn’t build one into iOS.

Well, now they have. If you go into settings and choose accessibility and then VoiceOver and you flick through, you will eventually find VoiceOver tutorial button.

If I double tap, …

VoiceOver: Select.

Collections page.

Learn how to use VoiceOver.

In this tutorial, you’ll learn how to use VoiceOver to navigate and use your iPhone. Some instructions may not apply if you’ve customized VoiceOver, and VoiceOver may be restricted throughout the tutorial., button.

Jonathan: I’ll flick right.

VoiceOver: Collections.

Continue from where you left off.

Pause voiceover speech, button.

Jonathan: I’ve been playing with this to see what it’s like. It seems very comprehensive and interactive. And even more impressively, when you go back in, you can pick up from where you’ve left off so you don’t have to do the tutorial in one go.

VoiceOver: Learn introductory VoiceOver interaction gestures, button.

Jonathan: We’ll just show you this, and you can explore this on your own.

VoiceOver: Select the next item.

VoiceOver automatically reads aloud the selected on-screen item like text, or a button. You can control which item is selected by moving the selection.

To select the next on-screen item, swipe from left to right with one finger. Swipe one finger from left to right now to move to the next page.

Jonathan: I’ll do that.

VoiceOver: Continue, button.

Select the previous item.

You just practiced the gesture to select the next item.

To select the previous on-screen item, swipe from right to left with one finger.

Continue, button.

Jonathan: I’ll do that.

VoiceOver: Swipe one finger from right to left now to move to the next page.

Activate an item.

Some items can be activated, which performs their default action. This is similar to clicking the mouse on the item, or pressing a button to submit a form.

To activate the selected item, tap the screen twice with one finger. Double tap with one finger to move to the next page.

Practice navigating.

Now, you’ll practice the commands you just learned by navigating.

Jonathan: I’ll just stop that there.

So the VoiceOver tutorial is very well done. A bit overdue, I have to say, Apple, but they’re there now, and that’s great.

Advertisement: Living Blindfully is brought to you in part by the Disability Disrupters Podcast.

Are you ready to break barriers and create a rich and fulfilling life despite the challenges we all face as disabled people? Check out the Disability Disrupters Podcast bringing you news, views, and interviews with disabled people around the world.

Disability Disrupters brings you insightful news. Stay updated on the latest happenings in the disability community from a disabled person’s perspective.

Diverse voices. Hear from inspiring individuals sharing their personal journeys, including struggles and successes.

Practical advice. Learn from others about navigating barriers to community participation.

Empowering conversations. Discover strategies to create a fulfilling life on your own terms.

Join the monthly discussion on the Disability Disrupters Podcast. It’s available on the Disability Responsiveness website at www.drnz.co.nz. That’s www.drnz.co.nz, or search for it in your favorite podcast app. Disrupters is spelt with an E-R.

The Disability Disrupters Podcast. Let’s disrupt the status quo, and show the world what we can achieve together.

Haptic Vibration at VoiceOver Start

It’s a little thing but if you quit VoiceOver and then you start it again, you’ll feel a small haptic vibration when VoiceOver starts. This could be useful if you’re having some sort of other hardware issue and you’re not getting any speech feedback, you are able to confirm with this haptic feedback that VoiceOver is on. If you’d prefer it not to do this, you can switch it off. In VoiceOver’s settings screen, you go to Audio, and then Sounds, and Haptic Settings, and there’s an option there to toggle the startup haptic vibration off.

Delay Before Selection

To discuss the next new VoiceOver feature, let me paint a picture for you.

You’re lying in bed, and you’re reading a news website. The articles aren’t long, so you’re not going to put the phone on a bedside table while it reads to you because you know you’re going to be tapping the phone and flicking through items very soon. Your finger’s hovering close to the screen, and you find, maybe as you get a little dozy, that you accidentally touch the screen. And the moment you touch the screen, VoiceOver’s going to respond. It is quite sensitive.

To deal with this, there’s a new feature on the main VoiceOver settings screen, towards the bottom of the screen, that’s called Delay Before Selection. This is set at its default of 0 seconds which means that the moment your finger touches the screen, VoiceOver is going to respond and tell you what’s under your finger. If you increase the setting, your fingers going to have to be on the screen a bit longer before VoiceOver acknowledges that you’re touching the screen and responds.

This is quite a unique setting because it only affects when you touch the screen with a finger. Swiping (sometimes called flicking) is not affected at all. So if you flick left or flick right, it’s going to be as responsive as it always was. This exclusively impacts how long your finger must be on the screen, either resting steadily on the screen or dragging your finger around the screen. In other words, exploring by touch before VoiceOver response.

Some people may find it very helpful to increase this a little bit. It goes all the way up to 2 seconds. That’s probably overkill for many people. However, if you have an additional disability that means that your fingers can be a little unsteady on the screen, this could really help you use VoiceOver.

Separate Braille Input and Output Tables

Next, let’s move on to Braille features. There are some good ones in iOS 18.

First, you can make different choices for your Braille input table and output table. If you don’t want to do this, you can switch it off and I’ll show you that in just a moment.

Assuming that you have a Braille display connected, you’ve gone into the rotor settings and ensured that the new input and output Braille rotors are selected in rotor settings, then we can use the rotor gesture to locate the appropriate items.

I’ll move left. I think that’s the quickest way for me to get there.

VoiceOver: Live recognition, scenes.

Voices.

Describe images.

Screen recognition.

Input Braille table. English unified. Contracted. Liblouis.

Jonathan: I can flick down through my choices.

VoiceOver: English US. 8 dot.

English Unified. Uncontracted. System.

English Unified. Contracted. System

English 8 dot.

English Unified. Uncontracted.

Jonathan: Now, we’re back to the one that I typically use.

The advantage here of being able to separate input and output is if you are still finding input in Braille a little bit idiosyncratic (and I think it has improved markedly over the years), you might elect, for example, to Braille in uncontracted Braille, but to read the output in contracted Braille.

So if I use the rotor gesture to go to the previous rotor item, …

VoiceOver: Output Braille table, English Unified. Contracted. Liblouis.

Jonathan: Now, we’ve got the output table.

If you find yourself changing Braille tables from time to time but you can’t envisage a time where you would choose a different input from your output, then you can consolidate this to the way it was.

I’m in Braille settings in VoiceOver now. And if I flick right, …

VoiceOver: Braille, heading. Match input and output tables. switch button, off.

Jonathan: I’ll double tap.

VoiceOver: On.

Jonathan: Now, let’s explore the rotor.

VoiceOver: Containers.

Speaking rate, 50.

Words.

Characters.

Edit.

Live recognition.

Voices.

Describe images.

Screen recognition.

Braille table. English unified. Contracted. Liblouis.

Jonathan: Now, it’s gone back to the pre-iOS 18 behavior where there’s just one rotor item for Braille table. And when you change it, it’ll affect both input and output.

If you find yourself never or seldom changing your Braille table, you can remove this from the rotor and make a change, if necessary, in settings from time to time, but make your rotor a bit less cluttered. And we know there are a lot of things that can end up being on your rotor that you seldom use.

Reconnect Braille Display

There’s now a reconnect Braille display option. You can do this in a couple of ways.

If you restart VoiceOver, it’ll reconnect your Braille display. So that’s a quick and dirty way of trying to re-establish the connection, if things aren’t going the way you’d hoped. We know that Bluetooth can be a bit finicky from time to time.

Or, you can assign a gesture on your touchscreen to the reconnect Braille display option, execute that gesture with VoiceOver running, and VoiceOver will attempt to reconnect the Braille display.

Item Chooser for Braille

An excellent feature that Apple added recently for Braille display users is that if you’re on the home screen and you want to launch an app, you can press enter, type in the name of the app, and press enter again. You’ll get a list of the apps that match your search criteria and you can choose from there. It’s very efficient.

There’s also a similar thing available for Braille screen input users. You’ve been able to do it since iOS came out, of course, with Spotlight Search, either with the virtual keyboard or the physical keyboard, when they were introduced in iOS 4. Well, in iOS 18, Apple has extended this concept, and they’ve called this the item chooser for Braille.

So I’m in the Braille screen at the moment. I’m going to go to the top of it, …

VoiceOver: VoiceOver. Back, button.

Jonathan: I’m going to press enter on my Braille display. I have my APH Mantis connected.

Now I have a blank display, and the cursor’s here. That’s all I see on the Braille display is a cursor at the beginning of the display.

I’m going to type the word duration because I’m pretty familiar with the screen, and I think there is more than one occurrence of the word duration.

So I’ve typed that, and I’m going to press enter.

Now, what’s come up is auto advanced duration.

I’ll use the command to move to the next item, and I see ignore chord duration.

I think that’s all. I think the word duration only appears twice on the screen.

I’m going to use the flick right gesture on my Braille display going to the next item. And sure enough, it is wrapped around again to auto advanced duration.

I’ll go to the next one, ignore chord duration.

And now, I’m going to press enter.

VoiceOver: Ignore chord duration, 0.3S, button.

Jonathan: And that has jumped me straight to that item.

This is a chance for me to give a bit of a plug for the good old item chooser in general, because there has been a speech-based item chooser for a long time in VoiceOver for iOS. It can be a huge time-saver if you’re on a busy screen, and you know what you’re looking for, even if you can think of a keyword on the screen that you’re looking for.

If you’re not familiar with the item chooser, see what happens when you try incorporating it into your daily iOS workflow. You may find it a real time-saver.

To use the regular item chooser, you perform a 2-finger triple tap.

Braille Screen Input Gets a Significant Revamp

Next, I’m going to spend a lot of time talking about a significant update in iOS 18 to Braille screen input.

We talk a lot about Braille screen input on Living Blindfully for good reason. If you know Braille, this is a fantastic way to get content into your device. It’s more private and more accurate than dictation. It’s very fast when you get the hang of it, and you don’t need to carry along any other accessory just to input a lot of data into your device.

If you want a demonstration and an explanation of Braille screen input in terms of the basics, which I won’t repeat here, you can go back to episode 144 of this podcast. Audio and a transcript are available of that episode, and the venerable Judy Dixon shows us Braille screen input and how to use it.

There will be a bit of overlap between what Judy said in episode 144 and what I’ll show you now. But Braille screen input has been improved significantly in iOS 18, and I congratulate everybody involved at Apple for doing such a brilliant job of this.

Potentially now, Braille screen input can be used as an alternative user interface. If you are happy in Braille land, you can feasibly set your iPhone up so that it seldom comes out of Braille screen input now, and you can use your iPhone very effectively as a Braille user. It is just brilliant.

One of the big changes in iOS 18 where Braille screen input is concerned is that you now have 2 modes. There’s the mode that we’ve always had which remains largely unchanged where you can input Braille into your device, but there’s also now a command mode. You toggle between these modes when Braille screen input is active with a 3-finger flick. It doesn’t matter whether you flick right or left. Both will work. I tend to do a 3-finger flick right for some reason, but that’s just habit.

Before we activate Braille screen input though, let’s configure it because there are quite a few new configuration options. To do this, we want to be on the main VoiceOver screen, and then go to…

VoiceOver: Braille. APH Mantis Q40.

Jonathan: It’s speaking the name of my Braille display here which may be a bit misleading, because if you go in here even if you don’t have a Braille display connected, you can still configure Braille screen input.

I’ll double tap.

VoiceOver: Match input and output tables. Switch button, on.

Jonathan: I’ll flick right.

VoiceOver: Input and output. English Unified. Contracted, button.

Jonathan: And then, we only need to do one more flick right to get to what we’re interested in.

VoiceOver: Braille screen input, button.

Jonathan: I’ll double tap.

VoiceOver: Type in Braille, launch apps, and control your device using Braille screen input.

Place one finger from each hand at the top and bottom edges of the screen, and double tap to start Braille screen input. To exit Braille screen input, slide two fingers in opposite directions.

Jonathan: All of that is new. First, the explanation is welcome from Apple. They’ve taken some time to provide some good documentation and hints for Braille screen input. And as you’ll hear, the way that you invoke Braille screen input is different by default. We’ll come back to that.

I’ll flick right.

VoiceOver: Learn more.

Jonathan: And if we double tap learn more, there is a lot of information here, which will hopefully be a good refresher for you in terms of all the commands that are available in Braille screen input.

I’ll flick right.

VoiceOver: Use activation gestures. Switch button, on.

Jonathan: As VoiceOver told us before, you can now invoke Braille screen input by default in a different way. Rather than it being on the rotor, you can double tap. And it’s an interesting gesture. What you do is you put one finger on one end of the phone, and another finger at the other end of the phone, and you double tap with both fingers and that will invoke Braille screen input. The cool thing about this is that you can do it from anywhere. You don’t have to move through all your rotor options to get to Braille screen input.

The rotor choice for Braille screen input used to be available only where Braille screen input was possible. Now though, because you can use Braille screen inputs to give commands to the device and just navigate rather than type, this is great because it’s available everywhere. But if you don’t like it, you can switch this off. VoiceOver explains this if we flick right.

VoiceOver: Without using the gestures, you can still start Braille screen input using the rotor. When using the rotor to exit Braille screen input, move both fingers.

Jonathan: I’ll flick right.

VoiceOver:Start automatically when editing text. Switch button, off.

Jonathan: I have this off, only because I do use my Mantis quite a bit, particularly in my office/studio here. But if Braille screen input is what you use all or the vast majority of the time, this is a cool time-saver. What this means, if you turn this on, is that whenever an edit field comes up, Braille screen input will be invoked, ready for you to type in Braille. No need to use the rotor, or even to invoke it with any special gesture.

VoiceOver: When a text control is activated, Braille screen input will start automatically in Braille entry mode.

Keep active until dismissed. Switch button, off.

Jonathan: VoiceOver explains this clearly if I flick right.

VoiceOver: After you launch an app or choose an item, Braille screen input will stay active in command mode.

Swipe left or right with 3 fingers to exit command mode.

Jonathan: So let’s say I enable this. I’m at my home screen, and I invoke Braille screen input and I type the partial name of an app like Mona, if I want to do something on Mastodon, and then I flick right with 2 fingers to launch it. At that point, if you have this setting enabled, Braille screen input will continue to be active and you can navigate your app using command mode in Braille screen input.

So again, let me emphasize. If you switch these things on, you essentially change the whole user interface of your phone and can use it pretty much permanently as a Braille user. But you can opt out of it at any time, of course.

VoiceOver: Reverse dot positions. Switch button, off.

Jonathan: That is not a new option.

VoiceOver: In 6 dot Braille, the positions of dots 1 and 3 and dots 4 and 6 will be swapped.

Visual text feedback. Switch button, off.

Each time a chord is entered, the text translation of the Braille will be visually shown on the screen.

Jonathan: That could be good for anyone who’s teaching someone Braille.

VoiceOver: Typing feedback, heading.

Sound. Switch button, on.

Jonathan: It makes some cool sounds now, and I quite like this when you’re typing away in Braille. It sounds like the kind of sound that is made when sighted people are typing away on their virtual keyboards.

VoiceOver: Haptic. Switch button, on.

Mode announcements, heading.

Jonathan: Under the mode announcements heading, you can choose the kind of feedback you get when you change modes.

VoiceOver: Selected. Speak and play sounds.

Speak.

Play sounds.

Jonathan: Those are your 3 options.

VoiceOver: Choose a Braille table, heading.

English Unified. Uncontracted. System.

Selected. English Unified. Contracted. System

Jonathan: That’s what I want.

VoiceOver: English Unified. Uncontracted. Liblouis.

English unified. Contracted. Liblouis.

Jonathan: Actually, I might change that to the Liblouis one.

I’ll double tap.

VoiceOver: Selected. English Unified. Contracted. Liblouis.

Jonathan: And those are the options that we have for Braille screen input.

I’ve got it configured the way that I want, and I’m just going to invoke Braille screen input right from the screen. Why wait? No time like the present. So to do this, I’m going to use the new method.

I like to use Braille screen input in tabletop mode because when it’s in tabletop mode, it feels like I’m Brailling on a virtual Perkins, and I’ve done that for most of my life so it’s very familiar. It’s second nature. I don’t have to think about it.

I’m going to execute this new gesture which for me, (with a Pro Max phone so it’s a large phone) definitely requires 2 hands. But that’s okay, because I’m going to be Brailling with 2 hands anyway.

I’ve got the phone positioned the way I want. It’s in landscape mode. I’m going to use a finger on my left hand to tap the screen at the top end of the phone. And at the same time, I’m going to use a finger from my right hand to tap the screen at the bottom end of the phone. We’re going to do this twice, so a double tap.

VoiceOver: Braille screen input.

Jonathan: We’re in Braille input mode at the moment, rather than command mode. But what’s the point of that, you might be asking, since there’s no edit field here? There’s no edit field to Braille into.

What happens in this situation is that you effectively get a Braille screen input item chooser. So let’s say that I’m interested in the reverse dot position option (which I’m not, but I mean it must be there for a reason. Some people must like it.), and I want to find it on the screen.

The first thing I’ll do is calibrate the dots by executing a dot 4 5 6 very quickly, followed by a dot 1 2 3.

VoiceOver: Dot positions calibrated.

Jonathan: Now, I’m going to start typing reverse.

VoiceOver: R, 9 items.

E, 7 items.

V, 1 item.

Jonathan: Now, if I flick down, …

VoiceOver: Reverse dot positions, off.

Jonathan: And there it is.

If I perform a 2-finger flick right here, …

VoiceOver: Moving to reverse dot positions, off.

Stopping Braille screen input.

Jonathan: I’m now on reverse dot positions, and Braille screen input has stopped. That’s because of the settings that I have in those Braille screen input settings that we just looked at. If you want to stay in Braille screen input, you can enable that feature in this settings screen that we’re still on.

I’m going to go back into Braille screen input by performing that double tap gesture with 2 fingers at either end of the phone.

VoiceOver: Braille screen input.

Jonathan: Now I’m going to go into command mode, and I’ll do that by flicking right with 3 fingers.

VoiceOver: Command mode.

After you launch an app or choose an item, Braille screen input will stay active in command mode. Swipe left or right with 3 fingers to exit command mode.

Jonathan: I’m in the Braille screen input settings screen still, so that’s why we’re hearing that.

What does command mode actually mean? Well, it’s as if you had a Braille display in front of you, and you’ve got the spacebar held down. So all you have to do to navigate around your phone is become familiar with the default Braille commands for iOS, and use them without the spacebar.

For example, if I press the letter K, it would be like pressing K chord on a physical Braille display, and that gets me into keyboard help.

VoiceOver: Starting help. To stop help, perform a 4-finger double tap or 2-finger scrub, or press escape on the keyboard.

Jonathan: If you’re a seasoned Braille user, you will know that pressing dot 3 6 chord is like performing a double tap. It activates the item.

So now that I’m in help mode, if I press dot 3 6 on the virtual keyboard, …

VoiceOver: Activate.

Jonathan: It confirms that that will activate. So this is a good way to get familiar with the different commands.

If I flick down, …

VoiceOver: Next rotor.

Jonathan: and I flick up, …

VoiceOver: Previous rotor.

Jonathan: Now if I press the letter H, we know that H chord is the home command.

VoiceOver: Home.

Jonathan: That’s correct. So just by pressing H, I can go home and exit the application that I’m in.

What about L chord, dots 1 2 3?

VoiceOver: Move to first item.

Jonathan: And dots 4 5 6…

VoiceOver: Move to last item.

Jonathan: So this is a good way to get familiar.

If I press an O-W, …

VoiceOver: Scroll left.

Jonathan: And O…

VoiceOver: Scroll right.

Jonathan: If I press dots 4 6, …

VoiceOver: Notification center.

Jonathan: And you get the idea. If you’re not familiar with these commands because you haven’t had access to a Braille display before, going into help by doing a 3-finger flick right to get into command mode, and then pressing the letter K, which is equivalent to K chord, is a good strategy to become familiar. There’s documentation online as well.

And if you want to exit help, just do the K again.

VoiceOver: Start help.

Stopping help.

Jonathan: So it’s a brilliant brilliant implementation of this. I’m very very impressed.

We are still in command mode. I’m going to go home at this point.

VoiceOver: Messages.

Jonathan: I pressed H to do that.

Now if you want to get straight into command mode, you can do that. Rather than doing that 2-finger gesture that I described where you have a finger at either end of your phone, do the same kind of gesture, but triple tap. When you do that, a triple tap, then you will go straight into command mode rather than input mode.

Now, I’m going to perform a 3-finger flick right to get back into input mode.

VoiceOver: Braille entry.

Jonathan: Braille entry mode, that’s what they call it.

I’m going to type D R A.

VoiceOver: D, 122 apps.

R, 7 apps.

A, 2 apps.

Jonathan: And maybe an F.

VoiceOver: F, 1 app.

Jonathan: There we go. You can then flick down, …

VoiceOver: Drafts.

Jonathan: and you can flick down through all these choices. Of course, if you have a whole lot of choices, just flick down to find the app that you want.

Now, I’ll perform a 2-finger flick right to get into drafts.

VoiceOver: Stopping Braille screen input.

Opening drafts.

Drafts. Text field. Is editing. Insertion point at start.

[more actions available sound]

Jonathan: We’ve covered drafts extensively in episode 238, if you’d like to learn about that app. I spend a lot of my life in Drafts these days. It is a fantastic app.

And because of the way that I have Braille screen input configured at the moment, I’ve been bounced out of it. Now that the app is active, so I’ll just invoke it again.

VoiceOver: Braille screen input.

Jonathan: Very easy to do.

And now, we’re in input mode, so I’m just going to calibrate my dots.

VoiceOver: Dot positions calibrated.

Jonathan: I’ll just Braille a few things so you can hear what it sounds like.

VoiceOver: This is Jonathan Brailling into Drafts for…

Jonathan:Now, this is new, too. Finally, you can write the for sign on an iPhone. You never used to be able to do this. You could do it on an iPad, but not on an iPhone. So I’ve just written the four sign. Let me complete the sentence.

VoiceOver: Living Blindfully.

Jonathan: So you can be very fast with Braille screen input. It’s fantastic, and I’ve written quite lengthy documents using it. I far prefer it to dictation.

You can, of course, navigate the screen when you’re in this mode if you want to do some editing or something like that. And the way to handle this is to put one finger anywhere on the screen, …

VoiceOver: In exploring mode.

Jonathan: I think there might be a bit of a bug there with it speaking that way.

With my other hand, I can now flick down.

VoiceOver: In exploring mode. 3.

Lines.

Words.

Characters.

Lines.

Jonathan: So we’re now selecting the unit that we’re moving by. I’m going to set it to words.

VoiceOver: Words.

Jonathan: So all this time, I’m keeping that one finger held down on the screen. That’s important. But with my other hand, I can flick left.

VoiceOver: Blindfully Living.

Jonathan: I’m working my way back through the document.

VoiceOver: For Drafts into Brailling Jonathan.

Jonathan: If I flick right, …

VoiceOver: Jonathan Brailling into Drafts for Living Blindfully.

Jonathan: Now, you can also select text in this mode by using 2 fingers instead of one, and you can select and unselect text. If you want to navigate the screen, remember, you can flick with 3 fingers to the right and you’ll get back in command mode.

If you want to exit Braille screen input, you can perform a rotor gesture. But there’s actually a much better way, and that is to perform a pinch gesture. So what I find works for me is to put 2 fingers on the center of the screen and just separate them. So one’s going to the left, and the other’s going to the right.

VoiceOver: Stopping Braille screen input.

Jonathan: The good thing about this method is that your rotor’s not affected, and you just popped right out of Braille screen input and you’re back into the standard user interface.

There’s a lot of power with Braille screen Input, and you don’t have to have any accessory with you to be efficient in inputting data. With Braille screen input in iOS 18, there’s no doubt that Apple has totally headed out of the park.

Custom Vocabulary With Voice Control

Now, let’s move on to some other accessibility features that may be of interest to this audience.

All the way back in episode 98, I started extolling the virtues of Apple’s voice control feature. I don’t know how many people in the blind community use it, and I guess we’re not the primary audience, but I find this incredibly helpful. When I’m walking around getting other things done, I can also use the commands that I’ve created myself that are far more streamlined than the default to get a whole lot of things done (most often reading posts on social media), and it really does save me a lot of time to be hands-free and controlling my phone in this way.

One downside of voice control in the past is that if you’re using this for serious work, if it’s your primary mode of input, it’s important to be able to add your own vocabulary to it. Of course, with apps like Dragon, you’ve been able to do that. Now, you can do it in iOS 18 with voice control.

It’s straightforward to do this. All you have to do is locate the voice control settings on the accessibility screen, and you’ll find a new vocabulary option. You can import a document containing words that you want voice control to be aware of. A significant enhancement for voice control users.

Vocal Shortcuts

While we’re talking about voice-related things, I want to show you a significant new feature that I think could add a lot of value for you. This is called vocal shortcuts.

I’m sure that the primary purpose of vocal shortcuts is to create shortcuts for people with a speech disability, to make things easier to say with your voice. But like a lot of accessibility features, there are others who can benefit as well. When you set up vocal shortcuts, you can give your phone commands without even having to invoke Siri, and you can do this when your phone is locked. Setting it up is pretty straightforward, and you can have an enormous amount of fun customizing vocal shortcuts to do things that you do regularly and really make your iPhone your own.

To find this, you go into Accessibility Settings under Speech. And then, if you flick through, you’ll find…

VoiceOver: Vocal Shortcuts. Off, button.

Jonathan: I’ll double tap. I’m going to set my first vocal shortcut up.

VoiceOver: Set up vocal shortcuts, button.

Jonathan: And as is pretty common with this sort of screen, if we flick right, we’ll get some additional context.

VoiceOver: Teach your iPhone to recognize a custom phrase that you can say to quickly perform an action.

Audio is processed on iPhone.

Jonathan: And if you were to flick right again, you’d get a learn more button that’ll tell you even more about this feature.

The significant thing here is that audio is processed on the iPhone so it is secure, but it’s also quite fast.

I’ll flick back left, …

VoiceOver: Set up vocal shortcuts, button.

Jonathan: and we’ll set this up.

VoiceOver: Cancel, button.

Set up vocal shortcuts, heading.

You’ll choose an action and record a phrase, in order to teach your iPhone to recognize your voice.

Continue, button.

Siri, heading.

Jonathan: Now, we’re on the screen where we begin setting up the shortcut, and the functions are divided by heading.

The first is Siri. What this means is that anything that you can say to Siri, you can say in a vocal shortcut. This is going to be much more beneficial with iOS 18 as Siri becomes more capable. But if you want a quick way of asking for the latest score from your favorite sports team or the weather outside, you can set this up here. Anything that Siri can say, you can assign to a vocal shortcut.

If I navigate to the next heading, …

VoiceOver: Shortcuts, heading.

Jonathan: Shortcuts are powerful. If you haven’t had a play with these, it’s amazing what you can string together and get your iPhone to perform a range of complex tasks with a single command of some kind.

We’ll come back to shortcuts in a moment in the context of vocal shortcuts. But now, I’ll navigate to the next heading.

VoiceOver: System.

Jonathan: We have system functions.

And finally, …

VoiceOver: Accessibility, heading.

Jonathan: Let’s explore some of these. I’m going to start by navigating back by heading until I get to shortcuts.

VoiceOver: System, heading.

Siri, heading.

Jonathan: Oh, it’s bypassed shortcuts. I’ll go right.

VoiceOver: Shortcuts, heading.

Jonathan: There it is.

I’ve got a lot of shortcuts. I use shortcuts an awful lot.

One shortcut that I use, probably one of the simplest ones I’ve ever set up, is to invoke the ChatGPT app and immediately put it into voice mode so I can have a voice conversation with it. So at the moment, what I say is Siri. And then after that, I say Hey, GP, which is the name I’ve given to my ChatGPT shortcut. So there’s a bit of friction there.

If I don’t want to use the Siri invocation word, I can, of course, also hold the side button down. But wouldn’t it be groovy if I could invoke ChatGPT as easily as I invoke Siri, at least for now, until the ChatGPT integration with iOS is complete, and then I may evaluate? So I’m going to set this up with a vocal shortcut.

The first thing I need to do is find the shortcut that I’ve set up to invoke ChatGPT in voice mode.

VoiceOver: Hey GP, button.

Jonathan: There it is. It’s just hey GP. And by the time we’re finished, that’s all I’ll need to say to start a conversation with ChatGPT.

So I’ll double tap the shortcut to tell the vocal shortcuts feature that this is the shortcut I want to take an action on.

VoiceOver: Custom phrase, text field. Is editing. Enter command name. Insertion point at start.

Jonathan: Having selected the thing that we want to assign a vocal shortcut to, in this case, my Hey GP shortcut, I now want to tell iOS 18 what I want the command to be that will invoke this shortcut. I’m just going to type Hey GP.

Before I press enter, I want to explain what’s going to happen next. Because if I explain this next screen while the screen is active, all sorts of fun things might happen.

When we go to the next screen after typing in the phrase, we’re going to say it 3 times. There are 3 waveforms on the screen. And when you’ve said the phrase 3 times correctly, it will move on. You can check how many times iOS has heard your phrase by checking the waveforms. If they say selected, then it has recognized them.

So you’ve got 3 waveforms in a row. I’m just going to say the phrase 3 times, and then it will move on to the next part of the process.

So let’s give that a shot. Now, I’ll press enter.

VoiceOver: Say hey, GP, heading.

Teach iPhone to recognize this phrase by saying it 3 times.

Waveform, image.

Waveform, image.

Waveform, image.

Jonathan: Hey, GP. Hey, GP. Hey, GP.

VoiceOver: Ready, heading.

Jonathan: Let’s just review the screen.

VoiceOver: Action is ready, heading.

iPhone will listen for hey GP and perform action hey GP.

Continue, button.

Add another, button.

Jonathan: Let’s have a look at what else we can add. I’ll double tap add another.

VoiceOver: Hey GP, button.

Jonathan: Now, we’re back on the main screen. And because we’ve created a vocal shortcut, we now have the list of vocal shortcuts which currently just comprises one item. I can navigate through those choices we saw before.

VoiceOver: Siri, heading.

Shortcuts, heading.

System, heading.

Jonathan: I want to show you the system functions that you can assign to a vocal shortcut.

VoiceOver: Action button, button.

App switcher, button.

Camera, button.

Control center, button.

Flashlight, button.

Front camera, button.

Home, button.

Lock rotation, button.

Lock screen, button.

Mute, button.

Notification center, button.

Reachability, button.

Screenshot, button.

Scroll down, button.

Scroll up, button.

Shake, button.

Siri, button.

Jonathan: I’m going to pause there because this one is an interesting one. If you want to call Apple’s assistant something else, you can do this with a vocal shortcut. So if you find Siri difficult to say, or you just want to impress your friends by invoking Siri with a completely different name, just assign this Siri system function to a vocal shortcut, and you can call the assistant whatever you want. And when you say that phrase, it will invoke Siri as if you’d said the magic word. Kind of fun!

VoiceOver: Spotlight, button.

Volume down, button.

Volume up, button.

Jonathan: In my testing of this feature before clearing it all out and showing this to you, I found this incredibly handy. I just set volume down to turn it down and volume up to turn it up. I can be some distance away from my phone, listening to a podcast or something like that. I can just say turn it up or turn it down, and the vocal shortcut is executed. It’s very cool indeed.

VoiceOver: Accessibility, heading.

Jonathan: Now we’re on to the Accessibility heading, and I want to show you what functions you can control here with the vocal shortcut.

VoiceOver: Accessibility shortcut, button.

Apple Watch mirroring, button.

Assistive touch, button.

Background sounds, button.

Jonathan: You’ll see these are in alphabetical order.

VoiceOver: Classic invert, button.

Color filters, button.

Control nearby devices, button.

Dim flashing lights, button.

Full keyboard access, button.

Hover typing, button.

Live captions, button.

Live recognition, button.

Live speech, button.

Magnifier, button.

Music haptics, button.

Smart invert, button.

Speak screen, button.

Switch control, button.

Vehicle motion cues, button.

Voice control, button.

VoiceOver, button.

Jonathan: There is VoiceOver. So if you want to set up a quick vocal shortcut to toggle VoiceOver, perhaps because sometimes you hand your phone over to a sighted person and you just want a really quick way to toggle it off and on again, this is where you do it.

VoiceOver: Zoom, button.

Zoom controller, button.

Jonathan: Those are the accessibility functions.

Let’s go to the home screen. I’m going to try the shortcut that I set up.

Hey, GP.

VoiceOver: Hey, GP. Connect.

Speak now.

Jonathan: Welcome! You’re speaking on the Living Blindfully iOS 18 demo. Do you have any thoughts on iOS 18?

ChatGPT: Thanks for having me on Living Blindfully!

iOS 18 is shaping up to be quite an interesting update, especially for accessibility features. Apple continues to innovate in making their devices more inclusive for everyone, including the blind and low vision community. One of the things that caught my attention is the enhanced voice recognition capabilities, which could be particularly useful for those who rely on VoiceOver. There’s also talk about more customizable haptic feedback, giving users more control over how their devices communicate information through vibrations.

Another area that seems promising is the improvement in Apple’s AI-driven features. These could provide more contextually aware assistance, which might help with tasks like reading text or identifying objects in the environment.

Overall, Apple seems to be continuing their trend of refining and expanding accessibility, which is great news for our community.

What are your thoughts so far on iOS 18? Have you tried out any of the new features?

Jonathan: Well, there’s another misconception right there. It thinks that blind people benefit from voice control. I mean we may well do, but it’s not because of blindness that we benefit from voice control. But there you go. It’s a work in progress, I suppose. But now, I can just use that command to invoke the voice functions of ChatGPT.

You can do a lot of customization with this. Vocal shortcuts are a pretty cool feature, and I’m sure many people will enjoy playing with that.

Advertisement: Living Blindfully is brought to you in part by ListenLater.net.

Are you tired of missing out on your favorite articles because you’re too busy? ListenLater.net is the revolutionary service that turns your must-read articles into your very own personalized podcast. Imagine having high-quality, human-like narration of your favorite articles ready to listen to anytime, anywhere, on any device you’re already using for podcasts.

With ListenLater.net, simply email a link, newsletter, or even a PDF, and they’ll transform it into a beautifully narrated podcast.

ListenLater.net is a great companion whether you’re commuting, working out, or doing chores. It kept me informed and entertained on my recent long international flights to and from the NFB convention.

ListenLater.net is perfect for everyone, especially for the blind community using devices like the Victor Reader Stream.

No subscriptions, no hassle. Just pay for what you use.

Get started with free credits today, and experience the convenience of having your own personal podcast feed. And after your first credit top-up, use the contact form and mention Living Blindfully for an extra 20% beyond what you paid for.

Visit ListenLater.net now, and transform the way you consume content.

ListenLater.net. Your articles, your way.

Music Haptics

I’m very proud of our deaf-blind audience here at Living Blindfully. That was the primary reason we started transcripts.

So I do want to make mention of a feature called Music Haptics, which is available in accessibility settings now.

VoiceOver: Accessibility.

Back, button.

Music Haptics, heading.

Music Haptics. Switch button, on.

Jonathan: Now, it’s off by default. I have enabled it to kind of feel what it’s like.

VoiceOver: When available, music haptics provide optic feedback for supported songs.

Learn more.

Jonathan: This only works on iPhone because it has the hardware to make this feature work.

It tries to represent music in a haptic way, so your phone will vibrate a lot and pulse a lot. And listening to some of my favorite songs with this enabled, it’s actually quite cool. It works on my phone, at least at the moment, only in Apple’s music apps, so Apple Music and their Classical app.

I do have hearing, so I’m not the primary audience for this. If you are deaf-blind and can’t enjoy music, I’d be very interested to know whether you think this feature has value.

New Background Sounds

While we’re on the subject of hearing, there is an accessibility feature that many people enjoy using whether they have a hearing impairment or not, and this is the ability to play a background sound. The accessibility purpose here is that it’s designed to mask tinnitus, that ringing in the ears that people get. But many people just like it because it’s a pleasant audio background.

You can get to this by adding the hearing option in Control Center, or by just going into Accessibility Settings and Hearing, and finding it there.

There are 2 new sounds in iOS 18. I’ll enable this.

VoiceOver: Background sounds. Switch button, off.

Jonathan: Double tap.

VoiceOver: On.

[cricket sounds]

Jonathan: Ah.

This is the new night sound that you can choose, if you wish.

Let’s just flick right.

VoiceOver: Plays background sounds to mask unwanted environmental noise. These sounds can minimize distractions and help you to focus, calm, or rest.

Sound. Night, button.

Jonathan: I’ll double tap, and we can have a look at what’s here.

VoiceOver: Balanced noise.

Bright noise.

Dark noise.

Ocean.

Rain.

Stream.

Selected. Night.

Jonathan: Night is the first of the new ones.

VoiceOver: Fire.

Jonathan: And fire is the second one.

VoiceOver: Selected. Fire.

[fire sound]

Jonathan: Ah.

Nothing like the sound of a good old crackling fire.

So those are the 2 new sounds available in iOS 18. I’ll switch that off.

VoiceOver: Background sound.

Audio.

Background sound. Switch button, on.

Off.

More Powerful Control Center

In iOS 18, the control center has undergone a significant revamp, and the way that you customize control center has changed.

In the past, if you wanted to customize control center, you would go into settings and control center, and you’d be able to add and remove certain widgets from Control Center as you see fit.

Not only are there many more Control Center widgets that you can add right away, but ultimately, third-party apps will also be able to introduce their own widgets to Control Center. So in iOS 18, you’ll find control Center is much more powerful.

The way you invoke Control Center hasn’t changed. If you have an older phone (or even a newer one, actually, because it still works for legacy purposes), you can touch the status bar, and then swipe up with 3 fingers. If you have any phone other than the SE phone that has the touch ID, give a short swipe down from the top of the screen with one finger.

If you go too far, you’ll hear a second tone and you’ll get into notification center. So you don’t want to swipe down too far.

I’ll do that now.

VoiceOver: Airplane mode. Switch button, off.

[more actions available sound]

Jonathan: I just did the slightest of swipe downs. Actually, it doesn’t take too much. So now, we are in control center.

I have already moved some widgets around because one of the features I use a lot in control center is focus modes. I frequently switch different focus modes on, depending on what it is that I’m doing. So I’ve tried to move focus a little bit further up the screen.

I have to say that I’m not sure whether this is a VoiceOver issue or some sort of conceptual thing that I haven’t quite grasped yet, but I do find dragging things around in control center a bit hit and miss. Sometimes, I can get things to move. Sometimes, I can’t.

When you’re on page 1 of the control center which is now a multi-page environment, you’re on your favorites. So on the screen, you should be able to drag the control center widgets that you use most often for easy access.

To get to page 2, I flick up with three fingers.

VoiceOver: Page 2 of 3.

Now playing.

Media suggestions, The NPR Politics Podcast.

Jonathan: I’m on the now playing part of control center. And if I flick right, I will get suggestions about media that I have listened to recently, as well as options for AirPlay and general now playing related functions.

And if I flick up one more time, …

VoiceOver: Page 3 of 3, connectivity.

Airplane mode. Off, button.

Jonathan: Here are all the items relating to connectivity.

So how do we add new things to control center?

Let’s go to the top of the screen.

VoiceOver: Add controls, button.

Jonathan: And there’s the add controls button. What you might want to think about before you start adding controls is how you want to organize control center. For example, you can create as many pages as you need to, and you might want to create pages that have specific themes. For example, if you use a range of accessibility features, why not create a page on control center that pertains just to accessibility?

So let’s have a go at doing this. Before I do though, I’m just going to swipe right from the add control button.

VoiceOver: Power, button.

Jonathan: Because that’s a useful thing to know now, there’s a power button right in control center, and that can be useful if you want to shut down your device.

So I’ll flick left again, …

VoiceOver: Add controls, button.

Jonathan: and double tap.

VoiceOver: Add controls.

Connectivity. Is editing. 4 by 8.

Jonathan: And we’ve got a connectivity group. It’s large. It’s 4 by 8. And that’s why when we flicked into page 3, we got into connectivity.

If you want to, you can move this connectivity widget somewhere else. So even though iOS 18’s default layout is to put connectivity on page 3 and now playing on page 2, if you would prefer to reverse that because of how frequently you use these features, you can do that by moving these controls around.

You can also resize some controls that have multiple purposes.

If I double tap, we’ll activate, which is the default action. But if I flick down, …

VoiceOver: Edit controls.

Jonathan: So let’s have a look at editing the connectivity control.

Connectivity. Is editing. 4 by 8.

Jonathan: Now, if I flick down, …

VoiceOver: Stop editing controls.

Delete.

Resize to 2 by 2.

Resize to 8 by 4.

Show controls gallery.

Drag connectivity.

Jonathan: And if we double tap on drag connectivity, we should be able to drop it to different places.

I have had some difficulty making this work quite the way I want. And as I say, I’m not sure if that’s a conceptual thing on my part, or whether there might be some issues still to be resolved relating to drag and drop with VoiceOver.

But these controls have some similarities to widgets, in the sense that some of them, which have multiple controls, can in fact be resized.

Now, I’ll go down, …

VoiceOver: Activate.

Stop editing controls.

Jonathan: and stop editing this control.

Now, I want to go to a blank page of Control Center. This works quite similarly to apps on your home screen. You will know that if you go into edit mode on your home screen and you scroll past all the apps that you currently have installed, you’ll always have a blank page there, kind of like a blank canvas to play with.

So we’ll go back, …

VoiceOver: Atdd controls.

Jonathan: into add controls.

VoiceOver: Add controls.

Connectivity. Is editing.

Jonathan: Now, I’m going to flick up with 3 fingers.

VoiceOver: Page 4 of 4. Is editing.

Jonathan: You’ll remember that we only had 3 pages before. This is a blank page. I can verify that by flicking around the screen.

VoiceOver: Add a control, button.

Double tap to open.

Add a control.

Jonathan: So we’ve got nothing on this page.

Now, I want to add a control.

VoiceOver: Add a control, button.

Jonathan: I’ll double tap.

VoiceOver: Sheet grabber, button.

Jonathan: Let’s see what controls are available.

VoiceOver: Search controls, search field.

Jonathan: There are a lot. You can search controls now to find something quickly that you’re looking for. And as we’ll see as we explore this massive list of controls, most of the controls are divided into logical groups delineated by heading. But for some reason, there’s a group of controls at the top that are not. And it’s possible that Apple is making a decision about what controls you are likely to want to use, and putting them up here. I haven’t been able to test on other people’s iPhones to verify that, or whether there is just a group of common controls at the top here.

VoiceOver: Translate, control.

Alarm, control.

Timer, control.

Text size, control.

Magnifier, control.

Jonathan: This is why I think that these controls may be personalized at the top here because I have been using magnifier a lot with text recognition, scene description, and some of the very good technology that Apple’s now providing. And I think it’s unlikely that if this was some sort of generic thing, magnifier would be up here.

Now, let’s say that we want to build a page of accessibility widgets, so we might want to add magnifier to this.

I’m going to double tap magnifier control.

VoiceOver: Magnifier. Is editing. 1 by 1. Word mode.

[more actions available sound]

Jonathan: Now, we can resize that control if we wish. You can hear there are actions available there. But if we want to keep building our accessibility page in control center, we’ll flick right, …

VoiceOver: Add a control, button.

Jonathan: and add another control.

VoiceOver: Sheet grabber, button.

Jonathan: The same screen pops back up. So if we’re just flicking around the screen like I’m doing, (and I’ll do that quickly)…

VoiceOver: Dark mode.

Magnifier.

Voice memo, control.

Dark mode, control.

Flashlight, control.

Stopwatch, control.

Screen recording, control.

Quicknote, control.

Low power mode, control.

Scan code, control.

Home, control.

Screen mirroring, control.

Recognize music, control.

Capture, heading.

Jonathan: Now that I’m past that list of controls at the top, we have the controls separated by heading, thankfully, because there are so many here, which is great. I’m not critical of that at all. It’s awesome to have this much flexibility in control center now. So if I want to navigate to the next heading, I can perform the move to next heading command which for most people, by default is on the rotor. But I use this so often, I’ve assigned it to a 2-finger flick right.

VoiceOver: Clock, heading.

Jonathan: I know that I don’t want to add anything related to the clock here because I’m trying to build an accessibility page, so I’ll navigate to the next heading.

VoiceOver: Connectivity, heading.

Jonathan: Likewise.

VoiceOver: Display and brightness, heading.

Focus, heading.

Home, heading.

Notes, heading.

Now playing, heading.

Remote, heading.

Shortcuts, heading.

Sounds, heading.

Jonathan: This is interesting, actually.

VoiceOver: Volume, control.

Silent mode, control.

Recognize music, control.

Translate, heading.

Jonathan: Back to navigating by heading.

VoiceOver: Utilities, heading.

Voice memos, heading.

Wallet, heading.

Watch, heading.

Accessibility, heading.

Jonathan: Here we go. Now, there are a lot of controls here. I mean, a lot. And once third parties start being able to add controls here, it’s going to get pretty busy.

But I think the trick is to have some logic to the way you structure your control center, and that’s why I’m thinking I’ll put all my accessibility-related things on one page.

VoiceOver: Accessibility shortcuts, control.

Assistive access, control.

Guided access, control.

Live speech, control.

Hearing accessibility, heading.

Sound, control.

Music haptics, control.

Left-right stereo balance, control.

Live captions, control.

Hearing, control.

Headphone levels, control.

Live listen, control.

Sound recognition, control.

Headphone accommodations, control.

Motor accessibility, heading.

Jonathan: We’ll keep going with exploring these accessibility options in a moment. But I just hope that at some point, Apple will add mono audio as an option here because I use this quite a bit.

In New Zealand, we have journalists who go out and live stream media conferences. Usually, political media conferences are the ones that I’m interested in.

I don’t know what technology a lot of these journalists are using. But I often find that the politician who’s giving the conference is on one channel, and then you get some sort of ambience or audience on the other channel. And as someone with a hearing impairment, I find that really distracting, and I want to turn it into mono.

At the moment, I have to go into accessibility settings, find hearing, and switch mono audio on, and then I can enjoy it the way I want it. But it’s a lot of steps, so I was hoping that mono audio would be an option here so I could just double tap it from my favorites in control center and have a lot less friction toggling this on and off.

VoiceOver: Motor Accessibility, Heading.

Switch Control, Control.

Voice Control, Control.

Full Keyboard Access, Control.

Assistive Touch, Control.

Apple Watch Mirroring, Control.

Control Nearby Devices, Control.

Eye Tracking, Control.

Vision Accessibility, Heading.

Classic Invert, Control.

Color Filters, Control.

Live Recognition, Control.

Increase Contrast, Control.

Vehicle Motion Cues, Control.

Reduce Motion, Control.

Reduce Transparency, Control.

Reduce White Point, Control.

Smart Invert, Control.

VoiceOver, Control.

Zoom, Control.

Speak screen, Control.

Dim flashing lights, Control.

Hover text, control

Hover typing, control.

Jonathan: That is a lot of accessibility choice. So if you wanted to, you could sit here and build an accessibility page customized for your specific accessibility needs.

You may be thinking well, what’s the point? Why don’t I just go into settings and do it from there?

I think the point is just getting to these controls more easily because you can invoke control center from within an app, make a change, and then pop right back into your app again. So as long as your control center doesn’t get too cluttered and illogical, it should be easier to quickly find things that you change regularly. And there are a lot of things here to explore. I have deliberately chosen not to flick through them all because that makes for a bit of a boring podcast, and you can do that yourself and just see the vast array of controls that Apple have added even before third-party developers start to take part in the fun.

Voice Memos

Next, let’s take a look at the voice memos app. This is a very handy, accessible way to quickly record something. You can even say, “Record a voice memo.” to Siri, and that will be sufficient to get recording.

But before we go into the voice memos app itself to see what’s new, there is a configuration option I’d like to show you. It’s long overdue, in my view. This is the ability to allow voice memos to record in stereo.

In the process of showing you this, I can also demonstrate the revamp of the settings app.

So I’m on the settings screen now. And you may remember that it was a very very busy screen in earlier versions of iOS because you had all the different sections that Apple chose to put in the settings app, plus a section for every app that you have on your phone. And if you have many hundreds of apps like I do, that made for an exceptionally busy settings screen.

I’m on the main settings screen now.

VoiceOver: Settings, heading.

Jonathan: And now, what you do is you can go to the bottom of this screen with a 4-finger single tap, …

VoiceOver: Apps, button.

Jonathan: There’s an apps button. When you double tap this, it will expand the apps section.

VoiceOver: Selected. Apps.

Aiko, button.

Jonathan: And it’s even easier to find the app you want. I’m going to go to the top of the screen, …

VoiceOver: Settings. Back, button.

Jonathan: and flick right.

VoiceOver: Apps, heading.

Search, Search field.

Jonathan: And there’s a handy-dandy search field, so I’m going to double tap.

VoiceOver: Search field. Is editing. Search. Character mode. Insertion point at start.

Jonathan: I have my APH Mantis connected, so I’m just going to type on its keyboard, voice, and flick right.

VoiceOver: Clear text, button.

Cancel, button.

Voice Dream, button.

Voice in a can, button.

Voice memos, button.

Jonathan: I’ll double tap voice memos. That’s the one we’re interested in.

VoiceOver: Allow voice memos to access, heading.

Jonathan: There are the usual options here, but I’m going to navigate by heading to get to the ones that I’m wanting to show you.

VoiceOver: Voice memo settings, heading.

Jonathan: Let’s flick right.

VoiceOver: Clear deleted. Never, button.

Jonathan: This is not the default. So if you want to make sure that your voice memos are not erased after 30 days or earlier, go in here and set it to never, and then you are in charge of when your voice memos get deleted.

VoiceOver: Audio quality. Lossless, button.

Jonathan: That is not a new option. I like setting it to lossless because I’m a bit of an audio purist, and if I want to compress the audio later and sound editor, then I can do that.

VoiceOver: Location-based naming. Switch button, on.

Stereo recording. Switch button, off.

Jonathan: It’s off by default. A stereo recording is going to take up more space, and not everybody wants stereo, but I think many of us will. So I’ll double tap to enable this.

VoiceOver: On.

Jonathan: And now, we have stereo recording in the voice memos app. Unfortunately, you can’t enable this on a recording by recording basis from within the Voice Memos app itself. So if you want to change this for any reason, you’ll need to go back into settings here and toggle it.

Now that I’ve done that, I’m going to open the app.

Open voice memos.

VoiceOver: Voice memos.

Jonathan: And I’ll flick right.

VoiceOver: All recordings, heading.

Jonathan: All recordings is the default folder, but you can set up multiple folders to keep your recordings organized. This could be very handy if you use voice memos app like the way that many of us used to use digital recorders – to take quick notes on different subjects. So if you want to create a voice memos folder for shopping lists or messages that you want to pass on to certain people, you can absolutely do that.

I’ll flick right.

VoiceOver: Edit, button.

Titles, transcripts. Search field.

Jonathan: You heard right. Voice memos now offers transcripts. This is one of the big new features in voice memos in iOS 18.

As I said, you can just tell Siri to start recording a voice memo But from within the app, we could just go to the bottom of the screen with a 4-finger single tap.

VoiceOver: Record, button.

Jonathan: And there’s the record button. It’s always intrigued me that VoiceOver says the word record in this app in a very strange way. Rocord. [laughs] I’m not sure where it’s getting that from.

So I’m going to double tap to start recording, and I’ll just speak into my iPhone.

Hello! This is Jonathan from the Living Blindfully podcast, and I’m recording a voice memo in iOS 18. I’m moving my head around a little bit in the hope that when I play this recording back, you’ll be able to hear the stereo effect and we’ll be able to confirm that it is recording in stereo. I’m also interested to see how accurate the transcription feature is. This is another thing that is new in iOS 18.

Because I didn’t move focus when I double tapped again, I was on the stop button. And now, voice memo has been recorded.

It’s also worth noting that when you’re in the voice memos app, you don’t have to hunt around for the record/stop button if you don’t want to. The magic tap is taken over by the voice memos app when it has focus. So all you need to do when you’re in voice memos is double tap with 2 fingers to start recording, double tap with 2 fingers to stop up again. Let’s go to the top of the screen.

VoiceOver: Voice memos.

Back, button.

All recordings, heading.

Edit, button.

Titles, transcripts, …

Dictate, button.

Jonathan: And it’s got my address here as the name of the file because I’ve got the option where it names the file based on your location.

VoiceOver: Transcription available.

Jonathan: We’ll take a look at that in just a moment.

VoiceOver: More actions, button.

Track position: 0 seconds of 30 seconds.

Edit recording, button.

Rewind 15 seconds, button.

Play, button.

Jonathan: Let’s take a listen.

[voice memo]

Hello! This is Jonathan from the Living Blindfully podcast, and I’m recording a voice memo in iOS 18. I’m moving my head around a little bit in the hope that when I play this recording back, you’ll be able to hear the stereo effect and we’ll be able to confirm that it is recording in stereo. I’m also interested to see how accurate the transcription feature is. This is another thing that is new in iOS 18.

VoiceOver: Play, button.

Jonathan: Well, there’s no doubt that that’s in stereo. And I guess because of the way I’m holding the phone, there’s definitely a skew all the way over to the right of the spectrum there.

I’m going to double tap on…

VoiceOver: More actions, button.

Share, button.

Edit recording, button.

Options, button.

View transcript, button.

Jonathan: Let’s have a look at how good this is, or not.

VoiceOver: Resume, button.

Jonathan: That is saying resume there but it’s pronouncing it resume like you’re applying for a job.

And if we look around the screen, we will eventually find this.

VoiceOver: Hello! This is Jonathan from the Living Blindfelly podcast, and I’m recording a voice memo in iOS 18. I’m moving my head around the little bit in the hope that when I play this recording back, you’ll be able to hear the stereo effect and we’ll be able to confirm that it is recording and stereo. I’m also interested to see how accurate the transcription feature is. This is another thing that is new in iOOS 18.

Jonathan: Let’s dissect that a little bit.

The first error (and I’m reading on the Braille display now to confirm this) is that it’s written blindfelly with an E-L-L-Y. So B-L-I-N-D-F-E-L-L-Y podcast. We’ll give it a pass there because blindfully is not a word you will find in the dictionary.

When I said I was moving my head around a little bit, it got the little bit.

And when I said recording in stereo, it got recording and stereo.

When I talked about this being a new feature in iOS 18, it got iOS 18 right the first time I said it. But the second time I said it, it got iOOS 18.

So I’d say a little over 95% accurate, and it’s certainly accurate enough that you would have no difficulty understanding what the voice memo contains. So that’s a handy their little app, the old Voice Memos, built right into your iPhone. It can be used as a little audio notetaker, if you want.

Notes

But let’s transition to talking about the Notes app now, because this may be a better way to go if you want a notetaking type solution built into your iPhone. We’ve seen, as I’ve written my iOS without the Eye books over the years and done presentations here on the podcast in more recent years, that the Notes app always seems to get some love. And what we find now is, I don’t know, I think the Notes app may have outgrown the name Notes because it’s a powerful word processor and personal organizer. There is a lot of power tucked away in the iOS Notes app. So let’s open it.

Open Notes.

VoiceOver: Notes. Search, search field.

Jonathan: You can search your notes, of course, but I’m going to go to the bottom of the screen, …

VoiceOver: Toolbar.

New note, button.

Jonathan: and double tap.

VoiceOver: New note, dimmed. Note, text field. Is editing. Character mode. Insertion point at start.

Jonathan: I’m going to flick right.

VoiceOver: Format, button.

Checklist. Off, button.

Table, button.

Attachments, button.

Jonathan: And that’s what I’m going to double tap.

VoiceOver: Attach file, button.

Record audio, button.

Jonathan: You can record audio into the notes app, which will be transcribed. So I’ll do this.

VoiceOver: Record audio.

New recording, heading.

Today, [8:03] AM.

More, button.

Waveform, 0 seconds.

Go backward 15 seconds.

Play, dimmed.

Go forward 15 seconds, dimmed.

Show transcript and summary, button.

Record, button.

Jonathan: This user interface is very similar to the one in the voice memos app. One difference is at the bottom of the screen, you don’t have the record button. You have, …

VoiceOver: Done, button.

Jonathan: But it’s still pretty easy to go to the bottom of the screen.

VoiceOver: Done, button.

Jonathan: (That’s where the done button is.), flick left, …

VoiceOver: Record, button.

Jonathan: and you’re on the record button again. [laughs] It is using that really interesting pronunciation in here, but I will try not to fixate on this and instead record something.

Here we are in the studio, continuing our look at the many new features in iOS 18 from Apple. I’m now recording audio that’s going to be attached to the note. And if all works correctly, this note will also be transcribed. It will be interesting to evaluate how accurate the description is once again.

[laughs] And of course, that’s the trouble when you’re trying to talk like that. I said description instead of transcription.

Let’s pause this. And what have we got?

VoiceOver: Done, button.

Jonathan: We’ll double tap the done button.

VoiceOver: Note, text field. Is editing.

New recording, [8:07] AM. Audio attachment, 32 seconds. More content available. New line. Insertion point at end.

[more actions available sound]

Jonathan: It says more content is available. So if I flick down, …

VoiceOver: Add link. Activate, default.

Jonathan: And I’ll double tap, …

VoiceOver: New recording, heading.

Jonathan: and flick right.

VoiceOver: Today, [8:06] AM.

More, button.

Text line 2.

Summary. Expand, button.

Jonathan: This may not be available when you get your iOS 18 because at the moment, I’m doing this demo using iOS 18.1, that has Apple Intelligence features in it, and it’s got a summary button which is all part of Apple Intelligence. But since we found it, if I double tap the summary button, …

VoiceOver: Loading.

Jonathan: What summary do we get?

VoiceOver: Copy summary, button.

Share summary, button.

Summary, heading.

Back, button.

iOS 18 introduces audio recording and transcription features for notes. The accuracy of the transcription will be evaluated.

Jonathan: We could do without the passive voice there, Apple. We could do without the passive voice.

But anyway, yes, that is a fair summary of what I said. And as we’ve already seen, there are copy and share summary buttons.

But I’m going to go back.

VoiceOver: New recording, heading.

Today, [8:06] AM.

More, button.

Jonathan: I’ll double tap more.

VoiceOver: Rename, button.

Add transcript to note, button.

Copy transcript, button.

Find in transcript, button.

Save audio to Files, button.

Share audio, button.

Delete, button.

Dismiss context menu, button.

Jonathan: In this case, we want to add the transcript to the note, so I’ll go to the top of the screen, …

VoiceOver: Rename, button.

Jonathan: and flick right.

VoiceOver: Add transcript to note, button.

Jonathan: I’ll double tap.

VoiceOver: Note, text field. Is editing.

New recording. [8:07] AM. Audio attachment. 32 seconds. More content available. New line.

Jonathan: Now, this could be a bug with the version that I’m doing this demo for you on, but the transcript is not actually here in the note. But I think I can find it if I go right, …

VoiceOver: New recording. [8:07] AM. Audio attachment. 32 seconds, button.

Jonathan: And double tap.

VoiceOver: New recording, heading.

Jonathan: I’m going to go to the bottom of the screen.

VoiceOver: Done, button.

Record, button.

Hide transcript and summary, button.

Go forward 15.

Play, button.

Go backward 15.

0 seconds.

Let’s pause this, button.

Jonathan: Okay. So here’s the full transcript.

VoiceOver: Here we are in the studio, continuing our look at the many new features in iOS 18 from Apple, button. I’m now recording audio that’s going to be attached to the note. And if all works correctly, this note will also be transcribed, button. It will be interesting to evaluate how accurate the description is once again. And of course, that’s the trouble when you’re trying to talk like that, button. I said description instead of transcription, button. Let’s pause this, button.

Jonathan: That’s actually a very good transcription.

I’m going to back out to get to the main body of the note.

VoiceOver: Text field. is editing.

New recording.

Jonathan: I’ll go to the bottom of it.

VoiceOver: Insertion point at end.

Calculations in Edit Fields

Jonathan: I just want to show you something that works in a number of edit fields now in iOS 18, and that is that you can type a little calculation and immediately get the result.

So I’m going to do 24, and then an asterisk, 312+12, and then I’m going to write the = sign.

VoiceOver: Computed result, 7,500.

Jonathan: So you can use any calculator function from within an edit field. This works in text messages as well, and you can get the result simply by pressing the equals sign.

Where I find this handy is that on Mushroom FM, the internet radio station that I’m a part of, we run a countdown every Saturday, and the length of that countdown varies a little bit. Some weeks, I need to insert a few extra program elements to get us up to 3 hours. On other weeks, if I included those program elements, the show would run over. So I calculate the length of each hour of this countdown every week.

I used to type it into the calculator, which is a bit annoying because if you mess up, you have to start from scratch again. It’s difficult to edit.

But I can now do it in the notes app or a number of other edit fields (it’s much easier), and just do the calculation. I get the running time of the show by using this method.

Advertisement: Living Blindfully is brought to you in part by Turtleback. They’ve been working with our community for decades now, and they understand our requirements when it comes to leather cases for the devices that we use. They need to be durable, protective, and most of all, functional.

You can be confident that when you buy a leather case or accessory from Turtleback, you’re buying quality.

Visit TurtleBackLV.com today to find out more, and be sure to use the coupon code LB12. That’s LB for Living Blindfully and the number 12 for 12% off your order at checkout.

Or you can phone their friendly team. The number is 855-915-0005. That’s 855-915-0005.

Make the most of your device by putting it in a Turtleback case.

TurtleBackLV.com. And don’t forget that coupon code – LB12.

Changes to the Files App

Next, I want to look at a few goodies in the Files app that have been added to iOS 18.

The big new feature for me is one that makes it more competitive with services such as Dropbox and OneDrive, and this is the ability to keep a folder on your device, even if you have an internet connection.

For many years (I think I started using Dropbox in 2009 or something like that. It was a long time ago.), I have folders that are local on my hard drive. It doesn’t matter whether I have an internet connection or not. Those folders are still there. And when the internet is restored, those folders will sync up again. Now, the files app has this.

So I’m going to flick right (I’m in the files app now),

I have Family Sharing set up with my iOS-using adult children. one of my kids is an Android fan, but the rest of them are using iOS, and they’re data-using ninjas. Bonnie is using this as well. So we have the 2 terabyte plan at the moment with iCloud. And I mean, if you’ve got all that space, why not use it, I say?

So I’ve got a folder called documents in iCloud drive, …

VoiceOver: Documents folder, in iCloud.

Jonathan: and I’m going to triple tap. You can actually see this option on the rotor. But what I found, at least at the time that I’m recording this, is that you don’t seem to be able to tell when it’s on or when it’s not from the rotor. If you choose the option from the rotor, nothing seems to change as far as VoiceOver is concerned, so I prefer to triple tap for now. I’ll do that.

VoiceOver: Copy, button.

Move, button.

Download now, button.

Keep downloaded, button.

Jonathan: I’m going to double tap keep downloaded.

VoiceOver: Desktop folder, in iCloud. 8 items.

Jonathan: Now, I’m going to flick right.

VoiceOver: Documents.

Jonathan: We’ll triple tap again and we’ll see what happened.

VoiceOver: Copy, button.

Move, button.

Selected. Keep downloaded, button

Jonathan: Keep downloaded is selected. So we can get confirmation using this method that in fact, keep downloaded is on. That means that these files which may be precious are going to be on your device so even if you don’t have an internet connection, you will continue to have access to them.

There is another new feature I won’t demonstrate (or rather, I can’t demonstrate this) because I don’t have anything that’s compatible with the USB-C port on my iPhone to show you this. But I understand that if you connect some sort of external drive (it could be a hard drive, or a thumb drive, anything that will plug into your iPhone) and you triple tap on the name of the drive and you choose erase, you will now have the option to reformat the drive. Currently, the format options apparently are APFS with options for case sensitivity or encryption, exFAT, and MS-DOS FAT. So the same options that are available in Disk Utility on the Mac.

Messages

Next, let’s take a look at some of the features that are new in the iOS 18 Messages app.

One of the big ones, especially if you have friends who use Android, is the introduction of RCS. We’ve talked about this on the podcast before, but I’ll recap and perhaps expand for the sake of this review.

To really convey the importance of RCS, I have to give you a bit of a history lesson.

SMS, which stands for short message service, is the most common form of text messaging, and it was first used all the way back in the 1990s. It allows for sending text messages up to just 160 characters long. That’s pretty negligible these days, and it’s super basic. SMS messages are limited to text with no enhanced features.

The big advantage of it, of course, is that it’s universal. SMS works on all mobile phones, no matter how smart or how basic, no matter what the brand is. All it takes for it to work is a cellular signal to send and receive messages.

Obviously, the world has changed a lot since the 1990s in terms of the way that we communicate. As a result, you have services that allow the transmission of pictures and much longer amounts of information. iMessage, for example, is Apple’s proprietary protocol unique to Apple devices. There’s also technology like WhatsApp, and many other technologies that are cross-platform. In other words, you can use them on iOS and you can use them on Android.

Now, RCS stands for Rich Communication Services, and it’s designed to drag universal messaging kicking and screaming into the 21st century. It’s a standard. It’s an upgrade to SMS with many more advanced features. It aims to provide a richer messaging experience, similar to apps like WhatsApp, iMessage, and Facebook Messenger. With RCS, you can send images, videos, and audio messages, and not just text. If you’re connected to a wi-fi network, you can send RCS messages across that wi-fi network, and not have to use the cellular network just like WhatsApp, Messenger, and iMessage.

RCS has red receipts and it has delivery certification, so you can see when a message has been delivered and you can see when it’s been read.

RCS has typing indicators as well, something you don’t get in standard SMS messages. You can see when someone’s typing a response to you.

There’s full support for group chats with more enhanced interactive features. If you’re sending an RCS message, you can share your location in the chat. There is no character limit whatsoever.

If all the people that you text are other iPhone users, this is no big deal. But it would be unusual not to know someone in your life who uses Android, and RCS is going to make texting just from within the native text messaging app that’s built into phones much more interactive and enjoyable. It is a universal standard for messages that can work across different platforms and carriers, much like SMS does now.

It’s worth noting that like iMessage and WhatsApp, RCS messages are end-to-end encrypted, so they’re far more secure than SMS messages.

Businesses can use RCS to send more engaging and interactive messages to their customers. For example, a company could send a boarding pass, or a receipt, or customer service updates with images and buttons for quick responses.

Apple resisted RCS with considerable vigor for a long time. Google embarked on a campaign to encourage Apple to adopt RCS. For the longest time, Apple was not listening. And the reason for that was that they perceived iMessage to be a significant competitive advantage over what Android offered. IMessage, in their view, was a compelling point of difference.

But it’s all changed, and Apple has gone in on rcs now because of EU regulation and scrutiny. So good on the EU for bringing this about.

It’ll be interesting to see what happens to WhatsApp. WhatsApp have just reached 100 million users in the US.

In other markets like India, WhatsApp is ubiquitous and huge. It’s hard to imagine WhatsApp being knocked off its perch in india.

But in the United States where there are a lot of iPhones, I do wonder what will happen when iPhone and Android users can communicate much more effectively, direct from the messages app.

For us as end users, there’s nothing to enable here. This just happens. If you start texting a number, then the Messages app will tell you that it’s sending a text message, and it will tell you whether it’s SMS or RCS. If you hear that it’s RCS, rejoice! You can start attaching voice clips and images to your heart’s content, and it will be a beautiful interactive thing. So when you choose to message someone, listen to VoiceOver telling you whether it’s an iMessage, an SMS message, or an RCS message.

Another feature from the, I don’t know what took them so long, but we’ll celebrate the win department, is that if you’re sending someone an iMessage, you can now schedule that message up to 14 days ahead.

We’ll have a look at how this works. I will go into my iMessage conversation with Bonnie, …

VoiceOver: Bonnie Mosen. This is it.

Jonathan: and I will double tap.

VoiceOver: Messages. Your iMessage. iMessage, text field.

Jonathan: I’ll double tap.

VoiceOver: Message, text field. Is editing. iMessage. Insertion point at start.

Jonathan: and I will type, It is 12 PM! Time for a break and some well-earned lunch.

Now as I record this, it’s quite a bit earlier than 12 o’clock. So I’m going to flick left, …

VoiceOver: Apps, button.

Jonathan: and we have to double tap the apps button. I have my apps hidden by default, and I only expand them when I want to use them, so I’ll double tap.

VoiceOver: Message, text field. Is editing.

Jonathan: We’ll go to the top of the screen.

VoiceOver: Audio.

Camera.

Photos.

Stickers.

Apple Cash.

Location.

More.

Store.

Send later.

Jonathan: And here is send later. It’s just an app in your apps drawer.

I’ll double tap send later.

VoiceOver: Send later.

Audio.

[more actions available sound]

Messages.

Your iMessage. This is a test from the…

Jonathan: Focus has moved around. Let’s see if we can get it to where we need it to be.

VoiceOver: Today, 5 PM, button.

[more actions available sound]

Jonathan: There we go. Apple Intelligence, which is partially in the 18.1 build I’m running for this demo, missed a bit of a trick there because I said it is 12 PM, so you’d think that Apple Intelligence might be able to schedule it correctly, or suggest the right time. But that’s okay.

I’ll double tap.

VoiceOver: Today, 5 PM.

Jonathan: Now, it feels like nothing has changed. But in fact, what’s happened is that the picker has popped up at the bottom of the screen. So I’ll perform a 4-finger single tap on the bottom half of the screen.

VoiceOver: PM. Selected. Picker item. Adjustable.

00 minutes. Selected. Picker item.

5 o’clock. Selected Picker item.

Today. Picker item. Adjustable.

Jonathan: And we can go up to 14 days in advance.

I’m going to flick right.

VoiceOver: 5 o’clock. Selected. Picker item. Adjustable.

Jonathan: And now, flick down.

VoiceOver: 4 o’clock.

3 o’clock.

2 o’clock.

1 o’clock.

12 o’clock. 12 of 12. Selected.

00 minutes. Selected. Picker item. Adjustable. 1 of 60.

Jonathan: So we’ve set this to where we need it to be.

VoiceOver: 12 o’clock. Today.

Send, button.

Jonathan: I’ll double tap the send button.

VoiceOver: Send message.

Dictate, button.

Jonathan: And it made a cute little noise.

What I’m going to do is back out of this message conversation, …

VoiceOver: Edit, button.

Jonathan: and flick right.

VoiceOver: Compose.

Messages. Search.

Dictate.

Conversations.

Bonnie Mosen.

This is a test from the Meta Smart Glasses.

Jonathan: So that was when I was using the Meta Smart Glasses to send her a text message. We don’t see the one that is scheduled for sending.

I’ll double tap the conversation.

VoiceOver: Messages. Your iMessage.

It’s 12 PM! Tome for a break and some well-earned lunch. 12 PM.

Jonathan: So focus was placed on the message. If I flick left now, …

VoiceOver: Today, 12 PM. Edit, button.

Jonathan: I can edit that.

VoiceOver: Send later, heading level 1.

Jonathan: There’s a heading for send later. All the send later messages, if I were to send multiple ones, would be grouped under that heading.

If I want, I can delete this message. I probably will, actually.

VoiceOver: Today, 12 PM. Edit, button.

Your iMessage.

Drag.

More.

Translate.

Copy.

Delete.

Jonathan: I can double tap to delete it.

VoiceOver: Your iMessage. This is a test… Delivered, heading.

Jonathan: And now, it has gone.

I wish I’d had this feature when my kids were a bit younger because I would be able to schedule messages while I thought about it That said, alright, it’s time to come home now. You said you’d be home by whenever o’clock. [laughs], or whatever it is.

It is a very handy feature. If you’re an organized kind of person and you’re speaking to someone who says hey, can you text me about this in 48 hours?, then set the message up right after you get off the call, and then you won’t have to worry about it anymore. Or maybe you have family members coming up who are celebrating birthdays, and you tend to get a bit busy during the week. You can have a look at when those birthdays are coming up and schedule them accordingly. The possibilities are endless, and it’s a nice handy little feature that they’ve added there.

You can now react in different ways to messages that have come in.

Here’s an iMessage from Bonnie about dog food.

VoiceOver: Bonnie Mosen.

Yes, I think it’s an automatic shipment every 6 to 8 weeks now. [11:12] AM.

[more actions available sound]

Jonathan: If I flick down, …

VoiceOver: Tap back.

Jonathan: we’ve got tap back. That is the first option.

I must confess, I wasn’t familiar with this term tap back until iOS 18. But apparently, it’s what we’ve always called the little reactions that you can send to a message. Now, there are more of them.

So if I double tap on tap back, …

VoiceOver: Tap back picker.

Heart, button.

Jonathan: That’s probably the one I use most when I’m reacting to a message from Bonnie.

VoiceOver: Thumbs up, button.

Thumbs down, button.

Haha, button.

Exclamation mark, button.

Question mark, button.

Smiling face with hearts, button.

Head shaking vertically, button.

Face with tears of joy, button.

Face with open eyes and mouth with head exploding, button.

Thinking face, button.

Add custom emoji reaction.

Jonathan: And here’s the really powerful option. You can add a custom emoji reaction. So if I double tap this, …

VoiceOver: Emoji search hidden.

Jonathan: I’m just going to flick right, …

VoiceOver: Search emoji, text field.

Jonathan: and double tap.

VoiceOver: Insertion point at end.

Jonathan: Now, since I do so much podcasting and internet radio broadcasting, I thought I might add the microphone one. So I’ll do microphone, and flick right.

VoiceOver: Clear text, button.

Microphone.

Studio microphone.

Jonathan: I like studio microphone. I’ll double tap that.

VoiceOver: Emoji.

Back, button.

Jonathan: And it has just sent that. A studio microphone reaction. Interesting. [laughs]

So if I go to another message from here, …

VoiceOver: Bonnie Mosen.

Jonathan: And now, I’ll flick down to tap back. Let’s see if that studio microphone is now on my list.

VoiceOver: Tap back picker.

Heart, button.

Thumbs up, button.

Thumbs down, button.

Haha, button.

Exclamation mark, button.

Question mark, button.

Studio microphone, button.

Jonathan: There we are. Studio microphone has now been added to the tap back, which means that you can really let your personality shine in terms of the emoji that you choose to react with.

A couple of years ago, Apple added the ability to send emergency messages via satellite, and this literally has saved lives around the world. This feature is being rolled out progressively around the world.

Now, this has been expanded so if you find yourself without a Wi-Fi connection or a cellular connection, you can use Satellite to send an iMessage, or an RCS, or SMS message.

Emergency Features

And that segues us nicely on to some other emergency features, as we leave the Messages app.

Emergency SOS now supports live video. That means that once the SOS feature is activated, you’re able to send video or audio, other media to emergency services.

My understanding is that the emergency services have to become equipped to receive that data, though. When all that’s been done though, a compatible emergency service dispatcher can send a request to you for you to share either live video or a video from the camera roll.

Apple says that should make it easier and faster for you to get help. The data is sent over an encrypted connection.

I would really encourage you to become familiar with how to use the Emergency SOS feature. On an iPhone by default, if you press the side button 5 times, you will trigger this. There are also other ways to get the emergency SOS feature working. It is also available on the Apple Watch. When you find yourself in need of this service, you don’t want to be racking your brain thinking, how do I use this? There won’t be time. So do get familiar with it. Make sure it’s second nature. It is a bit scary to invoke it, but you can invoke it and then stop it before it actually places the call to emergency services. So I would practice with it, make sure you’re really comfortable with getting that up and ,running in case you ever need it.

Locking and Hiding Apps

Next on our iOS 18 walkthrough, we’re going to take a look at locking and hiding apps. This is a new feature in iOS 18 which allows you to protect any app behind face ID, and optionally also to hide it. There are some apps already which require you to authenticate with biometric authentication, be that face ID on most phones, or touch ID on the iPhone SE phones. But the difference here is that this belongs to the operating system and therefore, it’s available to most apps.

Let’s take a look at how this works. I thought one example to show you would be the Facebook app. Perhaps you want to protect your social media so that if your phone falls into the wrong hands, you’ve got another layer of encryption that has to be broken through before anybody can take a look at the app.

So I’m on my home screen, and I’m going to tap…

VoiceOver: Facebook, 10 new items.

[more actions available sound]

Jonathan: Now, instead of just double tapping to open the Facebook app, I’m going to triple tap.

VoiceOver: Remove app, button.

Jonathan: And we have a little context menu that has come up pertinent to this app. I’ll flick through it.

VoiceOver: Require face ID, button.

Jonathan: And there’s the option that’s new – require face ID. If I double tap this, …

VoiceOver: Alert. Require face ID for Facebook? This app will require Face ID or your passcode to open or show content in other apps. App content will not appear in notification previews or spotlight.

Jonathan: And then if I flick right, …

VoiceOver: Require Face ID, button.

Hide and require Face ID, button.

Cancel, button.

Jonathan: Let’s go back, …

VoiceOver: Hide and require Face ID.

Require Face ID, button.

Jonathan: and I’ll double tap. I need to be looking at my phone when I do this.

VoiceOver: Alert. Face ID authenticated.

Facebook.

Messages.

Jonathan: I’m back on my home screen, and I’m at the top of it, in fact, which happens to be messages. Let’s check if this works. Not that I’m a cynic at all. I am not a cynic at all.

I’m going to tap Facebook.

VoiceOver: Facebook, 10 new items.

Jonathan: Now, I’ll double tap.

VoiceOver: Alert. Face ID required.

User authentication.

Profile picture.

Passcode field. Secure. Enter iPhone passcode.

Jonathan: A lot happened there. Because I wasn’t looking at my phone (and I deliberately chose not to look at the phone to show you what happens), it times out quite quickly. And when it times out, you will be required to enter your iPhone passcode to get into an app that is protected this way.

I’ll go back to the home screen and tap Facebook again.

VoiceOver: Facebook, 10 new items.

Jonathan: This time, I’ll make sure I’m looking at the phone when I open the app.

VoiceOver: Face ID authenticated.

Blind users of Ray-ban Meta Smart glasses.

Jonathan: And I’m back in Facebook. And what do you know? There’s another new post to the very prolific Blind Users of Ray-Ban Meta Smart Glasses Facebook group.

You’ll notice there was another option that was new here as well, and we’ll explore that. I’ll go back to the home screen.

VoiceOver: Facebook, 10 new items.

Jonathan: I’ll triple tap.

VoiceOver: Remove app, button.

Don’t require face ID, button.

Jonathan: If we double tap this, you’re reversing what we just did, and Face ID won’t be required to get into the app anymore. In fact, that’s what I need to do to show you this next option because right now, that option isn’t available.

I will double tap while looking at the phone.

VoiceOver: Face ID authenticated.

Jonathan: Now, I’m back on the home screen.

VoiceOver: Facebook, 10 new items.

Jonathan: I’ll triple tap again.

VoiceOver: Remove app, button.

Require Face ID, button.

Share app, button.

Jonathan: Now, that would appear to be a bug at the time that I’m recording this. So hopefully, by release time, this one’s solved because the other option hasn’t come back. But these two options are related.

We’ve already had a look at the require face ID option. There is another option, and we heard this when we first looked at the menu, which was hide and require face ID. When you do this, your app will disappear from the home screen. There will be very few traces on your phone that this app exists on your system. If you know it’s there, you will be able to launch it from the app library. If it’s accessible, and At the time of recording, which is a few weeks before you get iOS 18, so this may be fixed and I hope it is, I was not able to find a folder that gets created for all iOS 18 users called hidden. This folder exists in your app library, whether you have any hidden apps or not.

That’s a very good decision on Apple’s part because it means that someone can’t take a quick peek at your app library, see there’s a hidden folder, and say aha! You’re hiding apps. What are you hiding? Everybody’s got a hidden folder. It may be empty, or it may not be. Anyone who’s not authorized to know these things can’t tell because whenever you go into the hidden folder, you need to authenticate with biometric authentication.

Now, that’s all very nice but at the moment, as I’m recording this, the hidden folder was not accessible. I couldn’t locate it anywhere in the app library.

A reminder that to get to the app library, you swipe with 3 fingers to the left past the pages of your home screen. And when you’ve got past the home screen, you’ll find the app library there. Hopefully, this is one of those things that will be corrected.

So if you want to hide an app, that’s how to do it. Just be very sure that it is accessible before you do it because when I was experimenting, all for the good of science and for this review, I did hide an app. I couldn’t get it back because the hidden folder wasn’t accessible to me.

The only way I could get the app back was to uninstall it. I had to go into settings, general, about, and then file storage, and then go into apps, and delete it that way. So it was quite a process to get the app back. But if you have apps that you would rather the world didn’t see that you have, this is how to hide them in iOS 18.

Apple Music

A few wee nuggets to tell you about Apple Music.

It no longer requires everybody to have an Apple Music subscription if they want to use SharePlay to collaborate on the music cue. So you’ve got a bunch of people over, all air playing into a speaker, and you’re in party mode, and everybody’s contributing groovy tunes. Everybody can do that now, regardless of whether they have an Apple Music subscription.

Advertisement: Thanks to Aira for sponsoring Living Blindfully.

Aira may be available on more devices than you realize. Trained agents are available on the devices that you use every day. For example, you can use the Explorer app on your Android or Apple smartphone.

The apps always supported the rear camera, of course. But the latest version of the apps also support the front-facing camera. And that means you can even get help to take your next great selfie for uploading to social media.

Regular Living Blindfully listeners will be very familiar with the Envision smart glasses. If you own a pair of those, you can use Aira hands-free to assist with a variety of tasks, including guiding you through unfamiliar locations. If you’re doing a bit of travel again, it’s a great alternative to waiting for meet and assist at an airport.

Aira is also on the BlindShell Classic, so you’ve got access to your phone at the touch of a physical button. And it’s on your PC and Mac as well, which makes it always within reach when you come across an inaccessible website, or you just want to speed up the process when you’re on a busy website and time is of the essence. Aira is there on your device, on your terms.

To learn more, you can visit Aira at their website at Aira.io. That’s A-I-R-A.I-O.

The Passwords App

Next, we’re going to talk about a brand new app that’s been added to iOS 18, and this is the Passwords app. It doesn’t offer a lot of new functionality, although there is some. But the primary purpose of the Passwords app is to make iCloud Keychain more intuitive, easy to use. It’s not hidden away in settings.

If you don’t have a password manager up and running yet, then this might be the incentive you need to get one up and running. I cannot emphasize enough how important it is to take your cybersecurity seriously. You may think that nobody cares and that you are just a mere crinkle on the potato chip of life. That will be no consolation if somebody steals your identity, or does terrible things with your credit card because they’ve managed to break their way in.

So in order of preference, if the site that you are visiting supports the new passkey technology, which we’ll come back to in a bit, use that. It’s the most secure option.

Next, use a unique password for every website that you visit. And if possible where 2-factor authentication is available for that website, switch that on as well.

I don’t think that I will ever use this app myself. I mean, never say never, but I am a huge fan of 1Password.

It is true that when they switched to an electron-based app, they did go through a very rough accessibility patch for a while. That’s behind us now.

1Password is a fantastic, powerful app. I like the fact that it’s cross-platform, and that it’s fully accessible on all platforms.

Technical support works brilliantly.

It does support new things such as passkeys, as well as using it for 2-factor authentication if you want to simplify getting into a website. I love 1Password.

Now at the time that I’m recording this, a little bit before iOS 18 goes live, there isn’t the feature in the Passwords app that allows you to import from apps like 1Password. But I understand that it’s coming. So by the time iOS 18 comes out, there may well be an import option available to you.

1Password is very transparent. It allows you to export all your data. It is yours after all. That’s the sign of a well-behaved application, that they let you get data out of it so you can take it somewhere else. Even Apple doesn’t do that in all cases.

For example, you can’t export your Apple Podcasts as an OPML file so that you can import it into another app like Castro or Overcast, but they’re apparently offering an import option.

If you do this and you use a Windows computer, then you will want to get iCloud Keychain working with your Windows computer. Now, that iCloud app is not the most accessible in the world, but how you do this is you go into the iCloud control panel, you choose the passwords option, and you enable it. When you’ve done that, you’ll want to install the browser extension for every browser that you use on your Windows computer. Apple does support a wide range of Windows browsers so it should be easy enough to install the appropriate extension for your browser.

I cannot tell you any more about this. I do a lot of things in the name of Living Blindfully, but I’m not going to switch off my 1password and install this thing, so I don’t know how accessible it is or how well it works in practice.

If you give this a go and if you’re not using a password manager at the moment, I can’t stress enough, then yeah, sure, give it a go, if you’re interested. Let me know how you get on with it, how accessible it is, how easy it is to use.

My ability to demonstrate too much of this app is somewhat limited, because any passwords that I have in iCloud Keychain are now very old. 1Password manages all my passwords on my Apple devices as well. But let’s take a look at this.

I’m going to open the Passwords app now.

Open passwords.

VoiceOver: Alert. Face ID. User authentication.

Face ID authenticated.

Toolbar. New password, button.

Jonathan: I needed to authenticate with face ID to get into the Passwords app. You have to do that. That’s an obviously sensible security precaution that Apple have implemented there.

When you go into this app, you’ll land it immediately on the new password button. If I double tap, that will show you what’s here.

VoiceOver: New password, text field. Is editing. Website, app, or label. Character mode. Insertion point at start.

Jonathan: If you’re doing this manually, this is an important step because Apple needs to know what website or app this password pertains to, so that when you’re on the website or the app, it knows that that’s the password you will want when there’s a password field.

We’ll flick right and see what other fields exist.

VoiceOver: Username. User, text field.

Password.

Jonathan: Apple has suggested a password. It’s a strong one. It’s got a good combination of upper and lowercase letters and other characters. So if you were setting this up from scratch, you’d probably want to leave this as is.

However, if you are entering data from an account that’s already established on a website, obviously, you’ll need to enter the password that already exists pertaining to that account.

VoiceOver: Notes, heading.

Add notes, text field.

Jonathan: So as you can hear, if you’re a 1Password user, it’s much more basic than one password, but it has the essentials here.

I’ll go back to the top.

VoiceOver: Cancel, button.

Jonathan: And we’ll double tap that.

VoiceOver: Passwords, heading.

Jonathan: If I go back to the bottom, …

VoiceOver: Toolbar.

New password, button.

Jonathan: and flick left, …

VoiceOver: New group, button.

Jonathan: You can also add a new group. Let’s see what’s in here.

VoiceOver: New group.

Add trusted contacts. Create a group to share passwords and passkeys across their devices.

Choose what to share. Each member of the group decides which passwords they share and can delete them at any time.

You’re in control. The person who creates the group can add or remove others.

Continue, button.

Text field. Is editing. Name. Character mode. Insertion point at start.

Jonathan: Now, we can fill in the details for this group.

Let’s just talk about why you might want to do this. Why would you want to share passwords with people?

There are certain things like grocery shopping websites, streaming media services, and any number of things that might be a bit more communal where the temptation is to enter a simple password that everybody remembers, or that someone’s written down somewhere in an insecure way. That is not a good idea in this age in which we live. So if you can get a good, strong password and then share it with everybody who needs to have it, that is a much better option.

This way, you can create multiple groups. Each group can have their own members, and you can assign passwords to those groups. This is very similar to the vaults in 1Password.

For example, I have my own 1Password vault. But Bonnie and I have a shared one. And we have yet another one, which is shared with our wider family with certain passwords in there.

Of course, the big difference is that with 1Password, you can share much more than passwords. You can share secure notes, important documents, a wide range of things. But this is a good start.

So this is the name of the group.

I’ll flick right.

VoiceOver: Jonathan Mosen. Owner, button.

Add people, button. To invite someone to the group, they must be in your contacts.

Jonathan: And that’s essentially it.

It’s good that this is not just limited to family sharing. So you might be able to share passwords, say, with a group that you’re a part of, some sort of voluntary organization that you’re a part of, and you can share passwords across devices.

Of course, the big caveat is that everybody would have to opt into using this app or iCloud Keychain in some form.

I’ll cancel this.

VoiceOver: Cancel, button.

Jonathan: At the top of the passwords app, which is, by the way, installed by default with all installations of iOS 18, there is a search field. So if you’re looking for a particular app or a website so you can view the password associated with it, it’s easy to type that into the search field at the top of the app.

You can use this new Passwords app to provide 2-factor authentication codes for sites that make additional security options available to you. If you want to set up 2-factor authentication, and as I say, I highly recommend doing this everywhere 2-factor authentication is available, you choose the Codes section. And from there, you double tap the Add button. From there, the process is very similar to one that you might have used in a stand-alone authenticator app like Microsoft Authenticator or Google Authenticator.

You can either scan a QR code with your camera (That’s pretty common. I often go to a website on my PC to set up 2-factor authentication, and then I scan the code with my camera, so that works just fine in this app.), or you can type in a setup key or paste it in from a website. They tend to be quite lengthy.

Once you’ve got the code stored in your password app, just go back to the code section, and you’ll find that all the websites and apps that you’ve added will be listed there with current codes that you can enter. It’s not quite as seamless as using one password, but the price is right, of course.

Now, passwords are a big problem. They are out of fashion, and websites have started transitioning now from passwords to passkeys. It’s good to see that the Apple passwords app also supports passkeys. These are much more secure than passwords, and they let you log into your account just by using face ID, or touch ID, whatever biometric authentication you have on your Apple device.

Passkeys use a cryptographic key pair with one public key. That public key is stored on a server, and the private key which is stored on your computer or other device. They’re not shared. Those private keys, they remain on your device only. It’s impossible to send them to other people. That protects your account from all sorts of mean and nasty things going on.

You can see your stored passkeys in the passkeys section of the Passwords app.

So when you’ve got all this set up, how exactly does it work? Well, this depends on whether you’ve got autofill enabled on your device. If you have, you will go to a web page or an app where credentials from iCloud Keychain are available. And those details, if you’re using an old fashioned username and password, you’ll get a little pop up when you’re on the relevant site. It will say you want to log in with this. You can answer yes. And that’s all there is to it. It doesn’t matter how obscure or complicated your password is for the site. Hopefully, you don’t ever have to remember any unique passwords for websites again because they’re all stored in your passwords app.

Another really nice feature about this Passwords app is that it makes it very easy to take a look at all the Wi-Fi network credentials you have saved over the years. And I must admit, when I was looking at this in preparation for putting this little tutorial together, it really took me down memory lane because I had all sorts of passwords for places that I visited.

For example, when Bonnie, Nicola and I were on holiday in Europe a couple of years ago, all sorts of interesting places that I’d long since forgotten I’d visited. There’s no real need to, but if you want to, you can clean these up and have fewer Wi-Fi credentials in there. So that’s a nice touch.

If you’ve been practicing bad password hygiene for yonks, in other words, you’ve entered the same password on a lot of sites, then there is a dedicated security section in the Passwords app that you should take interest in. It’ll let you know if you have passwords that have been reused. In other words, used on multiple websites. Your objective here is to get that list to 0. You want nothing showing up in the reused section because best practice is to have a unique password for each website or app that requires you to log in.

You also have another section that will show you when passwords are too weak, or most important of all, when they’ve been compromised in a data leak. If you see any passwords that have been compromised in a data leak listed there, you need to treat that with the utmost seriousness. Somebody might have bought that data dump from the dark web somewhere, and could be just working through trying to log in as you. And if you don’t have any other protection on those accounts such as 2-factor authentication, you really can have a mess on your hand. So check that security section carefully.

So that’s a brief look at the Passwords app in iOS 18. It’s also on all the other Apple devices.

And by the way, your passwords are synced. If you’re logged into an Apple device with your Apple ID, all the passwords will be available on all the devices that you have.

Gaming Mode

Now, let’s talk briefly about something called gaming mode. I know this interested people when it came up at the Worldwide Developers’ Conference because there was discussion about lower latency with AirPods.

People thought, that sounds great. How do we enable this?

Well unfortunately, the answer is you don’t. Game developers do.

In iOS 18, you can automatically enter a special mode called game mode whenever you’re having serious gaming going on.

It will do 3 things when it’s enabled. First, it’ll minimize your background activity. The objective here is to give you consistently high frame rates, even if you’ve been playing for hours and hours. You might be engrossed. It’ll massively reduce your latency when using a Bluetooth game controller. It will similarly reduce latency when you’re using AirPods for audio.

Now, there is unfortunately no way to turn this on yourself. There’s no special toggle or anything like that. It’s triggered by the game developer.

It’ll be really interesting to see what, if any effect this has on VoiceOver when some of the accessible games that qualify to use game mode enable it.

Battery Care

Now, we’re going to talk about more battery saving options. I believe in iOS 17, this may have been limited to the iPhone 15 Pro. I could be wrong about that. But one option that does exist for me as an iPhone 15 Pro Max user is that I can choose to have the battery charging stop hard at 80%.

Why would you want to do that? Why would you choose not to have your battery fully charged?

Because when you charge a battery to 100% and let it discharge all the way, you are actually shortening the life of your battery. The optimal thing to do is to keep it charged at around 80% and to not let it go any lower than 20%. That’s a surefire way to make sure that you’re looking after your battery and keeping it in good health.

So when I’m at home and I’m not anticipating going out for an extended period, I always have this on. It is my default state for the battery. If I know i’m going out to a meeting and i need as much juice as i can get, then I will set it up at 100 and fully charge it. But I don’t do that all the time.

Let’s take a look at some of the options that are now available in iOS 18. We’ll go back into settings.

Open Settings.

VoiceOver: Settings.

Jonathan: The first thing you might notice is that Battery has a much more prominent place in iOS 18 than it formerly did.

I’ll flick right from the top.

VoiceOver: Search.

Dictate, button.

Personal hotspot, off.

Battery, button.

Jonathan: And there’s Battery right there. I’ll double tap.

VoiceOver: Battery percentage. Switch button, off.

Jonathan: And the one we want, I think if I tap about here, …

VoiceOver: Charging, button.

Jonathan: There we go. We’ve got the charging button. Let’s have a look at what’s here.

I’ll flick right.

VoiceOver: iPhone learns from your charging and usage habits to help preserve battery life span over time.

Learn more.

Jonathan: I’ll flick right.

VoiceOver: Charge limit, heading.

80%.

85%.

90%.

95%. 100%.

Jonathan: And that says it all, really. You can set your charge limit anywhere between 80% and 100%.

If I flick right, we get a really nice picker.

VoiceOver: 80. Adjustable.

Jonathan: And all I have to do is flick up and down to adjust the value.

You’ll notice that when we have a charge limit other than 100%, …

VoiceOver: Optimized battery charging. Dimmed. Switch button, off.

Jonathan: the optimized battery option is dimmed because you’ve essentially taken control of this yourself. Apple’s not learning about how you use your device because you’ve imposed a charging limit.

VoiceOver: When the charge limit is set to 100%, allow iPhone to wait to finish charging past 80% until the time you need to use it.

Jonathan: There’s another option that’s new here, too.

VoiceOver: Clean energy charging. Switch button, on.

Jonathan: Let’s get the explanation.

VoiceOver: In your region, iPhone will try to reduce your carbon footprint by selectively charging when lower carbon emission electricity is available. IPhone learns from your daily charging routine, so it can reach full charge before you need to use it.

Learn more.

Jonathan: And another useful feature is that when you are in the battery section of settings, you get an indication if a slow charger is being used.

These days, the iPhone can charge quite fast. But if you’ve got an older charger that’s not putting out sufficient wattage, then you’ll get an alert about that in case you have a better accessory around.

That’s what’s new with battery in iOS 18.

Phone

Let’s talk about the phone app. Yes, I know it may come as a surprise to you, but the iPhone does make phone calls.

We’re going to start with something really retro, dude. I was very surprised to see this popping up in iOS 18. But I’m actually quite delighted, And I find myself using this all the time when I don’t have some sort of external device with me, and I want to call somebody relatively quickly and maybe, Siri isn’t an appropriate option. This is the ability to dial up a contact using T9, of all things.

We’ve probably got a whole generation of people who are saying, what on earth is T9? But if you’ve been using smartphones for a long time pre-iPhone, you’ll know all about T9. This is a technology that used to be in common usage on phones that had nothing but a number pad so that you could send text messages, email, and even write documents quite quickly.

I gotta say, I was a T9 ninja. I could write documents extremely quickly with T9. And when the first iPhone came out that was accessible, the only options were the built-in keyboard and there was split tapping and double tapping. That’s all we had. I lamented the demise of T9 because I was so much faster with T9 than either of those methods.

Well, you haven’t got T9 for everything. But as I say, you have now got it in the phone app for dialing up a contact.

Let’s see how this works. We’ll go back home.

VoiceOver: Home.

Messages.

Jonathan: And…

VoiceOver: Doc.

Phone, 7 new items.

Jonathan: There’s the phone in my dock. I’ll double tap it.

VoiceOver: Search, search field.

Jonathan: And what I want to do now is go to the keypad.

VoiceOver: Tab bar. Voicemail.

Keypad, tab.

Jonathan: I’ll double tap.

VoiceOver: Selected. Keypad, tab. 4 of 5.

Jonathan: Now to use T9 effectively, you have to be familiar with the concept of word numbers. If you’re in the United States, New Zealand, and certain other countries where word numbers quite often come up in ads, this will be second nature. ABC is the number 2, DEF is the number 3, and so on.

I want to call Bonnie. Now, in a quiet environment like this where I’m not disturbing anybody, it’s still much more efficient for me to tell Siri to call Bonnie. But as I say, there may be times where that’s not a good option. So to call Bonnie, I could just dial 2 6 6, which is the equivalent of B-O-N.

Now, I’m going to go to the top of the screen, …

VoiceOver: Phone number. 266, text field.

Jonathan: and flick right.

VoiceOver: Bonnie Mosen. +64…

Jonathan: And there she is. Bonnie’s right there, along with a whole bunch of other people who match the 266 string.

I can just double tap and call her.

Really cool that Apple’s actually put this in here.

Call recording is coming as well, but that’s an iOS 18.1 feature.

Another one where we can say hooray and long overdue at the same time is a more searchable call history. To use this, …

VoiceOver: Recents. 7 items, tab. 2 of 5.

Jonathan: we’ll double tap the Recents tab.

VoiceOver: Selected. Recents. 7 items, tab.

Search your call history. Type a name, keyword, or date to find recent calls, voicemails, and contacts., button.

Jonathan: This is pretty self-explanatory. I can type the name of a person who’s called me, or a phone number, and find out when I last talked to them. Jolly good reason not to clear your call history now.

Wallet

Now, let’s talk about a change to the wallet app that sounds very nice. This pertains to Apple Cash. Apple Cash at the moment is only available in the United States. This new feature is called person-to-person cash transfer. it’s also been referred to as tap to cash.

You know for example that if you want to perform a transfer using Airdrop, you can bring two devices close together and it just magically happens? Well now, you can do this to transfer money from one person to another. So if you go to a restaurant, for example, you want to pay your part of the bill, you can do the tap to cash feature and Apple Cash is transferred from your account to the person whose phone you tapped. It’s all just seamless.

AirPods

There are some changes with AirPods to report in iOS 18.

First of all, voice isolation, which improves the voice quality for someone who’s listening to you through your AirPods. It eliminates background noise, and also that awful blowing sound that you get when wind blows into a microphone, so that’s a useful feature.

Some new ways for AirPod users to interact with Siri as well, because it’s not always convenient to talk to Siri. Now, you can nod your head yes or shake your head no to respond to Siri, and the AirPods will know what that’s about.

TV

And moving on to a few more little odds and ends, let’s talk about the dedicated Apple TV app.

It supports Insights. This is a feature that provides more information for you about the music and actors in a particular TV show or movie that you’re watching.

You can get Insights when you’re using your iPhone as a remote for the Apple TV 4K, but it is limited to Apple TV Plus shows.

Another new change. You might have seen Enhanced Dialog popping up on your Apple TV, which just makes the dialog a bit crisper and clearer. It’s been on the Apple TV 4K for a while. And now, it’s been added to the iPhone when you’re playing something through the built-in speakers, or AirPods, or other speakers and headphones.

Enhanced Dialog uses machine learning, they tell us, and computerizational audio to boost vocal clarity over the noise of music and background sounds in a show or movie.

Home

Moving on to the home features.

For those of us who are living the smart home dream, the home app supports new guest features. When people come to stay, you can give people temporary access to your smart locks, garage door openers, and alarm systems at select times. It could not only be useful for people coming to visit, it could be useful if you have a contractor who always comes at a specific time such as a cleaner, for example. You can set this on a recurring schedule. So if your cleaner comes every Wednesday at 9 AM, you can provide them with access, just say, for that 2-hour period.

Calculator

Talking about the calculator, the calculator app stores history now, so you’re able to see all your prior calculations.

You’ve got the basic and scientific sections as we’ve always had, but Apple has now also added a Math Note section to the calculator app that ties in with the Notes app for solving equations in a note.

Calendar

Now, let’s talk about the calendar app.

This is one of those features where we say good on you, Apple, for getting it done, but it’s been available in third-party apps for a long time. I’m referring to the integration of reminders in the Calendar app.

The distinction sometimes between what you should make an appointment and what you should make a reminder is somewhat arbitrary. So while the reminders app in iOS 18 is still exclusively for reminders, you can now view your appointments and your Reminders in the calendar app. This is the way that apps like Fantastical, which we’ve reviewed here on Living Blindfully, have worked for Yonks.

Let’s go into the calendar app and have a look.

Open calendar.

VoiceOver: Calendar.

Jonathan: And on the top of the screen,…

VoiceOver: Search, button.

Jonathan: we’ve got the search button.

But if I flick right, …

VoiceOver: Add, button.

Jonathan: and I’ll double tap.

VoiceOver: Text field. Is editing. Title. Character mode. Insertion point at start.

Jonathan: Let’s go to the top of the screen.

VoiceOver: Cancel, button.

New, heading.

Add. Dimmed, button.

Selected. Event, button. 1 of 2.

Jonathan: This is the default behavior, what we’ve had for years. When you double tap the add button, you’re adding an appointment.

But if we flick right, …

VoiceOver: Reminder, button.

Jonathan: I’ll double tap that. And now, let’s have a look at what’s here.

VoiceOver: Title, text field. Is editing. Insersion point at start. Notes, text field.

Date. Wednesday, August 14, 2024.

On, collapsed. Time. 3 PM. On, collapsed.

Repeat. Never, button.

List reminders, list badge.

Details, button.

Quick bar.

Selected. Date and time, button.

Location, button.

Tag, button.

Flagged, button.

Photos, button.

Jonathan: I want to go into details, …

VoiceOver: Details, button.

Jonathan: because there’s a really cool feature here.

VoiceOver: Early reminder, none.

Tags, button.

Location. Off, collapsed.

When messaging. Off, collapsed

Jonathan: There’s an explanation of what this does.

VoiceOver: Selecting this option will show the reminder notification when chatting with a person in messages.

Jonathan: So if I flick left, …

VoiceOver: When messaging. Off, collapsed.

Jonathan: and double tap, …

VoiceOver: When messaging, more options shown.

Choose person.

Jonathan: So if I double tap choose person, it will bring up my contacts and I can choose, for example, Bonnie. If we’ve got some sort of joint reminder that one of us can be responsible for, like buying some more milk or something mundane like that but important, then it will pop up next time we’re messaging. I really like this feature.

So that’s calendar in iOS 18.

Podcasts

If you are a user of Apple’s podcasts app, there are some changes to report there.

It shows podcast chapter segments in a podcast when you scrub through an episode, so it’s easy to skip around.

Conclusion

And we’ve probably only just scratched the surface. I’m sure there’ll be people who’ve been testing iOS 18 who will say, you’ve left out my favorite feature.

As you can hear, it is quite a substantial release. We’ve been going for well over 2 hours.

And sure, [laughs] there still are a few things to say. But I hope that gives you some guidance as to some of the new features that you can expect in iOS 18.

[music]

Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. That’s P-N-E-U-M-A solutions.com.

Closing

Whew! I think I needed a little lie down after all of that.

But we’ll be back soon enough for episode 301, and special coverage of the Apple event. I hope you’ll join us for that.

In the meantime, when you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.

[music]