Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

 

Contents

Welcome to 270.. 2

Jonathan Mosen Speaks With Microsoft’s Chief Accessibility Officer, Jenny Lay-Flurrie.. 3

Accessibility is a Fundamental Human Right 18

You May Not Know Texts Are Being Truncated, and General Siri Thoughts. 21

Impressed With my Chromebook. 26

Questions About Sonos and Dolby Atmos. 27

Lectrote Interactive Fiction Interpreter. 29

Transporting and Caring for Kids. 30

Blind People and Video Editing.. 31

The Good and Bad of the New Stuff Website.. 33

Everything You Ever Wanted to Know About Matrix. 34

Make Windows 10 Free for blind people.. 39

Where is the NFB Newsline Android app?.. 42

What Laptop Keyboard Meets My Needs. 42

Closing and Contact Info.. 44

 

 

 

Welcome to 270

[music]

Voiceover: From Wellington, New Zealand, to the world, it’s Living Blindfully – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.

On the show this week: a global leader who has significant influence on the technology many of us use daily; I’m joined by Microsoft’s Chief Accessibility Officer, Jenny Lay-Flurrie, remaining steadfast on accessibility being a fundamental human right, and welcome to The Matrix.

We’ve reached the 270th episode of this podcast. Whether you’ve been with us all along on this journey, or you’ve found us along the way, or this is your first time listening, well it’s great to have made it to episode 270.

And area code 270 in the North American numbering plan belongs to bits of Western Kentucky. This is the bit where I can go, …

[Jonathan sings with reverb effect]

“All the sun shines bright on my old Kentucky home.”

I think I can do that because I think it’s out of copyright by now, anyway.

Bowling Green, Kentucky is one of the cities covered by area code 270, and that is just such an epic name for a city. I mean last week, we had Kalamazoo. This week, we’ve got Bowling Green. Who knows what’ll come in episode 271?

But if you are in Western Kentucky and you’re listening to us from area code 270, a special welcome to you. Enjoy your day in the Bluegrass State with the warm spring Kentucky sun shining on you.

Now, country code 270 doesn’t actually exist because South Africa has got so many telephones that they’ve just got a 2-digit country code, and that is country code 27.

But since we weren’t going through numbering plans in episode 27 of this podcast, we will salute South Africa in country code 27.

We are rivals on the sporting field in both rugby and cricket, and probably other sports as well. Cricket is my thing, and so I’m delighted to just gently point out that we defeated South Africa 2-0 in the recent cricket test series against South Africa here in New Zealand. That was an epic result from my perspective.

So if you’re listening in South Africa (and I know there are people listening in South Africa because we get some great contributions from time to time from that part of the world, and there are 61 million people there), well, a very warm welcome to you.

Advertisement: My sincere thanks to Pneuma Solutions for making transcripts of Living Blindfully possible.

RIM is free for all to use for up to 30 minutes every day. Need more time? Well, there’s a range of monthly price plans available, and you can buy a RIM pass to assist one person as much as you like for a 24-hour period

And there’s yet another way to use RIM. The RIM Community Support Package is designed for those who occasionally need to provide or receive remote technical support, but don’t want a Pro plan. With the RIM Community Support Package, you get an additional 3 hours every single day. That’s on top of the 30 minutes RIM gives everyone free.

Check out the RIM Community Support Package. Use it on your own PC, or Mac, or with anyone else’s PC or Mac. getRIM.app. That’s G-E-T-R-I-M.app.

[music]

Jonathan Mosen Speaks With Microsoft’s Chief Accessibility Officer, Jenny Lay-Flurrie

For many of us, Microsoft plays a key part in our productivity at the office, our access to information, and our general ability to use today’s technology effectively.

Those of us who’ve been assistive technology users for a while will know that Microsoft’s approach to accessibility has evolved over the years. First, it took a supportive interest in the hackery necessary to give screen readers access to Microsoft’s operating systems. Eventually, Microsoft created official ways for third-party developers to obtain the access they needed. And while it continues to do all those things, today, Microsoft offers powerful accessibility tools of its own in Windows and accessibility is in the company’s DNA.

Championing Microsoft’s accessibility efforts is Jenny Lay-Flurrie, Microsoft’s Chief Accessibility Officer.

Now before we get started, I should mention that in the interests of harmony, I am avoiding the most controversial topic because Jenny for some reason thinks that British Marmite tastes better than New Zealand Marmite, and I just don’t know what we can do about that.

So welcome to Living Blindfully, Jenny. It’s delightful to have you here.

Jenny: [laughs] Oh Jonathan, you make me laugh right off the bat. You know, that is the most controversial topic that we should avoid. What do you think you call it? Sanatorium marmite versus …

Jonathan: Cemeterium, yeah. [laughs]

Jenny: Versus the real thing, which is of course, marmite from the UK.

But it’s a pleasure to see you, and it’s a pleasure to be on the call with you.

Jonathan: See, now that we’ve dealt with the elephant in the room, it’s all downhill from here. [laughs]

So before we get to the technology, I am interested in learning more about Jenny the person. Can you tell me a bit about the journey that ultimately saw you become the Chief Accessibility Officer at Microsoft?

Jenny: Oh my gosh! Well, I hail from the middle of England. So grew up, parents both teachers, went to classic school, went to a public school, and ended up at music college. And my big goal in life was actually to be a classical clarinetist, before I realised I really wasn’t very good and that wasn’t going to pass through, [laughs] although I loved studying it at college. It was just one of my best theories in my life.

But I ended up getting a job at the Daily Mirror after that, on the IT help desk. And that really was, I would say, a big pivot for me into IT, which is where I’d been ever since. And so I spent a good amount of time in the UK and Europe, early in my career. T-Mobile for a while, and then landed at Microsoft.

Oh gosh! I’m in my 20th year, Jonathan, to work on Hotmail in London and across Europe with an amazing gang of folks, worked in advertising for 7 years.

But really, my, the night job, as I called it back then, was where my passion lay, which was working on disability inclusion and accessibility.

And the untold (or maybe I think it’s very much out there now), is that I grew up hard-of-hearing. My hearing has been declining ever since. I’ve since acquired other disabilities, as is the nature of the beast.

And I came to Microsoft not really sharing my true extent of my deafness, but joined the deaf community. Then, joined the VIP community here (the visually impaired persons. I love that the acronym is VIP), and every other one, and ended up building the disability employee group here at Microsoft, which has been an incredible journey. Really powerful community here. And then ultimately, ending up as CAO in 2016.

I’m forever learning. It is a continual journey. And I think, the best part is the people that we work with. And of course, you being one of them. It does take a village. Accessibility takes a village always.

Jonathan: It certainly does.

It’s good now, isn’t it, that people feel more positive about disclosing. That’s not to say that we’ve reached nirvana yet, but I think people are now more willing to disclose than they’ve ever been, and say to an employer, “Look, this is what you can do to get the best out of me. These are the accommodations I require.”

Jenny: Yeah, for sure. But I still think there’s a long way to go.

I mean, I think my personal journey was one of very much hiding it for some time which, by the way, I needed to do. It was a survival tactic. Not many people wanted to hire a deaf musician.

But I don’t recommend that. I think I say that with a benefit of hindsight. I think if I’d been more forward in my identity which is what I recommend now, I do think that it’s so empowering to ask what you need, to be successful, and use your expertise and your talent as a disabled person.

And that’s really where we are, I would say, in our journey. Accessibility is a cultural discipline. We have 220,000 employees here globally. We, like many other companies, are growing and retracting all at the same time.

But we need people with disabilities whether you are in the gang, new to the gang. You know, most people acquire disability through their life. Come as you are. Do what you love, and ask for what you need to be successful.

And I think we all know in some way that if you bring that insight forward, you will, by nature, (it’s not easy, and I want to acknowledge that), but you will, by nature, make things around you more inclusive, and more accessible, and educate those around you by doing it.

But I say that acknowledging the lived tax of that, Jonathan. I think that’s something that’s very very important. It is advocacy fatigue, and that is a very real thing in our world, and I’m very conscious of that.

Jonathan: Your role seems incredibly broad because sometimes, you’re a powerful advocate, I imagine, internally as well as externally. Sometimes, you’ve got a hand in product design.

How would you describe the role of Chief Accessibility Officer? What do you do?

Jenny: [laughs] I meddle in lots of different areas. But I think the most important part of this role is actually empowering everyone else to do what they are great at. So my role is really to lead the company hub and to motivate, drive, strategically help to lead the many many accessibility folks that are working throughout the company.

And we do approach it as a cultural enterprise, which means that I’m looking at anything from people, you know, bringing in the right talent, empowering talent, partnering with our HR department right the way through to policy and making sure that we have the right policies internally, but also the right policies we’re advocating for externally. Self-minimum wage is one of those, as an example, and advocating for that to be eliminated.

And then, we look at clearly how we can help to drive partnerships, whether that’s with nonprofits, advocacy organizations, customers, really to help grow accessibility.

And at the core and fabric of everything is, of course, products and websites and tools. And we have a large inventory here.

The best way of thinking about it is that my job is to really have a good understanding of where we are, where we need to go, to invest in the right areas.

But I don’t write Windows code. I don’t write Xbox games. There are amazing people in each of the different spokes, the hub and spoke. They’re just fantastic talent all across the business who are the experts in each of those domains. They are responsible and accountable for delivering accessibility in their environment, whether that’s buildings, hiring process, marketing campaigns or technology.

But yes, I have a pulse on what’s going on, and it’s a complex system, and it takes a long time to get those kind of wheels spinning. But we’ve definitely learned a lot since 2016.

Jonathan: I’d like to focus on Narrator and screen reading over the next few minutes.

As I mentioned in the introduction, Microsoft’s now much more than a supporter of other people’s efforts.

I’d like to understand your vision for Narrator long term.

I think one of the advantages Windows has is that there are capable third-party alternatives to the screen reader that ships with the operating system. If your Apple screen reader develops a serious defect that affects your ability, you’ve got nowhere else to go on your Apple device. So that’s a strength.

Over time, do you see a point at which third-party screen readers need no longer exist because Narrator’s become increasingly capable? Or do you see third-party screen readers in the long term as complementary to, rather than competitive with, Narrator?

Jenny: Yeah, I would definitely go with the ecosystem and complementary, those 2 words. I think Windows as a platform and Microsoft now as a whole (because that platform extends to Xbox and multiple other parts of what we do). I think it’s really crucial in its core to how Microsoft, historically, has been built and focused is to empower everyone. And that ecosystem and choices between first and third are a core and crucial component to that.

And I would say Narrator has been on one heck of a journey. I remember the times when I walked in the room, I think with yourself and many others who were like, “Narrator isn’t where it needs to be.”, and I remember breathing that in, and then hiring one of the person, Anne Taylor from the NFB, who was actually at the forefront of that and being righteously and appropriately bold in her opinions at the time, really to push us to invest more in our first party. And we’ve definitely taken that kick to really do that, so that folks have options.

You need to have a screen reader that you can use to start your device, and get in and build it the way that you want it, and not wait until you have the ability to download a third-party and have to use sighted assistance. That’s just not what we’re looking for. We want to empower every person, every organization.

So I think it is a complementary mix. And I know the Narrator team and the Windows team really want to empower not just the first party and continue to invest there, whether that is screen reading, magnification, high contrast, leveraging the latest AI features as well, and also continue to empower third-party, of which there are many, and not just in blind, low vision, but also eye tracking and other disciplines as well for hard of hearing, with hearing aids, of course, on the radar. So the answer is complementary.

I think it’s always going to be a personal choice, what we all choose to use on a day by day basis. And my hope is that that helps folks to get what they need, whether it’s a work instance, life, play, again, Xbox, top of mind on that side of things, as well as Microsoft Word, and going in between those, all of those are in our portfolio. And so complementary is definitely the goal.

Jonathan: Okay. Credit where it’s due. By the way, I bought my wife a laptop over Christmas and set that up for her. And I was really impressed with just how seamless the setup process was, how accessible it was, no sighted assistance required at all.

So we’ve made a lot of progress. And thank you for that.

We get a lot of feedback here on the show. We air a lot of listener contributions.

And one observation that’s been made by listeners to the show is that the screen reading experience is becoming more generic, no matter which screen reader is being used because Microsoft is supplying ready-made information with a certain degree of verbosity thrown in. I know that you’ll be aware that verbosity is a very tricky, contentious issue for screen readers because one person’s essential information is another’s irrelevant distraction.

And people like Doug Geoffray who works for Microsoft now, and has been involved in this industry for years, has a lot of experience with this whole dilemma. And it’s quite an emotive topic.

How are those user experience decisions made? And do blind people have any opportunity to have input into those user experiences?

Jenny: I love that you mentioned Doug. I think Doug and I will say there’s a whole other list of folks that have been very very instrumental in that approach and will be, going forward.

The thing to take away here is we’re only as good as the feedback and insight that we get. And so we really are very consciously focused on making sure that we lean on experts, and we iterate a ton on that. And hopefully, you know, it sounds like that’s what you’re seeing here. So yeah, we have systems very much keenly focused on it.

Again, in all of the different spokes across the company, I think you’re lingering on Doug and Anne Taylor as two folks that we’ve mentioned so far. They have been very instrumental in making sure that how we focus on Microsoft 365. So think about that as the entirety of Office, working with advisory boards which are external members from the blind and low vision community, and other parts of the disabled community as well. And our internal employee resource group, that community of many thousands of folks here at Microsoft with disabilities. Those listening vehicles, those testing vehicles are incredibly important to make sure that we get that balance right.

Do we get it always right? No. But I think it’s one of those where we need to continue to iterate.

I will say the other big listening vehicle for us is the Disability Answer Desk. That’s something that’s been around now for 11 years, and it still takes about 13,000 calls a month.

And it’s a great way… you know, if you’re listening to this and like, “I’ve had a bug or an issue.”, or “I’m not quite sure how to go from this in Word.”, or “I want to get to this feature in… (my goodness!)) Excel.” (We all still love Excel.), or “I want to type BS code.”, that’s the team that you can call. It’s a free technical support line, staffed by people that work in accessibility. Most of them have disabilities. They get it. And yeah, we get a ton from that.

Our goal is to help people to get online. But we also get a lot of feedback through it that then funnels its way internally to every single product team including Narrator, including Office, and helps us to iterate.

So yes, those systems are grounded in the all up strategy. You’ve got to have expertise to inform your journey. And I can never get enough of it, honestly.

I think there’s so much lived experience and need. The challenge that we do have is that it’s a diverse need, and everyone has a different… Like you said, verbosity has a very different experience for every person, and finding that right balance is not always easy.

Jonathan: Yes. I think configuration options when it comes to verbosity are always a good thing because people have such different expectations and preferences.

I’m really pleased to hear about how many disabled people are working in this area, and at Microsoft in general. I know that you’re passionate about that, and you’ve also taken time to educate others external to Microsoft about the benefits of disabled people being at all levels of an organization.

Many people find it disheartening that even though offices are now more paperless than they’ve ever been, and technology is more capable than it’s ever been, we still have a huge unemployment problem in the blind community and actually, among disabled people in general.

What else needs to be done to move that needle? Because largely it’s no longer a technological problem we’re dealing with. It seems to be an attitudinal one.

Jenny: Oh, goodness. How long did you say we had today?

Jonathan: [laughs]

Jenny: Yeah, there’s no one answer to that. I mean, I think firstly, yes, this has been a big passion. And not just for me, but the entirety of the company; again, built on the strategic need to have disabled talent in the core and the fabric of the company, in order to deliver on our goals to be more inclusive and accessible. And what that’s meant over time is that people have felt more safe to identify, to ask what they need to be successful.

In fact, our accommodations process works on a 48-hour SLA for turning around those accommodation requests, which I think is very important. You’re not waiting weeks to get a copy of JAWS or a Braille reader. They’re able to turn them around super fast. And that’s our goal is to get you up and running quickly, so that you can help be impactful.

And I do love the progression with public in the percentage of people with disabilities that have self-identified at Microsoft. Just in the US right now (but we are tracking it globally), and 8.8% as of last June said, “Yes, I have a disability.” here in the US.

Is that number where we want it to be? No. I want it to be in the teens, if not in the 20s, which we know to be the true representation. But it’s progress. And I think that’s the important thing for us, and we want to continue to make progress here.

But yeah, there’s no denying there are systemic issues when it comes to employment and disabled talent, and the ableism that exists in society, which is that attitudinal low ceiling that is artificially put on people and what others feel them to be capable of doing. I don’t know that I know a single person that hasn’t felt that at some point, myself included.

And so I do think we’ve got to tackle it. There’s a lot of insight coming through that talks about how that impacts in interview processes, in technology, in face-to-face, in meeting with new employers, so there’s multi prongs to it.

But know that we’re trying to really tackle some of those, and we take that responsibility very seriously.

We’ve learned a lot through some of our own hiring programs. We do positively and proactively go out and hire disabled talent. Our neurodiversity program is very long-standing, that’s coming up to its 10 year anniversary next year. That one started with hiring autistic talent. Now, all of neurodiversity. And we do that in a specialist way because essentially, we’re ditching the traditional interview process to allow folks to share their skills in a different way. So instead of asking those traditional questions (What are your strengths and weaknesses?, which don’t necessarily allow an autistic individual to shine to their true extent of their talent), we go through a series of activities that allow us through Minecraft, of all things, a virtual Minecraft academy, to see their skills.

We also hire deaf, disabled, blind, and across the spectrum talent in other forms as well.

And I would recommend if you are interested in careers at Microsoft, don’t hesitate. Go and apply. Go check out the jobs on the careers website. Don’t hesitate to put in an application, and please ask for what you need to be successful. We have disability talent brokers that will help make sure that you have accommodations for an interview.

And so I think we’ve got to keep pushing it forward, but it’s going to take a lot, I think. There are some systemic issues here, for sure..

Jonathan: Yes. Obviously, the example that Microsoft is setting with all of those practices is very important.

But also, I think there’s a lot of public education that needs to be done because particularly where small businesses are concerned (and somebody may have mortgaged their home to set up that business, and it’s their baby, it’s their pride and joy), and they genuinely believe those small business owners in some instances, at least, that disabled people represent productivity risks. They represent health and safety risks. And it’s not that they intend to discriminate, it’s just that that’s what they genuinely believe, and they are acting in what they perceive to be the best interests of their business.

And I’m not sure how much public education has really been done. I’m not aware of any systemic campaigns on this issue.

Jenny: Yeah, Jonathan. I wish I had a magic wand for that.

Jonathan: Me, too.

Jenny: But I also think that there’s things that we can all do there to lean in.

One of the things, and the first things I did in this role is actually work with our training teams here to make accessibility training mandatory for every employee. It starts with the basics of what is disability, etiquette, language, things that really matter, and how we view disability as that strength, as that talent pool to purposefully tackle that. Now that training’s public. Anyone can check it out.

There’s plenty of other options available, but I do think that education’s a key part of it.

But I think there’s a lot more besides it. And I think that’s where, as we look at AI, one of the things that we really do need to do is to tackle ableism holistically. And so I think training’s a component. Technology is a component. Clearly, advocacy is a huge element of that. And I know that there are advocates on both sides of the globe that are really tackling that one on a day-by-day basis as well.

The reality is, I mean, you’re doing what you’re doing, I’m doing…, we are all disabled people doing amazing things. No one’s gonna put a ceiling on that for me. Many have tried, but we’re gonna bust through them, and I think we have to keep going.

Jonathan: As we record this, Microsoft’s Ability Summit recently concluded, and it gladdened my little activist heart to hear Microsoft boldly proclaiming that accessibility is a fundamental human right. And I think maybe some of our own community needs to hear that because sometimes, we even have some of our own believing it’s not realistic to expect equal access. A large corporation like Microsoft doesn’t make a statement like that without very careful thought.

What prompted Microsoft to put that stake in the ground at this particular time?

Jenny: Well bluntly, it’s the truth. Accessibility is any fundamental right, and the team that I sit in here at Microsoft, I’m surrounded by people that are leading that in different ways. I have the privilege and honor, really, of leading our work on accessibility.

My peer, Vicky who was actually at the Ability Summit with me leads connectivity. I’m working in Africa, and Mexico, and many countries and regions around the world to build in connectivity. Connectivity is also a fundamental right.

I’ve got another peer working on human rights, on racial justice, on how we lean into democracy as a fundamental right.

So there are so many different efforts that I think are so crucially important right now, and I do think it’s important to say those words very loudly and vehemently.

And I think they came out of both the CEO’s mouth, who opened the Ability Summit last week, and Brad Smith, our president. He had a big slide with it up just to make it especially clear. And I do believe it’s something that’s fundamental, she says using the word twice, to how we think, and I think it is important for companies like ours to say those words, and to live into it, and knowing that we really do get the responsibility of saying that.

Somebody asked me the other day, “Now that we’ve done 1 2 3, are we done with accessibility?” And I couldn’t help but sticker, this is a not ever done role. We have to continue to raise the bar. We are living in a world that has not been architected inclusively. I mean the just simple proof point of that is the existence of stairs, the most inaccessible vehicle there is. And so our job is to work tirelessly, candidly, to really make the world far more inclusive and accessible, particularly right now with this digital push that we are all living in, and modernisation of every construct we have the privilege to be in.

So yes, let’s go. At Ability Summit, that was our 14th annual summit, Jonathan. I remember when it was just a teeny little thing with 20 people, and I was as deliriously excited then as I am now with, I think we had just over 17,000 last week, and 137 countries.

So I think, that bluntly goes to show the bar of interest there is in building this. There were folks who were joining us that were new to accessibility, who wanted to really learn basics, and a lot that are in this world that wanted to know where the future is going with accessibility and everything in between.

So the demand is there. The appetite is there. We have to keep going.

Jonathan: I wanted to talk about AI. That’s obviously the number 1 topic right now.

And you did announce some AI enhancements affecting Windows and Seeing AI at Ability Summit. Would you like to fill listeners in on those? I know that Seeing AI in particular is a much loved app in our community.

Jenny: Yeah, there’s this little thing called AI.

[laughter]

It’s definitely dominating, I think, the tech industry right now. It’s very exciting. It was a big topic for us at Ability Summit, really to dig into what does this mean for us. What does it mean for us as humans, and what does it mean to the industry, and where we’re going?

And so we did wallow in quite a few things that I think are kind of very very relevant to what we’re doing across the spectrum including, of course, blindness.

I think the first thing to wallow in a little bit is what is this new chapter of AI that’s come out since November 2022, which is generative AI. Yeah, AI is not new. It’s been around since the 1950s. But this new chapter is about generating new content, and doing that based on a much larger data library, essentially, which allows you to do things like…

Actually, we announced a project with the Rijksmuseum in Amsterdam last week, which is an art museum, and they are using generative AI to generate image descriptions for over a million pieces of art, along with an incredible team of folks from the blind and low vision community that is powering that work.

We also talked a lot about Be My Eyes, which is not a Microsoft app. It’s an incredible startup, again, based in Europe. Many of you will be aware of it.

If you’re not, do download it if you have a Fruity or Android device. It’s a really good app.

They’ve got 600,000 sighted volunteers. They added Gen AI as well to that. So you can take a picture of something and again, it will give you those image descriptions. I mean, paragraphs of them.

I don’t know Jonathan, if you’ve used it.

Jonathan: Oh, every day. [laughs] Yes.

Jenny: Yeah. I mean, it’s incredible.

And I will tell you from a Microsoft perspective. We partnered with this company since 2017.

So if you have a technical issue and you’ve got an error message on your machine, you can take a picture, and it will now describe that and tell you what that error is or what you need to do, and give you a description.

And we’re finding that folks that are using Be My Eyes to call the Microsoft Disability Answer Desk are able to get their issues quickly resolved, and 40% of them no longer need to call us because the AI is doing the work. And that means you don’t have to hang on the phone to wait for a human to answer, so independence.

And then, of course, what does it mean for Microsoft? We talked about that in several ways.

Seeing AI, which we love, they did a few things last week. They’ve added another 14 languages, so they’ve now got 33 languages in there. And they’ve also added the richer descriptions and chat capability on photos, on documents.

And so if you haven’t had a chance to update your Seeing AI, do update it. There’s some really cool stuff coming through the pipe.

And we also previewed a fun one (which I think we’re going to hear a lot more about as this pilot continues), on audio descriptions, and whether AI can help to provide audio descriptions which, as we all know, takes people to generate right now, and takes a lot of time that when I try and get a video audio-described, it’s around a 3 to 5-day minimum turnaround, sometimes a lot longer. And could AI help with that, which could potentially mean that a lot more videos could be audio-described.

Lots of other channels.

Doug Geoffray actually did a great job of talking about what does this mean with Copilot and Office. And I think you and the community would love to know, Doug and many others have been involved in making sure that all of those new implementations are accessible as well.

So yes, it’s fun. It’s fun, it’s exciting.

And we also, again, know the responsibility. There are potential harms with AI and so how we approach it, I think is incredibly important.

Jonathan: Yes, a bit to unpack there.

I understand that Be My Eyes, or at least the AI component of it, is coming to Windows devices sometime this year. So that will be a significant development.

Jenny: Yeah. So this summer, it will be available in the Windows Store. And yes, that means it will be far more embedded into your desktop experience. One of many joyful things coming up, hopefully this summer.

Jonathan: Seeing AI is a great story, isn’t it? Because it came about as the result of a Microsoft hackathon, where people had license to be creative, and a blind developer saw a need and created this life-altering application.

Jenny: Yes. This goes back to (This is where I feel old a little bit.), it goes back to 2017. And hackathons matter to us. We’re a technical company. I have a lot of amazing nerds here who love to innovate and create. And for me, it’s a great way of motivating, driving education, so people who want to innovate and get going on accessibility, and seeing talent and what they can do, as well as seeing some great pieces of technology or whatever they create, it’s not just technology that you can hack on.

But this was Saqib Shaikh who pulled together a great team of folks. He is a blind developer, another Brit who has since moved to Redmond.

If you don’t know Saqib, please, maybe we can pop some links out after this, Jonathan. He’s phenomenal, and has been with Seeing AI ever since and leading that effort.

It’s an app that started on the iPhone, and has since moved to Android as well. It does some incredible things these days, multiple different channels. So you can use it to read out document, to figure out what color a piece of clothing is if you’re picking that up, to tell you whether the light is on or off, it’s got some indoor navigation options, it will clearly describe any pictures you take.

And so it’s well worth it. If you haven’t downloaded it, do. It’s free. It is done and designed and created by Saqib and his team. And yeah, it’s very very beloved by us all. And I think most importantly, very useful, which is what we want, right? We want the technology to empower people.

Jonathan: And speaking of empowerment, CoPilot is a buzzword that’s used a lot within Microsoft these days. Just recently, I’ve started using CoPilot for Microsoft Office, and it’s been intriguing to use that in the various Office applications.

Can you talk a little bit about CoPilot in its various guises, and what that might mean for people listening to this show?

Jenny: Yeah. I mean, I like the branding CoPilot. I think it aptly describes what this is, which is it’s your AI co-pilot. It is not intending to replace you, your function, your human. It’s there to help you to do tasks more easily, quickly. And so I’ve been using it as well, Jonathan.

In fact, we rolled it out to the disability employee community as one of the first pilot groups that went through it several months ago. And we’re finding that it does really help in certain scenarios.

For me, it’s invaluable. I have to read a lot of documents every day. Some of those are, let’s call them dense, and your CoPilot can summarize those for me.

Jonathan: [laughs]

Jenny: I still do the reading. I want to be clear, I still read everything I get. But it can provide very quick summaries.

I love it in Microsoft Teams because it doesn’t just provide… AI’s providing captions, which it’s done for many years, and there’s a transcript. But now, AI can generate the meeting notes from that. So it’s using that content to generate a CoPilot provided set of meeting notes, which means as someone who’s using an ASL interpreter, a sign language interpreter and captioning, it’s very hard for me to multitask and take notes at the same time, and it’s just saving me a ton of my own personal battery power. So yeah, in Word.

I’d be fascinated what you think. It’s growing every day, its capability. I’ve been using it to not just check my grammar, but reword. It gives you suggestions on, you know, how to phrase your paragraph slightly differently. And sometimes I take them, and sometimes I don’t. And sometimes, I make them a bit more British and me before I send. But it’s doing a lot of the hard work for me. I’m not having to redline quite as much.

Jonathan: Yes, I completely agree with you about Microsoft Teams. It is remarkable in Teams. I think that’s where CoPilot shines very strongly.

And also, it can be good in certain circumstances in describing graphical content in certain applications. Sometimes, I get complex spreadsheets. And I have no difficulty using Excel when the spreadsheet is accessibly designed. But again, sometimes, just getting the big picture that I think a sighted person can glance at the screen and see trends, CoPilot is helping blind people in that regard.

Jenny: Oh, I love that.

Yeah, I think it’s going to be really interesting. I would encourage people to give it a go, encourage folks to try it, and to really get comfortable with (they call it prompt engineering) but how to use it. Try it, test it, and play with the prompts that you give it, particularly the app, which is on both again, the iPhone and Android platforms. The app is also super easy to use, and it’s generating me some really nice, quick and easy… You can get it on your browser as well. Like, “Generate me some social media content.” Not to say that my social media is generated by AI. It is not, but it is giving me a starter for some of that. So I’ve been playing with it, and I’m gonna keep doing that.

Jonathan: Yes, it’s kind of in an experimental phase for me as well.

You touched on this earlier, and I think it really is important that we talk about this.

Society as a whole is grappling with the potential and the pitfalls of AI at the moment. I’ve had different AI products express sorrow to me when I’ve told them that I’m blind. I’ve had some quite stereotypical answers from some AI products about disability issues.

Is this something that Microsoft can do something about? Because clearly, it concerns you as well.

Jenny: Yes, it does, of course. Responsible AI is accessible AI and vice versa, right? We have to make sure that this chapter of AI and every preceding it is responsible.

And so there are some guiding principles there that traverse accessibility, include accessibility, but are also broader than that.

And if at all curious, do check out ours. We have 6 of them.

But just to linger on how we’re approaching it for accessibility, I think there’s 3 things that we’re going after. One is to make sure that the implementation of AI is accessible, and that’s really important. I think when you have a fast innovation curve, which is what we’re in right now, it’s core and fundamental. Again, that word keeps coming back to make sure that every implementation we do is still accessible as we go forward. So just not forgetting the foundation, we have to make sure what we do is accessible.

I think the next one is really leaning into ableism and making sure that we do tackle the tropes that can happen.

It sounds like you’ve had the, you know, … I had it in a grocery store the other day when I was out shopping. “Oh, I’m so sorry that you’re deaf.”

Jonathan: [laughs]

Jenny: And I’m like, “I’m not. I’m quite happy, thank you very much.”

Please don’t apologize for, … I mean, good grief.

And so yes, that is definitely something that’s being tackled where we can within the data that, of course, we own, and drive, and we do tests for it, and we do red team for it, which is the real sort of strategic process for AI before we ship anything. And that’s all a part of our responsible and accessible AI principles.

The third is to innovate and see where we can paradigm change. And that’s why we’re pursuing things like our partnership with Be My Eyes into Disability Answer Desk, and continuing to advance Seeing AI, and looking at audio descriptions and some of the other many things that we shared last week at Ability Summit and will be coming out soon as well because we do believe that there are some step changes that could potentially happen here, as long as we have the foundation on data, and as long as we are accessible in how we think of work.

I think when you have an innovation curve like this, it is incredibly important to stay grounded, stay very close to your principles, and really tackle any issues that do come up. It’s going to take the whole industry, I think, to lean in on this one, on multiple areas, to deliver the exciting possibilities, and there are a lot of them.

Jonathan: Yes. It’s a complex question because if you’re collecting the sum total of human knowledge pretty much, you’re also collecting the sum total of human prejudice, and somebody’s got to edit that out in some way, and there are clearly some deep philosophical and practical questions about that.

Jenny: It’s true. As we said earlier, ableism exists in society. That means that it exists in the data. But there are ways and vehicles by means we can put the right guardrails in place.

So my hope is that if you’re popping something in CoPilot, Jonathan, and you self-identify, you don’t get an apology. That’s one of those guardrails.

Jonathan: Microsoft is a key player, but it is not the only player in the AI space. And I am concerned that there are just not enough disabled people around the tables at these AI companies making the points that we’ve just been talking about.

Does that concern you as well, or are the companies more disability-confident than I give them credit for?

Jenny: Of course, I have that concern. I think we all do, on multiple elements, and I think we’ve touched on many of them.

I don’t think AI is different to some of the conversation we’ve had about, say, hiring. We always need more, always need more.

I will say I’m very conscious that I’ve had to stay very close to what we’re doing. And in many ways (I can say with the benefit of hindsight, but also with the realism that we have a lot more to do), the structures and the foundation of building with, and through people, and being that way for as long as we have, has helped us on our journey. And so I think in many ways, it’s helped to support the strategy that we put into place when we came into this, which was if you’re going to build great technology, you have to have disabled talent at the core.

That theory hasn’t changed with AI. In fact, it’s made it even more important.

And so yes, there’s always more needed. I’m glad that we have that foundation, but you know that we’re continually pushing for of course more of it.

Jonathan: It’s amazing how far we have come in a relatively short time since LLMs became a big thing about 18 months ago.

The thing that I’m looking forward to at some point is when this computing becomes so fast that we can get description in real time of the built environment as we move around. I think that will be quite the incredible sea change for the blind community.

Jenny: Oh, I couldn’t agree more. And I think if we think about where Seeing AI, Be My Eyes, audio description, and indoor navigation, let alone outdoor, where that’s come in the last 18 months, it’s incredibly exciting when you think about what could be possible in the next 18 months. And that is the paradigm change.

So again, folks, try it.

And Jonathan, keep dishing your feedback. I think it’s going to be really important to stay grounded and to have those listening vehicles, to really lean on that feedback as we go forward.

Jonathan: There are 2 things that are certainly true: one is that the sun will come up tomorrow, and the other is that I have an opinion, and I’ll express it, so you know.

Jenny: [laughs]

Jonathan: Jenny, I know our time is almost up, so I’d like to thank you for two things.

Firstly, for all that you’ve done for this industry and for people around the world who need somebody who can be in there on the inside. And we will never know the conversations that you’re having on the inside, but you’re there making a difference, and you’ve never forgotten where you’ve come from, and I’ve always admired that about you.

And secondly, just thank you for coming on the podcast and having this discussion. I know that the audience will really appreciate having heard your perspective.

Jenny: Jonathan, it’s an honor to be asked, and I’m forever grounded to the fact that I am one cog in the wheel. And so, thank you for forever keeping me grounded.

And to everyone listening, please know that we are forever listening to you as well, and I appreciate the partnership on the journey.

[music]

Voiceover: If you’re a member of Living Blindfully plus, thanks for helping to keep the podcast viable.

If you haven’t yet subscribed, why not do it today?

Get access to episodes 3 full days ahead of their release to the public, you’ll get advanced notice of some of our interviews so you can have a say in what we ask, and you’ll help keep the podcast viable by helping to fund the team to do their work.

Our guarantee to you is that everyone who works on the podcast is blind or low vision, so we’re keeping it in the community.

Find out more. Visit LivingBlindfully.com/plus. That’s LivingBlindfully.com/P-L-U-S.

Pay what you can. It all helps. Thanks for your support of Living Blindfully plus.

Accessibility is a Fundamental Human Right

How great it is to hear Microsoft, a large influential corporation, come out strongly and say that accessibility is a fundamental human right. And I believe that with every fiber of my being. I like to hope that most blind people believe that, too.

But as this podcast proves, being blind doesn’t mean we all share the same opinions, even about something as foundational as accessibility and universal design. Email lists and social media often have little flare-ups. And if someone says something confrontational, or aggressive, or demeaning of someone else’s point of view, that can generate a lot more likes, shares, and replies.

So these days, I tend to take a little time for reflection and decide whether an issue is important enough before I chime in.

There’s also a huge difference in tone and quality between reacting to something and responding to something.

But having taken that time for reflection, I do want to respond to something that left me both shocked and dismayed of late. It relates to what I consider the fundamental truth Jenny Lay-Flurrie just stated – accessibility is a fundamental human right.

There has justifiably been a lot of positive talk of late about the new accessible digital recorders from Zoom.

For those of us who want to create quality content while mobile, this is a massive breakthrough. Actually, I would go so far as to say that navigating my first accessible Zoom recorder made me feel the same sense of joy and freedom that I did when I received my first accessible Nokia cellphone running Talks on Symbian.

There are some parts of the recorders that don’t talk. But if I had to put a percentage on it, I would estimate that about 97% of the user interfaces of these devices offer exceptional feedback. Impressively, Zoom estimates that about 25% of development time for the Essential series of recorders was devoted to accessibility.

My hope is that now that that work is done, it’ll be easier to integrate the feature into future recorders, and that it will be just something Zoom does. It just includes it as a matter of course.

Time will tell, of course. But what we know is that since Zoom devoted this much time to making their recorders accessible, they care a lot, they know it’s the right thing to do, and they know we’re a market worth targeting.

You cap all this with some of the recent accessibility breakthroughs in software that drives audio interfaces from Focusrite and Audient, and even the increasing accessibility of digital audio workstations, and there’s never been a better time to be a blind audio content creator.

There is, however, one area where Zoom has fallen short – the user guides of all 3 products in the Essential series aren’t accessible. Specifically, it’s one of those user guides where every time a control is mentioned, it’s not mentioned by name. There’s a little graphic instead. These graphics don’t contain alt text.

So I’ve pointed this problem out to Zoom constructively.

I don’t think for a moment that they’ve made a conscious choice to produce inaccessible documentation. They’re probably just producing content the way they’ve always done it, and there’s a disconnect between the principle of universal design that says everyone should be able to create audio recordings, and what they do in their documentation department.

It doesn’t have to be fixed today, or even tomorrow. But given that they’re now showing such a commendable commitment to accessibility, it does need to be fixed sooner rather than later. It diminishes what is a fantastic accessibility effort.

So I’ve given constructive feedback. It’s been received politely. I hope others have done the same, simply pointing out the deficit.

What really did shock me (because I thought universal design and accessibility was one thing we could pretty much all agree on) was the opposition some people have expressed to some of us pointing this deficit out to Zoom.

Now, this is not my argument, and I would never seek to make it. But as I understand the argument, it is that since these recorders are mainstream products, we’re being whiny and entitled if we ask for accessible documentation. Never mind, it seems, that one of the big features being marketed by the company in all of its promotional material for the Essential series is that it’s accessible.

Others have said we’re being too finicky, too picky. Why are we asking for an accessible user guide when people can listen to my podcasts, or some of the YouTube videos that have been produced on these recorders? It’s said that it’s unreasonable to expect user manuals to be accessible and that if we want to use a product badly enough, we must do whatever it takes to figure it out, and we can and should find a way.

Let’s think about the ramifications here. What is this argument saying in practice?

Well, in practice, this argument is saying that even though we paid the same price for these recorders as everybody else, and everybody else gets a user guide that they can refer to quickly or read in its entirety if they see fit, blind users are unreasonable if we think we’re worthy of the same things. It says that blind people must have a higher degree of technology and research skills, if they are to be afforded the privilege of fully utilising the product that they have bought.

But aren’t these things also true for sighted people? Why can’t they listen to a podcast or watch a YouTube clip, therefore invalidating the need at all for a user guide?

Some, and perhaps I would guess most, do this very thing. You get lots of people who take pride in the fact that they never read user manuals, and there will be people who never look at it, never give it a thought, just look up on YouTube whatever it is that they want to know about.

But Zoom must see the value to their customers in producing documentation, or they’d save money. They’re a corporate entity. They wanna save money where they can. So they could fire their technical writers, offer no manual at all, and have everybody rely on YouTube clips and podcasts. By their actions, Zoom have acknowledged that user guides have their value, have their place. Those same user guides also have their value for blind people.

Documentation is no different from websites. Hopefully, (although maybe there is someone somewhere who thinks this because at this point, I’m beyond being surprised about this topic), no one thinks that the only websites that should be accessible are those exclusively targeting blind people. The concept of reasonable accommodation is now well-established, well understood, and accepted in many countries.

Would it be unreasonable to have Zoom change its practices so that in future, its documentation were accessible? Absolutely not. It’s not onerous, and the benefits to those affected are significant. And once they’ve got the templates set up, accessibility would be at the foundation of every document the company produces in future.

And as I said in my NFB address last year, the irony is that when those of us see a better world and advocate constructively for that better world, even those people who criticize the doers benefit from the doing.

Some of us have been asking Zoom to lift the bar and produce accessible recorders for many years now. Perhaps, some of these blind people who object to constructively pointing out the deficits in their documentation would have criticized us for asking for these accessibility features at all.

If they’re being consistent in their argument, shouldn’t they also have said, “This is not a mainstream product, so stop being so entitled and whiny and think that these recorders should talk. You can have a cheat sheet on your phone or your notetaker telling you what buttons to press.” And indeed, that’s exactly what blind content creators have had to do up until the last month or two. “Blind people who think these recorders should be accessible are just totally unreasonable.”, they might argue. And yet, here we are, as a result of that advocacy.

By comparison, fixing the manual is trivial. And I want to be clear that the point of my raising this is not to criticize Zoom.

There are plenty of user guides for devices and software that have exactly this problem. But there is the occasional user guide for a mainstream product that gets it right, too. And that’s a joy to behold when you get lucky and you get documentation like that.

My hope is that having set such a great example of what accessibility can be like in the hardware, their documentation will stop letting the side down and live up to the same standards.

No, my reason for raising this is to challenge us all to push back against the tyranny of low expectations, particularly from those in our own community. If we buy a product, we are worthy of equal access to information about that product. You should not have to be a tech genius or Sherlock Holmes to make the most of what you own.

Beware of those who think that because it’s easy for them, it should be just as easy for you.

Thank you for saying it, Jenny. Accessibility is a fundamental human right. Do not ever surrender your fundamental human rights.

[music]

Voiceover: Has something on the show got you thinking?

Share those thoughts with the rest of the Living Blindfully community.

Send us an email. You can include an audio attachment recorded on your computer or smartphone so we can hear your voice, or you can write it down. The address is opinion@LivingBlindfully.com. That’s opinion@LivingBlindfully.com.

Or phone our listener line in the USA: 864-60-Mosen. That’s 864-606-6736.

Let your voice be heard.

You May Not Know Texts Are Being Truncated, and General Siri Thoughts

This next email comes from Reg. He says:

“I recently discovered something that I did not know about text messaging. I thought I knew it all.

To me, the following is more proof (if anyone ever needed any) that texting anything more than a thank you, or see you real soon is the absolutely worst possible way for humans to communicate.

However, often, when humans decide to shut down or become deeply upset and unresponsive, and either will not or cannot talk to each other at a given moment, texting becomes your only option.

The issue from my perspective is that when messages are directly copied or forwarded, there is often no way to tell if Siri or VoiceOver has only provided you with part of the message.

Another time when a person might need to do this is when trying to clarify a negotiation where a lot of discussion has happened over text, or if there is some type of emergency or legal problem. Maybe you are being harassed by someone, and you need to collect those messages to prove that it’s happening.

So many people at work, or in bands, or courses like to communicate over text in small groups when making arrangements. It’s quick and easy, but it can so quickly get out of control.

After a somewhat traumatic experience that I’m not going to go into, I recently sent the following feedback to accessibility@apple.com. I wanted to share it with the community, partly to determine if this bothers anyone else, or if I’m making a big deal out of nothing.

Here’s the comment:

’Even when reading long messages with Siri or VoiceOver, the small red arrows that indicate messages have been truncated are not spoken. The person receiving the message gets no indication that text has been hidden or left out, nor do they know how to expand the rest of the text in order to be able to read the entire message.

A related issue is that there is no indication when messages I sent have been automatically divided or truncated. Often, VoiceOver will read the entire message. But on the receiving end, they see only a part of it, and no one realizes that half the content is hidden or missing until it’s too late.

I am required to select a version of iMessage for this form. However, it happens in all versions of iMessage on the phone, iPad, and Mac.

Please escalate this, if possible, as it is an accessibility issue.’

Am I wasting my breath baying at the moon, barking up the wrong tree? Should I just give up and go to sleep?

Probabl.”y, says Reg. “It was much worse than the old days before iMessage.

Even now, when you send a long text message to people with older phones, it gets divided up into multiple messages. Often, they’re put in the wrong order. It just takes one misplaced word to sabotage all the time and effort you put into clearly communicating your thoughts in a kind, considerate way.

I thought about sending my feedback as a feature request instead of a bug report, but then I knew it would instantly go to the bottom of the pile.

Even iPhone users with sight who are not of a technical kind often do not realize that the message must be expanded to show the entire content. These little red arrows need to be made larger, voiced, vibrated, labeled in Braille” (with an uppercase B), “with the words ‘Tap to expand’, and a small sound provided so that those who are low vision, deaf-blind, or blind will be made aware that something important is missing.

This is only one small scenario that points the way to a much broader problem. Many people who use Siri are unable to use their screens at all, so the same thing applies.

There needs to be some indication that this has happened, so that everyone will have equal access to the same information.

Once again, something that could be protected if society would only adopt the principle of ‘nothing about us without us’. Should we not be demanding the same level of access from our phones as our sighted peers?

Siri, with the accessibility settings, gives a person the ability to select the level of spoken feedback they desire. Any time it reads a weather forecast or says the address is here without actually speaking it, tries to force us to look at the screen, or leaves out vital information that others can access, we are being discriminated against. Not just blind people, but quadriplegics, audio learners, non-readers, and anyone who can’t conveniently see or navigate the screen. In this case, my wildest dream would be that when you tell Siri to speak everything, that’s what it does until you tell it to stop.

Another example of the problem is when you ask for the weather. Siri says something like, ‘Temperatures will hover around 45 degrees today.’, but it won’t tell you that the low will be 28, and the high will be 60. It does this randomly and intermittently, thereby cutting out important information that you might need to know to plan your day.

This might seem like a minor inconvenience, but inconveniences pile up and become major challenges to our freedom and independence. They are unnecessary, human-created barriers that need to be conquered.

If you ask your voice assistant what is the total predicted snow accumulation for the day, it won’t tell you. But a graphic may simultaneously appear on your screen with hour-by-hour precipitation amounts. How do you even know it’s there? You could be missing out on important information that could save your life.

Apple and all these voice assistants need to step up their game in this area and provide us with equal access, regardless of our abilities. Imagine a world where companies give everyone a universal, fully customisable user interface that gives a person what they need based on their preferences and abilities, not just what’s easiest to program for Apple.

The Siri that has literally changed the world for more than ten years and that we know so well and love to hate is likely on its way out and might soon be replaced with a much more robust and capable artificial intelligence-driven solution. I don’t have a crystal ball. However, it seems to be the trend. Whether they choose to call it Siri or Big Bob, it will be an entirely different animal.

I hope that they will keep these things in mind in the future, so that their products will continue to set the bar in the area of universal access for everyone. Maybe it’s not such a wild dream after all.”

Thanks for writing in, Reg!

Well, I agree with you about Siri. I think that it is the weakest of all of the commercially available voice assistants at the moment.

And you’re right. I am receiving intelligence that iOS 18 is going to be a blockbuster release.

Actually, there’s a lot in iOS 18, I understand, and a lot of it is responding to the hoopla about AI. So we will see something very different. The new iPhones are going to have more processing power, specifically targeted at working with large language models offline because this is consistent with Apple’s paradigm of trying to keep as much of this processing on device as possible.

So iOS 18 could be a rocky road for us because with so many significant changes come potential accessibility bugs, and who knows if they will be fixed or not.

But I think people will welcome something better than Siri. I mentioned this when we were celebrating, or commemorating, or commiserating or something over the 10th anniversary of Siri back in 2021, and I made the point that it showed so much promise in 2011, and really hasn’t lived up to the promise at all.

You’re right. It is frustrating when you ask a question and it’ll answer, “I found something on the web. Take a look.”

For me, ChatGPT is doing a much better job. And of course, Alexa too, the voice assistant from Amazon, the Soup Drinker which I try to call it so that we don’t trigger everything. That, in my view, does a pretty good job as well. Although that seems to be scaling back now. Amazon’s firing quite a few people who were involved in that division, and I’m not sure what the future of the Amazon Echo line will be.

That’s the easiest bit of your message to respond to.

I’ll come back to the first bit now relating to texting, and I’m a wee bit confused about what’s happening for you here.

I asked my daughter, Heidi, (who we’ve heard on the podcast many times over the years, and she partners with us on iPhone-related things) about these arrows that you’re talking about. And she says she’s never seen them. That when she opens up an iMessage conversation, she has never seen a message truncated. I’ve never seen one truncated either. So I’d be interested in understanding when this happens, what circumstances have to be true for it to happen?

I communicate with people via iMessage where there are very, very long messages. And you do have to double tap from the iMessage screen to actually open up the conversation, of course, because you’re only getting a preview on the main iMessage screen. But when I open an iMessage thread, I don’t ever recall seeing a message truncated.

That said, I have seen the thing you’re talking about where if you are texting someone who is not on iMessage, so they’ve disabled iMessage and they have an iPhone, I’m not sure why people would do that. But most likely, you’re texting somebody who uses Android. You’re reverting at that point to SMS, the short message service. The original 160 character text message from way back when. And when, as you say, you write a long text message that way, then it gets chopped up into 160 character chunks, and sent along by your carrier. And sometimes, your carrier will deliver part 2 before it delivers part 1, and that can be really frustrating.

I haven’t seen this actually in New Zealand with my carrier, but I have seen it when I’ve been on carriers overseas.

The good thing is that those days are almost at an end, because Apple, due to regulatory pressures, has finally bowed to the inevitable, and they’re going to be rolling out rich communication services known as RCS, for short. And this is going to be like sending an iMessage, or a WhatsApp message in a cross-platform way. So you’ll be able to use video, there’ll be support for group chats, there won’t be that horrible 160 character limit. So that will take care of your jumbled up order problem.

But I’m afraid I’ve just not seen the other thing you’re talking about where messages are truncated.

So perhaps someone can enlighten me, or you can enlighten me.

It’s interesting the way we are evolving in terms of the way that we communicate because you’re right, we do tend to text each other a lot these days. I’m not sure that’s necessarily a bad thing or a harmful thing. It just is what it is.

I regularly text Bonnie when she’s at work and I’m at work. We just quickly exchange something to do with what’s going on for dinner, or some little thing like that. I text my kids and other friends of mine that I keep in touch with. And it’s great for those times where you don’t want a deep and meaningful conversation.

I do hear about people being dumped via text message, and all sorts of things like that. That’s a bit drastic, isn’t it? I still think that a voice call, or even a face-to-face meeting (yes, that’s a radical concept, isn’t it?) is an appropriate way to do certain things. But perhaps, that’s just my age talking.

What I find quite interesting is that there are many young people these days (and I see this from my day job, actually), who are frightened of using the phone. Using a phone for a voice call makes some people incredibly anxious. I’m talking serious anxiety here – shortness of breath, panic. They just will not pick up the phone and call somebody (perhaps a business that they don’t know) to ask a question.

These days, there are so many chat bots, and you can engage with companies, particularly via WhatsApp here. I guess Apple’s got an equivalent product in iMessage, but that hasn’t taken off as much here as WhatsApp has. You can communicate with all kinds of businesses through WhatsApp.

And people will do this because they just don’t know how to use the phone for voice calls anymore when you’re calling somebody that you don’t know. So you’re picking up the phone and calling a business. I’m intrigued by this. I wonder what’s brought that about.

Advertisement: Living Blindfully is brought to you in part by Turtleback, the original manufacturer of leather cases for notetakers since 2003.

Now back then, I was managing blindness products at Pulse Data International, working with Russell Smith. And we never regretted choosing Turtleback because the name’s synonymous with quality manufacturing, quality service.

There’s a wide range of leather cases available for the products you use every day.

Check them out – TurtlebackLV.com. That’s all one word. TurtlebackLV.com, and be sure to use the coupon code LB12. That’s LB for Living Blindfully, and the number 12, and you’ll get 12% off at checkout.

If you prefer, give their friendly team a call at 855-915-0005. That’s 855-915-0005.

And if you’re attending CSUN, you can visit them at booth 1014, and feel the quality for yourself.

That coupon code again, LB12, for 12% off at checkout at TurtlebackLV.com.

Impressed With my Chromebook

Voice message: Hey, Jonathan! This is Chris Westbrook.

I recently got a Chromebook.

I know you reviewed Chromebooks a while ago.

I’m liking it. I got it mainly just to see what they were like.

I’m planning to help some blind students, hopefully this year. Because I know they’re used a lot in education, I wanted to see how they worked, and their commands and all that.

It didn’t take me very long to come up to speed with it. I’m really enjoying it. I can do pretty much everything I could do on Windows, except maybe coding, audio editing, and things like that. But you know, it’s a nice little laptop.

Gmail is accessible of course on the Chromebook, but it’s also accessible with NVDA and JAWS. So as far as that goes, I don’t think we need the basic HTML view anymore. I know some blind people probably like it. It’s familiar and whatever. But I really think you should learn how to use the regular view because I believe you can do a lot more with it. I love being able to search through my mail and stuff like that.

As far as flying experiences, I haven’t really had any bad experiences with TSA. I’m flying to Orlando this summer for the NFB convention, and I believe this is the first time I’ve flown with my cochlear implant, so that’ll be interesting because that has a magnet in it so I don’t know how that’s going to go through the metal detector, or if I’m going to have to take it off when I go through security. Hopefully not. You might just have to wind me down, or whatever.

But yeah. So if anybody’s going to NFB convention, I’ll be there. Hope to see some Living Blindfully listeners there. I’m sure there will be.

Questions About Sonos and Dolby Atmos

Joe Quinn is writing in about Dolby Atmos and Sonos. He says:

“I want to go whole hog into Dolby Atmos, and I know the Sonos ARC and 2 ERA 300s are the way to go.

My question is in terms of accessibility, because I know I’m going to need a new TV. What do you think would be the best TV to get for somebody who is blind that also supports EARC and Dolby Atmos? Also, does a TV supporting EARC automatically support Dolby Atmos?

I didn’t realize televisions were so expensive. I’m spoiled by the Fire TVs being such a good size, and being so cheap. Unfortunately, they neither have EARC nor support Dolby Atmos so unfortunately, those are out.

What are your adventures with the Sonos ARC and ERA 300s (if you have them), Dolby Atmos, and for that matter, audio description, which I know you’ve spoken of on the podcast previously?

Also, when it comes to EARC HDMI port, if I have more than one device that I want to connect to the TV, how can I do that? Or can I do that and have everything come out of the sound bar, even though I have switched to an input that isn’t necessarily the one that has the Sonos ARC connected to it?”

Joe, we have 3 Sonos ERA 300s at Mosen Towers. 2 of them are sitting in a place where they act as rear surrounds for the Dolby Atmos system, and they’re doing a great job of that because they add a little bit of extra depth compared to the previous Sonos speakers that we were using as rear surrounds. These are designed specifically for this purpose when they go into the correct mode, so it’s a great experience.

We also have another ERA 300 in the master bedroom. I’m probably going to expand that and turn that into a stereo pair at some point. But it is amazing. We have it on a dresser in just the right place so when you’re listening to Dolby Atmos content in the master bedroom, it actually sounds like sound is bouncing off the rear wall. It’s really cool indeed, so I highly recommend the ERA 300s. They’re great speakers.

It has been a few years since we bought the Sonos ARC, and had to go down the rabbit warren of picking the right TV. You may remember that we originally got a Sony TV because we had a Sony TV previously, so we upgraded to the new model that had a screen reader in it.

Sony TVs run Android TV, or at least their higher end ones do, or did at that time.

And what we found to our astonishment was that as soon as you turned on the screen reader, it broke the EARC function, which stands for Enhanced Audio Return Channel. So we couldn’t use it for Atmos. It was a bit of a disappointment.

So we ended up with a Samsung TV. I’m sure that Samsung TV has now long been superseded, but it has an okay screen reader. It supports Atmos, and it’s got the extended audio return channel that you want for lossless Dolby Atmos.

So we need to talk about this because the eARC may not be absolutely necessary in all cases.

I know, Joe, that you will be familiar (and some of the listeners will be familiar too) with the concept of making MP3 or even M4A files. So you take a lossless source (maybe it’s a CD in the old days, whatever it might be). You take this lossless source, and then you crunch it up to make it smaller. You do that by compressing things in a way that takes sound away, but it’s supposed to be sound that most people don’t notice or care about. And that’s how you get an MP3 file. And the smaller you scrunch it up, the more loss you can hear.

Well, Dolby Atmos over streaming services is like that, because there’s a lot of content to send over those streaming services. Not everybody has the whole multi-gigabit fiber thing going on, so they compress them. In that case, with compressed audio that is in Dolby Atmos, a standard ARC port would be just fine to plug your Sonos ARC into.

Whoever thought of calling it the Sonos ARC was a genious, a genious, because there’s a port right on your TV with ARC written on it. I mean, what great marketing that is, to have a port there with the name of the product? So you can plug a Sonos ARC into an ARC port, an audio return channel port on your TV that is not eARC.

But it will break down at least in one respect, and probably two.

The first use case where it breaks down is if you take a Blu-ray disc from somewhere, maybe you buy them. Not sure if anybody still rents them out anymore. But if you buy a Blu-ray disc and it’s got Dolby Atmos content on it, it’s lossless Dolby Atmos. I guess the closest equivalent would be you’re listening to a WAV file, as it were.

Your ARC channel is not going to have the bandwidth to play that content from your Blu-ray disc in Dolby Atmos. Now it won’t be that it will refuse to play, it’ll just down sample it to stereo, or maybe 5.1 or something like that. But you won’t get the Atmos from the Blu-ray disc.

The second use case that used to be the case, and this is what prompted us to make sure we bought a TV with an EARC port, an Enhanced Audio Return Channel port, is Apple TV. Even if it’s playing lossy audio from a streaming service, it sends it out to the HDMI as lossless audio. It upsamples it, in other words. And unless you had an eARC port on your TV, you wouldn’t be able to get content from Apple TV in Dolby Atmos, which for many blind people is a very significant consideration.

Now, because I haven’t had to do any shopping for this for the last, I don’t know, 3 or 4 years, I can’t tell you whether the Apple TV still does this, or whether there’s an option now that allows you to use a standard EARC port and get Dolby Atmos. It would be worth checking Dr. Google, or asking Apple directly for that.

But I think if you’re going to do this properly, you may as well go the whole hog, as you say, and get a TV with an EARC port.

But if you can’t, and the TV that you buy receives Dolby Atmos and has a number of apps like Netflix, and maybe Apple TV, and Disney Plus that pass on that Dolby Atmos (and not all of them do), then you should be in good shape.

In terms of audio description, it’s been a while since I’ve watched too much other audio described content other than Apple TV Plus. I can’t remember whether The Crown, which I did finish recently, had audio description and Dolby Atmos working together or not. I don’t think it did, but I stand to be corrected.

But you do sadly find that when you enable audio description, things go down even to stereo at times. Somebody told me once that it even went down to mono for something that they watched with audio description.

Apple is consistently good in this regard, and they deserve a lot of praise. All of their content that I’m aware of that is encoded in Atmos has audio description that you can still play when you’re in Atmos mode.

So if you get the setup, I would highly recommend For All Mankind. I mean, it’s a great series anyway. That is so good. I’m on the edge of my seat waiting to see whether we get season five of For All Mankind or not. But it’s spacey. It’s got lots of rocket ships and going to the moon and onwards. So you get a lot of great Dolby Atmos content.

Actually, another really good one for Dolby Atmos was Sea, which I could never really get into. But that sounded pretty impressive. It was one of the first things that we watched in Dolby Atmos and it was really immersive.

Lectrote Interactive Fiction Interpreter

Christopher Wright is back, and he says:

“Since the amazing Spatterlight for Mac was recently discussed, I thought I’d talk about Lectrote.” (That’s spelt L-E-C-T-R-O-T-E.)

“It’s an interactive fiction application that handles many different game formats.

From initial testing, it appears to be accessible, and NVDA automatically reads game output, so I think this is the best option for Windows users.

I’ve uninstalled several stand-alone interpreters that only worked well with an NVDA add-on, or SAPI 4 automatic speech output.

I’m not really sure how well it works on Linux with Orca, and Spatterlight is an excellent choice for the Mac. It’s one of the many things I miss about MacOS, if I’m being completely honest. The developer has gone above and beyond in terms of VoiceOver support.

Unfortunately or fortunately, depending on your view, Lectrote is coded in Electron, just like everything else these days.

We can argue all day whether Electron is effective for blind people, but it does have some interesting accessibility implications including the ability to read content like a webpage with your screen reader’s virtual cursor. This is super handy in Lectrote.

I’m not a fan of Electron, because it makes my head hurt trying to understand the hybrid web and application interface, but I’ve found disabling the virtual cursor and treating most Electron apps like Discord as traditional Windows software works much better. I can only imagine how much of a nightmare it is for less experienced screen reader users.

Sadly, it appears this is the way the world is going, even though these applications are extremely bloated and technically accessible, but not efficient from a blindness perspective. I’ll be dragged into this new era kicking and screaming.

My list of accessibility IF is Lectrote for Windows and possibly Linux, Spatterlight for Mac, Frotz for iOS, and Fabularium for Android. I’m still trying to find decent ways to play IF in Chrome OS, and it looks like the Linux Frotz command line program might be the best choice inside the Linux container, though I haven’t tried this yet with ChromeVox.

Sadly, Frotz only handles Z-code games, so if anyone has recommendations for other programs to handle other formats, please let us know.

Transporting and Caring for Kids

Let’s return to this interesting topic that Kelby Carlson raised on the show about transporting kids.

Patti Chang says:

“Just listened to episode 266, and I want to divert the conversation a bit.

When my kids were little, I had them on mass transit from age 3 or 4 weeks old.

My husband worked nights, and I worked days, but for the government, so I had more flexibility. Very easy to do.

As my kids got older, I trained them to answer mom, employed bells on their shoes, and used a harness attached to them in the interim. Really had no issues as the kids were growing up.

I have 2 children, 6 years apart. Their dad is sighted, by the way.

The bigger problem has arisen with our granddaughter. She has 2 sighted parents, and has not been taught to answer, use bells, and will not tolerate a harness at all. I get it. Her parents tried, and the daycare said, “No bells.”

It is hard to teach a child to use non-visual communication when you do not do so, etc. I am frustrated though, because I cannot safely take her outside without someone sighted, because these ordinary modifications have not been taught.

I bring this up so we can talk about how to deal with caretaking blind when you are not the primary caretaker, but the secondary caretaker, and you have not had the opportunity to put in place the things that would make it possible to caretake safely, especially for young children.

We recently attended a wedding where the music was loud, and I declined to watch my granddaughter, who is now 4, because I cannot trust that she will answer.

Then my husband lost sight of her for a couple of minutes, and our kids were upset.

Complex, but worth a discussion. I do not have definitive answers, but I think grandparenting is often left off these conversations, and grandparenting, because you can’t control what kids are taught, is different.”

Patti, thank you so much for that thought-provoking message.

Like you, I used to zip around with the kids when they were little on public transport. They knew that they never ever ever did not answer when they were called, all those good things.

But I’m just starting the grandparenting journey now. My grandchild is 1, and she lives in another city, so we haven’t been doing the caregiver thing at this stage.

But it’s a very interesting discussion, and I look forward to anybody’s contributions on it.

Blind People and Video Editing

Now I’m going to look directly at the microphone and put on my best smile, because we are talking about video editing.

Daniel Semro writes:

“The only editor I can safely recommend that is accessible is Apple’s iMovie app. However, there are a few caveats to keep in mind.

If you were going to add titles to your videos, as in titles that appear on the screen, the process to add them after selecting the ‘titles’ button with VoiceOver is not accessible, or wasn’t when I last tried it.

The other caveat is that iMovie is a big app and can take up a lot of space on the phone, so make sure you have plenty of space before installing it.

As I am a Mac user, my primary editor of choice is EyeClip.” (that’s E-Y-E Clip). “It is fully accessible, and is a free app available from the Mac App Store. It even said in the description of the app that it was made accessible, so even visually impaired creators can edit.

I am going to have to try Reaper”, says Daniel, “for video editing. I’ve used it for audio, but not for video.”

And Kelly Sapergia says:

“In episode 266, there was a question regarding editing videos.

I don’t have an answer for it, but I’ve been interested in possibly getting into doing videos on YouTube, either recorded or streamed.

I have a Logitech C270HD webcam that I use when attending meetings on Zoom and Google Meet, but don’t know what software to use to record a video file in Windows.

I’ve heard of services like TunesToTube.com, where you can upload both an MP3 and an image to create a YouTube video that way. But I’m thinking it would be better to record an actual video, then use Reaper to add my own music or edit the presentation that way.

I remember when Freedom Scientific had their Next Big JAWS feature contest last year, which required you to submit a video response. Assuming one knew how to do that, that was fine. But I was somewhat surprised that they didn’t have a training session on how blind people could do that, if they weren’t familiar with the process.

In any event, what software would you or other listeners recommend? Or is there a way of recording a video built into Windows?

Thank you, Kelly!

There is indeed. It’s the Camera app.

If you go to the Start menu and type Camera and press Enter, the Camera app will come up. You can switch it into Video mode, verify that you’re using the correct camera and microphone, and record that way.

And there may be better ways of doing this, but this is exactly what I do. When I want to put a video together for staff, I record in the Windows Camera app. I make my video, I bring it into Reaper, I edit it, I enhance the audio a bit with a little bit of compression and sparkle and good things like that, and we’re good to go.

TunesToTube is a very good service for doing exactly what it says on the tin. You take an image, you take an audio file, it combines it into a video that you can then upload to YouTube.

And indeed, YouTube itself has now put together a similar thing for podcasts like this one.

So Living Blindfully is on YouTube. And until recently, we used a service that you had to pay for called Repurpose.io, and that would upload the audio to YouTube, it would create a video file that was kind of cool, I understand. But obviously, it wasn’t an actual video of me sitting here in front of my gear reading my Braille display. And I’m not really after the video market sufficiently to do any of that.

Now, YouTube has automated it. This is all part of their quest to get into the podcasting space. So they will now take an RSS feed for you, they’ll take your podcast logo, all the artwork from a particular episode, and upload it for you.

So now, we have saved a bit of money by taking the old Repurpose.io away to publish to YouTube, and YouTube is doing it itself. You can subscribe to Living Blindfully on YouTube.

[music]

Voiceover: Since you’re listening to this podcast, you already know that Living Blindfully has a substantial engaged global audience. We’re heard in over 110 countries and territories.

That’s an opportunity for you if you have a product, service or podcast you’d like to tell our audience about. Get in touch with us about advertising here on Living Blindfully. We’ll tailor an advertising campaign to suit your message and your budget.

Find out more and get in touch by visiting LivingBlindfully.com/advertise. That’s LivingBlindfully.com/advertise, and share your message with the Living Blindfully community.

The Good and Bad of the New Stuff Website

On Living Blindfully, we always talk about stuff. In this case, we’re talking about Stuff, the website in New Zealand. It’s a news website. You can get to it by going to stuff.co.nz.

David Harvey is talking about this, and he says:

“Hi, Jonathan,

Given that you have written articles for Stuff, are you able to leverage your contacts with them to inform them of the issues I am facing, please?

Some articles list an email address to contact the journalist. However, I can no longer find that on the articles.

These are my good and bad of the new experience.

Good, navigating articles by heading using the VO heading navigation gesture.

Bad: the removal of the technology section.

I cannot back out of an article using the scrub gesture.

The share buttons are no longer at the bottom of the screen.

Saving articles to Instapaper from the share button, which took me time to find, as well as on the Stuff website, is broken, because VO just says on stuff.co.nz instead of the article title in Instapaper.

When I navigate, it sounds like they’ve just taken the website and wrapped it up in a new app. For instance, I go to the navigation menu and scroll down. I hear what sounds like links to various sections.

When navigating the list of articles, some older ones are still showing, even though they were published several weeks ago.

I’ve been a Stuff reader before. I had an iPhone, and I enjoy their articles.”

Thanks very much, David.

We do not have an exemplary news app in this country. I suppose the RNZ app is not too bad, and I worked with them on that some years ago. I understand they’re going to be doing a new RNZ app soon, so I hope that they’re taking care of accessibility because Richard Hulse (who was at RNZ at the time) and I spent a lot of time working on the accessibility of that app for both iOS and Android.

The New Zealand Herald app is absolutely appalling. There are so many problems with that app, I don’t know where to start.

And really, the Stuff app’s not that much better unfortunately, but it is a bit better, a bit better than the New Zealand Herald app.

You know, and it’s so sad because you look at apps like the New York Times and The Guardian, several others overseas and they’re great.

I do wish that pressure could be brought to bear on New Zealand providers to make their apps more accessible. They are facing some existential threats at the moment. It is a really tough media market in New Zealand. And sadly, we’re just a low priority right now.

Everything You Ever Wanted to Know About Matrix

Voice message: Hi, Jonathan! Hi, Blindfully listeners! This is Marco from Germany, and this is in response to the episode 266 question about Matrix, the protocol, what Matrix is, and what clients there can be.

I worked with Matrix for quite a while when I was working at Mozilla. And we switched from IRC (the old Internet Relay Chat), to Matrix as a team collaboration system.

We actually evaluated several options. Among them are Slack, Mattermost, and I think the third one was Rocket Chat. And we settled on Matrix because it was the most accessible from the client side, and also the most responsive when came to talking to the team about issues such as accessibility.

And it was open source. That was a big bonus as well.

So Matrix is, in essence, a collaboration tool. Like you can do voice chat, but you can also do text chat. You have rooms, and you can have kind of subgroups and you have private chats, and private chat groups as well.

And the thing about it is that it is federated. So much like Mastodon and the other ActivityPub systems are federated and can talk to each other, all the Matrix servers or those servers that implement the Matrix protocol can talk to each other.

So you can have the same rooms on several home servers. That’s what they’re called when you sign up to a server. It’s your home server. And they exchange messages regularly. So when you join a room, for example, you have access to all the backlog of chat that was in the room previously. So unless you wanted to, you could not only search through all that and also read everything consecutively.

In terms of accessibility, it’s a mixed bag. There is the official client called Element. Yeah, it’s really called Element, and it is available on the web as a desktop app. It’s a Chromium-based Electron app and on iOS and Android, and it is accessible for the most part.

I would say the web version is quite okay. You have to get used to some conventions, and you have to know your screen reader to effectively get around. But that’s the case with most more complex things.

Again, I think you can’t use Slack, for example, without some training and some basic screen reader knowledge either. So that’s not a big difference.

And the client that is available for iOS is, or used to be okay-ish. I haven’t looked at it for a while, but it used to be okay-ish. And the team, when they got to it, they also fixed VoiceOver bugs when you reported them in GitHub.

There are other clients, and it’s a hit-or-miss game whether they are accessible or not. Some try to be. But with anything that sighted people implement and don’t get right, they don’t get it right, and you have to maybe ask for corrections or adjustments to make it more efficient.

But there are others that are totally inaccessible. And fortunately for us, most of these clients are free.

you can log into your matrix account with as many clients as you like. You have control over which clients remain in the list of clients that are allowed to access your account, and you can remove others.

So you can try them out. You can actually download several of them and log in to the same server, get the same messages and the same rooms that you’re joined in, and you can find out what suits you best.

That’s the big plus. With Slack, or MatterMost, or some other things that are more or less accessible, you are stuck with one client. So you only have the one app and if it breaks, it breaks.

But with matrix, as it’s an open standard and an open source server and all that, and most of the clients are actually also open source, you can try them. You can actually really dig in and find the client or the clients, like with screen readers. Sometimes, you have to use multiple clients to get what you want. You can do that because they are all available, and most of them are open source and have public bug trackers where you can also add enhancements.

If you know how to program, you can do that yourself, or you can at least report issues and offer guidance if you so desire.

Good luck, whoever wants to try it. Since I no longer work, I’m retired nowadays, I don’t have as much use for matrix anymore. But sometimes, I dabble a bit and know, take a look at what all my former colleagues at Mozilla are doing and so forth. But I haven’t looked at it in a while, as I said.

Okay, thanks for listening to my ramblings and talk to you all later. Bye bye!

Jonathan: Great to have you contributing to the podcast, Marco. And thank you very much for that informative explanation of what Matrix is. I feel much more clued up about the subject now, thanks to you.

And it sounds like it’s something worth investigating. Anything that gives us choice, that means that we’re not at the mercy of one particular developer, who may choose to pull the rug from under us at any time, is a very welcome development. And certainly in 2023, we learned that lesson well, both with Twitter and with Reddit.

Rich Caloggero has also been saying fairly similar things to what Marco has offered. But he also says:

“You can use a pre-built client, or write your own. There are software development kits which do much of the protocol-heavy lifting for you, letting you concentrate on the user interface side.”

And Rich recommends element.io as a somewhat accessible client for Matrix.

Thanks, Rich! Appreciate that, too.

And in a typically thorough email, Elijah Massey expands on Matrix. I won’t read the stuff that he says here that Marco’s already said, but there’s quite a bit more. Elijah says:

“Another benefit of the openness of the protocol is that there are bridges for other chat platforms that allow you to use them for Matrix. They include bridges for Discord, GroupMe, Google Chat, Instagram DMs, Facebook Messenger, Telegram, Signal, Skype, WhatsApp, Google Messages, and even iMessage now, if you have a Mac or have access to macOS periodically to get registrations.

To use the bridges, you can either host your own home server and then host the bridges yourself, or you can pay $5 a month for a service called Beeper that gives you a Matrix account with a lot of these bridges. Although Beeper is currently transitioning away from providing this service to connecting to these chat platforms directly from their app.

If you use a public server like Matrix.org, you can use some of these bridges with very limited functionality, using services like T2Bot.io.” (That’s T, the number 2, and then bot.io).

Elijah says:

“I host my own home server on a Raspberry Pi 5 running Linux, but you could also use a virtual server in the cloud such as Amazon EC2, Linode, or DigitalOcean, as I used to do for a while.

However, the minimum cost of doing This is probably around $20 a month, since your server needs to have enough resources to run Synapse, the official Matrix home server, and the one with the most features.

A Raspberry Pi 400 with an SD card was barely fast enough, and would often slow down significantly when a lot of messages came through the bridges, or when joining large Matrix rooms.

An NVMe SSD over USB helped a lot, though.”

Wow! What a lot of acronyms in one sentence.

“And the Raspberry Pi 5 is significantly faster, and I haven’t experienced issues with performance so far.

In addition to setting up Synapse and whatever bridges you need, you also need to set up a database server with PostgreSQL – a reverse proxy that can forward traffic from the internet to your server, and provide TLS encryption, and a domain name, which is needed for federation. So setting up your own home server could require a significant amount of work and some periodic maintenance.

Also, if your network is behind a NAT that doesn’t support port forwarding, like my apartment network, you could check out Cloudflare Tunnel. If you don’t need the bridges, you can create a free account on matrix.org.

Element is the official client that supports the most features, and it runs on iOS, Android, Windows, Mac, Linux, and the web. The mobile app is mostly accessible, although there are a few issues including that you can’t swipe left past a reply on iOS. However, I still think it’s easier to use than the Discord app.

There’s also a new mobile app for both platforms called ElementX that has a lot faster syncing and some other improvements. The accessibility issues in Element have been fixed in Elementnavigation between messages is faster, since each message is only 1 item instead of the username and the time being a separate item in Element. Also, times for all messages are spoken in ElementX after the end of the message,instead of only a few, like in Element. There are even some VoiceOver actions.

However, there’s an issue on iOS, where swiping left from the bottom of the screen will not put you at the most recent message, and scrolling moves you in the opposite direction to what you would expect.

The issues I described are not present on Android, either in Element or in ElementX.

On desktop, Element is a web app, and about as easy to use as Discord. There are some keyboard commands including Ctrl+K or Command+K on Mac, which opens a search box where you can find rooms quickly. However, the rooms list does not seem to be accessible on Linux, although it works on Windows and Mac, and the search box works.

I couldn’t find any native desktop clients for Windows or MacOS. Although for Linux, there’s one called Gotktrix”, (that’s an interesting name. It’s G-O-T-K-T-R-I-X), “that is pretty accessible with some workarounds.

However, there is a client that works extremely well on desktop called Ementel,” (that’s E-M-E-N-T.E-L), “and It runs inside Emacs, which is an extremely powerful text editor that is pretty much its own environment for applications like a web browser can be. There are many programs for it including email clients, web browsers, chat programs and enhancements to the text editor that make it a great coding environment.

Emacs has its own screen reader and the one I use is called SPEECHD-EL, and there’s another very popular one called EmacSpeak. It has completely different keyboard commands than any other platform so it can have a steep learning curve, but Ement.el is the most accessible matrix client I have used.

It is also extremely customisable, just like Emac’s itself. For example, you can assign keyboard commands to open specific rooms, change any keyboard commands you want, change how messages are displayed, and even write your own scripts in Emac’s LISP. It will send notifications to your desktop as well, which you can also read from inside the client.

I run Emacs on Linux, but it supports Windows and MacOS, too.

Even without customisation, Ement.el is significantly more efficient than using Element for desktop, and even ElementX Mobile.

In addition, there is a client for Apple Watch called Watch The Matrix, and it lets you read all the messages in a room and send messages, and it is entirely accessible and very easy to use .

I mostly use Matrix for the bridges, so I can use most of the chat platforms I use all from one app.

An additional benefit of doing this is I can access these platforms from my Apple Watch in both read and write messages, and from Emac’s, both of which most chat platforms don’t have apps for.

I am in a few Matrix rooms as well, and I also have a few Matrix bots that I wrote for some things.”

Whew! So there you go.

A listener asks a question, we get 13 minutes of very knowledgeable answers from the Living Blindfully community. Isn’t that epicness?

[music]

Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. That’s P-N-E-U-M-A solutions dot com.

Make Windows 10 Free for blind people

Caller: Hello, Jonathan and everyone else.

Well, I just want to first say that the Zoom handy recorder Essential sounds revolutionary, and it’s been something I’ve been dreaming about for probably 10, 15 years.

I used to use the Olympus players for a while. They were great.

And now, it’s time to move on. And now, Zoom seems to have taken the ball and pick up right where Olympus left off. So good job, Zoom, and look forward to hearing demonstrations and maybe getting a couple of their units in the future.

First, I just wanted to ask a question about technology, and what your view is on this.

Microsoft Windows has been doing a lot of wonderful stuff with accessibility lately.

And I know that maybe some Apple users might say, “Oh, Apple’s better.”

But anyway, so here’s the deal. Windows has started to do something now where they’re changing their hardware and their system requirements are changing. And the system requirements changing could be bad because you could be blocked from installing Windows on your computer.

I’ll skip all the details. But there are ways you can get around that.

But I was wondering what there was involved if you think it would be worthy trying to try to talk to Microsoft and say, “Hey, for blind users, we can’t just go out to the store and buy a new machine every so often. Wouldn’t it be nice if you could allow screen reader users to continue running Windows 10 either free of charge or get around the system requirements for Windows 11?” Because Windows 11 is great, but it’s blocked on many users.

Now, I did try to have a resolution to one of the blindness organizations about trying to tell Microsoft to not charge visually impaired users for extra support of Windows 10. But the organization’s response was, “Well, that makes it look real bad. Makes blind people look like we need everything handed to us.”

Well, I don’t see it that way because you are paying a lot of money for something that you shouldn’t have to pay for. If the updates are there and we use it, we can’t get a machine as quickly as some can. Why not allow us to have access to those last few years of Windows 10, or allow Windows 11 to be used for everyone? Find a way to allow us to get past some of those heavy requirements. So I’m curious what you think about that.

Well, I wanted to also respond on 2 things that were discussed on other shows, too, that I thought were very very important.

school for the Blind and Uber

Well, the first thing about Uber, I really feel that if there was a way to do both to improve the city transportation, like expand paratransit by frequency, by flexibility (and by better flexibility in terms of scheduling and where they go and everything, not have to be specifically fixed to a fixed bus route), then I think we could see a lot more blind people getting out, especially on transportation. And those like myself who don’t get out that much, who have mobility challenges, we could benefit from that.

Now, as far as Uber is concerned, I don’t use a guide dog or a cane. I am personally a sighted guide user.

But I still feel for those out there who are struggling with discrimination, with guide dogs and stuff.

And I love the opinion that that driver needs to be not only disbanded from Uber and any other platform, but he needs to be brought into the spotlight and shown what happens. What would happen if you went blind one day, and you decided to get a dog, and somebody discriminated against you when you’re blind now?

So I don’t know, sometimes the fear of blindness can be used as a tool to try to get people to shape up, if you will. “You’re not in that position right now, but this could happen to you some other time.”

People have used the fear of reality with me when I was growing up. “How would you like it if somebody messed with your toy? You’re messing with theirs.”, so that taught me that kind of tough look.

But anyway, so I think that we can’t let it go. We have to find a way to deal with it, whether it be dealing with Uber, dealing with, if we find who that driver is, whatever. Something has to be done about it. And so both people had good points on that.

Now, the school for the blind thing.

To that person who was talking about how their mobility instructor wasn’t very friendly, I had a very similar incident.

I had a mobility instructor who would allow me to get lost, wouldn’t help me find the classroom. He’d let me wander around. He would use threats like, “Hey, how would you like to be the only person left on the campus at nighttime?”

You see, my instructor thought I was playing around. See, I have some orientation issues. So he thought I was playing around. So that was why he did what he did.

And the other thing about the School for the Blind that I don’t like is that they often treat us as very immature people.

So I had a music teacher who was not very interested in talking about the real world. She had no problem with us doing kids stuff. You know, we’re high school students and we’re doing kids stuff. So that wasn’t very good for being out in the real world. You know, we wanna be out in the real world and we’re singing kids songs. Doesn’t make us look good as students out in the real world.

My instructor, my choir instructor and I have many arguments about this. We disagreed heavily on many things, but there you go.

And as far as people, I only know a couple people who have been successful at my school for the blind. I know there’s a few that have been successful at other schools, but at my particular school, I only know maybe one or two people who have been successful.

That’s Alex with that contribution. Thank you so much for it.

Let’s talk about Windows 10. What Alex is referring to is that in October 2025, Microsoft will stop providing free support for Windows 10. So if you’re running Windows 10 right now, you will see every so often that there are updates still being released for it. There aren’t any major new features. I think they might be rolling CoPilot back because they want so many people to use CoPilot.

But as a rule, Microsoft would really like you to upgrade to Windows 11 and beyond.

It’s expected that there will be a Windows 12 at some point.

So when October 2025 rolls around, you will stop getting updates for Windows 10 if you’re running a Windows 10 machine by then. But all indications are that at that point, if you want to keep going with your updates for Windows 10, you’ll be able to pay for them and Microsoft will keep providing them. I guess this is because there might be certain business installations where Windows 10 is just chugging along nicely, and businesses don’t want to upgrade all their machines to Windows 11.

As Alex says, there are some system requirements where quite capable Windows 10 machines cannot run Windows 11. Actually, I’m in this situation with a machine that powers Mushroom FM. It hums along, it’s got a great processor, solid state hard drive, it’s doing everything we need it to do, but it’s not capable of being upgraded to Windows 11.

So Alex is suggesting when you get to the point where those updates are not free anymore, why not have a policy at Microsoft that makes them available to blind people and presumably other disabled people as well, who may have some constraints in terms of upgrading to Windows 11 because of socio-economic factors?

We all know about the high unemployment rate in the blind community. That actually doesn’t seem unreasonable to me.

And you might remember that there was an upgrade program, I think it was the upgrade program from Windows 7 to Windows 10, where it was available for free for screen reader users.

I think there were quite a few loopholes in that program that some people did exploit, and that may make Microsoft reluctant to go there again. But it is worth asking the question, I think. And we still have around about 18 months to ask the question and see what might be possible.

Where is the NFB Newsline Android app?

An email from Andy Smith who says:

“NFB Newsline is a service that gives access to over 500 magazines, newspapers, job listings, TV listings and more. It’s available in the US. I’m not sure of its availability outside of the US.

Many years ago, they created and still maintain an iOS app. They’re also maintaining an access method where you can go to NFBNewsLineOnline.org to access the content. Or you can call into a phone number and use an automated, somewhat tedious, menu-driven system.

I’m interested to know why there’s no Android app yet. If you’re an Android user, your online access methods are the aforementioned website or the phone number.

A number of years ago, I could certainly understand this. After all, most blind people were using iOS. But I am seeing an increasing number of Android users in the blind community.

I sent an email to the director and to a few other people, and I’ve gotten absolutely no response.

It’s about time,” says Andy, “someone addressed this, so that Android users can have equal access like their iOS using counterparts.

The website can be clunky, especially for new Android users. Android users deserve an app.

I love the show.”

Thank you, Andy!

I also reached out to NFB Newsline and got no response, so I can’t help you there either.

I’ve actually kept this email of yours on hold for a couple of weeks, seeing if I got a response, but I did not.

So I don’t know what is happening, or if anything is happening with respect to an Android app for NFB Newsline. What you’re saying sounds perfectly reasonable to me.

What Laptop Keyboard Meets My Needs

Let’s talk laptop keyboards now.

Bart Simons says:

“I want to buy a new laptop.

There are plenty ways to filter the models, but it is hard to get correct information about their keyboard layout.

I want one with physical home, end, page up and page down buttons. In fact, as many physical buttons as possible.

I asked sighted people to check the pictures, but it turns out that they don’t always show clear pictures of the keyboard.

A local vendor told me that those pictures are not always accurate. Unfortunately, they have very few models in the shop to touch.

I gave up the hope to find one with a physical context menu button. As you say, Shift plus F10 is not the same. I remap some keys with SharpKeys, but I don’t assign a physical context menu button.

I use JAWS and a Braille display. In Keyboard Manager, I assign a key on my display to the script called Send Application Key.

I just thought to suggest this to those who may benefit from it.”

Thanks very much, Bart.

And Bart is in Belgium, and a Living Blindfully plus subscriber. So thank you to you, and indeed all the Living Blindfully plus subscribers for your support.

Well, I can tell you that the ThinkPad X1 Carbon fits all your requirements. There is a dedicated home key and end key. It’s at the top of the keyboard. I’m not sure I like the placing very much but if you go to the right of F12, you will find home and end, and an insert key, and a delete key, so you do have a dedicated insert key. Further down where you would expect to find it, there’s a pageup and a pagedown key, and you have the best of both worlds because you can use function with left, right, up and down arrows to perform home, end, page up and page down, so that’s very convenient.

On the right hand side of the keyboard, you’ve got a print screen key, and that’s the one that I remap to the application key. So I’m very happy with my ThinkPad in this regard because it’s got all the keys that I want, and the X1 Carbon is a fantastic machine.

I also seem to recall that Bonnie’s HP Spectre from a few years ago, which is now retired, also had dedicated home, end, pageup and pagedown keys, but I can’t vouch for whether the current ones do.

I’m just so pleased to be back with ThinkPads. They’re workhorses, they’re good business machines, and they get the job done. Having used ThinkPads for a decade, and then straight on to other things, I kind of feel like a wayward son who has returned to the fold, and it would take a lot, it would take something pretty exceptional for me to move away from ThinkPads now again.

That said, I am actually playing with Mac quite a bit at the moment. There’s been a bit of discussion on Mastodon about this very thing. I’m just having a wee tinker, you know, because I like to play with new things. I don’t think it’ll amount to anything, but I suppose you never know. Never say never.

Closing and Contact Info

But we do have to say goodbye for now, because we’re at the end of the show. I’ll be back next week, and I appreciate you listening very much.

And remember that when you’re out there with your guide dog, you’ve harnessed success, and with your cane, you’re able.

[music]

Voiceover: If you’ve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.

If you’d like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: opinion@LivingBlindfully.com. Or phone us. The number in the United States is 864-60-Mosen. That’s 864-606-6736.

[music]