Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

 

Contents

Welcome to 297. 2

Vision Australia’s Board is not doing its job properly. 3

Greg Stilson and Jason Martin from APH discuss the latest with the Monarch. 5

Positive update on TPLink accessibility. 22

Peacock, ExpressVPN, and Pulseway. 22

Overdrive Media Console is no more. 24

The South African election. 25

Sweet Dreams app for some continuous glucose monitoring systems. 25

Experiences with Apple Vision Pro. 28

Recycling experiences. 31

CPAP machines, hearing aids, and the word “blind”. 33

Migrating content from Voice Dream Reader 33

Ray-Ban Meta Smart Glasses. 35

Luna RSS. 36

Closing and contact info. 37

 

 

 

Welcome to 297

[music] Voiceover: From Wellington, New Zealand, to the world, it’s Living Blindfully – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.

Jonathan: Hello on the show this week, it’s official, Vision Australia is embarking on a quick, lackluster recruitment process for its CEO, and blind people will be the poorer for it. Greg Stilson and Jason Martin from APH discussed the latest with The Monarch, and more about the accessibility of recycling systems around the world.

Gerald Ford: My fellow Americans, our long national nightmare is over.

Jonathan: Yes, it is. That was President Gerald Ford who said that back in August 1974. Hard to believe we just passed the 50th anniversary of the Nixon resignation. And there wasn’t a lot of discussion about it. I was expecting all sorts of retrospectives and documentaries, but maybe I was looking in the wrong places. I remember so distinctly where I was, even though I was a little person when Richard Nixon resigned, and even though I was way over here in New Zealand.

And can I just say that Gerald Ford is the only US president I have ever spoken with? Not while he was in the White House, I hasten to add, because I was seven when he left the White House, but I did speak to him subsequently. He was very nice.

Anyway, the long national nightmare of which I speak is the complete and utter absence of any country codes or area codes in recent times. It’s been terrible, but we’re breaking the drought with episode 297.

And it’s funny that I should talk about breaking the drought actually, because you don’t get a lot of rain in Aruba. And if you call a number in country code 297, Aruba is where you will be calling. But you can do a lot of hiking and outdoor activities like that. There’s a lot of water around Aruba. It’s in the Caribbean. And I guess that’s about all I have to say about that. But thank you for breaking our area code drought, Aruba. And they do speak English there. It’s one of the official languages. So if you happen to be listening from Aruba, welcome. And in the words of President Biden at the Democratic Convention, which has now concluded, thank you, thank you, thank you. Thank you for breaking the country code drought. He didn’t actually say that bit, but he did say thank you, thank you, thank you.

Advertisement:  My sincere thanks to Numa Solutions for making transcripts of Living Blindfully possible. RIM is free for all to use for up to 30 minutes every day. Need more time? Well, there’s a range of monthly price plans available, and you can buy a RIM Pass to assist one person as much as you like for a 24-hour period. And there’s yet another way to use RIM. The RIM Community Support Package is designed for those who occasionally need to provide or receive remote technical support, but don’t want a Pro Plan. With the RIM Community Support Package, you get an additional three hours every single day. That’s on top of the 30 minutes RIM gives everyone free. Check out the RIM Community Support Package. Use it on your own PC or Mac, or with anyone else’s PC or Mac. GetRim.app, that’s G-E-T-R-I-M.app.

Vision Australia’s Board is not doing its job properly

I want to return to the important matter we first discussed on the show last week in episode 296. It is critical that qualified blind people have the opportunity to apply to be the next Chief Executive of Vision Australia.

I’d like to hope that showing solidarity with blind Australians is in itself sufficient reason for devoting so much time to this issue. We must look out for each other, but just in case that’s not enough, you should be in no doubt that these agencies communicate with each other. If the Board of Vision Australia isn’t challenged about the highly limited, damaging, exclusive process it’s embarked upon, you will soon find other organizations in the blindness system around the world who will try to do the same thing that will cause real harm to blind people who are qualified to take on these roles at this level. And it’ll cause harm to all of us because when someone is appointed with lived experience of blindness and who has deep-rooted values and self-determination and consumerism, we all get better services.

Now, in last week’s episode, I spoke with Graeme Innes, a man so respected as a disability advocate that he’s been made a member of the Order of Australia. And I did something we don’t normally do on Living Blindfully. Normally, we like all the facts to be in before we embark on something. That’s important, I think, for this podcast’s credibility. But we had the discussion last week, despite it being only a rumor at the time, that Vision Australia didn’t intend to conduct an external process to fill the recently vacated role of Vision Australia’s CEO. Because I hoped that communicating some depth of feeling about this would help at least the four blind members of the Vision Australia board to understand the harm that they were potentially about to cause.

Living Blindfully was of course not alone. This has been a hot topic in Australia, justifiably so. And Fiona Woods, president of Blind Citizens Australia, also expressed concern.

But now we know for sure that the board of Vision Australia intends to proceed down this path in an act of breathtaking disrespect of capable potential external blind applicants and disregard for the application of good governance principles that this move represents.

In a message obtained by Living Blindfully that was sent to Vision Australia staff, its board chair, Bill Jolley, confirmed that they will only seek internal expressions of interest in the role.

One of the reasons they have given for doing that is that their recently departed CEO Ron Hooton occupied the role for 11 and a half years and left a strong leadership team. They don’t want to jeopardize how well in their view the organization is performing. There are a couple of serious problems with this argument.

First, let’s consider the recent process conducted by CNIB, which is Canada’s Vision Australia equivalent. It is larger by the way in every respect. Now, their recently departed CEO, whose name is John Rafferty, had been in the role for longer than Ron Hooton had at Vision Australia. Ron joined Vision Australia 11 and a half years ago. John Rafferty joined CNIB 15 years ago. Now, as someone who’s chaired and been on several boards, some in the disability sector and some not, I am well aware that the most important job a board has is to appoint its Chief Executive Officer. A board has a duty to all stakeholders, including service recipients, donors and staff, to ensure that it’s appointing the best person. And that’s why CNIB conducted a comprehensive international search. That search was thorough, it was professional, and it was conducted in a way that encouraged blind people to apply. They bent over backwards to ensure that every single aspect of it was fully accessible to not just blind people, but people with a range of disabilities. It is true that in the end, they did make an internal appointment, and that person is not blind. We might debate the wisdom of that decision separately, but the fact remains that the board did not fail to do its duty. It took the time, it examined the options, and it made an informed choice based on a comprehensive international search. It can have confidence that it had all the facts at its disposal before they made the decision that they made.

Now, let’s contrast that with what Vision Australia is doing. After 11 and a half years, it now also has the opportunity to consider a range of leadership styles and potential direction for the way blindness services in Australia are led, and it’s taking the easy, lazy way out. The Vision Australia board is concerned, it says, about the time an external recruitment process might take, and that might jeopardize existing projects. And yes, absolutely, a good process takes time. No, it need not jeopardize existing projects. I don’t think anyone would object to a competent existing leader holding the fort until a new CEO is appointed, and that could conceivably not happen until well into 2025. So they are right, it is a very time-consuming process to do this right. But it’s an investment, it’s worth doing it right.

Let’s say, for example, that the next occupant of the role sticks around for another 10 or 11 year stint. Doing this process right, being thorough, will be an investment in ensuring an effective organization into the long term. Where attitudes towards disability are revolving and changing. Thankfully, we’ve got a long way to go, but there is more recognition than ever that disabled people must be involved in our organizations. That’s not just consumer organizations, but service providers too. And I’ll come back to that in a bit. If they can find a blind person who is qualified for the role, who’s an experienced, capable leader, and who understands the considerable unmet need and the cultural challenges currently confronting Vision Australia, the difference that can make for blind Australians could be meaningful and exceptional.

Because for all the board’s talk of how well the organization is doing right now, I can tell you this. You don’t have to scratch too deep below the surface to find some very unhappy blind people at Vision Australia or people who are so unhappy that they couldn’t take it anymore and they’re now no longer there. Some try to stick it out because they value what good they may be able to do on the inside. Others feel defeated and have concluded that the organization’s values need a serious recalibration. This board appears to have accepted the spin of a leadership team that has a very commercial culture and that is considered at least by some people to be disconnected from the people the organization exists to serve. At the very least, the Vision Australia board should be exposed to that perspective and give it consideration through a fair and thorough process that includes external applicants.

The board says they’ve got to provide certainty to its stakeholders and the best way to do that is to only seek internal applicants and get this thing done and dusted by the end of October. At best, this is an incredibly nebulous excuse for a board not to do its job properly. At worst, it is an insult to the intelligence of Vision Australia’s stakeholders. If the board took the time to do this the right way, I promise you services would continue to be delivered, funds would continue to be raised, life in the organization would go on. Many organizations, some of which are much bigger than Vision Australia, survive for extended periods just fine with an acting CEO, because they know the benefits of getting this process right are worth it.

So is it possible that a blind person may emerge as CEO from this internal process? That is a fair question, and I would dearly love for that to be the outcome. Indeed, Australia’s recently completed Disability Royal Commission is clear. In its final report, it said, and I quote, “leadership should reflect the diversity of the community it serves, including the lived experiences of people with disabilities,”. But my understanding is that most blind staff at Vision Australia are in client facing or middle management roles. That in itself is an indictment on the present leadership. So having one of those people step up while not impossible would be unlikely.

As blind people, we are used to being passed over, and to fighting for opportunities, but it stings a lot, and it feels particularly like an act of betrayal when blind people do this sort of thing to other blind people. It’s not right. The way this process is being conducted is not just disgraceful. It’s a dereliction of the duty of good governance that every Vision Australia stakeholder has the right to expect. And yes, that does also include donors. This is sloppy governance because the board owes it to all stakeholders to get the best person for the role. So how can it possibly be said that this duty has been executed when they’re considering applicants from such a small pool of talent steeped in the way things are being done now and depriving capable blind people of the opportunity to apply who could transform Australia for blind people?

If the board doesn’t want to do its job properly, it should resign or be replaced.

Voiceover: Stay informed about Living Blindfully by joining our announcements email list. You’ll receive a maximum of a couple of emails a week and you’ll be the first to learn about upcoming shows and how to have your say. You can opt out anytime you want. Join today by sending a blank email to announcements-subscribe at livingblindfully.com. Why not join now? That’s announcements-subscribe at livingblindfully.com and be in the know.

Greg Stilson and Jason Martin from APH discuss the latest with the Monarch

Jonathan: We’re at the 2024 National Federation of the Blind Convention, and we’re surrounded by a star-studded cast this time. Now, Greg Stilson from APH is with me and Jason Martin as well. welcome to you both.

Greg: Thank you so much.

Jason: Thanks for having us.

Jonathan: You’re an old hand at this, Greg.

Greg: I like to call myself experienced, Jonathan.

Jonathan: A veteran of the tech. Right. Yeah. And how about you, Jason? Tell us a bit about yourself.

Jason: I’m a teacher of the visually impaired by trade, heart and spirit, but now I’m a product manager for the American Printing House for the Blind.

Jonathan: And how does that feel?

Jason: It feels awesome, honestly.

Jonathan: It seems like APH is the place to be. I keep running into people and they say, I’m working for APH now. You’re attracting people there.

Greg: We’re getting, you know, Larry Skutchan, when he led the technology team, he was always a forward thinker. And since he retired, I think he kind of laid the groundwork for a really good foundation of technology. And that’s really where APH is putting our focus, right? And I think you’re right. People want to be part of this. I think they’re seeing exciting things coming from APH, and it’s a fun place to be.

Jonathan: How’s Larry doing?

Greg: Larry’s living his best life in retirement, man. He deserves to.

Jonathan: He does, absolutely. Yeah, that’s good to hear. So last year, we got together and we talked about Monarch, and we’ve had a couple of presentations on Monarch over the years on Living Blindfully. I was a bit confused because Microsoft started talking to me about Monarch a few months ago, and then I realized their Monarch is a different one. Do you realize that’s the internal code name that they’ve given to their new outlook?

Greg: I did not. I got really excited when you said Microsoft was talking to you about Monarch. That made me really happy.

Jonathan: So whenever they talk about Monarch, they’re talking about the new outlook. So you might want to be aware of that. It’s really confusing me. For those who haven’t kept up, either of you, would you like to give us a kind of an elevator pitch about what Monarch is? Why is this product so exciting? And it’s been a long time in the gestation, this thing.

Greg: I think anybody who’s blind and a Braille reader has always been thinking or dreaming of this concept of what we’ve regarded as the holy Braille, this idea of a tool that can do Braille and tactile graphics on the same surface. And this is not the first attempt at this. APH even, we attempted to work with a product called the graffiti to just do a tactile display and realize that wasn’t mass-producible. So we at APH have tried, other companies, startups have promised the world. I know that we’ve had many organizations say they finally cracked the code and they’ve solved this and then the startup runs out of money and the dream falls apart, right? And so we knew from the beginning, this was not something that APH could do ourselves. We don’t have the hardware expertise to do that. So we partnered with a number of organizations to make this happen. We started in 2020 and we put out a request for information. We basically signed a bunch of NDAs and said, come and show us what secret things you’ve been working in your lab to a bunch of different companies. And we saw a number of prototypes that were all really cool. But none of them, except for one, had the ability to do Braille and graphics on the same surface. And we had to make compromises being the first device of this type. But we ended up partnering with HumanWare and Dot incorporated out of South Korea along with the National Federation of the Blind to make sure that really we kept the blind person at the center of all of our decisions. So the Monarch, what it is today, is a 10 line by 32 cell display. So 320 Braille characters. But all the pins are equidistant. So we do some magic behind the scenes to form those Braille characters. And what that does is it allows us to represent both Braille and tactile graphics on the same surface. But it’s not just a display, it’s really sort of a laptop style device. So it has its own internal applications. It has a Microsoft Word style application. It has a tactile viewing application, a book reading application, and a number of other ones. So our goal with this is that it’s a tool that the student will be able to use throughout their day in the classroom, whether they’re in a history class or a science or math class. This tool will be able to go with them throughout the entire day. It’s about four and a half pounds. So about the weight of a gaming laptop, if you will, and about the dimensions of a 15-inch gaming laptop.

Jonathan: Now, if you’re designing this for students, it’s got to be robust, because they’re going to put it in a backpack or whatever, knock it around and that kind of thing. How robust is it in your testing?

Greg: It’s pretty robust, man. I’m waiting to do the drop test. That’s my… It also terrifies me, because when you have four… Well, to make everything equidistant, we have to put 480 Braille cells in this device. And I’m really excited and terrified to throw it down the stairs, but that’s gonna make me really happy. But it is pretty robust, and I will say the one thing that I’ve been doing this for a long time, as you mentioned, and it always seems like at the time of product launch, we’re always trying to squeeze every second of battery out of these devices, right? We can never have enough battery. And what’s really interesting is that this is the first time in my career where we actually overshot the battery capacity on this device. So we were shooting for a full day, eight hours of battery. We’re getting about 24 hours of battery on this device right now. Which is fantastic because we know that kiddos are going to forget to charge their devices overnight.

Jonathan: And where are you at in terms of the launch phase of this? Is it officially out?

Greg: Not yet. So it’ll be launching officially in September. So we are in final software stages. We’re finishing a couple apps. Right now, we’re working on the web browser application and the email application. But we’re in that final stage, which I always tell my product managers, the last 10% is the hardest part of a product launch, as I’m sure you remember, Jonathan. And so you’re running as fast as you can. You’re fixing everything last minute, but it’s really coming together. And I think that the part that I’m excited to show you, Jonathan, is that I don’t think you saw it last year when we had the graphing calculator application. Right. And that’s one that I’m really excited. We partnered with Desmos to bring a tactile graphing calculator to this device that is going to open up worlds that blind students just have never had access to.

Jonathan: Now, the big elephant in the room is the price of this thing. And I suppose if someone’s in the education system and you’ve done your advocacy, I know that APH has some pretty robust advocacy going on, then at least in the United States, funding may be available through the education system so that a child can or a student can thrive by having a device like this. And it sounds certainly like, with STEM subjects and things, this is a significant breakthrough. But then you think about other parts of the world or even other markets who want this and may not be able to get it. I hear, for example, from blind adults who say, look, this sounds like an amazing device. It could certainly help me in my daily life and my daily work, but I don’t see how I’m ever going to be able to get one, to afford one.

Greg: It’s a really good question. I’m glad you brought up the elephant in the room right away.

Jonathan: You can count on me.

Greg: Of course. Being APH, obviously, our core focus is K-12 education in the US. Having said that, one of the things that we’ve really been looking at is there is tens of millions of dollars of Voc Rehab funding that goes unspent every year. Jason, you used to work in Voc Rehab, right? These are funds that are meant to make a blind person employable and to give them the ability to work and contribute to society, right? We have advocacy efforts going across the board from education all the way to Voc Rehab because the reality is right now, as I mentioned, this is a tool that is really a stand-alone device, right? Like it has the ability to connect to Wi-Fi, we can connect in and download tactile graphics and books and all that kind of stuff. Where things really snowball and change is when the screen readers support this as a multi-line display. And we’ve already started conversations. I was in your neck of the woods last month, Jonathan. I remember we were in Brisbane, Australia, and we met with the guys from NVDA, and they are beyond excited. We met with the guys from Vispero and Apple. All of these folks are excited because the Monarch is something that they’ve never had access to from an innovation perspective when providing tactile output, right? Braille has always been a single line device, and you can only do so much from a screen reader innovation perspective on a single line display. Monarch completely changes that game because not only do we have multi-line for Braille and tactile graphics, but we have a touch surface. And I’ll tell you what, when we showed this to the guys at Apple and others, the idea is, I mean, you could see their mind spinning on what they could do with a touch screen device connected to an iOS device, and same with the guys from Vispero. What can you do? Can you divide a screen? Can you do split screen interaction? Can you do a number of different things that now, when you think of that in a job environment, being able to output Excel spreadsheets to this device or PowerPoints or any of that in a working environment, I think that’s when you’re going to see the adult market really start to thrive with a tool like this. And that’s where the Voke Rehab and the other dollars I think come into play. And it is expensive, but is it more expensive than getting a laptop and a Braille display and an embosser and six other things that this device could potentially do?

Jonathan: And the software knows what you’re touching, correct?

Greg: Correct, yep. Yeah, we actually have…

Jonathan: So it’s significantly different from your standard Braille display.

Greg: Exactly, and we actually have an output. We developed it for our software engineers just so that they could really hone in the touchscreen. But we started showing it to teachers, and what it does is it’s a visual output of where you’re touching. So the teachers, what they went crazy for is, they see a visual representation of the Braille that’s on the Monarch, right? So if you have it connected to a monitor, they’ll actually see the physical Braille dots that are on the display. But then they see these little, I don’t know what the colors are, Jason.

Jason: It’s kind of red and like a white circle that represents fingertips.

Greg: Yeah, and they can actually see in real time where the student is touching. And their minds started going to like, could we do virtual Braille instruction? Could we do, you know, is there ways that we could help more kids at distance that they normally wouldn’t be able to help, right? And so, yes, to be able to touch and see where you’re touching is going to open up worlds that we’ve just never had access to, right?

Jonathan: And when I look at the split Braille feature that’s made its way into Jaws, you’d have to think, OK, this is some preparatory stuff that will be helpful for larger devices like this, where you could perhaps have more. It’s like having multiple monitors with a monarch.

Greg: Yep, you got it. And that’s really what, when I’ve spoken to Ryan Jones at Vispero, I mean, that’s exactly what they’re thinking, right? Is when you have these conversations, I know sighted people that can’t stand working on one monitor. And I’m like, I don’t even understand what that’s like. It’s a world that I just don’t know, right?

Jonathan: Yeah.

Greg: But the reality is, are we going to see a world soon where blind people will sort of have that frustration if they don’t have split screen now on their tactile display?

Jonathan: So how much does the Monarch cost?

Greg: So it’s going to be just under 15,000 on federal quota. So it’ll be 14,900 on federal quota, and off federal quota will be 17,900 in the US.

Jonathan: Do you think that as manufacture ramps up, costs will go down?

Greg: Yeah. We know there is a declining price point, but we have to hit a certain number of cells, meaning we have to produce a large quantity of cells before we start to see cost savings. And that’s really where we’re working with human ware and organizations around the country, or around the world, I’m sorry, to be able to sell these devices internationally. We will have it localized, at least initially in English, Spanish and French, right off the bat, and more languages to come, obviously. But really, it’s the number of cells. Once we produce a certain number of cells, then you’ll start to see the cost decrease. And we want to see more of these devices. And we think that we will be, you know, at CSUN, for example, we saw a number of multi-line attempts at Braille displays, right? And I think that’s, you know, in the same fashion that you’ve seen multiple Braille displays produced by multiple manufacturers, you’re going to start to see this is sort of the next phase of Braille being produced in this industry. And that’s a good thing. I always tell people, though, there has to be the first one, and the first one has to come out and be able to be purchased by people. And that’s really where we’ve never gotten to. We’ve seen so many prototypes and things like that. And that’s really where, when I came to APH back in 2020, this was our vision, was APH has the funding source, we have the partnership with the National Federation of the Blind. We have the ability with all these partnerships, and we haven’t even gotten into e-Braille yet as a crucial piece of this. But if it wasn’t going to be APH, I don’t know who in our industry would be able to produce something that could be both manufacturable and have a funding source to purchase it, right? And that’s really where I’m excited to see it get out. Once it gets out and in the hands of kids, we’re going to be able to start to unlock all of the questions that people have of, all right, how much faster are kids reading? Are they? Yeah, there’s tons. The literacy pieces, right? Right. And that’s the part that we just don’t have data on yet.

Jonathan: It would be interesting to go back and look at when the Perkins Brailler was released in 1951, when you take inflation into account, I wonder how much more expensive in real terms the Monarch is compared to that Perkins, because $17,000 sounds like a lot of money, but it is not a massive amount of money for assistive technology these days.

Greg: No, I mean, you remember the VersaBraille when that came out.

Jonathan: Yeah, I do, indeed.

Greg: What was that? It was $10,000?

Jonathan: Yeah, it was. I mean, and look at the original Kurzweil reading machine, for example, that was just hideously expensive.

Greg: Exactly. So I think when you look at it, right, any innovation in our field especially is going to be expensive. And I’m a blind person, you’re a blind person. Like, do I wish that it was under $1,000? Of course, absolutely. But this is the first device to ever do what this is going to do. And I’m 100% sure that we’re going to see cheaper devices coming down the road that will be able to do similar things. So, but it has to start somewhere, right? And that’s where I’m beyond excited to see this rollout, to see what it’s going to do. I mean, Jason led our field test. So one of the things that we have to do at the American Printing House for the Blind is before we put anything on the federal quota system, we have to do an official field test. And this was what, six weeks?

Jason: Six weeks. I think it was, what, maybe three weeks after I’d started with you. Greg: Yeah, this was this is how you this is how you join my team. You just you join my team and then I’m like, all right, buddy, it was either sink or swim. We’re going to do the field test of the most impactful product we ever released.

Jason: It was great, though, really, to see the energy from the students and the teachers just getting access to this. This is groundbreaking technology for them in the classroom. And so to get their response and to feel that energy firsthand, it was incredible.

Jonathan: What are they saying? I mean, what feedback did you receive from that field test?

Jason: Well, you had situations where kids were kind of lackluster Braille readers. This was the one that made me the happiest, I think. And where they really weren’t interested in reading Braille, kind of had some interest in technology, and the monarch comes into the classroom, and that’s what the kid wanted to do. They wanted to start touching Braille, start reading Braille daily, just to practice with the device, because it was so cool, new and futuristic.

Greg: I like the one kid that at the beginning, what did he say? He said, it felt like reading on wax paper.

Jason: Yeah, it’s my favorite. Well, it felt like wax paper. That’s how we put it.

Greg: Yeah, exactly. And because the monarch has a membrane over top of it, right? Now, it’s a different feeling. So what we did in the field test is we asked the same questions at the beginning of the field test, and we asked it at the end of the field test. And at the beginning of the field test, you had all these kids that were like, oh, the Braille feels different. I don’t know if I like the texture, those type of things, right? And you asked it at the end of the field test, and you never got any of those responses again. So it showed me that it was a learnable skill. It was something that people accepted and were happy with. We found that the point and click gesture to do the touch surface functions was a struggle at first, but at the end, they all got it. We heard numerous times that the kids were getting the touch gesture way faster than the teachers were.

Jason: One of the biggest things from that, though, that we learned was searching in BRF files, the ability to find your cursor in a BRF, and they were having difficulty with that. It was the software chains threw their feedback that we have a faster way of searching through BRFs now.

Greg: Yeah, one of the things that was eye-opening to me is, you know, on a single line refreshable display, you have a 1 in 32 or 1 in 40 chance of finding your cursor. When you have 320 characters on your display, finding your cursor…

Jonathan: ah, that’s a good point!

Greg: …can be a problem, right? I didn’t even think about it.

Jason: And they let us know it.

Greg: Yeah, so what it ended up doing was, in the latest build that we have on my device, we have an internal name that I’m not going to tell you on this podcast, but it’s Where the Heck is Our Cursor feature. And what it does is, it takes the line that your cursor is on and it pushes it to the top of the display. So you always know that your cursor is going to be at the top of the display, at the first line of the display. And it’s those types of bits of feedback, that those are things that we just didn’t even think about until you actually have it in the wild. I think the other bit of feedback was just instant access to tactile graphics. The fact that this device connects into the APH image library with 2,500 plus graphics in it. And the reality is these kids just have never had access to this many graphics.

Jason: Well, and what we learned from the field test through assignments that we gave students is that effectively with limited training, honestly, it wasn’t a long field test in terms of a student’s educational career, but they were able to switch actively between bar charts and graphs and through a word document and go answer questions effectively. So looking at both forms of the documents, tactile graphic and then a worksheet to edit that tactile graphic or information, it worked pretty seamlessly. Most students had minimal problems doing that.

Jonathan: Is it keeping up with fast Braille readers in terms of refresh rate?

Greg: Yeah. Yep. So because the way that it works is the cell technology. And for those of you who have touched the dot watch back in the day and things like that, these are not the same dot cells that you’ve experienced in the past. HumanWare and APH have worked with Dot for a number of years on really honing in the cell to feel like a piezoelectric cell. There’s a couple things that are different. First off, the cell is electromagnetic. So what ends up happening is when the pin pops up, it actually locks in place with a locking mechanism. And the nice thing is that you can push your finger down on those cells and the pins do not go down, which is fantastic for our kids with neuropathy or users with neuropathy or have trouble actually feeling the pins.

Jonathan: Or they just want to sort of try their hardest to see what they can do. I remember the old APH Handy Cassette recorder. Do you remember those?

Greg: Oh, yeah.

Jonathan: And I used to put my finger in the little, where the wheels spin around. Stop the wheel spinning.

Greg: Yeah. So you have those kids that would push the piezoelectric pins up and down throughout the day. So yeah, you can’t push these pins down. And the other downside to this, and like I said, we did have some compromises to make this happen, is that these pins do not refresh if you touch them while it’s refreshing. And I know that sounds shocking because we know that kids and users in general, we’ve actually adult users were almost worse than the kids, is you’re touching the display all the time, right? So we knew that this was going to be a problem from the start. And that’s really what kind of created our need for the touch sensor. So with software, what we’re able to do is that we call this the dynamic refresh. And what it does is it looks where your fingers are. And if you do happen to block a pin from refreshing, we know where your finger has been. And so when you move your finger away from that pin that was blocked, it will then refresh the cell correctly. So we also do have an auto, or I’m sorry, a manual refresh feature if something doesn’t refresh correctly. But what we’ve noticed is that if you’re reading a book and you’re reading from top to bottom and you hit the page down button, what ends up happening is your hands go up to the top of the display. And by the time your hands reach the top of the display, those pins have already refreshed, right? So you’re not interrupting anything at that point.

Jonathan: So one use case where I can see this could be a bit problematic for me would be if I’m doing live internet broadcasting and I have to connect to a server at the top of the hour, I’m looking at the clock and the seconds are ticking over and I want my fingers on the Braille cell watching the seconds tick over so I know exactly when to press the connect button. Is that a use case that doesn’t really suit the Monarch?

Greg: It’s something that we’re looking at accommodating. We’ve also heard a number of folks who want a blinking cursor and with the blinking cursor, you can’t blink if your finger is on that cursor, right? So we’re looking at ways of trying to accommodate that. It is a limitation of these pins and it’s one of those things where there was no other technology out there capable of doing what these pins do. And so it was one of the compromises that we had to make. And we think that the excitement and the beneficial features of having tactile graphics and Braille in the same service outweigh those, but yes, those might be some limitations that we have to either overcome or just don’t suit the Monarch, unfortunately.

Jason: And I would add to that what’s happened through the development process and developing some of the applications for the Monarch is we’ve gotten creative on how we do that. Like Greg said, that if it’s refreshing at the bottom of the page and you’re at the top, maybe not the use case of the clock. But for instance, in the Monarch startup app that we’ve developed, you can have scrollable Braille text at the top part of the screen. So there’s a whole paragraph that you can navigate down. But then there are tactile graphics beneath that go along with the startup. And I don’t want to give too much away, but they’re very festive, very enjoyable. But you can interact with basically tactile graphics at the bottom while Braille is loaded at the top or vice versa.

Jonathan: I think, Greg, last year we saw like a map application and various other things that you were testing at that stage where you could zoom in. And we may have talked about this too, but I’m interested in what further data you have about the degree to which tactile graphics can be taught to a blind person. And I don’t know whether this is a skill you can acquire or whether there might be certain types of congenital blindness that lend themselves less to this degree of spatial awareness. But I find it interesting that I see some blind people who can get to a tactile graphic and examine it and know exactly what it is. They just know what it is. And yet there are other people who examine the same graphic and it’s just not making sense. It’s not being translated, say, into whatever the picture is supposed to be. So I guess you would be looking at some sort of curriculum that tries to instill people with what they need to use the device?

Greg: You got it. And that’s actually… So Jason and I were looking at sort of the roadmap of apps that we’re going to be building down the road for The Monarch. And one of those apps is a curriculum that’s been around APH for years and years and years called Setting the Stage. And it’s a curriculum to teach students how to interpret tactile graphics. Now, that’s always been on physical paper. And it has a whole process of going from 3D object to 2D object to tactile graphic and that kind of stuff. We’re adding in another layer of that, which is how do you move from 3D to paper graphic to digital graphic? Because you add in an entirely new concept of panning and scrolling if the graphic is bigger than your monarch’s display, right? And those are concepts that we learned during the field test, the blind students, it’s just entirely foreign to them, right? They’ve never done something like scrolling or zooming before.

Jason: Or point to click. It’s foreign.

Greg: Yep. So we’ve started the process of creating a curriculum. Our outreach team is working with us on that. Because we know that it’s not something that a student is going to be able to pick up and do. To your question about the different subsets of people, we don’t have any data on it. I’ve heard from an academia perspective that there may be a link to specific vision loss conditions having more ability to interpret spatial things or less ability to interpret spatial things. I don’t think there’s any data on that, at least for the time being. Having said that, I think we are a subset of the larger population. And if you look at it, there’s sighted people that are awful at reading maps. So yeah, I think that that is a piece of it as well, is that you’re going to have some people who are just spatial learners and you’re going to have people who are not. And our goal is to try to at least give everybody the tools to try to be a spatial learner. What we’ve noticed specifically is that when we did our field tests here, we are not field tests but our testing throughout user testing throughout the past couple of years at NFB here, we did a lot of testing with blind adults. What we found was there was a very small group of blind adults who actually had significant exposure to tactile graphics. Most of the adults that we surveyed, they were great Braille readers and they had barely ever touched tactile graphics.

Jonathan: What does that mean for them? Is that a skill that can be taught in adulthood or is it like everything the older you get, the less neuroplastic your brain is?

Greg: I would love to say that they, here’s the thing. I think if you get exposure to it, it’s all about exposure, right?

Jason: Well, yeah, I would agree. You could be a student doesn’t necessarily, or an adult might not need to read the latest calculus formulas or see that graphed out on a Cartesian graph. But one of my favorite tactile graphics that I show is Batman because…

Jonathan: Holy jumping jellyfish.

Jason: …Who knows? I mean, everyone has some type of familiarity with Batman, but have you seen Batman’s bat symbol? And I think that element of it is more playful. It’s the ability to see something maybe that you haven’t seen and that you wonder about. The Eiffel Tower, for instance, is one of the more popular graphics that people pull up just to check it out.

Jonathan: Yeah. Or even, for example, here we are at the NFB with probably close to 3,000 blind people. I’m sure those numbers will come in later at the convention. And there are certain quirks, idiosyncrasies about this hotel. Like if you don’t get on the correct elevator, you’re not going to get to the right floor because certain elevators serve different floors.

Greg: We did that coming up.

Jason: Coming to see you, that’s right.

Jonathan: Yeah. So if there’s a way to demonstrate getting around this hotel in a way that’s successful, you can sort of zoom in to parts of the hotel you’re interested in, and that’s a pretty compelling demo as well.

Greg: We have this dream of creating a, you query a location on a map, and you’d be able to get a tactile graphic of the streets around it, right? And be able to understand. So much of, when you think about it, a sighted person, when they’re using Google Maps, they’re getting the directions to the location, but they’re also seeing a visual map. And that’s just something that we as blind people don’t get. The ability to, before you got here, Jonathan, if you typed in the Rosen Center Hotel, and you got a layout of the streets around the hotel, and you could feel where you were in space, like that, that is my dream, to be able to give a blind person the ability to spatially understand where they are in the world, right? And I think that that’s really compelling in the idea of navigation and knowing where you are.

Jonathan: So Jason, let’s talk about the SDK. This is where we geek out a little bit. And I know how important SDKs are, the software development kit, because really this comes down to what can a third party do with this device and how much are they going to be able to get under the hood, as it were, and make things happen and really create an ecosystem. And I guess it makes sense for you guys to do this, because although there’s a lot of investment that may not appear to be yielding a return to begin with, it actually takes the pressure off you when it’s done, doesn’t it? Because it means that third party developers can then run with this thing and develop tech that you might not have even thought of at this point.

Jason: That’s the hope. Absolutely. Once we get to the stage of possible hackathon, where you get people developing something for the monarch in a short span of time, who knows what we would end up with? You really don’t when industry gets a hold of the thing. But I know for what we’ve done in-house, it’s been a pretty amazing process seeing our development team, our very skilled development team, work through the SDK and start learning its intricacies and what it can and can’t do. And we’ve got three apps that we really like to talk about right now, two are coming out at launch and one’s a little later. But Monarch Startup is one that we’ve worked at, and really it’s a tech demo. So it’s introducing the person to the device, to the display, to this point and click ability that we’ve talked about. I took it with this. I said, if the device has the ability to show tactile graphics, then my goodness, when we do a startup wizard, we need to include tactile graphics as much as humanly possible to get that exposure, to show people, in a way, what they’re missing. And so with this software, what I love about it is it’s learning, how do we show that there’s, for instance, more Braille on a scrollable screen? How do we, we’re inventing this UI in some ways as we go along. And it’s really cool to see something like that develop and to see the power that comes behind it.

Greg: And it’s gamified. I think that’s the part that I’m most excited about is that we, we as blind people, we’re used to these tutorials and things like that, that are just dry and drab and whatever. This is an interactive, gamified startup tutorial where kids can see pictures of caterpillars and monarchs and things like that. There’s even a game to teach the point and click gesture where you have to actually find the different monarchs on the screen and tap them with the point and click gesture. And it rewards you with some sound effects and things like that. But the kids love it and they get really jazzed up about it.

Jason: It’s not just the kids, though. And that’s what I love about it. The adults who have never seen this type of interactivity are like, well, this is a caterpillar. It’s a butterfly. I get to click on butterflies? And they, I don’t know. The response I’ve seen has been heartwarming, I would say, through our development efforts to see how people, and especially, I know kids were gonna love it, but adults, I didn’t quite see how they would take to it.

Greg: Well, and just to kind of recap on how the software is being developed. So our partner, HumanWare, they designed the hardware piece of this, and they do the software foundation. Many of you are probably familiar with the BrailleNote line of products and the Keysoft interface that is on those. So we modified the Keysoft interface pretty drastically, but it’s still Keysoft under the surface. And so it runs when you load into the Monarch, you load at a main menu, and they’ve designed really the foundational apps that are on this device, your word processor, your tactile viewer, your book reader, et cetera, et cetera. Most importantly, what Keysoft does is it does all the Braille input, output and the tactile graphics rendering under the hood, right? And that’s really when we look at the SDK, software development kit, that’s what you gain access to is all of the intricacies of the Braille translation, the graphic rendering, the touch surface, the haptic feedback, all of the things that are in the device, but you’re not programming the core functionality of the device. It’s almost like you’re handed a black box with a set of keys that allow you to do certain things in it.

Jonathan: It’s amazing how that paradigm of key soft that Jonathan Sharp came up with years and years and years ago is still true. It’s kind of been very portable and extensible and seen changes to all sorts of operating systems. So under the hood, are you running Android or what’s the operating system powering Monarch?

Greg: There is Android under the hood, but you don’t see much of it. This is not a Google certified device. You’re not going to be downloading WhatsApp and things like that on it. This is a tool that is very purpose built. I’m sure people will find ways to install third party apps.

Jonathan: You can bank on that.

Greg: Yeah, but the software development kit, HumanWare has never built an SDK. And I jokingly say that I kind of dragged them kicking and I finished your job back 20 years ago. Jonathan was barking up this tree 20 years ago. So they built this software development kit and we are, I would say, sort of the beta testers of it. It’s being refined. We’re coming up with things that we need added and things like that. What’s the second app that we built?

Jason: The second app is, they’re all my favorite. So I’m going to say they’re my favorite every time, but it’s Monarch Chess. And what we’ve created is a chessboard that you don’t have to scroll, you don’t have to zoom. The entire game of chess is playable on one screen on the Monarch. And the onus behind creating this was really this idea from Windows 95, Solitaire from Windows 95. It was sole purpose to create this was for drag and drop, to teach people how to do this new interface. Well, with chess, it was a way for us to reinforce this idea of point and click that we can move around or the movement gestures and really learn it in a game instead of just some tutorial or rote exercise. But chess also features the full playable game against artificial intelligence. You can play against a person in the room currently on the same Monarch. It has a full tutorial if someone’s not familiar with chess, or definitely users wouldn’t be familiar with the way we’ve set up the UI to play through chess, so there is a full tutorial with it as well. And my hope for this is that I start seeing groups of students that wouldn’t have been part of a chess club before, start becoming engaging with their peers and kind of have that Queen’s Gambit moment, if you will.

Jonathan: I’ll have to send you a check, mate. When is the software development kit going to be available to developers? Are you aiming to have that at launch time for the Monarch?

Greg: So we’re going to continue to use it internally at APH. Our hope is by this time next year, we will have basically a call for proposals. We’re not going to make it completely public right away. What we’ll do is we’ll have strategic partnerships with folks who have ideas that they want to build, and we’ll confidentially hand them the SDK. As we get more developers using it, and we fix things that we know aren’t going to be fixed right away, then we could look at potentially making it public to anybody to build. But I think at this stage, we got to walk before we can run. So our hope is to have really the strategic partners by summer of next year.

Jonathan: You talked about a range of file formats, Word documents, PowerPoint presentations. One thing I’m curious about is how Excel spreadsheets lend themselves to a Monarch interface. In my previous role that I’ve just vacated, I had to deal with quite complex spreadsheets for a large organization with large contracts. Is that something you’ve looked at at this point?

Greg: Not yet. We’ve looked at how it handles tables and that type of stuff. We’ve been looking at things like the periodic table and how we can represent those type of things. It’s going to be just like we’ve done with every element of this device. It’s going to be creating a new UI from the ground up, right? And it’s something that blind people have never had access to. So I can’t tell you the amount of time that we spent just coming up with the focus indicator. One of the earliest problems that we look to solve was when you have multiple items in a menu on a tactile screen on a single line display, whatever is on your display is what’s in focus, right? But on a multi-line display, how do you show what item is actually in focus? And so we created this nine dot square that is always running down the left margin of the menu. So you can run your finger quickly from top to bottom on that left margin and find exactly what item is in focus. So you know if you press enter, if you were to point and click on that item, that’s what’s going to get activated. All of these concepts from the ground up, we’ve had to build because there’s never been a multi-line UI created.

Jason: Greg, according to kids, that through field testing, that symbol is called the waffle indicator. Just thought you needed that. That’s the waffle.

Greg: The last app I want Jason to talk about is the one that I’m most excited about because I know that there’s many times in the classroom where a teacher of the visually impaired does not get a graphic ahead of time for the student or doesn’t get something provided to them in time. And so, in real time, I’m sure you remember these days, Jonathan, your teacher of the visually impaired is using whatever tools they have in their Mary Poppins bag to create a rendering of whatever that tactile graphic is, right? And so whether it’s wiki sticks or rubber bands or whatever, they’re coming up with these very creative ways to create a style of tactile graphic that you can at least learn from. The tool Jason’s working on now, I think, really changes that game entirely.

Jason: So it’s called Wingit. What Wingit does is it connects the Monarch to an iOS display. Effectively, the Monarch works as a receiving unit to the iOS application, also called Wingit. And what a user can do is draw in real-time, and the minute they pick up their finger, it is on the Monarch. And so I could take, and I’ve done this as an example, a college university, my alma mater is Auburn, and I was there speaking. And in real-time, and I’m saying three minutes, I was able to do a rough sketch of the campus using the tools from the Wingit app, just sending it over, buildings. And now there is Braille labeling built into it. So we can label those buildings as we do it, and pan through the graphic, and move basically what we’re creating in real-time. So you can create a layout for the hotel right here in real-time, and eventually we’ll be able to save that and export it.

Greg: So the idea here is that the teacher of the visually impaired, I always use a real simple example. If they’re doing Pythagorean theorem, and they’re doing the right triangle on the board, the teacher of the visually impaired could draw that on their iPad or iPhone. They could label, in Braille labels, each of the angles or the sides, and that in real-time would show up tactically under the kid’s fingertip, and they’d be able to actually learn with the class in real-time.

Jonathan: Wow, that’s pretty impressive. And I must say, it’s to Apple’s great credit that they really have embraced multi-line displays quite quickly. We started seeing things popping up in iOS some time ago now that lend themselves to these sorts of displays.

Greg: The only device that has been out there, that has been capable of receiving this type of stuff, is the DotPad, Dot’s own sort of multi-line display. And to their credit, they connected with Apple really early, and Apple is kind of creating these type of concepts from the ground up, using the DotPad as a tool to output to. The good thing for us is that we’re gonna be able to use and take advantage of all the great efforts that Dot and Apple have done with a tool like this once the Terminal app is available. So our hope is by this time next year, you’re gonna see the Monarch supported by most screen readers. We control when they get the Terminal app, we don’t control what they do with it, unfortunately.

Jonathan: And this eBRAIL file format, it’s been re-labeled since we last talked to you, because it was called eBRF, now it’s called eBRAIL.

Greg: Yeah, we realize eBRF is not used in every country, so we’ve had to make it more generic.

Jonathan: eBRAIL is kind of descriptive. And it’s going to be important, I think, for this industry, for the graphical Braille display industry, that there be some sort of standard, so that if you have a graphical file, it should be readable on a wide range of devices. Is that one of your hopes for this?

Greg: So eBRAIL is device agnostic, right? It’s going to benefit your multi-line displays. It’s going to benefit your single-line displays. It’ll even benefit your embossers, because the eBRAIL format, you’ll be able to package Braille and tactile graphics into the same package, which is something that doesn’t happen today. The way that Braille textbooks or books in general are produced today is, I learned more about the whole ground-up production of a textbook in this process than I ever wanted to know, because it is still very archaic. Things haven’t changed in 40 years. So what ends up happening is you have these tactile graphics that are produced by tactile graphics artists, and they’re usually folded into the textbook at the end of the process, if they’re folded into the textbook at all. Sometimes they’re just thrown into a graphical volume of the textbook, which is, in my opinion, the worst case scenario, because now a blind student’s got Braille on one side and tactile graphics on the other. Having said that, eBraille allows you to take all of the benefits that you would get from ePub. So you have the ability to go with a marked up Braille file, essentially, and jump heading by heading, chapter by chapter, section by section. You can use all of the things that you’re familiar with with your screen reader. So you could jump to a table or a list or things like that. But most importantly, if a graphic exists in the textbook, it’s going to be marked. And so we will have a symbol that shows that there’s a graphic here and if you point and click on that graphic, it’s going to open up in our tactile viewer and you can look at that tactile graphic in real time. With eBraille, basically we are just about to release the first specification, the official spec. We’ve partnered with over 40 organizations around the world, along with the Daisy Consortium, to bring this file format into existence. As APH, I had no interest in both designing the Monarch and a file format at the same time. So we partnered with Daisy, who has a ton of experience with this, bringing file formats into existence and shepherding them through the process. But along with partnering with all of these different organizations around the world, because we needed buy-in, we need, if we’re going to do something like this, it has to be used worldwide. And so I’m happy to say that the first spec will be released, hopefully, by the end of July. And what’s going to end up happening is that spec is going to go under revisions, as we expect. We’re going to get feedback from the field. And I fully expect that we will, once exemplars are created and things like that, we’ll do a revised spec, most likely the first half of next year. Having said that, this is when, once those exemplars are created, which are really the ideal eBraille files, that’s when you’re going to start to see the Duxburies and the Braille Blasters and the other software pieces start to build in support for eBraille. So Duxbury is a supporter of this, so you’ll be able to create your eBraille files through Duxbury and through Braille Blaster and things like that. Our goal is to start producing converted eBraille files. So we’re not going to get to the point where we’re creating eBraille files from the ground up yet. Internally at APH, we’re building a converter that will take BRFs and convert them to eBraille files. And our hope is to start converting our most popular textbooks that we produce at APH, converting those BRFs into eBraille starting in September.

Jonathan: Is there user benefit in that? I mean, what’s the advantage of converting the file?

Greg: Huge benefits. So right now, for example, if I take a BRF and I throw it on the Monarch, even though Monarch is multi-line, if I want to go to chapter five of that BRF, you have to type in C-H-A-P-T-E-R, and you’re going to encounter every single… You do a find command within the BRF reader, and you’re going to come up with every instance of the word chapter, right? Whereas a marked up e-Braille file, if I convert a BRF, we’re using some algorithms along with a little bit of AI to basically detect what is a heading, what is a list, what is a table in that BRF, and we actually add the markup back into that in an e-Braille file. It also allows you to take those tactile graphics that would be part of that textbook and actually already put them into the package, so that if your e-Braille file, or if your BRF has tactile graphics as part of it, you’re going to be able to actually create an e-Braille file and look at your tactile graphics on the Monarch in real time with that textbook.

Jonathan: Cool! When can I have this in my Mantis, Greg?

Greg: That’s a good question. I would expect probably we will look at supporting this when we start rolling it out into the software packages, so I would see that as on next year’s roadmap.

Jonathan: The Mantis has been a very successful device, it seems to me. People have been waiting for a really good Braille display with a QWERTY keyboard, and I love this thing. It deals with all the weird Braille translation anomalies in certain devices, not naming Apple specifically, but it just works. So how’s that going? Do you think we might ever see any kind of revision? The one regret I have, I guess, about the Mantis is that it didn’t have a speaker, so some of the new functions in that software that are making their way to other HumanWare-type devices and the Chameleon are not in the Mantis.

Greg: Correct. We know that that was something that the community is looking for, and yes, I mean, we’re always looking at what is the next thing down the road. The Mantis came out in 2020, so from a technology perspective, it’s ancient at this stage. So we are looking at what’s going to come down the road. I will say, you know, we mentioned LarrySkutchan, . This was one that I entirely missed on, and I give Larry full credit on this. We, you know, from my times at HumanWare, the BrailleNote QWERTY keyboard versions were always the lowest number of sales.

Jonathan: But you know why that is? It’s because Russell in the early days even decided that he was gonna make one case, and the one case would fit the QWERTY and the Braille version. So the keyboard of the BrailleNote BT, or whatever it was called, QT, was very cramped.

Greg: It was. It was really tiny. It was really small. And so for the longest time, I said, well, obviously the sales aren’t good, so nobody wants this. So and Larry, to his credit, kept pushing and kept pushing. I remember he talked about this QWERTY keyboard Braille device probably since 2014, and it kept getting shot down, kept getting shot down. And finally, in whatever it was, 2018, Larry came back and said, let’s try this again. And he got somebody to bite on the idea. And God bless him.

Jonathan: That was kind of Larry’s parting gift to the industry, really.

Greg: It was.

Jonathan: It’s a fantastic device.

Greg: It has sold, I think, better than anybody had expected. And I think it’s also a matter of timing. We’re in a world now where Bluetooth keyboards are very, very popular. Blind kids are using a wide range of devices now. So a tool that is not specifically Braille input also develops a lot of transitional skills for these kids when they’re going into university in the workplace and things like that, because the world doesn’t always use a six key entry keyboard, right? You have to be able to use a computer down the road. And I think it’s showing that it creates these skills earlier for these kids.

Jonathan: That’s a good point. And I just find inadvertent cording. I’m a pretty fast braillist, and sometimes you’ve got to be careful not to accidentally press the space bar in combination with another command. So I like the Mantis a lot. It’s a great device. And the keyboard is very good on it.

Greg: Yeah. They really did well with that keyboard. I think it’s really been well received. And we are looking at ways, Jonathan, of could we route audio through Bluetooth headphones and things. So I think more to come on that. You will see an update to the Mantis here shortly. We did get a lot of requests from our, especially our Braille proofreaders. They wanted a lot of the BRF functions of being able to set your line and page length in the Braille editor, along with being able to query your location on a Braille page. So I didn’t realize how many folks actually use this for Braille proofreading.

Jonathan: Well, I don’t want to keep you from your adoring fans in a little over half an hour, but I appreciate you both giving us so much of your time. What’s the best way to keep up with what’s happening with The Monarch? Because it seems to be very fast moving.

Greg: It is. It’s probably our fastest moving project. I mean, when you think about it, right? This project started in 2020, summer of 2020, and in four years, we are about to release the product, which I can tell you an APH time is light speed here. You can go to https://aph.org/MeetTheMonarch. That’s the Monarch landing page. You can also email if you have any questions, email DTD, Dynamic Tactile Device. We still haven’t gone to the other one, but DTD at aph.org. That is a way to get in touch with folks regarding any questions you have. And I always tell people, you mentioned the funding and the costs, and the cost is the largest price tag I think we’ve seen in a long time. The only way that this gets funded, and I always like to finish with this, is if you, no matter what country you’re in, tell your story to the governing body, right? So here in the US, for example, we have had numerous conversations with Congress people to really talk about how this changes the game in regard to students. The student story is the easiest one to tell, I believe, because from a blind student perspective, having access to textbooks instantly, a Braille textbook can sometimes take 12 months to produce and can cost in the US upwards of $60,000 to produce an algebra textbook. So a device like this that can hold all of your textbooks, we always use the time to fingertips metric of getting the textbook under the kid’s fingertips faster. But from an economics perspective, it’s going to cost less to produce these textbooks in a digital fashion down the road. So I always tell people, tell your story to the governing bodies, that’s how this gets funded. APH and others can advocate, but it’s really your stories that are going to make this sort of move the needle here.

Jonathan: Fantastic. Thank you both. We’ll look forward to staying in touch.

Advertisement: I want to thank Aira for being a sponsor of Living Blindfully. Aira is happening right now. There are new Aira access locations being added, particularly all over the United States. I see they’ve brought another couple of airports online recently. They’ve got support for Remote Incident Manager. I’m using that so much these days. I can dial in and get assistance with computer related tasks so much easier than I was ever able to before with other solutions. And of course, there are all sorts of exciting innovations coming. If you’ve been following Aira’s social media channels, you’ll be aware of those, with pilots coming up involving the meta glasses and all sorts of other innovations. And of course, there’s Access AI, which is one of the coolest implementations of AI that I have ever seen, and I love to use that as well. So do check it out. To find out more, hit up the Aira website at aira.io. That’s aira.io.

Positive update on TPLink accessibility

Christopher Wright says, Hi Jonathan, this is a quick update to let everyone know TP-Link appears to take accessibility seriously, at least right now. After going back and forth for a while, and getting frustrated because it seemed like my concerns weren’t being taken seriously, I was first offered an older version of firmware. I declined, but indicated it might become necessary if things became even worse. I was given a beta version to download on July 11th, and I was absolutely amazed. While there are still unlabeled or strangely labelled controls, the experience is significantly better than it was, even before the update that made things really bad. This was achieved in a very short span of time. I made my gratitude known, as well as submitting additional improvements that can be made, so I’m waiting for either another beta version to be sent to me, or a public version with the changes. In the meantime, I’ve disabled automatic updates, and I have a local copy of the beta firmware if I need to go back to it. This is very encouraging. I didn’t expect TP-Link to step up to the plate, particularly when so many companies and the general public blatantly ignore us. I hope this new philosophy continues with future models of the Archer line. If it does, I’ll be a very happy TP-Link customer when it’s time for an upgrade. That is very good news, Christopher, you see? You got to keep the faith, you got to keep advocating, because you never know when that brick wall that you’re banging your head against might just move a tiny bit, and there’s an improvement to report. So that’s great stuff.

Peacock, ExpressVPN, and Pulseway

Rich: Hi, Jonathan and Living Blindfully listeners. This is Rich Yamamoto. You mentioned peacock to stream the Olympics, and I don’t know what the deal is, but peacock has a lot of audio described material. But for some interesting reason, at least on the iOS app and the Apple TV app, actually, you can’t get to the audio and subtitles button. I don’t know where it’s at. I don’t know why you can’t do that, but you can’t, the player is very weird to get to the thing you want to get to, which is a real shame, because I want to watch The Office with audio description, but I can’t watch The Office with audio description. Therefore, I’m not watching The Office right now, because I don’t know of another place to find The Office with audio description, and I really want to watch The Office. And it makes me sad that I can’t watch The Office.

Yes, ExpressVPN is awesome. I got it after you talked about it on the podcast a while ago. And I think it’s incredible and wonderful and all the things. And I’ve been able to use it to bridge, you know, and connect securely to the internet. I used it a lot in open networks. I’ve been to a lot of coffee shops to do some work, and so it’s been really nice to use my VPN through ExpressVPN to get that working.

And then my final comment is about Pulseway, actually. And I wanted to talk about that for a second because I remember when you mentioned it a while ago, I thought, man, that would be incredible for me to access on my phone, and I can control my work laptop at my house in Wichita because I work two jobs. And I can do basic maintenance, you know, restarting my computer remotely, applying Windows updates, things like that. And so I set up a free trial, and I’ve really been liking it. And what’s really cool is that I had somebody specifically tied to my account reach out to me this week, and just to check and see how it was going, and he told me about all the different services they have and things like that, and he wants to get back in touch with me this Thursday, August 1st, at the time of sending this out. And he wants to get back to me and check in again and see how I’m doing with it. And I’ve never had a company reach out like that. I’ve never had a company call multiple times in a week because I was busy doing family things and attending workshops and things while I was on vacation. And so I’ve never had a company reach out and just incessantly call me to make sure that I’m doing okay with their program and offering any help that they can. Especially when I told him I was a screen reader user, he said, well, how can we make this more accessible for you? And so that was awesome. I’ve never had that happen before, and that was really cool.

Jonathan: Look, Pulseway is an amazing piece of software, Rich, and their customer service, I agree with you, it’s extraordinary. The amount of trouble they went to to get me back up and running, when my luck finally ran out and I had to upgrade the product I’d had for years that they deprecated, was just super impressive, given that it was just one little installation and I’m not spending a huge amount of money with them. But Pulseway is a lifesaver. I’ve been in situations where I’ve been out, something’s happened to the Mushroom FM PC. I can log in from my phone, see what’s going on, restart it. Everything’s back up and running again. It’s a very cool tool indeed.

And just a reminder to people that if you’d like to try ExpressVPN and protect your traffic when you’re in public places or appear to be in another location, then go to livingblindfully.com/vpn. That’s livingblindfully.com/vpn. You get 30 days free by signing up through that link, so you can try it out and kick the tires. And we get 30 days free as well, so we appreciate that.

Overdrive Media Console is no more

Debee: Hey, Jonathan and listeners. This is Debee Armstrong. I wanted to share some information about Overdrive Media Console. If you don’t know what that is, then you’re not really missing anything because it no longer works. However, if you do know what it is, I’ll give you a little bit of an explanation about what happened.

So Overdrive Media Console was a program for the Mac and the PC that allowed you to download library books in audio from your local public library. And the MP3 files would be copied onto your PC or Mac. And then you could move them over to an MP3 player of your choice.

And I’ve listened to these books on a Movo, on a Zen Stone, on an Olympus Digital recorder, on a PlexTalk Pocket. Thank you, Jonathan, for the great tutorial on the PlexTalk Pocket back in August of 2011. And I’ve also listened to them on all three generations of the Victor Reader stream. And the books were not copy protected.

The way it worked is that you got a license to download the books. And when the license expired, the software would automatically delete the books off of your computer. And so what I always did was move them over to an external drive before they expired. So I had plenty of time to listen to them on the MP3 player of my choice.

Well, about a year and a half ago, the program disappeared from the internet. But luckily, I had saved the install file. So when I got a new computer, it was no big deal. I could put Overdrive Media Console on my new computer.

Last week, I downloaded six audiobooks from my public library. And Overdrive Media Console worked fine. And I transferred them to my stream and to my external drive so that I would have them if they expired. All was good.

This week, I went to the website for four different public libraries I belonged to. And guess what? That option has completely disappeared. So you can no longer download those audiobooks to the mp3 player of your choice via the software on your Mac or PC. That option has gone away, closing the last loophole that allow you to bypass the copy protection. So, sadly, you now have to listen to the books on your phone or tablet using the Libby app, L-I-B-B-Y.

Now, Libby is still very, very accessible. In fact, it gets better and better. They keep adding more features. So if you like listening on an Android tablet, an iPad, an iPhone, a Kindle Fire, or I believe some of the Smart TVs also have the Libby app, and there is also the website, libbyapp.com. You have lots of ways to listen to the books. But of course, when they expire, they expire. So you can set your loan period for 7 days or 14 days or 21 days. But when they’re gone, they’re gone. Another person gets to borrow them. And the libraries buy licenses for only a certain number of copies. So that’s all fair. But it is a little frustrating for us print impaired folks if we want to keep the books around.

So just letting people know, it’s dead, but you can still listen on many, many devices.

The South African election

George: Hi Jonathan, this is George again. I was wanting to ask you if you had looked into the South African general election that we had in May.

It’s been 30 years since South Africa became a democratic country, and we for the first time have a coalition. Now the African National Congress, which has been in power since April 27th, 1994. Many people have not really been particularly pleased. In fact, I would say many people have been very angry, myself included, with the way that they have run the country.

Now this isn’t just related to normal everyday services, but also services for the blind. I mean, we have no accessible traffic signals. I’ve only ever seen a couple in the country, and they were in a small little town.

Our rail network is destroyed. We don’t have a passenger rail network anymore. The post office doesn’t work. The state-owned electricity company doesn’t work, which means we don’t have a stable power supply. We do have a small inverter in our house, which uses batteries to convert to AC current, which helps us when the power goes off. And things just don’t work.

So I know that you are very well-versed in politics, and I would like to hear what you think of the current coalition. I don’t know how much you may or may not know, but if you were to look up the information that you would need, I would love to hear your thoughts.

Jonathan: Well, George, I certainly did follow the campaign and the coverage, and it was a momentous day, and perhaps a reflection of a maturing of a democracy, that this coalition now exists.

Given that one of our Mushroom FM fun guys is from South Africa, that’s Gary G., who does a show every weekday on the station, I’m well aware of the chaos that the power blackouts have been causing.

There’s probably a bit of a honeymoon and a bedding in period, and it’s probably too early to say whether real change is coming or not. But certainly after 30 years of rule by one party, it is a very significant, transformative time for South Africa. And it’ll be interesting to see what happens and whether blindness services mature any.

Sweet Dreams app for some continuous glucose monitoring systems

Justin: Hi, Jonathan. This is Justin Daubenmeier talking about continuous glucose monitoring, CGM systems.

I have a dexcom G7, and I found an app that I wanted to share with everyone. It’s a really cool little nugget that I came across. So this app is called Sweet Dreams. You can get it on the App Store. Unfortunately, it is not available for Android, but it works with the dexcom G6, G7, Libre system, and there’s one more. I can’t remember what it is. But this is an app that works alongside your CGM app that you install. And so Sweet Dreams allows you to turn on a feature to speak your blood glucose level.

So at first, I thought, well, it comes with a seven-day trial. So I just thought for fun, I download it and check it out. I really didn’t see the benefit to it because I used the dexcom G7 app. I can read my sugars with it. So I downloaded it for a seven-day trial, and I turn on the speech feature, and then I just set my phone down on my desk, computer desk, and I start working. And eventually, it says, your sugar is 90 and stable. I thought, okay, so then I just keep working, and it’s like, your sugar is 100 plus 10. I was like, huh, this app just told me my sugar increased by 10 points, and it spoke it to me. I’m like, well, dexcom does not show me how much I’m increasing numerically. So I was like, that’s kind of select.

So then, you know, next time it speaks, I’m still working at my computer desk, next time it speaks, it’s, you know, your sugar is 120 plus 10. I’m like, wait a minute. Now, this is letting me know how much I’m increasing. And because I’m going up 10 at a time, I’m going to go take a little bit of insulin. That was kind of like the aha moment for me.

So this app not only tells you and speaks it to you how much you’re going up in your sugar, but it also will tell you how much you’re decreasing. So for example, it might say 130 stable, okay? Then next time it speaks 115 minus 15. Whoa, that’s a big drop. So I’m going to get some glucose and eat some so I don’t go too low. Whereas the dexcom would just say 130, we go back to dexcom app, it says 115. Maybe we remembered 130, maybe we didn’t. But the benefit to this app, besides it’s speaking to you, is that it totally got rid of all the tinkering. Me asking Siri what my sugar is, because I can’t see it. Or having to grab my phone, go to the app switcher, tap on dexcom, swipe down and hear my sugar. Or use the shortcut, like I said with Siri, grab my phone, hold the power button down, what’s my sugar? Or say, hey Siri, what’s my, right? So it gets rid of all of that tinkering and mental work, where I can focus on what I’m doing and every five minutes on a dexcom, it reads my sugar and lets me know how much is increasing, or if it’s stable or how much it’s decreasing. To me, that was a big benefit.

And then the second benefit I found, aside from working is even when I’m cooking, I’ll have it on speak and then just sit it on the countertop so that I don’t have to go over here and there and check my sugar to see what I’m at. It just tells it to me. I put in earbuds, I’m talking to a friend, and it speaks my blood sugar right during a conversation. And it speaks it low enough to where it doesn’t overpower the person I’m listening to. And it allows me to monitor that sugar because there’s times I’m on a call and I’m talking for 20, 30 minutes and the whole time my sugar is dropping and it doesn’t alert me until it goes below 80, for example. And maybe it’s dropping pretty quickly, not rapidly, but quickly. Whereas on this app, when I’m talking on the phone, it’s going to say, you know, 150, then it will say, you know, 140 minus 10, 130 minus 10. So I know I’m dropping and I can catch it a lot quicker because it’s being spoken to me. So this was a pretty cool feature.

The other feature that’s kind of neat that I don’t use, but you can sync this Sweet Dreams app to your calendar on your phone, and then it will show your sugar on your MacBook, and it will show your sugar on CarPlay, which again doesn’t matter to me as a blind person, but if I’m out with sighted friends and family, I could literally put my sugar on their CarPlay and they could just see it right there on the dash as they’re driving. So that’s kind of neat.

The other thing it has is the concept of groups, where for example, you could have five to eight of your friends and family join a group, and they can watch your blood sugar, and they can chat back and forth, and they have an option where you can give a thumbs up to let everyone else know in the group, I got this. So for example, say, God forbid, your child or you yourself, you’re sleeping and you sleep through your alarm to wake up because your sugar is low, and you have 10 friends or five friends and family get notified. Someone gives a thumbs up and everyone in the group gets the thumbs up, so they know, hey, Sally’s got this. And so it’s kind of neat, plus they can chat back and forth in the group. So that’s kind of a neat feature.

I’ve spoken with the developer through email. He is an iOS developer outside of university. He just started iOS development, and he is very familiar with accessibility. The company he works for has to program accessibility legally, so he’s pretty fluent in it. So the Sweet Dreams app is fully accessible. And he’s very excited about the voice feature because right now it just simply speaks it. You can’t adjust the speed. You can’t adjust the volume. He’s excited to get those features in place because a lot of people use this speech feature. So if they’re jogging, they’ll have it on, or working out, they’ll have it on, stuff like that. So this has gotten a lot of traction. So he is planning on eventually adding a volume to it. You can turn it up, just like how you turn up Siri on the phone. It is using the Siri Voice to speak it. Right now, it only works with English, but he is getting in place where it will speak other languages as well. So this is a cool little feature. And then I mentioned to him all the voices that come with VoiceOver, and the ones you can even download. And so he’s looking into that. He may add different voices for you to choose from. So this isn’t something he’s probably going to do in the immediate, but he definitely is going to get some of this in place.

It does work with the Apple Watch as well, which apparently is a big feature for a lot of people. I have an Apple Watch, but I really don’t use it for checking blood sugar. So I can’t speak to that, but it is supported, fully supported.

It is $19 a year for the app. And so for me personally, just having that be able to speak to me, like even now as I’m recording this, it’s speaking to me what my sugar is. Now it’s using VoiceOver. So when the app has Focus, it’ll speak using Siri. That’s what I recommend. So when I say it’s speaking to me, I have the app open with Focus, the Sweet Dreams app, and then it’ll speak using the Siri. If not, you’ll get that bubble tip, like at the top of the screen, and VoiceOver will speak it. So it’ll also be on your lock screen. So apparently people like that.

It’s on the island. I don’t have an island. I have an iPhone 12 Mini, but it’s on the island. You can touch the island and kind of hear your sugar. So it just got different kind of things you can play with. But it’s a really cool little app. It’s kind of like one of those nuggets that kind of, it’s like a white glove that fits on nice and snug and tight, and precise, and it just fills a niche. And it’s like what a lot of people are saying.

There’s a Facebook group you can join if you have questions. They’re very helpful and supportive. It’s a nice community. But people are like, they don’t understand why dexcom or Libre hasn’t done anything like this. And so this guy’s very passionate about it. His younger brother has diabetes, so this just started as a family app. And then he put it out commercially, and he was shocked at how many people just grabbed it and started buying it. And it took off like wildfire. So I’d encourage you, if you use a dexcom or a Libre, check it out. Or if you have family and friends that you know, have a dexcom, Libre. There was a third one that supported. I can’t remember their name. It just came out. My apologies for that, but I’m sure it’s in the app. You can swipe through everything, tap on everything. Everything’s spoken. There’s nothing in the app that’s not spoken. You know, so that’s really cool too.

Experiences with Apple Vision Pro

Jonathan: Let’s talk about Apple Vision Pro, which we haven’t done too much talking about on this show actually. Scott Greenblatt is writing in and said, I visited my local Apple store yesterday to get my iPhone 15 Pro fixed. And since I had over 1.5 hours to wait around without anything to amuse myself, I decided to ask them for a demonstration of the Vision Pro goggles.

I was not entirely surprised to learn that these goggles are not something that I would wear all the time due to their size, weight and battery life. But according to what the Genius Pro told me during the failed demonstration, the Vision Pro goggles have the capability to almost give back sight to the vision impaired and those who are completely blind.

I have to take what Andrew said at least with a grain of salt for the time being, because we didn’t get the Vision Pro demonstration to work properly once VoiceOver was turned on. But from what he told me, a blind lady who used to work at my local store was relocated to California to work directly with the team who were designing the Vision Pro goggles to work better with VoiceOver and make the goggles more user friendly for those of us with vision problems.

The demo did not work because the goggles kept picking up my hand movements as gestures it was supposed to pay attention to, and we couldn’t get the goggles to work through the entire setup with VoiceOver as a result.

The gentleman who helped me told me he would blindfold himself and work with the Vision Pro goggles as well as reach out to the blind lady who moved out to California and he would be ready to give me a proper demonstration of the Vision Pro goggles when I next come into the store.

According to the Genius Pro from Apple, these goggles will provide real time AI interpretation of everything you look at and it will even tell you when an object or barrier is approaching. It will also tell you to back off when you’re too close to an object. It seems like these goggles don’t need to be trained in terms of object identification and they either can or will be able to identify people you meet up with as long as those people are identified in your photo library.

Since I’m just trying to catch up and learn all I can about the available wearables for the Vision Impaired, I’m not sure if you’ve ever done a comprehensive review of the Vision Pro goggles. If I’m incorrect, would you please tell me which episode of Living Blindfully I need to review to get my questions answered? On the other hand, if you haven’t done a comprehensive review of the Vision Pro goggles, are you planning to focus on them in a future episode? Can you compare the Vision Pro goggles to the Envision glasses and Ray-Ban smart glasses to explain how each one benefits our population?

Since the price tag on the Vision Pro goggles is so high, around $3,500, I’d like to get all of my ducks in a row to determine what is the best wearable option for me.

Scott, we’ve not done any kind of demo of the Vision Pro. First of all, it’s not available in New Zealand at all. So unless Apple or someone would have sent me one, I’m not able to access it. I’m not aware of that many blind people who are using this. I understand it’s having a significant impact on some with low vision, and that some of the results being achieved there are very good.

Based on what I know about my iPhone Pro, it sounds like some of the things that the Genius Bar Rep was describing to you are available on the iPhone Pro, but obviously in a wearable form factor, they might be more convenient. So on an iPhone Pro, which is equipped with LiDAR, you can switch on people detection, door detection, text description, and scene description, and that all happens in real time. So if you walk around with an iPhone held out, or maybe strapped to yourself in one of those harnesses that are popular in the blind community, you can get a lot of this, and you can hear what’s going on, you can be told where doors are. It is pretty impressive. It’s low latency, it’s near real time, and obviously that lends itself very well to a wearable form factor.

The Vision Pro is, I understand, heavy, and it sounds like you’ve confirmed that. The battery life is pretty low. So it might not be quite what we’re looking for, but I’m happy to be contradicted by somebody who’s bought one and is using one in the real world. I’d love to hear some stories about that.

Mark Gurman, who’s a journalist for Bloomberg and who seems to have all sorts of inside scoops at Apple. I don’t know who’s leaking from Apple to Mark Gurman, but his sources seem incredibly reliable. He has said recently that Apple continues to work on a lower cost Vision Pro, and also, perhaps more significantly for the use case that you are talking about, glasses. And I think we will find that there’ll be a bit more competition in this glasses space in the very near future. And if Apple comes out with a pair of glasses with some of these iPhone Pro-type features in them, I think that will be a very significant development for our community.

But if we do have someone listening who’s got experience with the Vision Pro, or Apple, if you want to send me one, I will send it back, you know. livingblindfully.com/opinion is where you go to share any thoughts. livingblindfully.com/opinion.

iPhone 15 questions

Catherine: Hi, Jonathan, it’s Catherine Getchell from Pittsburgh. Hope you’re doing well. I have a question, two questions actually, for you and your listeners.

I finally, finally switched from my iPhone 12 to a 15 Pro, because I’m really excited about taking advantage of the Apple Intelligence features. I’m really excited about Siri not being dumb anymore.

But I was taking the opportunity to go through all my settings since I have a new phone, just make sure everything’s what I want it to be. And I decided to change the way that the voiceover announces actions from the speaking actions are available, to the little b-b-b sound thing that you have turned on on yours, Jonathan. And so I went into voiceover and then verbosity, and then under actions, I changed it from speech to play sound. And I noticed that once I changed that setting, it seemed to be playing a sound after every swipe that I would make, like even swiping on the home screen, just swiping around after every single thing, it would make this noise. And that was really annoying. It doesn’t say actions are available after every swipe. So I went back and put it back to speech, because it’s just the little constant three blips of sound was annoying. So I was wondering if there’s any way to change the amount of times that it makes a sound or when it makes a sound. Is there a further way to customize that?

Not that I have found, Catherine. I hear the frustration. It can be quite annoying to hear the tone, but I find actions available even more annoying. So I put up with it. I’d also like a choice of noise, because I find that one a bit bassy, and it can kind of give you a headache after a while. But I still prefer it to actions available. So maybe we’ll see some more customization in future.

Second question. This has been going on ever since I think my iPhone 8. And I keep hoping that when I get a new phone or the OS updates, that it will not happen anymore. But it keeps happening. And what’s happening is when I am in the messages app, the text messaging app, and I’m responding to a message, if it’s a message to someone else with an Apple device, and I have Braille’s on screen keyboard turned on my rotor, so I just flip the rotor to on screen keyboard for Braille and start typing. But sometimes, maybe about half the time, when I’m messaging someone else with an Apple device, it decides that that means that I want to audio record a message. I have no idea why I haven’t turned on audio recording or anything like that. It just somehow thinks that when I flipped the rotor to Braille on screen keyboard, and specifically, I use the screen away mode, I notice it does not do it in tabletop mode. It does it only in screen away mode. So I was curious if anyone else who uses this method to type text messages has this problem and if there’s any way to turn it off.

I use Braille screen input an awful lot, but in tabletop mode exclusively. I’m a tabletop guy, so that could be why I’ve not seen this one. There may be help on the way, Catherine, because in iOS 18, there’s a completely different way that you can invoke Braille screen input. You can put it back on the rotor if you want to, but there’s a new way and we’ll explain that in the iOS 18 tutorial. So maybe that will fix it. I’ll be interested to know if it does.

Advertisement: Living Blindfully is brought to you in part by the Disability Disrupters Podcast. Are you ready to break barriers and create a rich and fulfilling life despite the challenges we all face as disabled people? Check out the Disability Disrupters Podcast, bringing you news, views and interviews with disabled people around the world. Disability Disrupters brings you insightful news. Stay updated on the latest happenings in the disability community from a disabled person’s perspective. Diverse voices, hear from inspiring individuals sharing their personal journeys, including struggles and successes. Practical advice, learn from others about navigating barriers to community participation, empowering conversations, discover strategies to create a fulfilling life on your own terms. Join the monthly discussion on the Disability Disrupters Podcast. It’s available on the Disability Responsiveness website at www.drnz.co.nz. That’s www.drnz.co.nz, or search for it in your favorite podcast app. Disrupters is spelt with an E-R, the Disability Disrupters Podcast. Let’s disrupt the status quo and show the world what we can achieve together.

Recycling experiences

Dave: Hello, Jonathan. This is Dave from Oregon. I heard the comment about recycling. In Oregon, we have had for several years what we call bottle drop. We collect our cans and bottles that have a 10 cent per container redemption rate that we pay for when we buy them, and we drop them into green bags on which we attach a barcode that we get printed out at a kiosk.

I don’t know if the kiosk is accessible. My wife usually goes and gets the labels. They come out, and you can press the button and put in our special account code, and we get a set of 10 of those barcodes. We purchase the bags. There’s 10 bags for $2. We affix the barcode to the bags. We fill them up with 75 or more cans and bottles, and then we push them into a large bin that we open with our same key code that we have at the four-digit pin, and then we get money in our account. The account amount is accessible. We get an email.

When we go to the grocery store, there’s a kiosk where we can enter our four-digit pin and get a voucher for that amount plus 20 percent or that amount in full cash. The system worked pretty well.

Admittedly, I’m not using the entire system myself. As I said I don’t go to the kiosk to order the barcodes, and I don’t go to the kiosk to redeem our balance. So, perhaps this is a little bit misleading. But all in all, this is a great system. And if I wanted, I could go into the Bottle Drop Redemption Center myself with bags of bottles and cans, approach a machine, and okay, admittedly, somebody’s next to me like a friend and says, okay, let’s start shoving the cans in there, because he probably pressed a touch screen. So, okay, maybe it’s not all as accessible as that. But in any case, we’ve been doing this for years in Oregon, and it’s a very nice system. So, if this is viable for putting out of the podcast, so much the better. Enjoy listening to everything, and glad to be a plus subscriber. Keep up the good work. Bye from Oregon.

Jonathan: See if you’ve had that in Oregon for some time, Dave. You’re a trailblazer, I tell you, a trailblazer. And thank you so much for your plus membership.

Vincent: Hello, Jonathan. Hello, fellow Living Blindfully listeners. This is Vincent in the Netherlands, and I would like to comment on the topic of recycling.

First of all, you need to know that recycling in the Netherlands is ingrained in our culture. We do this for a long time. I was born in 1983. However, since my early childhood, I remember going to the supermarket and putting an empty box with empty beer bottles on a special transport band, and they would be transported, and then you press a few buttons and a receipt would be printed, and you went to the cashier, and they would scan it for you, and the amount that you regained with the recycling would be compensated on your final checkout bill from the supermarket.

Due to some EU regulations that has been broadened, also our cans are now recyclable. So you need to bring them to the supermarket, which they take up a lot of space in your household, by the way. I found it very irritating to say the lease that they take up a lot of space, and if anyone has a good system to collect and keep their cans for recycling, let us know. So the same happens, you go to the supermarket, the same system is used, you put them in, and you get a receipt, and that will be compensated in the end. A very inaccessible process.

How other blind people deal with it, I know a friend of mine does only delivery of groceries, so you can also hand those cans over to the delivery driver, and they will take care of it for you, and they will compensate you.

In our household, my sighted girlfriend takes care of it, and otherwise, what I would do when I am alone going to the supermarket, I would take a big bag full of empty cans or bottles with me, and ask someone at the supermarket to assist me.

The topic was so not new for me, but what I found very interesting to see, to observe it by myself, is that I am so used to this, this whole recycling thing, and how inaccessible it is. Yeah, I never thought about it, that it’s very, it’s totally inaccessible, it’s totally not open to visually impaired people. Why does those terminals don’t speak? That’s really annoying now, since I’ve heard Peter’s comments from Hungary.

Also, the recycling with now, with cans, which is new, brings some very irritating things as well, even dangerous situations. Because there’s still a lot of people who don’t take care of that, and I totally get that. When you’re on the way, when you’re traveling, you really don’t want to take empty cans of Coke in your bag with you, if you have your precious laptop stored there. So still a lot of people throw those cans in the trash. The homeless people, which unfortunately we get more and more of these days, what they’ll do is they go over to those trash bins and start to empty them, to search for empty cans, so they can bring them to the recycling, so they get a little money out of that. However, what you also see is that many of those homeless people don’t throw the stuff they took out of the bins back into the bin. So what you get is a lot of garbage on the street basically. And that’s very dangerous, you know. In the first place, it’s dangerous for the people who do this. They put their own health at risk with this. In the second place, it’s dangerous for us because we don’t know where we step into. Maybe there is a broken glass in it. I don’t want my guide dog to step into it. So it comes with its problems.

So what they’re doing now is they’re putting special donation spots on those bins. And that’s some kinds of yeah, I don’t know how it looks like yet, but you can put your empty cans in those spots. So when the homeless people or the people who need to make a little money, when they come by, they can just grab those cans or bottles and take them to the recycling for you. They can make a little money. So that hopefully would make it a little bit less dangerous for everybody, this new recycling regulations.

CPAP machines, hearing aids, and the word “blind”

Jonathan: Walt Smith is writing in and says to the gentleman who shared his concerns about a blind person using a CPAP machine, I’ve been using one of these devices for over 20 years without any significant problems. And I’m totally, as in two prosthesis, blind.

The mask can be mildly uncomfortable when you’re first getting used to it. But after a short time, you just get used to it. And it doesn’t interrupt sleep.

I don’t personally use any water in my machine and never have. Having to keep distilled water on hand, measure it accurately, etc. just seemed like too much bother. This has never posed any kind of physical side effect or other issue for me.

Once the machine has been set up, normally by the organisation providing it, there’s virtually no need to change any settings. Depending on the specific machine, there may be some settings that relate to personal preferences, and these can be selected at the time of initial set up.

In short, at least give CPAP a try to see how you get along with it.

The new hearing aid technology from Phonak looks fantastic. The incorporation of AI technology and its apparent effect appears to be nothing short of miraculous. There’s a demonstration of a brief conversation with and without the new technology turned on, and unless it’s something that the Phonak PR folks have ginned up, I can’t wait to get my pair of these new aids. It may be a month or so before I can report on them, but once I have them in hand, or should I say in ear, I’ll report to Jonathan and the rest of the Living Blindfully community. I look forward to that.

Concerning use of the word “blind”, I have a funny story. Many years ago, I appeared on a radio station call-in show representing one of the consumer organizations, both of which I need not point out, have the dirty word in their names.

The show’s host was going out of his way to avoid using the terrible word, and finally during a commercial break, I finally asked him why, and his response was, I’m very uncomfortable with that word. When I pointed out that the word did in fact occur in the names of both consumer organizations, in the largest organizations for the blind, APH, AFB and the National Library Service among others and so on, he just repeated his objection. I walked out of the studio in mid-interview and left him in something of an embarrassing situation. My experience has been that people who aren’t comfortable with the word also aren’t comfortable with the condition or people who manifest it. So I tend to try to give such individuals a wide berth.

Migrating content from Voice Dream Reader

Marion: Good day everybody. This is Marion, otherwise known as Piano Marion, pretty much everywhere except Dice World, which is irrelevant.

I’m sending in this message because I just finished listening to an episode in which someone is looking for a way to import files from Voice Dream Reader to another application.

Now, I don’t know a whole lot about Speechify slash Reader, but I do know that Voice Dream started to have its issues. I looked into Speech Central, and I have Speech Central. And the way that I imported all of my Voice Dream files to Speech Central, I went into the Files app, and specifically where you have to go in the Files app is the section marked On My iPhone. Within the On My iPhone section, you go to something called Reader. And within that folder, you will find all of your Voice Dream files. And you can… Well, with Speech Central, you just kind of moved it. You moved all files to… And it gave you in the Share Sheet an Open In option. And then you clicked on that, and then I clicked on Speech Central, because that’s where I was going. But I don’t know if you can do that last step with Speechify slash Reader or not, but that’s how I moved my files to Speech Central.

Jonathan: Well, that’s news you can use, isn’t it? That’s a very useful hint, and I appreciate you sending that in.

I must say, I don’t wish the Voice Dream Reader people any ill or anything like that. They did respond to consumer feedback, but I have just found the app has become so unreliable. And with every release, there seems to be something that creeps in that makes using it difficult.

So I’ve had another look at Easy Reader from Dolphin, and I must say, it has come a long way since I seriously investigated it last. I didn’t realize, for example, how good the Windows app is, and I’ve always wanted something that will allow me to synchronize my content between my iPhone and my Windows computer. Easy Reader is it.

There are some limitations in terms of syncing content across your devices, though. If you get a book or a magazine from a library that Easy Reader supports, then it will sync. So you download a bunch of Bookshare books or any other library. It just appears on all your devices. But if you upload content to a device, so you take a PDF file or a Word document that’s on your computer, and you put it on your Easy Reader on your Windows computer, it’s sadly not going to show up on your phone. I hope they might add that in a future version.

I paid for a premium subscription, which is quite a bit less than the Voice Dream one, and it seems to do a lot more that I want.

Now, there is one big deficit for me, and that is that Easy Reader does not handle audio files. So you can’t take an MP3 or an M4A file or a collection of those files and read them in Easy Reader. And for that purpose, I was recommended On Mastered On, yet by way of a toot, we have a lot of good discussions about tech on there, this app called Book Player, and it’s brilliant. It really is a very good app. You can support the app developer by paying a premium, but you can also just download it and use a lot of features for free. You can input audio books and other MP3 and M4A files from all over the place. It remembers your place. You can group them into archives and folders. And I just love this thing. It’s actually a lot more powerful than Voice Dream’s audiobook support was. So, win-win, I’m really enjoying this book player app.

Ray-Ban Meta Smart Glasses

Let’s talk more about the Ray-Ban Meta Smart Glasses. And David Szumowski says, Jonathan, I enjoy your Living Blindfully podcast. Thank you. Well, thank you, David. I appreciate it. I’m a little unclear, he says, on the Meta Glasses. I have OrCam and know what it does and want to compare them to Meta Glasses.

I don’t understand if Meta has a sync with an iPhone so that my contacts are recognized and available for calls or texting.

I’m just going to stop and work my way through these comments, David. Yes, it does. When you set up the Meta Smart Glasses, you do so through an app called MetaView, which is downloadable from the App Store. It’s an elegant setup experience, very well done. And at that point, you can choose to share your contacts with your Meta account if you want to, and then the glasses know about your contacts.

He continues, I don’t know if it requires WhatsApp or whether I can avoid that app.

You absolutely can if you want to, but you might not want to for too much longer because of some blindness services that might be using WhatsApp to make their services available. I can’t really say any more than that at the moment, but if you want to be without WhatsApp, you absolutely can. You could also use Facebook Messenger or be without that too if you want. And then you can set up the Meta glasses so that you can just text with your iPhone.

David says, I am also unclear about where a photo goes once Meta takes the picture, for example, to the iPhone photo gallery or somewhere else. Well, it initially goes in the Meta app in its own gallery, but you can import those into your photo library and then do with them what you will from there. You can set things up so that when you put the glasses in the Ray-Ban Meta Smart Glasses case, it will do a sync for you and upload your content that you’ve created since you last imported into the phone. So it can be automatic, but you can also do it manually if you would rather.

He says, could I take a picture and say, hey Meta, send the photo to someone in my contact list or to an email address. You can certainly do it via a messaging app. I believe that works okay. He says, I think it sounds easy, but maybe it is not. It is relatively easy actually. It’s well implemented. It’s great for social media and sharing pictures. The camera on it is very good.

My major word of caution is that the AI is a little bit lackluster in my view. Sometimes you can get some good results. Sometimes it hallucinates, and it feels like you’re getting good results, but it’s actually telling you nonsense. Sometimes you have to play 20 questions with it. So I don’t think what you get from the meta AI is anywhere near as good as what you can get from Claude or ChatGPT or even Gemini. And there are a lot of hopeful people who’ve bought these glasses, hoping that it will improve. Well, I guess we’ll see and time will tell. But undoubtedly, as a social media tool, as a pair of headphones that don’t cover your ears, and which provide excellent microphones, there’s a lot to like. It really does take very nice pictures. And the meta AI is okay. You know, if it’s all you can afford, then it’s all right. And it does achieve some very nice results fairly often.

It was Bonnie’s birthday the other day, on the 21st of August, at which point she turned (bleep). What just happened there? I think Bonnie must be listening from upstairs with the big switch. Okay, I won’t tell you what age she turned, but anyway, she had a birthday and we went out to dinner. And I’ve had these experiences before. We have a lovely dinner. You’re feeling great. The food was good. The service was sublime. And then you go out there and you get refused by an Uber who doesn’t want to take the dog. Oh man, sounds familiar. Now we do have a thing called Uber Assist in New Zealand. And in my experience, when you call an Uber Assist, this sort of malarkey doesn’t happen. The thing is, there aren’t that many Uber Assist drivers out there. So you might look at your Uber app, and you’ll see that an Uber Assist ride is 10 minutes away, and an Uber X ride is one minute away. And you’re out there in the cold. I mean, it’s winter time here. And you think, I’m not waiting 10 minutes. I should be able to just take an Uber X like everybody else.

So you call the Uber X, and occasionally you’ll get a refusal. So I took the Meta Smart glasses with me, actually mindful that this might be a thing. And when I got the notification, your driver is arriving in the Uber app, I just started videoing. It was so easy. I was wearing the glasses. I double tapped the button. I took a video.

And what do you think happened? It was a very cheery, happy driver who pulled up and said, hello, come on in and all that kind of stuff. And if I hadn’t taken the Ray Ban Meta glasses and got videoing, we would have probably had a very different experience. But it was a good bit of peace of mind to know that if we ran into that problem, I would have video evidence that I could provide if necessary. So that’s a very real, unfortunate, but real use case for these Ray Ban Meta smart glasses.

Luna RSS

There’s been some talk of late about Luna RSS and Jose has some comments on this. He says, hi, Jonathan. Firstly, my congratulations for your excellent podcast, Living Blindfully. Thank you so much.

I’m writing to you relating to the questions in the past episodes, asking for a program to play podcasts in Windows. And I want to share an alternative that I have found very useful. I have combined the use of Luna RSS to listen to podcasts with the Jaws screen reader and the split Braille option. This way I can use my Braille display, which is the Mantis, as follows. Once I have selected the episode to listen to in the Luna RSS program, on the playback screen, I activate the Jaws cursor and move it to the playback percentage button. Then I activate the split Braille option in Jaws, Insert+Alt+v, and I move in the list to the Jaws cursor and press enter.

The next thing I do is activate the PC cursor in Jaws, and I move to the list of chapters in Luna RSS. Then my Braille screen is divided into two parts, one with the list of chapters and the other with the episode’s playback rate. Then I press control plus P to play the episode, and it also helps me pause, but I’m reading on my mantis with 40 cells, half the chapter list and the other the percentage. I move through the list of chapters and can go directly to the one I want by pressing enter.

I can also press control plus right arrow to go forward a few seconds in the episode, or control plus left arrow to go back a few seconds in the episode. So this alternative has been very efficient for me to listen to podcasts on windows with my Braille display and the split Braille option in Jaws. I have also tried the alternatives of dividing the screen with the play button and the percentage, or you can put the volume or speed button on the combination that works best for each user. Although my preferred way to listen to podcasts is using the iPhone, this alternative to splitting the Braille screen in two is not possible with voiceover or any other screen reader that I am aware of, only with Jaws. So it is an excellent alternative for blind and deafblind people who use Braille displays. I find the split Braille in Jaws to be very useful, very convenient and functional.

Thank you so much for the message. Really appreciate that.

Advertisement: Transcripts of Living Blindfully are brought to you by Numa Solutions, a global leader in accessible cloud technologies on the web at numasolutions.com. That’s P-N-E-U-M-A solutions.com.

I like that little sweep because I get a chance to boogie around the studio when it’s on and get my Apple Watch goals going a bit more, you see, but I’m out of here now.

Closing and contact info

Thank you very much for your company today. I appreciate it. We’ll see you next week. And remember that when you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.

[music]

Voiceover: If you’ve enjoyed this episode of Living Blindfully, tell your friends. Spread the word on social media. And if you’d take the time to give us a 5-star review on Apple Podcasts, we’d appreciate it.

We love hearing from you. Be a part of the show by getting in touch via WhatsApp, email, or phone.

For all the ways to share your thoughts with us, visit LivingBlindfully.com/opinion. That’s LivingBlindfully.com/opinion.