Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

 

Contents

Welcome to 293.. 2

Hi From the Ray-Ban Meta Smart Glasses. 2

The Final Report of the Royal Commission on Abuse in Care.. 4

I’m Worried About Blind Barbie.. 11

Watching US Olympics Coverage Outside the US.. 14

New Accessories for the Zoom H6 Essential Recorder. 15

Sonos Says Sorry. It’s Not Enough.. 16

Be My Eyes Preparing to Share Data with AI Companies. 18

Vicky Cardona Discusses the Pros and Cons of Various Blindness Wearable Devices. 19

Surface Laptop 7 Review.. 39

Closing and Contact Info.. 42

 

 

 

Welcome to 293

[music]

Voiceover: From Wellington, New Zealand, to the world, it’s Living Blindfully – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.

New Zealand’s government has published the final report of the Royal Commission on Abuse in Care. We look at it from a blindness perspective, Mattel has introduced a blind Barbie. Did they get it right?, and Vicky Cardona joins me from the NFB convention to discuss the current state of the wearable market for blind users.

Welcome to episode 293, and there is no area code nor country code 293 to tell you about, which means that we can quickly move on.

Hi From the Ray-Ban Meta Smart Glasses

And you’ve probably heard enough at this point to say, “He sounds a bit different.”, and the reason why he sounds a bit different is that I decided to give you a good dose of what the Ray-Ban Meta Smart Glasses sound like for outgoing audio, since there’s so much interest in these glasses at the moment.

So the beginning of this episode is actually being recorded with Just Press Record using the Ray-Ban Meta Smart Glasses. I’m just wearing them normally. What’s interesting is that Just Press Record sees the Ray-Ban Meta Smart Glasses just fine. When I go into Ferrite, I don’t get them there. So it may vary depending on what application you are using.

And I do want to talk a bit more about wearing these as a hearing aid wearer now because Mike Forzano has been in touch.

Carl Richardson has also been in touch, and I’ll read his email because essentially, they ask the same question. Carl says:

“First and foremost, I want to thank you for your invaluable information that you provide on your weekly podcast for people who have a dual sensory loss of hearing and vision. I cannot think of any other resource that makes myself, a person with Usher’s syndrome, feel as included in the discussion as your show does.

I recently listened to your podcast, where you mentioned that you and Bonnie purchased the new Ray-Ban Meta Smart Glasses. I was wondering how, as a hearing aid user yourself, you accessed the audio output. Did you Bluetooth it to your hearing aid, or was the speaker built into the glasses, and was the audio sufficient enough for you to hear it that way?

Could you potentially do a demo and show the DSL community” (that’s dual sensory loss) “how you were able to access the glasses and hear the audio? I can’t be the only person who is a hearing aid user wondering about this. It would be great if you could use your experience and explain this in great detail.”

I’m happy to try and help, Carl.

I don’t believe there’s a way to get the output of the Meta Smart Glasses to a set of hearing aids.

There are a couple of things that the Meta Smart Glasses will output. One is the audio from your iPhone.

Now, it may be possible to redirect it. This is what Mike Forzano was telling me. You may be able to go into the AirPlay options and get the audio back to your iPhone, where your hearing aids may be paired to your iPhone. And that’s fine for the output that’s coming from your iPhone.

What I don’t think that will help with is the audio that the Meta Smart Glasses themselves generate. So when you use the command, for example, to ask, “What am I looking at?”, (I’m not going to say it now because I’m wearing the glasses and I’m not sure what will happen.), the glasses themselves will speak back. And that’s built-in speech on the glasses themselves.

The good news for me as a behind-the-ear hearing aid wearer who’s using the Phonak Lumity aids, I have no problem whatsoever hearing the speakers built into the smart glasses. These speakers are beautiful, and I’ve found that just by positioning my hearing aids carefully right by the speaker, I’m getting more than enough loudness to hear, even in noisy environments.

Now, these aids are behind-the-ear aids, which does make this easier. If you have in-the-ear hearing aids, your mileage may vary. If there are other hearing aid wearers who are working with these Ray-Ban Meta smart glasses, I’d be interested to hear how that’s working.

So it is a bit different, say, from the Envision Smart Glasses, which will let you pair with a Bluetooth device. If you have Phonak hearing aids, which essentially present themselves as standard Bluetooth headsets, then it may be that the Phonak hearing aids and the Envision smart glasses are a great combination.

But for me, just positioning them correctly is no problem at all. And I’m getting really good sound from these smart glasses.

Now, as you heard, that audio was a little bit glitchy from time to time. The reason for that is that it’s sending it via Bluetooth from the glasses to the phone, and Bluetooth doesn’t have a massive amount of bandwidth. That’s why you hear the glitching from time to time when I was recording that as a WAV file.

What I’m doing now, just to conclude this quick demo, is I am recording a video on the glasses themselves, and I’ve imported the audio from this video into Reaper. You hear that this is much much better. It’s not glitchy, it’s just perfect, really.

But the downside of this is that you only have a maximum of 3 minutes, which is settable in the Meta View app, before your video stops recording. Recording video does consume quite a bit of battery life, and I guess they just want to minimize the length of these videos.

All that in mind, let me quickly tell you that I had a great experience over the last week. I used two of my favorite products together: RIM and Aira. And as you know if you’re a regular listener, …

Advertisement: Transcripts of Living Blindfully are brought to you in part by Pneuma Solutions.

Now, I want to be clear about how this works. Even if you don’t have a RIM account, even if you’ve never paid RIM a cent in your life, you can still download RIM for free and use it with Aira. The reason for that is that it’s the person who is assisting you who pays, not the person who requires assistance.

So if you’ve never used RIM before, download it, call an Aira agent, say to them, “I’d like a RIM session, please.” They’ll know exactly what you’re after.

They’ll invite you to nominate a keyword for the session. That keyword can be anything you like. You’re up and running with accessible control.

I tell you what, it’s so much easier than any other solution that Aira previously had when you want remote assistance. So well done to Aira and RIM for this amazing marriage!

Find out more and download it for your PC or Mac at GetRIM.app. That’s G-E-T-R-I-M.app.

And isn’t this audio, when recorded directly on the Meta Smart Glasses pretty sweet.

[music]

The Final Report of the Royal Commission on Abuse in Care

This section of the podcast covers the topic of abuse, specifically abuse in New Zealand. It may be difficult listening for some.

If the right decision for you is to skip past this section, you can do so using the skip to next chapter feature of your podcast player, if it supports this feature, or you can navigate to about 31 minutes and 30 seconds when the next section begins.

If you do listen to this section and you’d like to talk with someone, please reach out for support.

If you’re in New Zealand and you’re a survivor of abuse in state care, faith-based institutions, or organizations that were funded by the state such as the school formerly known as Homai College and its predecessor, you can reach out to the Survivor Experiences Service by visiting SurvivorExperiences.govt.nz, or by calling toll free in New Zealand 0800 456 090. That’s 0800 456 090. Or you can also text 8328 every weekday from [8:30] AM to [4:30] PM.

I should declare some interests here. I am a survivor of abuse at Homai College. I testified publicly at the Commission’s hearing on disability issues in 2022, and I was subsequently appointed by the previous government to the Board of the Survivor Experiences Service, a position I still hold.

Opinions I express in this section are mine alone. I speak for myself, and not on behalf of anybody else.

In episode 146 of this podcast, I spoke with Paul Gibson, who was one of the commissioners conducting the Royal Commission of Inquiry into Abuse in Care and Faith-Based Institutions. This inquiry is the largest royal commission ever to be conducted in this country, and it is widely considered the most comprehensive inquiry into abuse that has been conducted anywhere in the world.

The inquiry took over 6 years to complete. It investigated abuse perpetrated against an estimated 200,000 people over 7 decades. Even more experienced neglect. That’s a sobering number, given that the population of the country during much of the inquiry’s period was around 3 million people.

Many of those abused were Maori, New Zealand’s indigenous people. Disabled people were also identified as an at-risk group who were subjected to abuse and neglect.

The Commission said, “It is a national disgrace that hundreds of thousands of children, young people, and adults were abused and neglected in the care of the state and faith-based institutions. These gross violations occurred at the same time as Aotearoa (New Zealand) was promoting itself internationally and domestically as a bastion of human rights and as a safe, fair country in which to grow up as a child in a loving family. If this injustice is not addressed, it will remain as a stain on our national character forever.”

At last, the Commission’s report was tabled in New Zealand’s Parliament and published last Wednesday. Should you wish to, you can read the report in a variety of accessible formats at AbuseInCare.org.nz. That is all one word. AbuseInCare.org.nz, under the reports section.

It’s 15 volumes in print. It weighs 14 kilograms. It took 3 people to bring the hard copy to Parliament for the formal tabling, for which a large number of survivors were present in the public gallery of our Parliament. And then, when there was no more room, filled an overflow venue. The report is called “Whanaketia – Through pain and trauma, from darkness to light”.

Speaking in New Zealand’s Parliament, Prime Minister Christopher Luxon:

To every person who took part, I say thank you for your exceptional strength, your incredible courage and your confronting honesty. Because of you, we know the truth about the abuse and the trauma that you endured. I cannot take away your pain, but I can tell you this. you are heard and you are believed.

Many of your stories are horrific and harrowing. They are painful to read, but not nearly as painful as they were to endure. The state was supposed to care for you. But instead, many of you were subjected to the most horrendous physical, emotional, mental and sexual abuse.

Jonathan: Specifically referring to deaf and disabled survivors, the Commission found that they experienced ableist, disablist, and autist abuse including targeted abuse and derogatory verbal abuse. Deaf and disabled survivors and survivors who experienced mental distress were denied personhood and were often stripped of their dignity and autonomy. They were left unattended and ignored, with no stimuli to grow or develop their individual talents or interests. Many were segregated from society, deprived of individual attention and basic educational opportunities. Disabled adults were often treated as unable to make their own choices and decisions, with or without supported decision-making. Deaf survivors were denied sign language and deaf culture. Blind survivors were denied Braille.

For many deaf and disabled survivors and survivors experiencing mental distress, being segregated and experiencing restricted contact and separation from whānau (that’s the Māori term for extended family and community) has caused acute pain, and has had lifelong negative impacts. Many were institutionalized to the extent they struggled to live independently. They were denied their personhood and their culture, as well as the opportunity to practice life and community skills. Disabled communities lost generations of future leaders.

Elsewhere in the report, the Commission says this specifically about blind people. Blind survivors told the inquiry they were punished by staff at blind schools for behaviors such as using echolocation to navigate spaces. Blind people can use echolocation to help perceive their environment and can include using sounds such as mouth clicks, finger snaps, whistling, and cane taps. Blind survivors described their blindness as being a part of their cultural identity, so being punished for behaviors associated with that identity represented psychological and emotional abuse and cultural neglect.

The inquiry was told of Homai staff inflicting psychological harm on residents through actions that disregarded children’s fears including shutting a child who was terrified of dogs in a cage with one, throwing children who were scared of water in the pool, and chasing a child who was scared of vacuum cleaners with one.

More than half of the blind survivors who spoke to the inquiry about their experience in social welfare care settings were physically abused, 53%. The next most common types of abuse in social welfare settings were sexual, 44%, and emotional, 36%. In addition, 28% of blind survivors were neglected in these settings.

Of blind survivors who were in faith-based settings, the most common type of abuse was emotional, 58%. The next most common types were physical abuse and sexual abuse, which were in 53% of accounts.

I have been in touch with other survivors who’ve come forward to the Commission.

Coming forward to the Commission has, understandably, affected different people in different ways. For many of us though, finding others with similar stories has somehow been validating. I don’t think I knew at the time that I wasn’t the only one being abused in the swimming pool by Mary Buist, a particularly sadistic and vicious teacher.

The report is calling for many organizations to apologize publicly to survivors, including Blind Low Vision NZ. I cannot tell you how long I have waited for this organization to face up to the harm it has done, how much agony this issue has caused me.

As is mentioned in the report, in 2002, I became the Chairman of the Board of the Royal New Zealand Foundation for (later of) the Blind. It was a time of great optimism and hope, as the organization was transitioning from a paternalistic model of governance to a world-leading model based on self-determination.

Get a few blind people in a room, particularly if the alcohol is flowing, and it won’t be too long before you start hearing about the pain many still feel, the anger some still carry about the abuse to which they were subjected, for which the Royal New Zealand Foundation of the Blind is ultimately responsible.

I felt at the time that embarking on a new chapter for the organization, and therefore for blindness service provision, was the perfect time to confront the darker side of the organization’s past to acknowledge it, say sorry for it, and if necessary, compensate for it. The Royal New Zealand Foundation of the Blind is a very wealthy organization.

But I was met with considerable resistance from board members at that time, a good number of whom were blind themselves. They were worried more about the exposure to risk than justice and well-being for their fellow blind people. Some said it wasn’t appropriate to judge what happened in a different era by today’s standards. I could not get any traction on this issue at all, other than an agreement that I could make some pathetic, utterly inadequate, “things were done then that would not be acceptable now” kind of statement at the annual public meeting.

Should I have just got up there and said what I really wanted to say? Maybe. But in the end, it would have been a board chair gone rogue, and it would still not have carried the weight of an official apology on behalf of the organization.

Soon after, it was clear that my new job at Pulse Data (now Humanware) meant I’d be doing a lot of overseas travel, and I resigned from the board.

Not long after, there was a news report that made the papers about a young child who was abused in a very similar fashion to the way I was back in the 1970s. The problem had not gone away.

So for the first time, I went public with my own abuse in a blog post. I made sure journalists saw that blog post. I made the difficult decision, as a private citizen, that if the RNZFB were going to continue to deny survivors redress and an apology, I would speak my truth in the hope that others might also come forward. It has been a long journey from those difficult days when many fellow blind people tried to downplay the impact the abuse has had on me and other survivors like me.

A wide range of survivors told successive governments we need an inquiry, we need to be heard, and successive governments denied us that opportunity.

Finally, we have been heard, we have been believed, and Blind Low Vision NZ has been called upon to apologize to survivors after first consulting with survivors about the nature of that apology. On Friday, the chairman of the RNZFB board, Clive Lansing, issued a statement on behalf of the board. In part, it reads:

“The Royal Commission’s inquiry spans a period of time between 1950 and 1999. During much of this period, societal attitudes towards disability in Aotearoa (New Zealand) were appalling. New Zealanders were expected to fit in and conform to a narrow definition of what was considered ‘normal’.

As a result, disabled people disproportionately experienced abuse and neglect. Also, for much of this time, blind and low vision children were prevented, or at least discouraged, from attending their local schools, and Blind Low Vision NZ regarded itself, and was regarded as the best option for their primary education.

A section of the Royal Commission’s report addresses this specific abuse and neglect and notes the representation of the blind and low vision community. The RNZFB board and Blind Low Vision NZ expresses our sincere appreciation to the Royal Commission of Inquiry and to those people who bravely came forward to recount what happened to them.

We have had a first look at the report, and will continue to take the time to carefully read and follow the recommendations of the Commission.

We hereby commit to acting in accordance with the recommendations contained in the Reports Part 9. These recommendations call on key leaders to make public acknowledgments and apologies based on strong engagement with affected communities. This is to be completed within 6 months.

In the meantime, we unreservedly apologize for any abuse inflicted on children and adults in our care. We are committed to, and have a moral responsibility to regularly review and discover whether the systems we have put in place to prevent further abuse are as robust as possible. These include our children’s policy and our complaints policy, available on our website’s Contact Us page. It also includes all appropriate verifications and checks we carry out, including police checks for staff and volunteers.

We encourage clients who may not have come forward yet to reach out to the Survivor Experiences Service.

This is a difficult subject, so please reach out to friends, family, whānau and your loved ones. Even just a simple ‘Are you okay?’ can make a big difference.”

That’s the statement from RNZFB Board Chair Clive Lansing.

The consumer organization of blind people in New Zealand (Blind Citizens New Zealand) published a newsletter in which it highlighted how to access the commission’s report. It also made the following statement, which I greatly appreciate:

“Blind Citizens NZ acknowledges all survivors, recognizing this includes people from the blind community. We honor the courage of survivors who came forward and shared their experiences and the courage of survivors who chose their own pathway.”

Finally, I want to talk about the future because Blind Low Vision NZ has been instrumental in trying to give us a poorer future than disabled New Zealanders deserve.

I don’t recall exactly when, but it must be around a decade ago now that Blind Low Vision NZ heard about accessibility legislation in the province of Ontario in Canada. The concept was taken to a federal level in Canada when the Accessible Canada Act was passed in 2019.

We’ve had people on this podcast before talking about the flaws in this model, which are well known and obvious. When you have multiple means of remedy, it creates confusion for those seeking remedial action. Is this an accessibility issue or not? Is it more of a human rights issue than an accessibility issue? That ambiguity makes it easy for decision-makers to kick the can down the road. You go to one entity, they tell you to go to another one, who then bounces it back to the first one, and nothing gets done.

This is absolutely foreseeable, and for Blind Low Vision NZ to have tried to impose this flawed model on New Zealand is outrageous. It has set progress in this country back by a decade. Successive governments are also to blame because there has never been any truly open, transparent consultation about a range of options and their pros and cons.

So what’s the answer? The answer is clear. New Zealand is a signatory to the United Nations Convention on the Rights of Persons with Disabilities. If we’re going to ensure disabled people take our rightful place in society, we need effective all-encompassing disability rights legislation that codifies into New Zealand law the UNCRPD obligations.

What’s the gold standard for disability rights law in the world? Which legislation has had a massive impact beyond its borders? Of course, it’s the Americans with Disabilities Act, the ADA.

What New Zealand needs is wide-ranging disability rights legislation, and that is what the Royal Commission has now recommended.

I cannot tell you how overjoyed I am, how vindicated I feel by the Commission taking this common sense yet courageous position.

Accessibility is a critical right, but it is but one of many. Accessibility can be incorporated into disability rights legislation. I’ve been saying this for years.

Blind Low Vision NZ and its allies pushing this ill-considered accessibility legislation have tried to ghost me, but they always had a problem. When I would get up in front of a group of disabled people and make my case, a lot of those disabled people would say, “You know, that’s right. That makes sense. How come we’ve not been given ADA-type legislation as an option? That’s what we need.”

And that is now what the Commission has said we should have. You can only go for so long, stifling common sense, before it catches up with you in the end.

So I’m calling for the immediate disbandment of the organization now called Access Matters. It has done harm.

In its place, let’s have a genuine grassroots organization driven by disabled people who will fight hard for the passage of this sweeping legislation. After robust consultation, we must draft this new act together as a disability community. To quote Shannon, a survivor who was mentioned in the report, “It’s not right for able-bodied people to dictate the lives of people with disabilities.” Good on you, Shannon.

The day that a comprehensive Disability Rights Act for New Zealand passes will be a great and transformative day for this country. We must all work hard to ensure this recommendation is not put on the back burner for long.

Clearly, redress for survivors who are not getting any younger must be the priority. But the future is important, too.

I will close with these words from the Commission’s report:

“Greetings to you who came to us with nothing in your hands, with sorrow clouding your hearts, the burden of guilt weighing heavily on your very being, spirits crushed by the shame that still pervades your minds, proffering a simple prayer, deliver us.

We listened to your anguished memories, heard the sobbing of your hearts, and have been tasked to give you solace from the harsh winds of time, resolution from the gravity of your abuse.

May the unseen hand of providence shine upon you, the wisdom of the ages guide you, and that in the truth and justice we have sought together, may you find respite.

Finally, let us pay homage to those who will not see the end of this long journey. Those of you who now rest beyond the collective realm of memories, into the deep recesses of the night, we bid you farewell. Sleep in eternal serenity in the embrace of your ancestors. While your voices now are silent, your words will endure. Rest in peace, and in that peace, we hope you find solace in this report.”

Voice message: Hi, Jonathan! Kylee Maloney here.

As a fellow survivor, I’m sure I don’t have to tell you about the impact of the tabling of the Royal Commission of Inquiry into Abuse in State and Faith-Based Care’s report, which was tabled here in Aotearoa a few days ago.

What I do want to do is express my gratitude for the work that you have been doing and continue to do. As a fellow survivor, your ability to articulate for many of us, those of us who came forward like me and those of us who couldn’t, for various reasons.

I think of that example, and I hope that it will encourage others both here and overseas who have suffered abuse to reach out for help to somebody, someone they love, someone they care about, someone they trust. That could be hard for some people, given the stuff I’ve just read on the report. For some, that will be no one at all, unfortunately, because society has created that scenario. But we can all be here through your podcast, through all the different initiatives that we have, just to meet each other globally and to know that we’re not alone.

So I want to thank you for that and encourage you to keep going, even though it’s difficult. And know that you’re not alone either, and that what you give out all reaches back to you through all the responses you get for various reasons.

But on this week of weeks, from people who have been to the dark places because of societal attitudes, and because of being children who were dispensable and perhaps not even seen as fully human. So thank you, and kia kaha, be strong, keep going.

Jonathan: Kylee, I want to thank you for your incredibly generous comments. I truly appreciate them.

I also want to thank you for your bravery in allowing some of your statements to be published at some length in the report. In the first version of this part of the podcast, I tried to read that, and I sincerely apologize that I just couldn’t get through it. I had several goes. But the statement is there, for those who want to go into the report and read it. So thank you also for the courage that you have shown. It does require a lot of vulnerability, a lot of honesty, and a lot of going into some pretty dark places. And I, for one, was profoundly moved by the statement that is contained from you in the report.

Advertisement: Living Blindfully is brought to you in part by ListenLater.net.

Are you tired of missing out on your favorite articles because you’re too busy? ListenLater.net is the revolutionary service that turns your must-read articles into your very own personalized podcast. Imagine having high-quality, human-like narration of your favorite articles ready to listen to anytime, anywhere, on any device you’re already using for podcasts.

With ListenLater.net, simply email a link, newsletter, or even a PDF, and they’ll transform it into a beautifully narrated podcast.

ListenLater.net is a great companion whether you’re commuting, working out, or doing chores. It kept me informed and entertained on my recent long international flights to and from the NFB convention.

ListenLater.net is perfect for everyone, especially for the blind community using devices like the Victor Reader Stream.

No subscriptions, no hassle. Just pay for what you use.

Get started with free credits today, and experience the convenience of having your own personal podcast feed. And after your first credit top-up, use the contact form and mention Living Blindfully for an extra 20% beyond what you paid for.

Visit ListenLater.net now, and transform the way you consume content.

ListenLater.net. Your articles, your way.

I’m Worried About Blind Barbie

Voice message: Hi, Jonathan! It’s Carolyn, recording here from Auckland.

Late yesterday, we’ve got an email on one of the blindness email lists here regarding Mattel’s latest Barbie doll in the disability realm. This one is a blind Barbie doll, and it concerns me about the way it’s been created and how it looks.

I’m not anti-toys that depict disability. I think it’s positive. But I get very concerned about how they try to cover all the bases when creating these toys.

As I understand it, Mattel created this doll within consultation with an American blindness organization. What concerns me is that they have chosen to depict as many variations in this one doll of blindness, therefore actually depicting the stereotype that the public often perceives when you ask them about what does a blind person look like.

I’ll give you an example of the description that we’ve been given. We’ve been told that this doll has a white cane with a roller tip on the end, a red and white cane. That’s fine. It’s pretty standard. Don’t mind that.

One of its eyes is being created in such a way, and some people who have issues with their eyes to do with glare. That one, okay, I’ll let them get away with that one.

But the thing that really bothers me is that they have placed dark glasses on this doll, and that gives the what I would call the Stevie Wonder effect – the stereotype of a blind person with a white cane and dark glasses. And that is reality. That is a perception that a lot of sighted people who’ve never met a blind person in their life have of blind people.

Years ago, when I was vision impaired, I noticed this in a major way.

I used to work in the CBD here in Auckland. My bus would stop at Midtown, and I would have to walk down Queen Street, the main street in Auckland, to downtown to where my work was located.

Now in those days, I was vision impaired, and I wore a pair of quite thick glasses because I didn’t have a lens in my eye at the time. So I had these glasses with bubbles on, so they were rather thick.

Nobody took any notice of me wearing my clear prescription glasses. As I walked down Queen Street with my cane, I would get the usual jostling that you get in a busy street. People walking into the cane, in some cases even kicking the cane which is not great behavior, but it’s the normal run-of-the-mill stuff that would happen in a busy main street.

However, I noticed very quickly that when I put on my prescription sunglasses, (these were prescription glasses identical to my clear ones, only the lenses were dark bronze, dark brown in color.) And the moment I put those on and I started walking with my cane up or down Queen Street, the public parted like the Red Sea to let me get my way through.

I didn’t have the same issues, and I made comments to sighted friends at the time. Clearly, this is a stereotype that the public has of blind people. The moment they saw me with the dark glasses, oh shoot! She’s blind. Let’s get out of her way and make it easy for her to get through.

While that was great, and I enjoyed that freedom, it concerned me that this is the perception of how the public, in particular, (maybe here in New Zealand, I don’t know if it’s the same in other countries, and I won’t make a judgment on that. Others can make a judgment on that.) but it concerns me that that is the perception that the public has.

Not all blind people wear dark glasses. So it makes life very tricky when people in the public have these perceptions on how we should look and how we should behave.

And then, when you have an international company like Mattel creating a doll that fits all those perceptions, that’s not really going to make life much better for us, or change things. In particular, when we as a community have fought so hard to let people know that we are all different, we are not all the same, we have different skills, we have different talents. Is this pushing us back into reinforcing the stereotypes again?

Jonathan: Thank you for raising this, Carolyn!

I’d be interested to hear what others think about this.

Others have raised concerns with me, too. I see on social media, some people have said all the Velcro on the clothes that she’s wearing is really patronizing, because it implies that blind kids or blind people can’t use buttons.

Now, I asked Bonnie about this, who’s probably much more experienced and qualified to comment on these things than I am.

She said, “Jonathan, have you undressed a Barbie?”

I said, “Bonnie, I can’t say I ever have.”

She said, “Kids do it all the time. And you need something that’s straightforward for kids to undress and dress the Barbie. [laughs] Otherwise, the parents are driven mad having to do it for them. And let’s not forget, Jonathan,” Bonnie said to me, “there are a number of blind children, a good number of blind children these days especially, who have impairments additional to blindness.

She also tells me you can take the glasses off if you want to. They’re an optional accessory that is available. But if you don’t want the glasses, you don’t have to have them on.

I can’t say I’m an expert in this in any way at all. So this is what Bonnie tells me.

Bonnie also tells me that there was discussion on CNN, of all places, about why is this blind Barbie wearing heels? “Surely,” they said, “putting a blind anybody in heels is dangerous.”

And we see this so often, don’t we? We get people who say, “Can you handle steps okay?” They try to stop you from walking up or downstairs, when you’re just getting ready to do your thing.

And I’ve heard stories from several women over the years, blind women obviously, who’ve said that various sighted people (who consider themselves experts in blindness because they’ve gone and done a university course or whatever, and that makes them an expert more so than the actual people with lived experience) have actively discouraged these blind women from wearing heels. They’ve been told it’s dangerous.

One blind woman I talked to went to a guide dog school, and the guide dog school said, “We are not going to let you do your guide dog training in heels. It’s dangerous”.

And this woman said, “but I wear heels all the time, professionally. I want to do my training in a way that mirrors the way I will be working regularly”.

And she said, “I choose to wear heels. It’s a choice I’m entitled to make as a woman. If people choose not to wear them, that’s absolutely fine, too. But blindness in and of itself should not disqualify me from making that choice.” And they eventually backed down.

The training was completed fine, complete with heels, as I understand it. And the world went on. So at least they got that bit right. Blind Barbie has heels.

I have seen some pretty heart-warming stories from parents who’ve said it’s impactful, now that blind kids can say Barbie’s got a cane like me.

So what do you think about the Blind Barbie? Is the concept right, but the execution poor? Have Mattel got it about right? I believe it was the American Foundation for the Blind and not an organization of the blind that Mattel collaborated with on this.

You’re welcome to share your thoughts by going to LivingBlindfully.com/opinion. All the ways that you can contact us are listed there. LivingBlindfully.com/opinion.

And Carolyn, it is just as well that you and I are in New Zealand, where we tend not to be quite as litigious as some places I might mention, because you have put that damn Barbie song, you know, the Aqua Barbie Girl song in my head, and it won’t go away. It’s most annoying, and I blame you.

Watching US Olympics Coverage Outside the US

Voice message: Hi, Jonathan! This is Amy Ruell, recording on WhatsApp. I am just trying this out.

I also did purchase a subscription to ExpressVPN. For some reason, the 30-day extension of my subscription did not come through automatically, so I contacted their tech support. They were very helpful, and I was able to configure the program and get my 30-day extension, thanks to Living Blindfully, without any further difficulty.

The only question I have (because I have only used WhatsApp to make phone calls and send text messages) is how you find channels, communities, and groups on the WhatsApp app. I searched for Living Blindfully as an example, and had no results shown. So any tips about that would be greatly appreciated.

Also, I’m going to be out of the country for some of the time that the Olympics are on, and would love to be able to watch it. So if anyone has any tips on how I can access the US broadcasts of the Olympics while I’m out of the US, I would greatly appreciate any suggestions.

Thanks so much for a terrific podcast, congratulations on your recent well-deserved awards, and thank you for the opportunity.

Jonathan: Oh, Amy, I remember interviewing you on a very early edition of Blind Line as well when we were talking about the work that you were doing with supporting blind parents, so it’s good to hear from you.

Regarding WhatsApp, you’ve got to have the link to a community or a group, or a group owner or community owner can invite you. So they tend not to be searchable.

Channels, on the other hand, are searchable, and you’ll find them on the Updates tab. When you go there, it’s the first tab on the WhatsApp app. You should be able to find a whole bunch of channels suggested, and the channels are searchable.

If that’s not happening for you, it could be because I think, channels are still being rolled out. It’s a slow burn, so I understand that not everybody has channels yet. But if you have the Updates tab there, and you go there and you search for Living Blindfully, you should be able to find the Living Blindfully channel.

Good that you’ve got ExpressVPN all sorted because that’s going to be critical, if you want to watch the US Olympics coverage. Of course, there’ll be Olympics coverage all around the world. I suspect the BBC will once again set the standard in terms of good quality radio coverage of the Olympics.

But if you specifically want the US coverage, then the minimal research I’ve been able to conduct suggests to me that Peacock is the app to get. They have thousands and thousands of hours of Olympics coverage scheduled over on Peacock. And possibly, if there’s an NBC app, there may be some coverage there. But it sounds like it’s Peacock, if you want everything. I have no idea whether audio description is available there or not.

New Accessories for the Zoom H6 Essential Recorder

A couple of items of news I wanted to draw your attention to this week. It’s been a busy week for news.

The first thing is that Zoom is making good on its promise to release some more accessories for the H Essential series.

In particular, it’s time for the H6 Essential to shine. Now, the accessory that you can plug into the mic capsule slot on this H6 Essential that gives you 2 additional inputs is available. Not only do you get those 2 additional XLR and line inputs, There’s also a 3.5mm input jack, if you want to use that as well. If the microphones you’re connecting require phantom power because they’re condenser microphones, there’s a USB-C port in this little accessory that attaches to the recorder, so you can then plug it into a power source and your condenser mics will have phantom power.

If you want to search for this accessory, its model number is EXH-6E. That’s EXH-6E.

And there’s more, as they say. There’s a shotgun mic accessory that’s been released for the H6 Essential as well. You have full control over the width of the stereo image, whether it’s very wide or very narrow. You can even record in raw mode so that when you’re editing your recording afterward, you can widen or narrow the stereo image. Genius!

If you want to go searching for this shotgun mic, its model number is SSH-6E. That’s SSH-6E.

So we’re starting to see parity between the old H6 and the H6 Essential in terms of accessories that are available, and that is a good thing to see.

Sonos Says Sorry. It’s Not Enough

We haven’t talked about Sonos on Living Blindfully for a while.

There have been some progressive updates to the Sonos app. I think it is much more accessible.

But I don’t think it’s particularly optimal yet. They haven’t made any use of the actions rotor, for example. And that’s a problem, given that it’s one massive screen. So I don’t think it’s as good an experience as the one we had before.

Now, the reason why this Sonos thing is running and running is not because of accessibility issues. I was actually quite impressed by how much coverage we managed to get about the accessibility issues because normally, the media ignores us entirely. So that was good.

But the reason why this is running and running is that Sonos, back in May, chose to release a rewritten version of the app with masses of features missing. Why on earth would you do that? Sonos is not cheap. People who buy these things tend to be audiophiles. They care about their audio, and they care about the functionality. And then, these guys just come along and release a woefully inadequate app that makes their customers mad.

And you know what makes me mad? I mean, in addition to the functionality that we lost and the accessibility issues, the stock price of Sonos has really plummeted. That is not nice, Sonos. That is not nice. Your stocks are down because you’ve behaved so incompetently.

Now we have this groveling apology which if you’re a Sonos user, you’ve probably seen from Sonos CEO Patrick Spence. I’ll read you part of what it says.

“To our listeners,” he begins.

“We know that too many of you have experienced significant problems with our new app which rolled out on May 7th, and I want to begin by personally apologizing for disappointing you. There isn’t an employee at Sonos who isn’t pained by having let you down.

And I assure you that fixing the app for all of our customers and partners has been, and continues to be our number 1 priority.

We developed the new app to create a better experience with the ability to drive more innovation in the future, and with the knowledge that it would get better over time.

However, since launch, we have found a number of issues. Fixing these issues has delayed our prior plan to quickly incorporate missing features and functionality.

Since May the 7th, we have released new software updates approximately every 2 weeks, each making significant and meaningful improvements, adding features, and fixing bugs.

While these software updates have enabled the majority of our customers to have a robust experience using the Sonos app, there is more work to be done.” And then, he goes on to detail some features that they hope to add in the near future.

No more word on accessibility in this list, though.

He says: “We deeply appreciate your patience.”

Well, there’s an assumption right there, Patrick. What makes you think we’re patient? We paid good money for this stuff. You sat around in a meeting somewhere and willfully released a brand new version of the app that had almost as many bugs as there were missing features, and that’s really saying something.

So anyway, he deeply appreciates our patience, which I seriously dispute we’re exhibiting.

“As we address these issues, we know we have work to do to earn back your trust, and are working hard to do just that.

I am always open to your feedback. You can find me via email at CEO@Sonos.com.”

Well, it’s all very well publishing that email, Patrick. But I’ve heard from several blind people who wrote about the accessibility issues, and nobody, nobody got a reply from <CEO @Sonos.com>.

In days of yore, Patrick would reply to emails. I wouldn’t bombard him with emails. Just when there was something particularly urgent, I would send him something if it pertained to accessibility.

And a number of us let him know that there was about to be an accessibility disaster unleashed on the world. They released it anyway.

And they also knew that there was going to be a serious PR disaster unleashed on the world because of how buggy and featureless this app was. This is a premium hardware product Sonos is doing.

Now, I don’t often call for heads to roll, you know. But I think in this case, we need to understand who’s responsible. If the buck stops with the chief executive, then the chief executive, Patrick Spence, needs to resign over this issue. He ultimately is the person that signed off on this debacle. So if I were on the Sonos board, I would be calling for his ousting right now.

This is ridiculous! When so many of us have invested so much money, and this thing is just dragging on and on. Somebody is grossly incompetent at Sonos, and I don’t think customers should be satisfied until that person is identified and is no longer with the company.

Maybe then, my Sonos shares will start going up again. Well, I hope so.

Be My Eyes Preparing to Share Data with AI Companies

Next, over the convention season, I’ve no doubt you will have heard that Aira has got a program going where you can opt into getting some free Aira minutes. In exchange for those minutes, you agree to have your data shared with a third-party provider that Aira is not disclosing. But they are an AI provider, and they’re using this data to strengthen their large language model.

Now, we hear from Be My Eyes with their approach. They’ve been talking about this this week, and they say:

“we intend to provide video data to organizations to train AI models to be more inclusive and representative of the blind and low vision community”.

Be My Eyes continues:

“We are strengthening our data and privacy policy and making it more restrictive, to ensure you have even greater control over if and how your data is used”. We are taking these steps,” they say, “to ensure our users and the blind and low vision community generally are not left out of the innovative power of AI, and that AI models are trained using data that reflects the real, relevant, and lived experiences of people who are blind or have low vision.”

What? Hey, isn’t that against their language guide? I didn’t think they were supposed to be doing that person-first malarkey.

But anyway, the idea here is that you can opt out of sharing your data with a large language model, if you want to. But there are advantages, in my view, of sharing your sessions and making these large language models more aware of the lived experience of blind people.

Now, Bryan Bashin is quoted in the statement which if you’re a Be My Eyes subscriber, you’ve probably seen. But it is worth reading this because I think it’s an important example.

He says:

“Last year, 19,000 blind beta testers participated in a group that made OpenAI models materially better for the blind community. Their feedback and data convinced OpenAI to allow more robust and fuller facial descriptions of people to be generated from photos.

Think about that.”, Bryan says. “Blind people directly shaped the most utilized AI model in the world and bent it to their needs. We need more of this at scale, and we need it quickly.”

I completely agree with this, actually. And given that you can opt out if you want to, it seems perfectly reasonable to me that these partnerships be formed so that we can all have more capable LLM experiences.

Because if you’ve used AI in any way at all, you’ve probably seen the AI apologizing from time to time that you’re blind, giving very stereotypical, erroneous, assumptive responses about blindness. We can crack that with the knowledge that we have, to ensure that these models are better, and that they more fairly represent blindness.

I think this is a good thing. What do you think? I’ll gladly keep it switched on. LivingBlindfully.com/opinion. If you’d like to comment on this, there are a bunch of ways to contact us there. LivingBlindfully.com/opinion.

Advertisement: Living Blindfully is brought to you in part by Turtleback.

For over 2 decades, Turtleback have been making leather cases for the products we use every day.

If you own a BT Speak from Blazie Technologies, give it some additional protection and dress it up in style with a black leather fitted case with protective keyboard cover and strap from Turtleback. Because the case is fitted, all ports and buttons are fully accessible, but the device you purchased is protected.

There are plenty of other products at their store as well. So why not check them out? TurtlebackLV.com. That’s TurtlebackLV.com.

And when you’re there and you make a purchase, if you use the coupon code LB12 (that’s LB for Living Blindfully and the number 12), you get 12% off all your purchases at checkout.

If you prefer to make a phone call, they’re standing by to receive that call at 855-915-0005. That’s 855-915-0005.

And remember that coupon code LB12 on the website at TurtlebackLV.com.

Vicky Cardona Discusses the Pros and Cons of Various Blindness Wearable Devices

Now, we’re going to get into the first interview that I recorded at the NFB convention, which means this is the first time that the Zoom H6 Essential is putting in an appearance on Living Blindfully for something other than a demo of itself.

I want to explain how these are recorded. I bought myself 4 Sennheiser MD-46 microphones for Living Blindfully. I like these mics. They’re extremely directional.

It was very hot in Florida. And in the middle of a Florida afternoon, I couldn’t bear to turn the air conditioning off. These mics do a pretty nice job of filtering a lot of that out.

There is an interview coming up sometime down the line where I have 4 of these microphones working at once. The Zoom H6 Essential handles all of this without breaking a sweat. It’s done some brilliant quality recordings. So with that, here’s the first of them.

Just keep in mind that this was recorded before Bonnie and I purchased our own Meta Smart Glasses, so I didn’t know much about them at all at the time this was recorded.

Jonathan: We’re talking with Vicky Cardona. Vicky and I go way back. I think I first met you, Vicky, when we were both working for Freedom Scientific.

Vicky: Yes, and I believe that was, I was trying to remember what year that was when you started, was it maybe 2013?

Jonathan: No, you’re way out. I started on the 1st of September, 2006.

Vicky: Oh, that’s right because you did Pac Mate.

Jonathan: So when I got to meet you, I found a kindred spirit in that you’re a bit of a gadget fan like me. And in particular, at that time, there were 2 big players, and it’s interesting to contemplate the fact that both of those big players no longer exist in the mobile space. So there was Windows Mobile, and there was Symbian. And I was very much in the Symbian camp on my phone, even though the Pac Mate was a Windows Mobile device.

And you just seem to have this endless supply and endless knowledge of Windows Mobile devices.

Vicky: Yes, yes. That was right, because that was during the time when Symbian was kind of reaching its ending times. I mean, before that, I was all about Symbian devices.

And then, Windows mobile devices started coming onto the scene. And one of the main reasons I really was intrigued by them is because there was some accessibility.

At the time, not all platforms really had accessibility. If you wanted, you couldn’t use a BlackBerry, for example. I didn’t know of a way to use the BlackBerry as a blind person.

So there were aspects of the Windows Mobile platform that I really liked. And one of the things I really really enjoyed was that you could, I can’t remember what the term was that they used back then, but it was basically the equivalent of jailbreaking. So you could go in, and you could kind of get down into the nitty gritty and go into the registry, and install various different types of apps on the Windows Mobile devices. And that’s what I really enjoyed doing was kind of getting down and dirty in that respect, and figuring out different ways I could tweak my device to do various things I wanted it to do.

Jonathan: Yeah, and there were Code Factory and Talks, of course.

Vicky: Yeah, Talks was for Symbian.

Jonathan: Yeah.

Vicky: Well, Code Factory also did Symbian for a while.

Jonathan: Yes, yes.

Vicky: And then, they got more into the Windows Mobile space.

But there was another company, and I can’t for the life of me remember.

I’m going to probably remember way after we’re finished here.

But Code Factory for me was the one I really preferred. But I feel like there was another company in that space, but they just weren’t. Maybe they didn’t have offers much.

Jonathan: I think Dolphin had one.

Vicky: Maybe Dolphin. Maybe it was Dolphin.

Jonathan: You mentioned BlackBerry earlier, and I remember how people were very upset about BlackBerry not being accessible. And finally, the BlackBerry people worked with Humanware on a screen reader. And then right after that, BlackBerry sank like a stone, didn’t it?

Vicky: Yeah. I didn’t know they had worked with Humanware. That’s interesting.

Jonathan: Yeah, towards the very end. It was called Orator. And that was a screen reader for BlackBerry.

Vicky: Oh.

Jonathan: But we used to geek out. I was on the Symbian side, and you were on the Windows Mobile side, and you’d tell me about all these amazing devices.

And the reason why I bring this up, apart from the fact that people seem to love reminiscing about ancient forgotten tech is that you now specialize in a new product category.

Vicky: Mmm-hmm.

Jonathan: I’ve come to realize that you’re really into these wearable devices.

Vicky: Yes, yes. That is one of my newer passions. I mean, other than my iPhone, which is basically a part of my body. [laughs]

Jonathan: I dragged you kicking and screaming into the iPhone world. Do you remember that?

Vicky: Yes. Yeah. I had to be taken in kicking and screaming because I was very much anti-Apple at the time, because I didn’t like the fact that it was such a closed operating system.

Jonathan: Yes. You jailbroke your device.

Vicky: Yes. And then finally, I learned about Jailbreaking. And that’s really kind of what got me because back then, it was super simple. It was so easy. It was a matter of just going to JailBreakMe.com, and just double tapping on a couple of buttons and boom! Your device was jailbroken.

Now, not so much. Nowadays, it’s a completely different story.

Jonathan: Do you still do it?

Vicky: No. First of all, it’s not really worth it. Like, the apps that are available now for jailbroken devices are really not any apps that I would want to use. It’s a lot more strict.

And I think that a lot fewer people are working on the jailbreaks because now, they’re getting paid the big bucks by Apple to work for them and try to find loopholes and things in the software.

So jailbreaking is really not what it used to be, unless you’re maybe someone who likes to try different various wallpapers on their phone or whatever, or different kinds of games maybe. But it’s just not the same as it was back then.

Jonathan: So what is it then that’s taking your fancy about this wearable category?

Vicky: I have got to say, with all due respect, Jonathan, I have a lot of things that I agree with you on. And I have some things, however, that I don’t see in the same kind of way you do, pun intended and not intended.

I’m one of these kind of people where I feel like I’m blind, but it’s not by choice. But to take that one more layer further, I don’t want to be one of these kind of people that feels like I have to settle for certain stereotypical opinions or generalizations from people, or that I don’t want to be limited by my blindness.

However, if I could choose to not be blind, I would absolutely do it. I mean, hands down, the only way I wouldn’t is, let’s say, for example, if someone were to say to you, “Would you like to be able to see?”, the thing I would really have to consider is, are there any side effects? Because let’s say, for example, the trade-off is I would be able to see, but I would lose my hearing. No, that would be an absolute no-go for me.

Jonathan: What about the cognitive overload? The fact that since you’ve never seen, you’d be bombarded with all this visual information that your brain would have no idea how to interpret.

Vicky: You know, I have thought of that. And I did ask a person. I won’t mention their name because I don’t know this person very well, and I don’t know how they would feel about it. But I had heard of a person some years ago, like maybe 15, 20 years ago that was able to get their sight back. And this person had seen before, but they were very young when they lost their sight.

Jonathan: We’re talking about Mike, right?

Vicky: Yes. Yes.

And one time, there was one opportunity, I don’t remember why, I got to speak to him. And I just asked him, I said, “you know, I hope you don’t mind, but I really would like to ask you some questions about this restoration of your vision that you had”. And I don’t know how well he took me asking the questions, [laughs] but he was a good sport about it.

But basically, what he told me ultimately was that he decided he didn’t want to continue on that route and that he would just ignore it, because it was too difficult for him to readjust.

Jonathan: Yes.

Vicky: And at the time, I thought God, I can’t understand. In my brain, I’m like, I can’t understand that. I understand it a little more now.

But to answer your question, Jonathan, that is a chance I would take.

Jonathan: Okay.

Vicky: That is absolutely a chance I would take. I mean, sure, it would probably be very overwhelming. And my brain may or may not be able to take it, but I would still give it a shot. I mean, if the worst that would happen would be that I couldn’t deal with it but I would still be able to retain my other senses. And then to me, it’s like nothing ventured, nothing gained, nothing lost. That’s how I would look at it.

But to go back to the whole wearable thing. I just feel like, let’s say, for example, I go to the store and I have my cane in one hand and I have the bag that I just got from the store, or 4 bags in my other hand. I have very small hands. So to try to manipulate that, maneuver that, plus a phone that I have to wave around to try to figure out what’s around me, it’s like you need 10 hands. [laughs]

Jonathan: Yes, yeah. That is the hassle.

I mean, there’s such a powerful camera in any modern smartphone these days, be it iOS or Android. But it’s the practicality of it. You’ve already lost a hand with either a white cane or a guide dog harness. So then, you’ve got one hand for other things. And that hand’s often used to help you orientate to where you are. You’re reaching out for walls, or door handles or whatever. And then, you’re expected to hold this phone, take pictures, and get information.

Some people use chest harnesses to strap their phones into. But then, you’ve also got this whole category of wearables, which I think is only just starting to mature.

Vicky: Exactly.

And even to add to what you just said about things that people have to do, sometimes, you need to also be able to manipulate or move around your phone screen if you’re trying to figure out where you’re going and you need to access GPS information, you know. So there’s just so much that you have to keep track of.

And so I think with the advent of these wearables, it’s making life a little bit easier. I mean, there’s still a ways to go, but it’s really an exciting world right now in that respect.

Because I remember going back now to 2022, I had heard that there were basically two main companies on the scene with wearables. One was OrCam, and the other one was Envision. I was so intrigued by both, I wanted to see both, and I wanted to try out both. So thankfully, I was able to find a distributor here in Florida, Florida Reading and Vision. And so one of the reps from there, we met up, and she showed me both devices.

At the time, it was a really tough choice. But I thought, you know, they both had their pros and cons. But based on the experience I had with the Envision, I decided to go with that option.

And for me, it was a complete life changer because I was able to put the glasses on, use what they call the instant text feature, which is similar to the same type of thing that you can get with Seeing AI, except it’s on your face so you don’t have to move your phone around. And all I had to do was just look around me like anybody else, and these glasses would start reading what they saw.

And I really love perfumes, and body sprays and things. And so I really try to not…

Jonathan: That sounds expensive. You won’t be able to afford wearables if you keep buying.

Vicky: [laughs] Right. I know, exactly. So I try to stay away from these stores. But every now and then, I can’t help myself.

So I went to this shopping center that had, well, I had to go grab some things from Target. It has like a Target, and there’s a Bath and Body Works, which is a few stores down from there.

I remember the Uber driver drops me off in front of the store. And as I was walking toward the door, I heard the glasses say to me, “exit, exit”.

And I’m like, are these glasses reading this wrong? Why is it saying exit? And I’m like, could it be that the driver dropped me off in front of the exit?

Someone happened to be walking by, and I said, “Excuse me, is this the exit door?”

And they said, “Yes, the entrance is over here to the right.”

So the driver had indeed dropped me off in front of the exit, and the glasses had correctly indicated to me that I was in front of the exit door, which if I didn’t have the glasses, I would have had no idea. I would have just walked in that door and I would have gone in through the exit door. [laughs]

So then, I walk in and I look around. And then all of a sudden, I heard it say restroom. And I mean, I looked to my left and it said restroom. So I said, oh my gosh! No way. So I started walking to the restroom. And then, I saw where the main doors were for the restrooms.

The only thing that it did not read that I wasn’t able to get to figure out was if it was women’s or men’s. And I was pretty sure I was standing in front of the women’s, but I hadn’t gotten confirmation from the glasses.

Jonathan: And then a lot of those signs are tactile though, aren’t they?

Vicky: Yeah. Some of them are, some of them aren’t. And this was after COVID, so I don’t like to put my hands all over places, you know.

Jonathan: Yeah. [laughs]

Vicky: I think someone said, “Do you need assistance?”

And I said, “Well, I just want to make sure that this is the ladies’ room.”

And she said, “Yup, it’s right there in front of you.”

So that was really the only amount of assistance that I needed to ask was just to confirm that I was standing in front of the ladies’ room. Other than that, thanks to the glasses, I was able to actually locate the restrooms on my own. So I was blown away.

And then, I walked out of the Target, bought what I needed, and I started walking toward Bath & Body Works, and these glasses just start reading to me. I knew that the Bath & Body Works was going to be on my left. And so I just start walking. And they just start rattling off the names of the different stores I was passing like Office Depot, Party City or whatever.

And I just stopped in my tracks because I’m very emotional. I just started crying. I said, “I can’t believe this.” Having been blind from birth, this is for me, like the next best thing to being able to see – having these glasses read me these things.

Because sure, you can use GPS, and you can use apps like BlindSquare that’ll say such and such on the left, and such and such on the right. But it’s not the same kind of experience that you get when you literally just look in the direction of where the place is and have something tell you what it is, and just all the information when you walk up to a store like Target, or Walmart or whatever, and it says, “Welcome!”

And then, it says all these different types of signages and things that no one, even your most descriptive person will probably never think to read to you as a blind person because it’s to them, not that big a deal. And I mean, if I were in their shoes, I wouldn’t think it would be a big deal either.

But because I have no idea that these things exist, it’s like wow, this is such an eye-opener.

Jonathan: Yes. It’s wonderful when technology can give you a hey-wow moment like that.

And for me, my first Aira call at CSUN in 2018 was like that, and just the freedom of being able to navigate complex environments. So that was interesting that even back then, you were getting that sort of performance from that feature.

And over time, we’re getting into things like ChatGPT 4O, which is expected soon. And that sounds like it’s going to be a huge breakthrough for blind people because of the near real-time navigation.

Vicky: Yes, exactly. I can’t wait. I mean, I couldn’t believe when I saw that video from Be My Eyes, and I can’t wait to. I still don’t really believe it. I still think it’s staged, but I keep hearing it wasn’t. So I want to experience it. [laughs] I would love to experience that. And I know it’s coming. I just hope it’s sooner than later.

Jonathan: So that was Envision 2 years ago. And how’s the industry moved on since then?

Vicky: So kind of moving forward, at least with Envision, …

I did eventually end up getting the OrCam, but that’s… I don’t want to…

Jonathan: I was going to say, you ended up with both anyway.

Vicky: I did end up with both anyway. Yeah, I did. [laughs]

Jonathan: And the sprays and the perfumes.

Vicky: Yeah. [laughs]

Jonathan: Yeah.

Vicky: Well, you know, the way I look at it is I feel like I’ve worked hard most of my life. And I feel like, you know, you owe it to yourself, if you work hard and you really want something, whatever it is, whether it be it traveling, be it whatever it is, where there’s a will, there’s a way. And I feel like people should really try to go after whatever it is that they want out of life.

I mean, you’re an example of that, Jonathan. I mean, look at you with your current life. You’ve got me beat at this point. [laughs]

Jonathan: [laughs] I don’t know whether it’s a race, but I agree with what you’re saying.

Vicky: I’m not saying it’s a race, but you know, yeah.

[laughter]

Jonathan: So this puts you in a good position to perhaps tell us about how you think all these players are stacking up.

Vicky: Well, yeah. Right.

So basically, now, I mean, what’s really amazing is, … And I got to experience this last year when I met the folks from Envision for the first time because I did work for them for about a year and a half. I’m no longer with Envision now, as of this juncture, officially anyway.

So I did get to meet the guys from Envision. Very cool people. And they were still testing the feature that they call Describe Scene. And it was amazing because I could put the glasses on, and I could have them just describe what was around me, which I was completely mind-blown because it was like having a conversation with somebody telling me what they saw.

And I tested it. Like I went into the bathroom and I looked at myself in the mirror and I said, describe the scene, or tell me what you’re looking at. I don’t remember how I said it exactly at the time, but it basically described me.

It said, “There’s a lady with dark brown hair.” And I mean, it was so vivid.

And these were like test glasses, because they were not available to the public. So only the developers had them.

I kept thinking to myself, how can I just get out of here as fast as possible, run as fast as possible with these glasses, and them not catch me? [laughs] I just wanted so badly to take it.

And I said, “Look, can you put that on my glasses?”

And they said no no no.

So as it happened, I waited probably, I don’t know, 4, 5, 6 months before it finally was available. It was a while. Maybe it wasn’t that long, but it was a long time for me.

But yeah. I feel like now, it’s just incredible how you can basically ask the glasses questions about your surroundings and about your environment. And the accuracy is pretty high.

Jonathan: One of the downsides about doing a podcast like this is you can go in-depth with something.

I did a very in-depth review of Envision a while ago now, back in the early 200s, so quite some time ago. We went through all of the features, and took it to the mall and that kind of thing. And there were a couple of things, 3 things, I think, that stood out for me as negatives.

One was that Bonnie and I would have to have a separate pair of glasses, and I think that’s unreasonable. I think there should be a way for Bonnie to claim the glasses when she wants to go out, and it then connects to her Envision app. And then when they come back, I can claim them for when I want to use them.

It’s kind of like having a family car, right? One of you drives it one day, and the other drives it the other day. So I was pretty concerned that that didn’t exist.

The other one was how long it took to process images.

And the third thing was, I find this with the Envision app at the moment, and it’s frustrating to me. If I give Seeing AI any piece of print, whether it’s upside down or on its side or whatever, it doesn’t even bother me with the minutiae of the fact that this is upside down or on its side. It just reads what I need to hear. And Envision, at least on the app, doesn’t seem to do that. So those were the 3 big things.

Vicky: Yeah. With respect to being able to share glasses, I mean, I would say for me, sharing glasses is not the same as sharing a car because glasses are a much more personal item. I mean, you put them on your face. I’m sure with your wife, you might have customizations and things that you might want to have set up your way on your layout, and she might want to have her own.

Jonathan: I see what you’re saying. At the actual hardware itself, right?

Vicky: Yeah. But by the same token, I mean, I know the equipment is expensive, and so it’s tough. But I feel like that might not be such an issue now. You might be able to work something out. I mean, I don’t know. I’m not speaking for them, but I feel like that might be a different situation now in that regard.

But yeah. I don’t know that I would want to share. I’m even kind of picky about who I show my glasses to, because I don’t want just anyone putting their hands all over my glasses. [laughs]

Jonathan: [laughs]

Vicky: But yeah. I mean, the processing of the images, the time it takes, that has become faster. But for me, I’d rather sacrifice detail for speed. In other words, if you’re going to give me a quick response but the answer is not going to be as accurate, then I’d rather wait a few minutes. But that said, I mean, I kind of would want a bit of a middle ground.

But you take, for example, now you’ve got your Meta glasses, which, that’s the rage now. And they’re really popular because of their affordability. And even though the Metas are not designed for people who are blind, they have features that blind and vision impaired individuals can use.

But there’s the trade off. For example, the Metas don’t have features like instant text. So you’re always having to interact with the glasses. You’re constantly having to say, “Hey, Meta. What do you see?”, and just continue to have dialog with it.

Whereas with the Envision glasses, especially with the Instant Text, if you’re just interested in reading what’s around you, you can just turn on the Instant Text and just continue to listen to what the glasses are saying, and you don’t have to talk to them.

Jonathan: Yeah. One really cool experience I had when I was reviewing them was sitting in a car and looking out the window, and just hearing about businesses that we were passing as my son was driving the vehicle. That’s pretty amazing.

Vicky: That is amazing. Yeah. And that was a great review that you did of the glasses.

Jonathan: Oh, thanks!

Vicky: So yeah, it was awesome.

The main reason I decided to try out the Meta’s is because I wanted to be able to take pictures. I’m a horrible picture-taker. And I mean, I might not be as bad, but you’re always your own worst critic. At least that’s how I feel about myself. I feel like I’m my own worst critic.

There are blind people I know who take incredible pictures with their phones, but I don’t happen to be one of those people. I’m not confident enough in myself, and I’m always very self-conscious that my pictures are not going to turn out that great.

And quite frankly, I don’t want to share a picture to either social media, or even to friends and family that is going to look like a blind person took it.

Jonathan: Yeah. And it’s hard to verify the quality of what you’ve taken, isn’t it?

Vicky: Yes, it is. Yes, it is.

So with the Metas, I mean, the camera is right there. So you have the camera there on your face, you can take the picture. And not only can you take the picture, save it and share it, but you can at that time, at that moment, have the glasses describe the picture you took.

Whereas if I take the picture on my phone, then I have to take the picture, then open up an app like Seeing AI or Be My Eyes, then have it described, and then go from there. If it’s a bad picture, then I have to start the process all over again, and that’s just painstakingly frustrating and tedious, you know.

Whereas with the glasses, you can either click the capture button, or just tell the glasses, “Hey, Meta. Take a picture.”, and it’ll take the picture. And then, you can have it describe the picture to you. That was the reason I bought the glasses.

And then, of course, you can also take videos with the glasses.

And what’s really cool is that sighted people, they love the glasses. They’ve really embraced these glasses. And so we are now using a product that other people are using. So it’s not like we’re going to look like sore thumbs sticking out, you know. [laughs]

Jonathan: Do you think that’s a risk of the Envision product?

Vicky: Well, you know, that’s interesting because my personal experience with Envision is that I don’t know why it is exactly. I really can’t tell you why, but very often, when I wear my Envision glasses, even though they have the camera that kind of sticks out because it’s the Google glass, you can tell that they are the smart glasses more so than with the Metas (because the Metas look like regular sunglasses).

But when I’ve worn these glasses, I’ve never been told, “Ooh, you look weirder. What kind of glasses are those?”

In fact, for me, for whatever reason, it’s been the opposite. I have actually been able to tell a very distinct, … It’s a huge difference in the way that I’m treated because people think I have vision when I use the Envision glasses, even more so than with the Metas. I can’t put my finger on why because you would think oh, because of how the glasses look, that wouldn’t be the case. But for me personally, it’s really interesting to me. And I take it as a compliment, of course. But yeah, people often think that I can see, and I know this because of the way they speak to me and the kinds of questions that they ask me.

Jonathan: Just before I sat down to talk to you, I had lunch with Mike Calvo. Bonnie and I both had lunch with Mike.

And of course, Mike said, “Hey, have a look at my Meta Smart Glasses, bro.”

[laughter]

So I had a look at them, bro.

And Bonnie put them on and said, “Hey, Meta. What am I looking at?” The menu was in front of her, and it started to summarize the menu. I don’t know whether she could get it to read the whole thing top to bottom. Do you know if it can do that? Can you actually say to it, “Look, just read me the whole document.”?

Vicky: It’s a hit or miss. I haven’t personally tried it with the menu, but I can tell you I have been able to get it to read me short things. But entire documents, oftentimes it won’t read. I do know of people who have successfully been able to get the Metas to read them their menus.

I’m of the camp of if it summarizes it, that’s about the best I can do with it. but it won’t.

I’m very detail-oriented, and I often do like to have things read out to me verbatim from top to bottom, even if it takes longer. And that’s not something I can really accomplish very easily with the Metas.

Jonathan: But you could with Envision, right?

Vicky: Yes. I can with Envision, I can with OrCam.

Jonathan: Yeah.

Vicky: And the other thing too is for example, pieces of mail. The Metas won’t sometimes read addresses and things like that, because, it’ll state, for privacy reasons, that won’t read it. But yeah, you can do it with Envision and with OrCam.

And one of the reasons I bought OrCam as well as Envision was because at the time I bought my Envision glasses, the offline scanning feature was not available at that time. It is now. But back then, it was not. And now, it’s a whole different situation because now, OrCam has online functionality. Back then, it was just all offline.

And so for me, that’s the reason I got them was because I knew that if I wanted to read mail or if it was something really really personal that I didn’t want to take any kind of chance of having it shared or hacked into, then I could use my OrCam because it wasn’t going anywhere.

The drawback to that though, is that I couldn’t export it to anything. So the only way that I could save what it was reading is to just record what it was reading with my phone, or something.

But there’s not a way to export the text on OrCam. I don’t believe there is now, either. I don’t use my OrCam glasses as much as I did back then, so I’m not as familiar with some of their new features. But they do have online functionality.

And of course, with Envision, regardless of whether you scan offline or online, you can still save what you scan to the app.

The main difference between online and offline scanning now is that with online, you can now ask it follow-up questions about what you scanned. So if you scanned a document, let’s say it’s a huge document with a lot of information on it, you can just have the glasses summarize what’s on the document, or just ask it specific questions. Like if it’s a menu, and you want to know just about what the seafood entrees are, then you can ask it to tell you what seafood entrees it sees on there.

Jonathan: When you and I have had long, waffly, geeky conversations about this product category, I think one of the things you told me about OrCam was it’s particularly good if you want to sit there and just read a book.

Vicky: Yes.

Jonathan: So that’s pretty cool that you can just take any print book and read it reliably.

Vicky: Yeah, I was very impressed with that when I got the OrCam.

I will say that my experience with it is different now, and it could be me. So I don’t want to say that it’s them per se, but I feel like the OCR… Back then, it was just amazing. And yes, I could take a book and read it.

Now, it’s kind of hit or miss. Reading a magazine for example with the OrCam, if it has like a really shiny paper on it, sometimes, it doesn’t read it so well.

But I know, at least back when I got my OrCam, they sent me this huge user guide, the huge help, it’s like a huge paperback book. And the reason why it’s so big is because the book is written about, I don’t know, a bunch of different languages. But I would read the English section. And it would even describe like some of the pictures back then of what was on there. So I thought that was really really cool. And yeah, the OCR on it was fantastic back then.

And at the time, Envision was improving gradually. Now the OCR that you can get with Envision is great.

What I would say is that Envision provides more guidance. So if you’re totally blind and you don’t have much experience with wearables, it will provide more guidance as to how we know whether you have to move the document to the left, or to the right. OrCam doesn’t really do that.

Jonathan: So would you say that the Envision glasses are kind of like the ultimate to-go-for right now, in terms of capability?

In other words, the Meta Smart Glasses are cool because they’re being mass-produced, and they have some features. And for example, you can get on WhatsApp and actually get real-time assistance, can’t you, with people?

Vicky: Yes, yes.

Jonathan: So that’s very nice in such a small form factor.

But if you wanted a lot of blindness-specific features, it sounds like Envision still has it significantly beat.

Vicky: Because of all of the different things that you can do with them, all of the various features that it has.

If you, for example, need to do a lot of reading and it’s privacy-based, then you can do that with Envision, but OrCam might be a better solution. It just kind of depends.

Another thing that OrCam does that Envision does not do is it does have a barcode reader, and you can teach it products.

Now, you can get around that with Envision by just using the Instant Text, or sometimes the Scan Text feature, and it’ll read you what the product is.

But let’s say you work at a grocery store as a stocker, and it’s just easier to read the barcodes, then you might want something like an OrCam because it does offer a barcode reader.

It’s not always easy to get the barcode reader to work because again, you don’t receive any type of guidance. So for flat packages, it’s pretty easy. But if you’re trying to scan a can or something round, it’s a little more tricky.

But yeah, the Envision, it’s like a jack of all trades because you can do so much with them. Anything from just standard OCR, it reads handwriting, all the way to being able to use Aira with them, or contact a regular friend or family member using the call an ally feature.

So those are features that you can do with the Envision glasses that you cannot do with the OrCam because with the OrCam, you can’t place any kind of outside calls.

Now with Meta, yes, I’ve used the WhatsApp, and you can also do it with Messenger. I haven’t tried it on Messenger, but I have used WhatsApp to get assistance with the Metas, and that’s really cool. And of course, I’ve used the call an ally and Aira on the Envision glasses.

I’ve looked at other hardware. I don’t own that, like Seleste and ARX, and most of them have their pros and cons. But I feel like overall, and given the experience I have, the Envision has the most to offer at this point.

Jonathan: So one of the challenges that we’ve got is Apple’s still very locked down, which is why you used to jailbreak it all those years ago.

Vicky: Right, yeah.

Jonathan: So at the moment, it’s necessary for all the processing of the image to go on on these wearable devices.

So if you take a picture with the Envision Smart Glasses, then it’s actually the computer built into the smart glasses that’s doing that recognizing, and it needs an internet connection from somewhere, and all that kind of stuff.

But if Apple would open up its camera API so that third-party wearables could connect to the camera API, you would then have a situation where the actual glasses could be much less smart because you could just have those glasses take a picture, and send it to your Apple device for processing.

And if Apple’s not careful, this is an area where as we take increasing interest in wearables, Android’s going to eat Apple’s lunch because Android can do this now, and Apple can’t.

Vicky: So I’m not sure how much of the processing is being done on the glasses, and how much is being done on the actual device. But that is a stark difference between Meta and, let’s say, Envision and OrCam is that you do need to have the device close by, because it does all of the work. It needs to have the connectivity to the device, which is done usually via Bluetooth. It has to work with its companion app, MetaView. So it uses the MetaView app for a lot of that processing.

Jonathan: When you take a picture with Meta, how long is it taking to come back with an answer on average?

Vicky: Very fast. I would say 30 seconds or less.

Jonathan: That’s still quite a long time. With Envision, you’re walking around, and you’re getting real-time information.

Vicky: Right, and that’s what I’m saying. You can’t necessarily get the same kind of real-time information. It has to be like one picture at a time with Meta.

But compared to the other options available right now, it’s very fast. I mean, I can show you here later, if you’d like. But it doesn’t take that long at all.

It really depends though on various factors, like how good the connectivity is where you’re at currently, and if there’s a lot of other Bluetooth devices around. So there’s always that, but it’s still pretty fast.

But it’s not the same as getting everything real time, which is why I’m so excited about this GPT-4O because that would remove those concerns out of the equation. But it’s getting there.

The other thing too about Meta is that the AI is still being worked on. It has improved, but there are times when I feel like I get better results than other times.

Jonathan: Yeah. I’ve been playing with meta AI a bit on WhatsApp, you know, because it pops up there, and you can talk to it. And I have to say, I’m not overly impressed. It hallucinates a fair bit, and I’m not sure whether everything it gives back is accurate. So how much confidence should someone have in the answer to the question, what am I looking at?

Vicky: Right, exactly. I agree with you there. There are some days when it’s spot on. And then, there are times it’s not.

I’ll say to it, are you sure that such and such is what you said? And sometimes, it’ll actually say I apologize, but it’s such and such. And so, if I don’t know, …

Like let’s say, for example, I’m holding a bottle of Dr. Pepper. And then, I ask it what I’m holding.

Jonathan: Carbs, carbs, carbs.

Vicky: Yes. [laughs]

I ask it what I’m holding, and it says I’m holding a bottle of punch, or something way off.

I’ll say to it, are you sure about that? Can you double check?

And they’ll say, I apologize. You’re holding a bottle of Dr. Pepper. But what if I didn’t know what I was holding, and I went for what it first said?

Jonathan: Yes.

Vicky: And that’s what people really need to be careful of. I’m not saying that other solutions don’t hallucinate. It’s not like it’s not going to happen, but I feel like it would happen a lot less with Envision and OrCam because that is part of the whole idea of being able to have blind people use these devices is that the developers really check that, and we also test these things out to ensure that it happens as infrequently as possible.

Jonathan: Yeah, because you’ve got a lot of tools in the toolbox with Envision. And if you want, you can just take AI completely out of the equation and just have it do OCR.

Vicky: Yeah.

Jonathan: So you’re not in any doubt. It might not see all that you want it to see. But if it sees anything at all and reads it back to you, you know it’s actually what’s written on the package, or whatever it is that you’re looking at.

Vicky: Exactly. And that’s something that can’t be done with Meta. Sometimes, I can say, “hey, Meta. Read this to me.”, and it’ll read me whatever I’m holding in my hand, right?

And then, there are other times when I’ll say, “hey, Meta. Read this to me.”, and it’ll say “you have not received a text in the last minute.”

So it’s really really hit and miss, and it can be annoying for me at times.

Jonathan: What I want, what I really really want, …I hope the Posh Spice is listening.

Vicky: Yeah. [laughs]

Jonathan: is I want a wearable to integrate Seeing AI.

Vicky: I agree. Yes yes yes yes, absolutely. I would love that.

Jonathan: I mean, of all of these tools, I find Seeing AI consistently the least hassling and finicky, and it just gives me the answer I want.

And again, I come back to the fact that still after all this time, as far as I’m aware, if I give a piece of print to Envision and it’s the wrong way up, it does not flip it for me.

Seeing AI just does. And for me, if I’m going through a lot of groceries (and we get a whole bunch of meals delivered every week, fresh meals. And I go through them. It’s my job to take them out of their box and put them into little piles, neatly organized in the fridge.), with Seeing AI, it is so effortless to just put it into the instant text mode. It doesn’t matter whether I’m holding the package upside down or not. I just hear right away what it is, and can put it away.

Vicky: I have to admit, I don’t use the Envision AI app as much as I do Seeing AI. I started with Seeing AI, and I’m very comfortable with it. I rely on it. I trust it. I’m not saying that you shouldn’t trust Envision. It’s kind of what you’re used to is what I’m getting at here, and I really love seeing AI.

And you’re right. I mean, there are times when it doesn’t get something right, but that’s going to be like that with anything.

The few times I have used Envision, I haven’t had that problem where it says something is upside down.

Jonathan: Hmm.

Vicky: Now, OrCam will sometimes tell you that. But sometimes, I want to know if something is upside down or right side up, you know.

Jonathan: Yeah, that’s fair enough. Yeah. I’m happy to have the information that it’s upside down. But then, I’d like it to read it to me anyway.

Vicky: Right. Yeah.

Now, there is a device that does work with a wearable that does work with Seeing AI. Did you know that?

Jonathan: No, what is that?

Vicky: ARX, you can use with the Seeing AI.

Actually, I believe Seleste does as well. I have worked briefly with both. When I went to CSUN, I got to see both of them.

The ARX was very responsive. But the reason, for me anyway, that ARX would not work is because currently, it only is supported by Android. And so you would have to have an Android device, and it is tied to that Android device.

It’s not like you could just take the ARX with you and leave the phone in a different place. It’s a wired connection, so you have to have the device constantly connected to the charging port on the android, which is another drawback because if the battery begins to die on the Android device, you won’t be able to really charge it and use the ARX at the same time.

Jonathan: Sounds like the good old Horizon glasses from Aira.

Vicky: Yes, yes. That’s right. Yes, that is right.

And then, everything, like I say, has its pros and cons. But one thing with the Metas that make them a little different from Envision and OrCam is that they serve as both smart glasses, as well as an audio device. So if you use the Meta glasses, you don’t have to necessarily worry about using another Bluetooth headset because all of your phone system sounds and VoiceOver and everything is going to come through the Meta glasses.

Jonathan: So that could be great for navigation. If you’re running a GPS app, you’ll hear those instructions through the metaglasses.

Vicky: Yes. Yeah, it is great. The only drawback though is, for example, if you’re using something like Voice Vista that supports the head tracking with AirPods Pro, that doesn’t work with the Metas, currently. I’m going to go ahead and presume that it’s an Apple limitation, but I don’t know that.

I’ve asked the guys at Voice Vista to see if there’s any way they could look into enabling the head tracking.

Have you used Voice Vista before, Jonathan?

Jonathan: I have, yes. It’s a good app.

Vicky: So you’re familiar with what I mean by the head tracking, right?

Jonathan: Yes, yes.

Vicky: Well, maybe you might have listeners who aren’t. So it’s really cool because, … Sorry, I’m turning my head left and right. I hope that doesn’t mess up the microphone.

[laughter]

You can literally turn your head left and right, and the Voice Vista will give you information about what’s around you, based on which way your head is turned. So again, you don’t have to take your phone out of your pocket and move it around, or turn your body around. You just have to turn your head one direction or the other, and it tells you that.

I think that is an amazing feature, and I really hope that they can implement that into other headphones, even, for example, the AfterShokz. You can’t do that with AfterShokz either. But I think that it’s a limitation of AfterShokz is what I’m told. It doesn’t have the correct chip.

Jonathan: Now, yes. I think that’s probably because AirPods are pairing in a unique way, whereas what we’re talking about otherwise is just standard Bluetooth headset devices. And I don’t think that information is being communicated via the Bluetooth protocol. So I suspect that’s what’s going on.

Vicky: Yeah.

Jonathan: The other thing is that there are glasses now starting to emerge that integrate ChatGPT-4O. And I must say, I’m a lot more interested in that.

I just have a little bit of a reticence about Meta, you know. They’ve got a bit of a history with people’s privacy and people’s data.

Vicky: Yeah.

Jonathan: And GPT-4 is a proven commodity.

If you could get GPT-4O on a wearable, that would be a very big deal. And those glasses are now starting to emerge.

Vicky: I’ve heard that, yeah, but I don’t think that they’re available to the public. Like I know that there are ones I think are called Ergo Solos, or something like that.

Jonathan: Right.

Vicky: But they’re not available yet for prime time. And then, I think they have like a certain type that you can get which is just for the audio. And then if you want to add the camera, then it’s an additional piece that you’ve got to put on there. So yeah, it’ll be interesting to see how that pans out.

Jonathan: So as we wrap, I’ve got a big tough question to finish.

Vicky: Uh-oh!

Jonathan: If you had to pick one to keep, which would it be and why?

Vicky: I would say Envision. And the reason is because, again, I can deal more with them than I can any of the others. Obviously, I can’t take pictures with them, but I feel like they would be the best tool to help me when I’m out and about and in my daily life, doing my thing. I would say those are the ones I would choose.

Jonathan: People are concerned about the fact that it’s using a product that’s no longer being developed by Google. Does that bother you?

Vicky: No, and I’m going to tell you why. Because just because a product is no longer, let’s say, being sold by a company, doesn’t make it a bad product. And it doesn’t make it an unusable product.

I think it’s more a matter of what the company chooses to do and make available with that product, because Google could come out with a new set of glasses right now with even fewer capabilities and with less features. And just because it’s newer doesn’t make it better. So I would say no, I’m not worried about it because I know what the capabilities are and what I’m able to currently do with it. And I feel like there’s even more that Envision can do to make it even better still. And I feel like that’s the standpoint that I have on it.

You can take, for example, the Metas. One of the things that I don’t understand why these companies don’t seem to be concerned, they’re not concerned about it, or why they don’t place more of a concern on is the implementation of a flashlight. The only glasses that have a flashlight that I know of are the OrCams, and that’s not that great of a flashlight. The Metas don’t have a flashlight, and it doesn’t seem to be a big deal. And these are glasses that just came out in October, and we’re in July. So 8, 10 months ago. It’s less than a year, and no flashlight.

So newer is not always necessarily better. I mean, sure, the camera. Is it a good camera? It’s a great camera, absolutely.

But having said that, I feel like you might have a better, further, wider angle on the Envision glasses camera than on the Meta. I haven’t actually been able to test that with a person to be able to actually do the compare contrast, with respect to what the distance is that can be seen on one versus the other. But I know that people have told me before that they can see pretty far away when using the call an ally on the Envision glasses.

Jonathan: And I know I said I was going to wrap it. You’ve prompted a question here about getting the Meta glasses because you said you got them because you wanted to take pictures. Can you not do that with the Envision glasses?

Vicky: You can take pictures, but they have not given us the ability to save them.

Jonathan: That’s quite interesting. I mean, that would seem like a 101 kind of feature.

Vicky: Yeah, you would think. I don’t know what all the reasons are, but I know that currently, the process to save a photo is, It’s doable with the regular Google Glass, but it’s not accessible at all. So you can try to take a picture, but because there’s no audio feedback, you don’t know if it’s in picture mode, or in video mode, or anything. So I’m not sure how easy or difficult it would be for Envision to make all of that accessible. But I’ve asked them multiple times. And so they have other features that they feel are higher priorities.

Jonathan: Well, this has been fun. And you know, the next time that we talk about geeking out over a specific product category, it’ll probably be about which service we prefer to upload our brains to the cloud so we can live eternally and continue doing Living Blindfully in perpetuity.

Vicky: [laughs] Yeah, yeah. That’s right. There you go.

Or living sightfully. Maybe by then, I’ll be living sightfully. You never know.

Jonathan: Yeah, yeah. [laughs]

Anyway, this has been fantastic. Thank you so much! It’s lovely to have you on the podcast.

Vicky: Well, thank you so much for having me. This has been a lot of fun. Thanks!

Advertisement: A reminder that Living Blindfully is brought to you in part by Aira.

And the big news of the week, unquestionably, is that Aira now integrates with RIM.

So I want to circle back and talk about this one more time. RIM (Remote Incident Manager) is a fully accessible way for anyone with your consent to provide you with remote assistance to log into your computer and help you with things. And who better to do that than an Aira-trained professional agent?

Download the RIM app for free from GetRIM.app, call your Aira agent, and tell them that you want to initiate a RIM session. You’ll be amazed at how simple this is to set up.

It costs you nothing to have someone assist you with RIM. And with Aira’s wonderful 5-minute free call offer, you can get professionally trained remote assistants to help you with computer-related tasks. Epic, I tell you. Epic!

Find out more by going to Aira.io. That’s A-I-R-A.I-O.

Surface Laptop 7 Review

I’ve got to tell you. I’ve got to tell you that Christopher Wright, he’s a lucky guy because he’s got the Surface Laptop 7. I’m hearing so many good things about these new ARM-based Surface devices.

And Christopher says:

“I’ve only had the laptop for a few days, but I really like it.

I have the base model with the 10-core SnapDragon 10+, 16GB of RAM, and a 256GB solid-state drive. It’s very fast.

And the battery life is very good, though I expected a little better. I found it drains at a rate of about 10% per hour. I’ve discovered that always using Energy Saver mode cuts that down to about 4 or 5% per hour. I’m going to keep it in this mode as it doesn’t seem to impact general performance that much.

It would probably be better if Windows was a more efficient operating system. But as those of us that are tech savvy know, Windows is not efficient. The Prism Translation Layer to run regular x86 and x86-64 software probably doesn’t help in the power consumption department either.

I tried connecting my Fit-Headless HDMI dongle to the laptop and turning off the built-in screen, but this doesn’t seem to save power. The HDMI plug tricks the computer into thinking it has a display connected when it really doesn’t.

Using the laptop is essentially identical to using a regular x86 Intel or AMD Windows machine. Most applications appear to run fine, even those that run in emulation like Audacity, VLC, and audio games. Mainstream accessible video games or more demanding software may not run well under emulation, or at all. But I’ve got a desktop with an i9-12900K and an NVIDIA GPU with 12GB of VRAM if I want to do things like that.

It’s very seamless, and unlike Apple’s Rosetta, I don’t expect this to stop working in a year or two.

Having said that, I hope more applications become native over time.

Remote Incident Manager is also running in emulation mode.

The only driver problem I have right now is the USB driver for my Focus 40. According to Freedom Scientific, I need to install JAWS, and I really don’t want to do that. Hopefully, they’ll update the stand-alone driver installer to support ARM64 in the near future.

The keyboard is okay. I like the ability to toggle the function key behavior. Press Fn, which is to the right of control, once to turn hardware features on, and press it again to toggle the keys to standard software features.

The placement of the arrow keys is irritating. The left and right arrows are very large, with up and down sandwiched between them as much smaller buttons. There’s no empty space around the keys either.

The laptop came with a lot of bloat like Office 365, CoPilot, some kind of Dolby Sound junk, etc. Fortunately, it’s Microsoft bloat, and it’s easy to uninstall using the Installed Apps section of Settings.

I tried really hard to clean install Windows but no matter what I did, the bloat keeps coming back. A complete cloud reset where I erase everything doesn’t work, neither does restoring the system image from Microsoft’s website. I tried creating a bootable ARM USB drive, but the computer refused to boot it using the Advanced Startup Options menu. Sadly, it looks like I’ll have to be content with manually uninstalling the bloat. Hopefully, it doesn’t leave too much junk behind.

Overall, I’m very impressed by this laptop. It’s very fast, and seems to run all the programs I want without any problems.

I’ve disabled the touchpad in Settings and the touchscreen in Device Manager.

The hardware is very nice, and Microsoft gets bonus points for making it very easy to replace the SSD and battery. All you have to do is remove the rubber feet from the bottom, remove screws that are covered by said feet, and the cover lifts off.

Hopefully, Microsoft has learned their lesson with Windows on ARM this time around. The Windows RT experience 12 years ago catastrophically crashed and burned, but this actually has a chance of taking off. We’ll have to wait and find out how the platform matures over the next few years. I’m glad I got this instead of a MacBook Air.”

And would you like to hear what it sounds like when you record on it? Would you like to hear that?

Alright, then. Here’s Christopher with a wee sample for us.

Christopher: Hey, everyone! This is a quick demo of the microphones on the Microsoft Surface laptop from 2024.

I had to go into Windows Settings and turn off a bunch of things that they call sound enhancements on the microphone, which I don’t classify those as enhancements. I think they sounded terrible.

This is presumably what the microphone sounds like when you turn all that stuff off, and it’s capturing things the way that they were meant to be captured. This is in stereo, and they appear to be placed at the top, I guess the top left and top right portions of the monitor.

If I tap the left side of the screen though, you’ll hear it on the right side. And if I tap the right side, you’ll hear it on the left. So I’m not sure why it’s backwards like that. But there’s your quick demo.

The built-in microphones on this laptop aren’t bad, not bad at all.

Jonathan: I’m very impressed with that audio, Christopher. That is a lovely bit of audio coming from built-in mics of a laptop. Every bit as good as the MacBook, in my view. That’s pretty impressive.

Now, we did get an addendum because Christopher’s nothing if not persistent. He says:

“I figured out how to remove the bloat. If you use the option in the Reset tool to keep your files, you can choose to remove pre-installed programs. It removed Office 365, OneNote and the other junk, and it placed an HTML file on the desktop containing download information, just in case I want to download the programs again.

I’m currently testing if a subsequent Remove Everything Reset brings it back. It unfortunately does. ( insert booing or other sound of disappointment)”

Okay, I can do that.

[Aww! sound effect]

“The solution appears to involve resetting the computer, keeping personal files, and removing the pre-installed software. This is as clean of an experience as I can get, since booting from generic USB Windows install drives on ARM computers appears to be a big no-no.

This is also very useful for folks that aren’t comfortable making bootable drives, but still want to start with a clean Windows system that’s free from manufacturer crap.

I learned something new today, and I hope it helps others as well. It’s always a good day when you learn something new.”

Absolutely.

“Down with bloatware!”, says Christopher. “Time to celebrate.”

I’m sounding like Arlo Guthrie.

See, one man’s treasure is another man’s trash, right? I would not want to have a Windows computer without Office 365 on it, but I do agree that you should have the right to get rid of it if you don’t want it. So all’s well that ends well.

Now, there’s a range of these new Microsoft CoPilot-type computers – some from Microsoft, and some by third-party vendors like HP.

If you’ve got one of these new CoPilot PCs, how are they working out for you? Are you pleased with the performance, the battery life, the compatibility? Do let us know your experiences. That will be most helpful, I’m sure, to the community.

You can go to LivingBlindfully.com/opinion to find out all the ways that you can be in touch – email, WhatsApp, and phone, they’re all listed at LivingBlindfully.com/opinion.

[music]

Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. That’s P-N-E-U-M-A solutions.com.

Closing and Contact Info

And that’s where we will wrap it up for another edition of Living Blindfully.

A reminder that if you are not a Living Blindfully plus subscriber yet, you can do that for as little as 1 New Zealand dollar a month, which is only about 60 US cents. You can certainly pay a lot more, if you think it’s worth it and you feel able to. We certainly appreciate that. It really does help a lot.

In exchange, you get the podcast 72 hours ahead of its general publication, so you’re in the know sooner.

I really want to thank everybody who subscribes through Living Blindfully plus for helping to make this podcast viable.

We will see you next week.

Remember that when you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.

[music]

Voiceover: If you’ve enjoyed this episode of Living Blindfully, tell your friends. Spread the word on social media. And if you’d take the time to give us a 5-star review on Apple Podcasts, we’d appreciate it.

We love hearing from you. Be a part of the show by getting in touch via WhatsApp, email, or phone.

For all the ways to share your thoughts with us, visit LivingBlindfully.com/opinion. That’s LivingBlindfully.com/opinion.