Eyeglasses aren't just for improving vision anymore. Smart glasses use built-in cameras to scan what's in front of you and speakers to describe what's there. Join us this week as we explore the most popular options on the market today, Meta and Envision. Listen in as the Hadley team shares their personal experience with using these glasses in daily life.
Hadley
Smart Glasses and Vision Loss
Presented by Ricky Enger
Ricky Enger: Glasses, they can help block out light or help us see things more clearly. Turns out, there are some glasses that do a whole lot more. In this episode, we'll discuss exciting wearable options including products from Envision AI and Meta. I'm Ricky Enger, and this is Hadley Presents. Welcome to the show, everyone. We've got Doug Walker here.
Douglas Walker: All right. Great to be here.
Ricky Enger: And we have Lisa Salinger.
Lisa Salinger: Hello.
Ricky Enger: So great to have you both. So today, we're actually talking about glasses. And oh my goodness, have glasses ever become a thing. We get so many calls about glasses, which seems very surprising because you think about glasses as just this simple little device that you put on your face, and it doesn't really do much other than its intended purpose.
But in the last couple of years of AI making its way into things with cameras getting smaller, batteries getting smaller, and allowing some glasses to do way more than they could before, we have some really exciting things to talk about. I'm so glad you could join us. Lisa, we've had conversations before, and you've made a really good point that will sort of set the stage here, and that is, what do we think of when we hear glasses? What's that first impression that pops into people's minds?
Lisa Salinger: Well, I think that what most people think of first with glasses is sight. I know of some people who've been really disappointed, because they got some kind of smart glasses, and they couldn't see any better.
But people want the illusion, even of sight. And by putting them on your face, by putting them where your eyes would look at an object, it does make it convenient, but it's important to remember that they will not add to or enhance any sight that you have. But again, the good part of that is, if you don't have any sight or you have very little sight, you can still use these. You are not excluded from using them.
Ricky Enger: Yeah. That's a really good point. So glasses, in some cases, does not equal vision. It's really just talking about the form factor as opposed to how it actually works. Now I should say, there are some glasses out there that will help enhance your remaining vision. So, eSight has a pair called the eSight Go. There is Acesight, they have an option. There’s a company called Seva and they have the SevaVision glasses. So there are definitely some things out there that will help in that respect. You can zoom in on things, you can change your contrast, you can do some distance viewing, things like that. But we're not really going to talk so much about those today. We're going to focus on the options where you're getting your info through audio, right?
So the glasses are talking to you. You're talking back to the glasses. It's a really cool way to interface with these things. So, for any of these glasses, any of these products that we're talking about today, we are really just here to share opinions. We don't have partnerships with these companies. We don't get demo units to review or anything like that.
This is just us. We have decided to purchase the products and see how they work out in our daily lives. And we're just going to share those experiences and those things that we found in the process of using them. There are some really cool options in this kind of audio glasses space or area. I think more and more people are entering this market all the time, which is really awesome, because the more people who are developing these things, the more features we're likely to get.
So there are the Seleste glasses. And that is a, you get the hardware, you get the glasses, and then you pay a software subscription each month for the features that they're offering. There are the ARx vision glasses. There are a couple more out there, and of course, we will have links to all of these things in the show notes.
We do have a couple that we're going to take a deeper dive into today. It turns out that all three of us have this first set that we're going to talk about today, which are the glasses from Meta. The price for these begins at $299. Now, it's important to note that there are differently priced pairs of Meta glasses, but the difference is only cosmetic. Right?
So they all have the same features, but certain lenses, certain styles may cost a little more. I don't know if it's because they're rarer or it costs more to make them, or whatever. But I think it will be interesting to see what each of us decided on for our glasses. It's probably the least important part of it, but it's still fascinating to me.
So I have the Skyler glasses, which are cat's-eye-shaped frames, and I chose the shiny black with blue lenses. So they're clear inside, and then I think they become blue outside. I don't know. I've never seen them do this. But when people look at me wearing them, inside, the lenses look clear, but they're labeled as cerulean, so I think they must turn blue outside.
Rather than spending a lot of time talking about these, I'm more curious to see what both of you think. Lisa, we'll start with you. Tell us which frames you have and then just some things you really appreciated about the glasses, and maybe something you thought they would do, but it turns out they really don't do that well.
Lisa Salinger: So I got the Wayfarer glasses, and some people seem to simplify things and say, "Wayfarers are for men and Skylers are for women." Not necessarily. It seems that people are choosing the Wayfarer, people of both genders, because they are kind of square glasses. My face tends to be a little bit round. And so, I thought that the square shape would better complement my face shape.
I got matte black with graphite, which is like a gray. Apparently, none of the colors out there are really strong colors. Apparently, they're pretty muted, and I didn't know that, but I thought I didn't want something that might clash with something I was wearing. So instead of going from making a style statement, I chose to be a little boring and sort of just blend.
Goodness, there are so many things I like about the glasses. I really didn't think I would care about hands-free, because I could use a lot of the scanning apps with my phone pretty well. Well, it turns out I really like hands-free. It's super convenient. You don't have to juggle. It's just easier. And if you know you're going to be asking about your environment, you can put the glasses on, and then you're just kind of speaking. You don't have to fish out your phone or do this or do that. It's very simple.
Ricky Enger: So what you're doing is, just to describe things a little bit, there is a camera in your glasses, and you're able to ask questions that are then answered by audio. Right?
Lisa Salinger: Right. One of the best applications I found for this is identifying food. All my cabinets in my apartment are very high and I am short. I could leave my cabinet doors open and not even walk into them. So when I want cans, I've got to get on a step stool and get them down. I usually would take a bunch over to the table, read them with one app or another, and then label them or organize them and put them back.
But with the Meta glasses, I can just stand on the stool and pick one up, and say, "Tell me what this is. Tell me what this is." And it's just really fast. I appreciate the fact that it will describe your area. It will tell you, for example, if you're in a parking lot, what signs it can see. One thing that many people don't seem to like about the glasses is that the responses are very short. So if I ask it to describe my living room, it's going to say there's a chair, a love seat, an entertainment center, a desk.
But I really like that, because then I can drill down. I can say, for example, "Is the TV on?" It sounds like a silly thing to ask glasses, but some of the smart TVs can be in this suspended state where it's on, but you don't know it. So that can be very helpful. You can drill down. You could ask, "What color is the love seat? What colors are the pillows on the love seat?" That kind of thing.
The thing I don't like, which seems like a little bit of privacy overkill, is that it won't describe people. Also, it won't usually read bottles of medicine. One time I got it to identify a bottle of Tylenol. But usually, you can't get it to do that.
Ricky Enger: Yeah. I guess it's a little hesitant to possibly tell you something that has to do with something medical, because it's afraid, "Oh, the AI may not have gotten that right." And boy, would that be a bad time. I too am not a big fan of the fact that it can't describe people. One of the first things I did with the glasses, and I don't really know why, I think I just wanted a better description of them on my face. So I stood in front of the mirror and asked it to describe what it saw, and it wouldn't do it. So it couldn't describe the shape of my glasses and the color of them because it was on a human face, and it didn't want to give info about that.
Lisa Salinger: We probably should mention that, if you do not have a smartphone, for example, if you have BlindShell or a flip phone of some kind, if it's not an iPhone or Android, it's not going to work, because you do need an app. I don't know if this will be added in the future.
Ricky Enger: Yes. Great point. The Meta View app is required for interfacing with these glasses.
Lisa Salinger: One thing that made me very happy, though, is that Facebook is not required. Even though Meta is Facebook, you do not need a Facebook account.
Ricky Enger: Exactly. So, Doug, how about you? What frames did you get, and what lenses? I'm asking because I think you actually had a bit of a journey with this, didn't you?
Douglas Walker: I did. I went around and around with this. So I too have the black Wayfarer glasses, and I’m told that's just your traditional Ray-Ban sunglasses. And you're right. At first, I purchased the dark green sunglasses, but it didn't take me long to realize I made a mistake on that, because I do have some remaining vision, and I really wanted to be able to see through them and use them inside, and they were the sunglasses.
So these were just way too dark for me to use when I was indoors. So I sent those back and ended up getting the black Wayfarer glasses with the transition lenses instead. That way, I can use them both inside as well as outside when I want sunglasses on, and I love them. Absolutely, I love them. I think we all do. I use them a lot.
Of course, they take great pictures and videos, which is nice, and I love to listen to my music and my podcasts through those. I mean, they have surprisingly good speakers on them. I've been impressed with that. Like Lisa, I use them for recognizing objects and items around the house too, and I like the short descriptions it gives as well.
When I'm in a store, I'll use them to read price tags and of course, to describe the different products, and to even look at the sizes on clothing. I've used it a lot for that. Yeah. But probably my favorite thing that I use them for is for scanning a menu in a restaurant. To me, that's just the best. I love doing that. Saying, "What's the chicken on the menu?" and it jumps right to that. That's really nice.
I also love that they changed the prompt recently to have your glasses take a picture and then have it described to you. You used to have to say, "Hey, Meta," before you gave your prompt. Now, you can just say different phrases like, "What am I holding?" or "What am I looking at?" or "What's out the window?" or "What do you see on the TV?" It doesn't describe the background; it just narrows down to what you're holding if you ask that. So it's really nice that they've changed that prompt. It just makes it a lot more conversational.
Now, something I wish it would do better is read an entire document. I've tried all kinds of prompts, but it always seems to want to summarize that page instead of reading the whole page. So I really wish it would do a better job at that, but you never know. Maybe it'll get better at reading an entire page in a future update, because they are changing things. So yeah, there you go.
Ricky Enger: And they really are improving things all the time, and I think where we started is not where we'll be even in six months or so. This actually makes a really good transition for me. I won't talk too much about the Meta glasses, because I think you all have done a really great job covering that. But one of my favorite things is the ability to listen to music or listen to what's around me with my ears open, and still have that really good audio coming from the glasses. But one of the things I am not the biggest fan of is you can't get them to read something from top to bottom or read it verbatim.
The glasses that I want to talk about are the Envision glasses. These are about $2,500, and they were around well before Meta came out. The thing about these is that they are made specifically for blind and low-vision people. So certain features have really been thought about in a way that Meta just has not yet.
One of those features is, “How do I read what's in front of me?” So, while Meta may give you a summary of what the text is, Envision, if you use the Scan Document feature, will read that text exactly, because it is using a different technology rather than AI to figure that out. It's using pattern recognition, or we call it OCR. So it's reading what is really there instead of maybe taking some guesses as to what is there.
It will also guide you in how to position your head. It will say, “You might want to move the document up so we can get a better view of it,” or move it to the right or whatever. That's very nice. There are features that Envision has that Meta doesn't have yet.
For example, the ability to call Aira in a straightforward way. Aira is the service where you can have a person who is paid to look through your camera and give you information about what's being seen, what's visually apparent there and it's a really useful service. It's nice to have this in a head-mounted thing. Although, having said that, Meta is working on partnerships. They've announced a partnership with Be My Eyes. These are trained agents, but they are volunteers, who are going to do that. They'll look through the camera and answer whatever questions you give them. And Aira now has a beta that will allow you to do the same thing. It's not quite so straightforward, but it exists, and we can only assume things get better over time.
The last thing I want to mention, and we haven't touched on this so much, is that as good as the AI is in the Meta glasses, it is still AI, which means that it can be prone to what we call hallucinations, where it is, again, guessing at things. So, it may tell you that, "Oh, yeah. There's absolutely chicken à la king on the menu," when in fact there isn't. So that's just one of those things about AI that we have to keep in mind and be aware of.
Lisa, you told a funny story. Why don't you talk about how Meta recognizes currency?
Lisa Salinger: Well, usually, it does pretty well, but it did tell a friend of mine that he was holding a $7 bill. It's not always clear if it's giving you information from what's in front of you or from the cloud or the great beyond. I had a box of Boca Burgers, which are like veggie burgers. And so, I was curious, and I pulled them out, and I asked the glasses what they were, and it said, "A box of burgers." And I said, "How much protein is in one?" And they said, "The average burger contains between 11 and 14 grams of protein."
I thought, "That's not usually how package directions work." So I flipped it over, and I asked it again what it was, and it said, "This is a box of Boca Burgers." So this time, it gave me the brand. And when I said, "How much protein is in one?" it said, "One Boca Burger contains 14 grams." So you really kind of have to keep your ear on the context and how you get from one question to another to know where your information is coming from.
Ricky Enger: Yeah. Great point. There is so much to talk about here, and we've covered a lot, but there's still more to cover. I'm wondering, though, is there any last thing that people should consider if they're thinking about getting one of these smart glasses? Anything people should know? Doug, we'll start with you.
Douglas Walker: Yeah. And I think, Ricky, you mentioned this earlier, but I think it's important to say again and again that we are still really in the early days of wearables and for wearables like these. We are at the first beginning stages. So even though they may not do what you want them to do, they're only going to get better over time. For me, the future of wearables like this is super exciting. I love having my hands free, like we've talked about, and I can't wait to see how these continue to evolve for us.
Ricky Enger: I think you're right. I think it can only get better. And the more people who decide to jump into the market, well, the more options you have. Right? Lisa, how about you?
Lisa Salinger: One question I've gotten a lot is, "Well, when should I take the plunge? It seems like they're always coming out with new stuff." Well, I think sometimes it's just Murphy's Law of new stuff that when you get something, something else is going to come out the next day or the next week that seems better. If you focus on, "Is this going to help me? Is this going to do what I need it to do?" If your answer is yes, then it might be time to consider.
The other thing with all of these, we talked about the $7 bill, and I've heard lots of crazy things that AI says, I really think that you have to consider this as sort of an adventure or a game and approach it with a sense of playful curiosity. "This response didn't work. How can I say it that will get me the information I need?" Or maybe it's, "I always forget to turn on the lights, and no wonder it's giving me such horrible results, because my room is almost pitch-black." So I think it's just important to have that spirit about it where you're learning.
Ricky Enger: That is absolutely true. I think it can certainly cut down on a lot of frustration if maybe you expect bumps in the road, and you look at those as opportunities to exercise that curiosity or that creativity like when you were talking about the medicine bottle. You eventually figured your way around that and made it identify something for you. I think what I would say about all of these wearables is really the same thing you all have said.
We are just getting started with these, and the future is going to be really, really interesting. I know that Meta, for example, was never thinking about people who are blind or low vision using their product. They had designed them for streaming videos to Facebook or Instagram, which you certainly can do, by the way, or recording video and recording really good video, or taking pictures. They never thought that the AI in these would be used for something other than just a traditional smart assistant or to help you come up with a creative caption for your Instagram post.
Now that they see people are using these in really interesting ways, I think they're open to more partnerships and figuring out ways that this can be useful in different ways than they initially intended. And then, for those glasses that are being made specifically with blind and low-vision people in mind, there's more and more technology available, meaning that the features that could be available on these glasses for lighter, faster, cheaper, better is only going to grow. I think it's exciting times for sure.
Well, I want to thank you both so much for stopping by and chatting about glasses with us. We will have links to all of these in the show notes, and we'll also encourage you if you have questions like, "Can the glasses do this? Will the glasses do that?" We're happy to chat with you and answer any questions that you might have. So, contact us at Hadley, and we'll be glad to walk you through some things.
That’s going to do it for the conversation portion of today’s episode but if you’d like to hear a couple of these smart glasses in action, stick around for a short demo.
Ricky: After all the great discussion about Meta and Envision glasses, let’s take them for a spin in some real-world situations and see how they work.
I’ve come up with three scenarios that people are likely to try and use these glasses in, so let’s see how they perform with those tasks.
First up, we’ll read some text, and we’ll start with Meta. Hey, Meta. Read what’s in front of me.
Meta: It appears to be a coupon or advertisement for Valpack.com. Offering up to 0% off on certain products. The ad features an image of various photo frames and suggests that it’s easy to customize them with your own photos.
There’s also text encouraging you to visit the website Valpack.com for more savings opportunities.
Ricky Enger: Hmmm. Up to 0%. Something tells me Meta didn’t read that text exactly. So let’s see if we can get it to do that.
Hey Meta, read what’s in front of me verbatim.
Meta: Valpack more savings inside.
Ricky Enger: I feel like we’re still missing a lot of text. But Meta did give us enough info to tell what this piece of mail is and what we might want to do with it.
Alright, let’s give Envision a shot. We’ll use the scan text feature which is how Envision processes documents.
Envision: Scan Text. Move document left. Reader. Shutterfly. Don’t miss the magic. Make it extra easy to customize cards and gifts just in the nick of time. Up to 50%. See inside for details. Two free x10 prints.
Like or savings at Valpack. Call Valpack or Valpack.com or registered trademarks Valpack direct marketing systems. LLC, copyright 2024, direct stop.
Visual element detected. Alt text for the visual element is as following, a collection of personal items including a photo album, keychain, a mug, and greeting cards. Each item features photographs of people. End of visual element description.
Ricky Enger: So Envision gave us exact text as well as a description of the image in front of the envelope. All right. Next up, let's see how our glasses do with identifying products. Hey, Meta, look and tell me what product I'm holding.
Meta: You are holding a can of Campbell's roasted garlic mushroom soup.
Ricky Enger: Nice. That took no time at all. And this is actually one of the most challenging things to get right, because cans are round. And traditionally, they are difficult for text readers to make sense of. Let's see how Envision does. Since Envision doesn't have a specific product identification feature, we're going to use the short text feature to see if we can look at what's on the label.
Envision: Instant Text. P-B-L-L, Cream of Mushroom with Roasted Garlic Soup.
Ricky Enger: And what we see is that Envision, because it's relying on a more traditional text recognition for this, it's having problems reading the rounded can, but it still gives us just enough to figure out what we're holding. And finally, let's take our glasses outside and see how they do with describing a scene. It's a lovely sunny autumn afternoon, and I'm really curious if the leaves have begun to change color in my backyard. Hey, Meta, look and tell me what you see.
Meta: I see a wooden deck or porch surrounded by trees and foliage. There is an enclosed section with screens, possibly leading into the house.
Ricky Enger: Are the leaves changing colors from green to something else?
Meta: Yes. The leaves appear to be changing colors from green to yellow, and possibly other shades of autumnal hues.
Ricky Enger: And now let's find out if Envision says something different. It will give a very short description first, so we'll have to continue asking it questions.
Envision: Describe Scene. Screened porch overlooking a grassy backyard between a wooden fence.
Ricky Enger: Describe the trees in detail, including their colors.
Envision: The image shows trees with a variety of green hues. Some of the leaves are light green, indicating new growth, while others have a darker green shade. There are touches of yellow in some leaves, suggesting seasonal changes. The trees are lush and full, creating a dense foliage background.
Ricky Enger: Got something to say? Share your thoughts about this episode of Hadley Presents or make suggestions for future episodes. We'd love to hear from you. Send us an email at [email protected]. That's [email protected] or leave us a message at 847-784-2870. Thanks for listening.
The holiday season can bring added stress when adjusting to vision loss. Navigating get-togethers and interacting with family and friends may be a bit trickier this year. We have some tips for you to consider as you plan.
Join us as we chat with Ron Peterson, a retired scientist and engineer who is also a member of the Hadley community. Ron found a way back to doing what he loves, nature walks and volunteering after losing his sight to glaucoma.
Vision loss can be emotionally overwhelming. In this episode, we chat with a clinical psychologist who is legally blind herself, Dr. Ann Wagner. She shares how learning from and connecting with our emotions can bring about healing and transformation.
Preparing for a disaster is important for everyone. But for people with vision loss, it requires some extra thought and planning. In this episode, we share some ideas on how you can prepare.
Did you know that veterans are eligible for vision services and equipment through the Veterans Administration even if their vision loss developed many years later and was not as a result of service? Learn more as we chat with a representative of the Hines VA.
Voting can be tricky if you've lost some vision. In this episode, we discuss a variety of ways to cast your ballot, no matter your level of vision.
The Bright Focus Foundation funds research to find cures for macular degeneration and glaucoma, among other conditions. In this episode, we learn about their glaucoma and macular degeneration monthly chats. These sessions with scientists are open to the public and offer insights into the latest breakthroughs, treatments, and promising research on the horizon.
Artificial Intelligence (AI) has a lot to offer people with vision loss. Whether it's reading menus, describing pictures, or even narrating scenery, AI can make a big difference. This week we chat with Steven Scott, host of the Double Tap podcast, about some of the best AI-powered tools out there… so far. Link to Double Tap on Apple Podcasts.
When the doctor says, "there's nothing more I can do for you," what next? Who can help you make the most of your remaining vision and learn how to live more comfortably with vision loss? We break it down for you.
This week we talk to Dave Epstein, the visually impaired creator of the All Terrain Cane. He shares about his life with a progressive eye disease and his love of hiking. These two pieces of Dave lead him to develop his unconventional cane.
When you have vision loss, scams can be even more challenging to avoid. Listen in as we get some tips and tricks from Veronica Lewis who runs a low vision assistive technology website.
Be My Eyes CEO, Mike Buckley, joins us to talk about how this free, smart phone app merges technology and human kindness and how it's now using AI to describe the world in front of you.