Whether you're navigating a new environment, reading a menu, or shopping, an AI app or gadget may be just the ticket to help if you've lost some vision. But which one is the best one? In this episode, we share practical tips on which tools we turn to for different tasks. We'll share our experiences with Seeing AI, Be My Eyes, Aira, and Meta glasses in particular.
Resources mentioned in this podcast:
Be My Eyes
Seeing AI
Aira
Meta Glasses
Hadley
Artificial Intelligence (AI) and Vision Loss
Presented by Ricky Enger
Ricky Enger: When you're using technology to get things done, it can be hard to know which tool to pull out of the toolbox. In this episode, Steve Kelly and Eric Boklage join us to share tips on choosing the right tool for the task. I'm Ricky Enger and this is Hadley Presents. Welcome to the show.
Steve Kelley: Hey Ricky.
Eric Boklage: Hi Ricky. Hi Steve. How are you?
Ricky Enger: It is wonderful to have you both back in the same virtual room. We did a podcast a bit ago and talked about kind of those new services or vision professionals, things that you should know about if you are new to vision loss. And wow, did we ever cover a lot in that? It's certainly worth going back to listen to and this time I'm happy to have you both back to talk about technology. It's a place that we all sort of love to be and could talk about for hours. So, we're really going to have to struggle not to do that. But before we get into that, let's just get a brief intro from each of you. So, Steve, we'll start with you.
Steve Kelley: Thanks. I'm one of the practical help specialists here at Hadley and I do several discussion groups.
Ricky Enger: And Eric, your intro is going to be a little different this time than it was in your previous podcast. Some things have changed for you. So, at that time you were on the way to becoming a vision loss professional, but perhaps I shouldn't steal your thunder. Go ahead Eric.
Eric Boklage: Thank you Ricky for those kind words. So yes, a lot has changed. I've recently acquired both my CVRT certified Vision Rehab Therapist and C-A-T-I-S, which is certified Assistive Technology Instructional Specialist certifications. I now work here at Second Sense, which is a vision rehab agency here in Metro Chicago. And I get to spend my day helping people learn how to use the technology that helps them thrive in their normal daily lives.
Ricky Enger: As we were talking about putting this podcast together, we were kind of going through the tools that either we use personally or that come up as we're talking to others because I think we're all in that unique position of both being on both sides of the desk so to speak. So, we all work with people who have vision loss, and we are low vision or blind ourselves. We wanted to go through a lot of these tools that we either use ourselves or tell people about often and figure out which one to use at any given time. But I think what helps to get that conversation started is to actually give a brief description of what each of these tools actually is. So now we get to practice our elevator pitch and see just how concise and accurate we can be as we describe each of these things. Steve, you get to start off with Seeing AI, our first tool in the toolbox. What is that?
Steve Kelley: Seeing AI is an app that's made by Microsoft. It's available for both Android and the iPhone or iPad. And basically, it will read what it sees. Not only does it read, but sometimes it will read labels, and it does some object identification based on artificial intelligence (AI). And that's what it does. It's super handy.
Ricky Enger: And Eric, you have a tool and that is Be My Eyes. Off you go.
Eric Boklage: Thank you. Yes, Be My Eyes is also an app that's available on both the Android and Apple platforms. It uses the camera on the phone to allow you to interact with what's around you. Initially it was how do I call somebody for help. Since then, they've also added an AI function too.
Ricky Enger: Yes, for sure. And we actually have a podcast on Be My Eyes and we have workshops. So, teaching you how to use Seeing AI. So, we've got those resources for you. And next up is my tool to talk about that is Aira. So AIRA is also an option that can use your camera and it works a little similarly to Be My Eyes in that you are borrowing someone's eyes, but the difference is that these are trained professionals as opposed to volunteers who might be answering your call on a lunch break. So, these trained professionals are able to assist you in perhaps navigation or anything that you can think of. It also does have an AI component to describe images that you send to it. The human portion of Aira is a paid version and the AI version is free. Alright. And our next tool is Meta glasses. And my goodness, do we ever get a lot of calls about this? And Steve, I'm going to throw this to you first because you’ve recently gotten a pair of these and so you get to speak from a newbie perspective. What are these and what do they do?
Steve Kelley: Yeah, essentially they're Ray-Ban glasses and you can have your prescription lenses put in, or in my case it's, it's a nice pair of sunglasses and they have a camera on the right-hand side of it. They've got some great speakers inside of it. They were designed by Meta for Facebook users to take pictures of what is going on around them to put on Facebook and you know, to listen to, to various things. They weren't really designed for folks with vision loss, although we of course discovered them and said, hey, the artificial intelligence on these can do things like describe a scene in front of us. So, they can be really handy in that regard. And in terms of price, they're far less expensive than some of the other smart glasses that we've used.
Ricky Enger: Right. And in addition to those features, I appreciate the fact that they are open ears so you can listen to whatever you're listening to, whether it is navigation instructions or just music as you're strolling along, you can still hear what's going on in your environment. I also appreciate that you can access both Be My Eyes and Aira from these glasses. So, it gives you a hands-free option to take advantage of those services as well. Eric, is there anything that we didn't cover about Meta glasses? I know you're fairly new to that as well, but maybe you've had a client or two mention something they really appreciate about them.
Eric Boklage: I think you hit on it when you said hands free. The other three tools we talked about are ones that are primarily done through a cell phone. And the fact that you can take the Meta glasses and utilize either Be My Eyes or Aira through the lens, if you will, that's in the camera of the glasses instead of having to hold your phone. That sort of hands-free activity really helps. So, if you're trying to examine something, you don't have to both hold the phone and hold whatever it is you're trying to examine all at the same time. So, it almost acts like a, if you will, a multiplier effect for the Be My Eyes and the Aira apps.
Ricky Enger: Before we go into specific things, I'm wondering if there's like a general rule of thumb that you tell people and believe me, I'm listening carefully because I want to know this. How do people think in general about which tool to choose? Are there, are there criteria that you can mention that might help in that decision making process?
Eric Boklage: So, I, I think one of the key things is that it's user dependent. Each person's vision loss is their own and, and how they need support and assistance is different from the next person. And, and that's not just a function of do they still have vision or not, but all sorts of varieties in that. Right? So, what I recommend to my clients is that they use each of these in different environments so that they can see how well they work for them and then use that experience of their own to help define which tool they're going to use in what scenario. Because sometimes people find one interface easier to work than the other one or they may find that their day-to-day usage needs require something different from the other one. I have them explore each of these when they can and make a decision from there.
Ricky Enger: I really like how you put that because we all have these wonderful conversations with those of you listening and it tends to go something like what is the best and then followed by whatever question it is. And when the question comes up that way it implies that there is one right answer and there really isn't. There's one right answer for you. And so that's why I think it's helpful to explore this both figuring out what these tools are and then doing exactly what Eric said, trying them in your own environment and seeing what you prefer. So now we are at the point where I've just put together a list of things that we all do every day, and I want to talk about where these tools might fit with each of those scenarios. So here is one of my least favorite jobs sorting the mail. And just a, a brief tip from each of you, which tool might you use for this and kind of give why you chose that?
Steve Kelley: My go-to is a 5X Bausch and Lomb slide magnifier that's about the size of a silver dollar. I do not leave the house without it or, or even it's, it's always with me because I use it for everything that said. However, there are a lot of times when I pick up a piece of mail and it just doesn't do the tricks. And the next thing is I take the Android phone, and I'll just open the camera and just pinch it and make it a little bit larger and read it that way. The last thing I'll do is go to Seeing AI or Google Lookout and have it read it to me.
Ricky Enger: Eric, what about you?
Eric Boklage: I really like being able to use the Seeing AI read function. So, it would just read any kind of print that was in a short concise form.
Ricky Enger: And that is my answer as well as someone who has no vision, I'm going to reach for seeing AI every time because it does exactly what I want it to do, which is read the text precisely as it appears on that envelope. Now if I'm in a hurry I might say to the Meta glasses, you know, hey Meta read what's in front of me and it will give a summary as opposed to reading that precise bit of text. And the same thing is true for Be My Eyes. So, Seeing AI wins this one for me.
Eric Boklage: The thing that I would throw out there though is it, it will also read script. So I know this is spring as opposed to the holiday season, but if it's holiday season and people are getting in holiday cards as an example or things of that nature that have handwritten or hand addressed as the, if the writing's neat enough it can actually be identified by Seeing AI.
Ricky Enger: And even if it's not, my mom has notoriously bad handwriting and Seeing AI still gets it. So okay, our next little scenario is reading an important document, so maybe I have to sign to get that Publisher's Clearing House or whatever other scam or it could be something legitimate. What are we going to use to read that important document? Does your answer change from just sorting the mail?
Eric Boklage: I will start off by saying that I think Seeing AI is, is actually the tool to do that because it's got an OCR optical character recognition component that's built right into it. And it will also help guide you to the point of you hold the camera or the phone over the document and it'll guide you how far up you need to go in order to get all four corners in the picture and which way you might need to shift your hands. And then once you've got the full document, it's right there for you to be able to read it. And that includes multiple pages too.
Ricky Enger: And you can save it as well, which is really nice.
Eric Boklage: Exactly. It guides you through the process and it facilitates you being able to get the entire thing in very good precision because if it can only see the left half of the page, it's only going to read the left half of the page. So, it wants to guide you to where you get all four corners first.
Ricky Enger: That's right. Steve, what about you? Are you still going to be using your little handheld or are you going to use something else to read a larger page?
Steve Kelley: So, what I'll do is pull out the Bausch and Loam, I'll look at it first word, second word, then I'll move over to different lighting and try to get the third word. And then finally I'll do exactly what Eric suggests. We'll pull out the Seeing AI. But I got to go through the other steps first just to prove to myself that I really can't read this.
Ricky Enger: And that's going to be a very relatable statement to many, many people I think. Okay, so what about if you dropped something? There are techniques that you can use and in fact we have a workshop on how to find things that you've dropped. But let's say you've tried those things. You've kind of gotten down on the floor right in place and you're using that grid pattern to find where it is and if it bounced somewhere. What tool are you going to use? And I guess I should say, what tech tool are you going to use to find that thing?
Steve Kelley: You know, I do use the grid pattern, and I think at this point I'm accustomed to it so I, you know, I'll systematically look in place. The reality is if it, if it gets to the point I, I will ask a human being That said, however, I have used on occasion Be My Eyes and found that that was super helpful and, and I'll probably use it a little bit more in the future just because it, it's really good at that having somebody look through the camera and you know, just give you, you know, some cardinal directions, you know,
Ricky Enger: And that is my answer as well, getting a human involved and Be My Eyes is the perfect way to do that. It's free and I think speaking for myself anyway and perhaps you as well, it can feel a little awkward. Like I don't want someone to help me find this thing, I'm the one who lost it, I should figure it out or whatever it is. And what I will say is, having spoken to a number of volunteers at Be My Eyes, they've put the app on the phone knowing two things. One, I can answer that call when I have time and I don't have to, I'm not obligated to. And two, the other thing they know is this is really going to help someone when I do pick up that phone and do this brief little thing and it makes me feel good too. So, it is for those people who have volunteered, it's not out of a sense of obligation. They genuinely want to help, and it can sometimes be just that quick 30 second call, oh here's the earring, now we move on with our day. It's efficient and everyone feels good at the end of it.
Eric Boklage: You know, hearing you both talk, you're talking about calling a volunteer or in the case of Aira calling one of their representatives so that a human being is using the camera on the phone. But also, on Seeing AI, for example, there is a Find My Things channel and that can be programmed to look for something that you find really important. If you've got a small pocketbook that holds all of your credit cards and your main information, you can train it to look for just that thing. You can then use the camera to pan around the room, you know it's in that room somewhere and it will look for that item too.
Ricky Enger: Yes, you can train that to look for it and it's a wonderful feature.
Steve Kelley: You know what's so interesting about this, I think that for those of us who are in relationships or families where you have people who have better vision, it can be really challenging. As a person asking that family member repeatedly to find something or to assist or something like that and it can get wearing on them as well.
Ricky Enger: Yeah. And we're all works in progress when it comes to that kind of thing.
Steve Kelley: You can say that again.
Ricky Enger: But I do think it helps to have these discussions not only just to bring that out into the open that maybe it is wearing on the people around you if you do it too often and then acknowledging, well if that's the case, what other do I have? And so that's why we want to explore these things. So, here's another one that is something I struggle with, and this is probably one that I will ask family members more often than anything else. What is this mystery box in the pantry or the mystery bottle in the fridge. Is there a good way that the two of you have for doing this using maybe one of these tech tools?
Eric Boklage: I do and it's actually a two-part answer. So, a lot of those products have either barcodes or QR codes on them that you can use the product identifier in Seeing AI. And it will read that, and if it can't read it, it'll tell you that it can't read it. At that point you can just switch channels and use the read function, and it will read whatever is on that particular container or box.
Ricky Enger: That's right. And Seeing AI wins again because this is what I do when I feel like I have a little more than 10 seconds because sometimes it is a challenge to find that barcode or to read the text on that round can. And this again goes back to efficiency. Do you opt to do this with AI or optical character recognition or what have you? Or do I involve a human? Now the one thing that I do is Meta glasses and this is something they're surprisingly good at for a tool that was not built for this purpose. It will use a combination of looking, maybe it glimpses a part of the logo, and it glimpses a bit of the text, and it will put that together to make a pretty good guess of what this product is that you're holding. So that would be just saying, Hey Meta what's in front of me or what product am I holding? And the fact that it's hands-free makes that a really nice option as well.
Steve Kelley: You know, over the years I've used a number of different code readers and like you Ricky, I think I found them challenging in the sense that you have to spin it around a couple of times. You've got to be able to see at least have a general idea where the code is and they've gotten much faster again to repeat the usefulness of that has been particularly good with that. And like Eric was saying, just with the read function you catch a couple words and boom, you know, it's a can of soup and it's tomato soup.
Ricky Enger: So, once you have figured out what this product is, now you have to figure out how to use it. So, product directions, and I'll take this one first. This one is definitely one where I want a human because while Seeing AI can be used theoretically anyway, read the product directions, if you have scanned that code, it has a, a more info area and assuming that the manufacturer has included this information, you can read the product directions right there. But if that's not available, I would much rather have the assistance of a human through Be My Eyes or Aira telling me number one, this is where the directions are. Sometimes they're on the side, sometimes they're on the back, sometimes it's on the front. And also, they can be written in columns or there's additional texts that you pick up as you're trying to read the directions and for efficiency sake, or I'm really hungry, I want to cook this thing now, I'm going to enlist the assistance of a human. How about the two of you?
Steve Kelley: You know Ricky, that's so interesting because you bring up a really good point, especially with the column stuff. So, for example, when I'm looking at a product, I can generally get an idea of where I'm going to find either the cooking instructions or the ingredients, you know, and, and of course what I'm looking for is the cooking instructions. Basically, once I've gotten that, it used to be that I would just use the camera to enlarge it, but that was so slow. This is where Seeing AI comes in and, and Google Lookout for me because I'm an Android user. I've been using that primarily for product instructions for quite some time because you know, you snap a picture, and it begins reading it back to you. There are times when it's like, oh gee, I did get the ingredients instead of the instructions or vice versa. So yeah, you know, moving forward, it might be helpful to use something like Be My Eyes as a first step.
Eric Boklage: I would tend to, to piggyback on your point about the, the way that the materials are printed on the packaging is going to affect what you end up hearing being read to you. I've seen a couple of scenarios and columns as one, but let's face it, the packaging is designed by people in marketing. They're trying to get your attention, right? So certain words are going to be in larger font than in others. And, and you're trying to read a specific set of, of instructional kind of pieces of information. And a lot of times these days those instructions will also include, I don't know if this is the right term, pictographs, where they try to convey as much information through a, a little stencil type of picture. And that can be very difficult and awkward for any tool that's AI driven to be able to pick up and understand. I tend to agree with you that if it's something where you need to follow very specific instructions, you're better off getting some human advice to go along with whatever you might be able to see through the camera view of the AI.
Ricky Enger: Makes a lot of sense. Alright, so now we've decided I took too long figuring out how to cook this thing in my pantry. I just want to go out to eat instead. So now there's the issue of reading a menu and it is worth mentioning that a lot of menus can be viewed online or you can use apps like Uber Eats and DoorDash that do delivery as a part of that. They have the menus listed there in their apps. So, you can read it in whatever way is comfortable on your phone. But let's say that for whatever reason you can't do that, you're in a restaurant, there's a printed menu in front of you, what do you do?
Eric Boklage: I often would recommend the magnifier app on their phone because of the fact that you can modify the amount of magnification you need, you can turn on the light that's on the back of it as a flashlight. So, a lot of restaurants are dimly lit, and you can have the ability to turn that light on to be able to give you additional lighting. And if you happen to be someone who has to have things in white print on black background for your ease of use, you can do those sorts of filters or even color filters using magnifier apps.
Ricky Enger: I want to talk a little bit about shopping and for grocery shopping we do offer a number of tips in our podcast about that, and we'll have that in the show notes. I'm wondering if you're not using an app like Instacart or shipped to have those things delivered or to place your order and then get transportation to pick that up, maybe you're just going in for a couple of things. Is that where you would typically use your magnifier or is there another tool that you think works if you kind of have a general idea of, hey, I know where this is, but I just need to make sure I grab the right thing. What are you going to use?
Steve Kelley: You know, generally speaking, I'm grocery shopping with somebody and if I'm not, if I'm just getting dropped off to pick up a few things, I always say, okay, well you know, give me some extra time. Because it's true, it's going to take a little extra time, especially if the store is unfamiliar. But I'm just remembering one time when I was there looking for a specific type of cheese for something that we were going to make. I was by myself and thank goodness the dental hygienist was walking down the aisle. She saw me in the store because I had, you know, both doors open and I've got my face pressed up there and I'm looking around and she's like, oh Steve, how are you? What are you looking for? I was like, I'm looking for the sliced cheddar, blah blah blah. And she goes, oh, it's right here. And I'm like bingo, that's what I needed. I'm out of here. But the magnifying glass or stuff like that, even the monocular, is not really helpful. What's helpful sometimes is when you're trying to compare something. Let's say you've got a couple cans of beans or you're checking ingredients, yeah, then Seeing AI is particularly good. But if the store is unfamiliar, there will be a point when I'll just need to go up to customer service and get sighted assistance if I'm not going with a friend just to get me pointed in the right direction. But I think that can be a really tricky sort of environment without having one of the store assistants help out.
Ricky Enger: Here's how the Meta glasses help me, and this is what I would use in a scenario like this. I would never go shopping for say a full grocery call just walking around the store and doing all of that because it isn't going to be an efficient use of my time. I'll be able to do that better with an app. But if I'm going to pick up a couple of things, I am going to use my Meta glasses and it will either be grabbing that product off the shelf, kind of turning away a little bit and looking down the aisle, but having it in front of me and saying, Hey Meta, what is this? Or what am I holding? Or I'm going to take advantage of that hands-free opportunity and again contact Aira or Be My Eyes and for a grocery store trip I'm probably going to use Aira. And the reason is that these are trained agents who are accustomed to giving very specific navigation directions or being able to assist in directing where the camera should point in order for them to get that right glimpse of the product. Whereas a volunteer may have the best of intentions, but they're not accustomed to describing move the camera slightly upward, you know, move it to the right a little, all they know is yeah, I can't quite see it and I don't know, maybe bring it closer, that kind of thing. So, for efficiency’s sake, that's where the Meta glasses are going to play into this for me. But you do have to be mindful that it's going to catch a lot of info. I did have a person that I was chatting with, and he chose his cereal by standing in front of the aisle and he asked Meta, “Tell me from left to right what the cereal boxes are on the shelf at my eye level.” And with that very specific prompting, he got exactly what he was after.
Steve Kelley: Just out of curiosity, would you choose Aira or Be My Eyes or something like that over just going through with a shopper's assistant?
Ricky Enger: No, it's a really good question and it comes down that the store assistants are often not there exclusively to assist people who are blind or low vision to get around and shop. They're there to ask to answer a question or to, and they may also be stocking the shelves or doing any number of other things. I genuinely do feel like if I haven't made an arrangement that they're going to help me, I feel like I'm taking time that they haven't budgeted for. Whereas if I'm calling Be My Eyes or Aira, it is a transaction that both parties know they have entered into willingly. So, identifying currency, identifying money is one that I have here. And I'm pretty sure that Seeing AI is going to win on this one because it does have a function specifically for this. I will say Meta glasses can do this, at least they've gotten much better at doing this than they once were. So, I have just a couple others here and the one I want to touch on, well two actually describing things. Whether this is a garment or a picture that someone has sent you, whatever it is you need to get information that your vision is just not giving you enough, how are you going to get this object described? Whatever it is.
Steve Kelley: Yeah, I, you know, I'll take it quickly because you know, again, that one's kind of an interesting thing, particularly for those of us with, with low vision because for the most part I'm going to use magnification. It could be the camera on my phone; I might take a picture of an object and enlarge it or something like that. But I also recognize too, at least for me at this stage, sometimes colors are not accurate. So, there are times when a human comes in handy. I will have to say to the human, is that yellow or is this what I'm looking at? But for the time being, mostly it's magnification.
Eric Boklage: I think that the AI tool that's in both Seeing AI and Be My Eyes are good. Both of them have issues with color primarily due to lighting because depending upon how bright or how dim the lighting is, brown can come across as black for example and things of that nature. But if, if you're looking for a robust description of a scene or a room or even of a person, I have found that the Be My Eyes description tends to be more detailed and more robust than the Seeing AI one. And both of them allow you the option to ask questions, if you will, to follow up. But when it comes to things like identifying a specific color or whether or not things match. AI can't really help you to the degree that a human being might be able to do.
Steve Kelley: When you said Be My Eyes, were you talking about the AI version of it?
Eric Boklage: Yes. The Be My AI channel of the Be My Eyes app gives you very robust descriptions of something. If you take a picture of it, it will give you a very detailed description and then you can ask follow up questions of it that you can either dictate or type in. And, and of course if they're connected to the Meta glasses, then you could probably do the same thing just with your vocal descriptions. But I find that their descriptions on the Be My AI channel of the Be My Eyes app to be a bit more robust and detailed than the describe option that's available on Seeing AI.
Ricky Enger: Yes, it tends to be a bit more brief with Seeing AI than it is with Be My AI. There are two things I love about this, and the first is kind of going back to that feeling of not wanting to impose on people to get some information. If I want to ask Be My Seeing AI 68 questions about whatever this thing is that I've got a picture of, I can do that. I can spend as much time as I want, and the AI is not going to get tired of describing it for me. And the other thing I love about this is sometimes you do need that verification. So maybe the AI has described something and maybe it's an outfit that you're trying to match, and you think that it matches, but you can't be absolutely certain if the AI has detected the difference between navy blue and black for example. And whether it's Be My AI or even Aira has an AI component, both of these have the ability to verify with a human. And so, you can call that human from right there in the app. It's like, do you want to verify what I just told you? Well here, connect with someone and you can. Wow, we've really covered a ton of things, both from a practical perspective and I think just exploring some of those thought processes that we go through when we're trying to figure out which tool to choose. And that really brings us to, you know, is there anything that we haven't covered? Are there things that you would tell someone if they're listening to this and they've never tried any of these, or maybe they've tried one or two and it just hasn't worked out that well or whatever it is, if someone is feeling overwhelmed by all of these options, what do you do to kind of reduce the stress of finding the right thing? Is there a good place to start or a good way to approach this initially?
Eric Boklage: Well, I, I'm going to go back to my comment at the very beginning. It's all going to be person dependent and, and I really do recommend to people that they try any of these and all of these and get a comparison and a feel for how they might work. And they need to do that at a point in time when they're not under a time crunch or under any kind of stress just to play with it and figure out which one of these they seem to be more comfortable with. I mean, pick a can of soup out of your pantry and use two or three different tools to see which one of these gives me what I need to know to be able to make my selections. And then in the future, you will know which one you can be more comfortable with.
Ricky Enger: That's a really good point to do this when you're not under a time crunch, if you go into it feeling pressure already, the likelihood that things are going to go downhill is pretty high, right? So, you know, just have that moment of curiosity and, hey, I'm just playing with this and if it doesn't work today, well it doesn't matter. There's nothing at stake. Steve, anything to add?
Steve Kelley: Yeah, I was just going to say, Eric, I think you're so right that it’s a person dependent because there are a lot of tools out there and we're all different. So, you know, these tools are going to work differently for different people and the thing to do is to figure out, you know, which one works best for you. And you know, the last thing I would add is as I was looking over the list and thinking back through, you know, like the, you know, how I've, I've done things in the past, I just realized that there, you know, as somebody who had pretty decent vision for, for quite a while, there's a certain amount of this that really touched my self-esteem. You know, I was really reluctant sometimes to try some of this stuff, the handheld magnifier or some of the smartphone stuff out in public around people at a cash register or whatever. And you know, I just realized, just like that I'll say, oh, you know, I don't see well can you read that to me? Can you prompt me or give me some guidance or something like that? It's like relax, take a breath, we all do these things in different ways and it's okay. Just figure out your own way and don't let it steal your self-esteem. That's the other important thing.
Ricky Enger: Yeah. So very important because no one wants to stand out, we don't want to look different.
Steve Kelley: Right. Different, right.
Ricky Enger: Yeah. And at the end of the day, it's figuring out how to do what you need to do efficiently and still be able to figuratively look at yourself in the mirror and be okay with it all. And I think the, the final thing I would say is that while Seeing AI did come out a lot ahead in some of these things, Be My Eyes was, was a big winner in others and Meta glasses made their appearance and magnifiers continued to, to be here. So, the thing I would say to sum up is that there isn't one right magic tool that if you just use this, it kind of solves all of the practical things that you're struggling with. It's good to know about all of them so that you can pick the right one at any given point.
Eric Boklage: And they continue to evolve these, these tools and these devices, the AI that's inherent in all of them continues to improve and to, to build out. So don't let one experience with it cause you to walk away and not use 'me again. Be a little bit adventurous and check it out again. If it, if it didn't work well before that doesn't mean it won't, won't work for you again.
Ricky Enger: Yeah. Things are constantly changing. Great point. Well, thank you both so much. This has been such an informative discussion, and I've had a wonderful time. I hope you all have as well. If you're listening and you have questions about any or all of these tools, either where to find information about them or how to get started or you want to talk some of it through, we hope that you will contact us here at Hadley and we're happy to help. Thank you all so much for listening.
Got something to say, share your thoughts about this episode of Hadley Presents, or make suggestions for future episodes. We'd love to hear from you. Send us an email at [email protected]. That's P-O-D-C-A-S-T @hadleyhelps.org or leave us a message at 847-784-2870. Thanks for listening.
For many, vision loss means giving up the car keys and with that can come complexity, inconvenience, and frustration, among other challenges. We chat with two members of the Hadley community, Dia Kraft and Tara Perry, as they share some of the creative solutions they've found. From rural areas to urban environments, they offer some valuable perspectives about growing more comfortable and confident while navigating life after vision loss.
Do you have transportation advice to share? Please share it with your Hadley neighbors. Email us at [email protected] or leave us a voicemail at 847-784-2870. We will revisit the topic in a future episode and share more insights.
Listen in as Hadley team members share tips, tools, and their favorite apps to help make shopping a little easier for people with vision loss.
Resources mentioned in this podcast:
Hadley’s Grocery Shopping Series
Hadley’s Taking Notes Workshop
Hadley’s Seeing AI Series
Braille is often only associated with how people with no vision are able to read books. But there are many other ways braille may be of use, even for those who still have some vision. In this episode, members of the Hadley community share how they have found braille to be helpful in their everyday lives.
Charles Bonnet Syndrome, visual hallucinations that can accompany vision loss, is often overlooked or worse, misdiagnosed. We chat with the founder of Mary Carmel’s Light, an organization dedicated to supporting those facing Charles Bonnet Syndrome (sometimes called CBS).
Previous episode on this topic:
Vision Loss and Charles Bonnet Syndrome
Website:
Mary Carmel’s Light
For many, losing vision can make everyday kitchen tasks more challenging and even scary. Debra Erickson, founder of The Blind Kitchen, was no different. Then she decidedly faced her fears and learned some tips. Listen in to hear how Debra grew to love cooking now more than when she was fully sighted.
It's quite common to feel like you're all alone when facing vision loss. And feeling lonely can make the vision loss journey all the rougher. That's why Hadley created a community forum to share stories, insights and bits of inspiration to remind us that we aren't alone on this journey.
Ricky Enger and Marc Arneson spend time reintroducing us to Insights & Sound Bites, a community-generated show where others facing vision loss share what has helped them cope and adjust. Find out why fans tell us they listen to this show over and over again when feeling down.
With the Aira app, people with vision loss connect to vetted, live, expertly trained assistants. Using the camera on your smartphone, they walk you through whatever task you're struggling to see your way through.
Join us as Judy Davis shares her personal experience of facing a hurricane and its aftermath with vision loss. She shares how she managed to stay safe, informed, and connected throughout the ordeal.
Listen to our previous podcast: Disaster Preparedness with Vision Loss
The Foundation Fighting Blindness funds research to find treatments and cures for eye diseases. They also host local chapter events where you can meet others going through similar challenges. Join us for a conversation with Ben Shaberman, Vice President of Science Communications.
From navigating airports to making the most of airline assistance services, we talk about ways to make air travel with vision loss a little easier. We're joined by James Ashworth, vice president of customer care at Southwest Airlines, who is also living with macular degeneration.
Medical appointments can be stressful for anyone. When you add in vision loss, these appointments can feel overwhelming. From navigating buildings and rooms to filling out forms and signing your name, there can be lots of stumbling blocks. Listen in as we share tips and advice to help make managing medical appointments a bit easier.
February is Glaucoma Awareness Month, and we are revisiting our informative 2021 conversation with Dr. Jullia Rosdahl, a glaucoma specialist from the Duke Eye Center. Listen in as we ask Dr. Rosdahl common questions we hear about glaucoma, its risk factors, and treatment options. PLUS: at the end of the interview with Dr. Rosdahl, we've added on an episode from our sister podcast, Insights & Sound Bites, featuring Angela Delgado sharing her story about living with glaucoma.