Artificial Intelligence (AI) has a lot to offer people with vision loss. Whether it's reading menus, describing pictures, or even narrating scenery, AI can make a big difference. This week we chat with Steven Scott, host of the Double Tap podcast, about some of the best AI-powered tools out there… so far. Link to Double Tap on Apple Podcasts.
Hadley
Artificial Intelligence (AI) and Vision Loss: Tools You Should Know
Presented by Ricky Enger
Ricky Enger: Artificial intelligence is everywhere, but how exactly do we access it? And are there specific benefits for people with vision loss? On this episode, podcaster Steven Scott joins us to discuss all things AI. I'm Ricky Enger and this is Hadley Presents. Welcome to the show, Steven. Welcome back, I should say we've had you on before.
Steven Scott: That's right. I must have done something right because you brought me back.
Ricky Enger: We must have done something right because we didn't scare you off the first time. So it's a good thing.
Steven Scott: Everything's worked out for the best.
Ricky Enger: Yes. So for people who may not know who you are, give just a brief intro, tell us who you are and a little bit about what you do when you're not on this particular podcast.
Steven Scott: Yeah, okay. My name's Steven Scott and I host a show called Double Tap, which is a daily tech show and ultimately, it's a show which each day picks up on the top tech stories, looks at them through the blindness perspective, but also we talk about the realities of life with blindness, right? We talk about our challenges, our daily lives, sometimes through the tech itself we talk about all the cool tech that's out there, and we think, "Yeah, okay, that's great, but is it affordable?"
Ricky Enger: Yeah, yeah.
Steven Scott: Those are the kinds of questions a lot of people have in our community, and we've asked them ourselves many times. So we're trying to build a community of people who can come forward and share your ideas. And one thing we really encourage on the show is feedback and conversation. I want people to engage. I'm all for as much conversation as possible and funnily enough, the one thing I get criticized for the most is encouraging people to have an honest conversation with each other. I think it's really important, especially in today's environment, I think it's more important now than ever.
Ricky Enger: Oh, definitely. So with that in mind, I think you're the perfect person to talk about today's topic, which is artificial intelligence, AI. It sounds so fancy and technical, and if you're not a techie person, it can sound a little bit intimidating just saying this phrase, artificial intelligence, "Oh that means I have to know a lot about computers or whatever. And this is not for me."
But I think with our discussion today, we're going to prove that wrong. At least that's my hope. Maybe it would help to start out not by talking about what AI is and how you access it or any of that, we'll get into that in just a bit. Maybe it would help to start with what are we actually doing with AI that didn't seem likely with other tools, say a year ago? Is there something that you're finding that you're using AI for every day?
Steven Scott: So on our show, we've talked a lot about this topic and it's a topic which gets a lot of controversy, which our show likes to promote. I put out this idea about alt text. So for those who don't know what that is, ultimately if you post an image online for someone who is blind, there is this feature called alt text. Most platforms have the option to add alt text, which means alternative text, it's short for alternative. And ultimately what it allows you to add a description of the image onto the image itself so that a blind person will know what is.
So let's say you post, you go to a coffee shop and you take a picture of a nice cup of coffee that you've just got and it's maybe got a fancy design of a leaf on it, and you want to post that onto your Instagram or onto your TikTok or whatever. And you post it up there, you can add alt text to say, "This is a lovely cup of coffee sitting on a wood table with a beautiful leaf design on the coffee foam itself." It just includes the blind person. It just says to the blind person, "Hey, you're included, and this is what's going on in this image." What often happens when we as blind people are using our devices and for example, use a screen reader and we swipe past an image, you get the dreaded image and that's all you hear, and you have no idea what it is.
I think it's great that we have alt text, and we certainly needed it for a long time, but artificial intelligence allows the ability now to for the computer to almost actually see the image itself. What it can do is describe the image for us. So I've been saying on the show, "Wouldn't it be cool if we stopped pushing people all the time to put alt text everywhere and take time out of their day to do that?" And actually we as blind people use the tools we have; we now have to actually make our own alt text or just get our own image described. That is one area that I think is really interesting and it's kind of growing and I see more and more of us using it.
The reason I love it is because if someone takes a picture of a beautiful sunrise or something, I can get so much more out of that image because the AI can describe it in much more detail than someone with sight ever would. On top of that, because it's artificial intelligence, think of it like a virtual person, that you can ask it questions. You could ask more questions and say, "Okay, tell me more about that tree you've just talked about. Do you know what kind of tree it is? Are there any animals in the image?" And you can query that and get information back. So suddenly that experience becomes much more 3-D than even sighted people get.
Ricky Enger: Yes. And the virtual person doesn't get tired of answering those questions either.
Steven Scott: No, that's right.
Ricky Enger: So one thing I've noticed that I suspect many of our audience knew already is if you've had sight before, the power of getting a picture without any preface, so someone's not saying, "Here comes this picture of a tree and here's why I find it impactful." You have that moment of experiencing it. It used to be with your eyes and maybe you're not seeing that so well anymore, but I'm able, for example, to get pictures from my friends and they would've described them previously and been happy to, but the difference is that now I can suddenly get a picture from a friend and she doesn't have to tell me, this is a funny picture because I already know that when the AI says it's a picture of a cat on top of a bookshelf and its front paws are hanging over the bookshelf. Clearly, now I know why the picture is funny without having that experience conveyed to me by the person who has sent it along.
So, I'm having a lot of fun with getting these pictures just randomly from friends or family and feeling like I can share in that moment of discovery and then I can share with them this is the AI description that I just got of this picture. We ended up having a really good conversation about, "Oh, it saw something that I didn't notice." Or "The AI description did miss one important aspect, which is this." But regardless, it does open up conversations that I wasn't having before.
Steven Scott: And there's so many examples of this. I had a listener get in touch who had an image sent to her in an email form of her as a child on her brother's knee. Her brother was no longer with her, and she had this image of herself. She had never seen it, she has no vision, never had any vision, and she wanted to know about the image. She had no idea, for example, that she wore a yellow dress in the picture because she was a child. She was very young, so she didn't know anything about this. She was able to query that. She was able to find out about what she was wearing, what the background was like, what the environment around in the image was like, and it was just amazing the detail she got.
Now, I'm not against alt text, I'm not against people taking their time to write. This is a cup of coffee with the design of whatever it is, the leaf on the cup. But the point is we can get so much more information from tools available to us through artificial intelligence. That virtual person approach gives us the access and gives us, I think more importantly, autonomy in all this, some control in this. So it's not just a case of being fed information, we can actually control that information, we can ask questions, we can gauge the information we want to know what kind of dress is it in that picture, what kind of hat was I wearing? What color are my eyes? There's probably loads of questions blind people have about themselves that they've never really thought to have answered, and they can get that information now.
As blind people, we can often feel like a burden. We can often feel like a burden on other people and some of us just accept that and we just do it. Some of us shy away from a lot of social environments for that reason and certainly for asking questions. When I go to a restaurant, I hate asking the waiter or waitress to read the menu. I can use artificial intelligence to scan that menu and actually query that image in the same way. I can query the image of the cup of coffee. I can query a menu to say, "Tell me about the dishes on here that have chicken in them or tell me about the desserts." That stuff matters.
Ricky Enger: Yes, this is such a great example, especially if you're new to this and you're already struggling with how to do the things I used to do without being a burden on my family or my friends. Then here we are at the restaurant and I had a family member that every time we would go out to a restaurant and I would say, "Okay, well what do they have?" "Oh, well it's pretty standard fare." What does that even mean? I have nothing to go on. So not even reading the mains or the categories. So I know people are dealing with this and having that ability with artificial intelligence to say, "Hey, I want chocolate. Tell me about that."
Steven Scott: Yeah. And the image thing is interesting as well because we often think about the image description angle from the perspective of sitting at home, someone sends us an image or we're on social media and we find an image and we want to investigate it. For example, recently it was a beautiful day and my wife and I had gone to the beach. We'd taken the dogs and where we were sitting, a little picnic table overlooking the beach, overlooking the water, it's absolutely gorgeous.
I just bring my phone out and I snap an image, and I throw it into Be My AI. Suddenly I have this rich description of my environment and I'm getting so much more than I can even imagine. I'm seeing in front of me, I've got a little bit of vision, but not enough to really gauge too much of what's going on. At the beach I'm able to know that there's an ice cream stand, there's loads more dogs than I thought there were, and this is amazing. I'm getting all this information, and I can query, I can say, "Tell me more about the ice cream stand or the prices on it." And it actually told me the prices and I'm thinking, "This is amazing. This is so cool."
This is stuff that enables us to be part of our conversations. If you're sitting, and a lot of you will know this, if you have a sighted partner, and it's hard to say this, but it's true, we do feel second best sometimes. We certainly feel second class. It's not the fault of anyone. It's not the fault of your partner. It's not intended by your partner to make you feel that way at all. But we just do because we rely on them to give us information. When you can provide that information, for example, "Hey, the ice cream stand has a sale on, you can get two ice creams." Suddenly you are in possession of information, and they go, "Oh wow, okay, cool, let's do it."
Suddenly you're part of the conversation. You're no longer just picking up the information as you go or listening intently because the other part of being blind as we know, is it's incredibly overloading on our senses, on our remaining senses because everything is active. Even though people often say our hearing is better because we're blind, no, it's because we are listening harder. We're using the sense and we're using our sense of touch and our sense of smell. And we use all of that combined to navigate to get our way around. We are using three senses just to accommodate one. So it is overloading and if we can use tools, we can use technology. If artificial intelligence can just help a little bit amongst all that and actually make us enjoy our environments, what's not to love?
Ricky Enger: Absolutely. For me anyway, AI is giving me access to things I didn't know, I didn't know. So maybe I thought before I had a pretty good idea of the things around me, but just like your example with the ice cream stand, I didn't know there was an ice cream stand on the beach. I didn't even know that was a possibility. So I wouldn't have thought to ask. But with AI giving us those descriptions, suddenly it's amazing at just how much more detailed the world is. Again, with people who knew this, maybe you are listening and you've been sighted for a long time and suddenly you're not seeing things as well as you used to and you're missing those details. This is a way to get those things back.
Before we talk about how we are accessing AI and just giving some tools that we're using, is there anything that you have been really surprised by in any way that you are using AI that is maybe not so obvious?
Steven Scott: We did a feature on the show about an app from Honda, the car company. And it was interesting because you would never really put a car company next to blind people unless you're talking about driverless cars perhaps. But Honda had come up with this idea for an app that allowed people to use the latest in artificial intelligence. And the same system that we're talking about when it comes to describing images, the app would take a series of images from your phone. So essentially as you're driving along a road, you hold your phone up out of the window and the phone will take a number of pictures, snap, snap, snap, you're unaware of it. It's just doing that in the background, and it is then stitching those images together and building a picture and then relaying that back to you in poetic language. So it's telling you what you're seeing as you're going along.
Suddenly you're in a position where something may happen on the road or be happening passing by or whatever it might be, and suddenly you are aware of what's going on. The point is that this is the kind of stuff which I think is great for inclusion and bringing us together. I don't want to just sit in the car or sit on the coach or sit on the train and just be unaware of my world. I want to be part of it, but also part of it in my own way, on my own terms.
Ricky Enger: Yeah, you can say, "I don't care about the signs, I care about the scenery or vice versa." You have that choice.
Steven Scott: That's right. And then it goes one step further because the next big iteration of what is called GPT and that this is, it all gets very technical at this point, but it is complicated stuff, right? I mean, it's not easy to get your head around, but ultimately if we just keep in the image description approach, let's take it to the next step. The next thing that's coming is video, live video. So what will happen is the camera will be able to take live video and then respond to that. And I've seen some amazing examples, and I think this is where things get really exciting.
So one example recently was from Be My Eyes, and this is the volunteer-driven app of course that has an AI component to it as well. They are about to release this at some point in the future where it will be able to use the camera to allow you or allow the app, I guess, to see out of the camera and then respond to you in real time. And the examples given include ducks playing in a pond. And you query, you ask the question, "Hey, what are the ducks doing?" And it will give you an audio description of everything that's going on.
Ricky Enger: It's happening in real time, right?
Steven Scott: In real time. So if the duck puts its head under the water, it's going to tell you if the duck brings its head back out of the water, it's going to tell you and it's going to do all with a beautiful voice and it's going to tell it to you like a story.
My favorite thing from Be My Eyes, which they demoed in a video, was hailing a taxi. So the guy in the video, his name's Andy and he’s in London, England. He puts his hand out for one of the traditional black London cabs, and he is asking the app, "Where is the next taxi?" Now the taxis are evident to people, but they're yellow lights, which of course we can't see. The app is able to say, "Oh, there's a taxi coming, put your hand out. And the taxi's approaching you, the door is ahead of you." And the guy who had a guide dog opens the door of the taxi, gets in, and the camera must have somehow been pointing at the ground because his guide dog would've gone in front of the camera essentially. And you hear the voice saying, "Oh, beautiful dog."
Ricky Enger: Yes, I love that. And we'll have a link to that video in the show notes because it's such a great example of what is coming. And that's the act of hailing a taxi. What a great practical example of how AI can be useful.
Imagine, I don't know how many of you have gotten something new and you're trying to figure out how it works or how to put it together. So you go to YouTube, and you search for the video, and it starts playing this lovely music and it keeps playing the lovely music and there's more lovely music and it says nothing at all. And so it's music with no description. Imagine using AI to then get access to not having to feel that frustration of, "Well, I guess I'm not doing this right now. I guess I need someone else's assistance to figure out what is in this video." If it's 3:00 in the morning and no one else is awake and this is the thing that you want to do and the only thing standing in your way is having knowledge of what is on this screen, AI will be able to give you that information, which is amazing.
Steven Scott: And I just want to say as well, I know that when we talk about artificial intelligence in a professional capacity or workplace, we often talk about it in the sense of summarizing an email or perhaps even composing an email. So it can be really useful if you are someone who struggles to write down thoughts to compile it. I must admit, I'm bad at this. I'm really bad at trying to write emails and being able to use AI to take what I've written, which is very rough notes and say, "Can you turn this into something legible?"
Ricky Enger: Yes.
Steven Scott: Presentable with good grammar and good spelling. That really matters, especially the spelling and the grammar. Because look, I have to be honest, since losing more vision, I really am nervous when I'm sending a professional email these days because I'm often thinking, "Is this right?" And I'll sometimes send it to a friend first and they'll say, "Yeah, your grammar was a bit off, or you missed a few capital letters." I'm thinking, "Really? I should have got all that right." And you spend so much time, so much time going through things and letter by letter and character by character using braille or using speech or whatever it might be. Being able to throw this to essentially the virtual person again and say, "Can you check this? Can you rewrite it if necessary, or can you correct my grammar and spelling?" Then it sends it back to you and says, "Here you go." It just gives you confidence in what you're doing. And again, we're doing this without being a burden on anyone else. It's great.
Ricky Enger: Yeah, it's a beautiful, beautiful thing. So not everything is perfect, and we are going to talk about some things to watch out for with AI. But before we jump into that, why don't you give two or three examples of tools that you are using. I have a couple of things in mind as well, but what AI tools are you using? Are you primarily doing this from your phone or a wearable or both? And what apps?
Steven Scott: So I guess it's three for me, which is Be My Eyes or Be My AI through Be My Eyes, which is an app I definitely use a lot. It's also available as a Windows desktop app as well now, which is really useful. It allows you to take screenshots and query and do all the things we've talked about.
In terms of wearables, I'm using my Meta Ray-Ban Glasses. Now I've had mixed responses with this because in the UK we actually don't have official access to the AI component of these glasses. So we have limited assistant functionality, but if you're in the states, you will get full access to it because it's open there. I believe in Canada as well. You can get what's called Meta AI. Now it's a little bit limited, it's not perfect, but it can do things like look and describe, so you can query what's in front of you.
I'll give you an example. Recently we were on a cruise and we were standing on the outside just looking out at the water, and there were two cruise ships ahead of us, and I wondered where these ships are coming in from. So I asked my Meta Ray-Ban glasses, "Okay, can you describe the scene for me?" And it explains, "That here we have this dock and there are two cruise ships." And that's all it said. And I thought, "Okay." I was again able to query it and say, "Can you tell me where the cruise ships are from?" And it was able to tell me that one was from Norway, and one was from Holland America ship. So I was able to get that information just from the glasses, which is amazing.
I mean, that's on top of all the usual things you can do with these smart assistants, like ask the time or check the date and all the things we've kind of become used to. Making calls as well. I mean, it's more than just that. You can make calls, you can do WhatsApp calls, you can even connect to an Aira agent if you have an Aira account, which allows you access to visual interpreters. So there's lots of cool things that they do.
I think the other thing that kind of surprised me a little bit because I didn't expect much from it weirdly, and that was Microsoft Copilot. So now, Microsoft Copilot is part of Windows 11. If you have Windows 11 on a computer, you may have heard that as you've been navigating around or maybe someone's been talking to you about it. And certainly if you've been following anything in the tech press recently, you'll have heard Copilot. Ultimately what it is it's exactly as it says, it's the assistant built into Windows 11. In my day it was called Clippy. Today it's called Copilot. Essentially it is quite good because it's very easy to navigate with a computer screen reader. I can ask questions; I can submit images to it. I mean, again, a lot of the things we've talked about, and I often call it many doors to the same room. Ultimately, the tool you're essentially using is the same one, but we're just using different devices and different ways to access it and query it. But it's essentially the same place.
Ricky Enger: Right, exactly. Go figure, my tools are very similar to yours. I do have the Meta glasses, and I also had purchased the Envision glasses well before this, and those have AI on them as well. The difference between Meta and those is that Envision was designed specifically for people who are blind or low vision. So, there are things that it may do with a bit more thought toward that than just the standard Meta AI where the AI and the glasses were essentially meant to help you to caption a photo to post to Instagram. Or to let you live stream and then use the AI to sort of write about what you're streaming. It just happens to be useful for us as well. So the Envision glasses is one, Microsoft seeing AI has an AI component as well. It is in the name, but they have recently added a bit more to it as far as describing things in a room, describing a scene, things like that.
There is an app called PiccyBot, P-I-C-C-Y bot. And this is kind of an interesting one. It's a blend of an AI personality. So they all have their own voices and they're using certain phrases to make the description a bit more poetic or a bit more bubbly or what have you. And you can describe a photo or a video.
So PiccyBot is one, and the one thing I've been using that we haven't mentioned just yet is it is still Be My Eyes, I'm using it to do some shopping. So I'm able to share a product page from Amazon, for example, and get a description of this product, which is really helpful if you're trying to determine the color of something or get an idea of its design. So those are my tools, but sometimes no matter how good the tool is, there's the possibility that it can get it wrong. Have you had any instances where the AI will very confidently tell you something, which turns out to be absolutely not true?
Steven Scott: Yes, and it was very disappointing, Ricky. I'll tell you; this was a disappointing day for me because I had my heart set on something from a menu, which I had used AI to essentially scan and then give me information about. I asked it to read through the desserts and it told me about cheesecake. That's what I want. I want traditional cheesecake. I said, "Great." So when the waitress came over, I said, "I'd love the cheesecake, please." And she said, "We don't have cheesecake on the menu." And I'm like, "Well, according to this, you do." And she said, "No, definitely not." And it turned out it was a cheese board that it had, but it made up.
Ricky Enger: It just made it up.
Steven Scott: I don't know, it must have seen cheese and said, "Hey, let's just say it's cheesecake." And that's part of the problem at the moment. And it is a very interesting dilemma because I've seen people say, "Well, look, if something is so inaccurate, why would you trust it at all? Because it could just be feeding you any old information."
But the truth is that this is an ongoing process that is being fixed. Bear in mind, some of the models that are being used to give us this information have only really been around for about a year or so. I mean, AI has been around for decades, but the kind of models we're using are fairly new and certainly fairly new to most people. So there's a lot of learning to be done. We're at the very early stages. And of course, this is the worst it will ever be.
There are examples and some really bad examples of things like when you mentioned you're blind in a conversation with an AI and it apologizes to you. And actually that gets to the heart of the problem. The information inside AI. I mean, this is not some sci-fi contraption dreamt up in a lab. This is information that people have created and put on the internet being soaked up by these systems. These are what they call large language models, which are essentially huge brains that are constantly just reading every single page of the internet. So if there's lots of information online about blind people being poor and needing charity and lots of opinions, negative opinions, ableist opinions, all that stuff, then that is what it's getting.
Ricky Enger: It inherits those biases. Yeah.
Steven Scott: Exactly. And that of course will apply to race, it will apply to gender, it will apply to sexuality, it'll apply to everything. And that is a challenge for the generations to come that are building this technology, the companies as well that are building this technology. How do you engineer all of that out of that system? There's no easy answer to that one.
Ricky Enger: And what you said was so spot on about why you would trust it if it's going to get things wrong some of the time. I think that's an important conversation to have because it can get things wrong. But having said that, it can get things right as well. And maybe the takeaway is that yes, AI can sometimes get things wrong, but if you have some other method of verifying the information that you've gotten when it's important, then this is a useful tool.
I was going through products in my cabinet and one of them said it was facial lotion and it very much was not. I could tell by the smell of it, this is actually a cleaning chemical, and I wouldn't want to put it on my face. But the point is that I got the AI to tell me something, and then I used some other things that I know to verify that before I put it on my face. So AI is not the answer to everything, at least I don't think so just yet. But it is an incredible tool to put in the toolbox alongside other things that we're already doing.
Steven Scott: I often tell my co-host, Sean, when we talk about these issues or we talk about these challenges, he will say, "Oh, we can't use this app because it's not perfect yet and we can't use this because it's not..." I'm often saying to him, "When an app comes out or a piece of hardware comes out, no one's suggesting that you empty your home of every other gadget and gizmo you have. Or if it's an app, delete every other app on your phone and only use this one." It is exactly what you've said. It is a tool in the toolbox.
Now, the great thing, if you're looking to dip your toe into the world of artificial intelligence, you're a little bit nervous about it and you're concerned about things like misinformation or getting things wrong. This is where tools like Be My Eyes and Aira Access AI are really good. Now, these are free services. Aira Access AI is free, as is Be My AI. The great thing is that you can upload an image through Be My AI, and you can have it queried, but then if you're unsure, you can connect with a volunteer and that volunteer will check that image for you and say, "Yes, that's correct. That is exactly what you think it is." Or "Is the right color?" Or whatever it might be. And the same with Access AI, the difference being you're dealing with trained agents.
So I think it's really useful to know that not only are there tools out there that are great to get you into the world of artificial intelligence that get you using it, but you've got that backup in place as well. So, if you’re unsure and you've no one else around to ask at 3:00 o'clock in the morning, you can get that support from a volunteer.
I do love Be My Eyes with the volunteer factor, and I love Aira as well. But I've got to say, I think it's because there's such a sense of happiness from the people who connect with us on Be My Eyes. They're so pleased to help and I love that. I love that. And they're so willing to help.
A story before AI was, I'd gone into a store, and I had bought a card for my wife the previous year for her birthday. When I got home, she said, "It's a lovely card. It just happens to say happy anniversary on it." And I said, "Okay, that's not great." And it was because I was being that kind of stubborn guy, "I'll do this myself. No one's going to help me. I don't need help."
The next year when I'd gone back to get another birthday card, I brought Be My Eyes Along and they pointed me to this beautiful card, which was really nice. They spent ages with me going through the card. And honestly, I actually got emotional at the end of it because I thought, "I've never had that before and I've never had it on my terms." Because a salesperson, yeah, okay, they'll help, but they don't quite connect in that way. This person knew what the role here was. So AI can add a lot to that and can do a lot with that. But let's not forget the human aspect as well. And it's obviously there for backup.
Ricky Enger: Wow. What a beautiful way to sum that up. I love it. We do, by the way, have a podcast with Mike Buckley of Be My Eyes. So if you're listening and thinking, "Hmm, sounds like a neat app, I should figure out what that is." We'll have that in the show notes along with all the other tools and videos and things that we've mentioned here and that has been a lot. I knew that you and I would have a fantastic time, Steven. Talking for just this length of time is difficult because I know that we could carry on. But thank you so much for joining us and sharing your thoughts about this. Very cool.
Steven Scott: Listen, anytime. Thank you for coming on. And I've got to say, Hadley is such an amazing, amazing organization and the work you do to support blind people, people who are coming to sight loss, you do such amazing things, and I've learned a lot from this organization. So thank you for allowing me to be a small part of it.
Ricky Enger: Yes, we appreciate that. Thank you.
Got something to say. Share your thoughts about this episode of Hadley Presents or make suggestions for future episodes. We'd love to hear from you. Send us an email at [email protected]. That's P-O-D-C-A-S-T at HadleyHelps dot O-R-G. Or leave us a message at 847-784-2870. Thanks for listening.
Did you know that veterans are eligible for vision services and equipment through the Veterans Administration even if their vision loss developed many years later and was not as a result of service? Learn more as we chat with a representative of the Hines VA.
Voting can be tricky if you've lost some vision. In this episode, we discuss a variety of ways to cast your ballot, no matter your level of vision.
The Bright Focus Foundation funds research to find cures for macular degeneration and glaucoma, among other conditions. In this episode, we learn about their glaucoma and macular degeneration monthly chats. These sessions with scientists are open to the public and offer insights into the latest breakthroughs, treatments, and promising research on the horizon.
When the doctor says, "there's nothing more I can do for you," what next? Who can help you make the most of your remaining vision and learn how to live more comfortably with vision loss? We break it down for you.
This week we talk to Dave Epstein, the visually impaired creator of the All Terrain Cane. He shares about his life with a progressive eye disease and his love of hiking. These two pieces of Dave lead him to develop his unconventional cane.
When you have vision loss, scams can be even more challenging to avoid. Listen in as we get some tips and tricks from Veronica Lewis who runs a low vision assistive technology website.
Be My Eyes CEO, Mike Buckley, joins us to talk about how this free, smart phone app merges technology and human kindness and how it's now using AI to describe the world in front of you.
Recently retired, David Tatel served for decades on the US Court of Appeals for the D.C. Circuit. He also happens to be visually impaired. In this episode, he and Ricky talk about his recently written memoir, a book about his life as a judge, a husband, a father, a grandfather, and how all of these roles intersect with his experience with vision loss.
This week we talk smartphone tools and when you might want to use the different options. Jennifer Shimon from the Wisconsin Office for the Blind and Visually Impaired joins Ricky.
Sometimes, navigating life with vision loss goes a bit sideways. Things don't always turn out exactly as we've planned, and it can help to just laugh at these strange situations.
We've shared several episodes of listeners' stories, what we're calling vision loss bloopers. Today, Ricky Enger and Doug Walker share some more of these bloopers along with a few of their own.
Prevent Blindness' patient advocacy program empowers people facing vision impairment. Patients learn how to promote change with their physicians, their families, drug companies, and even policy makers.
Ever thought about getting a guide dog? Listen in as we chat with members Jeff Flodin and Ed McDaniel about their experiences with guide dogs and some common misperceptions.
The National Library Service has a free talking book program for anyone in the US with vision loss. Tonia Bickford, an advisor from Michigan's talking books program, joins us to discuss how to get the most out of this free service.
This week we learn more about visual hallucinations that sometimes accompany vision loss, a condition called Charles Bonnet syndrome.
Sometimes vision loss can make us feel less secure. This week we talk about personal safety with Hadley's Chief Program Officer, Ed Haines.
For many living with vision loss, watching TV is less enjoyable as they can't see what's happening on the screen very well anymore.
Audio description fills the void by narrating key visual elements. Listen in as Ricky chats with Hadley member and avid audio description user, Judy Davis.
Listen in to our conversation with Dr. Mondal, a low vision optometrist and professor at the University of Wisconsin. We chat about what to expect from a visit to a low vision specialist and the kind of help they can offer.
Have you listened to Hadley's community-generated audio podcast yet? In this episode, Ricky and Marc Arneson, Hadley's Director of Community, share a few stories from Insights & Sound Bites and discuss how to contribute your own story. Insights & Sound Bites | Hadley
Listen in as artist Chloe Duplessis explains how a degenerative eye disease changed, and didn't change, her life and love of art. "I thought art required sight. I was wrong."
Dr. Judy Box, a Hadley member living with macular degeneration, shares her tips for managing those important conversations with your eye doctor.
In this episode, the Hadley team talks all things gifts. Giving them, getting them, what's on their wish lists, and how vision loss may, or may not, impact these activities.
Friendships often change when one has vision loss. Whether it's adaptations to the activities you enjoy together, asking for help, or turning that help down … there are conversations to be had. Let's tune in as two Hadley members, Eugenia DeReu and Tara Perry, share their experiences with what's changed for them — and what's stayed the same.
Losing some vision can make for shopping challenges. Here are a few mishaps that Hadley members have run into. Have your own to share? Email us at [email protected]
This week we chat with the chief technology officer from Envision as he shares how their free mobile app or camera-enabled glasses can help those with vision loss. It speaks aloud written information, describes surroundings and objects, and even tells you who's nearby.
Lots of questions, concerns, and stereotypes connected to use of the white cane. In this episode, we address several of them from past discussions on the topic.
Listen in as Hadley's Director of Community, Marc Arneson, chats with Hadley members Bill Massey and Gregory Peterson about their participation in Hadley's new Peer-to-Peer program.
To learn if getting a peer connection is for you, call us at 1-800-323-4238.
Listen in as we chat with Ed Haines about getting the most out of our magnifiers.
Listen in as we chat with animal lovers Debbie Worman and Sheri Robinson about the joys and challenges of caring for a pet when you have vision loss.
Listen in as Hadley member, Wendy Spencer Davis, shares why she decided to learn some braille and how it's helping her in everyday life.
Ed McDaniel, a psychologist with low vision, joins us to talk about common emotional triggers people with vision loss face and how to recognize and manage them.
Jessica Grogan from the American Diabetes Association joins us to talk about managing your blood sugar with vision loss.
Tune into our chat with Sarah Clark, a visually impaired marriage and family therapist, as she offers her unique insight into some common family dynamics that often make adjustment more challenging and how to navigate through them successfully.
Join us as we chat with Hadley member, Kris, about her experience living with vision loss in a senior community.
Listen in as Hadley staff share their real-life bloopers—times when things didn’t quite go as planned.
Join us as we take a dive into the features of the BlindShell cell phone.
Listen in as Pastor Scott Himel shares his advice for participating in religious services no matter your level of vision.
Join us to learn about how ScripTalk technology translates medication labels into speech and where you can find a participating pharmacy.
Join occupational therapist from Duke Eye Center, Fay Tripp, in a conversation about bioptic glasses—what they are and who can benefit from them.
Listen in as we chat with birding expert Freya McGregor who shares her tips on how you can enjoy this hobby, no matter your level of vision.
Listen in as Hadley's Doug Walker and Ricky Enger chat about how they use GPS in their daily lives. From walking directions to finding items or assisting a driver by navigating a trip, GPS can be a very handy tool.
Listen in as we chat with Dave Steele about his life, poetry, and vision loss.
Listen in as we discuss some common situations that can make us feel unsafe and share ideas on how to address them. We're joined today by Christy Ray and Ricky Jones of STRIVE4You.Org
Unfortunately, it's not uncommon for feelings of shame to creep in when we've lost some vision. Join social worker Jeff Flodin and psychologist Ed McDaniel, both visually impaired themselves, as they explore where these feelings come from and how they have worked through these emotions in their own lives.
Listen in as Dorrie Rush of OE Magazine shares how she resisted using a white cane for years, the stigma she feared, and the confidence and security she found once it was in her hand.
Learn how CVS pharmacy customers throughout the US can access a free service that reads aloud prescription medication information.
Join us as we chat with author Hannah Fairbairn about the tips and tricks she has learned to take some of the stress out of holiday get-togethers, no matter your vision.
We're joined by the creator of The Blind Life YouTube channel, Sam Seavey. Sam shares his personal journey with vision loss and advice he has for people who are newer to vision loss.
Whether you like to read for enjoyment or need to check your mail, reading is an essential part of your day. We're sharing tips and tricks for how to continue reading, the best low-tech and high-tech gadgets, and the benefits of learning braille.
Chief Innovation Officer Doug Walker chats with us about the launch of Hadley's newest podcast, Insights & Sound Bites. This new podcast will offer short stories shared by listeners. By tapping into the power of our community, we hope to share ideas, discoveries, and moments of inspiration along the journey through vision loss.
Jim Hoxie and Joanna Jones join us to discuss their children’s book, "Grandpa's White Cane." Jim shares how vision loss shaped his life and how he and Joanna, a retired teacher, began instructing children about the importance of white cane awareness and the do's and don'ts for helping people with visual impairment.
Blogger and social worker Jeff Flodin talks about his personal journey with vision loss and how his passion for helping people led him to blog about his experiences.
Hadley has partnered with the National Eye Institute (NEI) to offer a Spanish-language version of our popular cooking workshop series. Devina Fan, director of the National Eye Health Education Program at NEI, joins the podcast to talk more about this new initiative, NEI’s expanding Spanish content, and the importance of connecting Hispanic and Latino communities to important vision resources.
A change in your vision may make some parts of your job more challenging. But with a bit of help and some new skills, you may be able to stay in your job. Hadley Chief Program Officer Ed Haines and Learning Expert Steve Kelley join the podcast to talk about our new Working with Vision Loss workshops and to share tips for where to find support and how to ask for what you need.
Certified accessible travel advocate Melvin Reynolds joins the podcast to share tips for getting the most out of traveling, no matter your level of vision. Melvin gives advice on what to research ahead of a trip, considerations for traveling with a guide dog, and how a certified accessible travel advocate can help.
Karen and Dan Leonetti share how vision loss has changed their relationship and the advice they have for other couples.
Rabbi Lenny Sarko joins us to talk about how his vision loss journey led him to create a first-of-its-kind braille Sefer Torah that people around the country can access.
Actor and artist Bruce Horak talks about his personal journey with vision loss, how he got interested in painting, and his role in the new television series Star Trek: Strange New Worlds.
CEO of Eschenbach Optik of America Ken Bradley joins the podcast to discuss how Eschenbach has adapted through the pandemic to help people with visual impairment access low vision devices remotely. Through their "Telelowvision" program, you can try out magnification devices from the comfort of your home to find what works best for you before you buy.
Scottish radio broadcaster and podcaster Steven Scott loves finding and talking about tech stuff. He's especially fond of apps and gadgets that make life easier for him and others with vision loss.
NYT Columnist Frank Bruni returns to the podcast to talk about his new book. Frank describes his personal experiences with vision loss and how, with time, his perspective has grown.
IT professional and stand-up comedian Todd Blenkhorn talks about his personal journey with vision loss and how his passion for stand-up helped him find and share the humor in daily interactions.
In this episode, we're sharing highlights from previous interviews with a glaucoma specialist, retina specialist, and a low vision doctor. Listen in to learn more about common eye conditions, treatments, and what to expect at these specialist appointments.
Master Gardener Sue Brasel and Hadley's Chief Program Officer and gardener Ed Haines join us for a chat about gardening, no matter your level of vision or gardening experience. They share tips for how to get started, common challenges, and the many benefits of gardening.
We're joined by Carol Mackey, an avid discussion group participant, and co-host Debbie Worman to chat about what Hadley groups are, how to join, and what you can get out of them. With 10 groups on a variety of topics, there's something for everyone. Listen in or chime in – it’s up to you.
Bold Blind Beauty blogger Stephanae McCoy joins us for a chat on beauty, style and confidence. Stephanae talks about how vision loss shaped her life, and then shares some of her favorite fashion and beauty tips.
Hadley staff share their favorite kitchen gadgets and tips. Whether you're an experienced home chef or a total novice, you're bound to pick up a few ideas that fit your vision needs and make your time in the kitchen more productive (and fun).
We sat down with Kim Walker, co-director of research and development at Hadley, and Mark Andrews, one of the Hadley advisors who reviewed our exciting new approach for adults with vision loss to learn braille. From labeling items in your home to identifying buttons on an elevator, braille can be a wonderful tool for everyday use.
New York Times Best-Selling Author, Gretchen Rubin, chats about her research on how tapping into different senses can enrich our lives and connect us to each other in surprising ways.
Twin sisters Jenelle and Joy join the podcast to share their personal experiences with vision loss and adjusting to it emotionally. While they look identical, their perspectives and journeys differ, highlighting their mission to show that "there is no right way to go blind."
Hadley learner Sharon Noseworthy shares tips and tricks for hosting get-togethers of any type or size, no matter your vision. Sharon has always loved the role of hostess and has learned to adjust her approach now that her own vision has declined.
We're joined by Teepa Snow, occupational therapist and founder of Positive Approach to Care, to learn more about the challenges of having both vision loss and dementia. Teepa addresses common misconceptions about dementia and shares practical tips for supporting someone with both conditions.
We sat down with several Hadley staff members and asked them about their favorite tech tips, apps, and gadgets. Whether you consider yourself a tech expert or novice, the group recommends a variety of high-tech and low-tech options that fit your comfort level and interests.
Judge David Tatel has served on the second most powerful court in the country since 1994. He also happens to be blind. Judge Tatel joins us to share his story on building a law career and family while dealing with changing vision, the technology and resources he's found useful, and what made him consider getting a guide dog in recent years.
In honor of White Cane Safety Day today, we're joined by Hadley learner Larry Carlson and Orientation and Mobility Specialist Elijah Haines for a conversation about this important tool. Larry shares what made him decide to use a white cane, and Elijah shares tips for what to consider and how to adjust to using a white cane.
Supriya Raman, manager of the Disability and Multicultural branches of the TSA, shares tips on traveling among shifting COVID restrictions. Supriya covers what to expect at the airport and what resources are available for people with visual impairment.
Photographer Michael Nye chats with us about his latest art exhibit, "My Heart is Not Blind," a collection of photos and audio interviews of people with visual impairment. Through these stories, Michael provides a look into what he calls "our shared humanity and shared fragility," as well as common misunderstandings about blindness.
Champion blind golfer Chad NeSmith talks about how vision loss shaped his life, and how he shares his passion for golf with others with vision loss.
Doug Walker, Hadley co-director of R&D, and Ed Haines, Hadley Chief Program Officer, chat about the making of Hadley's "Adjusting to Vision Loss" workshop series. The series guides people through the emotional aspects of vision loss. Doug serves as the series' personal storyteller and narrator.
In this episode we chat with ophthalmologist Dr. Angela Elam from the University of Michigan. Dr. Elam addresses common questions and concerns, and shares her advice for returning to the eye doctor among shifting COVID restrictions.
Dorrie Rush, OE's Chief Content Officer, joins us for a chat about this wonderful online resource chock full of tips for living well with vision loss. You'll find great articles on using tech tools, tips for health and well-being, stories from others living with vision loss, a terrific podcast, and more.
Learn about a new service that’s just launched in 2020 called Accessible Pharmacy. Accessible packaging and labeling and personalized customer support all free of charge to the end consumer, and specifically designed for those with vision impairment.
Audio Describe the World! That’s the mantra of UniDescription: a free smartphone app that provides audio descriptions and navigation tips for US National Parks and other public places.
In this episode, we chat with low vision optometrist Dr. Mark Wilkinson from the University of Iowa. Dr. Wilkinson answers common questions and shares his advice for getting the most out of low vision optometry appointments.
Jan and Elgie Dow share how vision loss has changed their relationship and the advice they have for other couples.
Join Hadley advisor Eddie Becerra as he shares about losing his sight from diabetic retinopathy, and how he gained a new perspective on life.
Classically trained chef Regina Mitchell shares how vision loss shaped her life. Regina worked her way back into the kitchen and is now helping others cook with confidence, no matter their vision.
In this episode we sit down with the director of Well Connected, an organization that offers free, call-in groups for adults over 60 on a wide variety of interest areas: games, music, meditation and more.
Support groups can be a great way to connect with others who "get it." Listen in as as low vision support group leaders Lynndah Lahey and Judy Davis describe how their groups are run and what their members get out of them.
World-renowned artist John Bramblitt describes how vision loss has shaped his painting and his life.
In this episode, we chat with Dr. Tim Murray of the American Society of Retina Specialists. Dr. Murray treats eye diseases such as macular degeneration and diabetic retinopathy. He answers common questions and shares his insights into the future of treatments.
In this episode, we sit down with Dr. Jullia Rosdahl, a glaucoma specialist from the Duke Eye Center, and ask her some of the many questions we’ve heard about glaucoma, its risk factors, and how to treat the disease.
Hadley learning expert Jessica Smith shares her experience raising a puppy that may eventually become a guide dog. She covers what she’s learned and things to consider if you’d like to volunteer to help out a guide dog school.
October 15 is White Cane Safety Day, a day to recognize this important tool that empowers people with visual impairment to travel safely and independently. It also brings attention to the general public to be mindful of visually impaired neighbors, giving them additional consideration and right-of-way when needed. We sat down with Kellee Sanchez, an orientation and mobility specialist, to talk about the history of White Cane Safety Day, and how a white cane can help those with vision loss.
Be My Eyes is a free smartphone app that connects visually impaired users with sighted volunteers for help with visual tasks. We sat down with Will Butler from Be My Eyes to hear how the app started, tips for using it, and exciting new features that provide specialized assistance, including with Hadley.
Tracy Simon from Eye2Eye peer support program shares her story of vision loss, how her program works, and the benefits of connecting with and supporting each other.
Ophthalmologist Dr. Lori Provencher chats with us about how the coronavirus pandemic has changed doctor's visits. She shares tips for staying safe, questions to ask, and what to expect before, during and after your next office visit.
Mindfulness expert Tiffany Guske returns to the podcast to share tips and insights on how to cope with life's challenges, such as vision loss or an illness, building resilience and focusing on self-compassion instead of judgment.
Author of "When You Can't Believe Your Eyes," Hannah Fairbairn, chats with us about how to communicate in everyday situations when you can't rely on visual cues. Hear Hannah's own story about losing vision, her practical tips on adjusting to vision loss, and advice she has on regaining confidence in social situations.
In this episode, we continue the conversation on living during the COVID-19 pandemic with a visual impairment. Listen in as we share some experiences, tips, and strategies for coping during these difficult times.
The COVID-19 crisis has brought a wave of change and uncertainty to our everyday lives. Listen in as we share personal experiences, resources and some helpful tips...all from a blind or low vision perspective.
Assistive technology experts Ricky Enger and Steve Kelley review BlindShell, a mobile phone built for those with visual impairment. They discuss the basic features, how it differs from a traditional smartphone, and how to decide if it's right for you.
This week we sit down with Dan Roberts, author of "The First Year-Age-Related Macular Degeneration: An Essential Guide for the Newly Diagnosed" and founder of MDSupport website and support group. Hear Dan's own story about being diagnosed with macular degeneration and what prompted him to reach out to others facing similar circumstances.
Listen in as we explore the basics of using hand tools with a visual impairment. Gil Johnson, a visually impaired home repair expert, shares tips on everything from measuring, to leveling to hammering.
Elections are right around the corner. So we gathered a panel to talk about options for voting no matter your level of vision. Listen in as we explore everything you need to know, from registering to vote to the many ways you can cast your ballot.
Ricky sits down with Android Accessibility Product Manager Brian Kemler to discuss what is available on Android phones for those with visual impairment. From adjusting font size and color, or opting to listen with TalkBack instead, the commitment to making these powerful tools more useful to a wider audience is clear.
In this episode, we chat with Gil Johnson, an experienced home repair and woodworking enthusiast about things to consider when undertaking home repair with blindness or low vision.
Hadley's Debbie Good sits down to continue a conversation with author and visually impaired world traveler Dr. Wendy David. Together they explore a wide variety of helpful hints covering train, plane, and cruise travel as well as practical information on traveling internationally and navigating hotels.
In this episode, Ricky Enger chats with Joe Strechay, associate producer on the Apple TV+ series SEE. The show takes place in a future where, after a viral apocalypse, all humans are blind. Joe takes us behind the scenes of the show and his work to help build an inclusive set for the cast and crew, including those with low to no vision. From casting to costumes, scripting to scenery, hear how Joe helped create a science fiction world that strives to be authentic to life with vision loss.
Hadley's Debbie Good sits down with travel author Dr. Wendy David in this latest episode. In part one of this two-part interview, Debbie and Wendy discuss tips for traveling with confidence as a blind or low vision person, advice on picking destinations, considerations for traveling alone and in a group, and more!
Ricky Enger is joined by Hadley's Debbie Worman and mindfulness expert Tiffany Guske in this latest episode. Debbie and Tiffany talk about what mindfulness is and the specific benefits that mindfulness can offer for those living with vision loss. Tiffany then walks listeners through a short mindfulness exercise.
In this episode, Ricky Enger speaks with New York Times columnist Frank Bruni, who shares the story of his sudden vision loss from NAION. Bruni speaks candidly on his adjustment to the change, maintaining a realistic attitude towards his vision loss, and the failure of medical professionals to provide resources after diagnosis.
Listen in as we share practical tips on how to keep your handwriting readable. This resource-packed episode includes many useful techniques and solutions to common handwriting challenges. Hadley Learning Expert Jennifer Ottowitz chats with Sue Dalton, Certified Vision Rehabilitation Therapist.
In this episode, Hadley's Steve Kelley speaks with Kendra Farrow, from the National Research and Training Center on Blindness and Low Vision, located at Mississippi State. The episode serves as a guide for those new to vision rehabilitation, including determining who is eligible for services, key differences between the medical and social services models, and how to locate services in each state.
In this episode, Ricky Enger chats with Microsoft's Jeremy Curry, a Senior Program Manager with the Windows Accessibility team. New vision accessibility features are now available in Windows 10 for low vision and screen reader users.
In the inaugural episode of Hadley Presents, Ricky Enger and Jonathan Mosen of Aira chat about the ways in which a visual interpreter service, such as Aira, can be used to gain valuable visual information and enhance travel and leisure activities for blind and low vision users.