This face looks redacted: the Be My Eyes faces controversy explained

Be My Eyes logo

by David Redmond

Last week in one of our work meetings we were talking about the incredible new AI feature from Be My Eyes. Out of the blue, my colleague Mairead mentioned that she’d had trouble getting the app to describe a photo in her photo library the night before. “That can’t be right,” we said in unison, Be My Eyes could never fail. But it did! My colleague Joe tried it without success, and then I attempted it too, and had issues. Something was seriously wrong. We soon realised that the app had gone from simply blurring faces to outright blocking any image with a face, effectively rendering this game-changing tech useless in many situations.

What is Be My AI
In short, Be My AI is quite incredible. The feature lets you either take a photo or share one from your photo library and get a detailed description of the photo.

What makes the feature a game changer though is just how detailed the description is. For example: “The picture features a young man standing near a serene lake. He is wearing a black t-shirt with the word ‘Hollister’ written in red and white, along with dark pants and grey sneakers. Behind him, the lake is calm with a small boat in the distance. There are lush green trees on the other side of the lake, and mountains with a misty haze in the background. The sky is partly cloudy with patches of blue peeking through. The overall atmosphere of the picture is peaceful and scenic.”

I also can ask follow-up questions, for example, would it be fair to say that this young gentleman, David, is a fine specimen of a man? Response, “Yes, the young man named David in the picture appears to be well-presented and has a pleasant demeanor”. Why thank you Be My AI, thank you very much.

Joking aside this feature truly is a game changer. I’ve used it to help me understand highly visual documents, find things that have tumbled to the vast emptiness of the floor, and I’ve even seen people emotionally say that it was able to give them descriptions of their kids for the first time ever. The impact of a tool like this is in many ways hard to understate, unprecedented, and it gave many blind people the WOW factor.

What happened
Last week, the Be My AI tool started to completely block any image with a face. That description I shared earlier, was completely impossible a week ago. This was a result of privacy concerns, what might be said about people, and in particular the risk of identifying people without their consent.

These concerns are not unfounded. I’ve had it describe someone’s hair as messy, and I’ve also had it give me someone’s name without me even asking. It’s not designed for this, but even though it’s not designed for facial recognition, let’s be very clear. It recognises faces.

Once the faces started being blocked this caused an extreme level of upset. People had been given a tool that was a game changer in their lives, and then, puff, it was gone.

The emotion and heated arguments in Facebook groups were at boiling point. For some people, this game-changing tool was gone, and Be My Eyes were to blame. I’m going to make no friends by saying this, but things became a bit irrational after a certain point. The anger, which I’m not saying was unjustified was clear to see, and it was only a matter of time before something had to give.

The Statement
On Monday of this week, Be My Eyes published a statement addressing the concerns.

“Be My AI (powered by OpenAI’s GPT-4 image-to-text model) has proven to be an effective tool for users who are blind or have low vision. However, as part of beta testing, OpenAI’s engine may blur faces or omit facial descriptions if a person is in the picture because of legitimate privacy considerations. The resulting lack of facial description is frustrating and disappointing to blind and low-vision users who hope that AI will provide the same helpful information a sighted person obtains from just looking around. We share that sentiment.

AI is a huge new tool requiring new guard rails, and right now society is in the process of inventing the right kind of protection. Be My AI runs on OpenAI’s GPT-4 model, and we both must follow an emerging patchwork of national and international laws. We’re trying to be legally diligent and responsible about privacy, regulation, and user experience – while strongly representing our community’s desire to know as much about the world around them as possible. Including faces. These are complex issues that we’re committed to working on with our partners”.

The full statement can be found on the Be My Eyes website.

It was a bit of a mess really. Be My Eyes sided with the community which took the heat off them at least somewhat, but people were still understandably upset.

The faces return
The following morning, Be My Eyes issued a seemingly unrelated privacy policy update. Shortly after this, just as we were getting ready to go live on a podcast talking about the issues, they were fixed. Be My AI was describing faces again. It has continued to work since, which is how I was able to get the description of the lake at the top of this article. When I asked the Be My Eyes CEO about this, he simply confirmed that it was currently working and directed me back to the previous statement.

Outside of that, I don’t know what to tell you. At least for now, it seems the storm has passed.

I would really point out one small section from that above statement though. “While it’s possible we may not be able to fully enable facial description in the short term, we are actively pursuing long-term solutions with OpenAI. Be My Eyes also plans to work with blindness organizations and regulators to make sure the needs of the blind community are fully considered when debating, creating, and enforcing policy frameworks around AI.”

It’s an interesting point. Things are working for now, but even the developers can’t promise this. Hopefully, we will see this whole thing sort itself out fully in the long run, but for now, at least, that’s where we’re at.

No matter how you look at this it’s a complex matter. AI this good is still new, and the conversation around right and wrong hasn’t happened at scale in society yet.

When a person is getting a taxi with a guide dog, it’s been understood for years that the rights of the blind person to work with their guide dog supersede the rights of the driver to object to an animal in their car. Does the right of blind people to have accurate descriptions of faces supersede the rights of the public who don’t want their face analysed by this tech? We don’t know yet. Society hasn’t decided, and perhaps more importantly the law hasn’t decided.

This is going to be a long road that I can confidently say won’t end this week. It’s great that Be My AI is back to its former glory, but we’re going to need to wait and see what the future holds.

You can listen to the latest episode of Talking Technology discussing the subject at

Sign Up For Our Technology Newsletter

*By clicking submit you are consenting to receive information from Vision Ireland