Not porn pic ai search

Google Chrome now has a service for scanning every website by AI. Maybe it’s time to have something similar in brave, to have an option for every site you enter to have the images scanned by AI and the user can choose whether to display images that contain sexual content (not only explicit porn, but even underwear, etc.)

Only if it’s on-device scanning, preferably by a model the user can inspect. Or at least not without some very clear, opt in controls. Also, there should be some mechanism for reporting mis-classification to tune the model.

If I enable “safe search” from some search engine, I’m implicitly granting permission to make some decisions on my behalf about the relevance and offensiveness of the results… from “I’m still in kindergarten, please try very hard to not offend me” to “I’ve been on the internet for years and have already seen the worst things possible. I can deal.”

It seems to me that the sort of people who use Brave are the sort of people who don’t want companies scanning, tracking, logging, profiling them. I mean, if you want to upload an image to Brave, and have them decide whether an image is some sort of inappropriate that’s fine… do with your data as you please. Just don’t make make it so that I have to upload my data too.