Dating app Bumble has launched a new feature that automatically blurs nude images, Independent reports.
Earlier this year, the female-forward app, which encourages women to make the first move when they match with men, announced the development of its Private Detector Tool.
By utilising artificial intelligence, the tool captures images in real-time with 98 percent accuracy in order to detect if a user has been sent content that is inappropriate.
The feature, which has now been rolled out across the globe and was announced on Bumble’s Instagram account, was championed by Whitney Wolfe Herd, founder and CEO of Bumble, and Andrey Andreev, founder of MagicLab, which owns dating apps Badoo, Bumble, Lumen, and Chappy.
When a Bumble user is sent an image or message that the Private Detector tool has deemed inappropriate, a message appears on their screen to alert them.
They are then offered the option to tap to reveal the picture or message in question.
According to Wolfe Herd,
“The digital world can be a very unsafe place overrun with lewd, hateful and inappropriate behaviour, there’s limited accountability, making it difficult to deter people from engaging in poor behaviour.”
He added that the safety of people who use dating apps is the company’s “number one priority”.
“The development of ‘Private Detector’ is another undeniable example of that commitment,” he stated.
Several Instagram users praised the rolling out of the new feature.
“Someone tried sending me a nude and I was SO happy when this popped up,” one person wrote, in reference to the Private Detector tool. “Guys – send dog pictures, not d*** pictures.”