[ad_1]
To highlight the insights shared at our recent conference in London, Global Dating Insights is proud to share a roundup of some of the event’s presentations.
In this article, we summarise the presention of Alexandra Popken, VP of Trust & Safety at WebPurify.
No business survives in the long term unless its customers trust its output. So the revelation that more than 90 per cent of dating app users in the 30-40 age group feel that the use of AI impacts their trust in the platform should be concerning for all platforms.
Alexandra Popken, VP of Trust and Safety at WebPurify shared that scary statistic with delegates at the GDI London conference this week. Although the negative trust impact of AI rose above 90 per cent in that age group, it crossed 70 per cent across all age groups. What’s more, 40 per cent of all users do not feel equipped to detect fake profiles or information, and 70 per cent of them feel it is up to the platforms to do more to protect them.
But whereas the users view AI with suspicion, Popken argued generative AI also offered an opportunity to help make dating safe and more rewarding.
She explained how generative AI could help identify and block activity such as catfishing, scamming, grooming and malicious attacks. “All of these are examples of activity that build the opposite of trust in your platform,” she highlighted.
Combating them, she told delegates, involved using AI as a good actor and making your own business and your users more aware and more alert to the dangers.
“Regulators are looking at our space,” she said. “Compliance will become a big thing and you need to be ready.”
She urged delegates to invest in content moderation, educate and engage with their users, partner with their peers, and also set their own internal standards and principles.
Popken envisaged AI as a tool and a best friend for the industry, but the research shows that educating the user community to its benefits is an important step on that journey.
[ad_2]
Source link

