Like every other tech company, Tinder is bringing artificial intelligence to its operations in an attempt to create a better user experience, its parent company Matchgroup said in its Q2 earnings letter on Wednesday. The company says it plans to create new features around the needs of Gen Z users.
Aside from Tinder, Match Group also owns popular dating apps Hinge and OkCupid which are introducing AI in an attempt to shift their algorithm to focus on how the younger generation approaches dating. By incorporating quizzes, prompts, and conversation starters, the earnings report says it will better target self-expression, but with that comes its plans to utilize AI capabilities.
As the total revenue for Match Group’s dating apps continues to grow, jumping 4% in the second quarter of 2023 compared to the same quarter last year, the company continues to search for ways to automate the dating process. By incorporating artificial intelligence into the mix, the company says users will be able to use the “AI-enabled photo selection” tool to automatically select the five best photos of themselves to use in their dating profile. The company did not state which of its apps would receive the feature or if it would apply to all of the apps.
But what Match Group has failed to convey in its report, is what will happen if, as with so many other AI tools, it introduces more bias and discrimination on the platforms. If AI is selecting which photos are most appealing, what’s to stop it from showing racial bias and picking only those photos that appear light-skinned? Such a problem occurred when Rona Wang, an Asian Massachusetts grad student, recently used AI to generate a LinkedIn photo of herself which she posted on her Twitter account.
Using an AI image generator, Wang told ABC News it created a picture of her with “Pale skin, like blue eyes, someone with like European features.” She added, “I think, certainly, there’s a lot of bias in AI, and I think that’s a huge problem.”
Of all the problems that have been identified in generative AI systems, racial bias has been one of the most persistent and pervasive.
Match Group says it plans to roll out an AI feature to propose ideal matches, explaining why the profile populated for the user, and why they would allegedly make a good match. By adding this feature, the company hopes to give the user less of a need to swipe left.
In one AI concept photo shared in the earnings letter, the AI feature showed an image of a possible match on OkCupid, with the caption: “Get ready for endless laughter with this witty charmer who shares your love for brunch, scary movies, and a laid-back approach to life!”
Yet, how will Match Group control how AI decides if you’re a match made in heaven? AI often misrepresents people of color, potentially creating a massive headache for dating apps. “If it’s being used to make some really big decisions that can be really dangerous, right?” Wang told ABC. “If it is perpetuating these biases or especially racial biases.”
Artificial intelligence tools pick up on trends and patterns it picks up from the dataset it was trained on—meaning it tends to carry on with similar racist attitudes that humans express online. This could mean that the AI tools will hide an otherwise perfect match based on a name, skin tone, or qualities relating to race and social interactions.
MIT student Joy Buolamwini said in a TED Talk that facial recognition algorithms don’t work accurately, saying even in her lab, it can’t detect Black faces, and she has to put on a white mask to be recognized.
“If the training sets aren’t really that diverse,” she said, “any face that deviates from the established norm will be harder to detect.”
As with all technology and dating apps in general, users should be additionally wary of AI tools’ ability to increase the number of fake profiles on dating apps. This is a concern Match Group carries, according to CEO Bernard Kim, who said during the earnings call, “We need to be really thoughtful about making sure that we’re giving the right thought to authenticity and ethical and privacy concerns,” TechCrunch reported.