HomeNewsCalmara suggests detecting sexually transmitted diseases from photos of genitals - a...

Calmara suggests detecting sexually transmitted diseases from photos of genitals – a dangerous idea

You went home with a Tinder date and things escalate. You don't really know or trust this guy, and also you don't wish to get an STI, so… now what?

An organization called Quiet wants you to take a photograph of the person's penis after which use AI to let you know whether your partner is “clear” or not.

Let's get something out of the best way straight away: you shouldn't take a photograph of another person's genitals and scan it with an AI tool to come to a decision whether or not it is best to have sex.

Calmara's premise has more red flags than a foul first date, but things only worsen from there, considering that Most sexually transmitted diseases are asymptomatic. So your partner could thoroughly have an STD, but Calmara would let you know they’re secure. This is why true STI testing uses blood and urine samples to detect infection, versus a visible exam.

Other startups are approaching the necessity for accessible STI testing more responsibly.

“In laboratory diagnosis, sensitivity and specificity are two essential metrics that help us understand the test's propensity for missed infections and false positive results,” Daphne Chen, founding father of TBD Health, told TechCrunch. “Even with very rigorous testing, there’s all the time some extent of fallibility, but test manufacturers like Roche are ahead of the sport with their validation rates for good reason – so doctors can contextualize the outcomes.”

In the high-quality print, Calmara warns that its results mustn’t be used as an alternative choice to medical advice. But its marketing suggests otherwise. Before TechCrunch reached out to Calmara, the title of its website was “Calmara: Your Intimate Friend for Unprotected Sex” (it has since been updated to say “Safer Sex” as an alternative). And in a Promo videoit describes itself as “The PERFECT WEBSITE to ADOPT!”

Co-founder and CEO Mei-Ling Lu told TechCrunch that Calmara shouldn’t be intended to be a serious medical tool. “Calmara is a life-style product, not a medical app. It doesn’t include any medical conditions or discussions inside its scope, and no physicians are involved in the present Calmara experience. It is a free information service.”

“We are updating communications to raised reflect our current intentions,” Lu added. “The clear idea is to initiate a conversation about STI status and testing.”

Calmara is an element of HeHealth, which was founded in 2019. Use Calmara and HeHealth the identical AI, which is claimed to be 65-90% accurate. HeHealth is designed as a primary step in assessing sexual health. The platform then helps users connect with partner clinics of their area to schedule an appointment for an actual, comprehensive screening.

HeHealth's approach is more reassuring than Calmara's, but that's a low bar – and even then, it raises an enormous red flag: privacy.

“It's good to see that they provide an anonymous mode where you don't should associate your photos with personally identifiable information,” Valentina Milanova, founding father of tampon-based STI screening startup Daye, told TechCrunch. “However, this doesn’t mean that their service is anonymized or anonymized, as your photos should be traced back to your email or IP address.”

HeHealth and Calmara also claim that they comply with HIPAA, a regulation designed to guard patient confidentiality, because they use Amazon Web Services. That sounds reassuring, but in its privacy policy, Calmara writes that it shares user information with “service providers and partners that help with service operations, including data hosting, analytics, marketing, payment processing and security.” They also don’t specify whether these AI scans happen in your device or within the cloud and, in that case, how long this data stays within the cloud and what it’s used for. That's a bit too vague to reassure users that their intimate photos are secure.

These security issues will not be only worrying for users, also they are dangerous for the corporate itself. What happens if a minor uses the positioning to envision for sexually transmitted diseases? Then Calmara comes into possession of fabric about child sexual abuse. Calmara's response to this ethical and legal obligation is to jot down in its terms of use that use by minors is prohibited, but that the defense has no legal significance.

Calmara represents the danger of overhyped technology: it looks like a publicity stunt for HeHealth to use the thrill surrounding AI, but in actual implementation it simply gives users a false sense of security about their sexual health. These consequences are serious.

“Sexual health is a difficult area to innovate, and I can see that their intentions are noble,” Chen said. “I just think they might be rushing to market with an answer that isn't sufficiently mature yet.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read