Sex sells. Where there are high user numbers, traffic and sales, there are also higher risks for acquirers and payment service providers.
Everything changed overnight last December for one of the internet’s largest adult entertainment merchants. Following allegations of child abuse content, violence and non-consensual sexual behavior on the site, Mastercard, Visa and Discover pulled card acceptance.
The Canadian-based company that owns the site and a whole host of other successful adult websites allows users to upload their own content. For the site’s visitors, this is part of the appeal – but it also contributes hugely to the risks. Hold that thought while we consider the first part: its extraordinary popularity.
Three hours’ worth of content uploaded every minute
The site in question regularly ranks within the top ten most visited sites on the internet. According to the merchant’s own statistics, it received over 42 billion website visits in 2019, or around 115 million visitors per day.
“That’s the equivalent of the population of Canada, Australia, Poland and the Netherlands all visiting in one day!” wrote the company in a December 2019 blog post.
The statistics for one internet minute on the site are equally mind-boggling. Thereare 80,000 visitors and nearly 220,000 videos viewed. Almost three hours’ worth of content is uploaded every minute. This equates to a day’s worth of video uploaded every nine minutes.
So, what’s the problem? Why did the card schemes pull their acceptance?
Spoiler: it’s not pornography — or more candidly, sex — it’s consent or the lack of it.
It seems as if the change was triggered by a high-profile New York Times investigation in December 2020. This claimed the website hosted videos of child rape as well as non-consensual sexual activity with trafficked people, including minors and unconscious women and children.
Among the hours of content uploaded every day were also videos posted without the knowledge or consent of those featured. For example, revenge porn and webcam footage from showers and changing rooms. That’s a big problem.
High brand-risk compliance programs
The card schemes don’t want to be associated with any illegal or brand-damaging activities. They have high brand-risk programs in place. Mastercard has the Business Risk Assessment and Mitigation program or BRAM. Visa has the Global Brand Protection Program or GBPP.
The programs stipulate that acquirers must register in advance and prove they have adequate controls in place before acquiring merchants in high brand-risk sectors. The change in risk classification from high-risk to high brand-risk reflects the change in how card acceptance risk is viewed, particularly in the e-commerce environment.
For example, under the GBPP, Visa does not permit its brand to be used for the sale or purchase of material relating to child abuse, bestiality, rape or any other non-consensual sexual behavior, as well as non-consensual mutilation of a person or body part. Mastercard has very similar rules.
These programs have been in place for around a decade, so they should be old news for the industry. Nonetheless, let’s reiterate some of the steps which acquirers and their adult, webcam or online dating merchants can take to mitigate e-commerce acceptance risks.
Four ways to manage adult merchant risk
Firstly, ensure that merchants have robust content approval and monitoring policies in place. This is to prevent the uploading of content that is illegal or prohibited by the card schemes.
This could include limiting upload privileges to verified users only and preventing users from downloading content. While these measures will reduce the amount of content uploaded, they will also significantly decrease the likelihood of non-compliant content being made available. Deactivating downloads helps to prevent content from circulating on different platforms.
Secondly, as soon as content is uploaded, additional content monitoring controls need to kick in. These checks should be automated, especially given the amount of user-generated content uploaded daily on those platforms. Image and video fingerprinting are very useful tools in this context.
Thirdly, ensure merchants conduct a full identification of any performers or models before they are accepted on to the platform. Verifying age and identity via a video chat, plus reviewing this at least annually, reduces the risk of identity theft or account takeover.
Fourthly, pay particular attention to member-only areas. Sites that have those or use third-party technology such as Skype, present increased content risks. Transactions may occur outside the controlled, monitored environment. Rigorous investigation as well as ongoing screening and monitoring are essential.
To wrap things up, let’s get back to the case we discussed earlier. How did the website react to the accusations and loss of card acceptance?
In a press release published in mid-December 2020, the company acknowledged the situation and announced a seven-point plan to safeguard its platform. The most notable change was that in the future, it will only allow verified users to upload content.
This also meant the detection of all content by non-verified users, which amounted to 10 million users. It also banned video downloads and pledged that it would be significantly expanding moderation. This was in addition to launching a trusted flagger program in co-operation with non-profit organizations.
Merchants offering user-generated content on their websites pose an increased risk for acquirers. They require proper and in-depth due diligence and, of course, high frequency monitoring.
At Web Shield, we have been helping acquirers manage their merchant portfolios since 2010 with both on-boarding and monitoring services. For more on this and other trending topics, check out Web Shield's regular video podcast.
Episode 1 teaser: Online adult entertainment and user-generated content.