Sex workers rights and AI

Abeba Birhane, Vinay Prabhu and colleagues’ excellent paper on large vision-language (VL) data sets highlights that an alarming amount of the content is pornographic12. I share their concern that non-sexual terms such as “girl”, “abuela” are associated with sexual imagery. I am also concerned about the subjects of these images losing agency over their likeness. Specifically, sex workers, who may or may not be aware these images are publicly available.

The issue of pornographic content is often talked about as a risk for AI trained on scraped internet data, and rightly so, because unwanted exposure to such content is upsetting! However, the rights of sex workers (or those creating adult content), are rarely considered. I think this in part sems from implicit whorephobia (discrimination against sex workers manifesting as dehumanisation and apparent ignorance to their needs). I think it also stems from a tendency to define harmful content based on the developer/ paper authors’ intuitions - so discussion rarely extends beyond “harmful content = bad (for me to see)”. This superficiality is true not just of sex workers’ rights, but for other marginalised groups of course - something which was compellingly presented by Su Lin Blodgett and colleagues in 2020 34.

Images of these performers (typically women) could be used to train models that produce custom pornography; a colleague has found several websites offering such services, where you can browse for your favourite only fans performer. This contributes towards greater income precarity for the performers, whilst those developing the models may benefit from sale of the output or model. The image subjects almost certainly recieve no compensation. Thus sex workers face losing income whilst others profit. Individuals may produce images of sex workers committing acts they would never consent to, which could be upsetting if viewed by the sex worker5.

There has been extensive discussion of the risk of deep fakes, whereby an “innocent” individual has their face effectively superimposed onto a body engaged in sexual acts. I have seen no discussion of the rights of the person who that body belongs to.

(I’d like to thank Juno Mac and Molly Smith whose book Revolting Prostitutes made me a better person and hopefully a better ally. Buy it here: https://www.versobooks.com/books/3039-revolting-prostitutes)

  1. A. Birhane, Vinay Uday Prabhu, and Emmanuel Kahembwe. 2021. Multimodal datasets: misogyny, pornography, and malignant stereotypes. ArXiv. 

  2. Vinay Uday Prabhu and Abeba Birhane. 2020. Large image datasets: A pyrrhic win for computer vision? arXiv:2006.16923 [cs, stat]. 

  3. Su Lin Blodgett, Solon Barocas, Hal Daumé III, and Hanna Wallach. 2020. Language (Technology) is Power: A Critical Survey of “Bias” in NLP. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 5454–5476, Online. Association for Computational Linguistics. 

  4. Su Lin Blodgett, Gilsinia Lopez, Alexandra Olteanu, Robert Sim, and Hanna Wallach. 2021. Stereotyping Norwegian Salmon: An Inventory of Pitfalls in Fairness Benchmark Datasets. 

  5. Thanks to Charlotte Bird for this insight. 

Updated: