Posted by ccld201
4 March 2024If not carefully regulated, the rapid development and proliferation of artificial intelligence (AI) is set to pose a significant threat to the dignity of women. The New York Post reported on 25 January that AI-generated pornographic images of Taylor Swift had begun circulating on X (Twitter).[1] While this incident is likely to increase discussions and highlight the urgency of dealing with this technology, the issue of deepfake pornography and nudification tools is not new. Indeed, a documentary released by the Guardian in 2023 titled âMy Blonde GFâ details the experience of Helen Mort, a woman who had been victimised by AI technology to create a pornographic film of her that she had not participated in, nor consented to. Mort had not been aware of the filmâs creation until being made aware of the final product.
Deepfake porn has been described as technology which allows the user to superimpose a victimâs face onto the body of a porn performer.[2] However, it seems to be rapidly moving towards a broader definition encompassing the use of AI and machine learning to create and/or alter images, videos and audio. Less discussion has taken place regarding nudification tools, which is a technology arguably falling under the deepfake umbrella. These tools allow a user to upload a fully clothed image of a woman (and more recently children, discussed below), and with the click of a few buttons, generate an entirely nude image of her. The AI attempts to generate as accurate an image as possible. At the time of writing, these tools fail to produce significantly accurate images. However, given Mooreâs Law â that technological advancement can happen at an exponential rate[3] â it likely will not be long before these tools are frighteningly accurate, and eventually progress to the creation of videos with an accurate nudified body.
Although the algorithms used in these tools can be altered to apply to anyone, not just women, a Consultation Paper published by the Law Commission in 2021 found that 100% of pornographic deepfakes were of women.[4] Additionally, the Commissionâs Final Report notes that intimate image abuse in general is âa gendered phenomenonâ.[5] They note that:
âIt is often linked to misogyny; a sense of male entitlement to womenâs bodies is a prevalent motivation for the non-consensual taking and sharing [or in the case of nudification tools, creation] of intimate images and operates to reinforce female subordination and the objectification of women in society.â[6]
This represents a clear violation of the human dignity of women â that a fully-clothed image she chose to share of herself online could be taken and uploaded to a nudification website or mobile application to generate a fully nude image of her without her consent or knowledge. Given that appearance in sexualised content can have real-world impacts on, for example, an individualâs ability to access housing and employment,[7] and the rapid development of these new technologies, the potential threat this poses, particularly to women and children, must be taken seriously. Another potential consequence of the evolution of this technology is the potential for smartphone apps to be developed that could ânudifyâ someone walking down the street in real time.
Nudification tools and similar technologies have recently been applied to children. Indeed, Fox News reported in June 2023 that AI tools are now being used to generate sexually explicit images of children.[8] In fact, South Korea jailed a man for the first-time last year for using AI tools to create sexually explicit images of children. Moreover, Sky News reported that AI tools were being used to âde-ageâ celebrities and to generate sexually explicit images of the results. There are two methods used by those abusing these tools to harm children in this way. Some paedophiles, according to fox news, are generating sexually explicit of âfakeâ AI-generated children.[9] This creates an additional complexity for the law to deal with given that no image of an actual child was used to generate the material. However, nudification tools can also be used to target real children in sexually abusive ways. Given that some AI porn generating tools are producing over half a million images a day,[10] there is an urgent need for innovative legal solutions to address this issue, particularly to protect women and children.
In a society where women are often shamed for expressing their sexuality, care must be taken not to shame adult women for appearing in images and videos in little to no clothing. However, it must be acknowledged that these tools will be (and are currently) weaponised to shame and embarrass women and violate their dignity and personal autonomy. The use of these tools essentially eliminates the victimâs right to self-determination and makes controlling oneâs own sexual privacy[11] nearly impossible. Similarly, they could significantly infringe upon the right to private and family life under Article 8 of the European Convention on Human Rights.[12] In VavĆiÄka and Others v. the Czech Republic,[13] the Court the right to private life has been found to consistently extend to âaspects relating to personal identity, such as personal identity, such as a personâs name, photo or physical and moral integrityâ.[14] The creation of such nudification tools allows any user to generate what will eventually be accurate naked images of any woman (or child or, though much less likely, man) with an online profile, not just without her consent, but without her knowledge. The crux of the issue, then, is not that women are appearing in pornographic images, but that the use of these tools removes the need for womenâs participation in the creation of these images and therefore eliminates the ability of women to express their agency in how they express themselves and their sexualities and eliminates their autonomy in the use of their likeness.
Preventing the abuse of such tools will not be an easy task. As long ago as 2021, there was at least one Member of Parliament in the UK calling to ban these harmful tools.[15] Since MP Maria Millerâs call for a ban, the Online Safety Act has indeed received royal assent. This Act makes it an offence to intentionally share âa photograph or film which shows, or appears to show, another person⊠in an intimate stateâ where that person does not consent or is not reasonably believed to have consented.[16] Additionally, the Protection of Children Act 1978 proscribes the creation of âindecent photographsâ of children.[17] It is unclear whether this would apply to indecent images of AI-generated children. An updated definition of âchildâ to include those created by AI could help to clarify this particular law. Alternatively, Parliament could amend legislation to specify that child sexual abuse material includes anything AI-generated that features an apparent child.
Although a step in the right direction, the Online Safety Act does not go far enough to protect potential victims, particularly adult women, from this newer type of cyber sexual abuse. The trouble with the UKâs Online Safety Act is that it prohibits the sharing or the threat to sharing such images,[18] but does nothing to proscribe their creation without explicit consent. Therefore, in developing legal solutions to dealing with this technological advancement, governments must regulate against the use of AI technologies for sexually abusive purposes. One way to do this might be requiring nudification apps and websites to implement consent verification systems to ensure that the individual whose likeness is being used has agreed to their image being manipulated in this way. An additional option might be for governments to work with experts and large corporations in computer science and the internet to find ways to specifically target these websites and computer applications in order to remove them from the web, or at the absolute minimum, suppress them from appearing in search results. Given that 96% of deepfakes are of a non-consensual pornographic nature,[19] prohibiting public access to nudifying technologies, or at least requiring some approval system for their use, would likely be proportional.
[1] Kirsten Fleming, âAI âpornâ of Taylor Swift is a wake-up call: House must pass bill making this a federal crimeâ New York Post (25 January 2024) <https://nypost.com/2024/01/25/entertainment/ai-porn-of-taylor-swift-is-a-wake-up-call-for-us-government/> accessed 27 January 2024.
[2] Gieseke AP, ââThe New Weapon of Choiceâ: Lawâs Current Inability to Properly Address Deepfake Pornographyâ (2020) 73 Vanderbilt law review 1479.
[3] Erik Gregersen, âMooreâs Lawâ, Encyclopaedia Britannica (2024) <https://www.britannica.com/technology/Moores-law> accessed 28 January 2024.
[4] Law Commission, Intimate Image Abuse: A consultation paper (Law Com No 253, 2021) para 2.47.
[5] Law Commission, Intimate image abuse: a final report (Law Com No 407, 2022) para 1.7.
[6] ibid.
[7] Free Speech Coalition, âFinancial Discrimination and the Adult Industryâ (Free Speech Coalition, May 2023) <https://action.freespeechcoalition.com/wp-content/uploads/2023/05/Financial-Discrimination-and-the-Adult-Industry-Updated-May-2023.pdf> accessed 13 February 2024.
[8] Nikolas Lanum, âAI now being used to generate child pornography, blackmail teenagers: Digital safety expertâ Fox News (29 June 2023) <https://www.foxnews.com/media/ai-used-generate-child-pornography-blackmail-teenagers-digital-safety-expert> accessed 27 January 2024.
[9] ibid.
[10] Will Gendron and Beatrice Nolan, âAn AI image generator making NSFW content is pumping out 500,000 pics a day, CEO says â but AI porn is a murky businessâ Business Insider (28 July 2023) <https://www.businessinsider.com/ai-porn-generator-unstable-making-500k-images-day-ceo-says-2023-7?r=US&IR=T> accessed 27 January 2024.
[11] Danielle Keats Citron, âSexual Privacyâ (2019) 128 Yale Law Journal 1870.
[12] Convention for the Protection of Human Rights and Fundamental Freedoms (European Convention on Human Rights, as amended) (ECHR) art 8.
[13] App no 47621/13 (ECtHR, 8 April 2021).
[14] European Court of Human Rights, âGuide on Article 8 of the European Convention on Human Rights: Right to respect for private and family life, home and correspondenceâ (European Court of Human Rights, 31 August 2022).
[15] Jane Wakefield, âMP Maria Miller wants AI ânudifyingâ tool bannedâ BBC (4 August 2021) <https://www.bbc.co.uk/news/technology-57996910> accessed 27 January 2024.
[16] Online Safety Act 2023, s 188.
[17] Protection of Children Act 1978, s 1.
[18] Ibid (n13).
[19] Sensity, âThe State of Deepfakes: Landscape, Threats and Impactâ (Sensity, 29 November 2019) <https://medium.com/sensity/mapping-the-deepfake-landscape-27cb809e98bc> accessed 13 February 2024.