{"id":1393,"date":"2024-03-04T17:05:49","date_gmt":"2024-03-04T17:05:49","guid":{"rendered":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/?p=1393"},"modified":"2024-03-04T17:05:51","modified_gmt":"2024-03-04T17:05:51","slug":"preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones","status":"publish","type":"post","link":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/","title":{"rendered":"Preventing AI Sexual Abuse by Suppressing Nudification Tools, by Courtney Jones"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\"><\/h2>\n\n\n\n<p>If not carefully regulated, the rapid development and proliferation of artificial intelligence (AI) is set to pose a significant threat to the dignity of women. The New York Post reported on 25 January that AI-generated pornographic images of Taylor Swift had begun circulating on X (Twitter).<a href=\"#_ftn1\" id=\"_ftnref1\">[1]<\/a> While this incident is likely to increase discussions and highlight the urgency of dealing with this technology, the issue of deepfake pornography and nudification tools is not new. Indeed, a documentary released by the Guardian in 2023 titled \u2018My Blonde GF\u2019 details the experience of Helen Mort, a woman who had been victimised by AI technology to create a pornographic film of her that she had not participated in, nor consented to. Mort had not been aware of the film\u2019s creation until being made aware of the final product.<\/p>\n\n\n\n<p>Deepfake porn has been described as technology which allows the user to superimpose a victim\u2019s face onto the body of a porn performer.<a href=\"#_ftn2\" id=\"_ftnref2\">[2]<\/a> However, it seems to be rapidly moving towards a broader definition encompassing the use of AI and machine learning to create and\/or alter images, videos and audio. Less discussion has taken place regarding nudification tools, which is a technology arguably falling under the deepfake umbrella. These tools allow a user to upload a fully clothed image of a woman (and more recently children, discussed below), and with the click of a few buttons, generate an entirely nude image of her. The AI attempts to generate as accurate an image as possible. At the time of writing, these tools fail to produce significantly accurate images. However, given Moore\u2019s Law \u2013 that technological advancement can happen at an exponential rate<a href=\"#_ftn3\" id=\"_ftnref3\">[3]<\/a> \u2013 it likely will not be long before these tools are frighteningly accurate, and eventually progress to the creation of videos with an accurate nudified body. <ins><\/ins><\/p>\n\n\n\n<p>Although the algorithms used in these tools can be altered to apply to anyone, not just women, a Consultation Paper published by the Law Commission in 2021 found that 100% of pornographic deepfakes were of women.<a href=\"#_ftn4\" id=\"_ftnref4\">[4]<\/a> Additionally, the Commission\u2019s Final Report notes that intimate image abuse in general is \u2018a gendered phenomenon\u2019.<a href=\"#_ftn5\" id=\"_ftnref5\">[5]<\/a> They note that:<\/p>\n\n\n\n<p>\u2018It is often linked to misogyny; a sense of male entitlement to women\u2019s bodies is a prevalent motivation for the non-consensual taking and sharing [or in the case of nudification tools, creation] of intimate images and operates to reinforce female subordination and the objectification of women in society.\u2019<a href=\"#_ftn6\" id=\"_ftnref6\">[6]<\/a><ins><\/ins><\/p>\n\n\n\n<p>This represents a clear violation of the human dignity of women \u2013 that a fully-clothed image she chose to share of herself online could be taken and uploaded to a nudification website or mobile application to generate a fully nude image of her without her consent or knowledge. Given that appearance in sexualised content can have real-world impacts on, for example, <del>&nbsp;<\/del>an individual\u2019s ability to access housing and employment,<a href=\"#_ftn7\" id=\"_ftnref7\">[7]<\/a> and the rapid development of these new technologies, the potential threat this poses, particularly to women and children, must be taken seriously. Another potential consequence of the evolution of this technology is the potential for smartphone apps to be developed that could \u2018nudify\u2019 someone walking down the street in real time.<\/p>\n\n\n\n<p>Nudification tools and similar technologies have recently been applied to children. Indeed, Fox News reported in June 2023 that AI tools are now being used to generate sexually explicit images of children.<a href=\"#_ftn8\" id=\"_ftnref8\">[8]<\/a> In fact, South Korea jailed a man for the first-time last year for using AI tools to create sexually explicit images of children. Moreover, Sky News reported that AI tools were being used to \u2018de-age\u2019 celebrities and to generate sexually explicit images of the results. There are two methods used by those abusing these tools to harm children in this way. Some paedophiles, according to fox news, are generating sexually explicit of \u2018fake\u2019 AI-generated children.<a href=\"#_ftn9\" id=\"_ftnref9\">[9]<\/a> This creates an additional complexity for the law to deal with given that no image of an actual child was used to generate the material. However, nudification tools can also be used to target real children in sexually abusive ways. Given that some AI porn generating tools are producing over half a million images a day,<a href=\"#_ftn10\" id=\"_ftnref10\">[10]<\/a> there is an urgent need for innovative legal solutions to address this issue, particularly to protect women and children.<\/p>\n\n\n\n<p>In a society where women are often shamed for expressing their sexuality, care must be taken not to shame adult women for appearing in images and videos in little to no clothing. However, it must be acknowledged that these tools will be (and are currently) weaponised to shame and embarrass women and violate their dignity and personal autonomy. The use of these tools essentially eliminates the victim\u2019s right to self-determination and makes controlling one\u2019s own sexual privacy<a href=\"#_ftn11\" id=\"_ftnref11\">[11]<\/a> nearly impossible. Similarly, they could significantly infringe upon the right to private and family life under Article 8 of the European Convention on Human Rights.<a href=\"#_ftn12\" id=\"_ftnref12\">[12]<\/a> In <em>Vav\u0159i\u010dka and Others v. the Czech Republic<\/em>,<a href=\"#_ftn13\" id=\"_ftnref13\">[13]<\/a> the Court the right to private life has been found to consistently extend to \u2018aspects relating to personal identity, such as personal identity, such as a person\u2019s name, photo or physical and moral inte<ins>g<\/ins>rity\u2019.<a href=\"#_ftn14\" id=\"_ftnref14\">[14]<\/a> The creation of such nudification tools allows any user to generate what will eventually be accurate naked images of any woman (or child or, though much less likely, man) with an online profile, not just without her consent, but without her knowledge. The crux of the issue, then, is not that women are appearing in pornographic images, but that the use of these tools removes the need for women\u2019s participation in the creation of these images and therefore eliminates the ability of women to express their agency in how they express themselves and their sexualities and eliminates their autonomy in the use of their likeness.<\/p>\n\n\n\n<p>Preventing the abuse of such tools will not be an easy task. As long ago as 2021, there was at least one Member of Parliament in the UK calling to ban these harmful tools.<a href=\"#_ftn15\" id=\"_ftnref15\">[15]<\/a> Since MP Maria Miller\u2019s call for a ban, the Online Safety Act has indeed received royal assent. This Act makes it an offence to intentionally share \u2018a photograph or film which shows, or appears to show, another person\u2026 in an intimate state\u2019 where that person does not consent or is not reasonably believed to have consented.<a href=\"#_ftn16\" id=\"_ftnref16\">[16]<\/a> Additionally, the Protection of Children Act 1978 proscribes the creation of \u2018indecent photographs\u2019 of children.<a href=\"#_ftn17\" id=\"_ftnref17\">[17]<\/a> It is unclear whether this would apply to indecent images of AI-generated children. An updated definition of \u2018child\u2019 to include those created by AI could help to clarify this particular law. Alternatively, Parliament could amend legislation to specify that child sexual abuse material includes anything AI-generated that features an apparent child.<\/p>\n\n\n\n<p>Although a step in the right direction, the Online Safety Act does not go far enough to protect potential victims, particularly adult women, from this newer type of cyber sexual abuse. The trouble with the UK\u2019s Online Safety Act is that it prohibits the <em>sharing<\/em> or the threat to sharing such images,<a href=\"#_ftn18\" id=\"_ftnref18\">[18]<\/a> but does nothing to proscribe their <em>creation<\/em> without explicit consent. Therefore, in developing legal solutions to dealing with this technological advancement, governments must regulate against the use of AI technologies for sexually abusive purposes. One way to do this might be requiring nudification apps and websites to implement consent verification systems to ensure that the individual whose likeness is being used has agreed to their image being manipulated in this way. An additional option might be for governments to work with experts and large corporations in computer science and the internet to find ways to specifically target these websites and computer applications in order to remove them from the web, or at the absolute minimum, suppress them from appearing in search results. Given that 96% of deepfakes are of a non-consensual pornographic nature,<a href=\"#_ftn19\" id=\"_ftnref19\">[19]<\/a> prohibiting public access to nudifying technologies, or at least requiring some approval system for their use, would likely be proportional.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<p><a href=\"#_ftnref1\" id=\"_ftn1\">[1]<\/a> Kirsten Fleming, \u2018AI \u201cporn\u201d of Taylor Swift is a wake-up call: House must pass bill making this a federal crime\u2019 <em>New York Post<\/em> (25 January 2024) &lt;https:\/\/nypost.com\/2024\/01\/25\/entertainment\/ai-porn-of-taylor-swift-is-a-wake-up-call-for-us-government\/&gt; accessed 27 January 2024.<\/p>\n\n\n\n<p><a href=\"#_ftnref2\" id=\"_ftn2\">[2]<\/a> Gieseke AP, \u201c\u2018The New Weapon of Choice\u2019: Law\u2019s Current Inability to Properly Address Deepfake Pornography\u201d (2020) 73 Vanderbilt law review 1479.<\/p>\n\n\n\n<p><a href=\"#_ftnref3\" id=\"_ftn3\">[3]<\/a> Erik Gregersen, \u2018Moore\u2019s Law\u2019, <em>Encyclopaedia Britannica<\/em> (2024) &lt;https:\/\/www.britannica.com\/technology\/Moores-law&gt; accessed 28 January 2024.<\/p>\n\n\n\n<p><a href=\"#_ftnref4\" id=\"_ftn4\">[4]<\/a> Law Commission, <em>Intimate Image Abuse: A consultation paper<\/em> (Law Com No 253, 2021) para 2.47.<\/p>\n\n\n\n<p><a href=\"#_ftnref5\" id=\"_ftn5\">[5]<\/a> Law Commission, <em>Intimate image abuse: a final report<\/em> (Law Com No 407, 2022) para 1.7.<\/p>\n\n\n\n<p><a href=\"#_ftnref6\" id=\"_ftn6\">[6]<\/a> ibid.<\/p>\n\n\n\n<p><a href=\"#_ftnref7\" id=\"_ftn7\">[7]<\/a> Free Speech Coalition, \u2018Financial Discrimination and the Adult Industry\u2019 (<em>Free Speech Coalition<\/em>, May 2023) &lt;https:\/\/action.freespeechcoalition.com\/wp-content\/uploads\/2023\/05\/Financial-Discrimination-and-the-Adult-Industry-Updated-May-2023.pdf&gt; accessed 13 February 2024.<\/p>\n\n\n\n<p><a href=\"#_ftnref8\" id=\"_ftn8\">[8]<\/a> Nikolas Lanum, \u2018AI now being used to generate child pornography, blackmail teenagers: Digital safety expert\u2019 <em>Fox News<\/em> (29 June 2023) &lt;https:\/\/www.foxnews.com\/media\/ai-used-generate-child-pornography-blackmail-teenagers-digital-safety-expert&gt; accessed 27 January 2024.<\/p>\n\n\n\n<p><a href=\"#_ftnref9\" id=\"_ftn9\">[9]<\/a> ibid.<\/p>\n\n\n\n<p><a href=\"#_ftnref10\" id=\"_ftn10\">[10]<\/a> Will Gendron and Beatrice Nolan, \u2018An AI image generator making NSFW content is pumping out 500,000 pics a day, CEO says \u2013 but AI porn is a murky business\u2019 <em>Business Insider<\/em> (28 July 2023) &lt;https:\/\/www.businessinsider.com\/ai-porn-generator-unstable-making-500k-images-day-ceo-says-2023-7?r=US&amp;IR=T&gt; accessed 27 January 2024.<\/p>\n\n\n\n<p><a href=\"#_ftnref11\" id=\"_ftn11\">[11]<\/a> Danielle Keats Citron, \u2018Sexual Privacy\u2019 (2019) 128 Yale Law Journal 1870.<\/p>\n\n\n\n<p><a href=\"#_ftnref12\" id=\"_ftn12\">[12]<\/a> Convention for the Protection of Human Rights and Fundamental Freedoms (European Convention on Human Rights, as amended) (ECHR) art 8.<\/p>\n\n\n\n<p><a href=\"#_ftnref13\" id=\"_ftn13\">[13]<\/a> App no 47621\/13 (ECtHR, 8 April 2021).<\/p>\n\n\n\n<p><a href=\"#_ftnref14\" id=\"_ftn14\">[14]<\/a> European Court of Human Rights, \u2018Guide on Article 8 of the European Convention on Human Rights: Right to respect for private and family life, home and correspondence\u2019 (European Court of Human Rights, 31 August 2022).<\/p>\n\n\n\n<p><a href=\"#_ftnref15\" id=\"_ftn15\">[15]<\/a> Jane Wakefield, \u2018MP Maria Miller wants AI \u201cnudifying\u201d tool banned\u2019 <em>BBC<\/em> (4 August 2021) &lt;<a href=\"https:\/\/www.bbc.co.uk\/news\/technology-57996910\">https:\/\/www.bbc.co.uk\/news\/technology-57996910<\/a>&gt; accessed 27 January 2024.<\/p>\n\n\n\n<p><a href=\"#_ftnref16\" id=\"_ftn16\">[16]<\/a> Online Safety Act 2023, s 188.<\/p>\n\n\n\n<p><a href=\"#_ftnref17\" id=\"_ftn17\">[17]<\/a> Protection of Children Act 1978, s 1.<\/p>\n\n\n\n<p><a href=\"#_ftnref18\" id=\"_ftn18\">[18]<\/a> Ibid (n13).<\/p>\n\n\n\n<p><a href=\"#_ftnref19\" id=\"_ftn19\">[19]<\/a> Sensity, \u2018The State of Deepfakes: Landscape, Threats and Impact\u2019 (<em>Sensity, <\/em>29 November 2019) &lt;https:\/\/medium.com\/sensity\/mapping-the-deepfake-landscape-27cb809e98bc&gt; accessed 13 February 2024.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>If not carefully regulated, the rapid development and proliferation of artificial intelligence (AI) is set to pose a significant threat to the dignity of women. The New York Post reported on 25 January that AI-generated pornographic images of Taylor Swift had begun circulating on X (Twitter).[1] While this incident is likely to increase discussions and [&hellip;]<\/p>\n","protected":false},"author":973,"featured_media":1397,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1],"tags":[],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v23.0 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Preventing AI Sexual Abuse by Suppressing Nudification Tools, by Courtney Jones - Dignity &amp; Democracy<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Preventing AI Sexual Abuse by Suppressing Nudification Tools, by Courtney Jones - Dignity &amp; Democracy\" \/>\n<meta property=\"og:description\" content=\"If not carefully regulated, the rapid development and proliferation of artificial intelligence (AI) is set to pose a significant threat to the dignity of women. The New York Post reported on 25 January that AI-generated pornographic images of Taylor Swift had begun circulating on X (Twitter).[1] While this incident is likely to increase discussions and [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/\" \/>\n<meta property=\"og:site_name\" content=\"Dignity &amp; Democracy\" \/>\n<meta property=\"article:published_time\" content=\"2024-03-04T17:05:49+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-03-04T17:05:51+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-content\/uploads\/sites\/197\/2024\/03\/AI-creative-commons.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"601\" \/>\n\t<meta property=\"og:image:height\" content=\"432\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"ccld201\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"ccld201\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/\",\"url\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/\",\"name\":\"Preventing AI Sexual Abuse by Suppressing Nudification Tools, by Courtney Jones - Dignity &amp; Democracy\",\"isPartOf\":{\"@id\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-content\/uploads\/sites\/197\/2024\/03\/AI-creative-commons.jpg\",\"datePublished\":\"2024-03-04T17:05:49+00:00\",\"dateModified\":\"2024-03-04T17:05:51+00:00\",\"author\":{\"@id\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/#\/schema\/person\/858d8c98ff31d036d3e30b2393ae98ab\"},\"breadcrumb\":{\"@id\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/#primaryimage\",\"url\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-content\/uploads\/sites\/197\/2024\/03\/AI-creative-commons.jpg\",\"contentUrl\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-content\/uploads\/sites\/197\/2024\/03\/AI-creative-commons.jpg\",\"width\":601,\"height\":432},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Preventing AI Sexual Abuse by Suppressing Nudification Tools, by Courtney Jones\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/#website\",\"url\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/\",\"name\":\"Dignity &amp; Democracy\",\"description\":\"A HRDF Blog\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/#\/schema\/person\/858d8c98ff31d036d3e30b2393ae98ab\",\"name\":\"ccld201\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/068ca04bb7c545e926b963b535a5293f?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/068ca04bb7c545e926b963b535a5293f?s=96&d=mm&r=g\",\"caption\":\"ccld201\"},\"url\":\"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/author\/ccld201\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Preventing AI Sexual Abuse by Suppressing Nudification Tools, by Courtney Jones - Dignity &amp; Democracy","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/","og_locale":"en_US","og_type":"article","og_title":"Preventing AI Sexual Abuse by Suppressing Nudification Tools, by Courtney Jones - Dignity &amp; Democracy","og_description":"If not carefully regulated, the rapid development and proliferation of artificial intelligence (AI) is set to pose a significant threat to the dignity of women. The New York Post reported on 25 January that AI-generated pornographic images of Taylor Swift had begun circulating on X (Twitter).[1] While this incident is likely to increase discussions and [&hellip;]","og_url":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/","og_site_name":"Dignity &amp; Democracy","article_published_time":"2024-03-04T17:05:49+00:00","article_modified_time":"2024-03-04T17:05:51+00:00","og_image":[{"width":601,"height":432,"url":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-content\/uploads\/sites\/197\/2024\/03\/AI-creative-commons.jpg","type":"image\/jpeg"}],"author":"ccld201","twitter_card":"summary_large_image","twitter_misc":{"Written by":"ccld201","Est. reading time":"8 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/","url":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/","name":"Preventing AI Sexual Abuse by Suppressing Nudification Tools, by Courtney Jones - Dignity &amp; Democracy","isPartOf":{"@id":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/#primaryimage"},"image":{"@id":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/#primaryimage"},"thumbnailUrl":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-content\/uploads\/sites\/197\/2024\/03\/AI-creative-commons.jpg","datePublished":"2024-03-04T17:05:49+00:00","dateModified":"2024-03-04T17:05:51+00:00","author":{"@id":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/#\/schema\/person\/858d8c98ff31d036d3e30b2393ae98ab"},"breadcrumb":{"@id":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/#primaryimage","url":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-content\/uploads\/sites\/197\/2024\/03\/AI-creative-commons.jpg","contentUrl":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-content\/uploads\/sites\/197\/2024\/03\/AI-creative-commons.jpg","width":601,"height":432},{"@type":"BreadcrumbList","@id":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/2024\/03\/04\/preventing-ai-sexual-abuse-by-suppressing-nudification-tools-by-courtney-jones\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/"},{"@type":"ListItem","position":2,"name":"Preventing AI Sexual Abuse by Suppressing Nudification Tools, by Courtney Jones"}]},{"@type":"WebSite","@id":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/#website","url":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/","name":"Dignity &amp; Democracy","description":"A HRDF Blog","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/#\/schema\/person\/858d8c98ff31d036d3e30b2393ae98ab","name":"ccld201","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/068ca04bb7c545e926b963b535a5293f?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/068ca04bb7c545e926b963b535a5293f?s=96&d=mm&r=g","caption":"ccld201"},"url":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/author\/ccld201\/"}]}},"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-content\/uploads\/sites\/197\/2024\/03\/AI-creative-commons.jpg","_links":{"self":[{"href":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-json\/wp\/v2\/posts\/1393"}],"collection":[{"href":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-json\/wp\/v2\/users\/973"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-json\/wp\/v2\/comments?post=1393"}],"version-history":[{"count":3,"href":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-json\/wp\/v2\/posts\/1393\/revisions"}],"predecessor-version":[{"id":1449,"href":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-json\/wp\/v2\/posts\/1393\/revisions\/1449"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-json\/wp\/v2\/media\/1397"}],"wp:attachment":[{"href":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-json\/wp\/v2\/media?parent=1393"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-json\/wp\/v2\/categories?post=1393"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sites.exeter.ac.uk\/humanrightsanddemocracyforumblog\/wp-json\/wp\/v2\/tags?post=1393"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}