{"id":25148,"date":"2026-01-15T19:13:04","date_gmt":"2026-01-15T15:13:04","guid":{"rendered":"https:\/\/me-en.kaspersky.com\/blog\/?p=25148"},"modified":"2026-01-15T19:13:04","modified_gmt":"2026-01-15T15:13:04","slug":"ai-generated-sextortion-social-media","status":"publish","type":"post","link":"https:\/\/me-en.kaspersky.com\/blog\/ai-generated-sextortion-social-media\/25148\/","title":{"rendered":"AI and the new reality of sextortion"},"content":{"rendered":"<p>In 2025, cybersecurity researchers discovered several open databases belonging to various AI image-generation tools. This fact alone makes you wonder just how much AI startups care about the privacy and security of their users\u2019 data. But the nature of the content in these databases is far more alarming.<\/p>\n<p>A large number of generated pictures in these databases were images of women in lingerie or fully nude. Some were clearly created from children\u2019s photos, or intended to make adult women appear younger (and undressed). Finally, the most disturbing part: some pornographic images were generated from completely innocent photos of real people \u2014 likely taken from social media.<\/p>\n<p>In this post, we\u2019re talking about what sextortion is, and why AI tools mean anyone can become a victim. We detail the contents of these open databases, and give you advice on how to avoid becoming a victim of AI-era sextortion.<\/p>\n<h2>What is sextortion?<\/h2>\n<p>Online sexual extortion has become so common it\u2019s earned its own global name: <a href=\"https:\/\/en.wikipedia.org\/wiki\/Sextortion\" target=\"_blank\" rel=\"noopener nofollow\">sextortion<\/a> (a portmanteau of <em>sex<\/em> and <em>extortion<\/em>). We\u2019ve already detailed its various types in our post, <a href=\"https:\/\/www.kaspersky.com\/blog\/all-sextortion-schemes-2024\/52436\/\" target=\"_blank\" rel=\"noopener nofollow\"><strong>Fifty shades of sextortion<\/strong><\/a>. To recap, this form of blackmail involves threatening to publish intimate images or videos to coerce the victim into taking certain actions, or to extort money from them.<\/p>\n<p>Previously, victims of sextortion were typically adult industry workers, or individuals who\u2019d shared intimate content with an untrustworthy person.<\/p>\n<p>However, the rapid advancement of artificial intelligence, particularly text-to-image technology, has fundamentally changed the game. Now, literally anyone who\u2019s posted their most innocent photos publicly can become a victim of sextortion. This is because generative AI makes it possible to quickly, easily, and convincingly undress people in any digital image, or add a generated nude body to someone\u2019s head in a matter of seconds.<\/p>\n<p>Of course, this kind of fakery was possible before AI, but it required long hours of meticulous Photoshop work. Now, all you need is to describe the desired result in words.<\/p>\n<p>To make matters worse, many generative AI services don\u2019t bother much with protecting the content they\u2019ve been used to create. As mentioned earlier, last year saw researchers discover at least three publicly accessible databases belonging to these services. This means the generated nudes within them were available not just to the user who\u2019d created them, but to anyone on the internet.<\/p>\n<h2>How the AI image database leak was discovered<\/h2>\n<p>In October 2025, <a href=\"https:\/\/www.wired.com\/story\/huge-trove-of-nude-images-leaked-by-ai-image-generator-startups-exposed-database\/\" target=\"_blank\" rel=\"noopener nofollow\">cybersecurity researcher Jeremiah Fowler uncovered an open database<\/a> containing over a million AI-generated images and videos. According to the researcher, the overwhelming majority of this content was pornographic in nature. The database wasn\u2019t encrypted or password-protected\u00a0\u2014 meaning any internet user could access it.<\/p>\n<p>The database\u2019s name and watermarks on some images led Fowler to believe its source was the U.S.-based company SocialBook, which offers services for influencers and digital marketing services. The company\u2019s website also provides access to tools for generating images and content using AI.<\/p>\n<p>However, further analysis revealed that SocialBook itself wasn\u2019t directly generating this content. Links within the service\u2019s interface led to third-party products\u00a0\u2014 the AI services MagicEdit and DreamPal\u00a0\u2014 which were the tools used to create the images. These tools allowed users to generate pictures from text descriptions, edit uploaded photos, and perform various visual manipulations, including creating explicit content and face-swapping.<\/p>\n<p>The leak was linked to these specific tools, and the database contained the product of their work, including AI-generated and AI-edited images. A portion of the images led the researcher to suspect they\u2019d been uploaded to the AI as references for creating provocative imagery.<\/p>\n<p>Fowler states that roughly 10,000 photos were being added to the database every single day. SocialBook denies any connection to the database. After the researcher informed the company of the leak, several pages on the SocialBook website that had previously mentioned MagicEdit and DreamPal became inaccessible and began returning errors.<\/p>\n<h2>Which services were the source of the leak?<\/h2>\n<p>Both services\u00a0\u2014 MagicEdit and DreamPal\u00a0\u2014 were initially marketed as tools for interactive, user-driven visual experimentation with images and art characters. Unfortunately, a significant portion of these capabilities were directly linked to creating sexualized content.<\/p>\n<p>For example, MagicEdit offered a tool for AI-powered virtual clothing changes, as well as a set of styles that made images of women more revealing after processing\u00a0\u2014 such as replacing everyday clothes with swimwear or lingerie. Its promotional materials promised to turn an ordinary look into a sexy one in seconds.<\/p>\n<p>DreamPal, for its part, was initially positioned as an AI-powered role-playing chat, and was even more explicit about its adult-oriented positioning. The site offered to create an ideal AI girlfriend, with certain pages directly referencing erotic content. The FAQ also noted that filters for explicit content in chats were disabled so as not to limit users\u2019 most intimate fantasies.<\/p>\n<p>Both services have suspended operations. At the time of writing, the DreamPal website returned an error, while MagicEdit seemed available again. Their apps were removed from both the App Store and Google Play.<\/p>\n<p>Jeremiah Fowler says earlier in 2025, he discovered two more open databases containing AI-generated images. One <a href=\"https:\/\/www.wired.com\/story\/genomis-ai-image-database-exposed\/\" target=\"_blank\" rel=\"noopener nofollow\">belonged to the South Korean site GenNomis<\/a>, and contained 95,000 entries \u2014 a substantial portion of which being images of \u201cundressed\u201d people. Among other things, the database included images with child versions of celebrities: American singers Ariana Grande and Beyonc\u00e9, and reality TV star Kim Kardashian.<\/p>\n<h2>How to avoid becoming a victim<\/h2>\n<p>In light of incidents like these, it\u2019s clear that the risks associated with sextortion are no longer confined to private messaging or the exchange of intimate content. In the era of generative AI, even ordinary photos, when posted publicly, can be used to create compromising content.<\/p>\n<p>This problem is especially relevant for women, but men shouldn\u2019t get too comfortable either: <a href=\"https:\/\/www.kaspersky.com\/blog\/all-sextortion-schemes-2024\/52436\/\" target=\"_blank\" rel=\"noopener nofollow\">the popular blackmail scheme<\/a> of \u201cI hacked your computer and used the webcam to make videos of you browsing adult sites\u201d could reach a whole new level of persuasion thanks to AI tools for generating photos and videos.<\/p>\n<p>Therefore, protecting your privacy on social media and controlling what data about you is publicly available become key measures for safeguarding both your reputation and peace of mind. To prevent your photos from being used to create questionable AI-generated content, we recommend making all your social media profiles as private as possible\u00a0\u2014 after all, they could be the source of images for AI-generated nudes.<\/p>\n<p>We\u2019ve already published multiple detailed guides on how to <a href=\"https:\/\/www.kaspersky.com\/blog\/minimizing-digital-footprints-2025\/53762\/\" target=\"_blank\" rel=\"noopener nofollow\">reduce your digital footprint online<\/a> or even <a href=\"https:\/\/www.kaspersky.com\/blog\/deleting-digital-footprints\/54591\/\" target=\"_blank\" rel=\"noopener nofollow\">remove your data from the internet<\/a>, <a href=\"https:\/\/www.kaspersky.com\/blog\/how-to-remove-yourself-from-data-brokers-people-search-sites\/54209\/\" target=\"_blank\" rel=\"noopener nofollow\">how to stop data brokers from compiling dossiers on you<\/a>, and <a href=\"https:\/\/www.kaspersky.com\/blog\/the-naked-truth-iia\/51733\/\" target=\"_blank\" rel=\"noopener nofollow\">protect yourself from intimate image abuse<\/a>.<\/p>\n<p>Additionally, we have a dedicated service, <a href=\"https:\/\/privacy.kaspersky.com\/?utm_source=kdaily&amp;utm_medium=blog&amp;utm_campaign=gl_kd-banner_ap0072&amp;utm_content=banner&amp;utm_term=gl_kdaily_organic_hwzuab72aq5ynvk\" target=\"_blank\" rel=\"noopener\">Privacy Checker<\/a>\u00a0\u2014 perfect for anyone who wants a quick but systematic approach to privacy settings everywhere possible. It compiles step-by-step guides for securing accounts on social media and online services across all major platforms.<\/p>\n<p>And to ensure the safety and privacy of your child\u2019s data, <a href=\"https:\/\/me-en.kaspersky.com\/safe-kids?icid=me-en_kdailyplacehold_acq_ona_smm__onl_b2c_kasperskydaily_wpplaceholder____ksk___\" target=\"_blank\" rel=\"noopener\">Kaspersky Safe Kids<\/a>\u00a0can help: it allows parents to monitor which social media their child spends time on. From there, you can help them adjust privacy settings on their accounts so their posted photos aren\u2019t used to create inappropriate content. Explore our <a href=\"https:\/\/www.kaspersky.com\/blog\/young-adults-cybersecurity\/54265\/\" target=\"_blank\" rel=\"noopener nofollow\">guide to children\u2019s online safety<\/a> together, and if your child dreams of becoming a popular blogger, discuss our <a href=\"https:\/\/www.kaspersky.com\/blog\/how-to-help-child-blogger-2\/54148\/\" target=\"_blank\" rel=\"noopener nofollow\">step-by-step cybersecurity guide for wannabe bloggers with them<\/a>.<\/p>\n<input type=\"hidden\" class=\"category_for_banner\" value=\"premium-generic\">\n","protected":false},"excerpt":{"rendered":"<p>Generative AI has taken sextortion techniques to a whole new level \u2014 now, any social media user can become a victim. How can you protect yourself and your loved ones?<\/p>\n","protected":false},"author":2726,"featured_media":25149,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1225],"tags":[2088,1481,1583,2349,2252,2040,2719,2274,89,76,1637,695,2774,768,240,321,521],"class_list":{"0":"post-25148","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-privacy","8":"tag-tips","9":"tag-ai","10":"tag-artificial-intelligence","11":"tag-blackmail","12":"tag-deepfakes","13":"tag-extortion","14":"tag-fakes","15":"tag-images","16":"tag-kids","17":"tag-phishing","18":"tag-porn","19":"tag-scam","20":"tag-sexting","21":"tag-sextortion","22":"tag-spam","23":"tag-technology","24":"tag-threats"},"hreflang":[{"hreflang":"en-ae","url":"https:\/\/me-en.kaspersky.com\/blog\/ai-generated-sextortion-social-media\/25148\/"},{"hreflang":"en-in","url":"https:\/\/www.kaspersky.co.in\/blog\/ai-generated-sextortion-social-media\/30084\/"},{"hreflang":"en-gb","url":"https:\/\/www.kaspersky.co.uk\/blog\/ai-generated-sextortion-social-media\/29964\/"},{"hreflang":"es-mx","url":"https:\/\/latam.kaspersky.com\/blog\/ai-generated-sextortion-social-media\/28902\/"},{"hreflang":"es","url":"https:\/\/www.kaspersky.es\/blog\/ai-generated-sextortion-social-media\/31773\/"},{"hreflang":"it","url":"https:\/\/www.kaspersky.it\/blog\/ai-generated-sextortion-social-media\/30406\/"},{"hreflang":"ru","url":"https:\/\/www.kaspersky.ru\/blog\/ai-generated-sextortion-social-media\/41165\/"},{"hreflang":"tr","url":"https:\/\/www.kaspersky.com.tr\/blog\/ai-generated-sextortion-social-media\/14191\/"},{"hreflang":"x-default","url":"https:\/\/www.kaspersky.com\/blog\/ai-generated-sextortion-social-media\/55137\/"},{"hreflang":"fr","url":"https:\/\/www.kaspersky.fr\/blog\/ai-generated-sextortion-social-media\/23521\/"},{"hreflang":"pt-br","url":"https:\/\/www.kaspersky.com.br\/blog\/ai-generated-sextortion-social-media\/24658\/"},{"hreflang":"de","url":"https:\/\/www.kaspersky.de\/blog\/ai-generated-sextortion-social-media\/33106\/"},{"hreflang":"ru-kz","url":"https:\/\/blog.kaspersky.kz\/ai-generated-sextortion-social-media\/30169\/"},{"hreflang":"en-au","url":"https:\/\/www.kaspersky.com.au\/blog\/ai-generated-sextortion-social-media\/35849\/"},{"hreflang":"en-za","url":"https:\/\/www.kaspersky.co.za\/blog\/ai-generated-sextortion-social-media\/35504\/"}],"acf":[],"banners":"","maintag":{"url":"https:\/\/me-en.kaspersky.com\/blog\/tag\/sextortion\/","name":"sextortion"},"_links":{"self":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/25148","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/users\/2726"}],"replies":[{"embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/comments?post=25148"}],"version-history":[{"count":1,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/25148\/revisions"}],"predecessor-version":[{"id":25150,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/25148\/revisions\/25150"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/media\/25149"}],"wp:attachment":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/media?parent=25148"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/categories?post=25148"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/tags?post=25148"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}