SEXUALIZATION OF CHILDREN IN DEEPFAKES AND HENTAI.

AuthorEelmaa, Simone
  1. Introduction

    Over the last decade, reports regarding child sexual abuse material (CSAM) (1) have increased dramatically (Europol 2020) A steep escalation in demand for CSAM is one of the numerous challenges brought about by the COVID-19 pandemic as detected levels of sharing and re-sharing of CSAM saw a 106% rise during pandemic-related lockdowns (Europol 2020). This significant increase in demand underscores the pertinence of the discussion on the sexualization of minors in virtual CSAM. The proliferation of technological developments has further oriented and advanced emerging forms of public sexualized discourses, which now includes the creation of sexually explicit video content using deepfake technology. (2)

    This technological advancement has also found use in creating such material of children (NetClean 2018). Provided that deepfake technology essentially creates an endless variety of design options for the user (Kikerpill 2020), producing computer-generated CSAM has never been easier. However, deepfake CSAM is only the most recent addition to the disturbing field of virtual CSAM. Hentai - known for its overtly sexual representations (often of fictional children) has been around for decades (Al-Alosi 2018) - hence providing an essential point of comparison in capturing how people conceptualize the sexualization of minors in the context of virtual CSAM.

    The differentiation is important because the two forms of virtual CSAM, though standing antithetical to each other - deepfakes aiming to create as realistic and genuine-looking experience as possible in contrast to hyper-realistic and exaggerated representations intrinsic to hentai - both contribute to an already extensive list of concerns over image-based sexual abuse (Maddocks 2018). However, it is not just 'offenders' or people creating virtual CSAM that feed the problem. Public opinion, people's views, and attitudes are central vehicles of discourses that sexualize minors (APA 2007). Therefore, this study seeks answers to the following questions: (1) How do people view the sexualization of minors in hentai and deepfakes? (2) How do people's views/attitudes differ concerning the SOM in hentai vs. in deepfakes?

    1.1. Sexualization of minors

    Sex and sexuality, in general, have become a staple presence in the public sphere. The pornification of mainstream media (Paasonen et al. 2007) transpired a cultural shift to what is known as the sexualization of culture. Mass media's hegemonic representations and misrepresentations of sexuality translate into every sphere and domain while still carrying harmful, dehumanizing messages, prejudiced views, double-standards, objectification, and normative expectations (Kelly 2017). These powerful narratives also include children, both as subjects of such portrayals and as the recipients of these messages (APA 2007). Children are surrounded by messages that diminish their inherent value to looks and attractiveness. From early on, children are exposed to mediums and messages that aim to blur the lines between children and adults (Moore and Reynolds 2018). The premature sexualization of children and adolescents is a global issue that has only intensified over the preceding decade.

    The concept of sexualization itself encompasses some definitional complexity. Although this study does not measure or seek to define sexualization, Luxembourg Guidelines (Greijer and Doek 2016) provide a helpful framing of the concept:

    Sexualisation is not always an objective criterion, and the crucial element in judging such a situation is intent of a person to sexualise a child in an image or to make use of an image for sexual purposes (e.g. for sexual arousal or gratification). Hence, this study considers sexualization as such that is inappropriately imposed upon children in the context of virtual CSAM. Previous scholarship suggests that most discussions about the sexualization of childhood focus on girls (Moore and Reynolds 2018), meaning we are discussing a strongly gendered discourse. Nevertheless, studies indicate that the sexualization of childhood in different spheres, particularly in media, is inherently damaging to all children, not just girls (APA 2007).

    Adverse effects to children include self-objectification, body dissatisfaction, negative mood, lower self-esteem, anxiety, eating disorders, depression, the pressure to appear and behave 'sexually' at an early age (ibid). What is more, the sexualization of children has negative consequences on their ability to develop a healthy and complete sexual self-image (ibid). Children are enticed into this sexualized realm of public sexual discourse with commercial marketing, social media, television, video games, music videos, and other forms of media (ibid), but the impact goes beyond the developmental and contextual consequences to children. For instance, the metadiscursive messages of the SOM have materialized a cultural facet where sexual depictions of children are becoming further normalized (Kelly 2007, Moore and Reynolds 2018) and may well cultivate a further demand for SOM.

    1.2. Virtual CSAM

    CSAM, also often referenced to as child pornography, (3) is any representation of a child engaged in real or simulated explicit sexual activities or any representation of the sexual parts of a child for primarily sexual purposes (Optional Protocol to the Convention on the Rights of the Child on the sale of children, child prostitution and child pornography, Article 2(c)). Virtual CSAM is any material with graphical sexualized depictions of children or child sexual abuse (CSA) that does not portray a real child, e.g., computer-generated images (CGI), cartoons, and morphed images (see, e.g., ECPAT International, SECO Manifestations), including both hentai and deepfakes.

    Although the possession, production, and dissemination of CSAM that exploits real children is criminalized in most countries (ICMEC 2018), the issue is more problematic with fantasy materials that depict fictional children. The fictional status of such materials raises questions regarding restrictions on freedom of speech and artistic expression, even if the materials are clearly explicit, which is usually the case with hentai (Al-Alosi 2018). Nevertheless, the harm of desensitization towards and normalization of illegal acts when involving real children ought to be similarly considered in discussions about fictional CSAM. Particularly as viewing CSAM does not constitute a preventive safety valve (Russell and Purcell 2008: 61). In fact, research has shown that repeated exposure to CSAM may cause desensitization when a person becomes habituated to a specific stimulus that is used to evoke strong reactions (Paul and Linz 2008; Al-Alosi 2018). Such materials may mislead viewers into believing that sexual activity with children is normal and thus increase the risk of committing the act in real life (O'Donnell and Milner 2007: 74). It is also noteworthy that hentai and teen are among the most frequently searched terms in Porn Hub (4) (Porn Hub 2019), indicating widespread interest in sexual depictions of minors. Furthermore, Taylor and Quayle (2002: 340) found that accessing images appeared to reinforce existing fantasies and was used to give permission to act on them.

    The prevalence of deepfakes is another strand of concern. According to NetClean Report (2018), one in five police officers had come across deepfakes in CSA investigations. For investigative authorities, the inhibitory nature of deepfakes further advances the production of (virtual) CSAM and makes identifying children more difficult (ibid). Deepfakes can be used to produce new CSAM from the already existing material or creating CSAM from children who have not been subjected to actual [emphasis added] sexual abuse (ibid). The primary problem associated with any deepfake content is the ease of their production and dissemination, as well as the false beliefs deepfakes promote (Gosse and Burkell 2020). Given the degree of harm deepfakes may inflict, particularly regarding CSAM (Kirchengast 2020), there are calls for decisive regulation of such content (Delfino 2020). Furthermore, as it is possible to request deepfake porn featuring adults from openly accessible online forums (Kikerpill 2020) and the dark web has afforded the creation of similar forums for sexual predators (Chiang 2020), the potential combination of these two operational aspects render a serious threat to protecting children.

  2. Materials and methods

    The epistemological roots of this study are grounded in the social constructionism paradigm, which views reality as socially constructed through social interactions (see Berger and Luckmann 1966). To understand how people make sense of the SOM in virtual forms, a qualitative study was carried out (Krauss 2005). In early 2018, a popular online forum Reddit chose to change a rule regarding the SOM on their platform. The following paragraph is the original rule announcement:

    Reddit prohibits any sexual or suggestive content involving minors or someone who appears to be a minor. This includes child sexual abuse imagery, child pornography, and any other content, including fantasy content (e.g., stories, anime), that encourages or promotes pedophilia, child exploitation, or otherwise sexualizes minors. Depending on the context, this can in some cases include depictions of minors that are fully clothed and not engaged in overtly sexual acts (Reddit 2018). The reactions to this rule change are the subject of this study. User comments forming the initial sample (N = 13,293) were posted to a thread opened in Reddit's Announcements section in which the platform informed users of its sitewide rule change. The data set was obtained from the archives of Pushshift Reddit, i.e., a comprehensive search engine and real-time analytics tracker for the platform (Baumgartner et al. 2020). The dataset of user comments was organized by date from oldest to newest while...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT