AI robot

Sextortion Scheme Targeting Colorado Students

Last week multiple news outlets reported that police were investigating a sextortion scheme targeting Colorado students—dubbed a “social media nightmare.” Dozens of middle school and high school students reported that a suspect or suspects contacted them on Instagram and asked them to pay to take down explicit images, some of which were generated via A.I. The suspect(s) also made unsolicited offers to other youth to pay to join a “Close Friends List” where sexually explicit images had been posted.

One question that is arising is whether Colorado’s statutes cover A.I.-generated content. Colorado, like many other states, has been trying to keep up with the rapidly evolving forms of image-based sexual abuse. In 2014, Colorado lawmakers passed a bill that made it illegal to distribute intimate and identifiable images without a victim’s consent. That legislation was modified to be more expansive several years later (for instance, removing the requirement that the dissemination of images cause victims to suffer serious emotional distress). In 2019, legislation was added giving victims of “revenge porn” the ability to file a civil lawsuit.

Colorado’s Image-Based Sexual Abuse Laws

Under Colorado’s current laws, posting a private image for harassment or for pecuniary gain are designated as crimes under C.R.S. 18-7-107 and C.R.S. 18-7-108, respectively. Each statute states that the actor commits the offense if “he or she posts or distributes through social media or any website any photograph, video, or other image displaying the private intimate parts of an identified or identifiable person . . . .” “Private intimate parts” are defined as “external genitalia or the perineum or the anus or the pubes of any person or the breast of a female.”

Additionally, a victim of image-based sexual abuse can bring a civil action under C.R.S. 13-21-1403 if they were “identifiable” and “suffered harm from a person’s intentional disclosure or threatened disclosure of an intimate image that was private” and shared without the victim’s consent. An “intimate image” is defined in C.R.S. 13-21-1402 as “a photograph, film, video recording, or other similar medium that shows: (a) The uncovered genitals, pubic area, anus, or female postpubescent nipple of a depicted individual; or (b) The depicted individual engaging in or being subjected to sexual conduct.”

If a victim’s actual body is depicted, a bad actor will be liable under these criminal and civil statutes. But when the explicit portion of an image is A.I. generated, the law becomes murky. Colorado courts will need to determine whether an image of a person—for instance, of their face—with A.I.-generated intimate content superimposed on that person’s body falls within the definitions of a “private intimate parts” or an “intimate image.” If not, then policymakers will need to craft new legislation to cover this type of situation.

Other laws might fill in the gaps in the meantime. For instance, C.R.S. 18-7-502 makes it illegal for a person to sell to a child “any picture, photograph, drawing, sculpture, motion picture film, or similar visual representation or image of a person or portion of the human body which depicts sexually explicit nudity, sexual conduct, or sadomasochistic abuse and which, taken as a whole, is harmful to children . . . .” Law enforcement is also considering distribution of child pornography and extortion charges.

Federal Image-Based Sexual Abuse Laws

Federal law also prohibits image-based sexual abuse. For instance, VAWA was expanded and an “individual whose intimate visual depiction is disclosed . . . without the consent of the individual . . . may bring a civil action against that person . . . .” 15 U.S.C.S. § 6851. Notably, the definition of “visual depiction” is fairly expansive and includes “data stored on computer disk or by electronic means which is capable of conversion into a visual image . . . .” 18 U.S.C.S. § 2256(5). Additionally, there are federal criminal statutes barring child pornography, and a civil remedy for victims of child sexual abuse material under Masha’s Law (18 U.S.C.S. § 2255).

We Must Address These Rapidly Evolving Cyber Abuses

Cyber abuses aren’t going away, and these abuses are popping up around the country at alarming rates. According to the FBI, there was a 20 percent increase in reports of sextortion targeting children between September 2022 and March 2023, as compared to the same timeframe the prior year. Moreover, from October 2021 to March 2023, FBI investigators received over 13,000 reports of financial sextortion aimed at minors, primarily boys. The sextortion led to at least 20 suicides. A psychology professor at the Metropolitan State University of Denver recently stated, “[i]t puts people at a tremendous risk for self harm and suicide, because of this horrible fear of being publicly humiliated and ashamed.”

To combat these abuses, we need to make sure our laws adequately address these evolving crimes. And, as the FBI has urged, “[p]arents, educators, and caregivers need to be aware of this increasingly urgent threat and empower victims to come forward.”