Associated Incidents
He is 18 years old and goes to the Manuel Belgrano pre-university institute in Córdoba. A few months ago, two 17-year-old girls, who are not in his same class and with whom he had no contact, began receiving requests on Instagram from men between 40 and 50 years old.
They thought the insistence was strange and that the same thing was happening to them. One of them chatted with one of those users and found out why, like other students from that high school, they were attracting so much attention on social networks.
Using artificial intelligence (AI), that 18-year-old student had created** sexual photos** mixing their faces with the bodies of other women, and uploaded them to porn pages. In each image he put the girls' first and last names. That is why they searched for their profiles.
The family of the two girls who were "harassed" on Instagram and are now undergoing psychological treatment, filed a criminal complaint. At the same time, in a contravention, another 20 students say they have been victims of more fake photos.
Stories like this - like a very similar one that became known this week in San Martín, province of Buenos Aires - emerge little by little. But the Córdoba case resulted in an unprecedented resolution in Argentina: the student was charged with serious injuries qualified by gender violence.
While waiting for it to go to trial, Clarin interviewed the Cordoba prosecutor of this pivotal case and experts in Law and cybercrime, to understand how deep the legal vacuum is in Argentina when crimes have AI as an "accomplice."
An accusation like this needs details. And the prosecutor gives them all.
"The case is clearly gender violence, due to the position of the accused in relation to the victims and the sexual objectification to which they are subjected in the digital medium," says Pablo Cuenca Tagle, the prosecutor of the Family Violence jurisdiction who charged him and is about to investigate him.
Specifically, Law 26.485 on the Protection of Violence against Women, in its article 6, section I, provides for the type of "digital or telematic violence due to conduct that affects the reputation of the victim," and expressly speaks of the assumption of "dissemination, without consent, of real or edited digital material, intimate or nude, that is attributed to women."
In short, says Tagle, although "the criminal law must necessarily be updated by generating new specific criminal types against AI," in this particular case: "The action of the accused can be classified as psychological injuries, which do not require a specific means of commission."
Even though there is no specific criminal figure, he considers, "conduct of this type is not bullying or a joke, but can constitute a crime, due to the severe consequences that it can bring to the psyche of the person involved, when these artificially created representations come to his knowledge."
At the time of closing this note, the 18-year-old boy remained at liberty, with restrictions on approaching the complainants.
How are serious injuries classified as gender violence proven when there are no marks on the body? "One of the victims has already undergone a psychiatric examination and has damage to her psyche. As for the other victim, -clarifies the prosecutor- the complaint was made later and the expert report is not yet available, so in that case for the moment the complaint is for minor injuries."
The prosecutor ordered a search at the defendant's house and the result was "positive": they found the computer and cell phone from which he gave the order (and the material) to an artificial intelligence program to create the sexual photos that he later uploaded to pornography sites. But, before those devices that incriminate him, how did the girls know that he was behind that false content?
"First, because there would have been an admission of the fact in front of friends and, second, because the IP that identifies the address of the author of the photos was tracked."
The contravention against the accused falls under the figure of "harassment aggravated by gender violence." The difference with the criminal jurisdiction is that it is a contravention, not a crime. "Not all victims react in the same way and suffer the same consequences. So far, two have brought criminal action for injuries, but it may not have caused such injury to other victims," concludes the prosecutor, who could make history in AI jurisprudence.
"The faces are the faces of the girls, it does not matter that they have been given another body. If they were minors, the crime is the dissemination of child pornography. Unfortunately, it is not a crime to disseminate adult pornography. This case makes us reconsider urgent modifications to the code. Especially because of AI," Daniela Dupuy, the City's criminal prosecutor specializing in computer crimes, told Clarín.
AI in Justice: between the legal vacuum and the need to update the Code
In mid-August, the commission in charge of preparing the draft of the Penal Code Reform met in the Comodoro Py courts and made progress in the inclusion of crimes related to artificial intelligence.
The Vice President of the Commission and judge of the Criminal Cassation Court, Mariano Borinsky, who led this reform project, explains to Clarin why in the era of AI it is necessary to "add text" to the law.
"It is possible that some cases of viralization of pornographic photos adulterated with AI, as in this accusation in Córdoba, can be analyzed from the figure of injuries aggravated by gender violence. It is not a traditional or direct framework, as in the case of physical injuries, but it could be argued that this conduct causes psychological injuries, this under a broader concept of 'injuries', understood as damage to mental health," says the expert in jurisprudence.
Indeed, this framework, he considers, "responds to the fact that today there is no specific rule that typifies this conduct (as there is in the proposed Penal Code project). The law must be updated in this regard."
The project specifically incorporates these figures. And with penalties ranging from 3 to 6 years in prison. Even, different aggravating factors are foreseen, "one of these being when there is gender violence in the case or the victims are minors."
The rule of the new Penal Code project says:
"ARTICLE 123.- 1. A prison sentence of THREE (3) to SIX (6) years will be imposed on anyone who creates with artificial intelligence or by any means produces, finances, offers, trades, publishes, facilitates, divulges, distributes any representation of a person under EIGHTEEN (18) years of age engaged in explicit sexual activities or any representation of their genital parts for predominantly sexual purposes."
Some of the victims in the Córdoba case are under 18 years of age. What happens in these cases when AI is involved? Is it an aggravating factor?
In the adulteration of images of minors through AI, as explained by the lawyer and expert in computer crimes Daniel Monastersky, there are several points to take into account and that could constitute a crime.
"The production of child sexual exploitation material: although the images are generated by AI, they are based on real photographs and represent them in explicit sexual situations; the simple possession of child sexual abuse and exploitation material; or the distribution and/or commercialization of child sexual exploitation material," he details.
"By creating false and sexualized versions of minors, it could be argued that there is a form of identity theft (it is a contravention in CABA). It should be considered urgent to include it as a crime in the Criminal Code of the Nation. An attempt at extortion could also be configured, using the false images, and this would constitute an additional crime," Monastersky concludes.
Soon, the project that includes AI in the classifications of the Criminal Code will be submitted to the National Congress.
PS