5/10/2024

TEEN STUDENTS TEAR : ALARMING A.I. ESSAY

OVER the last two months, deepfake nude incidents have spread in schools - including in Richmond, ILL, and Beverly Hills and Laguna Beach, Calif.

Yet few laws in the United States specifically protect people under 18 from exploitative A.I.apps.

That is because many current states that prohibit child sexual abuse material or adult nonconsensual pornography - involving real photos or videos of real people - may not cover A.I. generated explicit images that use people's faces, said the U.S. Representative Joseph D. Morelle, a Democrat from New York.

Last year, he introduced a bill that would make it a crime to disclose A.I. generated intimate images of identifiable adults or minors. It would also give deepfake victims, or parents, the right to sue the individual perpetrators for damages.

'' We want to make this so painful for anyone to even contemplate doing, because this is a harm that you just can't simply undo,'' Mr. Morelle said. 

'' Even if it seems like a prank to a 15-year-old-boy, this is deadly serious.''

U.S. Representative Alexandria Ocasio-Cortez, another New York Democrat, recently introduced a similar bill to enable victims to bring civil cases against deepfake perpetrators.

But neither bill would explicitly give victims the right to sue the developers of A.I. nudification apps, a step that trial lawyers say would help disrupt the mess of production of sexually explicit deepfakes.

'' Legislation is needed to stop commercializations, which is the root of the problem,'' said Elizabeth Hanley, a lawyer in Washington who represents victims in sexual assault and harassment cases.

The U.S. legal code prohibits the distribution of computer-generated child sexual abuse material identifiable minors engaged in sexual explicit conduct.

Last month, the Federal Bureau of Investigation issued an alert warning that such illegal material included realistic child sexual abuse images generated by A.I.

Yet fake AI-generated depictions of real teenage girls without clothes may not constitute '' child sexual abuse material, ''  experts say, unless prosecutors can prove the fake images meet legal standards for sexually explicit conduct or the lewd display of genitalia.

Some defense lawyers have tried to capitalize on the apparent legal ambiguity.

A lawyer defending a male high school student in a deepfake lawsuit in New Jersey recently argued that the court not temporarily restrain his client, who created nude A.I. images of a female classmate, from viewing or sharing the pictures because they were neither harmful nor illegal.

Federal laws, the lawyer argued in a court filing, were not designed to apply '' to computer-generated synthetic images that do not even include real human body parts.'' [ The defendant ultimately agreed not to oppose a restraining order on the images.]

This Master Essay will continue on a regular basis. The World Students Society thanks Natasha Singer.

0 comments:

Post a Comment

Grace A Comment!