This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

SACRAMENTO-

Several images of someone’s daughter, sister and classmate are all over Instagram. Pictures snapped from Androids or iPhones shared and posted using the app.

In almost all the photos, it appears the teen girls are snapping the pictures themselves, maybe passing them on to someone they trust. That trusted somebody then shares it with the person behind the Instagram account, likely without the girl’s permission.

The teen girls are naked and exploited, and it’s unknown who is administrating the account.

Although some details are unclear and difficult to track down, something is crystal clear. The person behind the account wanted pictures from local Sacramento County high schools, requesting pictures of teens from Inderkum, Natomas High, and Grant.

“It’s scary. It’s scary because I don’t understand it,” said David Abarca, a Sacramento resident and grandfather.

FOX40 took the pictures to the Sacramento County Sheriff’s Department High-Tech Crimes Bureau to see if the pictures were child pornography.

“Showing a girl’s breast topless isn’t considered to be child porn,” said Detective James Williams with the Sacramento County Sheriff’s Department.

“It’s frustrating because most people wouldn’t want their child’s breast on the internet,” Williams said.

The reason law enforcement can’t step in is because no genitals or sex acts are shown in the pictures.

“Legally we can’t investigate it, because it’s not a crime,” Detective Williams said.

The Sherrif’s Department had no complaints from any of the local school districts; FOX40 tipped them off to the problem. The bureau showed us how they would investigate if it were a criminal offense.

“It will take a little bit and I can use them, even though we don’t have anything, I can use a search warrant on,” said Detective Williams.

An investigation could happen in the future. The accounts featuring the nude photos had thousands of followers. They were removed or deleted in the last 24 hours.

Instagram does have a ‘no nudity’ policy. The company’s spokesperson sent FOX40 a response to our story,

“Once content is reported to us, we work quickly to review and remove if it violates our guidelines,” said Alison Schumer a Spokesperson for Instagram’s Public Relations Department.

“Similar to Facebook and other services, Instagram uses a technology called PhotoDNA which helps in finding and removing images of child sexual exploitation. We report all instances of exploitative content to The National Center for Missing and Exploited Children (NCMEC),” said Schumer.