After blocking porn, firms now scan for child sex images

LONDON, Oct 19 — The first alarm came within a week. It meant an Ericsson AB employee had used a company computer to view images categorised by law enforcement as child sexual abuse.

“It was faster than we would have wanted,” says Nina Macpherson, Ericsson’s chief legal officer.

In a bid to ensure none of its 114,000 staff worldwide were using company equipment to view illegal content, in 2011 the Swedish mobile networks pioneer installed scanning software from Netclean Technologies AB. While many companies since then have adopted similar measures, few have been willing to discuss their experience publicly.

Ericsson’s move may have made it the first big company to scan employee’s computers for indecent images of children rather than just blocking online pornography, according to Michael Moran, a director at Interpol’s child exploitation unit. The key difference is that child-abuse material depicts a crime being carried out, and notifying police helps them find the people making it and prosecute those viewing it.

“You can actually save a kid from the abuse they are experiencing by recognizing, reporting and removing it.”

Netclean’s software scans web searches, e-mail, hard drives and memory sticks for specific images or videos already classified as child pornography. It uses image fingerprinting to ensure that it recognizes blacklisted photos even if they are moved around the internet, between computers, or modified. Netclean says its software finds illegal images on about 1-in-1,000 computers across the hundreds of clients it now serves.

Since installing the system, Ericsson says it has been dealing with around one alarm each month — each one flagging an act that could lead to prosecution.

“Our aim is not to become a law enforcement agency, but we want to get this stuff out of our system. It’s an unacceptable use of our networks, illegal, and against our policies,” Macpherson adds.

No false positives’

The alerts — invisible to the person who triggers them — are sent via e-mail and text message to Ericsson’s group security adviser, Patrik Håkansson, a former detective chief inspector from Sweden’s National Police IT Crime Squad. He’s confident that the digital fingerprint system means the software only raises the alarm when it detects images already on an international child abuse blacklist.

“There are no false positives; the technology won’t show up any pictures of children on the beach,” says Håkansson.

His job is to confirm that the illegal pictures have indeed been handled on company equipment, and by whom. In the US the FBI must be called immediately. In other markets Ericsson can carry out some internal investigations before involving law enforcement.

The perpetrator is typically fired, unless digital forensic investigators can establish there has been a genuine mistake, Macpherson says. — Bloomberg