In an effort to fight online child pornography, a researcher from the Polytechnic Institute of New York Universtity has developed software that allows authorities to sift through deleted photos in a computer’s trash to search for “potentially explicit images of children.”
The program scans for faces of children, nudity and other features to help flag images that could possibly be illegal contraband.
Using specialized techniques, the software has the ability to measure the distance between a person’s eyes and nose to determine whether it is a child or not. Photos must be a completely frontal depiction of the child’s face, which many times these kinds of photos are not.
The software was designed to help law enforcers capture sex offenders during a time when this kind of illegal activity is on the rise. Proving only to be 70% accurate, it tries to alleviate some of the difficulties authorities have in fighting this problem. One such MAJOR hurdle is the fact that in developing this kind of software, it’s not only illegal for the sex offenders to view child porn but also the people developing the software to fight against it.
While this takes some great strides in online regulation, is it completely ethical? Is digging through someone’s digital trash for clues of child pornography (only to be 70% accurate) fair to those being accused? Should we hold off on implementing software like this till all the kinks are worked out? Or is this software, even with the kinks, an immediate necessity?