AI cameras discriminate against students of colour in NYC schools

Last year, Canadian cameras were installed in schools in New York in order to detect weapons and prevent shootings. However, it would seem that the company lied about the accuracy of the facial recognition cameras.

 

Indeed, the documents provided by SN Technologies shows that the algorithm, id3, was vetted by the National Institute of Standards and Technology (NIST). Moreover, it was ranked 49th out of 139 in tests for racial bias. However, a scientist from NIST denies having tested them.

 

This was reported after schools discovered that false positives were more likely to discriminate against black students, showing them as suspected criminals when they are not. A report revealed the AI software was even worse at recognizing black people than the company told when they sold the cameras to the schools. The algorithm also misidentified objects such as broom handles for guns.

 

Parents have then started to sue the New York State Education Department (NYSED) for their approval of facial recognition AI in City Schools.

More
articles