It's annoying that both articles are calling this AI error. This was human error, the police did the wrong thing and the people of Fargo will end up paying for this fuckup.
I would argue it was both. No doubt this company was marketing it in a way to make it seem very reliable. And all of the procedural things afterwards made the error so much more damaging.
But imo this is why local police departments should not have access to this kind of tool. It is too powerful, and the statistical interpretation is too complicated for random North Dakota cops to use responsibly. Neither the company nor the PD have an incentive to be careful.
It's not an AI error. The face recognition AI simply said that it's a "potential match", which is correct. It's the humans' job to confirm that a potential match is in fact a match, especially when the suspect is 1,900 kms away.
https://www.theguardian.com/us-news/2026/mar/12/tennessee-gr... - Another article on this without a paywall.
It's annoying that both articles are calling this AI error. This was human error, the police did the wrong thing and the people of Fargo will end up paying for this fuckup.