“Research has shown that some emotional AIs disproportionately attribute negative emotions to the faces of black people, which would have clear and worrying implications if deployed in areas such as recruitment, performance evaluations, medical diagnostics or policing.”
this, plus it really fucks over autistic folks (some whom are also Black of course). what my face is doing at any specific moment does not necessarily correlate to my true emotions at that moment.
my only qualm with that is that it should read "...some emotional AIs are programmed to disproportionately attribute negative emotions..."
they aren't doing this of their own free will. this is the end result of how they are programmed to process input and generate output