The problem is, when you can prove discrimination from human headhunters, you can sue them. When the company hides behind an algorithm, they simply say they went by what the algorithm recommended and they’re off the hook.
Even before the additional layer of algorithmic, now AI screening, its not like the disabled, out of work, likely financially fucked person can afford a lawyer.
Well, things certainly weren’t easy or rosy when humans did the discrimination. But at least there was the possibility of holding them accountable, either by disabled individuals or by associations. When companies let machines do the discrimination, they shield themselves from responsibility, making violations all the more egregious: without consequences, there won’t even try to exercize restraint.
The problem is, when you can prove discrimination from human headhunters, you can sue them. When the company hides behind an algorithm, they simply say they went by what the algorithm recommended and they’re off the hook.
Even before the additional layer of algorithmic, now AI screening, its not like the disabled, out of work, likely financially fucked person can afford a lawyer.
Well, things certainly weren’t easy or rosy when humans did the discrimination. But at least there was the possibility of holding them accountable, either by disabled individuals or by associations. When companies let machines do the discrimination, they shield themselves from responsibility, making violations all the more egregious: without consequences, there won’t even try to exercize restraint.