HR - watch out for AI creep in staff assessment systems

We have seen plenty of publicity about how the social media giants can monitor us online and monetise what they find by selling advertising to those who want to reach just such a target market.

Plenty of outrage has been generated by this invasion of privacy. We can marvel or be alarmed at how algorithms can predict our behaviour better than we can ourselves. Digitally we are our own worst enemy.

However, it does not stop there. Algorithms are increasingly being applied to manage people by linking performance stats with personnel data and online usage.

Artificial Intelligence is being used to monitor performance targets and to measure the progress attained by individuals in meeting those targets. A number of firms already use these techniques where the employees do not automatically have the right to access the data created.
(I can’t quite bring myself to name the companies I have read about who use these techniques, nor even to name any of the software suppliers involved. That perhaps indicates what I think about this.)

The Guardian is on the case, needless to say, to point out some of the pitfalls of this digital dissection, as is the TUC, which warns against the use of surveillance technology without the full cooperation of the workforce.

A number of questions should be asked by companies and those who work for them, including, if this is being done or is planned, is it completely open and understood by all involved; do we know what is being analysed and, even more crucially, what is being ignored; who is making the value judgements and on what basis?

The worry is that we end up only measuring what can be machine-measured and that means the ideal employee ends up being a machine (at least in a machine’s eyes!). It will be dangerous to use Artificial Intelligence unintelligently, unless you want to be entirely staffed by robots.

Published on Apr 10, 2019 by Neil Thomas