This week I am going to write about computer literacy within and Human Resource Management (HR). HR is a broad support industry that every company has to some degree. While the exact functions can vary based on the industry or company being supported, there are common functions throughout. Overall, HR is about supporting the people aspect of a company like talent acquisition and retention, managing employee pay and benefits, and workforce monitoring and analysis (ADP, n.d).
As degree seekers, we are likely to be dealing with HR recruitment systems in the future as we strive to put our earned knowledge to use. For years, HR departments have been using resume analysis programs to weed through applications in the hope of finding the best applicant without the interference of human bias (Dastin). However, reviews of the results revealed more of the same bias. As Martin (2018) puts it:
“This is one major drawback to AI, where whatever goes in is what goes out. This means that if there is already a bias in the hiring process of things like years of experience and certainly preferred degrees over skills or men over women in tech, the AI bot only knows what it is being told” (para 4).
The user may think that the cold logic of the computer is infallible when what they are actually getting is the repackaged logic of the people who came before. Dastin (2018) calls out that the transfer of these biases is likely unintentional but still a natural result of the training data.
Imagine you have recorded and ranked all the socks you have ever purchased. Early in your life your mom only got you SoccySocks. When you got out on your own, you purchased a few CallySocz because they were convenient, but never got around to trying MadSocks. You enter all these soc rankings into a computer and ask what kinds of socks you should buy next. SoccySocks has more scores than CallySocz, so it gets recommended higher. The program doesn’t extrapolate that CallySocz has a new cotton bamboo blend that you would like to try if you knew about it. The program has no basis to make any conclusions about MadSocks at all.
This example may sound silly, but if you apply this idea to collages on applicants' resumes, you can see how a program would keep with known successes, regardless of the actual causes of those successes. In the real world, this manifests as the resume programs downgrading applicants from all-girl women’s colleges (Dastin, 2018).
This brings me to the computer literacy of HR professionals who would be using or implementing resume analysis programs. I do not expect HR professionals to be able to write or understand the line-by-line code of every program they are using, nor do I expect that of anyone, even programmers. What I do recommend is an understanding of logic flows: if this, then that. I think HR professionals should know the connection between the input and output of the software they use well enough (or have access to meaningful enough explanations for reference) that they could examine a resume or situation themselves that they can match the program output.
This may delve more into critical thinking skills than computer literacy itself, but an ability without the critical eye to apply it well is meaningless at best, and actively harmful at worst.
ADP. (n.d.). What is human resource management? ADP.com. https://www.adp.com/resources/articles-and-insights/articles/h/human-resource-management.aspx
Dastin, J. (2018, 10, October). Amazon scraps secret AI recruiting tool that showed bias against women. Reuters.com. https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
Martin, N. (2018, December 13). Are AI Hiring Programs Eliminating Bias Or Making It Worse? Forbes.com. https://www.forbes.com/sites/nicolemartin1/2018/12/13/are-ai-hiring-programs-eliminating-bias-or-making-it-worse/?sh=4484d93a22b8
No comments:
Post a Comment