Call Sales +44 808 164 2563
 
< Back to Hiring Blog

Why AI Needs A Code of Ethics

September 12, 2019
 
iCIMS Staff
3 min read
Learn how iCIMS can
help you drive ROI

Data breaches in 2019 have increased by 54% since 2018. Since January, we’ve seen 4.1 billion records exposed – and we still have a whole quarter left to go in the year. As you can see in your daily feed, this is a rising concern for the public, but it should also be a top concern for the businesses.

From a recruiting standpoint, no matter your industry, if you employ people then you can potentially put your applicants at risk of data exposure.

Think about it. We’re gearing up for seasonal hiring and most organizations are in the midst of 2020 strategic planning. This means you’re ramping up headcount. You’ll likely have numerous candidates per requisition. So, you’re collecting data on multiple candidates per open role. This adds up. One quarter of all data breaches last year were caused by human error. The average cost of all breaches in the same time period was $3.92 million.

It doesn’t matter if you’re Tim Cook (who did have a small data breach last year). Your company can’t afford a data breach. It can be detrimental to your brand.

Why?

Candidates Trust You

Candidates submit personal information to employers for a very specific purpose: they want a job. The understanding between candidates and employers is that this personal information is necessary to make an informed hiring decision.

They expect employers to keep this data safe and confidential regardless of what it entails or how it was collected. If your company was to suffer a data breach, you lose the trust of potential candidates as well the candidates already in your queue. Would you want to work for an organization that leaked all your personal information?

The Government Holds You Responsible

Recognizing the need to hold businesses accountable for the phenomenal number of data breaches to date, the government is stepping in. We’ve seen California lay groundwork for an ethical code of conduct in the wake of GDPR through its California Consumer Privacy Act.

While the CCPA is still in its infancy stages, the U.S. Department of Defense is adding fuel to the fire. Last week, the Department of Defense released a new draft of cybersecurity standards. These standards focus on government contractors in order to help defend against hacks. While this is a segmented group of organizations, we are mindful that this is a first step.

We are likely going to see more laws and regulations enforced by local, state and federal agencies.

Ethics Goes Beyond Data Breaches

In addition to the threat of data exposure, there is more ethical components to consider when it comes to using AI and candidate data. As we continue to use AI in recruitment, we need to understand that AI can compound and expose bias. As we use AI algorithms to help us with recruitment, we must be prepared to address the fact that the algorithms are not neutral.

Machines are not ethical. They behave based on experiences and machine learning that we teach them. Despite our best efforts, we have bias when we teach machines. Data scientists and programmers are working diligently to address this issue.

To help protect your company, you should ensure to stay up to date on the latest legislation and trends in recruitment technology. Recruitment tech evolves each day and while it helps to automate and streamline your recruitment team’s day to day, it must be handled responsibly. AI needs a code of ethics to safeguard the data of all candidates.

×

Learn how iCIMS can help you drive ROI

Explore categories

Explore categories

Back to top

Join our growing community
and receive free tips on how to attract, engage, hire, & advance the best talent.

Read more about Changing tech

Recruitment process automation: A comprehensive guide for talent acquisition leaders

Read more

2023 year in review

Read more

We’re building better GenAI for talent acquisition

Read more