Article

Resources

expect

Article

Insights

Recruiting Scams Using AI-Generated Applicants and Credentials Provide New Reasons for Employers to Update Recruiting Processes

By: Mitchell J. Rhein

The President of the United States poses with a lightsaber flanked by the American flag and eagles. Taylor Swift offers free cookware sets due to a packaging error. Ukraine’s former Foreign Minister calls a United States Senator to gather political information. While no one was fooled into believing President Trump owns a lightsaber, scammers are using AI-generated photos, videos, or voices to trick people into providing confidential or sensitive information. These schemes now target employers seeking to hire for remote jobs.

In January, the FBI warned that North Koreans were posing as candidates for remote jobs to access employer’s confidential information and systems. Nearly every major company has hired or received applications from North Koreans using AI-generated interview answers or deepfake applicants to participate in video interviews. According to the Financial Times, one quarter of all job applicants in the global market will be fake by 2028.

To avoid falling prey to these scams, hiring managers do not need a computer science degree from Stanford. Instead, employers must rely on good old-fashioned caution and due diligence when hiring for remote roles. Here are three quick tips to mitigate the risk of hiring an AI-generated applicant:

Pick up the phone and make calls.

Fabricated or exaggerated employment experience is not a new scheme. However, few employers actually pick up the phone and contact an applicant’s former employers on a resume. No law prohibits an employer from contacting an applicant’s former employers. While employers must be careful not to ask questions that may screen applicants based on a protected characteristic, the calls should determine whether the applicant’s former employers actually exist, the applicant worked there, and the applicant held the positions listed on their resume. If the employer uncovers inconsistencies based on calls to former employers, then it should ask the applicant for more detailed information before making a hiring decision.

Use in-person interviews.

Schemes using AI-generated content to bamboozle employers rely on a remote interview process. While it may not be cost-effective to conduct in-person interviews for every role, employers should consider in-person interviews before deciding whether to hire an applicant. This is especially important if the employer is hiring an applicant who will have access to any sensitive information or systems. When in-person interviews are impossible or impractical, employers should require applicants to use video during remote interviews or to share the applicant’s screen to review documents such as the applicant’s resume.

Exercise caution with products that purport to screen applicants.

To combat scams aided by AI-generated content, vendors may offer products that purport to screen applicants for such content. Employers must exercise caution before buying or implementing such products. Any system that screens applicants could have a disparate impact on certain legally protected groups. Employers must investigate how any screening product was developed and whether there is a risk of disparate impact before implementing the product.

Spilman’s labor and employment attorneys are available to assist in evaluating hiring practices and mitigating the risk of AI-generated applicants.