Jobseekers, it might be a good time to polish your profile.
Microsoft’s LinkedIn is debuting a new AI hiring tool that can draft lists of job requirements, sift through profile pages and recommend users to recruiters as potential “top candidates” for new postings, the platform announced Tuesday.
The technology, called Hiring Assistant, can recommend LinkedIn users to recruiters based on location, skills and other desired qualifications, executives said. It can also help draft multiple messages reaching out to those users, and even take care of simple pre-screening questions.
In other words: If you’ve got the right skills, your profile could end up in front of a recruiter — without ever sending in an application.
“[The Hiring Assistant] is designed to take on recruiters’ most repetitive tasks, so they can spend more time on the most important parts of their job,” Hari Srinivasan, LinkedIn’s vice president of product, said in a virtual media briefing.
The new assistant builds on LinkedIn’s previous AI-powered recruiting latest technology, which hirers on the site could already use to sort through potential candidates.
But the hiring assistant has other capabilities: As recruiters use the technology, the AI tool learns about their preferences and improves its recommendations, LinkedIn executives said.
LinkedIn’s new feature is the latest entrant in an increasingly crowded field of AI-powered hiring tools that are supposed to make it easier for companies to match qualified applicants with open jobs — though results have been mixed.
Advice for job hunters: Spiff up your profile page
How can jobseekers prepare for more recruiters to start using the tool? Packing your LinkedIn profile with plenty of detail is your best bet.
“Just showing what you’ve done and being able to highlight those skills — I think that’s going to be a very important thing as Hiring Assistant gets more adoption,” Srinivasan said, adding that the technology can “pick up on all kinds of nuance” in the details of a user’s work experience.
In a statement to MarketWatch, vice president of product engineering Erran Berger said that LinkedIn’s AI assistant recommends candidates based on the job qualifications defined by the recruiter. That could include years of experience, location, skills or other requirements.
LinkedIn’s privacy policy outlines that the social network may use user’s personal data to develop and train AI models and gain insights with the help of AI: “so that our services can be more relevant and useful to you and others,” the policy reads.
The Hiring Assistant software is currently only available for a select number of clients that use LinkedIn’s recruiting platform, the company said — a list that includes major companies like Canva, Siemens and semiconductor company AMD.
The AI tool will be sold as an add-on for LinkedIn’s existing recruiter clients, but the company hasn’t yet announced a price.
Concerns about AI and the hiring process
Many companies already use algorithmic tools to filter and rank job applicants. Some have gone even further, employing AI to analyze an applicant’s performance in a video interview or design games that test their skills or personalities.
Recruiters have previously told MarketWatch that AI hiring tools have felt even more essential these days as hiring teams find themselves overwhelmed — both by shrinking recruiter ranks and a surge in applications as the job market cools.
But as the use of artificial intelligence in hiring becomes more common, critics have argued that there’s a lack of transparency surrounding the way the tools are used — and that the software could potentially introduce bias or discrimination into hiring decisions.
‘Workers often have no idea that an AI tool is being used, let alone how it works or that it might be discriminating against them,” Olga Akselrod, a senior staff attorney at the American Civil Liberties Union, told MarketWatch earlier this fall.
“Algorithmic tools are trained to make decisions based on historical data — based on what has happened before,” she said. “That data is going to bake in any biased human decisions and biased systems.”
Critics have pointed to infamous examples of AI missing the mark, from a scrapped tool at Amazon that showed bias against women, to a more recent paper showing that AI models, used by companies to evaluate text, showed a bias against people with disabilities.
Some lawmakers have already moved to regulate the use of algorithms in hiring. New York City passed a law requiring that employers who use so-called “automated employment decision tools” must submit those platforms to a third-party bias audit once a year.
LinkedIn’s hiring tool is primarily designed to help recruiters build talent pipelines, not help with final hiring decisions. Executives say that human supervision is “critical” while using the Hiring Assistant.
Berger told MarketWatch that the company takes its commitment to responsible AI “very seriously.” LinkedIn doesn’t provide any search filters for recruiters to target specific demographics, he said, and doesn’t allow recruiters to use hiring prompts that are discriminatory.
“We continuously work to ensure our systems can detect and eliminate any unintentional biases that might come up in the hiring intake or evaluation processes,” Berger said. That includes any potential bias in the algorithm that could impact how people meet qualifications, he said.
Berger did not respond directly to a question asking whether the company would submit its AI Hiring Assistant for a third-party bias audit.
“If harmful biases are identified, we work to address them,” he said.