Skip to main content
Open this photo in gallery:

Jacky Hazan is the chief executive and co-founder of Tel Aviv-based Intervyo.Courtesy of manufacturer

Job seekers applying to Air Canada and other Canadian employers could encounter a new step in the interview process: artificial-intelligence software that analyzes their answers, facial gestures and voice tonality to issue a score indicating their suitability for the role.

Earlier this year, Air Canada highlighted a partnership with a little-known company called Intervyo to insert AI into its hiring practices. “Based on the pilot project, we are seeing better [fits] for the job, less turnover and realizing efficiencies by not having to host brick and mortar [job] fairs,” said Arielle Meloul-Wechsler, the airline’s senior vice-president of people, culture and communications, at the company’s investor day in February.

Proponents argue that machine-learning algorithms can reduce bias and save employers time and money by automatically screening video interviews. “It sounds like science fiction, and I’m glad it does,” said Jacky Hazan, chief executive and co-founder of Tel Aviv-based Intervyo.

But experts in various fields are concerned about the reliability and lack of transparency of the technology, and whether it could in fact exacerbate bias.

The technology has attracted scrutiny in the United States. The state of Illinois passed a bill earlier this year requiring employers to obtain consent from applicants if their video job interviews will be assessed by artificial intelligence. The largest vendor in the U.S. is a Utah-based company called HireVue, whose platform is used by Unilever and Hilton, among others. The company says it can determine someone’s “employability” by assessing conscientiousness, emotional awareness and other traits.

A number of Canadian companies use HireVue’s basic video interviewing platform – AI is an added feature –and a spokesperson was not able to confirm any companies in Canada using the assessments. However, Unilever Canada said that it does, and PricewaterhouseCoopers in Canada said it is exploring the possibility.

A Toronto-based startup called Knockri sells its own AI video-assessment tool, with clients such as IBM and Education First, a global language-training company that operates in Canada. (IBM said Knockri is in “limited deployment” in the U.S. and Britain and declined to answer questions. Education First also declined to comment.)

The software is used during the first stage of the hiring process. Typically, job candidates record videos of themselves answering questions and the software converts the speech to text, which is then parsed by an algorithm. Vendors say this analysis is grounded in concepts from industrial organizational psychology, a field concerning workplace behaviour.

Applicants could be asked questions that gauge their empathy levels or ability to collaborate, for example. The software also analyzes facial expressions during the video interview, such as whether people are smiling, and grades applicants on various criteria to determine the overall fit, at which point a human hiring manager takes over.

The process reduces unconscious bias, resulting in fairer assessments, proponents argue. “Creating inclusive technology is a core component of Knockri,” said Jahanzaib Ansari, CEO and co-founder of the Toronto startup. Mr. Ansari started the company after he found he could only get callbacks from employers when job-seeking a few years ago when he anglicized his name on his résumé.

Machine-learning algorithms are only as effective as the data used to train them. In 2017, Amazon.com Inc. shut down a résumé-sorting program because it showed bias against women. The algorithm was trained on existing applications to the company – which were predominantly from men – and learned that women were not preferable.

“If you're trying to determine what your future perfect candidate looks like and you're using historical data, which is obviously what you have to do because you don't have future data, then you have to be extremely mindful of what that might reveal,” said Carole Piovesan, co-founder of Toronto law firm INQ Data Law.

Knockri said it amassed its training data by having a diverse set of recruiters at client companies manually score videos from job applicants. “They’ve evaluated tens of thousands of videos,” said Maaz Rana, a co-founder and chief operating officer. Nearly 430 people have been hired through Knockri’s software.

Mr. Rana said Knockri constantly monitors its algorithms for signs of bias. If the software encounters something new, such as an accent, it will flag the video for a recruiter to review. “We want to make sure nobody’s left out,” he said.

It can be difficult to detect whether an algorithm inadvertently discriminates, said Manish Raghavan, a PhD candidate in the computer-science department at Cornell University, in Ithica, N.Y. One trained to favour applicants who smile a lot, for example, could penalize someone from a culture where smiling is not as common. “You may still be inadvertently disadvantaging people without meaning to,” Mr. Raghavan said.

Intervyo founder Mr. Hazan said the source of his company’s training data is confidential. But he insisted that the technology was sophisticated.

“It’s way beyond trying to understand whether the person is smiling,” he said. “Rather, what does it mean if an individual is feeling positive or negative in a certain question? How does it relate to his or her competencies?”

Air Canada declined to answer questions about how it uses Intervyo, including whether applicants are aware that they are being assessed by AI, saying it was “premature.” At the investor day, Ms. Meloul-Wechsler said Intervyo was used by in-house recruiters, and the software “will read the facial reactions from the candidates” and “redirect the recruiter to probe further in key areas.”

Intervyo’s website, which bears the slogan “predict greatness,” touts “advanced microfacial-gesture analysis” to reveal “hidden emotional intentions.”

Facial analysis makes some researchers uneasy. Software can detect whether someone’s facial muscles are contorting into a smile or a frown, but it’s not possible to determine the emotional intent behind those movements with a high degree of reliability, said Lisa Feldman Barrett, a psychology professor at Northeastern University in Boston. “Anybody who claims otherwise is not using science,” she said.

Dr. Barrett and four senior scientists from around the country spent more than two years reviewing roughly 1,000 studies of facial movements and emotions. Published earlier this year in the journal Psychological Science in the Public Interest, their review concluded that there are no universal facial expressions for six of the most studied emotions, including anger, happiness and sadness.

How people move their faces varies not only across cultures and geographies, but among people of the same background. People might scowl when they’re angry – or when they’re concentrating. Assuming emotional intent behind a frown or a furrowed brow alone without considering the context can be misguided. “You run the risk of making serious judgment errors,” Dr. Barrett said.

Mr. Hazan challenged the findings of the review. “If you investigate the subject more comprehensively, you’d realize that this one ‘study’ cannot disregard a myriad of other studies done on the subject,” he wrote via e-mail, adding that analyzing faces is useful for recruiters. “Why wouldn’t we?” he asked. “Neither you or I would want to work next to someone grumpy or angry.”

HireVue, which monitors its technology for bias, said most of its assessments do not involve facial analysis. But for some roles, knowing how people present themselves can provide “additional value” to recruiters. “A key example would be that someone who smiles, in some jobs, may produce better customer reactions,” a spokesperson wrote in an e-mail.

Past media coverage of Knockri has mentioned that the software can detect grimacing when an applicant is discussing a former boss, or whether someone’s eyes are wandering. But Mr. Rana said the software only alerts hiring managers to incongruencies – if a candidate’s answers scored highly for empathy, for example, but the candidate did not appear empathetic.

Machine-learning algorithms are complex and opaque, and it’s not always clear how or why they arrive at certain decisions. Even Mr. Rana cannot say which specific physical traits indicate empathy to its algorithms. “Eventually, we will be getting to that point, but to say that we can do that right now is disingenuous,” he said.

For job seekers, figuring out how to impress a near-inscrutable algorithm presents a new challenge. “It adds another level of mystification to this process,” said Kira Lussier, a postdoctoral researcher at the Institute for Gender and the Economy in Toronto, who has studied the history of corporate personality testing.

Companies should be cautious if they do plan to integrate AI into the hiring process, said Nicolas Roulin, an associate professor of industrial organizational psychology at Saint Mary’s University in Halifax. The software should first be tested against traditional methods over time. “If you do that with enough applicants, you can try to figure out if the AI is doing a decent job,” he said.

Mr. Roulin has mixed feelings about the technology, based on his experiences with vendors. “They are not always super transparent about what exactly they’ve done,” he said. “If I was a customer, I would not yet be convinced that this really works.”

Your time is valuable. Have the Top Business Headlines newsletter conveniently delivered to your inbox in the morning or evening. Sign up today.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe