A recent study published in Computers in Human Behavior has revealed that financial or travel advisors often feel offended when their clients use AI to obtain a second opinion and are less motivated to work with the clients down the track.
The research, conducted by the Monash Business School, in collaboration with MUMA College of Business at the University of South Florida, examined how professionals perceive and react to clients who consult AI-enabled advisors.
Rapid advances in artificial intelligence have enabled the rise of AI-enabled advisory tools. While these tools benefit decision-makers, they also introduce new competitive pressures for human advisors whose expertise they may complement or replace.
First author, Associate Professor Gerri Spassova, from the Department of Marketing, said learning that a client has sought AI advice decreases the focal advisor's motivation to work with that client.
Importantly, this effect persists even when clients use AI only for background information or as a complementary resource.
"Advisors view AI as substantially inferior to themselves; thus, being placed in the same category as an AI system feels insulting and signals disrespect, undermining advisors' willingness to engage," Associate Professor Spassova said.
An interesting question the research raises is whether this would change once AI becomes more acceptable.
"One can only speculate," Associate Professor Spassova said. "My intuition is that the situation will not get much better. Firstly, because professional advisors' jobs are on the line.
"Also, as AI gets better, it may threaten our sense of worth and self-regard, and so when clients defer to AI, it would prompt advisors to question the value of their human contribution."
Key findings:
- Seeking a second opinion from AI erodes advisors' motivation to work with the client.
- The effect persists when AI is consulted for initial or complementary information.
- Feeling offended, due to being equated with AI, explains the effect.
- Clients who consult AI may be seen as less competent and warm.
One way the researchers assessed this was by asking participants to imagine they worked for a travel agency, and that a few months earlier, they spent an hour with a client who needed advice about a trip to North America, discussing travel packages and looking up flights and accommodation. Half of the participants were then told that the client also saw another travel agent and ended up booking the trip through them; for the other half, the other travel agent was described as an AI travel agent.
Participants were then asked to imagine that a few months after the first encounter, they were asked by the same client to book a new trip.
Participants indicated they were less willing to spend time with the client if on the previous occasions they had booked the trip with an AI agent.
"Ultimately, we believe that at present it is better for clients not to disclose to their advisors that they consulted AI. And this may be particularly true for new client-advisor relationships, where the client has no track record of doing business with the advisor and where trust may not have been established yet," Associate Professor Spassova said.
"The effect will probably be weaker if there is a long history of the client and the advisor working together (though even then the advisor may feel cheated)."