People tend to trust machines more than humans when it comes to sharing private information and access to their financial data, an Indian-origin researcher has stressed.

 

 


People who trusted machines were significantly more likely to hand over their credit card numbers to a computerised travel agent than a human travel agent, said S. Shyam Sundar, Co-director of the Media Effects Research Laboratory and affiliate of Penn State’s Institute for CyberScience (ICS).

“A bias that that machines are more trustworthy and secure than people — or the machine heuristic — may be behind the effect,” Sundar added.

The faith in machines may be triggered because people believe that machines do not gossip, or have unlawful designs on their private information.

However, said Sundar, while machines might not have ulterior motives for their information, the people developing and running those computers could prey on this gullibility to extract personal information from unsuspecting users, for example, through phishing scams.

“This study should serve as a warning for people to be aware of how they interact online. People should be aware that they may have a blind belief in machine superiority. They should watch themselves when they engage online with robotic interfaces,” Sundar noted.

For the study, the researchers recruited 160 participants from Amazon Mechanical Turk, an online crowdsourcing website frequently used in studies.

The participants were asked to use either a human or a machine chat agent to find and purchase a plane ticket online.

After the agent returned the flight information, it prompted the participants for their credit card information.

The presence of a machine agent on the interface served as a cue for triggering the ingrained belief that machines are superior.

People with a high degree of trust in machines only need subtle design indications that they are interacting with a machine.

“In all of this, one thing I would like to stress is that the designers have to be ethical. They should not unethically try to extract information from unsuspecting consumers,” Sundar stressed.

The findings were shared during the “ACM CHI Conference on Human Factors in Computing Systems” held in Glasgow from May 4-9.

Share.
Leave A Reply

Exit mobile version