Taking the help of advanced technology, the number of people looting people’s hard-earned money is increasing rapidly. These cases of cyber fraud are continuously increasing. For some time now, those scam cases have been in the headlines, where scammers are looting people’s money by changing their voice using tools equipped with Artificial Intelligence (AI). The latest incident is also related to this, where a 59 year old woman was talked to by imitating her nephew’s voice and Rs 1.4 lakh was looted.

TOI’s according to, a caller, imitating the voice of her nephew living in Canada, narrated a sad story and asked for money from the woman in the name of helping her. The call was made late at night and the woman says the man’s voice was exactly like her nephew. The scammer’s style of talking was also like that of his nephew.

The scammer told in his story that he had met with an accident and due to legal troubles, he needed money as soon as possible. On hearing this, the woman immediately transferred Rs 1.4 lakh to the account suggested by the scammer.

Recently, these voice changer tools equipped with Artificial Intelligence are being used in large numbers. The ongoing tension in western countries has further increased these incidents. Scammers are taking advantage of people under the guise of relatives stuck in these countries. The report states that Prasad Patibandla, director of Delhi’s Center for Research on Cyber ​​Intelligence and Digital Forensics (CRCIDF), says that “AI voice imitating tools can simulate the voice of any person using data available in the public domain.” Can copy accurately. These scams become even more effective by pretending to be stuck in a troublesome situation in a foreign country.”

In August this year, scammers in Delhi pretended to be the daughter of a former Home Ministry official and looted Rs 48 lakh from a retired government official. The accused pretended to be the daughter of the officer’s batchmate and claimed that her mother was hospitalized in Bihar and needed financial help.

For the girl’s voice, he used an app called Magic Call, which works to change the voice through voice modulation. The criminal added noise in the background, which made it difficult for the officer to recognize the voice and also made him think that the girl was really in the hospital.

At the same time, a similar case also happened in Kerala, where a person impersonating a customs officer convinced the victim to transfer money after posing as an officer on video call.

In such a situation, we would always advise you to first understand the situation calmly and then take some action. Apart from this, one should avoid sharing their financial details or OTP etc. with any unknown person on the call.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *