Trading is booming. In 2024, global market capitalisation reached a new peak of more than 124 million USD. The most popular trading products include, in particular, securities from technology companies and ETFs. More and more companies are also relying on investments in the securities market for their liquidity strategy. In addition to securities, cryptocurrencies are also increasingly being added to portfolios. In doing so, companies in particular rely on expert knowledge and outsource their trading activities. However, with growing demand, some downsides are also emerging, such as deepfake technologies and supposedly reputable trading partners. This article shows how companies can avoid falling for scams.
Recognising trading fraud – not so easy
Clever investing and making money with securities, ETFs, cryptocurrencies and the like – more and more companies are taking advantage of these opportunities to secure additional liquidity. However, the supposed trading platforms and experts are not always as reputable as you might think. Recently, for example, a fraud was uncovered in Rani, near Guwahati. A youth named Chinmoy Das is said to have defrauded investors out of around 6 million US dollars through his fraudulent activities.
The scam was as simple as it was lucrative: he promised enormous returns of at least 10%. But the investors never saw any of the promised profits. Instead of a windfall in their accounts, they were met with stalling tactics and, in the end, the supposed trading pro was no longer even available. Now the handcuffs are clicking and the police have uncovered the fraud. It remains unclear whether the victims will see their investments again.
It can happen to anyone
Can anyone recognise trading fraud? Is it really that simple? The case described shows that even experienced investors are not immune to black sheep in the industry. But how can you really recognise a reputable trading partner? One of the most reliable measures is to check the LEI database. Companies that want to offer reputable financial services must register for a so-called LEI number.
And true professionals also rely on professional support here, for example from LEI number online – LEI.net. The company provides support with applications and renewals and also offers insights into the LEI database to help users quickly find reputable financial partners.
When bank employees become fraudsters themselves
The importance of control mechanisms in the financial industry is impressively demonstrated by an incident in 2024. Travis Klein, an employee of Macquarie Bank, had registered over 400 fictitious commodity transactions in order to conceal his own trading losses via the company. From June 2020 to February 2022, he repeatedly placed orders, thereby incurring costs of 57.8 million US dollars for the bank to liquidate the positions.
When the UK Financial Conduct Authority (FCA) found out, it reacted severely: the trader responsible was immediately banned from the financial services industry. His employer, Macquarie Bank, was fined £13 million.
Deepfake: the new danger for investors
Artificial intelligence has developed rapidly in recent years. Various media content is now being created with its help. But here, too, not all that glitters is gold. So-called deepfakes are keeping the financial sector on tenterhooks. In recent years, the number of fraud attempts on this basis at financial institutions has increased by over 2000%.
The fraudsters always proceed according to a similar pattern. In fake video conferences, for example, executives are imitated and employees are instructed to initiate transfers. In addition, there is manipulated content in social networks and phishing attacks. In Hong Kong, fraudsters were able to steal more than 25 million US dollars from a financial employee in this way.
How can AI fraud be detected?
There is no absolute certainty when it comes to protecting yourself from such sophisticated fraud. However, if you look closely, you can often spot signs of a deepfake. Slight delays between lip movements and sound, inaccurate facial features or unnatural facial expressions, stilted or robotic-sounding voices – all of these can be indications of an attempted fraud. Reverse image search, AI recognition software and voice verification systems are also helpful. For example, they can reveal deepfake characteristics or confirm the real-time nature of voice recordings through biometric analysis.