Financial Scams Archives - 成人VR视频 Institute https://blogs.thomsonreuters.com/en-us/topic/financial-scams/ 成人VR视频 Institute is a blog from 成人VR视频, the intelligence, technology and human expertise you need to find trusted answers. Mon, 13 Apr 2026 08:15:40 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 More SARs, not better ones: Why AI is about to flood the system /en-us/posts/corporates/ai-driven-sars/ Mon, 13 Apr 2026 08:06:52 +0000 https://blogs.thomsonreuters.com/en-us/?p=70285

Key insights:

      • SAR volume is significantly underreported 鈥 Continuing and amended filings add approximately 20% to the official count yet remain invisible in trend analyses.

      • Filing activity is highly concentrated 鈥 A few large financial institutions dominate SARs volume, meaning trends reflect their practices more than systemic changes.

      • Agentic AI will drive a surge in SARs 鈥 Agentic AI risks increased noise over actionable intelligence, without addressing the unresolved question of whether current filings yield meaningful law enforcement outcomes.


The Suspicious Activity Reports (SAR) that financial institutions file with the U.S. Treasury Department鈥檚 Financial Crimes Enforcement Network (FinCEN) provide valuable insight, although they may not offer a comprehensive picture.

Prior to meaningful discussions regarding the future of SARs, it is essential for the financial crime community to clarify what is being measured. In 2025, for example, SAR filings of more than 4.1 million, representing an almost 8% increase compared to the total number of SARs filed in 2024.

Every figure FinCEN has published reflects original SARs only. Continuing activity SARs, which represent roughly 15% of all filings, are submitted under the original Bank Secrecy Act (BSA) identification number and never appear as new filings. Corrected and amended SARs add another 5% on top of that. This makes the real volume of SARs activity approximately 20% higher than what is reported.


The average community bank files fewer than one SAR a week, while the largest institutions file more than 500 a day.


Recent FinCEN guidance giving financial institutions more flexibility around continuing activity SARs sounds significant on paper, but as former Wells Fargo BSA/AML chief Jim Richards points out: “It won’t change the reported numbers 鈥 because those filings were never counted to begin with.” Financial crime professionals need to keep that gap in mind every time a trend line gets cited.

2025 was steady, not spectacular

There were roughly 300,000 SARs filed every single month of 2025, and the most notable thing is that nothing notable happened. That is likely a first on the volume side and worth acknowledging, but beyond that milestone the year did not hand financial crime professionals anything noteworthy. In a space that has dealt with pandemic distortions, crypto chaos, and fraud spikes that seemed to come out of nowhere, steady volume and predictable patterns are a little surprising. A quiet data set, however, is not the same as a quiet landscape, and financial crime professionals who are reading stability as stagnation may find themselves flat-footed when the numbers start moving again.

For example, one of the most underleveraged insights in the SARs space is just how concentrated filing activity really is. The numbers are stark: The top four banks file more SARs in a single day than 80% of the rest of the banks file in 10 years, according to 2019 data from a .

The average community bank files fewer than one SAR a week, while the largest institutions file more than 500 a day. “50 a year versus 500 a day,” notes Wells Fargo鈥檚 Richards, adding that such asymmetry has real implications for how the financial industry interprets trends. Meaningful movement in SARs data, up or down, is almost entirely dependent on what a handful of mega-institutions decide to do.

Not surprisingly, money services businesses (MSBs) are the second largest filing category, and virtual currency exchanges are almost certainly driving recent growth there, even if outdated category definitions make that difficult to confirm directly. Credit unions round out the top three.

The filing philosophy hasn’t changed and shouldn’t

Regulatory noise occasionally suggests that institutions should be more selective about what they file. However, compliance and legal reality have not shifted. No institution has ever faced serious consequences for filing too many SARs, and the cases that result in enforcement actions, reputational damage, and regulatory scrutiny are consistently about missed filings or late ones.

鈥淵ou’re not going to get in trouble from filing too much,鈥 Richards says. 鈥淣obody ever has, and I doubt if anyone ever will.” For financial crime professionals, the calculus remains exactly what it has always been 鈥 when in doubt, file. That posture isn’t going to change, and frankly it shouldn’t.

Yet, here is where the SARs space gets genuinely interesting. Agentic AI use in SARs filings 鈥 systems in which multiple AI agents work through a case from screening to decision to documentation 鈥 is beginning to move from concept to deployment. The impact on filing volume likely will be significant.


The risk is a system flooded with AI-generated SARs of variable quality, creating more noise for law enforcement to sort through rather than sharper intelligence to act upon.


Whereas a small team today might work through a handful of cases a week, AI-assisted workflows could push that into the dozens. Multiply that across institutions already inclined to file rather than miss something, and the result is a coming surge in SARs volume that could play out over the next two to four years.

“Agentic AI has the potential to be a game changer on how we do our work,鈥 Richards explains. 鈥淏ut I believe it’ll guarantee that there will be more SARs filed and not necessarily better and fewer SARs filed.” Indeed, the critical point for the financial crime community to internalize is exactly that.

The risk is a system flooded with AI-generated SARs of variable quality, creating more noise for law enforcement to sort through rather than sharper intelligence to act upon. Once the largest institutions adopt agentic AI as a best practice, others will follow quickly, and regulators will likely be several steps behind.

The value question can’t wait

The has been in place since 2014. Yet after 12 years of filings, the financial crime community still lacks a clear public accounting of whether that data has produced actionable law enforcement outcomes.

So, the question Richards is asking is one the entire industry should be asking: “Has anybody asked law enforcement?”

This question reflects a larger challenge that the industry needs to confront more aggressively, especially as AI technology is set to dramatically increase filing volume across the board. Increasing the volume without improving how the information is used does not represent progress. If SARs are not generating real investigative value, the solution is not to file more of them faster 鈥 instead, the pipeline should be fixed before it grows any bigger.


Please add your voice to 成人VR视频鈥 flagship , a global study exploring how the professional landscape continues to change.

]]>
The banks you don’t know you’re using: Risks of unregulated banking /en-us/posts/government/unregulated-banking-risk/ Wed, 01 Apr 2026 17:10:50 +0000 https://blogs.thomsonreuters.com/en-us/?p=70163

Key insights:

      • Convenience has outpaced consumer understanding 鈥斕齅any users treat apps, prepaid accounts, and rewards programs as simple payment tools, remaining unaware they are entrusting their money to entities with few safeguards.

      • Risk is no longer confined to traditional banks 鈥 Some of the most significant financial activities now occur within platforms and brands that do not resemble banks at all.

      • Opacity enables systemic vulnerability 鈥 The less transparent an institution’s obligations, leverage, and oversight, the easier it is for financial fragility, misconduct, and systemic risk to grow unchecked.


When you think of where money is held, you generally think of a bank. However, as we look at the financial landscape today, money is being held at a wide range of institutions that often have varying levels of safety and oversight. Entities from Starbucks to Visa to Coinbase hold money for individuals, effectively serving as a bank, but often without the regulatory framework that comes with it.

Behind the scenes, it can seem like . In its daily operation, it collects prepaid funds that resemble deposits, holds them as liabilities, and uses them internally 鈥 all without offering interest, cash withdrawals, or FDIC insurance. Starbucks’ rewards program holds $1.8 billion in customer cash, and if it were a bank, that would make it bigger, , than 85% of chartered banks, making the coffee chain one of the .

This dynamic extends well beyond coffee shops. “Popular digital payment apps are increasingly used as substitutes for a traditional bank or credit union account but lack the same protections to ensure that funds are safe,” warns the . If a nonbank payment app’s business fails, your money is likely lost or tied up in a long bankruptcy process.

Shadow banking

Think of a Starbucks gift card as a financial instrument. Technically it is one, but no one seriously worries about it being weaponized for any large-scale financial crimes. Most people鈥檚 concerns about a gift card is either losing it. The real concern lies not in lost gift cards, however, but in the broader trend: Nonbank institutions managing vast sums without commensurate oversight 鈥 and scale matters. A lost gift card is a personal inconvenience; but an unregulated institution managing billions of consumer dollars in leveraged capital is a systemic one.

Shadow banking encompasses credit and lending activities by institutions that are not traditional banks, and crucially, they do not have access to central bank funding or public sector credit guarantees. And because they are not subject to the same prudential regulations as depository banks, they do not need to hold as high financial reserves relative to their market exposure, allowing for very high levels of leverage which in turn can magnify profits during boom periods and compound losses during downturns.

The shadow banking ecosystem is diverse, and each segment of it presents distinct risks:

    • Hedge funds and private equity firms听鈥 Firms like Blackstone, KKR, and Apollo manage vast capital pools using leveraged strategies under limited oversight. Their size and borrowing levels may mean that market reversals can trigger rapid deleveraging, spilling risk into broader markets.
    • Family offices听鈥 A private company or advisory firm that manages the wealth of high-net-worth families, these can operate with even less transparency and often outside direct regulatory scrutiny, enabling them to engage in extreme leveraging and posing risks of sudden collapse.
    • Nonbank mortgage lenders and FinTechs听鈥 This group faces lower capital requirements than traditional banks, leaving thinner buffers to absorb losses during downturns, which can be especially concerning considering this sector鈥檚 rapid growth.
    • Crypto exchanges听鈥 Like much of the cryptocurrency ecosystem, these exchanges operate in jurisdictional gray zones, complicating enforcement and enabling illicit financial flows.
    • Money market funds 鈥 While these are generally perceived as safe, they can suffer runs if confidence in underlying assets erodes, which can force fire sales that destabilize related markets.
    • Special Purpose Vehicles (SPVs) and Structured Investment Vehicles (SIVs)听鈥 These investment instruments allow large institutions to move risk off their balance sheets, rendering such activity invisible to regulators.

Shadow banking may be the single greatest challenge facing financial regulation. These non-traditional institutions act like banks, but without the safeguards that make banks accountable. And where accountability is absent, opportunity often fills the void.

The same opacity that makes shadow banking difficult to regulate also makes it attractive to those with less legitimate intentions. Without mandatory reporting requirements, standardized oversight, or the threat of deposit insurance revocation, these institutions can become conduits for money laundering, fraud, terrorist financing, and sanctions evasion in ways that traditional banks simply cannot. The question is no longer whether these vulnerabilities exist, but how they continue to be exploited.

The challenge of regulation

The global financial system has always evolved faster than the rules designed to govern it. What began as a coffee loyalty program and a few alternative lending platforms has quietly morphed into a parallel financial universe, one that moves trillions of dollars with a fraction of the transparency that traditional banking requires. That gap between innovation and oversight is not just a regulatory inconvenience, it鈥檚 an open door for illicit actors.

Closing that door will require more than periodic enforcement actions or piecemeal legislation. It will require regulators, lawmakers, and institutions to reckon honestly with how broadly the definition of a financial institution has expanded, and who bears the risk when things go wrong. Because historically, it has not been the institutions themselves; rather it has been the customers, the investors, and ultimately the public.

The first step, of course, is awareness. Recognizing that your money does not need to be in a bank to be at risk and that the custodians of that money need not be offshore shell companies to operate in shadows, can transform how we think about financial safety.

The line between a convenient app and an unaccountable financial intermediary is thinner than most realize. And in the world of financial crime, thin lines have a way of vanishing entirely.


You can learn more about the听many challenges facing financial institutions today听here

]]>
Financial crime implications of a US-Iran war: The emotional drivers of instability & illicit flows /en-us/posts/corporates/us-iran-war-financial-crime-implications/ Tue, 10 Mar 2026 16:26:26 +0000 https://blogs.thomsonreuters.com/en-us/?p=69898

Key insights:

      • Geopolitical crises fuel financial volatility and illicit activity 鈥 Conflicts have traditionally accelerated capital shifts and flows, creating cover for bad actors.

      • Predictable patterns emerge 鈥 Financial institutions should watch for sudden cross-border activity, unusual cash deposits, and transactions from border areas.

      • Conflict zones enable black market expansion 鈥 They also should adapt their compliance systems to detect more sophisticated methods used by criminals, tightening screening and enhancing staff training.


While business and international politics may appear cold and calculating, these things are often driven by emotion, especially fear 鈥 and fear of instability often drives market volatility.

So it goes as the United States attacks one of the world’s largest militaries and supporters of regional terror groups, causing deepening instability in a Middle East already beset by violence. It is certain that there is already a surge of money flowing in and out of the region for different reasons. Legitimate and illegitimate actors alike will seek to both run away from the crisis and profit from it. However, there are some anti-money laundering specific thoughts that financial institutions need to consider during a time of global uncertainty.

The bottom line 鈥 lots of money is on the move. Funding will send aid groups towards the crisis; it will also send logistical supplies, war material, and other necessities. All of these cost money, and defense sectors in multiple countries will be pumping out munitions to refill stockpiles in any country that is related to or in the neighborhood of the conflict.

Not every large transaction is an unusual, reportable event, but financial institutions now need to look one or two layers below the surface. What does not seem related on the surface is always a red flag. Look at beneficial ownership of companies and vessels, look at relations of the owners, not just the(OFAC) results of those people themselves. The financial system will, and should, allow the legitimate funds to flow. However, financial investigators must remain diligent to catch bad actors that take advantage of the surge in non-profit activity or the urgency with which legitimate businesses operate in a conflict zone.

Risk Factor 1: Capital flight from regime change

Just as the fall of the Al-Assad regime in Syria caused family funds to flow to as regime members fled the country, you will see the same with politically exposed persons (PEPs) who are inevitably fleeing regime change in Iran. A political crackdown will come. Whether the victors are on the side of the West or not remains to be seen, but some factions are going to flee the country and take family wealth with them.

Banks and other financial services should watch for anyone connected to people moving money through neighboring countries in which they may have literally hiked or driven before depositing cash into a financial institution. There are stories of refugees leaving places with gold bands on their arms, cash and false bottom purses, and diamonds in the lining of sweaters. These things will be converted to cash in neighboring countries and put into financial systems less affected by the conflict. An influx of cash throughout the region, therefore, could indicate this type of capital flight.

Risk Factor 2: Illicit finance and black markets

Since the fall of Syria, we have also become aware of that helps fuel addiction and armed conflict. There are certainly other substances and drug trafficking networks about which we know very little on this side of the secrecy veil.

Therefore, this instability will be seen as a time of opportunity for criminal groups. Indeed, with Assad鈥檚 security forces no longer controlling middle eastern captagon and other narcotics trade and various armed groups looking for funding sources, this is an illicit business opportunity.

Financial institutions can expect rapid movement of money between unrelated shell corporations, new corporations, and shadow vessels. They also should expect the black market to boom with drugs, contraband Iranian oil, and funds tied to narcotics that they have only yet to discover. Illegal arms will also generate funding, so all of the methods, both formal and informal, used to transfer value will become active.

In fact, large portions of such funding will flow through financial institutions; and peer to peer payment providers, FinTechs, and money transmitters should be especially wary of funds moving rapidly through their platforms. A burst in conflict means a burst in activity from illicit sources; therefore, enhanced, targeted monitoring is a must.

How financial institutions鈥 risk & compliance teams should respond

First, all financial institutions鈥 risk & compliance departments need to assess their institutions鈥 OFAC and sanctions screening search parameters. This is a good time to dial up fuzzy logic capability and reduce match percentage thresholds. In other words, risk tolerance should go down while the metaphorical dragnet gets wider. Surge the department鈥檚 personnel capability to compensate if you have to, because that is better than a strict-liability OFAC fine. Remember, OFAC sanctions are closely tied to national security, especially when it comes to Iran. This is not an arena in which leniency can be expected. Compliance teams should look at monitoring systems and thresholds immediately, create geographical targeting models to cover the conflict zone, and consider a command center approach to deal with the fluidity of the situation until things settle.

If your institution has not already taken the hint from regulators, this also is an opportunity to double down on Customer Due Diligence and identity verification. Front line staff and embedded business compliance personnel should receive updated training and job aids to increase awareness and hone internal reporting. Indeed, it is an advanced business skill to understand complex corporate beneficial ownership, much less to detect when it may be tied to illicit activity or corrupt regimes. Now is the time to increase that level of knowledge and thereby make the culture of compliance more robust.

In every crisis there is opportunity as well as risk: Managing the risk allows every company to take advantage of the opportunity, shore up its mission, and strengthen the institution.


You can find out more aboutthe geopolitical and economic outlook for 2026here

]]>
Crypto crime, caveats & clarity: How crypto forensics has evolved in 5 years /en-us/posts/corporates/crypto-crime-forensics-evolve/ Mon, 02 Mar 2026 17:21:24 +0000 https://blogs.thomsonreuters.com/en-us/?p=69690

Key insights:

      • Crypto crime is likely much bigger than it appears 鈥 Blockchain forensics firms only report what they can prove with 99%-plus accuracy, meaning the true scale of crypto crime is likely far larger than official reports suggest.

      • False negatives are still a problem 鈥 While achieving incredibly low false positive rates, these strict standards result in significant false negatives, with firms missing up to 75% of known criminal addresses in tested datasets.

      • This reporting gap reveals hidden losses 鈥 FBI data shows higher losses than do forensic reports and when accounting for the 85% of fraud victims who never report crimes, actual losses could exceed $110 billion annually.


Law enforcement has known about crypto-related crime for more than 14 years now. Five years ago, I felt these industry reports left a lot to be desired. A lot has happened in since then, however, and I have learned that clarity is becoming more important than caveats, because even my own are being taken out of context by the cryptocurrency ATM industry.

The myth of “crypto crime”

Nick Furneaux points out spoiler: it鈥檚 all just financial crime. Yet, the blockchain forensics industry still has the annual tradition of issuing crypto crime reports that end up getting reviewed . However, my previous post showed how the prevailing reports appeared to prove Nick鈥檚 point, stating that crypto crime represented just 鈥 effectively, a rounding error.

I wrote that these reports needed to be heavily caveated, as the figures identified were clearly smaller than the figures that may have been reasonably expected. In fairness to the industry, reports have since incorporated caveats on nearly all stated figures. However, this has still not stopped the industry from cherry picking figures that support the argument that there is no such thing as crypto crime.

The ironically good news in this year鈥檚 reports has been that the official figures for illicit activity across the industry has increased to of all crypto activity for the . This increase is an indicator that the industry has gotten better at identifying criminal activity; and while there is still room for improvement, we are moving in the right direction.

Art vs. science

The companies producing these reports continue to hold some of the largest datasets on crypto-crime and blockchain metadata in the world. They are ideally placed to speak to these trends in illicit activity in the crypto ecosystem. However, one of the early arguments in blockchain forensics was that it is not as effective as some people were claiming.

In the landmark case, (colloquially known as the Bitcoin Fog case), blockchain intelligence platform CipherTrace claimed that blockchain forensics was more of an art than a science. Based on evidence from Chainalysis, the case鈥檚 acknowledged blockchain forensic evidence was admissible in criminal court to based on the methods used.


Understanding the limits of these reports requires an understanding of the core audience for these forensic firms: Law enforcement, which has a high burden of proof to achieve before going to court with any evidence.


Chainalysis has been doing this for 12 years at this stage and has been one of the only services to undergo a of its data, albeit a tiny sample size of its overall dataset. In the last five years, competitor TRM Labs has become an industry leader based on its focus on blockchain intelligence and law enforcement support.

The accuracy trap

Understanding the limits of these reports requires an understanding of the core audience for Chainalysis and TRM Labs: Law enforcement, which has a high burden of proof to achieve before going to court with any evidence. As such, the standard held by industry leading companies is that a data model should achieve an accuracy level of 99%-plus. However, as with any machine learning algorithm, it is incredibly difficult to guarantee 100% accuracy. Still, 99% accuracy is higher than human-based systems are expected to have.

Despite this commitment to high standards, the blockchain forensics industry has come under fire for false negatives. In the academic research of Chainalysis鈥 data, researchers found its false positive rate to be 0.01%, 0.15%, and 0.11%, respectively across the three datasets, or at least 99.85% accuracy for what was in their tool. Obviously, this is much more scalable and accurate in the modern world in which criminals are using AI than having humans unravelling these datasets manually. However, this level of certainty does paradoxically result in a surprising level of false negatives.

Indeed, Alison Jimenez, of Dynamic Securities Analytics, pointed out that Chainalysis missed a significant percentage of all addresses in the three sample datasets. The study looked at coverage of three known illicit services: BestMixer, Hansa Market, and Wall Street Market.

Chainalysis was found to have been able to identify 25%, 79%, and 95% of the sampled addresses, respectively. While this may seem like the company is negligent to suggest they can identify crime when it missed 75% of Best Mixer addresses, a service designed to obfuscate the flow of funds, the reality is that identifying any of these services is pretty difficult in the first place 鈥 especially in a world in which criminals are actively trying to escape surveillance. And remember, this is just the data that made it to production; Forensics firms are still able to assist law enforcement to make informed decisions on their investigations based on a range of additional data that never gets surfaced in the tool or in reports.

The reporting gap

These forensic companies are unable to publish informed estimates of the level of crime, but they are saying that they have identified at least $154 billion dollars in illicit activity in 2025. These tools also assist law enforcement with investigations which they may not always have permission to include in their datasets. Yet, investigators can still use the technology to carry out their investigations safe in the knowledge that their evidence will be admissible in court. That means, the $154 billion figure is effectively a floor, not a ceiling for the potential effectiveness of blockchain forensics.


The FBI counts what victims report, whereas forensic firms count what they can prove on-chain. When you consider that academic research suggests 85% of fraud victims never report their crimes to anyone, the scale of the problem becomes staggering.


The discrepancy between forensic reports and law enforcement data is where the caveats become most visible. The for 2024 (released in late 2025) pegged crypto-related scam losses at $16.6 billion. This figure is 67% higher than Chainalysis鈥檚 estimate, and 55% higher than TRM Lab鈥檚 for the same category.

Why the gap? Because the FBI counts what victims report, whereas forensic firms count what they can prove on-chain. When you consider that academic research suggests 85% of fraud victims never report their crimes to anyone, the scale of the problem becomes staggering. If we extrapolate the FBI鈥檚 reported figures to account for this silent 85%, the potential loss to crypto scams could be as high as $110 billion. While not an academically rigorous calculation, this figure would not surprise many industry analysts.

What will these reports look like in another 5 years?

The critique I have of these reports is that they underestimate the size of the problem in order to be able to accurately stand by their data. This isn鈥檛 a bad thing, it just results in unfortunate outcomes. There may be a day when these reports are combined with academic research to make a more informed estimation of how big the crypto crime problem really is.

Thankfully, those in the blockchain forensics industry can鈥檛 speak in theories or artistic interpretation. They have to be able to prove their statements and back them up with verifiable data. Right now, these reports are effectively looking at the tip of the iceberg and showing what they know about what they can see 鈥 the caveat now is that this is just the known knowns. The challenge continues to be identifying the known unknowns. Fortunately, we are getting better at identifying criminal activity every year.


You can find more of our coverage of the cryptocurrency industry here

]]>
The OCC鈥檚 2026 mission: Modernization & innovation in the financial sector /en-us/posts/government/occ-modernization-mission/ Fri, 27 Feb 2026 12:11:27 +0000 https://blogs.thomsonreuters.com/en-us/?p=69674

Key insights:

      • Pushing innovation in the financial sector 鈥 The OCC is actively enabling innovation among financial service institutions, not resisting it.

      • Regulation is being refocused, not removedPriorities may change with each administration, but oversight remains, and crypto is increasingly central.

      • Compliance is a growth requirementRegulations around the BSA, sanctions, and KYC still apply, so durable controls and experienced teams do matter, even with AI.


Shortly after being named Acting Director of the Comptroller of the Currency in early 2025, Rodney E.听Hood in the financial sector. Hood spoke about improving bank-fintech partnerships and providing regulatory frameworks for digital asset activities.

As expected, the Hon. Jonathan V. Gould was sworn in as the 32nd on July 15, 2025. Under his leadership of the Office of the Comptroller of the Currency (OCC), the spigot of technology-enabled financial innovation is set to remain wide-open, with blockchain-based products at the forefront.

In his speech to the , Comptroller Gould laid out a road map to a future that includes more de novo charters, with many of them coming from the ranks of blockchain and digital or virtual asset service providers (VASP). He refuted notions that these things cannot be done under current rules and reaffirmed the agency’s ability to regulate such institutions.


Register now for The 2026 Future of AI and Technology Forum, a cutting-edge conference that will explore the latest advancements in GenAI and their potential to revolutionize compliance, legal, and tax practices


Institutions that fail to embrace these emerging technologies as they arise risk falling behind, Gould said, describing how any legal framework that treats digital assets differently than existing electronic means is risking 鈥渁 recipe for irrelevance.鈥 Such an antiquated approach keeps companies, institutions, and indeed the nation鈥檚 entire financial system, mired in the past, he added.

Digi-mon go!

In word and deed, the current OCC continues to offer a green light to VASPs as well as to traditional financial institutions that are looking to dabble with blockchain, stablecoins, and the like. Regulatory action in the past year mostly served to end prior enforcement against traditional institutions while putting ancillary companies in check. For example, of US/Mexican border casinos, crypto ATM-style terminals, and armored car companies demonstrates the regulatory shift that takes place after each change in administration.

Government rarely gives up its authority, but it does shift the focus. Border cash is out, crypto is in. Clear regulation for this sector is important, necessary, and will continue to create an entirely new set of financial products & services.


Institutions that fail to embrace these emerging technologies as they arise risk falling behind… [and] any legal framework that treats digital assets differently than existing electronic means is risking ‘a recipe for irrelevance.’


Normally I advocate more caution but, in this case, having any regulation is better than having no regulation. Blockchain is here to stay and having any kind of clarity around it is the right way to begin. Those who legislate have an opportunity to improve the regulatory framework over this technology as it evolves 鈥 as long as a framework exists. It’s sort of like the slippery slope argument in reverse: When we build a foundation on regulations that encourage innovation while protecting consumers, including the companies themselves, we create a healthier economy. These rules can always be improved and adjusted as we understand better what we have unleashed upon the world.

Compliance is on the 鈥渃an鈥檛 cut鈥 list

Rumors are swirling of cuts to many corporate compliance budgets. Many compliance pros think this administration will let companies do as they please! Let a professional risk manager urge caution here instead. The power of the Bank Secrecy Act (BSA), the extraterritorial reach of sanctions, and the requirements to know your customers (KYC) are not going anywhere. Regulations are refocused, not removed. A proliferation of nouveau financial institutions will provide a target-rich environment for the regulators of today and tomorrow to find things they dislike and prosecute those offenses. A business that hopes to make it big should be built to withstand the winds of change and weather different regulatory conditions over time.

Therefore, smart compliance professionals will keep an eye on the horizon and keep their risk controls tight. Yes, it may be a good time to start a crypto company; but no, that does not mean you can process drug cash, ignore sanctions, or fail to collect basic personally identifying information.

With increasingly ubiquitous AI tools, your humans in the loop are more important than ever. As entry level jobs become automated, depth of experience becomes more valuable. Retain talent and institutional knowledge on your compliance teams because those individuals will train the AI as well as the investigators of tomorrow.

Indeed, no matter who is in charge of the government鈥檚 regulations, enforcement will come when you let your guard down and ignore basic risk management principles.


You can find more about how government agencies are managing various risk, fraud, and compliance issues here

]]>
AI-powered fraud: 5 trends financial institutions need to understand in 2026 /en-us/posts/corporates/ai-powered-fraud-5-trends/ Tue, 17 Feb 2026 15:19:11 +0000 https://blogs.thomsonreuters.com/en-us/?p=69411

Key insights:

      • AI scales deception 鈥 Fraudsters automate convincing scams, create synthetic identities, and overwhelm legacy controls, making AI an essential part of financial institutions鈥 anti-fraud solution.

      • “All-green” fraud is rising 鈥 The biggest losses often happen in correctly authenticated sessions, making them much harder to detect.

      • Behavior plus collaboration wins 鈥 Financial institutions need to shift from point-in-time checks to real-time, cross-channel behavioral signals and tighter inter-institution cooperation to spot coordinated campaigns and reduce friction without stalling growth.


How financial institutions are facing fraud in 2026 isn’t what it was like even two years ago. AI has industrialized deception, synthetic identities bypass traditional checks, and scams manipulate legitimate customers into moving their own money even as every security control shows green.

Today, financial institutions face a perfect storm, according to Michal Tresner, CEO of ThreatMark, and Sara听Seguin the Director听of听Enterprise Banking at Alloy. Indeed, they鈥檙e trying to manage attacks that scale automatically, identities that look real but aren’t, and victims who authenticate correctly before being convinced to hand over funds.

5 trends financial institutions need to understand in 2026

Looking at each of these five key challenges individually can offer both perspective and possible solutions.

1. The AI threat multiplier

Generative AI (GenAI) and large language models (LLMs) have fundamentally changed the fraud landscape. “AI is now the biggest threat facing financial institutions in 2026,” Tresner notes, adding that fraudsters are leveraging these technologies to create highly convincing content while automating attacks at unprecedented scale 鈥 a combination that overwhelms traditional security systems.

Seguin agrees and confirms this trend is . “Financial institutions are seeing a measurable increase in AI-enabled financial crimes, while consumers increasingly expect banks to deploy AI-based security in response,鈥 she explains. The reality is stark: AI has become an essential tool for both fraudsters and those fighting against them.

2. The onboarding dilemma

In another area, the account opening process represents a critical vulnerability. Seguin points to rising first-party fraud and scams as particularly challenging because perpetrators often appear indistinguishable from legitimate customers going through the onboarding process. “A person may open an account with seemingly normal intentions 鈥 direct deposit or everyday banking 鈥 only to later engage in fraudulent activity,” she explains.


Onboarding is where institutions have the least certainty about either the authenticity of the identity or the legitimacy of the intent.


Tresner identifies a related threat: Synthetic identities. “Rather than stealing real identities, fraudsters now generate convincing fake ones, complete with realistic identity documents and even AI-generated images or video,” he says, noting that these synthetic identity accounts are exploding and frequently serve as infrastructure for moving stolen funds.

The common thread is that onboarding is where institutions have the least certainty about either the authenticity of the identity or the legitimacy of the intent.

3. Authentication under siege

Similarly, and even as financial institutions work to strengthen onboarding controls, account takeover remains a persistent threat. Fraudsters are now using AI to bypass authentication mechanisms at scale, making previously reliable security gates less trustworthy, Tresner explains. 鈥淪uccessful authentication can no longer serve as a definitive indicator of safety.鈥

Indeed, a properly authenticated session may still be the entry point for fraud, whether committed by an intruder or through a legitimate customer who is being manipulated.

4. The “all green” problem

Which brings us to another fraud scenario faced increasingly by financial institutions, and one that Tresner says may be 2026’s most operationally challenging issue 鈥 the fact that many scams don’t trigger traditional fraud controls. When the legitimate account holder initiates a transaction from their usual device and location using correct credentials, every standard check appears normal. The difference is the persuasion happening on the other side as fraudsters convince victims they’re interacting with trusted entities like banks, law enforcement, or romantic partners, and then direct them to transfer money.

Seguin notes that detecting these scenarios requires new approaches, such as identifying subtle behavioral signals like hesitation immediately before a money transfer. “Traditional device and credential checks won’t help when the customer is genuinely authenticated but acting under manipulation,” she explains.

5. Fraud as an industrial operation

Tresner emphasizes that modern fraud is not a series of isolated events but a coordinated, multi-step operation. Campaigns typically begin with establishing or compromising mule accounts, then deploying automated phishing kits to harvest personal data.


Younger users represent a growing target due to their online activity and platform usage, and the emergence of human trafficking-linked fraud operations has worsened this problem.


Not surprisingly, younger users represent a growing target due to their online activity and platform usage, Seguin says, adding that the emergence of human trafficking-linked fraud operations, including sextortion and overseas scam compounds, has worsened this problem.

What works in 2026

Tresner’s core recommendation for fraud investigators in financial institutions is for them to shift their focus from static, point-in-time checks to behavior-based detection. “Behavior profiling and analytics across channels can identify sophisticated actors and manipulation patterns invisible in single transactions or logins,” he explains, stressing that real-time cooperation among financial institutions is critical because fraudsters collaborate, and isolated defenses are insufficient.

Further, Seguin reframes fraud prevention as a growth enabler. “Effective risk controls allow institutions to launch products faster, set higher transaction limits with confidence, and avoid overly restrictive policies driven by fraud concerns,” she notes. Indeed, modern fraud defense isn’t just about reducing losses but about enabling safe expansion.

The 2026 fraud landscape presents compounding challenges: AI-driven scale and realism, onboarding uncertainty from synthetic identities and hidden intent, weakening authentication boundaries, scams that produce legitimate-looking transactions, and industrialized fraud operations that can span channels and institutions. Success in this area requires financial institutions to treat fraud as a behavioral, multi-channel, collaborative challenge because that’s exactly how their adversaries are operating.


You can learn more about the many challenges facing financial institutions today here

]]>
Scams aren’t just fraud 鈥 they’re engineered to exploit human nature /en-us/posts/corporates/scams-fraud-exploiting-human-nature/ Thu, 20 Nov 2025 19:02:27 +0000 https://blogs.thomsonreuters.com/en-us/?p=68515

Key insights:

      • Traditional fraud breaks systems; scams break people 鈥 Scams directed against individuals weaponize trust, urgency, and emotion and hit victims when they’re stressed or distracted.

      • Nearly 1-in-4 adults have lost money to scams, and that number is climbing 鈥 Criminals now wield deepfakes, voice cloning, and AI to make their pitches eerily convincing 鈥 and the curve is still bending in their favor.

      • By the time someone reaches the payment screen, manipulation has already won 鈥 Real protection means flagging suspicious outreach early, verifying identities in real-time, and building friction into high-risk transactions, all before emotions override logic.


One of the most fundamental distinctions in financial security is this: Every scam is a fraud, but not all fraud is a scam. During this week鈥檚 , it’s worth pausing to note what makes scams different 鈥 and why that difference matters more than ever in 2025.

Traditional fraud typically exploits weak systems, such as stolen credentials, manipulated data, or technical vulnerabilities. Scams, on the other hand, exploit something far more powerful and harder to patch 鈥 human nature itself. Scams can weaponize trust, urgency, and emotion; and when those psychological levers are pulled at just the right moment, even savvy people can find themselves wiring money to someone they’ll never see again.

The threat is only growing

The numbers tell a sobering story. More than 1-in-5 (22%) of adults report losing money to scams, according to the . And Ayelet Biger-Levin, founder of听and creator of ScamRanger, a technology designed to stop scams before they happen, doesn鈥檛 mince words about the growing threat of scams: “From a numbers perspective, scams are on the rise,鈥 she says. 鈥淭hey’re going to continue to rise because criminals are becoming more sophisticated, leveraging the latest technology advancements including large language models (LLMs) and AI agents to scale operations.”

Indeed, her definition cuts straight to what makes scams unique. “A scam is social engineering to convince an individual to either disclose personal information or transfer money directly to a criminal,” she explains, adding that it’s not a system breach; rather, it’s a conversation that goes wrong 鈥 often in ways the victim doesn’t realize until it’s too late.

And the trajectory isn’t encouraging. Biger-Levin says that she expects that the number of adults being victimized over the next 12 to 18 months will only increase. “In the US, I expect it to rise,鈥 she notes. 鈥淐riminals are rapidly leveraging tools that make scams more believable such as deepfakes and voice cloning, which are used for impersonation to increase both scale and success.”

And while we haven’t reached the tipping point yet, the curve isn’t bending in our favor.

Scams adapt to every new channel we create

Here’s the uncomfortable truth: Scams aren’t a glitch in the system; rather, they’re a feature of human society that adapts with every new communication channel we build. Romance scams, investment lures, fake shopping sites, cryptocurrency schemes 鈥 these aren’t amateur operations anymore. They’re often run by organized networks, sometimes operating out of compounds in Southeast Asia, and they’re supercharged by technology that makes deception easier and more convincing than ever.

Deepfakes can put your CEO’s face on a video call. Voice cloning can mimic a family member in distress. Increasingly, agentic AI can personalize phishing at scale, crafting messages that feel eerily tailored to your life. Educating people about ways to keep from becoming victims helps, absolutely. However, when a persuasive story lands at exactly the wrong moment 鈥 when you’re stressed, distracted, or emotionally vulnerable 鈥 logic often takes a back seat.

And if those fighting fraud are waiting until a victim reaches the payment screen to intervene, they’re already too late.

Meeting manipulation where it starts

To make real progress, we need to meet manipulation at first contact 鈥 the moment persuasion begins. That means pairing human-centered design with protective technology across the entire scam lifecycle.

What does that look like in practice? It means flagging risky outreach before it reaches an inbox. Verifying websites and identities in real time, in context; and slowing down high-risk payments while prompting users with friction that feels helpful, not punitive. And critically, it means sharing signals and liability across the ecosystem 鈥 among banks, telcos, social platforms, and regulators 鈥 so they can all work from the same playbook.

The constant in all of this is human psychology. The variable is how well our systems anticipate it.

Biger-Levin says she is optimistic about enforcement improving over time. “I do predict that long-term, these scam compounds are going to be taken down,” she says, adding that she’s also realistic about what comes next. “Criminals are not going to stop there, and by using advanced technology will continue to attack individuals. The one common denominator, though, is human psychology, and that is something we can tackle and protect with the right consumer empowerment in place”

That’s the core challenge. Regulators or financial services compliance agents can shut down a scam operation, but they can’t patch human emotion. Technology solutions must be designed around how people actually think and behave under pressure 鈥 not how we wish they would. That means building systems that recognize when someone is being groomed, when urgency is being manufactured, and when trust is being weaponized.

The old advice still holds鈥 because it reflects how we think

There’s a reason the classic warnings never go out of style. The old saying of, If something seems too good to be true, it probably is, is not outdated wisdom 鈥 it’s a reflection of how scams work by promising outsized returns, instant solutions, or emotional rewards that bypass our rational filters.

Gut checks still matter, Biger-Levin reminds us, adding that doesn鈥檛 mean we can rely on individuals to shoulder the entire burden of vigilance, especially when criminals are using industrial-grade tools to manipulate them.

Scams will always evolve. So, the question isn’t whether they’ll disappear 鈥 they won’t. The question is whether we’re willing to build systems smart enough to protect the humans inside them.

That means reducing exposure at the source, disrupting grooming tactics before they gain momentum, and making the this doesn’t feel right moment easier to spot 鈥 and safer to act on. It also means treating scam prevention not as a user education problem, but as a systems design problem.

We can bend the curve, but only if we stop treating scams as individual failures and start treating them as the systemic, technology-enabled threats they’ve become. The tools already exist; however, the challenge is coordination, accountability, and a willingness to bake protection into every layer of the digital experience.

Because the denominator isn’t changing and human psychology remains constant, the aspect that we can change is how well our systems anticipate it 鈥 and how much harder we make it for criminals to exploit it.

Staying ahead of the scammers

To stay ahead of these scammers, organizations and consumers should take practical steps to prevent and minimize risks. For example, they should stay up to date on the latest scam tactics by keeping an eye on consumer protection updates. These can help you spot red flags, such as urgent demands or unusual payment requests, that may signal a scam.

Also, when you receive unsolicited calls or emails, take a moment to verify their authenticity. Instead of responding right away, contact the organization directly using official contact information. Legitimate companies typically won’t ask for sensitive information like passwords or account details out of the blue.

Finally, boost your digital security by using strong, unique passwords and enabling two-factor authentication. Be cautious when clicking links and avoid those that seem suspicious. Scammers often rely on high-pressure tactics to prompt rushed decisions; so by taking a step back and evaluating the situation carefully, you often can avoid falling prey to their schemes.


You can find out more about how businesses and individuals are navigating fraud schemes here

]]>