DeFi Is Optimizing for Gas, Not for Markets
DeFi's gas-optimized design cracks under volatility. Here's why fixed liquidation logic fails — and what João García says must change in 2026.

What to Know
- DeFi's core architecture was engineered around gas fee constraints, not financial resilience — and that trade-off is getting harder to ignore
- During MakerDAO's Black Thursday in March 2020, vaults were liquidated at effectively zero bids as auction mechanics collapsed under network congestion
- The 2023 Curve Finance exploit exposed how LP tokens treated as static collateral can radiate systemic risk across interconnected lending protocols
- João García, DevReal lead at Cartesi, argues that without richer on-chain computation, complex risk logic migrates off-chain — making DeFi less transparent, not more
DeFi's gas optimization problem is not a bug they're working to fix — it's a foundational design decision that increasingly looks like a structural liability. The systems that power decentralized lending, trading, and derivatives were built to minimize computational cost, not to survive a volatility spike. And every time markets turn ugly, that compromise shows up exactly where you don't want it: in liquidation engines that can't adapt, collateral thresholds that move too slowly, and risk models that live off-chain where nobody can audit them.
Why DeFi Rebuilt Finance With One Hand Tied
The pitch was always that DeFi would out-compete Wall Street by being transparent, permissionless, and non-custodial. What nobody put in the brochure: these systems were also built under severe computational constraints that forced developers to strip financial logic down to its barest essentials. On Ethereum and similar chains, floating-point arithmetic doesn't exist natively, iterative simulations burn through gas fast, and recomputing cross-asset exposure in real time is expensive enough to be practically off the table.
So DeFi converged on static ratios, fixed liquidation curves, and governance-driven parameter updates — not because anyone ran the numbers and concluded this was optimal, but because more sophisticated models simply cost too much to run. That compression worked fine when markets were calm. It tends to fall apart when they aren't.
João García, DevReal lead at Cartesi, puts it plainly: what looks like a design preference is often a concession to execution limits. The architecture performs adequately in stable conditions, but volatility has a way of testing its edges — and DeFi's edges are more rigid than most users realize.
Three Times the Architecture Cracked Under Pressure
The evidence isn't theoretical. MakerDAO Black Thursday in March 2020 is the cleanest case study: vaults were liquidated at effectively zero bids when auction mechanics seized up under collapsing ETH prices and network congestion. The liquidation engine couldn't adapt because it wasn't built to. Fixed formulas met a dynamic crisis, and the mismatch was brutal.
Aave and Compound saw similar dynamics in later downturns — mass liquidations triggered by rigid collateral ratios rather than any kind of portfolio-level risk recalculation. If your threshold is static, a flash move to the threshold triggers the same response whether volatility is 15% or 80%. That's not a risk model. That's a tripwire.
Then came the Curve Finance exploit 2023. After the smart contract vulnerability drained liquidity from Curve's pools, the damage radiated outward — specifically into lending protocols that had been treating Curve LP tokens as if they were stable, fixed-value collateral. They weren't. The contagion wasn't a one-in-a-million edge case. It was the predictable result of a system that models risk statically in a market that moves dynamically.
In each instance, García notes, decentralization itself wasn't the breaking point. The breaking point was rigid financial logic operating inside an execution layer that couldn't recompute risk as conditions deteriorated.
When risk cannot be modeled and recomputed transparently on-chain, it migrates off-chain into dashboards, analytics teams, discretionary parameter adjustments and emergency governance coordination.
Does DeFi's Risk Logic Actually Live On-Chain?
Here's the uncomfortable reality: for many protocols, the answer is increasingly no. The settlement layer is on-chain. The adaptive intelligence that keeps the system stable during stress — that part is migrating off-chain into analytics dashboards, parameter committees, and emergency governance calls. During volatility spikes, protocols depend on rapid human coordination to adjust parameters before the system breaks.
Oracles and large token holders end up with outsized influence over outcomes precisely because the protocol can't self-adjust. The blockchain remains the immutable ledger. The actual decision-making under pressure happens somewhere else — and that somewhere is far less auditable than a smart contract.
This is the gap García is pointing at. Traditional markets evolved in the opposite direction: banks and clearinghouses run thousands of stress scenarios continuously, with margin requirements that respond dynamically to shifting volatility regimes and correlation structures. That requires substantial computational infrastructure — which is exactly what public blockchains weren't designed to provide. The question is whether that design assumption still holds as DeFi scales into instruments that are genuinely complex and deeply interdependent.
What Has to Change — and What's Actually at Stake
García's argument isn't that DeFi needs to become more centralized. It's that computation needs to be treated as a first-class primitive. If verifiable execution environments can approximate general-purpose computing, the financial design space expands considerably: native floating-point support, iterative algorithms, access to established numerical libraries. That's what would let a lending protocol do scenario-based stress testing instead of relying on a fixed liquidation ratio.
Margin requirements could respond to observed volatility rather than waiting on governance. Credit systems could recompute multivariable risk scores transparently — replacing binary heuristics with something closer to how risk actually works. The point isn't complexity for its own sake. It's keeping financial intelligence inside the protocol, where users can audit it, rather than externalizing it into operational layers they can't see.
The fork in the road looks like this: one path preserves gas-optimized minimalism at the base layer while increasingly sophisticated risk logic migrates off-chain. Clean smart contracts, opaque operations. The other path accepts more capable execution environments in exchange for systems that can adapt, recompute, and stress-test transparently on-chain. Neither is free — but only one of them is honest about where the complexity actually lives.
DeFi projects simplicity in its code right now. It relies on discretion in practice. That gap will widen as leverage increases and instruments grow more interdependent. Markets won't slow down to accommodate virtual machine constraints — and at some point, pretending otherwise stops being a technical footnote and starts being a systemic risk.
Frequently Asked Questions
What is DeFi's gas optimization problem?
DeFi protocols were built to minimize on-chain computation costs (gas fees), which forced developers to use simplified financial models — static collateral ratios, fixed liquidation curves — instead of dynamic risk systems. This works in calm markets but breaks down under volatility because the logic can't adapt in real time.
What happened during MakerDAO's Black Thursday in 2020?
In March 2020, MakerDAO vaults were liquidated at effectively zero bids. As ETH prices collapsed and Ethereum network congestion spiked, the protocol's fixed auction mechanics failed to find clearing prices. The result was mass undercollateralized liquidations — a direct consequence of rigid on-chain logic meeting a dynamic crisis.
How did the 2023 Curve Finance exploit create systemic risk?
A smart contract vulnerability drained Curve's liquidity pools, but the wider damage came from lending protocols that had been accepting Curve LP tokens as collateral at static valuations. When pool values crashed, those protocols faced cascading losses — an example of how static collateral models fail to price dynamic risk.
What does Cartesi propose to fix DeFi's computational limits?
Cartesi's position, articulated by DevReal lead João García, is that DeFi needs verifiable execution environments that support general-purpose computation — including native floating-point arithmetic, iterative algorithms, and numerical libraries — so that complex risk models can run transparently on-chain rather than being pushed into opaque off-chain operations.
