Apr 10, 2023 - The burden of complexity

Complexity is present in any project. Some have more, some have less, but it’s always there. The manner in which a team handles complexity can pave the way for a project’s success or lead towards its technical demise.

In the context of software, complexity arises from a variety of factors, such as complicated requirements, technical dependencies, large codebases, integration challenges, architectural decisions, team dynamics, among others.

When talking to non-technical folks, especially those not acquainted with the concepts of software complexity and technical debt, it can be helpful to present the topic from a more managerial perspective.

So I propose the following qualitative diagram that regards complexity as an inherent property of a software project, and simultaneously, a responsibility that a software development team must constantly watch and manage for being able to deliver value in the long run:

surface

From the diagram:

  • The Complexity Burden curve represents the theoretical amount of effort necessarily spent servicing complexity, as oposed to productive work. This is an inevitable aspect of software development and can manifest in various forms, including spending time understanding and working with complex code, encountering more intricate bugs and errors, updating depencencies, struggling to onboard new team members due to excessively elaborate designs, among others.

  • The Team’s Capacity line is the maximum amount of effort the team is able to provide, which varies over time and can be influenced by factors such as changes in the product development process, team size, and efforts to eliminate toil [1]. Additionally, reductions in the complexity burden of a project can unlock productivity, influencing the team’s capacity as well.

  • The Complexity Threshold represents the point where the team’s capacity becomes equal to the complexity burden. In this theoretical situation, the team is only allocating capacity towards servicing complexity. Value delivery is compromised.

With these definitions in place, let’s review the two colored zones depicted in the diagram.

The Improvment Zone

Projects are typically in the improvement zone, which means that the team has enough capacity to handle the complexity burden and still perform productive work. The lower the complexity burden, the more efficient and productive the team will be in delivering results. The team can choose to innovate, develop new features, optimize performance, and improve UX. It’s worth noting that doing so may result in added complexity. This is acceptable as long as there is sufficient capacity to deal with the added complexity in the next cycles of development and the team remains continuously committed to addressing technical debt.

The Degradation Zone

A project enters the degradation zone when the team's capacity is insufficient for adequately servicing complexity, adding pressure on an already strangled project. The team will constantly be putting out fires, new features will take longer to ship, bugs will be more likely to be introduced, developers may suggest rewriting the application, availability may be impaired, and customers may not be satisfied. The viable ways out of this situation are to significantly reduce complexity or to increase capacity. Other efforts will be mostly fruitless.

Closing thoughts

The concept of complexity burden can be a valuable tool for enriching discussions around promoting long-term value delivery and preventing a project from becoming bogged down by complexity, leaving little room for new feature development. It’s important to make decisions with a clear understanding of the complexity burden and how it may be affected.

It’s worth pointing out that if the productive capacity of a team is narrow, meaning if the proportion of the team’s capacity allocated towards the complexity burden is already too high, the team will find itself in a situation where continuing to innovate may be too risky. The wise decision then will be to prioritize paying off technical debt and investing in tasks to alleviate the complexity burden.

Even though they are related, it’s crucial to distinguish between the complexity burden and technical debt. The former materializes as the amount of (mostly) non-productive work a team is encumbered by, while the latter is a liability that arises from design or implementation choices that prioritize short-term gains over long-term sustainability [2]. A project can become highly complex even with low technical debt.

Finally, a project is a dynamic endeavor, and a team may find itself momentarily in the “degradation” zone in one cycle and in the “improvement” zone in the next. What matters most is to be aware of the technical context and plan next steps preemptively, aiming to maintain the complexity burden at a healty level.


Reference

[1] Google - Site Reliability Engineering. Chapter 5 - Eliminating Toil

[2] Wikipedia - Technical debt

Apr 15, 2022 - An inherent source of correlation in the crypto market

If you follow the crypto market, you may already be familiar with the strong direct correlation between cryptocurrencies prices, native or tokenized. When BTC goes down, pretty much everything goes down with it, and when BTC is up, everything is most likely to go up too. This correlation isn’t exclusive to BTC, impacts on the price of many other cryptocurrencies (such as ETH) also reverberate across a wide range of crypto assets, with different degrees of strength among them.

surface

Have you ever asked yourself why is it so? One straightforward answer is that large events impacting the price of BTC (or other crypto) will make crypto investors want to assess their exposure not only to BTC but to other crypto assets in a similar manner. This answer is intuitive, but mostly behavioral and hard to quantify. However, there are other structural sources of correlation between cryptocurrencies that are often overlooked, and in this post I analyze one of them: Decentralized exchange (DEX) pairs.

DEX pairs

A DEX pair, popularized by Uniswap, can be viewed as a component in the blockchain that provides liquidity to the crypto market and allows wallets to trade between one asset and another in a decentralized way. For instance, the BTC/ETH pair allows traders to swap between these two currencies in either direction. Likewise, the BTC/USDC pair allows traders to exchange bitcoin for stablecoins, and vice-versa.

And how do DEX pairs build-up correlation between cryptocurrencies? To answer this we need to dive a bit into how DEX pairs work:

surface

First, in order to provide liquidity a DEX pair needs to have a reasonable supply of both of its tradable assets. Then it implements a mathematical formula for calculating the exchange rate between these two assets honoring the supply/demand rule, i.e., the more scarce one of the assets becomes in the DEX pair’s supply, the more valuable it’ll be in relation to the other asset.

Second, the sensitivity of the exchange rate in a DEX pair will depend on the total value locked (TVL) in its supply. Each trade performed against the DEX pair changes the relation of its assets, thus changing the effective exchange rate for succeeding trades. The higher the TVL, the less sensitive the exchange rate will be with regard to the trade size.

Implications on correlation

Now we can start exploring the implications on correlation. You see, a DEX pair is basically a bag which locks pairs of cryptocurrencies supplies together, creating a hefty relationship between them. Even so, if you have one or two DEX pairs to play with you may not achieve much with respect to correlation of the assets prices against the dollar. But if we define a closed system with at least three DEX pairs like the one shown below, interesting things start to happen:

surface

In this system we have:

  • One BTC/USDC pair defining a price of US$ 45k per BTC
  • One ETH/USDC pair defining a price of US$ 3.2k per ETH
  • One BTC/ETH pair creating a relationship between these two assets
  • Pricing consistency between all pairs
  • US$ 300M TVL in each pair

Remember that each DEX pair defines an independent exchange rate between two assets. Then, if we buy a lot of BTC in the BTC/USDC pair with stablecoins, for instance a US$ 1M trade, we’ll generate an upwards pressure in the price of BTC as defined by that pair:

surface

Trade details:

  • 1M USDC input
  • 22.075 BTC output
  • Effective Ex. Rate of 45300
  • Resulting BTC/USDC pair Ex. Rate of 45602

This new price will be unbalanced in regards to the other pairs, triggering an arbitrage opportunity since a trader holding USDC could now buy ETH in the ETH/USDC pair, exchange ETH for BTC in the BTC/ETH pair and finally sell BTC for stablecoins in the BTC/USDC pair making a profit.

surface

Now let’s consider that a trader took advantage of this arbitrage opporutnity to the fullest and analyze the resulting exchange rates when the system reaches equilibrium:

surface

So, comparing to the initial state of the system, the US$ 1M trade to buy BTC had the effect of:

  • Rising the price of BTC by 0.89% (from US$ 45,000.00 to US$ 45,400.00)
  • Rising the price of ETH by 0.44% (from US$ 3,200.00 to US$ 3,214.20)
  • Inflating the TVL in the system by 0.44% (from US$ 900M to US$ 904M)

As you can see, the initial rise in the BTC price opened up an arbitrage opportunity that once explored to exhaustion had the effect of rising the price of ETH as well. To put it simply, this closed system created an inherent correlation between ETH and BTC prices.

Conclusion

In this qualitative analysis we’ve seen how a system of DEX pairs builds-up correlation between crypto assets as a result of exploring arbitrage between these pairs. Even though the analysis was based on a simulated US$ 1M trade to buy BTC, similar and consistent results hold for selling BTC, as well as for buying/selling ETH, within this closed system.

As of the time of this writing Uniswap on Ethereum mainnet alone holds US$ 4.77b of TVL in hundreds of DEX pairs, creating an entangled net of relationships between crypto assets and contributing to the correlation among them.


Notes

  • The simulation whose results are presented in this post was based in Uniswap’s V2 protocol implementation. Similar results should hold for the more complex and recent V3 implementation which adopts the concept of virtual supplies.

  • The complete source code for running this simulation is provided on GitHub. The routine used for generating the results presented in this post can be found in this code file.

Dec 4, 2021 - Is infinite token approval safe?

Is has become common practice for DeFi protocols to use infinite token allowance approval to improve end users experience. From the user’s perspective it’s indeed very convenient and even appealing, once they grant a dapp (decentralized app) infinite allowance they will be able to interact with such dapp mostly using single transactions instead of having to perform a token spending approval transaction prior to every interaction with the dapp.

A few months ago I questioned a similar approach used by the DAI stablecoin while researching the EIP-2612 proposal which I replicate below:

Is DAI-style permit safe to use?

Differently from the EIP-2612 which defines a “value” for the allowance DAI’s approach appears to approve an unlimited allowance for the spender address.

Is it safe to permit a protocol to spend DAI on my behalf?

If not, to which use cases is DAI-style permit targeted to?

(link to the full question)

Eventhough I didn’t get a full answer to my question at the time, one user provided some insights in a comment:

It depends on the protocol. If the protocol is only a smart contract and you see the source code and trust that the contract is bug-free and will only transfer the token based on defined logic and transparent actions/conditions then no harm in doing it (but you can see there is too many “AND”s). – Majd TL

So I concluded that there are too many “ANDs” for trusting a protocol with unlimited token allowance approval. It’s more flexible, sure, but more riskier than using limited approval if any bug is found in the protocol. Nonetheless, nobody seemed to care, most protocols were doing it by default, without notice.

As a skeptical person myself I carried on never granting infinite allowance approval to dapps I use, and adopting a few strategies which I’ll comment later on in situations I needed more flexibility.

But then, after a few months, something happened that made me remind of this matter, the Badger DAO Protocol exploit…

US$ 120 million stolen

As reported by rekt, the Badger DAO Protocol exploit took place past December 2nd and the staggering amount of US$ 120 million were stolen from it.

How did this happen? Different from previous DeFi attacks we’ve seen in the past that took advantage of smart contract bugs and sophisticated strategies for manipulating protocols internal parameters this one was simple enough that even those unfamiliar with DeFi can follow easily:

A front-end attack. An unknown party inserted additional approvals to send users’ tokens to their own address. Starting from 00:00:23 UTC on 2.12.2021, the attacker used this stolen trust to fill their own wallet.

Simple as that. For several days Badger users were accessing the hacked UI and inadvertently approving mostly unlimited allowance to the attacker’s address. The attacker waited for the right time to make his/her move, silently watching hundreds of users approving his/her address. And then, the attacker decided the reward was large enough, made his move and stole 120 million dollars.

Rumours that the project’s Cloudflare account was compromised have been circulating. Still, it a flagrant wake up call, to remind us that even if the protocol smart contracts are audited, battle tested and considered reasonably safe, if you’re interacting with that protocol through a dapp you can still fall for a hack if the front-end has been compromised.

Strategies for protecting yourself

There are three main strategies to protect your assets in situations like this when interacting with dapps that don’t support EIP-2612, which I detail below:

1) Always use limited approval: This is the trivial strategy, never grant unlimited allowance, always use two transactions, the first one for approving the protocol a limited allowance and the second one for interacting with the protocol. Some dapps allow you to disable the default setting of unlimited allowance in their UI. For the ones that don’t you can edit the allowance approval value in your wallet (ex: MetaMask) before sending the transaction through.

2) Use a hot wallet: Another common strategy, in case you really need to allow unlimited allowance (for instance to reduce costs with transactions fees) you should use a hot wallet, i.e., a separate address that you will fund on demand. All funds held by this address will be subject to higher risk, but it will contain a smaller portion of your holdings, so it’s a limited risk. By the way, avoid using the same hot wallet for multiple dapps, otherwise you’ll be increasing your risk profile.

3) Deploy a proxy contract: This is a more sophisticated strategy which requires you to code a smart contract that will interact with a protocol on your behalf, even bypassing the front-end altogether. I’ve been using this approach to interact with DEXes. I have a non upgradable proxy smart contract in place to which I send transactions for swapping tokens. I grant this proxy unlimited allowance for a hot wallet of mine. When I send a swap transaction to the proxy it will first approve a limited allowance in the destination DEX, then perform the swap transaction, and finally transfer the tokens back to my hot wallet. This way I get the best of both worlds, I’m using single transactions for interacting with dapps, and my hot wallet is shielded from “allowance exploits”. But de advised that writing smart contracts is inherently risky, so this strategy doesn’t come easy as well.

An idea for improving front-end security

Before closing I would like to discuss an idea for improving front-end dapps security. These apps are insecure because, unlike (most) smart contracts and blockchain transactions, hosting is centralized. A few admins have control of the front-end app. If one of the admin accounts is hacked the front-end app could be tempered with without anyone noticing.

So we need to make sure we are interacting with a untampered front-end in the first place. And the solution to this has been around for a long time: signed apps. If we define a method for bundling front-end apps for having a DAO controlled address to sign this bundle we can greatly reduce the front-end attack surface. All users would then access this fronted app and have their wallets checking the app signature. If the calculated signature for the received front-end bundle doesn’t match the DAO’s controlled signing address a warning message would be shown and the user would be advised to not interact with the app.

There’s one catch though, for this idea to work we would need to manually register/bookmark all DAO’s signing addresses that we trust. Let’s just hope we don’t get it from a hacked front-end then 😅