Latent Cultural Potential

NFTs have special powers afforded by their technical details and the ledger on which they reside. This power can be exercised across remote stretches of time.

I confess a love of non-fungible tokens (NFTs). I can recall my first purchase years ago. When you appreciate blockchain ledgers, you quickly appreciate the idea of non-fungible provenance, a sense of owning a little slice of a new digital reality — and the capacity to prove that ownership, engage with it, trade it, program over it, and more.

This post is unabashed boosterism for this type of data. NFTs are, after all, just a data type. It is a way of assigning some information to some cryptographically signable public key. Despite the controversy around NFTs as mere “pointers to URLs,” they can be much more than this.

The goal of this post is to argue that NFTs are a unique kind of asset. I revisit some ideas many have raised before, but I focus on one that’s always intrigued me: NFT projects have a latent potential. They can lie in wait on the chain, even if they are not noticed for a time. This can give NFTs special powers afforded by their technical details and the ledger on which they reside; this power can be exercised across remote stretches of time.

NFTs ≠ Beanie Babies

Some have remarked on the similarities between NFTs and Beanie Babies. One cannot deny some parallels — a niche interest to a mania and price swings and more.

But NFTs are very much not like Beanie Babies, too.

They differ in several important ways.

For one, the market for NFTs is global, constant, and instant. Trustless computation on programmable blockchain yields automatic markets, too. You don’t have to trust your buyer at all, so long as you trust the mechanisms in place for computing the sale. And sales have, in most cases, immediate effect.

Second, some NFTs, especially those on chain, live forever. They do not degrade. They are ever in mint condition, recoverable, and programmable.

For example, the “Witness the Draft” project by Simon de la Rouviere is a chorus of eyes from an on-chain asset that marks its maker’s creative progress. Collectors can interact with the blockchain and open and close these eyes, altering their piece as much as they like forever.

Project “Witness the Draft” is stored on-chain. Etherscan’s token view, owned by the author

And because these on-chain NFTs live on ledger, their impacts are perceivable, measurable, and analyzable. When a collection drops into the blockchain, it reverberates, generating little disturbances to the ledger’s permanent data.

These properties make NFTs qualitatively unlike Beanie Babies. But the above reasons are mostly financial and technical. NFTs, on-chain NFTs especially, have more. They have latent cultural potential. Prominent artist and writer de la Rouviere refers to this as a kind of “fertile ground for emergent culture.”

In this brief post, I’ll use details from Etherscan and more to illustrate what I mean by that heady phrase. Let’s break it down into layers.

de la Rouviere’s project lives on chain; owners can call the contract and alter the asset

Layer 1: The Artifacts

As noted above, so-called “on-chain NFTs” are the best example of this latent potential. You can read introductions here about on-chain NFTs. The significance of this technical feature cannot be understated. As long as Ethereum and its data persist, these entities do too.

These artifacts can be inspected in all their detail. For example, the ever evolving On-Chain Checker is a tool that validates and visualizes on-chain NFTs (check out these tools too). Users can even display and tinker with the underlying code of these artifacts.

NFT Avastar #9707 checkable on chain here

The conceptual basis of some NFT projects can be analyzed with metadata too. For example, the early on-chain NFT project ChainFaces by Nate Alex has designated functions on the contract to completely recover details about the assets. So the asset itself, and its metadata details, are all on the chain forever.

Call ChainFaces metadata directly from contract here

These on-chain NFTs are enmeshed in the incentive structure of the entire blockchain. From wallets engaged in basic transfers to advanced DeFi — other stakeholders of the chain help to sustain the infrastructure. So the incentives and actions from others who are not even interested in NFTs help to preserve these assets. Even if forgotten, they lie in wait.

Layer 2: The Community

As long as Ethereum’s transaction data (sometimes called “receipts”) are preserved, then much more can be gleaned about these assets. You can summon a contract’s functions to see who else owns on that contract. The result is a mesh of co-collecting — individuals who have, for one reason or another, found themselves represented in the same contract’s memory. Etherscan displays summaries of this on-contract memory on its token holder details.

Who owns Autoglyphs (2019)?

(This would be like if Beanie Babies had by their intrinsic design a button in them that instantaneously revealed with certainty and currency the social network of all co-owners.)

We can visualize this interconnectivity (for fun, you can even reconstruct and visualize it on the chain itself). But owners can be projected on a wider reticulum. You can follow their trail into the ledger. What else do they own? How long have they been active on the chain? When did they go quiet? It poses all sorts of intrigue. And with tools such as Ethereum Name Service and more, it is possible to get a sense of the identity and engagement of these individuals.

A collection encodes a memory of its community.

Co-collection graph: dots are wallets, lines are collecting. See here.

Layer 3: Cultural Dynamics

The connectivity among owners does not exhaust the intriguing detail. If we can assume longevity to Ethereum’s transaction data, we can also unpack the rise and fall of communities. We can track a wave of incoming interest, or sudden departures.

These dynamics are obviously economic in nature. Mass acquisitions and sales of NFT projects are often the subject of discussion. But they are more than this. They are cultural dynamics by any reasonable definition of the term: “cultural dynamics research is about meaning over time,” andan investigation of how a culture thus defined is formed, maintained, and transformed over time.” (Kashima, 2014)

In my visual series Reflective Recursion, I used data from Etherscan and more to visualize the history of KnownOrigin using KnownOrigin’s data itself. I sought to visualize the rise of cultural trends using the behavior of artists on chain. For example, Infection captures the sudden effect of the pandemic on artist creations, and a new wave of incoming artists during COVID.

“Infection,” lines are artists over time; dots mints; red = virus mention; see here for more details

So a single NFT project, especially one stored directly on the blockchain, reveals only a segment of such dynamics. One single project shows its own section of this history. But we can assess cultural dynamics across many such projects. It may be possible to assess an NFT project as a sign of its times, an adaptation to a moment and its technical constraints. This analysis can be conducted across many such projects.

In a prior post, I described this “adaptive” quality. For example, relative cost of transacting constrains NFT projects. Avastars, a fully on-chain project, layers beautiful SVG components into alluring faces — a sign of more affordable gas (though CyberBrokers by Josie and team is a pioneering outlier to this principle, too, storing everything on chain for about $200,000).

When transaction fees rose with NFT popularity, projects adapted by computing a visual asset rather than storing them explicitly. Recently, using infrastructure like Art Blocks and scripty.sol and more (including on Bitcoin), there is a modularity emerging, through which NFTs are becoming more complex and coded from existing on-chain components.

Mandalas project generates a GIF on its contract; by whigawag

These are cultures and subcultures, dynamically changing, trackable on chain. An NFT project, even if forgotten, never loses such relevance. That relevance can be rediscovered. It lies in wait.

Conclusion: Cultural Raw Material

NFTs are undoubtedly different from Beanie Babies. The technical, financial and cultural features of NFTs seem to me qualitatively different and much expanded. This is especially true of on-chain assets and their associated data.

On-chain projects like this can be forgotten but never lost (though see here for important caveats). They would lie in wait for rediscovery. A fun example, perhaps familiar to many readers, is the lost MoonCats. A very early NFT project, it did not immediately attract much attention. The contract was waiting for new collectors to mint and “rescue” the cats. An account of this agenda describes the communal excitement of such rediscovery:

MoonCat Winter & Rediscovery. MoonCat​Rescue was developed and released in 2017 by a pair of Ethereum enthusiasts who wanted to explore the possibilities of the early Ethereum network. The project used an on-chain, proof-of-work mining system to allow people to “locate” and “rescue” MoonCats. … Though MoonCats gained a small and passionate following, interest waned. … MoonCats were rediscovered on March 12, 2021. In a bout of MoonCat mania, all of the remaining MoonCats were rescued in just a few hours. On that day — from the grassroots — the MoonCatCommunity was born!

MoonCat community site with tons of fun detail

Upon discovery, a community can cohere around these entities. The project can become a raw material for new cultural dynamics. So this gives NFTs a latent potential. Their encoded data can still influence across remote stretches of time. Their assets, community and dynamics are cultural in nature, especially when they are stored on the chain itself, inspectable and appreciated years after their creation.


I sometimes own or create projects I mention. I was not paid for this post. I wrote it for fun. Thanks to Etherscan for letting me contribute. You can find me on Twitter with links to projects here:

Latent Cultural Potential was originally published in Etherscan Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Superchain Circuitry

Coinbase’s OnChain Summer is now complete. During August 2023, OnChain Summer celebrated the activation of Coinbase’s Base, a second layer (L2) on Ethereum (and built from Optimism’s OP Stack). Daily events across August included many NFT mints and other on-chain Base activities. Thousands of wallets participated. OnChain Summer was so active that you can see on-chain Base usage spike and show an approximate daily cycle, probably around when a mint or other activity became available.

Data from adapting this Dune dashboard by @tk-research

Second layers like Base offer users cheaper and faster transactions. The OnChain Summer event was meant to illustrate to users the fun and ease of participating. As unfortunate evidence of these benefits, when Base was opened to transacting, there was some event log spam on its ledger. Some transactions logged 10,000 events and slowed some explorer interfaces when viewing these transactions (see here for an example on BaseScan; caveat: it may freeze some browsers!).

Example: BaseScan view of event log spamming on transaction

Initial hiccups aside, OnChain Summer seemed largely successful. It culminated in the Base Wars NFT project. Base Wars is part of the Finiliar family of NFT projects, in which “animal software” of cute little avatars respond to on-chain data. Thousands of distinct wallets minted a Base Wars NFT.

Base Wars for OnChain Summer

OnChain Summer provokes a broader question about the role of an L2 in Ethereum or any protocol: What are second layers for? One simple view is that an L2 is a kind of simulacrum of the hallowed mainnet. This simulacrum’s “reality” is assured by its relationship to the real thing (mainnet) but it permits the same use cases that are facilitated by cost and speed improvements.

This answer to the question oversimplifies the concept of an L2 and the many relationships it may have with its L1. In a blog post last year, I summarized how users may engage L2 for a variety of reasons. This may include cheaper trading, developing faster and less expensive data-heavy on-chain applications, more affordable NFT experimentation and more.¹

This suggests an L2 has a more complex relationship to its L1.² One way to think about it is that L1 and L2 are part of a broader “circuit” in which users find distinct purposes for each. The L1 may be seen as ultimate “finality,” as its ledger is most likely to have the best security guarantees. An L2 greases the wheels on transactions, and so users may visit L2 for distinct reasons, such as gas-heavy NFT participation. In this simplest case, the L2 is participating with the L1 for a specific functionality. It is not simply copying a user’s overall engagement.

What are second layers for? Data source.

Example on Base

To quantify these ideas we can use the API for Etherscan and BaseScan. I grabbed transactions from Base blocks 3344550–3344559 yielding 122 unique transactions. I then sampled the Ethereum mainnet (L1) and Base (L2) ledgers for 87 from wallets in those transactions. All together, this process returned about 34,000 on mainnet and 110,000 on Base for all 87 addresses (some are high-transacting, such as bridge addresses). I categorized each of these transactions as ERC-721 (“NFT” activity), ERC-20 (“DeFi” activity) or other (e.g., mainly other contract calls, regular transfers, etc.).³

We can use a ternary plot to visualize the concentration of this activity, and how it may differ across Ethereum mainnet L1 and Base L2. A ternary plot is basically a triangle with three categories at each point. Plotting inside that triangle represents how prominent the different categories are. When the point is right at the triangle’s center, it means the three categories are balanced — in our case, it might mean equal DeFi, NFT, and Other transaction counts. But if a wallet has its data point at the very top (say, near Other), it means they do not engage in any ERC-20 or ERC-721 transactions at all and only Other. If a point is on one of the triangle’s edges, it means there’s some balance between the two categories that connect that edge (such as Other and DeFi) and no transactions of the other (third) point.

This is illustrated below for four addresses that were included in those Base blocks. The green dot shows the distribution of transactions over three types (NFT, DeFi, Other) for mainnet, and the blue for Base. The top row shows that some addresses have similar activity, because the two dots are close together in this space. Those on the bottom show a deviation of Base activity from the way they transact on Ethereum. (For example, the bottom left ternary plot shows a Base transaction distribution of Other, NFT, and DeFi as about 75%, 20% and 5%, respectively.)

Green = Ethereum L1 activity; blue = Base L2 activity

We can also look to the preference of addresses in L1 vs. L2 usage by taking the ratio of transactions between L1 and L2. This is shown below. We see a wide range — some addresses focus on one or the other, while many are at a balance of L1 and L2 usage. The concentration of activity differs widely across addresses.⁴

Focal Base addresses by relative L2/L1 usage (negative y-axis, more relative L2 use)

Superchain Circuitry and Explorers

OnChain Summer and Base offer a recent opportunity to see this wide range of distinct chain usage. Relationships among chains (L1, L2, etc.) and their users will probably be complicated. A fun and philosophical way of thinking about “layers” is that they are part and parcel of one larger circuit — a superchain, as Optimism’s OP Stack calls it. Layers are secured by the mainnet, but the flow of information inevitably includes bridging and transacting in a wide variety of ways. These chains should not be considered “apart” from one other. Instead, they are functionally coupled such that they should be considered one superchain or “meta-chain.”

Optimism OP Stack

This promotes new potential ideas for the role of blockchain explorers too. What I’ve shown above is just the simplest such analysis — showing distinct L1 or L2 activity. But in the future, we may be able to analyze and categorize users and applications in more complex ways. These future analyses may show more complex relationships that the “superchain circuitry” enables, perhaps especially when EIP-4844 arrives with its blobs making L2 usage even cheaper.

In a similar way, designers of explorers like Etherscan could think of themselves as mapping one big superchain circuit. New tools might facilitate discovery of how users are engaging L1/L2, and how users could engage them in new ways. Here is some mild speculation.

Teleportation links. Labeling contracts as bridging contracts should invite all related links on a bridge transaction to the associated chain on the other side. This would permit quick juxtaposition of an address on L1/L2.

Relative activity badges. The simplest possible way to mark a wallet on L1 or L2 is to specify by chain metrics whether it has relatively higher transactions or transaction volume on L1/L2.

Hypothetical relative activity bands (right) show chain usage; click, go to explorer for that address

Value lock-up engagement / time. A similar analysis may be how long bridged value has lasted on the L2. Next to a bridging transaction, a ratio of L2 transaction volume by value bridged could show how much usage is associated with that bridged value.

Usage fingerprinting / cross-chain entropy. A small marker consisting of a ternary plot or miniature bar plot could indicate the pattern of activity of a wallet on a given chain vs. other chains; it could, for example, be represented by a single metric such as “usage entropy” measuring whether the distribution of wallet activity on one chain deviates from mainnet or others.

Hypothetical “badge” showing engagement fingerprint


Many explorers like Etherscan, Nansen and others have features like this, but they are almost always focused on one chain per interface or explorer. In the coming years, these could be adapted to describe this “superchain circuit,” to connect information across explorers and facilitate discovery. The result might very well be a UX feel that sees mainnet and its second layers as part of one big superchain. Users see one explorer interface, with icons or other markers that specify in which part of that circuit a transaction has settled.


  1. This is what Optimism means by its “superchain” — a development stack that can interconnect and advance new use cases and applications across chains by leveraging their individual strengths and lowering the barriers for them to mutually interact.
  2. In fact the superchain concept noted above and the many more chains that may interact could challenge the “L1/L2” dichotomy altogether. Concept worthy of mention, but outside scope of the post.
  3. Obviously this leaves out ERC-1155. I do this for simplicity, though I may include it in a follow up analysis later.
  4. This analysis can’t tell if some of these origin wallets on mainnet are just hot wallets and there may be other activity on another main address. In many chain analyses, such confounds are inevitable so it’s an important caveat.

About Takens

I was not paid for writing this post. I wrote it for fun. I do creative projects and other work in the crypto space. I am on Twitter, and welcome a hello on my main timeline!

Superchain Circuitry was originally published in Etherscan Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Ethereum’s New Data Economy

Forthcoming mainnet upgrades suggest a future of incentivized data preservation and a shared responsibility to encode the past

Ethereum’s core devs are already approaching another major upgrade to mainnet. This upgrade will center on Ethereum Improvement Proposal #4844 (EIP-4844). They’ve designated a new portmanteau, “Dencun,” to refer to this upgrade (combining “Deneb” and “Cancun,” for updates to the consensus and execution layers, respectively).

EIP-4844 may bring down transaction costs on mainnet, but its focus is on reducing fees for Ethereum’s second layers (see my post here about L2s). To accomplish this, this EIP’s approach is all about data. The EIP will improve the way in which L2s encode data on mainnet. L2s currently devote much of their fees to writing to Ethereum mainnet for validating their ledgers (using transaction calldata). This also increases fees on mainnet. You can see this on Etherscan’s “gas guzzler” list here. 5%–10% of mainnet fees are often related to L2s, such as zkSync and Arbitrum.

Example gas guzzlers during Jul. 2nd, 2023 with zkSync and Arbitrum near the top

EIP-4844 is therefore significant. In this upgrade, users of Ethereum (such as L2s) will be able to encode so-called blobs of data. As part of a new transaction type, these blobs will be cheaper because the data will only persist for 30 days. There will be a second fee market on mainnet for the cost of committing blobs on the Beacon chain (the consensus layer). Blob fees will have a dynamic similar to how EIP-1559 governs supply and demand (see here for a great summary). All this complexity (including fascinating details about the blob data itself) are by design; they are meant to bring Ethereum closer to future scaling upgrades. And L2s can use these cheaper blobs to validate their ledgers.

vitalik.eth on Twitter: “Some proposals to add “blob-carrying transactions” in a near-future hard fork, bringing higher scalability to rollups before full sharding is complete. / Twitter”

Some proposals to add “blob-carrying transactions” in a near-future hard fork, bringing higher scalability to rollups before full sharding is complete.

But EIP-4844 introduces for the first time a big idea in Ethereum’s future updates: transient data.¹ This upgrade got me thinking about its implications. Other planned protocol changes also have this property of temporary data on chain. A bird’s-eye view over the planned upgrades reveals that data is an important part of Ethereum’s future. Or, put differently, the absence of data is an important part of that future.

Let’s consider some other examples. I’ll focus on NFTs to illustrate what data temporariness means for the future. Despite concerns with transience, this series of upgrades represents a growing data economy for Ethereum.

Pruning Historical Data: EIP-4444

I’m especially curious about implications for applications that make use of on-chain data. In particular, there is a growing landscape of NFTs that use on-chain data storage. On-chain NFTs store their data on chain because the asset (artwork, PFP, etc.) is purportedly forever — you can always retrieve it on chain.

Hundreds of NFT projects are now fully on chain; see for an authoritative list

But these upgrades and the temporariness of chain data raise important questions. There are legitimate concerns about how the data will be stored and made available.

Consider another major improvement proposal: EIP-4444. This EIP may be implemented in the coming year or two. The idea of this proposal is pretty simple: Ethereum nodes will no longer be required to hold onto historical records of transactions beyond one year. This will include block headers, calldata, and so on. This can impact applications that make use of historical data, such as market analysis or economic research. It can also impact some NFT projects. For example, some prominent NFT projects store their code or data in calldata. You can see this on Etherscan too. Here’s the C code to generate one of 0xDEAFBEEF’s archetypal projects, Synth Poems. It is in the calldata used for this transaction (its hash is recoverable from contract functions here):

This code would be needed to rebuild the hypnotizing audiovisual experiences of 0xDEAFBEEF’s pieces. EIP-4444 would prompt nodes to delete this calldata because it is from over 2 years ago. (And that means that even if you spun up a node yourself in the future, you’d not have access to this data.)

Still frame from a Synth Poem.

An important distinction here is between memory and storage. Because 0xDEAFBEEF’s code is in calldata, it is at risk in the EIP-4444 upgrade — it is not accessible in the EVM, and calldata is only in memory in the moment of the transaction. So calldata is a historical transaction record, accessible to a full node that syncs the chain (but not in the EVM itself). EIP-4444 would mean this is pruned after a year.

By contrast, projects that use storage preserve data in their contract, accessible to the EVM. On-chain NFTs store data inside contract storage itself. These are part of Ethereum’s state, and so aren’t at risk by EIP-4444. This storage pattern is exemplified by Avastars and CyberBrokers. These NFT projects have a beautiful and complexly layered set of functions to assemble SVG artwork. These functions use contract storage (see my blog post here for detail). You can see the beautiful layers encoded on Avastars years ago by calling its contract storage on Etherscan.

Rendering Avastar #1 by calling its contract storage
Avastar #1: fully on-chain, in contract storage

Other planned upgrades imply that contract storage is not entirely safe either. It may succumb to a later upgrade of Ethereum that involves state expiry.

Purge of the State

At this point, you may ask why the absence of data is so important to Ethereum’s future. A compelling case is made in a Bankless episode with Vitalik. The interview is somewhat dated, but the content has aged extremely well, and remains a crystal clear discussion of many roadmap features.

At about 40:00 in this interview, Vitalik summarizes the challenges that data will pose for those who wish to participate in Ethereum’s security — such as by running a node. When Ethereum scales, it would produce petabytes of data per year under the current data model. This is far too prohibitive for most participants because they would be expected to completely sync up with this growing blockchain data.

Chain size is already considerable; see Etherscan’s charts

The concern about data also applies to the very state of the Ethereum blockchain itself — to storage, mentioned earlier. The possibility of state expiry is also encouraged by the fact that historical data has a simpler trust model (the past is “easier to prove”). So why not prune the state itself?

This EIP proposes just this (it is currently an early “proto-EIP,” which you can read here). After a period of time, nodes could prune states too. The effects of this are non-trivial. For example, such states store balances for all ERC-20 contracts. And this would impact all NFT projects. The state also stores URI pointers to every NFT asset, and for on-chain NFTs it is arguably worse: All the metadata and the piece itself are evanescent under state expiry. (That means that, after state expiry, if you spun up a node, states of your projects beyond particular time points may not even be accessible either.)

The New Data Economy

The blobs of EIP-4844 are temporary. This bridge between L1 mainnet and L2s lasts for about a month, after which validators on the Beacon chain need not hold onto them. Where will blobs go? Will they be needed, in audits or analysis? In EIP-4444, historical data is pruned after a year, and state expiry will involve some similar timeline for state pruning. A future of “temporary data.”

To observers, this may seem concerning, especially if you’re into projects that make great use of historical data or contract storage (which is, arguably, everything; perhaps most starkly with on-chain NFTs).

But this transient data approach is a necessary one. The chain is getting too heavy. It is the “deadweight of history,” as Vitalik has described it. But this presents new challenges of data preservation, recovery, analysis and so on. And challenges present opportunities. With EIP-4844, we get a new fee market baked into the blob transaction type. EIP-4444 and state expiry present new opportunities for other markets, too. Here are a few ideas.

Centralized services

The obvious choice for maintaining both historical data and state data is centralized services. Vitalik mentions Etherscan and other approaches in his interview, too (including Beaconscan). There is incentive to maintain these data sources because they are monetized as a service. This will become more important for Ethereum beyond the so-called “Purge,” with EIP-4444 and state expiry. Tools like Etherscan are already routinely mentioned as critical infrastructure. In the future era of transient data, their importance will grow.

The Purge, as part of Vitalik’s roadmap diagram

Incentivizing distributed data preservation

Another approach to storing historical and state data is to create a distributed system (akin to IPFS) that is built on top of Ethereum. The Portal Network is aiming to create a peer-to-peer system that permits light clients that distribute the data load so that history is still accessible in a similar way to current APIs. The Graph is a prominent data infrastructure that many are hoping will approximate a fully decentralized preservation system that can be incentivized by participation in governance and paid data usage.

The Graph’s subgraph explorer; delicious mounds of chain data

State maintenance services

These next two present more interesting possibilities and pertain to state expiry. In state expiry, it is possible to keep a storage slot active on your contract in order to maintain its presence in the chain. One could imagine new contract functionalities that routinely “ping” another contract in order to maintain certain states. A customer could register with a state-maintenance server which uses an emerging standard to “ping” all contracts created by a given wallet. For a small fee, this could be “loaded” with a subscription that lasts decades into the future (akin to ENS registry). It could also be decentralized too, using a system of contracts, and customers could routinely check to ensure the system is working. If it is not, they could seek another service or setup a scheduled system themselves to call a “maintenance” contract.

State maintenance monetizes the “state tree” more fully. Some may be concerned that it’s an additional fee for users, like the lamentable “Apple peripherals” that can proliferate into higher distributed cost. But the argument against this is that data preservation is expensive, especially if it is in some tension with securing the blockchain. For this reason, data maintenance services let users pay for the privilege of such data preservation, and let validators and other participants focus on consensus and security.

State recovery services

In that Bankless discussion with Vitalik, he emphasized that history is unlikely to be lost. With the services described above, we could expect multiple more or less centralized tools for robustness to preserve historical and state data. But even without these tools, assuming you have information about the storage in your contracts, you can still recover them. State recovery could be a service too. It could provide point-and-click tools and some standards and practices for preserving history of importance to you. You can then bring that personally held data to a service, upload it and establish a proof that recovers these states.

There can be fun and fulfillment in recovery, see MoonCats!

In a summary of this state expiry, Vitalik shares a wonderful thought experiment of Alice whose work with a smart contract is one of her passions (see “Epoch 13” section). She travels and has some other events in her life that keep her from the contract for some time. Its storage is pruned from the tree. Vitalik describes how she hunts for witnesses with sufficient information to facilitate recovery of her beloved contract.

Vitalik’s little thought experiment about state expiry


Ethereum has to accommodate security and efficiency of its consensus mechanism amidst what we hope to be mass scale increase in future use. This goal is in tension with the wonderful yet plentiful data that blockchain creates. Forthcoming upgrades will bring a new era of “temporary data,” but it will also introduce new and interesting economic possibilities for the maintenance, recovery and curation of blockchain data.

Here’s the rendering code for the Art Blocks project Symbol 1 by Emily Weil. A beautiful quinean project; the code is the work. That code sits in storage. But in the coming years, it may not. The future data economy may help to preserve and recover it.

Symbol 1 #96, owned by me
Symbol 1 script (in 2 parts) in Art Blocks storage


  1. Dencun will also likely include EIP-1153, which proposes new transient storage opcodes which have very interesting computational implications — another transient data ingredient.

Further Materials

  1. Anthony Sassano just discussed EIP-4844 again on a Daily Gwei, including an update to devnet.
  2. Recently updated article on statelessness on
  3. Great recent summary of EIP-4844 by Christine Kim @ Galaxy, including interesting detail about the life of a blob.


I am on Twitter. I spend a lot of my time on creative data visualization projects, including several fully on-chain works like the_coin, one of the first NFT projects that lets owners update contract storage to modify the NFTs (hence an interest in state expiry).

Disclosures: I own and create NFTs and sometimes hold the projects that I mention. For example, I own some Avastars. I love them. I was not paid for this post. I wrote it for fun. I hope it was interesting.

Ethereum’s New Data Economy was originally published in Etherscan Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Beacon Withdrawals and the Inevitable 80/20 Distribution

The Beacon chain, the backbone of Ethereum’s move to proof of stake, went live in late 2020. Interested participants could deposit ETH into the Eth2 deposit contract, enabling them to participate as validators in the Beacon chain, which fused with mainnet in the celebrated Merge in late 2022.

A few weeks ago, the newest hard fork was activated: The Shapella upgrade now allows validators to withdraw their ETH back into regular mainnet circulation. Full withdrawals — through which validators exit Beacon — are significantly rate-limited. Partial withdrawals (such as staking rewards) are also limited but have higher bandwidth: 16 per block. At the time of writing, there have been over 2,500,000 withdrawals from Beacon, amounting to over 2,000,000 in ETH.

Since Shapella, there have also been plenty of fresh depositors. Last month saw the highest concentration of Beacon deposits so far. So even with millions of withdrawals, staking on Ethereum is an order of magnitude larger than current or pending withdrawals, at least for now (see a helpful survey of this in the Coin Metrics State of the Network #203).

From Etherscan’s deposits dashboard

It was widely discussed that initial deposits into Beacon were highly concentrated, including after The Merge (see prior post here). A few parties or pools control the vast majority of the deposited ETH, raising concerns that the network is insufficiently decentralized. Only a few parties could control the majority of block production.

We can measure this concentration with the Gini coefficient. It is a measurement of how unequally distributed a good is. A Gini of 0 means that there is no concentration, indicating equal distribution. A Gini of 1 means that a single entity has all the resources in a given distribution. We can show this by plotting the rank of depositors by their relative dominance in a cumulative distribution (from 0% to 100%, adding them up).

As you can see below, this is not a flat distribution. It shows that the first 500 depositors (in either regular or internal transactions) are responsible for over 80% Beacon inputs.

Ranking wallets by their deposited ETH; Gini: 0.85

This distribution of depositors has a Gini coefficient greater than 0.85. This concentration is due in part to staking services such as Lido and Coinbase. An important note by Kyle Waters of Coin Metrics and others is that these staking services involve many participants who, in theory, represent a more decentralized potential as there are many hundreds or thousands of these depositors. So the debate is nuanced (including debate about metrics like the Gini), and also involves other aspects of the proof of stake consensus framework.

How about withdrawals? Have they been concentrated?

Ranking addresses by withdrawn ETH received; Gini: 0.98

It appears they have been intensely concentrated, perhaps more so, with a Gini coefficient of about 0.98. This is likely due in part to the exit of Kraken, as just one of the withdrawing wallets, responsible for over 600,000 ETH, flows directly into a Kraken address. But it could also be due to the relative rewards accrued by these entities. Smaller validators may have to wait longer to justify withdrawal; larger validators may exit with a very regular stream of withdrawals as they receive more rewards. Indeed, wallet 0xB9D79 shows frequent, daily withdrawals. It has received withdrawals from Beacon over 1,000,000 times alone with an average of about 0.25 ETH. It seems to belong to Lido.

Lido withdrawal manager

Concentration like this is very common across many networks, both in the digital world and in physical and biological systems. The tendency for resources to become concentrated has been argued to be an inevitable fact of reality — the vagaries of uneven distribution, preferential attachment, thermodynamics and more (hypotheses vary). Readers may recognize this as the famous “80/20” principle — 20% of the entities control 80% of the resources, work, etc. It’s a rough heuristic, a rule of thumb. But it expresses this concentration in a familiar way (though in the cases above, it is closer to “95/5”).

For Ethereum, this has been long under discussion. Vitalik addressed this concentration years ago, shortly after Beacon went live. He argued that focusing too much on the Gini coefficient and related measures of unequal distribution may oversimplify our understanding of important underlying relationships in a social or economic system. Such relationships may better express the nature of such “inequality,” its origins, architectural implications and risks. This seems reasonable, but one could argue that (i) other possible measures of underlying relationships may still yield strong indications of concentration and (ii) there’s simply obvious concentration at a glance. In any case, Ethereum is not the sole project subject to such discussion. Many, perhaps most, are. Even Bitcoiners debate mining concentration, too.

Bitcoin pool ranking;

In the design of protocols that decentralize responsibility, to ensure robustness to attack or manipulation, it is an uphill battle if this distribution is indeed an inevitability of nature’s principles. But we can try to vary the slope of this concentration to make sure that it doesn’t become too skewed.

There are movements afoot to promote this in Ethereum, such as facilitating a broader base of participants in block validation and block building. For example, Flashbots, despite fears of its dominance, has nobly taken on the challenge to open-source its tools and expand participation. Alongside these advances on Ethereum, developments such as EigenLayer allow depositors to restake their ETH into other services, such as to help secure an emerging project or protocol. This expands the potential utility of staked ETH, and could alter incentives around deposits and withdrawals in the future.

I wrote this for fun for Etherscan. I was not paid by anyone. I own various cryptocurrency things, sometimes ones that I mention in my writing. You can follow me on Twitter here.

Beacon Withdrawals and the Inevitable 80/20 Distribution was originally published in Etherscan Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Complexity of a Stablecoin “Run”

A few weeks ago, three crypto-serving banks faced major liquidity crises amidst a bank run. These banks were shuttered, and some rather scandalous hypotheses swirl around these events. Something else interesting happened on chain, too: This fiasco impacted the peg of both USDC and DAI. Because USDC was significantly banked at the now-defunct Silicon Valley Bank, and DAI collateralized substantially with USDC, both experienced their own little “run.” As Coin Metrics reports in a great survey of this run that happened around March 11th, 2023, users seemed to move their USDC and DAI into USDT and BUSD. (They may have perceived them as safer given the potential impact on USDC’s Circle, or have been seeking arbitrage, etc.)

Figure from Coin Metrics State of the Network #198

As you can see in the plot from Coin Metrics, this was a temporary effect. Unlike other notorious “stable”coins of recent history, both USDC and DAI returned to peg after a couple of days.

The term “run” conjures a mental image of individual customers rushing to the bank to withdraw their cash. The term implies a singular desire to escape a particular asset or custodial situation. While this simple description captures an underlying motivation, it is actually enacted in many different ways. A “run” manifests in ways that can be complex: The simple desire among holders actually yields distinctive patterns of decisions and actions.

Indeed, with Etherscan data and USDC alone, you can see this complexity on chain. In this brief post, I summarize some underlying on-chain patterns between March 9th and March 12th. The message here is simple: The visualization and analysis suggests runs have complex on-chain dynamics. They involve a mix of on-chain activity, shifts in value-sent distributions, and more.

Below I illustrate this in two ways. (1) First with direct transfers of USDC across various wallets. And second, (2) with individual token swaps between USDC and other assets.

(1) Direct USDC Sends

To explore patterns underneath this run, I extracted transaction data from Etherscan, starting with direct ERC-20 transfers of USDC. These were transactions that included only a single transfer event. Between March 9th and March 12th, I extracted about 200,000 of these.

I built a “stackplot” of these 200,000 transactions. To understand what a stackplot is, consider this key I shared in an introductory article to this visualization:

Rows are addresses; lines are transfers; colors are exchanges/highlighted addresses

When we do this for all addresses across the 200,000 transactions, you get the following full stackplot:

Stackplot of over 200,000 direct USDC transactions

Here, each “stacked” row is an address, each column (line) is a transfer. As addresses enter the data, and transact on USDC’s contract, rows rise in number quickly between 3/9 and 3/12. You can see a “glow” in the middle of the stackplot, indicating a collective, momentary “run,” USDC being sent across wallets. I highlighted some exchange addresses (Binance 14, Coinbase 10) with color. There seems to be a rise in exchange wallet activity visible even in this granular visualization.

You can also see very large transactions. The largest, in the top right quadrant (in white) is a $500,000,000 transaction between two Binance wallets. As Coin Metrics reported, many larger transactions took place during this time, and there were unusually more $1,000,000 or greater USDC transfers. This distribution seems visible in the stackplot above, with higher and large density along the steep “sigmoid” rise of the wallet rows and their transfers.

We can examine how these transactions cluster in blocks. On the x-axis we plot time (in block height) and y-axis the number of unique USDC senders and receivers within that block. By plotting each block this way, we can examine this local distribution of on-chain activity. Departures from this distribution represent an important pattern: They could be exchange consolidations taking place in one or a few blocks at a given time.

The “run” on USDC (direct sends); distribution of wallets in blocks shows consolidated activity

Near the end of this date interval, curiously, HitBTC seems to show some consolidation in wallets that were once funded by their deposit wallet. Bittrex sends a large number of USDC transactions to separate wallets far exceeding the statistical trend among unique receivers.

We can visualize these trends with network diagrams, illustrated below. Lines are ERC-20 transfers and dots are wallets. USDC flows into a wallet that seems to be associated with HitBTC here. Over a range of about 40 blocks, just 8 minutes, there are hundreds of these consolidations.

Direct USDC transfers from blocks 16815288 to 16815328; 0xbfcd8 = HitBTC affiliated?

(2) USDC Swaps

Let’s consider transactions that involve two ERC-20 transfers, one of which is USDC. These usually involve a swap (from or to USDC). In that date range, I extracted about 80,000 of these (amounting to 160,000 transfers, two per transaction). Interestingly, the distribution of senders and receivers by block seems more orderly in DEX swaps, which may indicate a more collective phenomenon of individual wallets transacting to move USDC around.

However you can detect significant shifts in the distribution of USDC swaps. As Coin Metrics also reported, users seemed to swap into USDT or (W)ETH, and below you can see the rapid rise of USDT swaps near the run.

One way to summarize the swap patterns is to calculate an entropy score over the distribution of unique tokens in 100-block windows. Entropy is generally interpreted as “disorder,” but here higher entropy reflects a more complex mix of tokens in a period of time. Conversely, if entropy drops, it means that DEX “behavior” with USDC is getting simpler, focusing on a smaller number of tokens. Indeed entropy drops by about 25% or more at the start of the run. To put it in very playful terms: Runs alter the thermodynamic structure of block space.

Finally, USDT and WETH showed different patterns too. On March 11th, the distribution of swaps involving these tokens were distinct, with USDT having the higher average swap value, but WETH having a handful of extremely large swaps (owing to an MEV bot).


I focused on USDC, and peeled back hundreds of thousands of transactions to take a quick glimpse into the composition of a run.

The view of a “run” as being a collective clamor among individual wallets is partly right. There seems to be lots of that.

But there is also significant and easily identifiable heterogeneity of activity: (i) individual users send into exchanges, (ii) whales become more active, (iii) blocks exhibit statistical trends of coordinated consolidation, (iv) CEX and DEX behavior are distinct, and (v) swap distributions shift. In a helpful thread about these events, Delphi Digital referred to this as an “on-chain frenzy.” It had impacts on issuance too: Kyle Waters of Coin Metrics reported thousands of burnt ETH around this time, taking the cumulative burn to over 3 million ETH since The Merge!

This post dipped a bit more into this underlying complexity to share some other on-chain details. Understanding these details may be helpful for responding to future runs of this sort.

I’m a creator and writer and such and you can follow me on Twitter. I was not paid for this post. I sometimes own the assets I mention.

Complexity of a Stablecoin “Run” was originally published in Etherscan Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.