$CHEETAH is live!
Type something to search...
Blog

Corporate Crypto Accounting: Engineering Mark-to-Market Infrastructure

FASB's fair-value mandate has forced engineering teams to build real-time on-chain reporting infrastructure for institutional ETH. Here is what that actually looks like in practice.

Corporate Crypto Accounting: Engineering Mark-to-Market InfrastructureCorporate Crypto Accounting: Engineering Mark-to-Market Infrastructure
Corporate Crypto Accounting: Engineering Mark-to-Market Infrastructure
Join Our Newsletter

Subscribe to our newsletter to get the latest updates and offers

* Will send you weekly updates on new features, tips, and developer resources.

TL;DR:

  • FASB ASC 350-60, effective for fiscal years beginning after December 15, 2024, now requires most qualifying digital assets to be carried at fair value through earnings, ending the era of cost-basis accounting for corporate ETH holdings
  • Companies holding significant ETH positions face quarterly reporting obligations that require real-time price data infrastructure, not spreadsheets or manual API calls
  • On-chain mark-to-market reporting requires at minimum three distinct engineering layers: a price feed aggregation layer, an on-chain state indexer, and a reconciliation engine
  • ETH's proof-of-stake mechanics introduce a second accounting dimension beyond spot price, requiring separate treatment of staking rewards under current FASB guidance
  • The tokenized equities market crossed $963 million in January 2026, a 2,878% year-over-year increase, signaling that on-chain financial infrastructure is moving from experiment to production requirement
  • Grayscale's 2026 Digital Asset Outlook identifies this year as the dawn of the institutional era, and the engineering teams building reporting infrastructure are the ones making that era operationally real
  • AI-assisted development environments are compressing the time to build compliant reporting pipelines from months to weeks

The result: On-chain mark-to-market reporting for institutional ETH is not a future problem, it is a present engineering requirement that most teams are underprepared to solve.

The Accounting Shift That Changed Everything

The Financial Accounting Standards Board's update to ASC 350-60, which took effect for fiscal years beginning after December 15, 2024, fundamentally changed how public companies must account for digital assets on their balance sheets. Before this update, companies holding ETH or Bitcoin were required to use an indefinite-lived intangible asset model, which meant they could only write down the value of their holdings when impairment occurred but could never write them back up when prices recovered. This created a deeply asymmetric accounting treatment that made corporate crypto holdings look worse on paper than they actually were, and it discouraged many CFOs from even considering digital assets as a serious treasury strategy.

The new standard requires companies to measure most qualifying digital assets at fair value, with changes in fair value recognized in net income each reporting period. This means that a company holding 10,000 ETH at a cost basis of $2,000 per token, with a current market price of $3,500, must now recognize that $15 million unrealized gain in its income statement. The flip side is equally true: a price decline flows directly through earnings, creating volatility that boards and audit committees need to understand and plan for. The accounting treatment is now closer to how trading securities are handled under ASC 320, and it demands the same quality of price data infrastructure that equity trading desks have built over decades.

What makes this particularly interesting from an engineering perspective is that the standard does not simply require a daily closing price. It requires fair value measurement consistent with ASC 820, which means companies need to use observable market inputs, document their valuation methodology, and be prepared to defend their price sources to auditors. For ETH specifically, this means building a price feed infrastructure that can pull from multiple exchanges, apply appropriate weighting, handle gaps in trading data, and produce an auditable record of every valuation used in a financial statement. That is not a problem you solve with a spreadsheet or a manual API call once a quarter.

Why ETH Creates Unique Engineering Complexity

Bitcoin and Ethereum are often discussed in the same breath when it comes to institutional treasury strategy, but from an accounting and engineering perspective they are meaningfully different assets. Bitcoin is relatively straightforward: it is a non-yielding asset with a single price dimension. ETH, by contrast, has been a proof-of-stake asset since the Merge in September 2022, which means that any company holding ETH in a validator or through a liquid staking protocol is also earning staking rewards. Those rewards are not the same as unrealized price appreciation. They represent new tokens being received, and under current FASB guidance they need to be recognized as income at fair value on the date they are received.

This creates a two-dimensional accounting problem. The first dimension is the mark-to-market valuation of the existing ETH position, which changes continuously with market price. The second dimension is the ongoing recognition of staking income, which accrues in small increments with each epoch, roughly every 6.4 minutes on the Ethereum mainnet. A company running its own validators might receive staking rewards across dozens of validator keys, each with slightly different reward timing depending on attestation performance and block proposal luck. Aggregating that data into a clean accounting record requires an indexing infrastructure that can track validator state, reward events, and token receipt timestamps at a granularity that most traditional financial reporting systems were never designed to handle.

There is also the question of liquid staking tokens. A company that holds stETH through Lido, or rETH through Rocket Pool, does not receive discrete reward events in the same way a direct validator does. Instead, the exchange rate between the liquid staking token and ETH changes continuously as rewards accrue. Accounting for this requires tracking the exchange rate at the beginning and end of each reporting period, calculating the implied reward income, and then separately marking the entire position to market. The interaction between these two calculations requires careful engineering to avoid double-counting or misclassification, and it is one of the areas where teams that have not thought through the data model upfront tend to run into serious problems at audit time.

Designing the Price Feed Layer

The foundation of any on-chain mark-to-market reporting system is the price feed layer. This is the component responsible for producing a defensible, auditable ETH/USD price at any given point in time. In practice, this means aggregating price data from multiple sources, applying a methodology that satisfies ASC 820's requirement for observable market inputs, and storing the results in a way that can be retrieved and verified months or years later during an audit.

Most production implementations pull from a combination of centralized exchange APIs, including Coinbase, Kraken, and Binance, and decentralized oracle networks like Chainlink or Pyth Network. Chainlink's ETH/USD price feed aggregates data from multiple premium data providers and updates on-chain with a heartbeat of roughly one hour and a deviation threshold of 0.5%, meaning a new price is pushed whenever the price moves more than half a percent from the last recorded value. Pyth Network operates on a pull-based model with sub-second latency, making it more suitable for applications that need intraday precision. For quarterly financial reporting, the choice between these systems matters less than the documentation of the methodology and the consistency of its application across periods.

A robust price feed layer for institutional reporting typically implements a time-weighted average price calculation over a defined window, often the volume-weighted average price across the last hour of trading on the last business day of the reporting period. This approach smooths out short-term volatility and is more defensible to auditors than a single point-in-time snapshot, particularly in periods of elevated market volatility where a closing price at 11:59 PM UTC might differ meaningfully from the price at 11:00 PM.

The storage architecture for price feed data deserves as much attention as the collection methodology. Audit requests often come 12 to 18 months after the reporting period in question, and the ability to reconstruct exactly which price was used for a specific valuation date is not optional. Most teams implement an append-only price log in a combination of a relational database for fast querying and a content-addressed storage system like IPFS or Arweave for long-term immutability. The hash of each price record can then be stored on-chain as a lightweight attestation, giving auditors a cryptographic proof that the price data has not been altered since it was recorded. This pattern adds minimal cost but dramatically reduces the friction of audit responses.

Building the On-Chain State Indexer

The price feed layer tells you what ETH is worth at a given moment. The on-chain state indexer tells you how much ETH the company actually holds, across which addresses, and in what form. For a company with a simple treasury setup, this might mean tracking a single cold storage address. For a company with a more sophisticated strategy that includes validator nodes, liquid staking positions, DeFi protocol deposits, and multi-signature custody arrangements, the indexer becomes a genuinely complex piece of infrastructure.

The most common approach is to build on top of an existing indexing protocol rather than running a full archive node in-house. The Graph Protocol allows teams to define subgraphs that index specific contract events and entity states, making it possible to query the current and historical ETH balance of any address or set of addresses with a simple GraphQL call. For validator tracking, the Ethereum Beacon Chain exposes a REST API that returns validator balances, status, and reward history, and most production implementations poll this API on a per-epoch basis and store the results locally. The combination of an EVM state indexer for on-chain holdings and a Beacon Chain indexer for validator positions covers the majority of institutional ETH exposure patterns.

One area that catches teams off guard is the treatment of ETH held in smart contracts that the company controls but does not directly own in the traditional sense. A multi-signature Safe wallet, for example, holds ETH in a contract address rather than an externally owned account. The accounting treatment is the same, but the indexing logic needs to correctly attribute that balance to the company rather than treating the contract as an independent entity. Similarly, ETH deposited into a DeFi protocol like Aave or Compound is represented as a receipt token, aToken or cToken respectively, and the indexer needs to resolve the current redemption value of those tokens back to an ETH equivalent before the price feed layer can apply the USD conversion. Getting this resolution logic right is one of the more technically demanding parts of the system, and it is where most first-pass implementations introduce subtle errors that only surface during reconciliation.

The Reconciliation Engine

The reconciliation engine is the component that sits between the price feed layer and the on-chain state indexer on one side, and the general ledger on the other. Its job is to take the raw data from both upstream systems, apply the accounting rules, and produce journal entries that can be imported into the company's ERP system, whether that is NetSuite, SAP, or Oracle Financials. This sounds straightforward in principle, but the gap between on-chain data and GAAP-compliant journal entries is wider than most engineering teams expect when they first approach the problem.

The core challenge is that on-chain state is continuous and the general ledger is periodic. ETH prices change every block, roughly every 12 seconds on post-Merge Ethereum, but the accounting system only needs entries at specific points in time: the end of each reporting period, the date of any acquisition or disposal, and the date of any staking reward receipt. The reconciliation engine needs to know which of these events occurred during the period, pull the appropriate price for each event, calculate the gain or loss or income amount, and format the result as a properly coded journal entry. For a company that is actively accumulating ETH through dollar-cost averaging, this might mean processing hundreds of acquisition events per quarter, each with its own cost basis and fair value calculation.

The reconciliation engine also needs to handle the cost basis tracking that underlies gain and loss calculations on disposals. Under GAAP, when a company sells a portion of its ETH holdings, it needs to identify which specific units were sold and what their original cost was. The most common approach is specific identification, which requires the company to maintain a lot-level record of every ETH acquisition, including the date, quantity, and price paid. When a disposal occurs, the company selects which lots to apply against the sale, typically choosing the lots that minimize taxable gain or maximize a desired accounting outcome. The reconciliation engine needs to maintain this lot ledger, apply the selected identification method consistently, and produce the correct cost of goods sold entry alongside the fair value adjustment.

Staking Rewards: The Accounting Edge Case Nobody Plans For

Staking rewards deserve their own section because they represent the accounting edge case that most teams underestimate until they are sitting across from an auditor who wants to understand why the staking income line in the income statement does not reconcile to the validator reward data. The core issue is that staking rewards on Ethereum are not paid in a single transaction at a predictable interval. They accrue continuously at the protocol level and are credited to the validator's withdrawal address in batches, with the timing and amount of each batch depending on the validator's performance and the network's reward rate.

Under current FASB guidance, staking rewards are recognized as income when the company has the right to receive them and the amount can be reasonably estimated. For validators that have enabled partial withdrawals, this means recognizing income as rewards accumulate in the validator's balance above the 32 ETH effective balance threshold. For validators that have not enabled withdrawals, the recognition timing is less clear and requires a documented accounting policy that is applied consistently. The fair value used for recognition is the ETH/USD price at the time of receipt, which means the reconciliation engine needs to match each reward event to a price from the price feed layer with timestamp precision.

The volume of these events at institutional scale is not trivial. A company running 100 validators, which represents a 3,200 ETH stake at current minimum requirements, might receive partial withdrawal credits dozens of times per day across its validator set. Over a 90-day quarter, that could mean thousands of individual reward events that each need to be priced, recorded, and aggregated into the staking income line. Automating this process reliably requires the Beacon Chain indexer to be running continuously, not just at period end, and it requires the reconciliation engine to process events in near real time rather than in a batch at quarter close. Teams that try to reconstruct this data retroactively at the end of a quarter typically find that the effort required is disproportionate to the result, and they end up with estimates rather than precise figures.

Audit Trail Architecture and Immutability

Every component of the reporting infrastructure described so far produces data that will eventually be reviewed by an external auditor. The audit trail architecture is the set of design decisions that determine how easy or difficult that review process will be. Getting this right from the start is significantly cheaper than retrofitting it after the first audit cycle reveals gaps.

The most important principle is that every data transformation should be logged with enough context to reproduce it independently. When the reconciliation engine calculates a fair value adjustment for a specific lot of ETH, the audit log should record the lot identifier, the quantity, the cost basis, the price feed value used, the timestamp of that price, the source of the price, and the resulting journal entry. This level of detail allows an auditor to pick any line item in the financial statements and trace it back through the system to the raw on-chain data in a matter of minutes rather than days. Companies that have built this traceability into their systems report dramatically shorter audit cycles and fewer audit adjustments.

The immutability layer adds a second dimension of assurance. By periodically hashing the state of the audit log and anchoring that hash to a public blockchain, the company creates a cryptographic proof that the log has not been altered after the fact. This is particularly valuable in the context of digital asset accounting, where auditors and regulators are still developing their understanding of the technology and may be skeptical of data that exists only in a company-controlled database. An on-chain attestation of the audit log hash, timestamped by the blockchain itself, provides a level of tamper-evidence that no traditional database can match. The cost of anchoring a hash to Ethereum mainnet is negligible relative to the assurance value it provides.

The Regulatory Landscape Entering 2026

The engineering requirements described in this article do not exist in a vacuum. They are a direct response to a regulatory environment that has shifted more in the past 18 months than in the previous decade. The SEC's approval of spot Bitcoin ETFs in January 2024, followed by spot Ether ETFs in May 2024, marked the point at which digital assets became unambiguously part of the regulated financial system. Grayscale's 2026 Digital Asset Outlook characterizes this year as the dawn of the institutional era, and the infrastructure being built by engineering teams right now is what makes that characterization operationally meaningful rather than aspirational.

The FASB fair-value standard is the most direct driver of reporting infrastructure investment, but it is not the only one. The SEC's Staff Accounting Bulletin 121, which required banks and broker-dealers to record digital assets held in custody as liabilities on their balance sheets, created significant friction for institutional custody arrangements and is currently under review. The potential reversal or modification of SAB 121 would open the door for major custodians to offer institutional-grade ETH custody at scale, which would in turn increase the volume of corporate ETH holdings that need to be reported under the new fair-value standard. Engineering teams building reporting infrastructure today should design for this growth rather than for the current scale of their holdings.

International regulatory developments are also relevant for companies with cross-border operations. The International Accounting Standards Board has been working on its own digital asset accounting standard, and while the IASB's approach differs from FASB's in some respects, the general direction toward fair-value measurement is consistent. Companies that operate under both GAAP and IFRS need reporting infrastructure that can apply different accounting rules to the same underlying on-chain data, which adds a configuration layer to the reconciliation engine but does not fundamentally change the data architecture. Building this flexibility in from the start is considerably easier than adding it later.

What Production Scale Actually Looks Like

The theoretical architecture described in earlier sections becomes considerably more concrete when you look at what companies with significant ETH holdings are actually running. Bitmine Immersion Technologies, for example, has disclosed ETH holdings of 4.474 million tokens as part of a total crypto and cash position of $9.9 billion. At that scale, a 1% daily price move in ETH represents tens of millions of dollars in fair value change that needs to flow through the income statement. The reporting infrastructure supporting a position of that size needs to be reliable, auditable, and capable of producing results on a timeline that supports quarterly close processes.

At production scale, the price feed layer typically runs as a dedicated microservice with its own redundancy and failover logic. The on-chain state indexer runs continuously and writes to a time-series database optimized for range queries, with a separate read replica for reporting workloads. The reconciliation engine runs as a scheduled job at period end but also processes events in near real time to keep the lot ledger current. The entire system is monitored with alerting on data freshness, price feed divergence between sources, and reconciliation discrepancies above a defined threshold. This is not a side project maintained by the accounting team. It is a production engineering system that requires the same operational discipline as any other critical business infrastructure.

The teams building these systems are also discovering that the boundary between accounting infrastructure and trading infrastructure is blurrier than expected. The same price feed data that drives fair value reporting is also useful for treasury management decisions. The same on-chain state indexer that tracks holdings for accounting purposes can also power real-time portfolio dashboards for the CFO and treasury team. Building these systems with a clean API layer that serves multiple consumers, rather than as a tightly coupled accounting-only tool, tends to produce better outcomes and better return on the engineering investment.

AI-Assisted Development and the Speed of Compliance

Building the infrastructure described in this article from scratch is a substantial engineering undertaking. A complete implementation covering price feed aggregation, on-chain state indexing, staking reward tracking, reconciliation, and audit trail management represents several months of work for a senior engineering team that already understands both the blockchain data layer and the accounting requirements. Most companies entering this space do not have that combination of expertise in-house, and the learning curve for engineers who are strong on one side but not the other is steep.

This is where AI-assisted development environments are beginning to make a measurable difference. The ability to generate boilerplate for Ethereum RPC integrations, scaffold a subgraph definition for a specific set of contract events, or produce a first-pass reconciliation algorithm from a natural language description of the accounting rules compresses the early phases of development significantly. Engineers who might spend two weeks reading documentation and writing initial integrations can now get to a working prototype in two or three days, leaving more time for the harder problems of correctness, edge case handling, and audit trail design.

The gains are not just in initial development speed. AI-assisted code review can catch common mistakes in on-chain data handling, such as incorrect handling of integer overflow in token amount calculations, missing null checks on API responses from the Beacon Chain, or off-by-one errors in epoch boundary calculations. These are the kinds of bugs that are easy to miss in code review and expensive to discover during an audit. Having a development environment that surfaces them during the writing process, rather than after deployment, changes the economics of building compliant financial infrastructure in a meaningful way.

Building This With Cheetah AI

The infrastructure described throughout this article sits at the intersection of blockchain engineering, financial systems design, and regulatory compliance. It is one of the more technically demanding categories of software being built in the Web3 space right now, and the teams doing it well are the ones who can move quickly without sacrificing correctness. That combination is exactly what Cheetah AI is designed to support.

Cheetah AI is a crypto-native IDE built for engineers working at the intersection of on-chain systems and production software. It understands the context of Ethereum data structures, Solidity contract interfaces, and the specific patterns that show up in financial reporting pipelines, from Chainlink oracle integrations to Beacon Chain API clients to ERC-20 balance reconciliation logic. If your team is building or extending mark-to-market reporting infrastructure for institutional ETH holdings, Cheetah AI can help you move from architecture to working code faster, with fewer of the subtle errors that tend to surface at the worst possible time. The institutional era for digital assets is here, and the engineering work that makes it real deserves tooling that was built for it.


If you are earlier in the process and still working through the architecture, Cheetah AI is a useful thinking partner for that phase too. The questions that come up when designing this kind of system, how to handle lot identification across a multi-address treasury, how to model the exchange rate accrual of liquid staking tokens, how to structure the audit log schema for maximum queryability, are exactly the kinds of problems where having a development environment that understands the Web3 context saves hours of searching through documentation and forum threads. The goal is not to replace the engineering judgment your team brings to these decisions, but to make sure that judgment is applied to the hard problems rather than the routine ones.

Related Posts

Reasoning Agents: Rewriting Smart Contract Development

Reasoning Agents: Rewriting Smart Contract Development

TL;DR:Codex CLI operates as a multi-surface coding agent with OS-level sandboxing, 1M context windows via GPT-5.4, and the ability to read, patch, and execute against live codebases, making it

user
Cheetah AI Team
09 Mar, 2026
Web3 Game Economies: AI Dev Tools That Scale

Web3 Game Economies: AI Dev Tools That Scale

TL;DR:On-chain gaming attracted significant capital throughout 2025, with the Blockchain Game Alliance's State of the Industry Report confirming a decisive shift from speculative token launche

user
Cheetah AI Team
09 Mar, 2026
Token Unlock Engineering: Build Safer Vesting Contracts

Token Unlock Engineering: Build Safer Vesting Contracts

TL;DR:Vesting contracts control token release schedules for teams, investors, and ecosystems, often managing hundreds of millions in locked supply across multi-year unlock windows Time-lock

user
Cheetah AI Team
09 Mar, 2026