Research Finder
Find by Keyword
Is AI Turning Traditional Procurement Cycles into a Liability?
Why the utility meter 15-year depreciation assumption is breaking down as edge AI redefines what meters are for; with implications for other industrial IoT use cases
02/27/2026
Key Highlights
- Utilities operating on traditional 15-to-20-year meter depreciation schedules may have in place systems predating the edge AI era, leaving that hardware in place could lock in hardware incapable of on-device intelligence for the next decade.
- Example, AMI 1.0 deployments built on RF mesh networks are already reaching functional end-of-life at 10-12 years, roughly half the expected lifespan, as communication standards evolve faster than capital recovery timelines.
- The $15.2 billion edge AI for smart grid market in 2024 is projected to grow at a 24.7% CAGR through 2034, signaling that the compute economics utilities need are arriving faster than their procurement cycles planned for.
- Our analysis suggests utilities that defer edge-capable meter upgrades will not save capital but will instead create a stranded-asset scenario, arriving at their next replacement cycle facing significantly more capable and more expensive systems with no interim data infrastructure to show for it.
- There is a broader lesson for industrial IoT operators here: procurement cadences designed for pre-AI business cases are poorly matched to infrastructure rapidly evolving as AI data collection and edge control points.
The News
A convergence of two forces is reshaping the economics of utility infrastructure procurement in ways that most capital planning models have not yet absorbed. On one side, tens of millions of first-generation advanced metering infrastructure (AMI 1.0) smart meters, deployed between roughly 2010 and 2020 on the assumption of 15-to-20-year useful lives, are approaching functional obsolescence faster than their depreciation schedules anticipated, with RF mesh-based systems showing end-of-life characteristics at just 10-12 years according to IntelMarket. On the other side, a new generation of edge AI-capable metering hardware, exemplified by platforms like Utilidata's Karman system built on NVIDIA's Jetson architecture, is entering the market architected to function not merely as a billing endpoint but as a distributed grid intelligence node. The strategic collision between these two timelines, the slow cadence of utility capital recovery and the accelerating pace of edge compute capability, is creating a window that many utilities are at risk of misreading as a routine refresh decision. This note examines why that misreading may prove costly, and what lessons it holds for industrial IoT operators beyond the utility sector.
Analyst Take
Opening Perspective
From my perspective on both the buy- and sell-side, I have watched the utility sector (among others) navigate infrastructure decisions for a long time. The pattern is familiar: build conservatively, depreciate slowly, recover capital through the rate base, and repeat the cycle when the battery dies or the radio stops working. Technology advances were rarely a part of the equation. A clerk looks up the people you bought the meter from in 2006, see what they have today, make sure it's backwards compatible, and buy the cheapest one you can get. Rinse and repeat. That model works fine as long as the meter's job is pretty much the same as that 2006 piece of kit, count kilowatt-hours and transmit a reading once a month. But here, as with everything, the job is changing. Fast. If procurement cant keep pace - and it rarely can even pre-AI - the impact will be visible and could be catastrophic.
A quick contrarian observation I want to make: the industry conversation around AMI 2.0 is almost entirely framed as an upgrade decision, a question of whether to patch legacy hardware or replace it. That is the wrong frame. The real question is whether a utility is building billing infrastructure or grid intelligence infrastructure. Those are not the same thing, and they do not belong on the same procurement timeline.
The Strategic Reality: What Has Changed
The hardware constraints of AMI 1.0 are not a minor footnote. They are a fundamental architectural ceiling. Research published in Nature Communications in late 2024 documented that current-generation smart meters operate with as little as 192 kilobytes of static random-access memory, hardware designed to do a preset list of tasks as cheaply as possible. On-device AI inference requires orders of magnitude more headroom. Software enhancements, firmware updates, and edge platform overlays cannot overcome that constraint. In AMI 1.0, the silicon is the limit.
Meanwhile, the grid these meters are connected to is being asked to manage an entirely different level of complexity. Distributed energy resources (DERs), including rooftop solar, residential batteries, and EV chargers, are proliferating at a pace that drives local, real-time decision-making. A central control architecture communicating instructions to field devices cannot respond at the speed or granularity that a grid with millions of bidirectional power flows requires. The meter that was architected to count consumption now needs to be capable of sensing voltage fluctuations, detecting anomalous load curves, and coordinating localized responses to DER behavior, often before the central operations center knows there is an event to respond to.
The numbers behind this pressure are not abstract. U.S. electricity demand is forecast to surge between 35% and 50% between 2024 and 2040 according to industry projections, driven in large part by EV adoption and electrification of residential and commercial loads. That demand growth is not arriving at the substation in a clean, predictable curve. It is arriving as millions of behind-the-meter devices making independent charging, storage, and export decisions on timescales measured in seconds. A virtual power plant or distributed energy resource management system (DERMS) can attempt to coordinate those devices through command-and-control signaling from a central operations center, but the communication latency and computational overhead of that architecture become liabilities precisely when grid conditions are most dynamic. Edge AI on the meter is not a feature enhancement. It is the architectural requirement for managing a grid that no longer moves in one direction at predictable rates. Utilities that are still deploying metering hardware incapable of local inference are building a coordination gap into their grid, one truck roll and one firmware limitation at a time.
The platform emerging to address this is architecturally distinct from its predecessor. Systems like the Utilidata Karman platform, which has attracted $126.5 million in venture funding with NVIDIA and Quanta Services among the backers, are designed to run full operating systems with managed application frameworks at the meter endpoint. The goal is to move the intelligence to where the data originates, rather than transmitting raw interval reads to centralized systems for processing. That is a different device category. It belongs on a different procurement timeline.
Market Analysis
The financial signals are pointing in one direction. According to Deloitte's analysis of investor-owned utility capital planning, electric and gas utility capex across 47 major utilities is projected to reach $212 billion in 2025, a 22% year-over-year increase, with cumulative spending expected to surpass $1 trillion over the 2025 to 2029 period. The money is there. The question is whether it is being allocated against the right strategic frame, or is a clerk cracking open the vendor list from the early 2000s?
Edge AI for the smart grid market reached an estimated $15.2 billion in 2024 and is projected to hold a CAGR of 24.7% through 2034 according to market analysis published by market.us. That trajectory does not fit a niche technology segment. It better fits economics that are arriving at utility scale faster than almost all capital recovery models anticipated.
The regulatory dimension of this shift is underappreciated and, we would argue, is where the real acceleration will originate. EY's AMI 2.0 analysis, published in early 2025, notes that utility regulatory commissions are already applying increased scrutiny to AMI business cases, specifically because traditional cost-benefit models built on billing accuracy and truck-roll reduction no longer capture what next-generation metering infrastructure is actually designed to deliver. The traditional 20-year depreciation schedule was written for a device with a stable, narrow function. That functional definition no longer holds. If regulators begin to accept that an edge-AI-capable metering endpoint belongs in a different asset class than a billing meter, with shorter depreciation timelines and a broader benefits framework that includes DER coordination, grid resilience, and deferred capital investment in distribution infrastructure, the capital math for accelerated deployment changes significantly. We fully expect this regulatory evolution to lag the technology by two to four years, which is precisely why utilities that wait for regulatory clarity before acting will find themselves making the investment decision at peak market pricing rather than ahead of it.
This is where the procurement cycle trap closes. A utility that defers edge-capable meter deployment today, reasoning that its existing AMI 1.0 fleet has remaining depreciable life, will arrive at its next natural replacement window, perhaps in 2028 or 2032. At that point, thanks to rapid AI enablement, the deferring utility will face replacement hardware that is significantly more capable, but also significantly more expensive. Also, in the intervening years, the deferring utility will have no experience deploying intelligence at the edge, and will have gathered no meaningful edge AI data in the interim. The grid's DER management problem will be meaningfully worse by then. The expectation from regulators, customers, and grid operators will be meaningfully higher. And the utility will be starting from zero on edge intelligence infrastructure precisely when its competitors and peers (and vendors) have three to five years of operational learning built in.
McKinsey's research on AI-driven procurement transformation is relevant context here. Their analysis notes that companies with advanced operating models in procurement enjoy measurably superior financial outcomes, and that agentic AI systems are beginning to reshape how infrastructure decisions are evaluated, shifting from transactional cost minimization toward strategic value creation. Utilities that apply that lens to their AMI refresh cycle, evaluating the decision not as a billing hardware replacement but as an edge AI infrastructure build, will reach a different conclusion about optimal timing. And may have substantial investor/regulator interest in 2026, where in 2032 regulators will be asking ‘what have you been doing all this time?’
The industrial IoT parallel deserves more than a footnote, because the utility sector is the canary in the coal mine. Manufacturing floor sensors, oil and gas pipeline monitoring endpoints, and water treatment instrumentation all operate on procurement assumptions that share the same structural flaw: they were sized for a data-collection-and-report function and are now being asked to support an AI-inference-and-respond function. The hardware ceiling is identical. The capital recovery logic is identical. The difference is timing. Utilities are being forced to confront this reckoning first because the grid's DER complexity is making the gap operationally acute on a compressed timeline. In oil and gas, the pipeline monitoring gap may not become acute for another three to five years. In water treatment, perhaps longer. But the underlying dynamic is the same: the edge device that was designed to sense and transmit is being redefined as a compute node, and no amount of software investment can overcome the absence of the silicon to run it. Industrial CIOs outside the utility sector would do well to treat the AMI 2.0 debate as a preview, not a curiosity. The procurement decisions being made in utility capital planning departments today are the decisions their own organizations will face within the next planning horizon.
The hardware cannot evolve through software alone. The procurement cycle needs to absorb that reality early, not just when it is replacement time based on decades old policies.
Looking Ahead
The key trend we will be monitoring is whether forward leaning utilities engage regulators to begin revising depreciation frameworks underpinning AMI capital recovery timelines. The traditional 20-year schedule was designed for hardware with a stable functional definition, and smart meters no longer have a stable functional definition. They are AI end points, and that means a rapidly accelerating transition from billing endpoints to grid control nodes, happening faster than any rate case anticipated. If regulators begin accepting shorter depreciation periods for edge-capable metering infrastructure, the financial barrier to accelerated deployment decreases significantly, and the stranded-asset risk of continuing to defer shifts decisively toward inaction. We will also be tracking how hyperscalers and edge compute platform vendors position themselves relative to the utility procurement cycle. AWS, Google, and NVIDIA each have a financial interest in establishing the edge intelligence architecture at the meter before utilities lock in next-generation contracts with incumbent meter vendors. The race for the grid edge is not only a utility strategy question. It is becoming a platform land grab.
Stephen Sopko | Analyst-in-Residence – Semiconductors & Deep Tech
Stephen Sopko is an Analyst-in-Residence specializing in semiconductors and the deep technologies powering today’s innovation ecosystem. With decades of executive experience spanning Fortune 100, government, and startups, he provides actionable insights by connecting market trends and cutting-edge technologies to business outcomes.
Stephen’s expertise in analyzing the entire buyer’s journey, from technology acquisition to implementation, was refined during his tenure as co-founder and COO of Palisade Compliance, where he helped Fortune 500 clients optimize technology investments. His ability to identify opportunities at the intersection of semiconductors, emerging technologies, and enterprise needs makes him a sought-after advisor to stakeholders navigating complex decisions.