Research Notes

MinIO Embeds Delta Sharing into Object Storage: A Sign of Lakehouse and Storage Convergence?

Research Finder

Find by Keyword

MinIO Embeds Delta Sharing into Object Storage: A Sign of Lakehouse and Storage Convergence?

This approach allows platforms such as Databricks to analyze enterprise data directly where it already resides.

03/04/2026

Key Highlights

  • MinIO announced AIStor Table Sharing, embedding the Delta Sharing protocol directly into the AIStor object storage platform.
  • The capability allows analytics environments such as Databricks to access on-premises datasets without replication.
  • By integrating sharing and governance into the storage layer, MinIO positions object storage as a more active participant in hybrid lakehouse architectures.
  • The development reflects broader industry dynamics as storage systems and lakehouse platforms evolve in response to AI-driven data demands.

The News

MinIO announced AIStor Table Sharing, a capability that embeds the Delta Sharing protocol directly into the AIStor object storage platform. The feature allows enterprises to securely share on-premises datasets with analytics environments such as Databricks without replicating the underlying data. The capability is generally available as part of the AIStor platform. For more information, read the MinIO press release here.

Analyst Take

Storage platforms and data lakehouses are becoming more tightly connected as organizations extend analytics and AI across hybrid environments. Both layers are responding to the same pressures: the scale of enterprise data, the economics of infrastructure, and the operational complexity of distributed analytics. As a result, the boundary between the lakehouse and the storage layer is becoming less distinct as vendors push data access and governance capabilities closer to the storage substrate. This shift is occurring while many organizations are still adapting their data platforms for AI workloads. HyperFRAME Research’s State of the Enterprise AI Stack Lens found that only 14% of enterprises currently classify their data architecture as fully AI-ready.

MinIO develops high-performance S3-compatible object storage platforms widely used for large-scale analytics and AI data environments. Its announcement reflects this broader trend. In many hybrid lakehouse deployments today, Delta Sharing exists either as a capability governed within the analytics platform itself or as a separately deployed service tier connected to underlying storage. In both cases, storage holds the data while governance, sharing controls, and lifecycle management remain in a layer above it. By embedding Delta Sharing directly into the object store, MinIO proposes a different arrangement in which elements of sharing and governance move closer to the storage substrate.

Open table formats such as Apache Iceberg and Delta Lake are central to this evolution. They provide the structural layer that allows analytics engines, AI pipelines, and governance systems to operate consistently across distributed storage environments. MinIO’s AIStor platform treats object storage as a repository for these tables and also as a participant in their lifecycle and sharing. Supporting both formats reflects the pragmatism that enterprise environments will remain heterogeneous, with multiple analytics engines and frameworks operating against shared datasets. When open table formats combine with embedded sharing protocols, object storage begins to assume responsibilities that historically sat within the lakehouse control plane.

Today, multiple architectural approaches exist for enabling cross-platform data sharing in lakehouse environments. In many deployments, sharing is governed within the analytics platform itself, with systems such as Databricks managing Delta Sharing endpoints and coordinating access to data stored in object storage. This model centralizes governance and simplifies the analytics experience but concentrates operational responsibility in the compute layer. A second approach uses a separately deployed Delta Sharing server connected to storage. While flexible, each additional service introduces another component to deploy, secure, patch, monitor, and scale.

MinIO’s approach represents a third model in which the sharing capability is embedded directly into the object storage layer. The potential benefit is architectural simplification and reduced operational sprawl. The trade-off is that more responsibility moves into the storage platform itself, which may shift how organizations think about governance authority within the broader lakehouse architecture.

Efficiency is another important dimension. Hybrid AI environments increasingly generate high-value data in locations where moving it is impractical, costly, or restricted by regulation. In these situations, copying data into cloud analytics environments can introduce additional cost and operational friction. An embedded sharing model allows analytics platforms to access live datasets where they reside, extending compute toward the data rather than forcing data to move toward compute.

It is also important to note that the capability is new. While MinIO indicates that development was driven by customer demand, most organizations are still evaluating the feature rather than employing it broadly in production environments. For now, the announcement is best understood as an architectural marker about how lakehouse platforms and storage systems may continue to evolve together.

What Was Announced

MinIO introduced AIStor Table Sharing, a capability that embeds the Delta Sharing protocol directly into the AIStor object storage platform. Delta Sharing is an open protocol designed to enable secure data sharing across platforms, clouds, and organizations. In many deployments today, this capability is implemented through a standalone Delta Sharing server that sits between storage and analytics environments. MinIO’s approach integrates the sharing endpoint directly within the storage system, allowing datasets stored in AIStor to be shared and governed from the same platform where the data resides.

AIStor Table Sharing builds on MinIO’s broader architecture, which combines S3-compatible object storage with integrated table services supporting open formats such as Apache Iceberg and Delta Lake. Through this integration, enterprises can manage both structured and unstructured data within the same storage environment while exposing selected datasets to analytics engines through standards-based sharing protocols. By embedding Delta Sharing within the storage layer, MinIO eliminates the need to deploy and manage separate sharing infrastructure and consolidates aspects of table lifecycle and share management within the object store itself.

The capability is designed for hybrid environments where large volumes of enterprise data remain on-premises. Analytics platforms such as Databricks can access datasets stored in AIStor using the Delta Sharing protocol while the underlying data remains in place.

AIStor Table Sharing is available as part of the MinIO AIStor platform and was developed in response to customer demand for hybrid architectures that allow analytics platforms to work with enterprise data where it resides while maintaining governance and lifecycle management within the storage system.

Looking Ahead

The significance of this announcement lies in the direction it suggests for enterprise data architectures. For much of the past decade, storage platforms and analytics systems evolved along relatively separate tracks, with storage acting primarily as a scalable repository while analytics platforms handled governance, sharing, and data management functions. As AI workloads expand and lakehouse architectures mature, that separation is becoming less rigid.

Organizations increasingly expect their data infrastructure to support hybrid operation, distributed analytics, and near-real-time access to operational datasets. These expectations place new demands on both storage systems and lakehouse platforms, encouraging architectures that reduce complexity and allow data to be analyzed where it resides.

In our view, MinIO’s decision to embed Delta Sharing within the storage layer illustrates one possible direction in this evolution. The architecture positions storage closer to the operational center of the data platform. If similar approaches emerge across the ecosystem, the boundary between storage systems and lakehouse platforms may continue to blur as organizations pursue simpler and more efficient hybrid data architectures.

Author Information

Don Gentile | Analyst-in-Residence -- Storage & Data Resiliency

Don Gentile brings three decades of experience turning complex enterprise technologies into clear, differentiated narratives that drive competitive relevance and market leadership. He has helped shape iconic infrastructure platforms including IBM z16 and z17 mainframes, HPE ProLiant servers, and HPE GreenLake — guiding strategies that connect technology innovation with customer needs and fast-moving market dynamics. 

His current focus spans flash storage, storage area networking, hyperconverged infrastructure (HCI), software-defined storage (SDS), hybrid cloud storage, Ceph/open source, cyber resiliency, and emerging models for integrating AI workloads across storage and compute. By applying deep knowledge of infrastructure technologies with proven skills in positioning, content strategy, and thought leadership, Don helps vendors sharpen their story, differentiate their offerings, and achieve stronger competitive standing across business, media, and technical audiences.

Author Information

Stephanie Walter | Practice Leader - AI Stack

Stephanie Walter is a results-driven technology executive and analyst in residence with over 20 years leading innovation in Cloud, SaaS, Middleware, Data, and AI. She has guided product life cycles from concept to go-to-market in both senior roles at IBM and fractional executive capacities, blending engineering expertise with business strategy and market insights. From software engineering and architecture to executive product management, Stephanie has driven large-scale transformations, developed technical talent, and solved complex challenges across startup, growth-stage, and enterprise environments.