SAN MATEO, Calif.–(BUSINESS WIRE)–#Addepar–The enterprise AI adoption wave is driving forward momentum for Arcion, the only real-time, in-memory change data capture (CDC) solution. Arcion replicates enterprise databases using log-based change data capture in real time into cloud lakehouses and data warehouses to power the next generation of AI applications.
News highlights include:
- New connectors to extract data in real time from cloud warehouses.
- Rapidly growing enterprise adoption of Arcion to serve real-time AI applications.
- New partnerships with Confluent and Redis to further democratize data mobility.
- New funding from Hewlett Packard Enterprise to fuel further development.
“The resulting forward momentum is incredible,” said Arcion CEO Gary Hagmueller. “These latest strides further position Arcion as the unrivaled leader in real-time, large-scale CDC data pipeline technology for the enterprise AI era and beyond, setting up the company for unprecedented growth and development well into the foreseeable future.”
New Product Capabilities
Arcion continues to innovate in its journey to power the next generation of AI and analytics applications with real-time enterprise data from online transaction processing (OLTP) databases like Oracle, SQL Server, DB2, Sybase, MySQL, and Postgres. Arcion’s new connectors for Oracle and MySQL offer 10x more throughput, effectively utilizing the underneath scalable architecture.
Enterprises have started embracing Snowflake to store and manage operational data, and Arcion is proud to announce a first-of-its-kind native connector for Snowflake. Using Arcion’s new Snowflake connector, enterprises can seamlessly replicate data stored in Snowflake to any platform. This is especially exciting to firms that are beginning to use such data to feed the vector databases and large language models (LLMs) that power their customized AI initiatives. Arcion also added online schema change support for all its OLTP connectors, ensuring no pipeline downtime if database schemas are changed.
To deliver real-time pipeline experience for big data, Arcion has made monumental improvements to its Snowflake and Databricks target connectors. The enhancements are taking data pipeline performance to the next level and allowing customers to achieve extreme per-table throughput. Arcion users reported that it now processes more than 30,000 operations per second with room to grow, with single digit minutes of latency during peak database load.
Expanded Customer Base
As a result of its impressive performance and industry-changing innovation, Arcion is expanding its customer base beyond its projections. For new customer Chegg, the efficiency and reliability Arcion offers for ingesting data from transactional sources into the company’s lakehouse is proving vital for their analytics, data science requirements, and AI initiatives.
“Arcion’s in-memory log-based CDC combines low-latency transactional data with click-stream data, greatly benefiting our business and data customers,” concurred Chegg Director of Engineering Gopinath Rangappa and Senior Manager of Data Engineering Madhuri Prathikantam. “Arcion now plays an important part in our next-gen data platform to ingest near real-time relational data into the Databricks lakehouse and build streaming data analytics pipelines.”
Arcion customer Addepar has more than 100 billion rows of data in relational databases, with hundreds of millions of row changes daily. The company conducted a proof of concept with traditional binary log file replication technologies, but data ingestion was complicated to configure and row reduplication took excessive query time, which prevented the ability to power a live lakehouse.
“Arcion’s solution integrated directly with Databricks’ Unity Catalog, had single-digit minutes of latency during peak database load, and was able to process more than 30,000 operations per second with room to grow,” said Addepar Senior Software Engineer Alex Solder. “Arcion was an excellent collaborator during the proof-of-concept process, introducing several new features for our specific use cases with only a few days turnaround.”
Arcion’s clear ability to unlock valuable AI results is driving substantial benefits across the ecosystem. As a result, Arcion is driving value for an ever increasing number of strategic partnerships, with two major announcements this month. The company recently announced its participation in Connect with Confluent, providing enterprises with real-time data streaming through a single integration. This partnership now enables Confluent customers to accelerate growth and consumption while supercharging their go-to-market strategies.
“We’re very excited to see Arcion launching its embedded integrations with both Confluent Platform and Confluent Cloud,” said Confluent Global Head of Technology Alliances Rob Taylor. “With this partnership, Confluent users will have a simple solution for writing changes in real time from a variety of databases to the world’s most trusted data streaming platform available across clouds, on premises, and everywhere in between.”
In addition, Arcion is proud to be the technology partner for the Redis Data Integration tool.
“Redis Data Integration (RDI) helps users synchronize existing databases with Redis Enterprise, helping them accelerate and scale their data access without coding or integration projects,” said Redis Senior Principal Product Manager Yaron Parasol. “With Arcion, RDI can extend its source coverage and provide a performant change data capture streaming pipeline that captures the changes and transforms the data to the best model for the application in near real time.”
“Arcion has solved the problem of enterprise data availability in lakehouses to power next generation AI applications,” said Arcion Chief Technology Architect Rajkumar Sen. “Our usage in critical applications is skyrocketing, and I am so thrilled to be partnering with Confluent Inc. and Redis Labs.”
Arcion’s innovation and development has been accelerated with a strong capital base, which has now grown thanks to additional funding from Hewlett Packard Pathfinder, the venture capital program of Hewlett Packard Enterprise (HPE). Pathfinder’s increased investment helps Arcion continue progress in achieving its vision for real-time data mobility.
“We see demand continuing to grow for real-time analytics workloads as enterprises seek ways to maximize their data across on-prem and hybrid platforms,” said Ali Wasti, Managing Director, Hewlett Packard Pathfinder. “HPE chose to invest and partner with Arcion to complement the data fabric and unified analytics capabilities within HPE Ezmeral Software with Arcion’s real-time change data replication solution, allowing customers critical access to the freshest possible data.”
“It’s truly heartening to see our vision of data mobility take shape. Our customers continue to expand their use of Arcion and are finding new and innovative ways to drive ever more value from their real-time data,” added Hagmueller. “As witnessed by the significant product advances we’ve released in the last few months, Arcion is committed to helping customers exceed their data project objectives.”
Arcion is the only modern and scalable way to replicate enterprise databases using change data capture (CDC), making them available in real time in lakehouses and cloud warehouses to power the next generation of AI applications. Enterprises utilize Arcion’s zero-impact, agentless CDC pipelines to replicate data with guaranteed low latency and consistency. Arcion features built-in high-availability, out-of-the-box support for schema conversions and evolution, with zero code required. Fortune 500 companies rely on Arcion to break down data silos and drive faster, more agile analytics and AI by replicating mission-critical data among databases and cloud data warehouses. Learn more at www.arcion.io, and follow the company on LinkedIn, YouTube and @ArcionLabs.
All brand names and product names are trademarks or registered trademarks of their respective companies.
Tags: Arcion, Hewlett Packard Pathfinder, Hewlett Packard Enterprise, Confluent, Redis, Snowflake, Chegg, Addepar, change data capture, data infrastructure, data mobility, real-time data streaming, data migration, cloud, data lake, data lakehouse, cloud-native, zero-code, no-code, data replication, data pipeline, enterprise