Confluent’s Predictions for GenAI in 2026

According to a recent survey, 68% of IT leaders recently attributed data silos as a major impediment to AI success. To adapt, companies will need to take these steps: 

  • Exposing agent-safe APIs 
  • Adopting tokenised payment protocols 
  • Making real-time product data available, to forestall transaction bottlenecks down the pipeline for an “always-synchronised commerce layer.” 

Confluent’s 2026 forecast explained, 

“Companies will need to figure out how to optimize sales and marketing for the machines that will increasingly do the decision making, and in some cases, ultimately the buying. If you think human customers are fickle, machine customers can be ruthless: they have zero patience for latency, no brand loyalty, and can switch vendors mid-transaction whenever a better offer appears.” 

Despite security concerns about the open-source protocol being shared with participants, who may not have been appropriately vetted, “the gravitational pull towards a single, easy open protocol that reduces friction and developer overhead will prove irresistible in 2026.” 

Although it concedes that “other competing standards like Agent2Agent (A2A) and Agent Communication Protocol (ACP) continue to vie for relevance in agent-to-agent communication,” adopting the Model Context Protocol (MCP) framework will ensure flexibility in uptake of contributing LLMs, ensuring that your data vendor is fit for purpose and allowing trade-in for alt vendors. 

The report asserts that context engineering will mainstream in 2025, whereas 2024 saw the roll-out of agentic AI. Context engineering is designed to iterate on data with priority schemas at the forefront of the analytic process, while not overburdening the LLM with run codes that slow down or impede the accuracy of its performance, in a continuous evaluation process guided by context. By using pre-fitted models, you are able “to refine logic, and adjust live through prompts, rules, and context.” 

More offloading of queries to caches or additional databases will be mandated, to enable systems to cope with the data usage levels in 2026. “Now is the time to implement change data capture (CDC) pipelines and ensure data is flowing in near real time,” as agentic queries are run on a more comprehensive scale than those overseen by human actors and day-to-day data hygiene tasks are increasingly automated. 

Confluent also stressed the vital role of cyber security in 2026, with many technical leaders citing it as a core priority. Current forecasts for global cyber crime account for losses of about £12trillion, although the uptake of AI by threat actors could bump the figure as high as $18trillion p.a. Attack volume may actually double, making scenario modelling and penetration testing essential to containing registered threats and prioritising responses according to business impact, to isolate contaminated divisions stacked by tiers according to priority. Cyber security managers need to be able to pivot around a threat, using segregated “high volume, low-latency data infrastructure for instant analysis and automated response.” 

In the context of data governance, for safeguarding standards to be upheld, ensuring data trust, quality and lineage, data protection officers need to be proactive about monitoring reuse of sensitive data and linked queries run against data tables where access is permitted based not only on shared values, but also on overlapping fields in segregated data tables where bulk processing requirements can override usage rules. The report stated that 84% of technical leaders recently called data management and governance a top-tier technology priority. 

Confluent is pushing its cold data siloing solution, Iceberg, as a way of meeting requirements for data reuse. 

 Iceberg is poised to lead this transformation, with continued maturity in its Puffin metadata format, advancements in data compaction and sorting, and emerging row-level lineage capabilities. Compared to other open table formats, like Databricks Delta Lake (which is optimized for “hot” read performance), Iceberg’s merge-on-read and partitioning design make it particularly well suited for cold data workloads.” 

Durable execution engines will be the smart investment choice, with event-driven responses to retries, timeouts and compensations. LangGraph and Pydantic AI have pioneered the adoption of fault-tolerant frameworks, which allow for solution via live diagnostics. This enables complex patterns, such as those run by Sage and CQRS, to transition from microservice code into work-flow-as-code. 

“At the same time, Kafka will remain the scalable event backbone and Flink a high-throughput low latency processing and real-time analytics engine.  

The report concluded with a call to CIOs not to overwrite or override the existing infrastrucutre, but to work with GenAI to adapt the system to changing usage needs. 

“Executives should begin to re-analyze the cost-risk for legacy system modernization with GenAI, leveraging specialized integrators for generative code translation and understanding. Developers should focus on migrating legacy messaging systems (like JMS) to modern event-driven architectures, using AI tools to accelerate the process and transform hard-to-maintain legacy code into new, valuable capabilities.” 

Comments

Leave a comment