For years, security operations have relied on monolithic architectures built around centralized collectors, rigid forwarding chains, and a single “system of record” where all data must land before action can be taken. On paper, that design promised simplicity and control. In practice, it delivered brittle systems, runaway ingest costs, and teams stuck maintaining plumbing instead of improving outcomes.
These challenges were front and center in a recent conversation I had with Allie Mellon, Forrester principal analyst for Security Operations, and Mark Ruiz, senior director of Cyber Risk and Resiliency at Becton Dickinson. We discussed why the monolithic model is breaking down and how a pipeline-first architecture offers a more flexible and sustainable path forward for large enterprises.
The variety and velocity of modern telemetry—spanning endpoints, cloud services, SaaS, and IoT—have simply outgrown centralized designs. Enterprises now need an approach that restores flexibility, cost efficiency, and independence to the teams who operate security at scale. The answer is a pipeline-first architecture.
Traditional architectures suffer from three deeply rooted problems:
These constraints slow transformation and keep enterprises reactive. They also make it difficult to respond to business changes such as mergers, divestitures, or cloud migrations. In short, monolithic architectures can’t keep up with the dynamism of modern operations.
A pipeline-first approach redefines how telemetry moves through the enterprise. Rather than forcing all data through a single collector, organizations adopt a neutral control plane that sits between producers—endpoints, cloud apps, and IoT—and consumers such as SIEMs, XDRs, data lakes, or AI systems. Within this layer, teams can normalize, enrich, filter, and route telemetry before it reaches downstream tools.
The shift may sound simple, but its benefits are significant. Noise can be removed at the source, cutting ingestion and storage costs. Standardized schemas reduce parsing failures when upstream tools change. Data can be routed simultaneously to multiple destinations, making it easier to pilot new analytics platforms without disruption. Most importantly, routing decisions stay under practitioner control instead of being dictated by a single vendor’s ecosystem. The pipeline becomes the security data control plane, allowing teams to design architectures around outcomes rather than constraints.
Artificial intelligence enhances the pipeline-first model by embedding intelligence directly into the data flow instead of layering it on top. When AI operates within the pipeline, it improves data quality and response speed without adding another console to manage.
The most effective uses involve context enrichment, where AI automatically assembles asset identity, business criticality, and threat context from multiple sources to give analysts a clear picture of what matters most. AI can also detect schema drift, monitor telemetry health, and apply fixes before a change disrupts detections. Finally, it can summarize complex event chains into concise, executive-level narratives or translate between query languages on demand. By shifting AI “left” into the pipeline, enterprises improve the fidelity of the data feeding every downstream system, raising confidence before an alert ever reaches an analyst.
A pipeline-first model also changes how security teams allocate time and talent. Instead of repairing connectors or re-ingesting data, engineers can design for resilience from the start. They can define outcome-driven routes aligned with business purpose, privacy, and retention requirements, tag incoming feeds by sensitivity and persona, and create feedback loops that improve data health continuously. This approach reduces dependence on scarce data engineers and lets experts focus on higher-value work such as creating better detections, building automation, and accelerating response.
At BD, Mark Ruiz applied this model to unify more than a dozen SIEM instances within weeks. Telemetry is now routed by purpose: high-value detections flow to the SIEM, while broader event data streams to long-term storage for hunting and analytics. Because the pipeline separates producers from consumers, BD can onboard new analytics tools or integrate telemetry from acquisitions without losing visibility. The same backbone also supports IT and observability use cases, proving that consolidation does not have to mean lock-in. For organizations that live with constant change, this flexibility is transformative, providing a programmable way to evolve without downtime.
What the Market Is Signaling
Forrester’s Allie Mellon notes that pipeline capabilities are becoming a key evaluation criterion for security analytics platforms. Large enterprises increasingly prefer to keep basic native analytics features while using external pipelines for routing and transformation. This model provides strategic independence at a time of market consolidation, ensuring that a change in SIEM vendor or the addition of an AI-driven analytics layer does not require a full re-architecture. The ability to pivot tools without breaking visibility is fast becoming a hallmark of resilient security programs.
For CISOs and security leaders, adopting a pipeline-first strategy does not require a full rebuild. It simply shifts where control resides and how data decisions are made.
Monolithic architectures made sense when data sources were few and change was slow. Today’s world of SaaS sprawl, IoT expansion, and AI-accelerated operations demands flexibility. A pipeline-first architecture offers a modern default: consolidate without being cornered, scale without ballooning costs, and apply AI where it delivers real operational value. Most importantly, it restores control to the practitioners closest to the problem, the ones who know which data truly drives action when the clock is ticking.
The future of security operations is not about building bigger monoliths but about designing architectures that can adapt as quickly as the threats they face. That shift begins by making the pipeline the enterprise’s next control plane.