Skip to main content

Next Steps

Your basic DataStream pipeline is processing data end-to-end. The sections below outline the main expansion paths, each linking to the authoritative configuration reference.

Customize Your Pipeline

Content Hub templates provide production-ready baselines, but most deployments need adjustments: renaming fields to match a target schema, dropping unwanted event types, or enriching records with GeoIP data. You can edit any installed template directly and add or reorder processors to suit your requirements.

For complex workflows, break processing into child pipelines that handle specific log categories or enrichment tasks independently.

Scale Your Data Collection

DataStream supports a wide range of input and output types. On the ingestion side, add devices for network equipment (syslog, SNMP), cloud services (Azure Monitor, AWS CloudTrail), Windows endpoints (Event Logs via Agents), or HTTP webhooks. On the output side, add targets for SIEMs, cloud storage, analytics platforms, or message queues.

Each device and target is configured independently, so you can expand coverage incrementally without redesigning existing routes.

Configure Advanced Routes

Basic Quick Routes connect one device group to one pipeline and one target. As requirements grow, Advanced Routes unlock additional capabilities:

  • Conditional routing -- filter events by content, severity, or source type so different data flows through different pipelines.
  • Parallel processing -- send the same data to multiple targets by creating separate routes that share the same device selection.
  • Multi-target delivery -- route processed output to several destinations (for example, a SIEM for real-time analysis and blob storage for long-term retention).
  • Load balancing -- distribute processing across clustered Directors for high-availability deployments.

Relevant references:

Manage Your Organization

When multiple engineers share a DataStream deployment, set up proper governance through the Organization section. Assign roles (Owner, Admin, Contributor, User) to control who can modify pipelines, routes, and infrastructure. Enable audit logging to track configuration changes and user activity for compliance reporting.

Congratulations!

You've successfully built your first DataStream pipeline and learned the foundation for sophisticated data processing workflows. Your journey from raw logs to actionable insights is well underway.

What You've Accomplished:

  • ✅ Created your DataStream account and cloud presence
  • ✅ Deployed and connected a managed Director
  • ✅ Configured your first data source device
  • ✅ Set up data output destination with proper formatting
  • ✅ Installed professional-grade processing templates
  • ✅ Connected components with functional data flow routes
  • ✅ Verified end-to-end data processing and transformation
  • ✅ Learned monitoring and troubleshooting techniques
  • ✅ Explored scaling and customization options

Your Next Adventure:

Choose your path forward based on your needs:

  • Security Focus: Integrate with Microsoft Sentinel, add threat intelligence enrichment
  • Operations Focus: Connect infrastructure monitoring, build operational dashboards
  • Scale Focus: Add more data sources, implement advanced routing, deploy clustered Directors
  • Compliance Focus: Implement audit trails, long-term retention, automated reporting

The powerful data processing infrastructure you need is now at your fingertips. Welcome to the DataStream community!