Cost Optimization Guide
LogTrim vs Datadog pipelines
Datadog pipelines are useful for processing kept data. Cost-focused suppression is most effective before ingestion.
Why this problem exists
Teams expect downstream processing to solve upstream ingestion economics.
Pipeline logic can improve search quality while leaving ingestion-heavy bills mostly unchanged.
Real cost and impact
When pricing is ingestion-driven, pre-ingestion suppression provides the clearest savings path.
Relying only on downstream pipelines can keep costs high in noisy workloads.
Solutions (including alternatives)
- Use Datadog pipelines for enrichment, parsing, and field extraction on high-signal logs.
- Use pre-ingestion filtering for repetitive low-value logs that should never be forwarded.
- Route complete retained history to S3 when full-fidelity archival is required.
How LogTrim solves it
LogTrim enforces suppression, masking, and routing before Datadog ingestion.
Teams can still use Datadog pipelines on curated high-value logs.
Example scenario
A platform team kept Datadog pipeline enrichment for error events and moved noisy success-path suppression upstream.
They improved both cost and search relevance.
Comparison
| Dimension | Datadog Pipelines | LogTrim |
|---|---|---|
| Execution point | After ingestion | Before ingestion |
| Best use | Enrichment and parsing | Volume reduction, masking, and routing |
| Cost impact | Limited for ingestion-heavy workloads | Direct reduction of billable ingest volume |