Cost Optimization Guide

LogTrim vs Datadog pipelines

Datadog pipelines are useful for processing kept data. Cost-focused suppression is most effective before ingestion.

Why this problem exists

Teams expect downstream processing to solve upstream ingestion economics.

Pipeline logic can improve search quality while leaving ingestion-heavy bills mostly unchanged.

Real cost and impact

When pricing is ingestion-driven, pre-ingestion suppression provides the clearest savings path.

Relying only on downstream pipelines can keep costs high in noisy workloads.

Solutions (including alternatives)

  • Use Datadog pipelines for enrichment, parsing, and field extraction on high-signal logs.
  • Use pre-ingestion filtering for repetitive low-value logs that should never be forwarded.
  • Route complete retained history to S3 when full-fidelity archival is required.

How LogTrim solves it

LogTrim enforces suppression, masking, and routing before Datadog ingestion.

Teams can still use Datadog pipelines on curated high-value logs.

Example scenario

A platform team kept Datadog pipeline enrichment for error events and moved noisy success-path suppression upstream.

They improved both cost and search relevance.

Comparison

Side-by-side view of the trade-offs for this use case.
DimensionDatadog PipelinesLogTrim
Execution pointAfter ingestionBefore ingestion
Best useEnrichment and parsingVolume reduction, masking, and routing
Cost impactLimited for ingestion-heavy workloadsDirect reduction of billable ingest volume

Reduce your costs with LogTrim

Start with high-noise categories, keep high-signal logs in Datadog, and archive full retention in S3.