Duplicate log storage occurs when multiple sinks capture the same log data — for example, organization-wide sinks exporting all logs to Cloud Storage and project-level sinks doing the same. This redundancy results in paying twice (or more) for identical data. It often arises from decentralized logging configurations, inherited policies, or unclear ownership between teams. The problem is compounded when logs are routed both to Cloud Logging and external observability platforms, creating parallel ingestion streams and double billing.
Cloud Logging charges separately for data ingestion and for storage in each destination. When logs are routed to multiple sinks with overlapping filters, organizations pay duplicate ingestion and storage costs for the same log entries. These costs scale linearly with the number of redundant destinations.