Expand description
JSON-in-log-line processor (NDJSON / structured log output).
Processes files where each line may contain an embedded JSON object
(e.g. structured logging output from slog, tracing-json, bunyan,
logrus, Datadog, etc.).
§Behaviour
Each line is processed individually:
- Scan for the first
{on the line. - If found, attempt to locate the matching
}using brace-counting. - Parse the extracted
{...}span as JSON. - If parsing succeeds, pass the JSON object through the full JSON processor with all field rules from the profile.
- Reconstruct:
line_prefix+ sanitised JSON +line_suffix. - If parsing fails or no JSON span is found, the line is emitted unchanged. The outer double-pass streaming scan will still catch plain-text secrets on those lines.
§Format Detection
This processor is not auto-detected from .log extension.
It must be requested explicitly with --format log.
This avoids misprocessing plain-text log files that happen to contain
individual { characters.
§Field Rules
Use "*" to sanitize every string field inside the JSON payloads,
or specific dot-separated paths (e.g. "user.token") to be selective.
Structs§
- LogLine
Processor - Structured processor for NDJSON / structured-log files.