Parsing JSON Structured Logs
Parse JSON-formatted structured log entries used by modern logging libraries like Winston, Bunyan, Pino, and Logrus with automatic field extraction.
Detailed Explanation
JSON Structured Logging
JSON structured logs are the modern standard for application logging. Instead of free-form text, each log entry is a JSON object with well-defined fields. This makes logs machine-readable, searchable, and easy to ingest into log management systems like ELK, Datadog, or Splunk.
Common JSON Log Fields
Different logging libraries use slightly different field names, but the parser handles all common variations:
| Purpose | Field Names |
|---|---|
| Timestamp | timestamp, time, ts, @timestamp, date |
| Severity | level, severity, loglevel, log_level |
| Source | logger, source, service, module, component |
| Message | message, msg, text, log |
Example Log Entries
Winston (Node.js):
{"level":"info","message":"Server started on port 3000","timestamp":"2024-01-15T10:30:00.000Z","service":"api-gateway"}
Pino (Node.js):
{"level":30,"time":1705312200000,"msg":"Request completed","pid":12345,"hostname":"web-01","req":{"method":"GET","url":"/api/health"},"res":{"statusCode":200},"responseTime":12}
Logrus (Go):
{"level":"warning","msg":"Connection pool running low","time":"2024-01-15T10:30:00Z","available":2,"max":20,"component":"db-pool"}
Extra Fields
Any JSON keys that are not recognized as standard log fields (timestamp, level, source, message) are extracted as extra fields and displayed in a collapsible section. This captures context-specific data like request IDs, durations, user IDs, and error stack traces.
Use Case
Analyzing application logs from microservices architectures, debugging distributed system issues using correlation IDs, monitoring service health metrics embedded in log entries, and preparing logs for ingestion into centralized log management platforms.