The FileLog Receiver in the OpenTelemetry Collector is used to ingest logs from files.
It monitors specified files for new log entries and streams those logs into the Collector for further processing or exporting. It is useful for testing and development purposes.
For this part of the workshop, there is script that will generate log lines in a file. The Filelog receiver will read these log lines and send them to the OpenTelemetry Collector.
Exercise
Move to the log-gen terminal window.
Navigate to the [WORKSHOP] directory and create a new subdirectory named 3-filelog.
Next, copy all contents from the 2-gateway directory into 3-filelog.
After copying, remove any *.out and *.log files.
Change all terminal windows to the [WORKSHOP]/3-filelog directory.
Your updated directory structure will now look like this:
Create the log-gen script: In the 3-filelog directory create the script log-gen.sh (macOS/Linux), or log-gen.ps1 (Windows) using the appropriate script below for your operating system:
#!/bin/bash
# Define the log fileLOG_FILE="quotes.log"# Define quotesLOTR_QUOTES=("One does not simply walk into Mordor.""Even the smallest person can change the course of the future.""All we have to decide is what to do with the time that is given us.""There is some good in this world, and it's worth fighting for.")STAR_WARS_QUOTES=("Do or do not, there is no try.""The Force will be with you. Always.""I find your lack of faith disturbing.""In my experience, there is no such thing as luck.")# Function to get a random quoteget_random_quote(){if(( RANDOM % 2==0));thenecho"${LOTR_QUOTES[RANDOM % ${#LOTR_QUOTES[@]}]}"elseecho"${STAR_WARS_QUOTES[RANDOM % ${#STAR_WARS_QUOTES[@]}]}"fi}# Function to get a random log levelget_random_log_level(){LOG_LEVELS=("INFO""WARN""ERROR""DEBUG")echo"${LOG_LEVELS[RANDOM % ${#LOG_LEVELS[@]}]}"}# Function to generate log entrygenerate_log_entry(){TIMESTAMP=$(date "+%Y-%m-%d %H:%M:%S")LEVEL=$(get_random_log_level)MESSAGE=$(get_random_quote)if["$JSON_OUTPUT"=true];thenecho"{\"timestamp\": \"$TIMESTAMP\", \"level\": \"$LEVEL\", \"message\": \"$MESSAGE\"}"elseecho"$TIMESTAMP [$LEVEL] - $MESSAGE"fi}# Parse command line argumentsJSON_OUTPUT=falsewhile[["$#" -gt 0]];docase$1 in
-json)JSON_OUTPUT=true;;esacshiftdone# Main loop to write logsecho"Writing logs to $LOG_FILE. Press Ctrl+C to stop."while true;do generate_log_entry >> "$LOG_FILE" sleep 1# Adjust this value for log frequencydone
# Define the log file$LOG_FILE="quotes.log"# Define quotes$LOTR_QUOTES=@("One does not simply walk into Mordor.""Even the smallest person can change the course of the future.""All we have to decide is what to do with the time that is given us.""There is some good in this world, and it's worth fighting for.")$STAR_WARS_QUOTES=@("Do or do not, there is no try.""The Force will be with you. Always.""I find your lack of faith disturbing.""In my experience, there is no such thing as luck.")# Function to get a random quotefunctionGet-RandomQuote{if((Get-Random-Minimum0-Maximum2)-eq0){return$LOTR_QUOTES[(Get-Random-Minimum0-Maximum$LOTR_QUOTES.Length)]}else{return$STAR_WARS_QUOTES[(Get-Random-Minimum0-Maximum$STAR_WARS_QUOTES.Length)]}}# Function to get a random log levelfunctionGet-RandomLogLevel{$LOG_LEVELS=@("INFO","WARN","ERROR","DEBUG")return$LOG_LEVELS[(Get-Random-Minimum0-Maximum$LOG_LEVELS.Length)]}# Function to generate log entryfunctionGenerate-LogEntry{$TIMESTAMP=Get-Date-Format"yyyy-MM-dd HH:mm:ss"$LEVEL=Get-RandomLogLevel$MESSAGE=Get-RandomQuoteif($JSON_OUTPUT){$logEntry=@{timestamp=$TIMESTAMP;level=$LEVEL;message=$MESSAGE}|ConvertTo-Json-Compress}else{$logEntry="$TIMESTAMP [$LEVEL] - $MESSAGE"}return$logEntry}# Parse command line arguments$JSON_OUTPUT=$falseif($args-contains"-json"){$JSON_OUTPUT=$true}# Main loop to write logsWrite-Host"Writing logs to $LOG_FILE. Press Ctrl+C to stop."while($true){$logEntry=Generate-LogEntry# Ensure UTF-8 encoding is used (without BOM) to avoid unwanted characters$logEntry|Out-File-Append-FilePath$LOG_FILE-Encodingutf8Start-Sleep-Seconds1# Adjust log frequency}
WORKSHOP
├── 1-agent
├── 2-gateway
├── 3-filelog
│ ├── agent.yaml # Agent Collector configuration file
│ ├── gateway.yaml # Gateway Collector configuration file
│ ├── log-gen.(sh or ps1) # Script to write a file with logs lines
│ └── trace.json # Example trace file
└── otelcol # OpenTelemetry Collector binary
For macOS/Linux make sure the script is executable:
chmod +x log-gen.sh
3.2 Start Log-Gen
Exercise
Start the appropriate script for your system. The script will begin writing lines to a file named quotes.log:
./log-gen.sh
Writing logs to quotes.log. Press Ctrl+C to stop.
Note
On Windows, you may encounter the following error:
.\log-gen.ps1 : File .\log-gen.ps1 cannot be loaded because running scripts is disabled on this system …
To resolve this run:
powershell-ExecutionPolicyBypass-Filelog-gen.ps1
3.3 Filelog Configuration
Exercise
Move to the Agent terminal window and change into the [WORKSHOP]/3-filelog directory. Open the agent.yaml copied across earlier and in your editor add the filelog receiver to the agent.yaml.
Create the filelog receiver and name it quotes: The FileLog receiver reads log data from a file and includes custom resource attributes in the log data:
filelog/quotes:# Receiver Type/Nameinclude:./quotes.log # The file to read log data frominclude_file_path:true# Include file path in the log datainclude_file_name:false# Exclude file name from the log dataresource:# Add custom resource attributescom.splunk.source:./quotes.log # Source of the log datacom.splunk.sourcetype:quotes # Source type of the log data
Add filelog/quotes receiver: In the logs: pipeline add the filelog/quotes: receiver.
logs:receivers:- otlp # OTLP Receiver- filelog/quotes # Filelog Receiver reading quotes.logprocessors:- memory_limiter # Memory Limiter Processor- resourcedetection # Adds system attributes to the data- resource/add_mode # Adds collector mode metadata- batch # Batch Processor, groups data before sendexporters:- debug # Debug Exporter- otlphttp # OTLP/HTTP EXporter
Validate the agent configuration using otelbin.io. For reference, the logs: section of your pipelines will look similar to this:
Check the log-gen script is running: Find the log-gen Terminal window, and check the script is still running, and the last line is still stating the below, if it not, restart it in the [WORKSHOP]/3-filelog directory:
Writing logs to quotes.log. Press Ctrl+C to stop.
Start the Gateway:
Find your Gateway terminal window.
Navigate to the [WORKSHOP]/3-filelog directory.
Start the Gateway.
Start the Agent:
Switch to your Agent terminal window.
Navigate to the [WORKSHOP]/3-filelog directory.
Start the Agent.
Ignore the initial CPU metrics in the debug output and wait until the continuous stream of log data from the quotes.log appears. The debug output should look similar to the following (use the Check Full Debug Log to see all data):
<snip>
Body: Str(2025-02-05 18:05:16 [INFO] - All we have to decide is what to do with the time that is given)us.
Attributes:
-> log.file.path: Str(quotes.log)
</snip>
Check Full Debug Log
2025-02-05T18:05:17.050+0100 info Logs {"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 1}
2025-02-05T18:05:17.050+0100 info ResourceLog #0
Resource SchemaURL: https://opentelemetry.io/schemas/1.6.1
Resource attributes:
-> com.splunk.source: Str(./quotes.log)
-> com.splunk.sourcetype: Str(quotes)
-> host.name: Str(PH-Windows-Box.hagen-ict.nl)
-> os.type: Str(windows)
-> otelcol.service.mode: Str(gateway)
ScopeLogs #0
ScopeLogs SchemaURL:
InstrumentationScope
LogRecord #0
ObservedTimestamp: 2025-02-05 17:05:16.6926816 +0000 UTC
Timestamp: 1970-01-01 00:00:00 +0000 UTC
SeverityText:
SeverityNumber: Unspecified(0)
Body: Str(2025-02-05 18:05:16 [INFO] - All we have to decide is what to do with the time that is given)us.
Attributes:
-> log.file.path: Str(quotes.log)
Trace ID:
Span ID:
Flags: 0
{"kind": "exporter", "data_type": "logs", "name": "debug"}
Verify the gateway has handled the logs:
Windows only: Stop the Agent and Gateway to flush the files.
Check if the Gateway has written a ./gateway-logs.out file.
At this point, your directory structure will appear as follows:
WORKSHOP
├── 1-agent
├── 2-gateway
├── 3-filelog
│ ├── agent.yaml # Agent Collector configuration file
│ ├── gateway-logs.out # Output from the gateway logs pipeline
│ ├── gateway-metrics.out # Output from the gateway metrics pipeline
│ ├── gateway.yaml # Gateway Collector configuration file
│ ├── log-gen.(sh or ps1) # Script to write a file with logs lines
│ ├── quotes.log # File containing Random log lines
│ └── trace.json # Example trace file
└── otelcol # OpenTelemetry Collector binary
Examine a log line in gateway-logs.out: Compare a log line with the snippet below. It is a preview showing the beginning and a single log line; your actual output will contain many, many more:
{"resourceLogs":[{"resource":{"attributes":[{"key":"com.splunk.sourcetype","value":{"stringValue":"quotes"}},{"key":"com.splunk/source","value":{"stringValue":"./quotes.log"}},{"key":"host.name","value":{"stringValue":"[YOUR_HOST_NAME]"}},{"key":"os.type","value":{"stringValue":"[YOUR_OS]"}},{"key":"otelcol.service.mode","value":{"stringValue":"agent"}}]},"scopeLogs":[{"scope":{},"logRecords":[{"observedTimeUnixNano":"1737231901720160600","body":{"stringValue":"2025-01-18 21:25:01 [WARN] - Do or do not, there is no try."},"attributes":[{"key":"log.file.path","value":{"stringValue":"quotes.log"}}],"traceId":"","spanId":""}]}],"schemaUrl":"https://opentelemetry.io/schemas/1.6.1"}]}{"resourceLogs":[{"resource":{"attributes":[{"key":"com.splunk/source","value":{"stringValue":"./quotes.log"}},{"key":"com.splunk.sourcetype","value":{"stringValue":"quotes"}},{"key":"host.name","value":{"stringValue":"PH-Windows-Box.hagen-ict.nl"}},{"key":"os.type","value":{"stringValue":"windows"}},{"key":"otelcol.service.mode","value":{"stringValue":"agent"}}]},"scopeLogs":[{"scope":{},"logRecords":[{"observedTimeUnixNano":"1737231902719133000","body":{"stringValue":"2025-01-18 21:25:02 [DEBUG] - One does not simply walk into Mordor."},"attributes":[{"key":"log.file.path","value":{"stringValue":"quotes.log"}}],"traceId":"","spanId":""}]}],"schemaUrl":"https://opentelemetry.io/schemas/1.6.1"}]}
{"resourceLogs":[{"resource":{"attributes":[{"key":"com.splunk/source","value":{"stringValue":"./quotes.log"}},{"key":"com.splunk.sourcetype","value":{"stringValue":"quotes"}},{"key":"host.name","value":{"stringValue":"[YOUR_HOST_NAME]"}},{"key":"os.type","value":{"stringValue":"[YOUR_OS]"}},{"key":"otelcol.service.mode","value":{"stringValue":"agent"}}]},"scopeLogs":[{"scope":{},"logRecords":[{"observedTimeUnixNano":"1737231902719133000","body":{"stringValue":"2025-01-18 21:25:02 [DEBUG] - One does not simply walk into Mordor."},"attributes":[{"key":"log.file.path","value":{"stringValue":"quotes.log"}}],"traceId":"","spanId":""}]}],"schemaUrl":"https://opentelemetry.io/schemas/1.6.1"}]}
Examine the resourceLogs section: Verify that the files include the same attributes we observed in the traces and metrics sections.
You may also have noticed that every log line contains empty placeholders for "traceId":"" and "spanId":"". The FileLog receiver will populate these fields only if they are not already present in the log line.
For example, if the log line is generated by an application instrumented with an OpenTelemetry instrumentation library, these fields will already be included and will not be overwritten.
Stop the Agent, Gateway and the Quotes generating script as well using Ctrl-C.