Prerequisites
- Create an Axiom account.
- Create a dataset in Axiom where you send your data.
- Create an API token in Axiom with permissions to update the dataset you have created.
Configure Log4j
Log4j is a flexible and powerful logging framework for Java applications. To use Log4j in your project, add the necessary dependencies to yourpom.xml
file. The dependencies required for Log4j include log4j-core
, log4j-api
, and log4j-slf4j2-impl
for logging capability, and jackson-databind
for JSON support.
-
Create a new Maven project:
-
Open the
pom.xml
file and replace its contents with the following:Thispom.xml
file includes the necessary Log4j dependencies and configures the Maven Shade plugin to create an executable JAR file. -
Create a new file named
log4j2.xml
in your root directory and add the following content:This configuration sets up two appenders:- A Socket appender that sends logs to Fluentd, running on
localhost:24224
. Is uses JSON format for the log messages, which makes it easier to parse and analyze the logs later in Axiom. - A Console appender that prints logs to the standard output,
- A Socket appender that sends logs to Fluentd, running on
Set log level
Log4j supports various log levels, allowing you to control the verbosity of your logs. The main log levels, in order of increasing severity, are the following:TRACE
: Fine-grained information for debugging.DEBUG
: General debugging information.INFO
: Informational messages.WARN
: Indications of potential problems.ERROR
: Error events that might still allow the app to continue running.FATAL
: Severe error events that might lead the app to cancel.
App.java
in the src/main/java/com/example
directory with the following content:
ThreadContext
.
Forward log messages to Fluentd
Fluentd is a popular open-source data collector used to forward logs from Log4j to Axiom. The Log4j configuration is already set up to send logs to Fluentd using the Socket appender. Fluentd acts as a unified logging layer, allowing you to collect, process, and forward logs from various sources to different destinations.Configure the Fluentd.conf file
To configure Fluentd, create a configuration file. Create a new file namedfluentd.conf
in your project root directory with the following content:
- Replace
API_TOKEN
with the Axiom API token you have generated. For added security, store the API token in an environment variable. - Replace
DATASET_NAME
with the name of the Axiom dataset where you want to send data.
- Set up a forward input plugin to receive logs from Log4j.
- Add a
java.log4j
tag to all logs. - Forward the logs to Axiom using the HTTP output plugin.
Create the Dockerfile
To simplify the deployment of the Java app and Fluentd, use Docker. Create a new file namedDockerfile
in your project root directory with the following content:
- Build the Java app.
- Set up a runtime environment with Java and Fluentd.
- Copy the necessary files and configurations.
- Create a startup script to run both Fluentd and the Java app.
Build and run the Dockerfile
-
To build the Docker image, run the following command in your project root directory:
-
Run the container with the following:
View logs in Axiom
Now that your app is running and sending logs to Axiom, you can view them in the Axiom dashboard. Log in to your Axiom account and go to the dataset you specified in the Fluentd configuration. Logs appear in real-time, with various log levels and context information added.Logging in Log4j best practices
- Use appropriate log levels: Reserve ERROR and FATAL for serious issues, use WARN for potential problems, and INFO for general app flow.
- Include context: Add relevant information to your logs using ThreadContext or by including important variables in your log messages.
- Use structured logging: Log in JSON format to make it easier to parse, and later, analyze the logs using APL.
- Log actionable information: Include enough detail in your logs to understand and potentially reproduce issues.
- Use parameterized logging: Instead of string concatenation, use Log4j’s support for parameterized messages to improve performance.
- Configure appenders appropriately: Use asynchronous appenders for better performance in high-throughput scenarios.
- Regularly review and maintain your logs: Periodically check your logging configuration and the logs themselves to ensure they’re providing value.