Red Hat

Custom Filters in WildFly

What is a log filter?

A log filter is used to add fine grained control over a log message. In the case of WildFly this is a java.util.logging.Filter. As of WildFly 18 there is the ability to use custom log filters.

Creating a Filter

To create a filter you must implement the java.util.logging.Filter interface. The filter must be in a module and can be defined on a logger or a handler via the filter-spec attribute. A custom filter can also be combined on the filter-spec attribute with a filter expression. For example any(match(".*WELD.*"), myCustomFilter).

The below example will filter log messages based on the current thread’s context class loader. It takes advantage of WildFly’s use of JBoss Modules to get the module name from the class loader. The name is then checked to see if it matches the pattern configured on the filter.

Example Filter

public class ClassLoaderFilter implements Filter {

    @Override
    public boolean isLoggable(final LogRecord record) {
        final ClassLoader cl = getClassLoader();
        String value;
        if (cl instanceof ModuleClassLoader) {
            value = ((ModuleClassLoader) cl).getName();
        } else {
            value = cl.toString();
        }
        if (pattern == null || pattern.matcher(value).matches()) {
            MDC.put("moduleName", value);
            return true;
        }
        MDC.remove("moduleName");
        return false;
    }
 }

Adding a Filter

A filter can be added to WildFly by first creating a module based on the library the filter is in. We then need to create a filter resource on the logging subsystem based on this new module and filter. Finally the filter can be added to a logger or handler resource.

Example CLI Commands

module add --name=org.jboss.example.filter --resources=/path/to/log-filter.jar --dependencies=org.jboss.modules,java.logging,org.jboss.logging
/subsystem=logging/json-formatter=json:add(exception-output-type=formatted, date-format="yyyy-MM-dd'T'HH:mm:ss.SSSZZZZZ")
/subsystem=logging/filter=clFilter:add(module=org.jboss.example.filter, class=org.jboss.example.filter.ClassLoaderFilter, properties={pattern=".*deployment\.app.*"})
/subsystem=logging/file-handler=DEPLOYMENT:add(file={relative-to=jboss.server.log.dir, path=deployment.log}, level=TRACE, append=false, autoflush=true,named-formatter=json, filter-spec=clFilter)
/subsystem=logging/root-logger=ROOT:add-handler(name=DEPLOYMENT)

In the example above we create a filter which uses the pattern .*deployment\.app.*. This will match the module name from the current thread’s context class loader and only accept messages where the module name matches the pattern. In our case this will only log messages that are associated with our deployment.

We then add the filter created to the file handler created with a JSON formatter. Finally we add the file handler to the root logger.

Example Project

An example project can be found at https://github.com/jamezp/wildfly-examples/tree/master/custom-log-filter. To use the example project simply download the source and run mvn clean wildfly:run. Once started and the application is deployed you can access the example at http://localhost:8080/app.

You should initially see some log messages that were logged during the deployment process. You can then log a custom message or start a job which logs a message at the defined number of seconds.

Centralized Logging for WildFly with the ELK Stack

The ELK stack; elasticsearch, logstash and kibana can be used for centralize logging. It’s not the intention of this post to be a tutorial on how to configure logstash. We will go through a basic logstash configuration then configure WildFly to send log messages to logstash.

Download and Configure logstash

First we need to download logstash. Once the download is complete simply extract logstash from the archive.

Next we will need to create a configuraton file. In the logstash directory create a file called logstash-wildfly.conf and add the following content to the configuration file.

input {
  tcp {
    port => 8000
  }
}

filter {
  json {
    source => "message"
  }
}

output {

  elasticsearch {
    # Use the embedded elsasticsearch for convienence
    embedded => true
    protocol => "http"
  }
}

Start logstash with the configuration file we just created ./bin/logstash agent -f logstash-wildfly.conf. In the example configuration above logstash should be listening on port 8000. Make note of the port you use as we’ll need it later when configuring WildFly.

Configure WildFly

If you don’t have a local install of WildFly you’ll want to download a recent version. In my example I’ll be use WildFly 9.0.1.Final. However any other version should work the same.

We also need to download the jboss-logmanager-ext library so that we can install it as a module. This library includes the formatter and handler we’ll use for logging.

Start up WildFly in admin-only mode so we can configure logging, $JBOSS_HOME/bin/standalone.sh --admin-only. Once the server is running start a CLI console, $JBOSS_HOME/bin/jboss-cli.sh -c, to install the module and configure logging. The following commands can be entered manually or placed in a CLI script.

batch
# Add the module, replace the directory on the resources attribute to the path where you downloaded the jboss-logmanager-ext library
module add --name=org.jboss.logmanager.ext --dependencies=org.jboss.logmanager,javax.json.api,javax.xml.stream.api --resources=~/tmp/jboss-logmanager-ext-1.0.0.Alpha3.jar

# Add the logstash formatter
/subsystem=logging/custom-formatter=logstash:add(class=org.jboss.logmanager.ext.formatters.LogstashFormatter,module=org.jboss.logmanager.ext)

# Add a socket-handler using the logstash formatter. Replace the hostname and port to the values needed for your logstash install
/subsystem=logging/custom-handler=logstash-handler:add(class=org.jboss.logmanager.ext.handlers.SocketHandler,module=org.jboss.logmanager.ext,named-formatter=logstash,properties={hostname=localhost, port=8000})

# Add the new handler to the root-logger
/subsystem=logging/root-logger=ROOT:add-handler(name=logstash-handler)

# Reload the server which will boot the server into normal mode as well as write messages to logstash
:reload
run-batch

With these changes WildFly should be writing to logstash. You can view the log messages from logstash with kibana. With the defaults we used you should just be able to start kibana with bin/kibana and the default configuration. My dashboard looks like the following.

kibana

Conclusion

If you’re already using the ELK stack for centralized logging adding WildFly to the aggregation is rather simple. If you’re just looking for a way to view and filter log messages using the ELK stack with WildFly could be a good fit as well.

One thing to note is if you’re seeing performance issues or you’re writing to a remote logstash server you may want to use an async-handler.

back to top