Elasticsearch-Logstash-Kibana (ELK) LoggerProvider for .NET Logging(4 min read)

Note: The Elasticsearch logger provider has been moved to the ECS DotNet project.

Find the latest version here: https://github.com/elastic/ecs-dotnet/blob/master/src/Elasticsearch.Extensions.Logging/ReadMe.md

The nuget package is here: https://www.nuget.org/packages/Elasticsearch.Extensions.Logging/1.6.0-alpha1

To add the package to your project:
dotnet add package Elasticsearch.Extensions.Logging --version 1.6.0-alpha1

This ElasticsearchLoggerProvider, for Microsoft.Extensions.Logging, writes direct to Elasticsearch, using the Elasticsearch Common Schema (ECS), and has full semantic logging of structured data from message and scope values.

To use, add the Essential.LoggerProvider.Elasticsearch package to your project:

PS> dotnet add package Essential.LoggerProvider.Elasticsearch

Then add the logger provider to your host builder, and the default configuration will write to a local Elasticsearch service:

using Essential.LoggerProvider;

// ...

    .ConfigureLogging((hostContext, loggingBuilder) =>
    {
        loggingBuilder.AddElasticsearch();
    })

Once you have logged some events, open up Kibana (e.g. http://localhost:5601/) and define an index pattern for dotnet-* with the time filter @timestamp.

You can then discover the log events for the index. Some useful columns to add are log.level, log.logger, event.code, message, tags, and process.thread.id.

Structured message and scope values are logged as labels.* custom key/value pairs, e.g. labels.CustomerId.

Example: Elasticsearch via Kibana

Elasticsearch Common Schema (ECS)

There are several ways to already get your logging into Elasticsearch (serilog, log4net, NLog, etc), however they each have a different way of specifying fields so you can't query across them if you have multiple applications (and why does serilog default to the "logstash-*" index?). They also not standalone logger providers, but each part of their own framework with quirks (e.g. log4net doesn't support event IDs, and serilog treats them as fields).

So, what do you do when there are three different ways of doing somthing - create a standard, of course, so you then have four different ways. Except in this case, then standard already exists and is the Elasticsearch Common Schema (ECS).

The ElasticsearchLoggerProvider is a standalone provider, with the only dependencies on .NET and the Elasticsearch.NET low level client, and follows the Elasticsearch Common Schema to structure the data sent to the server.

For example, the core details are logged in the following ECS fields:

@timestampDateTimeOffset when the message was logged, including local offset.
messageThe formatted log message and arguments.
tagsCustom tags, e.g. [ "Staging", "Priority" ].
event.nameThe name of the logged EventId, e.g. ErrorProcessingCustomer.
event.codeThe numeric value of the EventId, e.g. 5000.
event.severityThe syslog severity corresponding to the log level (allows numerical comparison).
log.levelThe log level: Critical, Error, Warning, Information, Debug, or Trace.
log.loggerThe category name (namespace and class) of the logger, e.g. HelloElasticsearch.Worker.
error.*Exception type, message, and stack trace.
label.*Custom key/value pairs.

These fields may not be compatible with other .NET loggers (until they implement ECS), but will be compatible with other log sources, such as the Elasticsearch Beats framework.

Version 1.0 (actually 1.1.1)

This is only the first release of the library, so it has a bunch of features, but is by no means complete. And doesn't really have many performance optimisations - it does use an asynchronous queue for processing messages, but doesn't batch sending them to Elasticsearch. (It also creates an in-memory object that it then serializes, rather than just writing properties directly.)

Performance aside, probably the first improvement I'll need to add is credentials support for the connection to the Elasticsearch server. Currently it just uses a direct connection, probably not a configuration I would recommend in production.

BTW. The version is 1.1.1 because it's part of my Essential.Logging suite, in a single repository, and using GitVersion.Tool for versioning - meaning a single version across all libraries in the project. So version 1.0.0 was the release of the rolling file logger provider, 1.1.0 for adding Elasticsearch, and then extra .1 a minor bump because I didn't add ES into the build script 🙂

Try it out

I'd love people to try out the logger and give some feedback.

There is an Elasticsearch example that walks through getting it up and running, including, if you have Docker installed, a Docker compose file to easily get Elasticsearch and Kibana services up and running locally.

Warning: I've been developing on Ubuntu lately, so not sure how good the Docker setup will work on Windows; it will at least need to be in Linux mode (which requires Hyper-V).

Any feedback on running in on Windows will be especially welcome.

Leave a Reply

Your email address will not be published. Required fields are marked *