All in - All SSL (Secure Graylog)

If possible, make your environment as secure as possible - making even the internal applications secure as possible. You never know.

The best solution, write down what you have and research how to make that as secure as possible. I want my Graylog to use only secured connections to each piece. Later add secure communication to the collectors ... but one step after another. All SSL means, secure connection between Graylog and Elasticsearch and Graylog and MongoDB - but also that Graylog uses a secure connection to itself and that the Browser speaking to the API is using a secure connection.

How can I secure?

Use the already present CA is the first option, create your own CA the second. Something like shadowCA can help you with that - but multiple ways for that are possible. If no CA is is given, can you use something like Let's encrypt?

I'll use the last in my example - because it is possible.

My best buddy is when I want to use Let's encrypt, so I do in this description. The following will not be a copy & paste guide - more something you can follow to archive the same. Thinking is allowed.

The internal domain I'll use is Graylog and MongoDB will share one server and Elasticsearch will get it's own. Making me use of and - just as reference. This domain DNS is reachable from anywhere so Let´s encrypt is able to verify via DNS all needed certs.


Since the release of the Elasticsearch OSS this is my installation method - using the official Elasticsearch Version that only contain code under Apache 2.0 License. Nearly undocumented how to use by elastic. The installed package is called elasticsearch-oss and you need to choose a different repository. For CentOS this is needed:

rpm --import

and /etc/yum.repos.d/elasticsearch.repo must contain:

name=Elasticsearch repository for 6.x packages

Now you are able to install the package elasticsearch-oss having no issues with the Elastic License.

Search Guard

The Searchguard Plugin can be used to secure your Elasticsearch with authentification and much more - I'll only use it for the Transport Security.

Install is straight from the documentation of search-guard:

/usr/share/elasticsearch/bin/elasticsearch-plugin install -b com.floragunn:search-guard-6:6.7.1-24.3

And some added information to the elasticsearch.yml

# searchguard

searchguard.ssl_only: true
searchguard.ssl.transport.pemkey_filepath: /etc/elasticsearch/ssl/
searchguard.ssl.transport.pemcert_filepath: /etc/elasticsearch/ssl/
searchguard.ssl.transport.pemtrustedcas_filepath: /etc/elasticsearch/ssl/
searchguard.ssl.transport.enforce_hostname_verification: true

searchguard.ssl.http.enabled: true
searchguard.ssl.http.pemkey_filepath: /etc/elasticsearch/ssl/
searchguard.ssl.http.pemcert_filepath: /etc/elasticsearch/ssl/
searchguard.ssl.http.pemtrustedcas_filepath: /etc/elasticsearch/ssl/

Now the certificates needs to be created and placed at the right location

# create dir
mkdir /etc/elasticsearch/ssl/

# issue the certificate --issue --dns dns_nsupdate -d

# create the needed type of --toPkcs8 -d

# Copy the `pkcs8` key manually
cp /root/ /root/etc/elasticsearch/ssl/

# install cert --install-cert  -d --cert-file /etc/elasticsearch/ssl/ --key-file /etc/elasticsearch/ssl/ --fullchain-file /etc/elasticsearch/ssl/ --ca-file /etc/elasticsearch/ssl/  --reloadcmd "service elasticsearch restart"

After the above your Elasticsearch os only available via TLS/HTTPS - no user auth is configured, only the transport is secured. I would minimal use iptables to make Elasticsearch only reachable from the Graylog hosts.


Following the MongoDB guide to SSL is straight forward - if you know what type of certificate MongoDB needs.

First create the certificate on the Graylog server4wes0me --issue --dns dns_nsupdate -d

At first just take the certificates, installation and auto-renew/update will be configured later.

# mongodb PEM create
cat /root/ /root/ > /etc/ssl/mongo.pem
chmod 600 /etc/ssl/mongo.pem
chown mongodb:mongodb /etc/ssl/mongo.pem

The mongod.conf needs the following to allow only SSL connections.

  # SSL Setting
    mode: requireSSL
    PEMKeyFile: /etc/ssl/mongo.pem

Now, after restart the MongoDB only accepts secured connections.


The last and final step is to make the certificates available to Graylog and copy the certificates if has renewed them. I'll use a simple script for that - including the restart of the services:

# Save as /usr/local/bin/
# this will run if do the cert update and restart nginx
# no failure check and nothing implemented.

# mongodb PEM create
cat /root/ /root/ > /etc/ssl/mongo.pem
chmod 600 /etc/ssl/mongo.pem
chown mongod:mongod /etc/ssl/mongo.pem
service mongod restart

# graylog pem / key
cp /root/ /etc/graylog/server/ssl/
cp /root/ /etc/graylog/server/ssl/
service graylog-server restart

Import for Graylog are the changed settings in the server.conf to use secured transport to Elasticsearch and MongoDB but also listen on https too.

http_publish_uri =
http_enable_tls = true
http_tls_cert_file = /etc/graylog/server/ssl/
http_tls_key_file = /etc/graylog/server/ssl/
elasticsearch_hosts =
mongodb_uri = mongodb://

In my installation the hostname resolves to too - so I can use the hostname to connect to localhost. This step is needed if you want to use Let`s encrypt and having Graylog not exposed to external interfaces.

Finally install the certificates that they might be used in other places and that MongoDB and Graylog are notified on update. --install-cert -d \ 
	--key-file /etc/ssl/http/ \
        --fullchain-file /etc/ssl/http/ \
        --reloadcmd     "/usr/local/bin/"

All of the above does only secure the transport between the three parts - you want to think about authentification too - and add secure transport to your inputs might be another topic.

Show Comments