Local elasticsearch
Setup docker
sudo apt-get install docker docker-compose
Add your user to the docker group to ease setup.
sudo usermod -G docker -a $USER
Elasticsearch
mono instance
We will be reusing existing docker container so no specific Dockerfile.
Edit a docker-compose.yml file with the following content:
version: '3' services: elasticsearch: container_name: elasticsearch-6.2.2 image: "docker.elastic.co/elasticsearch/elasticsearch:6.2.2" ports: - "9200:9200" - "9300:9300" environment: - discovery.type=single-node
Note: This is a single node instance but we could extend this to simulate a real cluster.
build and run
Then build and run the instance:
docker-compose up --build
You should have a running instance on your port http://localhost:9200.
Optional tool
cerebro
It's a web application that permits to introspect/admin your cluster's state.
Adding a cerebro/Dockerfile:
FROM openjdk:8-jre ENV CEREBRO_VERSION 0.7.2 RUN wget https://github.com/lmenezes/cerebro/releases/download/v${CEREBRO_VERSION}/cerebro-${CEREBRO_VERSION}.tgz RUN tar xvf cerebro-0.7.2.tgz RUN mkdir cerebro-${CEREBRO_VERSION}/logs RUN mv cerebro-${CEREBRO_VERSION} /opt/ EXPOSE 1234 WORKDIR /opt/cerebro-${CEREBRO_VERSION} CMD ["bin/cerebro", "-Dhttp.port=1234"]
Now edit the initial docker-compose.yml and add the following:
cerebro: container_name: cerebro build: cerebro ports: - "1234:1234"
Now:
$ docker-compose up
should start both your elasticsearch instance and your cerebro instance.
Check your cerebro local instance running at http://localhost:1234/. You can now connect this instance to your elasticsearch 'cluster' (elasticsearch: http://<fqdn>:9200).
kibana
It's a foss data visualization plugin for Elasticsearch.
Edit the initial docker-compose.yml and add the following:
kibana: image: kibana ports: - 5601:5601
Now:
$ docker-compose up
should start both your elasticsearch instance and your kibana (and cerebro) instance.
Check your local instance running at http://localhost:5601/.