Hello Elasticsearch Docker

A quick example how to run Elasticsearch with Docker. For the demo I use the the official image from Elasticsearch at dockerhub.

The latest version is 2.4. Get latest stable release

tan@omega:~/Sources/improved-docker-elasticsearch$ sudo docker pull elasticsearch:2.4
[sudo] password for tan:
2.4: Pulling from library/elasticsearch
8ad8b3f87b37: Pull complete
751fe39c4d34: Pull complete
b165e84cccc1: Pull complete
acfcc7cbc59b: Pull complete
04b7a9efc4af: Pull complete
b16e55fe5285: Pull complete
8c5cbb866b55: Pull complete
53c3dd7fc70d: Pull complete
3de13756a8c8: Pull complete
64be422416b7: Pull complete
b808918635ce: Pull complete
5b3ceec8c156: Pull complete
561269d0b7cc: Pull complete
0c6cf9533753: Pull complete
Digest: sha256:0805f15d9dfa3ef5a32a0f43d3aad428adc0d6fa6576ffbcdd268f5ae40a4a2e
Status: Downloaded newer image for elasticsearch:2.4

We run it :-). We need a local data directory to map it to docker directory volume defined in the Dockerfile (VOLUME /usr/share/elasticsearch/data).

tan@omega:~/Sources/improved-docker-elasticsearch$ mkdir data
tan@omega:~/Sources/improved-docker-elasticsearch$ sudo docker run -it -v $(pwd)/data:/usr/share/elasticsearch/data elasticsearch
[2016-08-31 19:23:20,733][INFO ][node                     ] [Dark Beast] version[2.4.0], pid[1], build[ce9f0c7/2016-08-29T09:14:17Z]
[2016-08-31 19:23:20,734][INFO ][node                     ] [Dark Beast] initializing ...
[2016-08-31 19:23:21,184][INFO ][plugins                  ] [Dark Beast] modules [reindex, lang-expression, lang-groovy], plugins [], sites []
[2016-08-31 19:23:21,202][INFO ][env                      ] [Dark Beast] using [1] data paths, mounts [[/usr/share/elasticsearch/data (/dev/sda3)]], net usable_space [681.6gb], net total_space [899.8gb], spins? [possibly], types [ext4]
[2016-08-31 19:23:21,202][INFO ][env                      ] [Dark Beast] heap size [990.7mb], compressed ordinary object pointers [true]
[2016-08-31 19:23:22,486][INFO ][node                     ] [Dark Beast] initialized
[2016-08-31 19:23:22,486][INFO ][node                     ] [Dark Beast] starting ...
[2016-08-31 19:23:22,544][INFO ][transport                ] [Dark Beast] publish_address {}, bound_addresses {[::]:9300}
[2016-08-31 19:23:22,549][INFO ][discovery                ] [Dark Beast] elasticsearch/aSMijsOZRkmOyrhwrcGcUQ
[2016-08-31 19:23:25,652][INFO ][cluster.service          ] [Dark Beast] new_master {Dark Beast}{aSMijsOZRkmOyrhwrcGcUQ}{}{}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2016-08-31 19:23:25,677][INFO ][http                     ] [Dark Beast] publish_address {}, bound_addresses {[::]:9200}
[2016-08-31 19:23:25,677][INFO ][node                     ] [Dark Beast] started
[2016-08-31 19:23:25,763][INFO ][gateway                  ] [Dark Beast] recovered [0] indices into cluster_state

We create a document

tan@omega:~/Sources/improved-docker-elasticsearch$ curl -v -POST -d '{ "message":"Hello Elasticsearch Docker"}'
*   Trying
* Connected to ( port 9200 (#0)
> POST /test/logs HTTP/1.1
> Host:
> User-Agent: curl/7.47.0
> Accept: */*
> Content-Length: 41
> Content-Type: application/x-www-form-urlencoded
* upload completely sent off: 41 out of 41 bytes
< HTTP/1.1 201 Created
< Content-Type: application/json; charset=UTF-8
< Content-Length: 137
* Connection #0 to host left intact

Elasticsearch log output

[2016-08-31 19:23:45,149][INFO ][cluster.metadata         ] [Dark Beast] [test] creating index, cause [auto(index api)], templates [], shards [5]/[1], mappings []
[2016-08-31 19:23:45,724][INFO ][cluster.routing.allocation] [Dark Beast] Cluster health status changed from [RED] to [YELLOW] (reason: [shards started [[test][2], [test][1], [test][3], [test][1], [test][2], [test][3], [test][4]] ...]).
[2016-08-31 19:23:46,022][INFO ][cluster.metadata         ] [Dark Beast] [test] create_mapping [logs]

Since it is a single node cluster, without any possibility to replicate the shards, the cluster health is yellow. We stop it with CTRL+C

^C[2016-08-31 19:27:21,282][INFO ][node                     ] [Dark Beast] stopping ...
[2016-08-31 19:27:21,496][INFO ][node                     ] [Dark Beast] stopped
[2016-08-31 19:27:21,496][INFO ][node                     ] [Dark Beast] closing ...
[2016-08-31 19:27:21,518][INFO ][node                     ] [Dark Beast] closed

Cleanup everything.