You'll often want to import large volumes of data over hand entering the values. This video shows how to use the bulk data API from Elasticsearch so you can import mulitple rows of data at the same time.
- [Instructor] Now let's talk about bulk loading…data into Elasticsearch.…You're most likely not going to be hand typing data…and you're going to be ingesting it through something…like Logstash or Beats, but for the remainder…of this course, I want to pull in some data here…that we can really work with and see the power…of this tool.…The end point for the bulk API is underscore bulk.…This is where you send your requests to…when you want to bulk load data.…Now it expects new-line delimited JSON data,…including a new-line at the very end,…which is very important, otherwise you'll get errors.…
It allows you to index, create, delete, and update…data using this API, and when you're doing it,…you need to make sure that you're using the…data-binary flag from the curl command.…So let's switch over to terminal now,…and see how to bulk load some data.…So I'm here in a new terminal window,…and the first thing I want to do is create a file…that has some data in it.…So I'm going to use VI here, and I'm going to call it reqs,…
In this course, join Ben Sullins as he dives into the inner workings of Elasticsearch combined with Kibana. Ben provides an overview of the architecture, and then goes over the different deployment methods, and how to best structure your data. From there, he demonstrates how to query data, and how to work with Kibana to present your insights.
- Reviewing key Elasticsearch concepts
- Running Elasticsearch in the cloud and locally
- Bulk loading data
- Setting up mappings of data types
- Querying data
- Running basic aggregations
- Creating visualizations and dashboards in Kibana