This video discusses the data analytics in the Elastic Stack.
- [Instructor] Chapter two: Making Sense of Your Data. In this video, we are going to take a look at the importance of analyzing log, the different use cases of Elastic stack, and finally an architectural overview of the Elastic stack in an enterprise. Importance of Analyzing Data, what is the big deal with data why do we care so much about it. The answer is data is changing the way business decisions are made today. In short data is knowledge. According to an article released by Forbes in September 2015 the data created in the past two years is more than the data created in the entire history of the human race.
It is also estimated by the year 2020, about 1.7 megabytes of new information will be created every second for every human being on the planet. Companies could liberate the full power of data analytics could increase there operating margins as much as 60 percentage, particularly the rate incenter. Use cases of Elastic stack. The first use case of Elastic stack is search, elastic search was built as a search engine to speed up the search but later on it was extended to other use cases as well as it was used as a data stool.
Elasticsearch uses are structured called inverted index which is designed to allow very fast search techs. The first use case of the Elastic stack is search. Elasticsearch was initially built as a search engine to speed up the search but later on it was extended to other use cases as well as it was used as a data stool. Elasticsearch uses a structure called inverted index which is designed to allow very fast full tech searches and it is the primary use case of Elasticsearch.
The next use case is suggestions. You could have noticed in many websites, like Wikipedia, we could receive suggestions when we type a letter or word in the search tab. This auto complete functionality is also as fast as the user types, the reason behind this is Elasticsearch. The final use case is log analysis. Which will be explained in detail in the upcoming slides. Log analysis. In today's infrastructure a lot of data is created with which meaningful insights can be derived.
Many organizations are immature enough to get the log analyzed but this could be one of the key things for success in the future. The first and foremost use of log analysis is issue debugging. Suppose we have an issue being repeated again and again, in the current world the process could be to correct it manually but this can be directed and automated if we have proper log analysis in place. Elastic stack could be used for this purpose. Next comes the security analysis. For example, we can take the following scenario: User A logs into an application from a system located in London on the same user after a few minutes logs them out of the system located in India this clearly specifies there is a security breach.
These kind of security breaches can be directed using the Elastic stack performance analysis. Let us consider the scenario in which we have built a new API and we need to monitor the performance of the API. We might be interested in the number of hits we get and the time taken for the API to respond. This type of performance analysis can be performed with Elastic stack as well. Finally we will be able to perform predictive analysis as well. Lets consider it a newcomer site and the log shows that a number of visitors are high in a particular period of time of day with this insight we can scale up the servers to provide the user a better user experience or provide offers to the users to increase the conversion rate.
Next we will see a sample architecture of how Elastic stack can be implemented in an enterprise. In this scenario we have a number of server in an organization and the organization is facing difficulty to analyze it's logs. To add complexity to it the logs across each server is in different format and structure. We will see how implementing Elastic stack will improve the log analysis of the enterprise. First we need to install Elasticsearch in a server which will act as a data stool.
Next we need to have log stash installed in the server in which we need to have the log data transformed to jsep, the only format in which the document can be stored in Elastic search. Beads can be installed in the server if the format of the log files are compactable to be stored in Elasticsearch dataclay. The log files from all servers can be clouded to the Elasticsearch server. Finally we can have Kibana installed in the same server as Elasticsearch is installed or in a different server through which we can build graphs, pie charts, and dashboard to provide meaningful insights to the users.
As an alternate architecture we can have a centralized server for log stash in which the logs are transformed and passed to Elasticsearch instead of installing log stash in every server that transformation is required. Beads can be installed in those servers and the log files from those servers can be passed on to the centralized server in which log stash is installed. Log stash transforms the log files and transfers it to the Elasticsearch server. Then Kibana can be used to visualize the data in form of graph, pie chart, or dashboards.
In the next video we will see how to install and start Elasticsearch.
This course was created and produced by Packt Publishing. We are honored to host this training in our library.
- Elasticsearch concepts
- Working with Logstach and Kibana
- Elasticsearch Query DSL
- Aggregation and analyzers
- Scripting in Elasticsearch
- Using plugins and APIs
- Building an interface with dashboards
- Filtering and processing input
- Loading data to Elasticsearch