Basics of Elastic Search with KIBANA

Basics of Elastic Search with KIBANA

1. What is Elastic Search?

Elastic Search is a search engine based on Lucene. It provides a distributed, multitenant-capable full-text search engine with HTTP web interface and schema-free JSON and is released as open-source under the terms of the Apache License.

2. Why should it be used?

If we require, where we needed to get the data analytics for our site. We can add data analytics by using Elastic Search. Let’s suppose we have our site deployed to various environments for different clients and we have to extract useful data, such as the users who had logged in on a certain day, or the volume of users in different locales so that we can create and adjust their strategies accordingly.

Elastic Search is developed alongside a data-collection and log-parsing engine called Logstash, and an analytics and visualization platform called Kibana. The three products are designed for use as an integrated solution, referred to as the “Elastic Stack”.

3. How to install it:

a). Go to the link https://www.elastic.co/downloads

b). Then Download as per the below picture.Basics of Search Engine-KIBANA

C). Click on Download ElasticSearch then you will get a screen as below.

 Basics of Search Engine-KIBANA

d). Click on MSI file and it will download automatically.

e). Then Install that EXE file. Refer below screens to install Elastic Search.

1.Basics of Search Engine-KIBANA2.

Basics of Search Engine-KIBANA3.

 Basics of Search Engine-KIBANA

4. No need to add X-Pack direct click on NEXT.

 Basics of Search Engine-KIBANA

5.  Basics of Search Engine-KIBANA

f). By default, Elasticsearch will be installed at Path“C:\ ProgramData\Elastic\Elasticsearch\config”.

g). go to the path “C:\ ProgramData\Elastic\Elasticsearch\config” and Edit “elasticsearch.yml” this file set your PORT if you want to set otherwise it takes PORT 9200.

h). After successfully installing ElasticSearch you will need to install “Kibana”.

i). Just refer to point ‘b’ click on Download Kibana then you will get a screen as below.

Basics of Search Engine-KIBANA

j). Click on “WINDOWS sha” you will get ZIP file. Extract that ZIP file.

k). On the extract path, you will get a bin folder wherein you can find the “kibana.bat” file just double click on it.

l). Your Kibana server will get started.

m). Now in the browser point, the URL http://localhost:5601/ your kibana server gets started and gets a screen as shown below.

 Basics of Search Engine-KIBANA

4. How to upload files

a). To upload files first you need to add reference of “Nest” in your code which is used to connect with KIBANA server as shown in the below screen.

 Basics of Search Engine-KIBANAb) Here are snipped for uploading data in KIBANA

Image 1:

Basics of Search Engine-KIBANA

Image 2:

Basics of Search Engine-KIBANA

Image 3:

Basics of Search Engine-KIBANA

Image 4:

Basics of Search Engine-KIBANA

Image 5:

Basics of Search Engine-KIBANA

Image 6:

Basics of Search Engine-KIBANA

Image 7:

Basics of Search Engine-KIBANA

These are the code snipped for uploading files in the KIBANA. It stores millions of records in the JSON format and create Auto Indexing and fetch data much faster than the usual process.

5. Search Data In KIBANA

Single search:-

Basics of Search Engine-KIBANA

Multi-Search:-

Basics of Search Engine-KIBANA

Multiple Request:-

Basics of Search Engine-KIBANA

6. Pros and Cons of the ElasticSearch

Pros:-

a). Elasticsearch has a couple of built-in advantages, such as scalability by sharing, full-text search, aggregations, schema flexibility.

b). For logging you get scalable, high ingestion rates, full-text search, and fast aggregations.

c). Indexing. Elasticsearch can index thousands of documents per second.

d). Searching. Elasticsearch provides plenty of options for querying your data to get just the right information back.

e). Backup. Elasticsearch has built-in options for backing up your data. If you’re dealing with a large cluster, backing things up can get rather interesting from a storage perspective, but Elasticsearch has worked very well for us thus far.

Cons:-

a). ElasticSearch’s query DSL is less common and less flexible than PostgresSQL’s SQL.

b). Everything is indexed by default, which creates an index overhead. You will need to understand terms such as tokenizers and analyzers to understand how to properly query your data and how it is stored, you have less control over consistency (no transactions), etc.

c). Elasticsearch can struggle if you’re trying to create too many new indexes at the same time.

d). If you want to store or retrieve data outside of searching, you may want to try a different solution since ElasticSearch’s capabilities are limited.

e). If you want to do large or complex computations with the data, Elastic Search isn’t good at that.

7. Code

Code for upload file data:-

public static class StringExtensions

{

public static string RemoveSpecialCharacters(this string str)

{

StringBuilder sb = new StringBuilder();

foreach (char c in str)

{

if ((c >= ‘0’ && c <= ‘9’) ||

(c >= ‘A’ && c <= ‘Z’) ||

(c >= ‘a’ && c <= ‘z’) ||

c == ‘.’ ||

c == ‘_’ ||

c == ‘ ‘ ||

c == ‘\” ||

c == ‘ç’ ||

(c >= 128 && c <= 255) ||

c == 33 ||

(c >= 35 && c <= 47) ||

(c >= 58 && c <= 64) ||

(c >= 91 && c <= 96) ||

(c >= 123 && c <= 126))

{

sb.Append(c);

}

}

return sb.ToString();

}

}

[ElasticsearchType(Name = “clientelastic”)]

public class ClientElastic

{

//VOORNAAM  NAAM    RIJKSREGISTERNUMMER PKWKN  GROEP  FIRMANUMMER   CATEGORIE       WERKNEMERNUMMER

public string subindex { get; set; }

public string name { get; set; }

public string surname { get; set; }

public string insurancenumber { get; set; }

public string pkwkn { get; set; }

public string group { get; set; }

public string firmanumber { get; set; }

public string category { get; set; }

public string employeenumber { get; set; }

}

private void btnUploadFile_Click(object sender, EventArgs e)

{

List<string> headers = new List<string>();

string fileFullPath = “E:\\GoogleSearch\\Kibana\\ElasticSearchFiles\\Dossiers20180612Cnew.csv”;

if (!String.IsNullOrWhiteSpace(“http://localhost:9200/”) && System.IO.File.Exists(fileFullPath))

{

bool isHeader = true;

//List<Fichier> batch = new List<Fichier>();

//int step = 0;

long count = 0;

//stopWatch.Start();

List<string> list = new List<string>();

int bigstep = 0;

var node = new Uri(“http://localhost:9200/”);

var settings = new ConnectionSettings(node);

settings.BasicAuthentication(“elastic”, “s7xt8pW?ZXK^1@X7C3xU”);

var client = new ElasticClient(settings);

List<ClientElastic> clientList = new List<ClientElastic>();

var subindex = “”;

 

// // // // // // // // //  WARNING  // // // // // // // // //

var deleteIndexResponse = client.DeleteIndex(“testdata”);    //

// // // // // // // // //  WARNING  // // // // // // // // //

var existsIndex = client.IndexExists(“testdata”);

if (!existsIndex.Exists)

{

var createIndexDescriptor = new CreateIndexDescriptor(“testdata”)

.Settings(set => set.Analysis(an => an.Normalizers(nor => nor.Custom(“myLowercase”, r => r.Filters(new string[] { “lowercase” })))))

.Mappings(ms => ms

.Map<ClientElastic>(m => m

.AutoMap()

.Properties(ps => ps

.Keyword(c => c.Name(n => n.name).Normalizer(“myLowercase”))

)

)

);

var createIndexResponse = client.CreateIndex(createIndexDescriptor);

}

foreach (string line in System.IO.File.ReadLines(fileFullPath, Encoding.Default))

{

if (isHeader)

{

headers = line.Split(‘;’).ToList();

var head0 = headers[0].RemoveSpecialCharacters();

var head1 = headers[1].RemoveSpecialCharacters();

var head2 = headers[2].RemoveSpecialCharacters();

var head3 = headers[3].RemoveSpecialCharacters();

var head4 = headers[4].RemoveSpecialCharacters();

var head5 = headers[5].RemoveSpecialCharacters();

var head6 = headers[6].RemoveSpecialCharacters();

var head7 = headers[7].RemoveSpecialCharacters();

if (head0 == “Naam”.RemoveSpecialCharacters() &&

head1 == “Voornaam”.RemoveSpecialCharacters() &&

head2 == “Birthdate”.RemoveSpecialCharacters() &&

head3 == “Geboortedatum”.RemoveSpecialCharacters() &&

head4 == “Dossiernr”.RemoveSpecialCharacters() &&

head5 == “CodeOld”.RemoveSpecialCharacters() &&

head6 == “CodeOldOld”.RemoveSpecialCharacters() &&

head7 == “Code_historiek”.RemoveSpecialCharacters())

{

isHeader = false;

}

else

{

string headersOutput = “”;

foreach (string header in headers)

headersOutput += header + “;”;

//return “Failure: File headers doesn’t match the proper format FileURL: ” + fileFullPath + ” Headers: ” + headersOutput + “<br/>”;

}

}

else

{

string[] row = line.Split(‘;’);

try

{

ClientElastic newClient = new ClientElastic

{

subindex = subindex,

name = row[0].Trim(),

surname = row[1].Trim(),

insurancenumber = row[2].Trim(),

pkwkn = row[3].Trim(),

group = row[4].Trim(),

firmanumber = row[5].Trim(),

category = row[6].Trim(),

employeenumber = row[7].Trim()

};

clientList.Add(newClient);

}

catch (Exception ex)

{

//logger.Log(NLog.LogLevel.Error, ex.Message + “: Line: ” + line);

}

bigstep++;

if (bigstep >= 10000)

{

var desc = new BulkDescriptor();

desc.IndexMany<ClientElastic>(clientList, (bd, q) => bd.Index(“testdata”));

clientList = new List<ClientElastic>();

bigstep = 0;

var bulkDescResponse = client.Bulk(desc);

}

}

count++;

}

var descriptor = new BulkDescriptor();

descriptor.IndexMany<ClientElastic>(clientList, (bd, q) => bd.Index(“testdata”));

clientList = new List<ClientElastic>();

bigstep = 0;

var bulkDescriptorResponse = client.Bulk(descriptor);