Implement a Geo-distance search using .NET Aspire, Elasticsearch and ASP.NET Core

This article shows how to implement a geo location search in an ASP.NET Core application using a LeafletJs map. The selected location can be used to find the nearest location with an Elasticsearch Geo-distance query. The Elasticsearch container and the ASP.NET Core UI application are setup for development using .NET Aspire.

Code: https://github.com/damienbod/WebGeoElasticsearch

Setup

For local development, .NET Aspire is used to setup the two services and the HTTPS connections between the services. The services are configured in the Aspire AppHost project .

The Elasticsearch client is setup as a singleton and requires the connection configuration. This can be changed, if for example an API key is used instead. The connection URL is read from the configuration as well as the secrets.

 using Elastic.Clients.Elasticsearch; using Elastic.Transport; namespace WebGeoElasticsearch.ElasticsearchApi; public class ElasticClientProvider { private readonly ElasticsearchClient? _client = null; public ElasticClientProvider(IConfiguration configuration) { if (_client == null) { var settings = new ElasticsearchClientSettings(new Uri(configuration["ElasticsearchUrl"]!)) .Authentication(new BasicAuthentication(configuration["ElasticsearchUserName"]!, configuration["ElasticsearchPassword"]!)); _client = new ElasticsearchClient(settings); } } public ElasticsearchClient GetClient() { if (_client != null) { return _client; } throw new Exception("Elasticsearch client not initialized"); } } 

Create Index with mapping

The index cannot be created by adding a document because the mapping is created incorrectly using the default settings. The mapping can be created for the defined index using the Mappings extension from the Elastic.Clients.Elasticsearch Nuget package. This was added to the client project in the Aspire.Elastic.Clients.Elasticsearch package. The mapping is really simple and probably not complete for a production index, some keyword optimizations are required. The detailsCoordinates field is defined as a GeoPointProperty.

 var mapping = await _client.Indices.CreateAsync<MapDetail>(IndexName, c => c .Mappings(map => map .Properties( new Properties<MapDetail>() { { "details", new TextProperty() }, { "detailsCoordinates", new GeoPointProperty() }, { "detailsType", new TextProperty() }, { "id", new TextProperty() }, { "information", new TextProperty() }, { "name", new TextProperty() } } ) ) ); 

The created mapping can be validated using the “IndexName”/_mapping GET request. This returns the definitions as a Json response.

https://localhost:9200/mapdetails/_mapping

Documents can be added to the Elasticsearch index using the IndexAsync method.

 response = await _client.IndexAsync(dotNetGroup, IndexName, "1"); 

Search Query

A Geo-distance query is used to find the distance from the selected location to the different Geo points in the index. This using latitude and longitude coordinates.

 public async Task<List<MapDetail>> SearchForClosestAsync(	uint maxDistanceInMeter,	double centerLatitude,	double centerLongitude) { // Bern	Lat 46.94792, Long 7.44461 if (maxDistanceInMeter == 0) { maxDistanceInMeter = 1000000; } var searchRequest = new SearchRequest(IndexName) { Query = new GeoDistanceQuery { DistanceType = GeoDistanceType.Plane, Field = "detailsCoordinates", Distance = $"{maxDistanceInMeter}m", Location = GeoLocation.LatitudeLongitude(	new LatLonGeoLocation	{	Lat = centerLatitude,	Lon = centerLongitude	}) }, Sort = BuildGeoDistanceSort(centerLatitude, centerLongitude) }; searchRequest.ErrorTrace = true; _logger.LogInformation("SearchForClosestAsync: {SearchBody}",	searchRequest); var searchResponse = await _client	.SearchAsync<MapDetail>(searchRequest); return searchResponse.Documents.ToList(); } 

The found results are returned sorted using the Geo-distance sort. This puts the location with the smallest distance first. This is used for the map display.

 private static List<SortOptions> BuildGeoDistanceSort(	double centerLatitude,	double centerLongitude) { var sorts = new List<SortOptions>(); var sort = SortOptions.GeoDistance( new GeoDistanceSort { Field = new Field("detailsCoordinates"), Location = new List<GeoLocation> { GeoLocation.LatitudeLongitude(	new LatLonGeoLocation	{	Lat = centerLatitude,	Lon = centerLongitude	}) }, Order = SortOrder.Asc, Unit = DistanceUnit.Meters } ); sorts.Add(sort); return sorts; } 

Display using Leaflet.js

The ASP.NET Core displays the locations and the results of the search in a Leafletjs map component. The location closest to the center location is displayed differently. You can click around the map and test the different searches. The data used for this display is powered using the Geo-distance query.

Testing

The applications can be started using the .NET Aspire host project. One is run as a container, the other is a project. The docker container requires a Desktop docker installation on the host operating system. When the applications started, the containers need to boot up first. An optimization would remove this boot up.

Notes

Using Elasticsearch, it is very simple to create fairly complex search requests for your web applications. With a bit of experience complex reports, queries can be implemented as well. You can also use Elasticsearch aggregations to group and organize results for data analysis tools, reports. .NET Aspire makes it easy to develop locally and use HTTPS everywhere.

Links

https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-geo-distance-query.html

https://leafletjs.com/

https://www.elastic.co/guide/en/elasticsearch/reference/current/explicit-mapping.html

Using Elasticsearch with .NET Aspire

This post shows how to use Elasticsearch in .NET Aspire. Elasticsearch is setup to use HTTPS with the dotnet developer certificates and and simple client can be implemented to query the data.

Code: https://github.com/damienbod/keycloak-backchannel

Setup

Two services are setup to run in .NET Aspire. The first service is the official Elasticsearch docker container and deployed using dotnet developer certificates. The second service is an ASP.NET Core application using the Elastic.Clients.Elasticsearch Nuget package. The App.Host project is used to set this up and to link the services together.

Elasticsearch development server

The Elasticsearch container is configured in the program class of the App.Host project. The container is run using HTTPS and takes the Aspire parameters for configuration of the default account.

 var elasticsearch = builder.AddElasticsearch("elasticsearch", password: passwordElastic) .WithDataVolume() .RunElasticWithHttpsDevCertificate(port: 9200); 

The developer certificates needs to be created and copied to the specific folder inside the Elasticsearch docker container. This is implemented using a shared folder and the Elasticsearch xpack.security.http.ssl properties are set to match. The following three properties are used:

  • xpack.security.http.ssl.enabled
  • xpack.security.http.ssl.certificate
  • xpack.security.http.ssl.key
 using System.Diagnostics; using System.IO.Hashing; using System.Text; namespace Aspire.Hosting; // original src: https://github.com/dotnet/aspire-samples/tree/damianedwards/keycloak-sample/samples/Keycloak public static class HostingElasticExtensions { public static IResourceBuilder<ElasticsearchResource> RunElasticWithHttpsDevCertificate(this IResourceBuilder<ElasticsearchResource> builder, int port = 9200, int targetPort = 9200) { if (builder.ApplicationBuilder.ExecutionContext.IsRunMode) { builder .RunElasticWithHttpsDevCertificate() .WithHttpsEndpoint(port: port, targetPort: targetPort) .WithEnvironment("QUARKUS_HTTP_HTTP2", "false"); } return builder; } public static IResourceBuilder<TResource> RunElasticWithHttpsDevCertificate<TResource>(this IResourceBuilder<TResource> builder) where TResource : IResourceWithEnvironment { const string DEV_CERT_DIR = "/usr/share/elasticsearch/config/certificates"; if (builder.ApplicationBuilder.ExecutionContext.IsRunMode) { // Export the ASP.NET Core HTTPS development certificate & private key to PEM files, bind mount them into the container // and configure it to use them via the specified environment variables. var (certPath, _) = ExportElasticDevCertificate(builder.ApplicationBuilder); var bindSource = Path.GetDirectoryName(certPath) ?? throw new UnreachableException(); if (builder.Resource is ContainerResource containerResource) { builder.ApplicationBuilder.CreateResourceBuilder(containerResource) .WithBindMount(bindSource, DEV_CERT_DIR, isReadOnly: false); } builder .WithEnvironment("xpack.security.http.ssl.enabled", "true") .WithEnvironment("xpack.security.http.ssl.certificate", $"{DEV_CERT_DIR}/dev-cert.pem") .WithEnvironment("xpack.security.http.ssl.key", $"{DEV_CERT_DIR}/dev-cert.key"); } return builder; } private static (string, string) ExportElasticDevCertificate(IDistributedApplicationBuilder builder) { var appNameHashBytes = XxHash64.Hash(Encoding.Unicode.GetBytes(builder.Environment.ApplicationName).AsSpan()); var appNameHash = BitConverter.ToString(appNameHashBytes).Replace("-", "").ToLowerInvariant(); var tempDir = Path.Combine(Path.GetTempPath(), $"aspire.{appNameHash}"); var certExportPath = Path.Combine(tempDir, "dev-cert.pem"); var certKeyExportPath = Path.Combine(tempDir, "dev-cert.key"); if (File.Exists(certExportPath) && File.Exists(certKeyExportPath)) { // Certificate already exported, return the path. return (certExportPath, certKeyExportPath); } else if (Directory.Exists(tempDir)) { Directory.Delete(tempDir, recursive: true); } Directory.CreateDirectory(tempDir); var exportProcess = Process.Start("dotnet", $"dev-certs https --export-path \"{certExportPath}\" --format Pem --no-password"); var exited = exportProcess.WaitForExit(TimeSpan.FromSeconds(5)); if (exited && File.Exists(certExportPath) && File.Exists(certKeyExportPath)) { return (certExportPath, certKeyExportPath); } else if (exportProcess.HasExited && exportProcess.ExitCode != 0) { throw new InvalidOperationException($"HTTPS dev certificate export failed with exit code {exportProcess.ExitCode}"); } else if (!exportProcess.HasExited) { exportProcess.Kill(true); throw new InvalidOperationException("HTTPS dev certificate export timed out"); } throw new InvalidOperationException("HTTPS dev certificate export failed for an unknown reason"); } } 

When the App.Host project is started, the Elasticsearch containers boot up and the server can be tested using the “_cat” HTTP Get requests or the default base URL will give a server information about Elasticsearch.

https://localhost:9200/_cat

Elasticsearch client

The Elasticsearch client was implemented using the Elastic.Clients.Elasticsearch Nuget package. The client project in .NET Aspire needs to reference the Elasticsearch server using the WithReference method.

 builder.AddProject<Projects.ElasticsearchAuditTrail>( "elasticsearchaudittrail") .WithExternalHttpEndpoints() .WithReference(elasticsearch); 

Elasticsearch can be queried used a simple query search.

 public async Task<IEnumerable<T>> QueryAuditLogs(string filter = "*", AuditTrailPaging auditTrailPaging = null) { var from = 0; var size = 10; EnsureElasticClient(_indexName, _options.Value); await EnsureAlias(); if (auditTrailPaging != null) { from = auditTrailPaging.Skip; size = auditTrailPaging.Size; if (size > 1000) { // max limit 1000 items size = 1000; } } var searchRequest = new SearchRequest<T>(Indices.Parse(_alias)) { Size = size, From = from, Query = new SimpleQueryStringQuery { Query = filter }, Sort = BuildSort() }; var searchResponse = await _elasticsearchClient .SearchAsync<T>(searchRequest); return searchResponse.Documents; } 

See the source code: https://github.com/damienbod/keycloak-backchannel/blob/main/AuditTrail/AuditTrailProvider.cs

Notes

With this setup, it is easy to develop using Elasticsearch as a container and no service needs to be implemented on the developer host PC. Setting up HTTPS is a little bit complicated and it would be nice to see this supported better. The development environment should be as close as possible to the deployed versions. HTTPS should be used in development.

Links

https://learn.microsoft.com/en-us/dotnet/aspire/search/elasticsearch-integration

https://www.elastic.co/guide/en/elasticsearch/reference/current/docker.html

https://www.elastic.co/products/elasticsearch

https://github.com/elastic/elasticsearch-net

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-simple-query-string-query.html

Implementing an Audit Trail using ASP.NET Core and Elasticsearch

This article shows how an audit trail can be implemented in ASP.NET Core which saves the audit documents to Elasticsearch using the Elastic.Clients.Elasticsearch Nuget package.

Code: https://github.com/damienbod/AspNetCoreElasticsearchAuditTrail

History

  • 2024-09-11 Updated to .NET 8 and Elastic.Clients.Elasticsearch
  • 2020-01-12 Updated to .NET Core 3.1
  • 2019-02-15 Updated to .NET Core 2.2

Should I just use a logger?

Depends. If you just need to save requests, responses and application events, then a logger would be a better solution for this use case. I would use Serilog as it provides everything you need, or could need, when working with ASP.NET Core.

If you only need to save business events/data of the application in the audit trail, then this solution could fit.

Using the Audit Trail

The audit trail is implemented so that it can be used easily. In the program file of the ASP.NET Core application, it is added to the application using the builder. The class library provides an extension method, AddAuditTrail, which can be configured as required. It takes 5 parameters, a bool parameter which defines if a new index is created per day or per month to save the audit trail documents, and a second int parameter which defines how many of the previous indices are included in the alias used to select the audit trail items. If this is 0, all indices are included for the search.

Because the audit trail documents are grouped into different indices per day or per month, the amount of documents can be controlled in each index. Usually the application user requires only the last n days, or last 2 months of the audit trails, and so the search does not need to search through all audit trails documents since the application began. This makes it possible to optimize the data as required, or even remove, archive old unused audit trail indices.

 var indexPerMonth = false; var amountOfPreviousIndicesUsedInAlias = 3; builder.Services.AddAuditTrail<CustomAuditTrailLog>(options => options.UseSettings(indexPerMonth, amountOfPreviousIndicesUsedInAlias, builder.Configuration["ElasticsearchUserName"], builder.Configuration["ElasearchPassword"], builder.Configuration["ElasearchUrl"]) ); builder.Services.AddControllersWithViews(); 

The AddAuditTrail extension method requires a model definition which will be used to save or retrieve the documents in Elasticsearch. The model must implement the IAuditTrailLog interface. This interface just forces you to implement the property Timestamp which is required for the audit logs.

The model can then be designed, defined as required. Use the keyword attribute, if the text field should not be analyzed. If you must use enums, then save the string value and NOT the integer value to the persistent layer. If integer values are saved for the enums, then it cannot be used without the knowledge of what each integer value represents, making it dependent on the code.

 public class CustomAuditTrailLog : IAuditTrailLog { public CustomAuditTrailLog() { Timestamp = DateTime.UtcNow; } public DateTime Timestamp { get; set; } public string Action { get; set; } = string.Empty; public string Log { get; set; } = string.Empty; public string Origin { get; set; } = string.Empty; public string User { get; set; } = string.Empty; public string Extra { get; set; } = string.Empty; } 

The audit trail can then be used anywhere in the application. The IAuditTrailProvider can be added in the constructor of the class and an audit document can be created using the AddLog method.

 public class HomeController(IAuditTrailProvider<CustomAuditTrailLog> auditTrailProvider) : Controller { private readonly IAuditTrailProvider<CustomAuditTrailLog> _auditTrailProvider = auditTrailProvider; public IActionResult Index() { var auditTrailLog = new CustomAuditTrailLog() { User = GetUserName(), Origin = "HomeController:Index", Action = "Home GET", Log = "home page called doing something important enough to be added to the audit log.", Extra = "yep" }; _auditTrailProvider.AddLog(auditTrailLog); return View(); } private string GetUserName() { return User.Identity!.Name ?? "Anonymous"; } 

The audit trail documents can be viewed using QueryAuditLogs which supports paging and uses a simple query search which accepts wildcards. The AuditTrailSearch method returns a MVC view with the audit trail items in the model.

 public async Task<IActionResult> AuditTrailSearchAsync(string searchString, int skip, int amount) { var auditTrailLog = new CustomAuditTrailLog() { User = GetUserName(), Origin = "HomeController:AuditTrailSearchAsync", Action = "AuditTrailSearchAsync GET", Log = $"user clicked the audit trail nav. {searchString}" }; await _auditTrailProvider.AddLog(auditTrailLog); var auditTrailViewModel = new AuditTrailViewModel { Filter = searchString, Skip = skip, Size = amount }; if (skip > 0 || amount > 0) { var paging = new AuditTrailPaging { Size = amount, Skip = skip }; auditTrailViewModel.AuditTrailLogs = (await _auditTrailProvider.QueryAuditLogs(searchString, paging)).ToList(); return View(auditTrailViewModel); } auditTrailViewModel.AuditTrailLogs = (await _auditTrailProvider.QueryAuditLogs(searchString)).ToList(); return View(auditTrailViewModel); } 

How is the Audit Trail implemented?

The AuditTrailExtensions class implements the extension methods used to initialize the audit trail implementations. This class accepts the options and registers the interfaces, classes with the IoC used by ASP.NET Core.

Generics are used so that any model class can be used to save the audit trail data. This changes always with each project, application. The type T must implement the interface IAuditTrailLog.

 using AuditTrail; using AuditTrail.Model; using Microsoft.Extensions.DependencyInjection.Extensions; using System; namespace Microsoft.Extensions.DependencyInjection; public static class AuditTrailExtensions { public static IServiceCollection AddAuditTrail<T>(this IServiceCollection services) where T : class, IAuditTrailLog { ArgumentNullException.ThrowIfNull(services); return AddAuditTrail<T>(services, setupAction: null); } public static IServiceCollection AddAuditTrail<T>( this IServiceCollection services, Action<AuditTrailOptions> setupAction) where T : class, IAuditTrailLog { ArgumentNullException.ThrowIfNull(services); services.TryAdd(new ServiceDescriptor( typeof(IAuditTrailProvider<T>), typeof(AuditTrailProvider<T>), ServiceLifetime.Transient)); if (setupAction != null) { services.Configure(setupAction); } return services; } } 

When a new audit trail log is added, it uses the index defined in the _indexName field.

 public async Task AddLog(T auditTrailLog) { EnsureElasticClient(_indexName, _options.Value); var indexRequest = new IndexRequest<T>(auditTrailLog); var response = await _elasticsearchClient.IndexAsync(indexRequest); if (!response.IsValidResponse) { throw new ArgumentException("Add auditlog disaster!"); } } 

The _indexName field is defined using the date pattern, either days or months depending on your options.

 private const string _alias = "auditlog"; private readonly string _indexName = $"{_alias}-{DateTime.UtcNow:yyyy-MM-dd}"; private readonly IOptions<AuditTrailOptions> _options; private readonly static Field TimestampField = new("timestamp"); private static ElasticsearchClient _elasticsearchClient; private static string _actualIndex = string.Empty; 

index definition per month:

 if (_options.Value.IndexPerMonth) { _indexName = $"{_alias}-{DateTime.UtcNow:yyyy-MM}"; } 

When quering the audit trail logs, a simple query search query is used to find, select the audit trial documents required for the view. This is used so that wildcards can be used. The method accepts a query filter and paging options. If you search without any filter, all documents are returned which are defined in the alias (used indices). By using the simple query, the filter can accept options like AND, OR for the search.

 public async Task<IEnumerable<T>> QueryAuditLogs(string filter = "*", AuditTrailPaging auditTrailPaging = null) { var from = 0; var size = 10; EnsureElasticClient(_indexName, _options.Value); await EnsureAlias(); if (auditTrailPaging != null) { from = auditTrailPaging.Skip; size = auditTrailPaging.Size; if (size > 1000) { // max limit 1000 items size = 1000; } } var searchRequest = new SearchRequest<T>(Indices.Parse(_alias)) { Size = size, From = from, Query = new SimpleQueryStringQuery { Query = filter }, Sort = BuildSort() }; var searchResponse = await _elasticsearchClient.SearchAsync<T>(searchRequest); return searchResponse.Documents; } 

The alias is also updated in the search query, if required. Depending on you configuration, the alias uses all the audit trail indices or just the last n days, or n months. This check uses a static field. If the alias needs to be updated, the new alias is created, which also deletes the old one.

 private async Task EnsureAlias() { if (_options.Value.IndexPerMonth) { if (aliasUpdated.Date < DateTime.UtcNow.AddMonths(-1).Date) { aliasUpdated = DateTime.UtcNow; await CreateAlias(); } } else if (aliasUpdated.Date < DateTime.UtcNow.AddDays(-1).Date) { aliasUpdated = DateTime.UtcNow; await CreateAlias(); } } 

Here’s how the alias is created for all indices of the audit trail.

 private async Task CreateAliasForLastNIndicesAsync(int amount) { EnsureElasticClient(_indexName, _options.Value); var responseCatIndices = await _elasticsearchClient .Indices.GetAsync(new GetIndexRequest(Indices.Parse($"{_alias}-*"))); var records = responseCatIndices.Indices.ToList(); var indicesToAddToAlias = new List<string>(); for (int i = amount; i > 0; i--) { if (_options.Value.IndexPerMonth) { var indexName = $"{_alias}-{DateTime.UtcNow.AddMonths(-i + 1):yyyy-MM}"; if (records.Exists(t => t.Key == indexName)) { indicesToAddToAlias.Add(indexName); } } else { var indexName = $"{_alias}-{DateTime.UtcNow.AddDays(-i + 1):yyyy-MM-dd}"; if (records.Exists(t => t.Key == indexName)) { indicesToAddToAlias.Add(indexName); } } } var response = await _elasticsearchClient.Indices .ExistsAliasAsync(new ExistsAliasRequest(new Names(new List<string> { _alias }))); if (response.Exists) { await _elasticsearchClient.Indices .DeleteAliasAsync(new DeleteAliasRequest(Indices.Parse($"{_alias}-*"), _alias)); } Indices multipleIndicesFromStringArray = indicesToAddToAlias.ToArray(); var responseCreateIndex = await _elasticsearchClient.Indices .PutAliasAsync(new PutAliasRequest(multipleIndicesFromStringArray, _alias)); if (!responseCreateIndex.IsValidResponse) { var res = responseCreateIndex.TryGetOriginalException(out var ex); throw ex; } } 

The full AuditTrailProvider class which implements the audit trail can be found in the Github repository linked above.

Testing the audit log

The created audit trails can be checked using the following HTTP GET requests:

Counts all the audit trail entries in the alias.


https://localhost:9200/auditlog/_count

Shows all the audit trail indices. You can count all the documents from the indices used in the alias and it must match the count from the alias.


https://localhost:9200/_cat/indices/auditlog*

You can also start the application and the AuditTrail logs can be displayed in the Audit Trail logs MVC view.

This view is just a quick test, if implementing properly, you would have to localize the timestamp display and add proper paging in the view.

Notes, improvements

If lots of audit trail documents are written at once, maybe a bulk insert could be used to add the documents in batches, like most of the loggers implement this. You should also define a strategy on how the old audit trails, indices should be cleaned up, archived or whatever. The creating of the alias could be optimized depending on you audit trail data, and how you clean up old audit trail indices.

Links:

https://www.elastic.co/guide/en/elasticsearch/reference/current/aliases.html

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-simple-query-string-query.html

https://docs.microsoft.com/en-us/aspnet/core/

https://www.elastic.co/products/elasticsearch

https://github.com/elastic/elasticsearch-net

Angular search with ASP.NET Core and Elasticsearch

This article shows how a website search could be implemented using Angular, ASP.NET Core and Elasticsearch. Most users expect autocomplete and a flexible search like some of known search websites. When the user enters a char in the search input field, an autocomplete using a shingle token filter with a terms aggregation used to suggest possible search terms. When a term is selected, a match query request is sent and uses an edge ngram indexed field to search for hits or matches. Server side paging is then implemented to iterate though the results.

Code: https://github.com/damienbod/Angular2AutoCompleteAspNetCoreElasticsearch

2017.02.10: Updated to VS2017, Angular 2.4.6 and webpack 2.2.1
2017.01.07: Updated to csproj, webpack 2.2.0-rc.3, angular 2.4.1

ASP.NET Core server side search

The Elasticsearch index and queries was built using the ideas from these 2 excellent blogs, bilyachat and qbox.io. ElasticsearchCrud is used as the dotnet core client for Elasticsearch. To setup the index, a mapping needs to be defined as well as the index with the required settings analysis with filters, analyzers and tokenizers. See the Elasticsearch documentation for detailed information.

In this example, 2 custom analyzers are defined, one for the autocomplete and one for the search. The autocomplete analyzer uses a custom shingle token filter called autocompletefilter, a stopwords token filter, lowercase token filter and a stemmer token filter. The edge_ngram_search analyzer uses an edge ngram token filter and a lowercase filter.

 private IndexDefinition CreateNewIndexDefinition() {	return new IndexDefinition	{	IndexSettings =	{	Analysis = new Analysis	{	Filters =	{	CustomFilters = new List<AnalysisFilterBase>	{	new StemmerTokenFilter("stemmer"),	new ShingleTokenFilter("autocompletefilter")	{	MaxShingleSize = 5,	MinShingleSize = 2	},	new StopTokenFilter("stopwords"),	new EdgeNGramTokenFilter("edge_ngram_filter")	{	MaxGram = 20,	MinGram = 2	}	}	},	Analyzer =	{	Analyzers = new List<AnalyzerBase>	{	new CustomAnalyzer("edge_ngram_search")	{	Tokenizer = DefaultTokenizers.Standard,	Filter = new List<string> {DefaultTokenFilters.Lowercase, "edge_ngram_filter"},	CharFilter = new List<string> {DefaultCharFilters.HtmlStrip}	},	new CustomAnalyzer("autocomplete")	{	Tokenizer = DefaultTokenizers.Standard,	Filter = new List<string> {DefaultTokenFilters.Lowercase, "autocompletefilter", "stopwords", "stemmer"},	CharFilter = new List<string> {DefaultCharFilters.HtmlStrip}	},	new CustomAnalyzer("default")	{	Tokenizer = DefaultTokenizers.Standard,	Filter = new List<string> {DefaultTokenFilters.Lowercase, "stopwords", "stemmer"},	CharFilter = new List<string> {DefaultCharFilters.HtmlStrip}	}	}	}	}	},	}; } 

The PersonCity is used to add and search for documents in Elasticsearch. The default index and type for this class using ElasticsearchCrud is personcitys and personcity.

 public class PersonCity {	public long Id { get; set; }	public string Name { get; set; }	public string FamilyName { get; set; }	public string Info { get; set; }	public string CityCountry { get; set; }	public string Metadata { get; set; }	public string Web { get; set; }	public string Github { get; set; }	public string Twitter { get; set; }	public string Mvp { get; set; } } 

A PersonCityMapping class is defined so that required mapping from the PersonCityMappingDto mapping class can be defined for the personcitys index and the personcity type. This class overrides the ElasticsearchMapping to define the index and type.

 using System; using ElasticsearchCRUD; namespace SearchComponent { public class PersonCityMapping : ElasticsearchMapping { public override string GetIndexForType(Type type) { return "personcitys"; } public override string GetDocumentType(Type type) { return "personcity"; } } } 

The PersonCityMapping class is then used to map the C# type PersonCityMappingDto to the default index from the PersonCity class using the PersonCityMapping. The PersonCityMapping maps to the default index of the PersonCity class.

 public PersonCitySearchProvider() {	_elasticsearchMappingResolver.AddElasticSearchMappingForEntityType(typeof(PersonCityMappingDto), new PersonCityMapping());	_context = new ElasticsearchContext(ConnectionString, new ElasticsearchSerializerConfiguration(_elasticsearchMappingResolver))	{	TraceProvider = new ConsoleTraceProvider()	}; } 

A specific mapping DTO class is used to define the mapping in Elasticsearch. This class is required, if a non default mapping is required in Elasticsearch. The class uses the ElasticsearchString attribute to define a copy mapping. The field in Elasticsearch should be copied to the autocomplete and the searchfield field when adding a new document. The searchfield and the autocomplete field uses the two analyzers which where defined in the index when adding data. This class is only used to define the type mapping in Elasticsearch.

 using ElasticsearchCRUD.ContextAddDeleteUpdate.CoreTypeAttributes; namespace SearchComponent { public class PersonCityMappingDto { public long Id { get; set; } [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })] public string Name { get; set; } [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })] public string FamilyName { get; set; } [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })] public string Info { get; set; } [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })] public string CityCountry { get; set; } [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })] public string Metadata { get; set; } public string Web { get; set; } public string Github { get; set; } public string Twitter { get; set; } public string Mvp { get; set; } [ElasticsearchString(Analyzer = "edge_ngram_search", SearchAnalyzer = "standard", TermVector = TermVector.yes)] public string searchfield { get; set; } [ElasticsearchString(Analyzer = "autocomplete")] public string autocomplete { get; set; } } } 

The IndexCreate method creates a new index and mapping in elasticsearch.

 public void CreateIndex() {	_context.IndexCreate<PersonCityMappingDto>(CreateNewIndexDefinition()); } 

The Elasticsearch settings can be viewed using the HTTP GET:

http://localhost:9200/_settings

 {	"personcitys": {	"settings": {	"index": {	"creation_date": "1477642409728",	"analysis": {	"filter": {	"stemmer": {	"type": "stemmer"	},	"autocompletefilter": {	"max_shingle_size": "5",	"min_shingle_size": "2",	"type": "shingle"	},	"stopwords": {	"type": "stop"	},	"edge_ngram_filter": {	"type": "edgeNGram",	"min_gram": "2",	"max_gram": "20"	}	},	"analyzer": {	"edge_ngram_search": {	"filter": ["lowercase",	"edge_ngram_filter"],	"char_filter": ["html_strip"],	"type": "custom",	"tokenizer": "standard"	},	"autocomplete": {	"filter": ["lowercase",	"autocompletefilter",	"stopwords",	"stemmer"],	"char_filter": ["html_strip"],	"type": "custom",	"tokenizer": "standard"	},	"default": {	"filter": ["lowercase",	"stopwords",	"stemmer"],	"char_filter": ["html_strip"],	"type": "custom",	"tokenizer": "standard"	}	}	},	"number_of_shards": "5",	"number_of_replicas": "1",	"uuid": "TxS9hdy7SmGPr4FSSNaPiQ",	"version": {	"created": "2040199"	}	}	}	} } 

The Elasticsearch mapping can be viewed using the HTTP GET:

http://localhost:9200/_mapping

 {	"personcitys": {	"mappings": {	"personcity": {	"properties": {	"autocomplete": {	"type": "string",	"analyzer": "autocomplete"	},	"citycountry": {	"type": "string",	"copy_to": ["autocomplete",	"searchfield"]	},	"familyname": {	"type": "string",	"copy_to": ["autocomplete",	"searchfield"]	},	"github": {	"type": "string"	},	"id": {	"type": "long"	},	"info": {	"type": "string",	"copy_to": ["autocomplete",	"searchfield"]	},	"metadata": {	"type": "string",	"copy_to": ["autocomplete",	"searchfield"]	},	"mvp": {	"type": "string"	},	"name": {	"type": "string",	"copy_to": ["autocomplete",	"searchfield"]	},	"searchfield": {	"type": "string",	"term_vector": "yes",	"analyzer": "edge_ngram_search",	"search_analyzer": "standard"	},	"twitter": {	"type": "string"	},	"web": {	"type": "string"	}	}	}	}	} } 

Now documents can be added using the PersonCity class which has no Elasticsearch definitions.

Autocomplete search

A terms aggregation search is used for the autocomplete request. The terms aggregation uses the autocomplete field which only exists in Elasticsearch. A list of strings is returned to the user from this request.

 public IEnumerable<string> AutocompleteSearch(string term) {	var search = new Search	{	Size = 0,	Aggs = new List<IAggs>	{	new TermsBucketAggregation("autocomplete", "autocomplete")	{	Order= new OrderAgg("_count", OrderEnum.desc),	Include = new IncludeExpression(term + ".*")	}	}	};	var items = _context.Search<PersonCity>(search);	var aggResult = items.PayloadResult.Aggregations.GetComplexValue<TermsBucketAggregationsResult>("autocomplete");	IEnumerable<string> results = aggResult.Buckets.Select(t => t.Key.ToString());	return results; } 

The request is sent to Elasticsearch as follows:

 POST http://localhost:9200/personcitys/personcity/_search HTTP/1.1 Content-Type: application/json Accept-Encoding: gzip, deflate Connection: Keep-Alive Content-Length: 124 Host: localhost:9200 {	"size": 0,	"aggs": {	"autocomplete": {	"terms": {	"field": "autocomplete",	"order": {	"_count": "desc"	},	"include": {	"pattern": "as.*"	}	}	}	} } 

Search Query

When an autocomplete string is selected, a search request is sent to Elasticsearch using a Match Query on the searchfield field which returns 10 hits from the 0 document. If the paging request is sent, the from value is a multiple of 10 depending on the page.

 public PersonCitySearchResult Search(string term, int from) {	var personCitySearchResult = new PersonCitySearchResult();	var search = new Search	{	Size = 10,	From = from,	Query = new Query(new MatchQuery("did_you_mean", term))	};	var results = _context.Search<PersonCity>(search);	personCitySearchResult.PersonCities = results.PayloadResult.Hits.HitsResult.Select(t => t.Source);	personCitySearchResult.Hits = results.PayloadResult.Hits.Total;	personCitySearchResult.Took = results.PayloadResult.Took;	return personCitySearchResult; } 

The search query as sent as follows:

 POST http://localhost:9200/personcitys/personcity/_search HTTP/1.1 Content-Type: application/json Accept-Encoding: gzip, deflate Connection: Keep-Alive Content-Length: 74 Host: localhost:9200 {	"from": 0,	"size": 10,	"query": {	"match": {	"searchfield": {	"query": "asp.net"	}	}	} } 

Angular 2 client side search

The Angular 2 client uses an autocomplete input control and then uses a ngFor to display all the search results. Bootstrap paging is used if more than 10 results are found for the search term.

 <div class="panel-group"> <personcitysearch *ngIf="IndexExists" (onTermSelectedEvent)="onTermSelectedEvent($event)" [disableAutocomplete]="!IndexExists"> </personcitysearch> <em *ngIf="PersonCitySearchData.took > 0" style="font-size:smaller; color:lightgray;"> <span>Hits: {{PersonCitySearchData.hits}}</span> </em><br /> <br /> <div *ngFor="let personCity of PersonCitySearchData.personCities"> <b><span>{{personCity.name}} {{personCity.familyName}} </span></b> <a *ngIf="personCity.twitter" href="{{personCity.twitter}}"> <img src="assets/socialTwitter.png" /> </a> <a *ngIf="personCity.github" href="{{personCity.github}}"> <img src="assets/github.png" /> </a> <a *ngIf="personCity.mvp" href="{{personCity.mvp}}"> <img src="assets/mvp.png" width="24" /> </a><br /> <em style="font-size:large"><a href="{{personCity.web}}">{{personCity.web}}</a></em><br /> <em><span>{{personCity.metadata}}</span></em><br /> <span>{{personCity.info}}</span><br /> <br /> <br /> </div> <ul class="pagination" *ngIf="ShowPaging"> <li><a (click)="PreviousPage()" >&laquo;</a></li> <li><a *ngFor="let page of Pages" (click)="LoadDataForPage(page)">{{page}}</a></li> <li><a (click)="NextPage()">&raquo;</a></li> </ul> </div> 

The personcitysearch Angular 2 component implements the autocomplete functionality using the ng2-completer component. When a char is entered into the input, a HTTP request is sent to the server which in turns sends a request to the Elasticsearch server.

 import { Component, Inject, EventEmitter, Input, Output, OnInit, AfterViewInit, ElementRef } from '@angular/core'; import { Http, Response } from "@angular/http"; import { Subscription } from 'rxjs/Subscription'; import { Observable } from 'rxjs/Observable'; import { Router } from '@angular/router'; import { Configuration } from '../app.constants'; import { PersoncityautocompleteDataService } from './personcityautocompleteService'; import { PersonCity } from '../model/personCity'; import { CompleterService, CompleterItem } from 'ng2-completer'; import './personcityautocomplete.component.scss'; @Component({ selector: 'personcityautocomplete', template: ` <ng2-completer [dataService]="dataService" (selected)="onPersonCitySelected($event)" [minSearchLength]="0" [disableInput]="disableAutocomplete"></ng2-completer> ` }) export class PersoncityautocompleteComponent implements OnInit { constructor(private completerService: CompleterService, private http: Http, private _configuration: Configuration) { this.dataService = new PersoncityautocompleteDataService(http, _configuration); ////completerService.local("name, info, familyName", 'name'); } @Output() bindModelPersonCityChange = new EventEmitter<PersonCity>(); @Input() bindModelPersonCity: PersonCity; @Input() disableAutocomplete: boolean = false; private searchStr: string; private dataService: PersoncityautocompleteDataService; ngOnInit() { console.log("ngOnInit PersoncityautocompleteComponent"); } public onPersonCitySelected(selected: CompleterItem) { console.log(selected); this.bindModelPersonCityChange.emit(selected.originalObject); } } 

And the data service for the CompleterService which is used by the ng2-completer component:

 import { Http, Response } from "@angular/http"; import { Subject } from "rxjs/Subject"; import { CompleterData, CompleterItem } from 'ng2-completer'; import { Configuration } from '../app.constants'; export class PersoncityautocompleteDataService extends Subject<CompleterItem[]> implements CompleterData { constructor(private http: Http, private _configuration: Configuration) { super(); this.actionUrl = _configuration.Server + 'api/personcity/querystringsearch/'; } private actionUrl: string; public search(term: string): void { this.http.get(this.actionUrl + term) .map((res: Response) => { // Convert the result to CompleterItem[] let data = res.json(); let matches: CompleterItem[] = data.map((personcity: any) => { return { title: personcity.name, description: personcity.familyName + ", " + personcity.cityCountry, originalObject: personcity } }); this.next(matches); }) .subscribe(); } public cancel() { // Handle cancel } } 

The HomeSearchComponent implements the paging for the search results and and also displays the data. The SearchDataService implements the API calls to the MVC ASP.NET Core API service. The paging css uses bootstrap to display the data.

 import { Observable } from 'rxjs/Observable'; import { Component, OnInit } from '@angular/core'; import { Http } from '@angular/http'; import { SearchDataService } from '../services/searchDataService'; import { PersonCity } from '../model/personCity'; import { PersonCitySearchResult } from '../model/personCitySearchResult'; import { PersoncitysearchComponent } from '../personcitysearch/personcitysearch.component'; @Component({ selector: 'homesearchcomponent', templateUrl: 'homesearch.component.html', providers: [SearchDataService] }) export class HomeSearchComponent implements OnInit { public message: string; public PersonCitySearchData: PersonCitySearchResult; public SelectedTerm: string; public IndexExists: boolean = false; constructor(private _dataService: SearchDataService, private _personcitysearchComponent: PersoncitysearchComponent) { this.message = "Hello from HomeSearchComponent constructor"; this.SelectedTerm = "none"; this.PersonCitySearchData = new PersonCitySearchResult(); } public onTermSelectedEvent(term: string) { this.SelectedTerm = term; this.findDataForSearchTerm(term, 0) } private findDataForSearchTerm(term: string, from: number) { console.log("findDataForSearchTerm:" + term); this._dataService.FindAllForTerm(term, from) .subscribe((data) => { console.log(data) this.PersonCitySearchData = data; this.configurePagingDisplay(this.PersonCitySearchData.hits); }, error => console.log(error), () => { console.log('PersonCitySearch:findDataForSearchTerm completed'); } ); } ngOnInit() { this._dataService .IndexExists() .subscribe(data => this.IndexExists = data, error => console.log(error), () => console.log('Get IndexExists complete')); } public ShowPaging: boolean = false; public CurrentPage: number = 0; public TotalHits: number = 0; public PagesCount: number = 0; public Pages: number[] = []; public LoadDataForPage(page: number) { var from = page * 10; this.findDataForSearchTerm(this.SelectedTerm, from) this.CurrentPage = page; } public NextPage() { var page = this.CurrentPage; console.log("TotalHits" + this.TotalHits + "NextPage: " + ((this.CurrentPage + 1) * 10) + "CurrentPage" + this.CurrentPage ); if (this.TotalHits > ((this.CurrentPage + 1) * 10)) { page = this.CurrentPage + 1; } this.LoadDataForPage(page); } public PreviousPage(page: number) { var page = this.CurrentPage; if (this.CurrentPage > 0) { page = this.CurrentPage - 1; } this.LoadDataForPage(page); } private configurePagingDisplay(hits: number) { this.PagesCount = Math.floor(hits / 10); this.Pages = []; for (let i = 0; i <= this.PagesCount; i++) { this.Pages.push((i)); } this.TotalHits = hits; if (this.PagesCount <= 1) { this.ShowPaging = false; } else { this.ShowPaging = true; } } } 

Now when characters are entered into the search input, records are searched for and returned with the amount of hits for the term.

searchaspnetcoreangular2_01

The paging can also be used, to do the server side paging.

searchaspnetcoreangular2_02

The search functions like a web search with which we have come to expect. If different results, searches are required, the server side index creation, query types can be changed as needed. For example, the autocomplete suggestions could be replaced with a fuzzy search, or a query string search.

Links:

https://github.com/oferh/ng2-completer

https://github.com/damienbod/Angular2WebpackVisualStudio

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-query-string-query.html

https://www.elastic.co/products/elasticsearch

https://www.nuget.org/packages/ElasticsearchCRUD/

https://github.com/damienbod/ElasticsearchCRUD

http://www.bilyachat.com/2015/07/search-like-google-with-elasticsearch.html

http://stackoverflow.com/questions/29753971/elasticsearch-completion-suggest-search-with-multiple-word-inputs

http://rea.tech/implementing-autosuggest-in-elasticsearch/

https://qbox.io/blog/an-introduction-to-ngrams-in-elasticsearch

https://www.elastic.co/guide/en/elasticsearch/guide/current/_ngrams_for_partial_matching.html

Angular autocomplete with ASP.NET Core and Elasticsearch

This article shows how autocomplete could be implemented in Angular using ASP.NET Core MVC as a data service. The API uses Elasticsearch to query the data requests. ng2-completer is used to implement the Angular autocomplete functionality.

Code: https://github.com/damienbod/Angular2AutoCompleteAspNetCoreElasticsearch

2017.02.10: Updated to VS2017, Angular 2.4.6 and webpack 2.2.1
2017.01.07: Updated to csproj, webpack 2.2.0-rc.3, angular 2.4.1

To use autocomplete in the Angular 2 application, the ng2-completer package needs to be added to the dependencies in the npm packages.json file.

 "ng2-completer": "1.0.0" 

This project uses Webpack to build the Angular application and all vendor packages are added to the vendor.ts which can then be used throughout the application. The ng2-completer package is added to the vendor.ts file which is then built using Webpack.

 import 'ng2-completer'; import 'jquery/src/jquery'; import 'bootstrap/dist/js/bootstrap'; import 'bootstrap/dist/css/bootstrap.css'; import 'bootstrap/dist/css/bootstrap-theme.css'; import '../favicon.ico'; 

PersonCity is used as the data model for the autocomplete. The server side of the application uses the PersonCity model to store and search for data.

 export class PersonCity { public id: number; public name: string; public info: string; public familyName: string; } 

The ng2-completer autocomplete is used within the PersonCityAutocompleteSearchComponent. This component returns a PersonCity object to the using component. When a new search request is finished, the @Output bindModelPersonCityChange is updated. The @Output is chained to the onPersonCitySelected event from ng2-completer.

A custom CompleterService, PersoncityautocompleteDataService, is used to request the data from the server.

 import { Component, Inject, EventEmitter, Input, Output, OnInit, AfterViewInit, ElementRef } from '@angular/core'; import { Http, Response } from "@angular/http"; import { Subscription } from 'rxjs/Subscription'; import { Observable } from 'rxjs/Observable'; import { Router } from '@angular/router'; import { Configuration } from '../app.constants'; import { PersoncityautocompleteDataService } from './personcityautocompleteService'; import { PersonCity } from '../model/personCity'; import { CompleterService, CompleterItem } from 'ng2-completer'; import './personcityautocomplete.component.scss'; @Component({ selector: 'personcityautocomplete', template: ` <ng2-completer [dataService]="dataService" (selected)="onPersonCitySelected($event)" [minSearchLength]="0" [disableInput]="disableAutocomplete"></ng2-completer> ` }) export class PersoncityautocompleteComponent implements OnInit { constructor(private completerService: CompleterService, private http: Http, private _configuration: Configuration) { this.dataService = new PersoncityautocompleteDataService(http, _configuration); ////completerService.local("name, info, familyName", 'name'); } @Output() bindModelPersonCityChange = new EventEmitter<PersonCity>(); @Input() bindModelPersonCity: PersonCity; @Input() disableAutocomplete: boolean = false; private searchStr: string; private dataService: PersoncityautocompleteDataService; ngOnInit() { console.log("ngOnInit PersoncityautocompleteComponent"); } public onPersonCitySelected(selected: CompleterItem) { console.log(selected); this.bindModelPersonCityChange.emit(selected.originalObject); } } 

The PersonCityDataService extends the CompleterItem and implements the CompleterData as described in the ng-completer documentation. When PersonCity items are returned from the service, the results are mapped to CompleterItem items as required. This could also be done on the server and then the default remote service could be used. By using the custom service, it can easily be extended to add the security headers for the data service as required.

 import { Http, Response } from "@angular/http"; import { Subject } from "rxjs/Subject"; import { CompleterData, CompleterItem } from 'ng2-completer'; import { Configuration } from '../app.constants'; export class PersoncityautocompleteDataService extends Subject<CompleterItem[]> implements CompleterData { constructor(private http: Http, private _configuration: Configuration) { super(); this.actionUrl = _configuration.Server + 'api/personcity/querystringsearch/'; } private actionUrl: string; public search(term: string): void { this.http.get(this.actionUrl + term) .map((res: Response) => { // Convert the result to CompleterItem[] let data = res.json(); let matches: CompleterItem[] = data.map((personcity: any) => { return { title: personcity.name, description: personcity.familyName + ", " + personcity.cityCountry, originalObject: personcity } }); this.next(matches); }) .subscribe(); } public cancel() { // Handle cancel } } 

The PersonCityAutocompleteSearchComponent also implemented the specific styles using the personcityautocomplete.componentscss file. The ng-completer components comes with css classes which can be extended or overwritten.

 .completer-input { width: 500px; display: block; height: 34px; padding: 6px 12px; font-size: 14px; line-height: 1.42857143; color: #555; background-color: #fff; background-image: none; border: 1px solid #ccc; border-radius: 4px; -webkit-box-shadow: inset 0 1px 1px rgba(0, 0, 0, .075); box-shadow: inset 0 1px 1px rgba(0, 0, 0, .075); -webkit-transition: border-color ease-in-out .15s, -webkit-box-shadow ease-in-out .15s; -o-transition: border-color ease-in-out .15s, box-shadow ease-in-out .15s; transition: border-color ease-in-out .15s, box-shadow ease-in-out .15s; } .completer-dropdown { width: 480px !important; } 

ASP.NET Core MVC API

The PersonCityController MVC Controller implements the service which is used by the Angular 2 application. This service implements the Search action method which uses the IPersonCitySearchProvider to search for the data. Helper methods to create and add some documents to Elasticsearch are also implemented so that the search service can be tested.

 using Microsoft.AspNetCore.Mvc; namespace Angular2AutoCompleteAspNetCoreElasticsearch.Controllers { [Route("api/[controller]")] public class PersonCityController : Controller { private readonly IPersonCitySearchProvider _personCitySearchProvider; public PersonCityController(IPersonCitySearchProvider personCitySearchProvider) { _personCitySearchProvider = personCitySearchProvider; } [HttpGet("search/{searchtext}")] public IActionResult Search(string searchtext) { return Ok(_personCitySearchProvider.QueryString(searchtext)); } [HttpGet("createindex")] public IActionResult CreateIndex() { _personCitySearchProvider.CreateIndex(); return Created("http://localhost:5000/api/PersonCity/createindex/", "index created"); } [HttpGet("createtestdata")] public IActionResult CreateTestData() { _personCitySearchProvider.CreateTestData(); return Created("http://localhost:5000/api/PersonCity/createtestdata/", "test data created"); } [HttpGet("indexexists")] public IActionResult GetElasticsearchStatus() { return Ok(_personCitySearchProvider.GetStatus()); } } } 

The ElasticsearchCrud Nuget package is used to access Elasticsearch. The PersonCitySearchProvider implements this logic. Nest could also be used, only the PersonCitySearchProvider implementation needs to be changed to support this.

 "ElasticsearchCRUD": "2.4.1.1" 

The PersonCitySearchProvider class implements the IPersonCitySearchProvider interface which is used in the MVC controller. The IPersonCitySearchProvider needs to be added to the services in the Startup class. The search uses a QueryStringQuery search with wildcards. Any other query, aggregation could be used here, depending on the search requirements.

 using System.Collections.Generic; using System.Linq; using ElasticsearchCRUD; using ElasticsearchCRUD.ContextAddDeleteUpdate.IndexModel.SettingsModel; using ElasticsearchCRUD.Model.SearchModel; using ElasticsearchCRUD.Model.SearchModel.Queries; using ElasticsearchCRUD.Tracing; namespace Angular2AutoCompleteAspNetCoreElasticsearch { public class PersonCitySearchProvider : IPersonCitySearchProvider { private readonly IElasticsearchMappingResolver _elasticsearchMappingResolver = new ElasticsearchMappingResolver(); private const string ConnectionString = "http://localhost:9200"; private readonly ElasticsearchContext _context; public PersonCitySearchProvider() { _context = new ElasticsearchContext(ConnectionString, new ElasticsearchSerializerConfiguration(_elasticsearchMappingResolver)) { TraceProvider = new ConsoleTraceProvider() }; } public IEnumerable<PersonCity> QueryString(string term) { var results = _context.Search<PersonCity>(BuildQueryStringSearch(term)); return results.PayloadResult.Hits.HitsResult.Select(t => t.Source); } /// <summary> /// TODO protect against injection! /// </summary> /// <param name="term"></param> /// <returns></returns> private Search BuildQueryStringSearch(string term) { var names = ""; if (term != null) { names = term.Replace("+", " OR *"); } var search = new Search { Query = new Query(new QueryStringQuery(names + "*")) }; return search; } public bool GetStatus() { return _context.IndexExists<PersonCity>(); } public void CreateIndex() { _context.IndexCreate<PersonCity>(new IndexDefinition()); } public void CreateTestData() { PersonCityData.CreateTestData(); foreach (var item in PersonCityData.Data) { _context.AddUpdateDocument(item, item.Id); } _context.SaveChanges(); } } } 

When the application is started, the autocomplete is deactivated as no index exists.

angular2autocompleteaspnetcoreelasticsearch_01

Once the index exists, data can be added to the Elasticsearch index.
angular2autocompleteaspnetcoreelasticsearch_02

And the autocomplete can be used.

angular2autocompleteaspnetcoreelasticsearch_03

Links:

https://github.com/oferh/ng2-completer

https://github.com/damienbod/Angular2WebpackVisualStudio

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-query-string-query.html

https://www.elastic.co/products/elasticsearch

https://www.nuget.org/packages/ElasticsearchCRUD/

https://github.com/damienbod/ElasticsearchCRUD