This article was originally published on Codica blog.
Today, we are going to provide you with the second part of the article How to Implement Elasticsearch When Developing a Rails Web App. Here we will discuss the process of Elasticsearch implementation, adding functionality, and how to make the research available with API.
Let’s get started.
Step #3: Using Elasticsearch with Rails
To implement Elasticsearch to the Rails app, add the following gems to Gemfile:
gem 'elasticsearch-model'
gem 'elasticsearch-rails'
To install these gems run bundle install
.
Now it’s time to add actual functionality to the location model. Use the so-called concerns
.
We create a new app/models/concerns/searchable.rb
file.
Afterwards, we add the following code snippet:
module Searchable
extend ActiveSupport::Concern
included do
include Elasticsearch::Model
include Elasticsearch::Model::Callbacks
end
end
Include the created module to the location model:
class Location < ApplicationRecord
include Searchable
end
The steps we reproduce are as follows:
- With
Elasticsearch::Model
module, add Elasticsearch integration to the model. - With
Elasticsearch::Model::Callbacks
, add callbacks.
Now, let’s index our model.
We open the Rails rails c
console and launch Location.import force: true
.
With force: true option, you can build an index if it doesn't exist. To be sure that the index has been created, open Kibana dev tools at http://localhost:5601/ and insert GET _cat/indices?v
.
So, we have built the index with the name locations:
The index was created automatically. It means that the default configuration was applied to all fields.
Now we are going to develop a test query. Find more information about Elasticsearch Query DSL here.
Open Kibana development tools and go to http://localhost:5601.
Insert the snippet:
GET locations/_search
{
"query": {
"match_all": {}
}
Don’t forget about the hits attribute of the response’s JSON and especially its _source attribute. Take into consideration that all fields in the Location model were serialized and indexed.
To make a test query through the Rails app open rails c
console and insert the following:
results = Location.search(‘san’)
results.map(&:name) # => ["san francisco", "american samoa"]
Step #4: Creating a custom index with autocomplete functionality
At this step, we have to delete the previous index and then create a new one. To do this just open rails c Location.elasticsearch.delete_index!
.
Now it’s time to edit the app/models/concerns/searchable.rb
file. It would look like this:
module Searchable
extend ActiveSupport::Concern
included do
include Elasticsearch::Model
include Elasticsearch::Model::Callbacks
def as_indexed_json(_options = {})
as_json(only: %i[name level])
end
settings settings_attributes do
mappings dynamic: false do
indexes :name, type: :text, analyzer: :autocomplete
indexes :level, type: :keyword
end
end
def self.search(query, filters)
set_filters = lambda do |context_type, filter|
@search_definition[:query][:bool][context_type] |= [filter]
end
@search_definition = {
size: 5,
query: {
bool: {
must: [],
should: [],
filter: []
if query.blank?
set_filters.call(:must, match_all: {})
else
set_filters.call(
:must,
match: {
name: {
query: query,
fuzziness: 1
end
if filters[:level].present?
set_filters.call(:filter, term: { level: filters[:level] })
end
__elasticsearch__.search(@search_definition)
end
end
class_methods do
def settings_attributes
index: {
analysis: {
analyzer: {
autocomplete: {
type: :custom,
tokenizer: :standard,
filter: %i[lowercase autocomplete]
},
filter: {
autocomplete: {
type: :edge_ngram,
min_gram: 2,
max_gram: 25
end
end
end
In this snippet, we are serializing our model attributes to JSON with the key as_indexed_json
method.
In fact, we will work only with two fields, i.e. name
and level
*:
def as_indexed_json(_options = {})
as_json(only: %i[name level])
end
Let’s define the index configuration:
settings settings_attributes do
mappings dynamic: false do
# we use our autocomplete custom analyzer that we have defined above
indexes :name, type: :text, analyzer: :autocomplete
indexes :level, type: :keyword
end
end
def settings_attributes
index: {
analysis: {
analyzer: {
# we define custom analyzer with name autocomplete
autocomplete: {
# type should be custom for custom analyzers
type: :custom,
# we use standard tokenizer
tokenizer: :standard,
# we apply two token filters
# autocomplete filter is a custom filter that we defined above
filter: %i[lowercase autocomplete]
},
filter: {
# we define custom token filter with name autocomplete
autocomplete: {
type: :edge_ngram,
min_gram: 2,
max_gram: 25
end
end
Let’s define an autocomplete
custom analyzer with standard tokenizer
and with lowercase
and autocomplete
filters.
Autocomplete
filter is of edge_ngram
type. The edge_ngram
tokenizer splits the text into smaller parts (grams).
For instance, the word “ruby”
will be divided into [“ru”, “rub”, “ruby”]
.
When implementing autocomplete functionality, edge_ngram
come in handy. Still, there is one more way to integrate the options needed. It is called completion suggester approach.
We use mappings with the name and level fields. The keyword data type is applied to the level field. The text data type is used with both the name field and our custom autocomplete
analyzer.
And now, we are going to explain the search method we apply:
def self.search(query, filters)
# a lambda function adds conditions to a search definition
set_filters = lambda do |context_type, filter|
@search_definition[:query][:bool][context_type] |= [filter]
end
@search_definition = {
# we indicate that there should be no more than 5 documents to return
size: 5,
# we define an empty query with the ability to
# dynamically change the definition
# Query DSL https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl.html
query: {
bool: {
must: [],
should: [],
filter: []
# match all documents
if query.blank?
set_filters.call(:must, match_all: {})
else
set_filters.call(
:must,
match: {
name: {
query: query,
# fuzziness means you can make one typo and still match your document
fuzziness: 1
end
# the system will return only those documents that pass this filter
if filters[:level].present?
set_filters.call(:filter, term: { level: filters[:level] })
end
__elasticsearch__.search(@search_definition)
end
Open the Rails console and check the request to make sure the project works properly:
rails c
results = Location.search('san francisco', {})
results.map(&:name) # => ["san francisco", "american samoa"]
Also, we are going to verify if the product performance is correct with a few mistakes in the request. The purpose is to make sure the project functions accurately:
results = Location.search('Asan francus', {})
results.map(&:name) # => ["san francisco"]
We have one filter defined. It is applied to Location filter by level. The database contains two objects in the database with the same name, i.e. New York, which are of different levels. The first level concerns to the state, and the second one - to the city:
results = ation.import force: true=>"new york", :level=>"state"}
results = Location.search('new york', { level: city })
results.map { |result| { name: result.name, level: result.level } }
# [{:name=>"new york", :level=>"city"}
Step #5: Making the search request available by API
In the final stage, we are going to create a controller through which the search queries will pass:
rails generate controller Home search
For this purpose, open app/controllers/home_controller.rb and insert the following snippet in it:
class HomeController < ApplicationController
def search
results = Location.search(search_params[:q], search_params)
locations = results.map do |r|
r.merge(r.delete('_source')).merge('id': r.delete('_id'))
end
render json: { locations: locations }, status: :ok
end
private
def search_params
params.permit(:q, :level)
end
end
To see the project performance, run the Rails server by typing in rails s
and then navigate to http://localhost:3000//home/search?q=new&level=state
.
In the code, we ask all files containing the name “new” and whose level is equal to the state.
Our test Rails web app is ready, with the basic functionality of the searching service implemented.
Conclusion
Here we presented you the second part of the article covering Elasticsearch integration with Rails web app and hope that our tutorial was helpful.
Stay tuned and read the full article version here: How to Implement Elasticsearch When Developing a Rails Web App.
Top comments (0)