A client required me to rewrite a Geospatial REST API from Node-Express to Elixir-Cowboy.The Front-end was wrote in React, it interchange json with the API. The geospatial information is stored in Postgis, it is a spatial database extension for Postgres.
The communication between the front-end and the back-end is using json, so does not need to load or unload any binary object from the database. Just only need the string representation of geometries objects, the classis WKT Well Know Text Representation
The data manager of Elixir is Ecto, but working with it would imply create many views and some time, work with binaries.The app have 5 modules where every one have more or less 10 sql queries, so rewrite almost 50 well writed and optimized queries, wasn't on my mind.
Searching the web I found an interesting article, that was my main source of inspiration : Elixir-Without-Ecto
Table of Contents
- 1-Introduction
- 2-Working with geospatial data
- 3-Configure and Start
- 4-Postgrex Query Helpers
- 5-Named Parameters
1) Introduction
Query Helpers
The Back-end use PgPromise, so the option was create some function helpers to mimic his behaviour:
# For queries that must to return only 1 row, # raise if are 0 or more than 1 row = DB.one!(sql, data_map) # For queries that must to return only 1 row, # return error if are 0 or more than 1 {status, row} = DB.one(sql, data_map) # For queries that return 0 or multiple rows rows = DB.any!(sql, data_map) # Get affected rows for update, insert, delete, affected = DB.affected!(sql, values) # For update, insert, delete, store procedures, functions # return a map with operation status "fail" or "succes" status = DB.status(sql, values)
Working with named parameters
Another Pg-Promise very useful thing is work with named parameters stored in json.
With it we avoid to use list values with reference numbers: $1, $2.
In elixir Json is stored in maps, named parameters let us write things like the next:
#iex> sql= "INSERT into books(name,author,pages) VALUES($(name),$(author),$(pages));" values= %{ "name"=>"Lord of the Rings", "author"=>"John Ronald Reuel Tolkien", "pages"=>500} DB.status(sql, values) %{status: "success"}
Obtain columns as maps
Postgrex return every row as a list with values, if we run the next query:
# iex> sql = "Select 1 as id, "Jhon" as name" Postgrex.query!(sql,[])
Will be get:
%Postgrex.Result{ columns: ["id", "name"], command: :select, connection_id: 5849, messages: [], num_rows: 1, rows: [[1, "Jhon"]] }
But sometimes is more usefull to have a list of maps, where every column have an access key:
# iex> sql = "Select 1 as id, 'Jhon' as name;" DB.one!(sql,%{})
Will give me:
# iex> {:ok, %{"id" => 1, "name" => "Jhon"}}
2) Working with geospatial data
Some examples with geometries:
Select:
def selectField(id) do sql ="select json_build_object( 'id', id::text, 'name', name, 'centroid', st_AsGeoJSON(st_centroid(boundary))::json , 'area', to_char(ST_Area(boundary::geography)/10000, 'FM999999999.00') || 'ha', 'boundary', st_AsGeoJSON(boundary)::json ) as field FROM fields WHERE id=$(id);" DB.one!(sql, %{"id"=>id}) end
The result will be a list of rows where every row is a map ready to encode and send:
[%{"field"=> %{ "id" => "14", "name" => "Field name here", "area" => "28.18ha", "boundary" => %{ "coordinates" => [ [ [-63.69112612, -31.394299467], [-63.70054825, -31.393974797], [-63.69087755, -31.39156248], [-63.69094669, -31.39416452], [-63.69112612, -31.394299467] ] ], "type" => "Polygon" }, "centroid" => %{ "coordinates" => [-63.695616434, -31.392755962], "type" => "Point" } } } ]
Insert a Polygon:
Load geometries to the database:
def insertField() do json_polygon= %{ "coordinates" => [ [ [-63.694052, -31.389509], [-63.694352, -31.395883], [-63.691005, -31.396286], [-63.690061, -31.389839], [-63.694052, -31.389509] ] ], "type" => "Polygon" } values= %{"name"=>"Soy Beans", "polygon"=>json_polygon} sql = "INSERT into fields(name,boundary) VALUES($(name),ST_GeomFromGeoJSON($(polygon)));" DB.status(sql, values) end
Will return :
%{status: "success"}
Update some points:
def updateLocations() do values = %{ "id"=>[10,11,12], "long"=> -63.694052, "lat"=> -31.389509, "srid"=>4326} sql= "UPDATE locations SET point=ST_SetSRID( ST_Point($(lon),$(lat)),$(srid)) WHERE id=ANY($(id));" IO.inspect(DB.affected(sql, values), label: "Rows affected") end
After running this code the output will return :
Rows affected : 3
3) Configure and Start
The first thing to do is configure the access to Postgres and then start the process.
Create file config/config.exs
where the
the name key is really important, it will be used has reference to the running process.
Config file
#file: config/config.exs import Config config :postgrex, config: [ name: :mydb, hostname: "localhost", username: "user", password: "pasw", database: "gis", port: 5432 ]
Start the process
defmodule AppGis.Application do use Application @impl true def start(_type, _args) do dbconfig = Application.fetch_env!(:postgrex, :config) children = [ {Postgrex, dbconfig}, Another process... ] opts = [strategy: :one_for_one, name: Appgis.Supervisor] Supervisor.start_link(children, opts) end end
4) Postgrex Query Helpers
defmodule DB do defmodule DBError do defexception message: "", plug_status: 422 end # For arguments values inside map def query!(sql, values_map) when is_map(values_map) do {sql, values} = replace(sql, values_map) Postgrex.query!(:mydb, sql, values) end # For arguments values in a list def query!(sql, values) do Postgrex.query!(:mydb, sql, values) end # For queries that return 0 or multiple rows def any!(sql, values) do case query!(sql, values) do %{rows: []} -> nil %{rows: rows, columns: columns} -> Enum.map(rows, fn row -> mapify(columns, row) end) _ -> raise DBError, message: "returned multiple rows" end end # For queries that must to return only 1 row, # raise if are 0 or more than 1 def one!(sql, values) do case query!(sql, values) do %{rows: []} -> raise(DBError, message: "returned 0 rows") %{rows: [row], columns: columns} -> mapify(columns, row) _ -> raise DBError, message: "returned multiple rows" end end # For queries that must to return only 1 row, # return error if are 0 or more than 1 def one(sql, values) do case query!(sql, values) do %{rows: []} -> {:error, nil} %{rows: [row], columns: columns} -> {:ok, mapify(columns, row)} _ -> {:error, nil} end end # For update, insert, delete def affected!(sql, values), do: query!(sql, values).num_rows # For update, insert, delete # with operation status def status(sql, values) do case query!(sql, values).num_rows do 0 -> %{status: "fail"} _ -> %{status: "success"} end end # Tranform from [[cols...]] and [[values...]] to [%{"col"=>value}] defp mapify(columns, row) do columns |> Enum.zip(row) |> Map.new() end end
5) Named Parameters
In order to work with named parameters, a function is writted.
Basically receive:
values = %{"firstName"=>"Jhon", "phone"=>"(555) 456 789 522"} sql= "INSERT into clients(name,phone) VALUES($(name),($(phone));"
And transform to:
values = ["Jhon","(555) 456 789 522"] sql= "INSERT into clients(name,phone) VALUES($1,$2);"
A thing to have in count is that Postgrex need to be feeded with te right amount of arguments, not one more, not one less.
If a value is a map, it is encoded to json.
def replace(sql, mapvalues) do # Pattern to find keys $(keyname) pattern = ~r/\$\(([^)]+)\)/ # Get unique keys keys = Enum.uniq(Enum.map(Regex.scan(pattern, sql), fn [_ | [tail]] -> tail end)) # Build values list values = Enum.reverse(Enum.reduce(keys, [], fn k, lk -> [encode(Map.get(mapvalues, k)) | lk] end)) # Replace key names with their associate index sql = String.replace(sql, pattern, fn match -> key = String.replace(match, ["$", "(", ")", " "], "") idx = Enum.find_index(keys, fn x -> x == key end) + 1 # IO.puts("$#{idx}") "$#{idx}" end) {sql, values} end def encode(v) when is_map(v) do {:ok, str} = Jason.encode(v) str end def encode(v), do: v end
Top comments (0)