(This post is part of the finding-odin series.)
Read the introduction: Finding my cat Odin using Python and Elasticsearch.
Before beginning development of any kind, I like to map out the components of the application and narrow down their function. In order to avoid introducing too much complexity too soon, I attempt to keep the scope of the overall application very small. This is a practice known as prototyping.
Provided below is a rough sketch of the architecture for the prototype:
I have divided the components of the architecture into two categories:
- The Flask app
- The data stores
Flask is a microframework for developing web applications with the Python programming language. Flask lets us separate major sections of our application into "blueprints". Each blue box in the sketch represents a blueprint that will exist in our application. Let's elaborate on the function of each blueprint.
The API will facilitate communication between the front end and the data stores. We will attempt to keep the API as basic as possible and offload processing to the scrapers and data stores.
The front end will be the portion of the application available to be viewed by the end user. Most likely we will use Vue.js for the front end framework, although we may default to the simpler Backbone.js.
The scraper blueprint will be in charge of downloading and parsing the Austin Animal Center lost and found listings. After the parsing is complete, the parsed fields will be stored in Elasticsearch. Either Requests + PyQuery will be used or the Scrapy scraping framework will be used to accomplish this.
The Data store section of our architecture consists of the software in charge of storing long-term, persisting data. Sometimes this is called the "database layer" or the "persistence layer" of an application. The two data stores we will use for this application are SQLite and Elasticsearch. Let's discuss how the data stores will be used.
Elasticsearch will be used to store the fields for the cats found in the Austin Animal Center database. The search mechanisms built into Elasticsearch will make it relatively easy to score searches based upon relevance. When an animal has been added to an Elasticsearch index that matches the criteria specified by a user, Elasticsearch should trigger a notification.
SQLite will be used to store user data from the front end. This includes information like account credentials and descriptions of the pets they are looking for.
Next we will create the structure of our Flask application and set up a "hello world" blueprint to get us started.