This application interfaces with an ElasticSearch backend and exposes a set of API Endpoints to allow registered users to query our data.
The backend is populated via pipelines.
This repository contains Architecture Decision Records in the docs/architecture-decisions directory.
adr-tools should allow easy creation of additional records with a standardized template.
- please
bundle exec annotate
when making changes to models to update the internal documentation - don't commit your .env or .env.development, but do commit .env.test after confirming your test values are not actual secrets that need protecting
Updating the data model is somewhat complicated because many files need to be edited across multiple repositories and deployment steps should happen in a particular order so as not to break production services.
- Start by updating the data model in Mario. Instructions for that can be found in the Mario README. Then complete the following steps here in TIMDEX.
- Update
app/models/search.rb
to build/update/remove queries for the added/ edited/deleted fields as appropriate. Make sure to update filters and aggregations if relevant to the changed fields. - Update
app/views/api/[version]/search/_base_json_jbuilder
to add/update/remove changed fields OR updateviews/api/[version]/search/_extended_json_jbuilder
if the changed fields aren’t/shouldn’t be in the brief record result. - If changed fields should be aggregated, update
views/api/[version]/search/_aggregations_json_jbuilder
as appropriate. - Update tests as necessary. Make sure to test with all current data source samples ingested into a local ES instance.
- Update
openapi.json
to make sure our spec matches any changes made (including bumping the version number).
We are using Swagger UI to automatically generate documentation from the openapi.json
file in GitHub Pages. The HTML
file is in docs/index.html
and the openapi.json
file always pulls from the main
branch.
EMAIL_FROM
: email address to send message from, including the registration and forgot password messages.EMAIL_URL_HOST
- base url to use when sending emails that link back to the application. In development, oftenlocalhost:3000
. On heroku, oftenyourapp.herokuapp.com
. However, if you use a custom domain in production, that should be the value you use in production.JWT_SECRET_KEY
: generate withrails secret
ELASTICSEARCH_INDEX
: Elasticsearch index or alias to queryELASTICSEARCH_URL
: defaults tohttp://localhost:9200
AWS_ACCESS_KEY
AWS_ELASTICSEARCH
: boolean. Set to true to enable AWSv4 SigningAWS_SECRET_ACCESS_KEY
AWS_REGION
SMTP_ADDRESS
SMTP_PASSWORD
SMTP_PORT
SMTP_USER
-
ELASTICSEARCH_LOG
iftrue
, verbosely logs ElasticSearch queries.NOTE: do not set this ENV at all if you want ES logging fully disabled. Setting it to `false` is still setting it and you will be annoyed and confused.
-
ES_LOG_LEVEL
set elasticsearch transport log level. Defaults toINFO
.
NOTE: `ELASTICSEARCH_LOG` must also be set for logging to function.
PREFERRED_DOMAIN
- set this to the domain you would like to to use. Any other requests that come to the app will redirect to the root of this domain. This is useful to prevent access to herokuapp.com domains.REQUESTS_PER_PERIOD
- requests allowed before throttling. Default is 100.REQUEST_PERIOD
- number of minutes for the period inREQUESTS_PER_PERIOD
. Default is 1.SENTRY_DSN
: client key for Sentry exception loggingSENTRY_ENV
: Sentry environment for the application. Defaults to 'unknown' if unset.
This section will describe how to use the included docker compose files to spin up ElasticSearch and optionally use Mario to load sample data for testing.
You may set ELASTICSEARCH_URL
to http://0.0.0.0:9200
to use this ES instance in development if you
choose to not use the included Dockerfile
make up
make down
After ElasticSearch is running from make up
command:
make sampledata
Note: if you run this and it fails, try again in a few seconds as ES may still be loading
You can also run arbitrary Mario commands using a syntax like this after first running make up
.
docker run --network timdex_default mitlibraries/mario --url http://elasticsearch:9200 YOUR_MARIO_COMMAND_HERE [e.g. indexes]
Note: if you have no indexes loaded, many mario commands will fail. Try make sampledata
or load the data you
need before proceeding.
curl 'http://0.0.0.0:3000/api/v1/ping'
curl 'http://0.0.0.0:3000/api/v1/search?q=archives'
You can also use the playground via your browser, see the index page of the running app for a link and example queries.