Webapp for Tandem-dimple project
First, clone the repo and the submodule (Tandem-Dimple)
git clone <path_to_git>
git submodule update --initWe need to first have the environment to download the database. This is used by Loci's downloading script.
conda env create -f environment_for_downloading_database.ymlThen, we can download the required database for Loci's inference code
conda activate tandem
# In ./inference/tandem/
bash scripts/download_pfam.sh data/pfamdb # 1.5G, ~1.5m
bash scripts/download_consurf_db.sh data/consurf/db # 2.5G, ~2m
# Please skip this database for now
# We will download this database later
bash scripts/download_uniref90.sh data/consurf # 90G, ~127mThen, we can build and run the docker containers.
# Build and run all the containers
docker compose up -d --build
# Stop all containers
docker compose downWe can check the frontend at http://0.0.0.0:7860/
-
user_inputIts status would go from
pending->processing->finished
-
gradio_appFrontend for the website, exposed on port 7860.
Would check for and update the result from the database automatically, for every 3 seconds.
-
mongodbDatabase
container platform:
linux/amd64orlinux/arm64both are OK, very stable. But don't uselinux/amd64on Apple Silicon, since Rosetta does not have CPU AVX support. -
workerAutomatically fetch user input one-by-one, whose "status" are "pending".
Use atomically lock jobs, whenever an input is fetched, its "status" changes from "pending" -> "processing"
Jsonify the input and send to the inference container docker, through HTTP API.
Write the input with the results, back to database, "status" changed from "processing" to "finished".
-
inferencePerform feature processing and model inference.
The inference docker build the correct environment needed for Loci's inference code.
Container platform can only use
linux/amd64, and cannot run on Apple Silicon, since TensorFlow needs CPU AVX support.Serve a
/inferAPI through flask, on internal port 5000.Use git submodule to link to Loci's inference git repo.
Use
adapter.pyto import Loci's inference functions.