AI4OS Modules TemplateΒΆ
To simplify the development of new modules, and make the integration of your model with the DEEPaaS API easier, we provide a standard template for modules.
There are different versions of this template:
master: this is what 99% of users are probably looking for. Simple, minimal template, with the minimum requirements to integrate your code in the AI4OS catalog.
child-module: this is a fork of the
master
branch specifically tailored to users performing a retraining of an existing module. It only creates a Docker repo whose container is based on an existing moduleβs Docker image.advanced: this is a more advanced template. It makes more assumptions on how to structure projects and adds more files than those strictly needed for integration. Unless you are looking for some specific feature, you are probably safer using
master
.
Create your project based on the templateΒΆ
Based on the version of the template you choose you will be asked to answer a number of questions, which might include:
git_base_url
: Remote URL to host your new git repositories (e.g. https://github.com/deephdc ).project_name
: Project name, to be added after "git_base_url" (see above)β, (aka <your_project> in the following).author_name
: Author name(s) (and/or your organization/company/team). If many, separate by comma.author_email
: E-Mail(s) of main author(s) (or contact person). If many, separate by comma.description
: Short description of the project.app_version
: Application version (expects X.Y.Z (Major.Minor.Patch)).open_source_license
: Choose open source license, default is MIT. More info.docker_baseimage
: Docker image your Dockerfile starts from (FROM <docker_baseimage>
) (donβt provide the tag here), (e.g. tensorflow/tensorflow ).baseimage_cpu_tag
: CPU tag for the baseimage, e.g. 2.9.1. Has to match python3!baseimage_gpu_tag
: GPU tag for the baseimage, e.g. 2.9.1-gpu. Has to match python3! Sometimesbaseimage_cpu_tag
andbaseimage_gpu_tag
are the same (for example in Pytorch). In Tensorflow they are different.failure_notify
: whether you want to receive updates if your model fails to build.
Based on your answers, we will fill the template and create two repositories (linked to your git_base_url
):
~/your_project
: this is where the code of your module goes~/DEEP-OC-your_project
: this is where the Docker files to build your module go. It also has the metadata of your module that will be shown in the Dashboard.
Each repository has two branches: master
(to commit stable changes) and test
(to test features without disrupting your users).
Via User InterfaceΒΆ
Go to the Template creation webpage. You will need an authentication to access to this webpage.
Then select which version of the template you want and answer the questions. Click on Generate
and you will be able to download a .zip
file with both repositories.
Via TerminalΒΆ
Useful video demos
You will need to install cookiecutter and then run it as follows:
$ pip install cookiecutter
$ cookiecutter https://github.com/deephdc/cookiecutter-deep --checkout master
You are first provided with an [Info]
line with information about the parameter. And in the next line you configure this parameter. Once all questions are answered, the two repositories will be created.
Project structureΒΆ
Based on the on the branch you choose, the template will create different files, being master the most minimal option (see above). The content of these files is populated based on your answer to the questions.
Master branch
<your_project>
##############
βββ LICENSE <- License file
β
βββ README.md <- The top-level README for developers using this project.
β
βββ requirements.txt <- The requirements file for reproducing the analysis
β environment (`pip freeze > requirements.txt`)
β
βββ setup.py, setup.cfg <- makes project pip installable (`pip install -e .`) so
β {{cookiecutter.repo_name}} can be imported
β
βββ {{cookiecutter.__repo_name}} <- Source code for use in this project.
β β
β βββ __init__.py <- Makes {{cookiecutter.repo_name}} a Python module
β β
β βββ api.py <- Main script for the integration with DEEPaaS API
β β
β βββ misc.py <- Misc functions that were helpful across projects
β β
β βββ tests <- Scripts to perform code testing
β
βββ Jenkinsfile <- Describes basic Jenkins CI/CD pipeline
DEEP-OC-<your_project>
######################
ββ Dockerfile <- Describes main steps on integration of DEEPaaS API and
β <your_project> application in one Docker image
β
ββ Jenkinsfile <- Describes basic Jenkins CI/CD pipeline
β
ββ LICENSE <- License file
β
ββ README.md <- README for developers and users.
β
ββ metadata.json <- Defines information propagated to the AI4OS Dashboard
Child-module branch
DEEP-OC-<your_project>
######################
ββ Dockerfile <- Describes main steps on integration of DEEPaaS API and
β <your_project> application in one Docker image
β
ββ Jenkinsfile <- Describes basic Jenkins CI/CD pipeline
β
ββ LICENSE <- License file
β
ββ README.md <- README for developers and users.
β
ββ metadata.json <- Defines information propagated to the AI4OS Dashboard
Advanced branch
<your_project>
##############
βββ LICENSE
βββ README.md <- The top-level README for developers using this project.
βββ data
β βββ raw <- The original, immutable data dump.
β
βββ docs <- A default Sphinx project; see sphinx-doc.org for details
β
βββ models <- Trained and serialized models, model predictions, or model
β summaries
β
βββ notebooks <- Jupyter notebooks. Naming convention is a number
β (for ordering), the creator's initials (if many
β user development), and a short `_` delimited
β description.
β e.g.`1.0-jqp-initial_data_exploration.ipynb`.
β
βββ references <- Data dictionaries, manuals, and all other explanatory
β materials.
β
βββ reports <- Generated analysis as HTML, PDF, LaTeX, etc.
β βββ figures <- Generated graphics and figures to be used in reporting
β
βββ requirements.txt <- The requirements file for reproducing the analysis
β environment, (`pip freeze > requirements.txt`)
β
βββ test-requirements.txt <- The requirements file for the test environment
β
βββ setup.py <- makes project pip installable (pip install -e .) so
β {{cookiecutter.repo_name}} can be imported
β
βββ {{cookiecutter.__repo_name}} <- Source code for use in this project.
β β
β βββ __init__.py <- Makes {{cookiecutter.repo_name}} a Python module
β β
β βββ dataset <- Scripts to download or generate data
β β βββ make_dataset.py
β β
β βββ features <- Scripts to turn raw data into features for modeling
β β βββ build_features.py
β β
β βββ models <- Scripts to train models and make predictions
β β βββ deep_api.py <- Main script for the integration with DEEP API
β β
β βββ tests <- Scripts to perform code testing
β β
β βββ visualization <- Scripts to create exploratory and results oriented
β βββ visualize.py visualizations
β
βββ tox.ini <- tox file with settings for running tox; see tox.testrun.org
DEEP-OC-<your_project>
######################
ββ Dockerfile <- Describes main steps on integration DEEPaaS API and
β <your_project> application in one Docker image
β
ββ Jenkinsfile <- Describes basic Jenkins CI/CD pipeline
β
ββ LICENSE <- License file
β
ββ README.md <- README for developers and users.
β
ββ docker-compose.yml <- Allows running the application with various configurations
β via docker-compose
β
ββ metadata.json <- Defines information propagated to the AI4OS Dashboard