-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NLP use-case template without Model Control Plane #1
Changes from 16 commits
a3b825e
4d26beb
6e265e9
95a7f40
728cba6
9dd6200
157ec1b
bb78ab0
facd30a
9643e00
888bc94
502166d
3d9e434
199eefd
b766507
3af6018
840f89a
23eeb4d
3fb8714
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,14 @@ | ||
#!/bin/sh -e | ||
set -x | ||
|
||
SRC=${1:-"template tests .scripts"} | ||
|
||
export ZENML_DEBUG=1 | ||
export ZENML_ANALYTICS_OPT_IN=false | ||
|
||
# autoflake replacement: removes unused imports and variables | ||
ruff $SRC --select F401,F841 --fix --exclude "__init__.py" --isolated | ||
|
||
# sorts imports | ||
ruff $SRC --select I --fix --ignore D | ||
black $SRC |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,69 @@ | ||
# 💫 ZenML End-to-End Natural Language Processing Project Template | ||
# 💫 ZenML End-to-End NLP Training and Deployment Project Template | ||
|
||
This project template is designed to help you get started with training and deploying NLP models using the ZenML framework. It provides a comprehensive set of steps and pipelines to cover major use cases of NLP model development, including dataset loading, tokenization, model training, model registration, and deployment. | ||
|
||
## 📃 Template Parameters | ||
|
||
| Parameter | Description | Default | | ||
|-----------|-------------|---------| | ||
| Name | The name of the person/entity holding the copyright | ZenML GmbH | | ||
| Email | The email of the person/entity holding the copyright | [email protected] | | ||
| Project Name | Short name for your project | ZenML NLP project | | ||
| Project Version | The version of your project | 0.0.1 | | ||
| Project License | The license under which your project will be released | Apache Software License 2.0 | | ||
| Technical product name | The technical name to prefix all tech assets (pipelines, models, etc.) | nlp_use_case | | ||
| Target environment | The target environment for deployments/promotions | staging | | ||
| Use metric-based promotion | Whether to compare metric of interest to make model version promotion | True | | ||
| Notifications on failure | Whether to notify about pipeline failures | True | | ||
| Notifications on success | Whether to notify about pipeline successes | False | | ||
| ZenML Server URL | Optional URL of a remote ZenML server for support scripts | - | | ||
|
||
## 🚀 Generate a ZenML Project | ||
|
||
To generate a project from this template, make sure you have ZenML and its `templates` extras installed: | ||
|
||
```bash | ||
pip install zenml[templates] | ||
``` | ||
|
||
Then, run the following command to generate the project: | ||
|
||
```bash | ||
zenml init --template nlp-template | ||
safoinme marked this conversation as resolved.
Show resolved
Hide resolved
|
||
``` | ||
|
||
You will be prompted to provide values for the template parameters. If you want to use the default values, you can add the `--template-with-defaults` flag to the command. | ||
|
||
## 🧰 How this template is implemented | ||
|
||
This template provides a set of pipelines and steps to cover the end-to-end process of training and deploying NLP models. Here is an overview of the main components: | ||
|
||
### Dataset Loading | ||
|
||
The template includes a step for loading the dataset from the HuggingFace Datasets library. You can choose from three available datasets: financial_news, airline_reviews, and imbd_reviews. | ||
safoinme marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
### Tokenization | ||
|
||
The tokenization step preprocesses the dataset by tokenizing the text data using the tokenizer provided by the HuggingFace Models library. You can choose from three available models: bert-base-uncased, roberta-base, and distilbert-base-cased. | ||
|
||
### Model Training | ||
|
||
The training pipeline consists of several steps, including model architecture search, hyperparameter tuning, model training, and model evaluation. The best model architecture and hyperparameters are selected based on the performance on the validation set. The trained model is then evaluated on the holdout set to assess its performance. | ||
|
||
### Model Registration and Promotion | ||
|
||
After training, the best model version is registered in the ZenML Model Registry. The template provides an option to promote the model version based on a specified metric of interest. If metric-based promotion is enabled, the template compares the metric value of the new model version with the metric value of the current production model version and promotes the new version if it performs better. | ||
|
||
### Batch Inference | ||
|
||
The template includes a batch inference pipeline that loads the inference dataset, preprocesses it using the same tokenizer as during training, and runs predictions using the deployed model version. The predictions are stored as an artifact for future use. | ||
|
||
### Deployment Options | ||
|
||
The template provides options to deploy the trained model locally or to the HuggingFace Hub. You can choose whether to deploy locally or to the HuggingFace Hub by setting the `deploy_locally` and `deploy_to_huggingface` parameters. | ||
|
||
## Next Steps | ||
|
||
Once you have generated the project using this template, you can explore the generated code and customize it to fit your specific NLP use case. The README.md file in the generated project provides further instructions on how to set up and run the project. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Just realised that there's also this ( |
||
|
||
Happy coding with ZenML and NLP! |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -11,4 +11,15 @@ setup: | |
pip install -r requirements.txt | ||
zenml integration install pytorch mlflow s3 gcp aws slack transformers -y | ||
|
||
install-stack: | ||
install-local-stack: | ||
@echo "Specify stack name [$(stack_name)]: " && read input && [ -n "$$input" ] && stack_name="$$input" || stack_name="$(stack_name)" && \ | ||
zenml experiment-tracker register -f mlflow mlflow_local_$${stack_name} && \ | ||
zenml model-registry register -f mlflow mlflow_local_$${stack_name} && \ | ||
zenml model-deployer register -f mlflow mlflow_local_$${stack_name} && \ | ||
zenml stack register -a default -o default -r mlflow_local_$${stack_name} \ | ||
-d mlflow_local_$${stack_name} -e mlflow_local_$${stack_name} $${stack_name} && \ | ||
zenml stack set $${stack_name} && \ | ||
zenml stack up | ||
|
||
install-remote-stack: | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. left in by accident? |
||
|
This file was deleted.
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,34 @@ | ||
# {% include 'template/license_header' %} | ||
|
||
settings: | ||
docker: | ||
{%- if accelerator == 'gpu' %} | ||
parent_image: 'huggingface/transformers-pytorch-gpu' | ||
{%- endif %} | ||
required_integrations: | ||
{%- if cloud_of_choice == 'aws' %} | ||
- aws | ||
- skypilot_aws | ||
- s3 | ||
{%- endif %} | ||
{%- if cloud_of_choice == 'gcp' %} | ||
- gcp | ||
- skypilot_gcp | ||
{%- endif %} | ||
- huggingface | ||
- pytorch | ||
- mlflow | ||
- discord | ||
requirements: | ||
- accelerate | ||
- zenml[server] | ||
|
||
extra: | ||
mlflow_model_name: nlp_use_case_model | ||
{%- if target_environment == 'production' %} | ||
target_env: Production | ||
{%- else %} | ||
target_env: Staging | ||
{%- endif %} | ||
notify_on_success: False | ||
notify_on_failure: True |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
# read the doc: https://huggingface.co/docs/hub/spaces-sdks-docker | ||
# you will also find guides on how best to write your Dockerfile | ||
|
||
FROM python:3.9 | ||
|
||
WORKDIR /code | ||
|
||
COPY ./requirements.txt /code/requirements.txt | ||
|
||
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt | ||
|
||
# Set up a new user named "user" with user ID 1000 | ||
RUN useradd -m -u 1000 user | ||
# Switch to the "user" user | ||
USER user | ||
# Set home to the user's home directory | ||
ENV HOME=/home/user \ | ||
PATH=/home/user/.local/bin:$PATH | ||
|
||
# Set the working directory to the user's home directory | ||
WORKDIR $HOME/app | ||
|
||
# Copy the current directory contents into the container at $HOME/app setting the owner to the user | ||
COPY --chown=user . $HOME/app | ||
|
||
CMD ["python", "app.py", "--server.port=7860", "--server.address=0.0.0.0"] |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
# {% include 'template/license_header' %} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we use this anywhere?