Skip to content

Commit

Permalink
S3 fix
Browse files Browse the repository at this point in the history
  • Loading branch information
MathieuBsqt committed Dec 18, 2024
1 parent d6f5cd3 commit 9de3298
Show file tree
Hide file tree
Showing 6 changed files with 39 additions and 31 deletions.
4 changes: 3 additions & 1 deletion apps/gradio/stable-diffusion/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,7 @@ ovhai app logs <app_id> --follow

When your app is ready for use, you will be able to generate your first images using the default checkpoint.

For your information, the `--volume` parameter allows to use both Swift and S3 buckets. However, it's important to note that for S3 usage, a proper configuration is necessary. If S3 is not configured yet and you wish to use it, please read the [S3 compliance guide](/pages/public_cloud/ai_machine_learning/gi_08_s3_compliance).
For your information, the `--volume` parameter allows to use both Swift and S3* compatible Object Storage buckets. However, it's important to note that for S3 compatible usage, a proper configuration is necessary. If S3 compatible is not configured yet and you wish to use it, please read the [S3 compatible compliance guide](/pages/public_cloud/ai_machine_learning/gi_08_s3_compliance).

### Step 3: Add Stable Diffusion checkpoints

Expand Down Expand Up @@ -268,3 +268,5 @@ If you need training or technical assistance to implement our solutions, contact
Please send us your questions, feedback and suggestions to improve the service:

- On the OVHcloud [Discord server](https://discord.com/invite/vXVurFfwe9)

**\***: S3 is a trademark of Amazon Technologies, Inc. OVHcloud’s service is not sponsored by, endorsed by, or otherwise affiliated with Amazon Technologies, Inc.
4 changes: 3 additions & 1 deletion apps/streamlit/whisper/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ You can create your Object Storage bucket using either the UI (OVHcloud Control
>>
>> *`GRA` alias and `whisper-model` will be used in this tutorial.*
>>
>> For your information, the previous command is applicable to both Swift and S3 buckets. However, it's important to note that for S3 usage, a proper configuration is necessary. If S3 is not configured yet and you wish to use it, please read the [S3 compliance guide](/pages/public_cloud/ai_machine_learning/gi_08_s3_compliance).
>> For your information, the previous command is applicable to both Swift and S3* compatible Object Storage buckets. However, it's important to note that for S3 compatible usage, a proper configuration is necessary. If S3 compatible is not configured yet and you wish to use it, please read the [S3 compatible compliance guide](/pages/public_cloud/ai_machine_learning/gi_08_s3_compliance).
#### Download whisper in the created bucket
Expand Down Expand Up @@ -313,3 +313,5 @@ If you need training or technical assistance to implement our solutions, contact
Please send us your questions, feedback and suggestions to improve the service:

- On the OVHcloud [Discord server](https://discord.com/invite/vXVurFfwe9)

**\***: S3 is a trademark of Amazon Technologies, Inc. OVHcloud’s service is not sponsored by, endorsed by, or otherwise affiliated with Amazon Technologies, Inc.
2 changes: 1 addition & 1 deletion jobs/neuralangelo/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ OUTPUT_DIR := logs/$(GROUP)/$(MODEL)
DOWNSAMPLE_RATE ?= 2
SCENE_TYPE ?= object

# S3 config
# S3* compatible config
DATASTORE ?= NEURALANGELO
BUCKET_NAME := neuralangelo-$(shell whoami)-$(MODEL)
BUCKET := $(BUCKET_NAME)@$(DATASTORE)
Expand Down
26 changes: 14 additions & 12 deletions jobs/neuralangelo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ The processing will follow 3 main steps :
- 3D model extraction

Each step will be run using an AI-Training job and these jobs will share their data using an AI-Training volume synced
with a S3 bucket.
with a S3* compatible bucket.


### Makefile
Expand Down Expand Up @@ -92,27 +92,27 @@ Download the sample video:
gdown 1yWoZ4Hk3FgmV3pd34ZbW7jEqgqyJgzHy -O neuralangelo/input/
```

### Configure an S3 bucket for ovhai
### Configure an S3 compatible bucket for ovhai

To be able to share data between the AI Training jobs we will run as well as providing code and data to our workloads, we need to configure an AI datastore pointing to an S3 endpoint.
To be able to share data between the AI Training jobs we will run as well as providing code and data to our workloads, we need to configure an AI datastore pointing to an S3 compatible endpoint.

```shell
ovhai datastore add s3 NEURALANGELO <s3_endpoint_url> <s3_region> <s3_access_key> --store-credentials-locally
```

>
> Data store information (endpoint, region, access_key and secret key) can refer to an OVHcloud S3 bucket or any other provider.
> Data store information (endpoint, region, access_key and secret key) can refer to an OVHcloud S3 compatible bucket or any other provider.
>
> Using `--store-credentials-locally` is needed here to be able to push/pull data from a bucket, using ovhai CLI in the next steps.
>
> See [this page](https://help.ovhcloud.com/csm/en-gb-public-cloud-ai-s3-compliance?id=kb_article_view&sysparm_article=KB0058011) for help about S3 usage.
> See [this page](https://help.ovhcloud.com/csm/en-gb-public-cloud-ai-s3-compliance?id=kb_article_view&sysparm_article=KB0058011) for help about S3 compatible usage.
>
### Prepare model input using COLMAP

Data preparation relies on the process described in [Neuralangelo documentation](https://github.com/NVlabs/neuralangelo/blob/main/DATA_PROCESSING.md).

#### Push the Neuralangelo project in the S3 bucket
#### Push the Neuralangelo project in the S3 compatible bucket

```shell
make push-data
Expand All @@ -125,7 +125,7 @@ make push-data
> ovhai bucket object upload neuralangelo-experiments-lego@NEURALANGELO .
> ```
>
> Note: As a bucket shall be unique in an S3 region, the Makefile uses the current username in the bucket name (`experiments` in this example).
> Note: As a bucket shall be unique in an S3 compatible region, the Makefile uses the current username in the bucket name (`experiments` in this example).
>
#### Extract pictures from the video
Expand Down Expand Up @@ -174,7 +174,7 @@ make prepare-status
make prepare-logs
```

Once the job is done, we get generated data from the S3 bucket:
Once the job is done, we get generated data from the S3 compatible bucket:

```shell
make pull-data
Expand Down Expand Up @@ -208,7 +208,7 @@ make adjust
Follow the process described [here](https://github.com/mli0603/BlenderNeuralangelo?tab=readme-ov-file#2-locating-the-control-panel) to adjust the bounding sphere.
Push the adjusted configuration in the S3 bucket:
Push the adjusted configuration in the S3 compatible bucket:
```shell
make push-data
Expand Down Expand Up @@ -313,7 +313,7 @@ make extract-status
make extract-logs
```

Once the job is done, we get generated data from the S3 bucket:
Once the job is done, we get generated data from the S3 compatible bucket:

```shell
make pull-data
Expand All @@ -340,14 +340,14 @@ in `neuralangelo/projects/neuralangelo/configs/base.yaml`.
It is possible to change it:

- either using `torchrun` command line parameters.
- or editing the file directly and sync it to the S3 bucket using `make data-push`.
- or editing the file directly and sync it to the S3 compatible bucket using `make data-push`.

### Checkpoints rendering

If the process is configured with a large amount of iterations, the processing can be long. As Neuralangelo creates
intermediate checkpoints, we are able to try extraction on any intermediate model.

To perform this, we need use `ovhai` to trigger a `data-push` on the running job to sync the S3 content and use
To perform this, we need use `ovhai` to trigger a `data-push` on the running job to sync the S3 compatible content and use
the previously described `make extract` command.

If you need training or technical assistance to implement our solutions, contact your sales representative or click on [this link](https://www.ovhcloud.com/en-gb/professional-services/) to get a quote and ask our Professional Services experts for a custom analysis of your project.
Expand All @@ -357,3 +357,5 @@ If you need training or technical assistance to implement our solutions, contact
Please send us your questions, feedback and suggestions to improve the service:

- On the OVHcloud [Discord server](https://discord.com/invite/vXVurFfwe9)

**\***: S3 is a trademark of Amazon Technologies, Inc. OVHcloud’s service is not sponsored by, endorsed by, or otherwise affiliated with Amazon Technologies, Inc.
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"\n",
"- create a Roboflow account\n",
"- click on `Download` in order to download the dataset\n",
"- select`YOLO v8 PyTorch` format\n",
"- select `YOLOv8` format\n",
"- choose the method `show download code`\n",
"\n",
"You will get a URL (`<dataset_url>`) that will allow you to download your dataset directly inside the notebook.\n",
Expand Down
32 changes: 17 additions & 15 deletions notebooks/getting-started/S3/use-s3-buckets-with-ai-tools.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -5,28 +5,28 @@
"id": "ac60a3db-eb56-4fd4-b139-95114faaee64",
"metadata": {},
"source": [
"# Using objects from your S3 buckets in OVHcloud AI Tools\n",
"# Using objects from your S3* compatible buckets in OVHcloud AI Tools\n",
"\n",
"This tutorial provides help to manage and use S3 buckets with AI Tools in Python, using the `boto3` library. We will show you how you can interact with your S3 Buckets and files by creating buckets, downloading objects, listing objects and reading their content when working with AI Notebooks, AI Training and AI Deploy.\n",
"This tutorial provides help to manage and use S3* compatible buckets with AI Tools in Python, using the `boto3` library. We will show you how you can interact with your S3 compatible Buckets and files by creating buckets, downloading objects, listing objects and reading their content when working with AI Notebooks, AI Training and AI Deploy.\n",
"\n",
"## Requirements\n",
"\n",
"To be able to follow this tutorial, you will need to have followed the [Data - S3 compliance with AI Tools documentation](https://help.ovhcloud.com/csm/en-gb-public-cloud-ai-s3-compliance?id=kb_article_view&sysparm_article=KB0058011) first, in particular the following steps:\n",
"To be able to follow this tutorial, you will need to have followed the [Data - Compliance between AI Tools and S3 compatible Object Storage](https://help.ovhcloud.com/csm/en-gb-public-cloud-ai-s3-compliance?id=kb_article_view&sysparm_article=KB0058011) first, in particular the following steps:\n",
"\n",
"- Have created a S3 user\n",
"- Have created a S3 compatible user\n",
"- Checked that this user has ***ObjectStore operator*** and ***AI Training Operator*** rights\n",
"- Have created a datastore with this user\n",
"\n",
"## Code\n",
"\n",
"The different steps are as follow:\n",
"- Setup the environment\n",
"- Set your S3 datastore\n",
"- List all S3 buckets in your S3 datastore\n",
"- Set your S3 compatible datastore\n",
"- List all S3 compatible buckets in your S3 compatible datastore\n",
"- Create a new bucket\n",
"- List all objects of a specific bucket\n",
"- Read content from objects\n",
"- Download object from S3 bucket\n",
"- Download object from S3 compatible bucket\n",
"\n",
"### Setup the environment\n",
"\n",
Expand Down Expand Up @@ -72,15 +72,15 @@
"id": "8347366f-98ec-4f4b-87c8-f53aea2b20e7",
"metadata": {},
"source": [
"### Set your S3 datastore"
"### Set your S3 compatible datastore"
]
},
{
"cell_type": "markdown",
"id": "10bc5759-97ce-472c-8c96-cb7224387bcc",
"metadata": {},
"source": [
"To interact with an S3 bucket, we need to initialize a S3 client and configure it with our user credentials (`s3_access_key`, `s3_secret_key`, the `endpoint URL`, and the selected region).\n",
"To interact with an S3 compatible bucket, we need to initialize a S3 compatible client and configure it with our user credentials (`s3_access_key`, `s3_secret_key`, the `endpoint URL`, and the selected region).\n",
"\n",
"***Make sure to replace these credentials by yours.***"
]
Expand Down Expand Up @@ -114,9 +114,9 @@
"id": "84a88d2f-5a8a-4c82-a4b3-9fc7b5d24ec1",
"metadata": {},
"source": [
"Once the S3 client has been initialized, we are ready to communicate with the S3-compatible storage service. Many things can be done.\n",
"Once the S3 compatible client has been initialized, we are ready to communicate with the S3 compatible storage service. Many things can be done.\n",
"\n",
"### List all S3 buckets in your S3 datastore"
"### List all S3 compatible buckets in your S3 compatible datastore"
]
},
{
Expand Down Expand Up @@ -431,9 +431,9 @@
"id": "12ea21bf-bde2-481e-b9ea-be327ec365f5",
"metadata": {},
"source": [
"### Download object from S3 bucket\n",
"### Download object from S3 compatible bucket\n",
"\n",
"You can download any object from your S3 bucket into your environment. Here is how to download the `requirements.txt` file under the name `local-object.txt`"
"You can download any object from your S3 compatible bucket into your environment. Here is how to download the `requirements.txt` file under the name `local-object.txt`"
]
},
{
Expand Down Expand Up @@ -466,11 +466,13 @@
"source": [
"### Conclusion\n",
"\n",
"We hope this example has helped you to manipulate the objects in your S3 buckets directly from the OVHcloud AI Tools products. \n",
"We hope this example has helped you to manipulate the objects in your S3 compatible buckets directly from the OVHcloud AI Tools products. \n",
"\n",
"The operations presented here are not the only possible actions. Please consult the documentation for a full list of available commands.\n",
"\n",
"More commands here : https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html"
"More commands here : https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html\n",
"\n",
"**\\***: S3 is a trademark of Amazon Technologies, Inc. OVHcloud’s service is not sponsored by, endorsed by, or otherwise affiliated with Amazon Technologies, Inc."
]
}
],
Expand Down

0 comments on commit 9de3298

Please sign in to comment.