From d7a3eb0e6f55b610c2d41cce53e4c15b8f7d0ea3 Mon Sep 17 00:00:00 2001 From: Pavel Iakubovskii Date: Thu, 11 Jul 2024 11:58:04 +0100 Subject: [PATCH] Rename master->main (#891) --- .github/workflows/tests.yml | 4 ++-- README.md | 12 ++++++------ docs/quickstart.rst | 4 ++-- docs/save_load.rst | 2 +- 4 files changed, 11 insertions(+), 11 deletions(-) diff --git a/.github/workflows/tests.yml b/.github/workflows/tests.yml index 89feef56..9cd1c1c7 100644 --- a/.github/workflows/tests.yml +++ b/.github/workflows/tests.yml @@ -6,9 +6,9 @@ name: CI on: push: - branches: [ master ] + branches: [ main ] pull_request: - branches: [ master ] + branches: [ main ] jobs: diff --git a/README.md b/README.md index 4c6beb2f..5cfd76ee 100644 --- a/README.md +++ b/README.md @@ -4,8 +4,8 @@ **Python library with Neural Networks for Image Segmentation based on [PyTorch](https://pytorch.org/).** -[![Generic badge](https://img.shields.io/badge/License-MIT-.svg?style=for-the-badge)](https://github.com/qubvel/segmentation_models.pytorch/blob/master/LICENSE) -[![GitHub Workflow Status (branch)](https://img.shields.io/github/actions/workflow/status/qubvel/segmentation_models.pytorch/tests.yml?branch=master&style=for-the-badge)](https://github.com/qubvel/segmentation_models.pytorch/actions/workflows/tests.yml) +[![Generic badge](https://img.shields.io/badge/License-MIT-.svg?style=for-the-badge)](https://github.com/qubvel/segmentation_models.pytorch/blob/main/LICENSE) +[![GitHub Workflow Status (branch)](https://img.shields.io/github/actions/workflow/status/qubvel/segmentation_models.pytorch/tests.yml?branch=main&style=for-the-badge)](https://github.com/qubvel/segmentation_models.pytorch/actions/workflows/tests.yml) [![Read the Docs](https://img.shields.io/readthedocs/smp?style=for-the-badge&logo=readthedocs&logoColor=white)](https://smp.readthedocs.io/en/latest/)
[![PyPI](https://img.shields.io/pypi/v/segmentation-models-pytorch?color=blue&style=for-the-badge&logo=pypi&logoColor=white)](https://pypi.org/project/segmentation-models-pytorch/) @@ -77,8 +77,8 @@ preprocess_input = get_preprocessing_fn('resnet18', pretrained='imagenet') Congratulations! You are done! Now you can train your model with your favorite framework! ### 💡 Examples - - Training model for pets binary segmentation with Pytorch-Lightning [notebook](https://github.com/qubvel/segmentation_models.pytorch/blob/master/examples/binary_segmentation_intro.ipynb) and [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/qubvel/segmentation_models.pytorch/blob/master/examples/binary_segmentation_intro.ipynb) - - Training model for cars segmentation on CamVid dataset [here](https://github.com/qubvel/segmentation_models.pytorch/blob/master/examples/cars%20segmentation%20(camvid).ipynb). + - Training model for pets binary segmentation with Pytorch-Lightning [notebook](https://github.com/qubvel/segmentation_models.pytorch/blob/main/examples/binary_segmentation_intro.ipynb) and [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/qubvel/segmentation_models.pytorch/blob/main/examples/binary_segmentation_intro.ipynb) + - Training model for cars segmentation on CamVid dataset [here](https://github.com/qubvel/segmentation_models.pytorch/blob/main/examples/cars%20segmentation%20(camvid).ipynb). - Training SMP model with [Catalyst](https://github.com/catalyst-team/catalyst) (high-level framework for PyTorch), [TTAch](https://github.com/qubvel/ttach) (TTA library for PyTorch) and [Albumentations](https://github.com/albu/albumentations) (fast image augmentation library) - [here](https://github.com/catalyst-team/catalyst/blob/v21.02rc0/examples/notebooks/segmentation-tutorial.ipynb) [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/catalyst-team/catalyst/blob/v21.02rc0/examples/notebooks/segmentation-tutorial.ipynb) - Training SMP model with [Pytorch-Lightning](https://pytorch-lightning.readthedocs.io) framework - [here](https://github.com/ternaus/cloths_segmentation) (clothes binary segmentation by [@ternaus](https://github.com/ternaus)). @@ -465,7 +465,7 @@ $ pip install git+https://github.com/qubvel/segmentation_models.pytorch ### 🏆 Competitions won with the library `Segmentation Models` package is widely used in the image segmentation competitions. -[Here](https://github.com/qubvel/segmentation_models.pytorch/blob/master/HALLOFFAME.md) you can find competitions, names of the winners and links to their solutions. +[Here](https://github.com/qubvel/segmentation_models.pytorch/blob/main/HALLOFFAME.md) you can find competitions, names of the winners and links to their solutions. ### 🤝 Contributing @@ -500,4 +500,4 @@ make table # generate table with encoders and print to stdout ``` ### 🛡️ License -Project is distributed under [MIT License](https://github.com/qubvel/segmentation_models.pytorch/blob/master/LICENSE) +Project is distributed under [MIT License](https://github.com/qubvel/segmentation_models.pytorch/blob/main/LICENSE) diff --git a/docs/quickstart.rst b/docs/quickstart.rst index c5181f32..7fc04dd7 100644 --- a/docs/quickstart.rst +++ b/docs/quickstart.rst @@ -62,8 +62,8 @@ You are done! Now you can train your model with your favorite framework, or as s Check the following examples: .. |colab-badge| image:: https://colab.research.google.com/assets/colab-badge.svg - :target: https://colab.research.google.com/github/qubvel/segmentation_models.pytorch/blob/master/examples/binary_segmentation_intro.ipynb + :target: https://colab.research.google.com/github/qubvel/segmentation_models.pytorch/blob/main/examples/binary_segmentation_intro.ipynb :alt: Open In Colab -- Finetuning notebook on Oxford Pet dataset with `PyTorch Lightning `_ |colab-badge| +- Finetuning notebook on Oxford Pet dataset with `PyTorch Lightning `_ |colab-badge| - Finetuning script for cloth segmentation with `PyTorch Lightning `_ diff --git a/docs/save_load.rst b/docs/save_load.rst index 0aec7d50..e225c5d0 100644 --- a/docs/save_load.rst +++ b/docs/save_load.rst @@ -68,7 +68,7 @@ By following these steps, you can easily save, share, and load your models, faci |colab-badge| .. |colab-badge| image:: https://colab.research.google.com/assets/colab-badge.svg - :target: https://colab.research.google.com/github/qubvel/segmentation_models.pytorch/blob/master/examples/binary_segmentation_intro.ipynb + :target: https://colab.research.google.com/github/qubvel/segmentation_models.pytorch/blob/main/examples/binary_segmentation_intro.ipynb :alt: Open In Colab