PyTorch implementation of UniRestorer: Universal Image Restoration via Adaptively Estimating Image Degradation at Proper Granularity
Abstract: Recently, considerable progress has been made in all-in-one image restoration. Generally, existing methods can be degradation-agnostic or degradation-aware. However, the former are limited in leveraging degradation-specific restoration, and the latter suffer from the inevitable error in degradation estimation. Consequently, the performance of existing methods has a large gap compared to specific single-task models. In this work, we make a step forward in this topic, and present our UniRestorer with improved restoration performance. Specifically, we perform hierarchical clustering on degradation space, and train a multi-granularity mixture-of-experts (MoE) restoration model. Then, UniRestorer adopts both degradation and granularity estimation to adaptively select an appropriate expert for image restoration. In contrast to existing degradation-agnostic and -aware methods, UniRestorer can leverage degradation estimation to benefit degradation-specific restoration, and use granularity estimation to make the model robust to degradation estimation error. Experimental results show that our UniRestorer outperforms state-of-the-art all-in-one methods by a large margin, and is promising in closing the performance gap to specific single-task models.
- 2024.12.31: Paper and supplement files are released on ArXiv.
- Inference code and pre-trained models release.
- Datasets, training code release.
Our method has three steps, constructing multi-granularity degradation set, train multi-granularity MoE restoration model, train degradation and granularity estimation-based routing.
If you have any questions, please contact [email protected].
We thank to the following image restoration works for their awesome backbones and code repos:
@article{lin2024unirestorer,
title={UniRestorer: Universal Image Restoration via Adaptively Estimating Image Degradation at Proper Granularity},
author={Lin, Jingbo and Zhang, Zhilu and Li, Wenbo and Pei, Renjing and Xu, Hang and Zhang, Hongzhi and Zuo, Wangmeng},
journal={arXiv preprint arXiv:2412.20157},
year={2024}
}