background
logo
ArxivPaperAI

Multi-Scale Implicit Transformer with Re-parameterize for Arbitrary-Scale Super-Resolution

Author:
Jinchen Zhu, Mingjian Zhang, Ling Zheng, Shizhuang Weng
Keyword:
Computer Science, Computer Vision and Pattern Recognition, Computer Vision and Pattern Recognition (cs.CV)
journal:
--
date:
2024-03-11 00:00:00
Abstract
Recently, the methods based on implicit neural representations have shown excellent capabilities for arbitrary-scale super-resolution (ASSR). Although these methods represent the features of an image by generating latent codes, these latent codes are difficult to adapt for different magnification factors of super-resolution, which seriously affects their performance. Addressing this, we design Multi-Scale Implicit Transformer (MSIT), consisting of an Multi-scale Neural Operator (MSNO) and Multi-Scale Self-Attention (MSSA). Among them, MSNO obtains multi-scale latent codes through feature enhancement, multi-scale characteristics extraction, and multi-scale characteristics merging. MSSA further enhances the multi-scale characteristics of latent codes, resulting in better performance. Furthermore, to improve the performance of network, we propose the Re-Interaction Module (RIM) combined with the cumulative training strategy to improve the diversity of learned information for the network. We have systematically introduced multi-scale characteristics for the first time in ASSR, extensive experiments are performed to validate the effectiveness of MSIT, and our method achieves state-of-the-art performance in arbitrary super-resolution tasks.
PDF: Multi-Scale Implicit Transformer with Re-parameterize for Arbitrary-Scale Super-Resolution.pdf
Empowered by ChatGPT