OpenCodePapers

deblurring-on-realblur-r-trained-on-gopro

Blind Image DeblurringDeblurring
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodePSNR (sRGB)SSIM (sRGB)ModelNameReleaseDate
AdaRevD: Adaptive Patch Exiting Reversible Decoder Pushes the Limit of Image Deblurring✓ Link36.530.957AdaRevD2024-06-13
Learning Enriched Features via Selective State Spaces Model for Efficient Image Deblurring✓ Link36.350.961ALGNet2024-03-29
Uformer: A General U-Shaped Transformer for Image Restoration✓ Link36.220.957Uformer-B2021-06-06
Restormer: Efficient Transformer for High-Resolution Image Restoration✓ Link36.190.957Restormer2021-11-18
Intriguing Findings of Frequency Selection for Image Deblurring✓ Link36.110.955DeepRFT2021-11-23
DeblurDiNAT: A Compact Model with Exceptional Generalization and Visual Fidelity on Unseen Domains✓ Link36.090.955DeblurDiNAT-L2024-03-19
Revisiting Image Deblurring with an Efficient ConvNet✓ Link36.080.955LaKDNet2023-02-04
Multi-Stage Progressive Image Restoration✓ Link35.990.952MPRNet2021-02-04
MSSNet: Multi-Scale-Stage Network for Single Image Deblurring✓ Link35.930.953MSSNet2022-02-19
MAXIM: Multi-Axis MLP for Image Processing✓ Link35.78MAXIM2022-01-09
Blind Image Deblurring Using Dark Channel Prior34.010.916Pan et al2016-06-01
Deblurring Low-light Images with Light Streaks33.670.916Hu et al2014-06-01
Deep Stacked Hierarchical Multi-patch Network for Image Deblurring✓ Link0.948DMPHN2019-04-06
Scale-recurrent Network for Deep Image Deblurring✓ Link0.947SRN2018-02-06
Dynamic Scene Deblurring Using Spatially Variant Recurrent Neural Networks✓ Link0.947Zhang et al2018-06-01
DeblurGAN-v2: Deblurring (Orders-of-Magnitude) Faster and Better✓ Link0.944DeblurGAN-v22019-08-10
Unnatural L0 Sparse Representation for Natural Image Deblurring0.937Xu et al2013-06-01
DeblurGAN: Blind Motion Deblurring Using Conditional Adversarial Networks✓ Link0.903DeblurGAN2017-11-19
Deep Multi-scale Convolutional Neural Network for Dynamic Scene Deblurring✓ Link0.841Nah et al2016-12-07