DEV Community

0 seconds of 26 minutes, 11 secondsVolume 90%
Press shift question mark to access a list of keyboard shortcuts
00:00
00:00
26:11
 
Jimmy Guerrero for Voxel51

Posted on

1

ECCV 2024 Redux: Robust Calibration of Large Vision-Language Adapters

We empirically demonstrate that popular CLIP adaptation approaches, such as Adapters, Prompt Learning, and Test-Time Adaptation, substantially degrade the calibration capabilities of the zero-shot baseline in the presence of distributional drift. We identify the increase in logit ranges as the underlying cause of miscalibration of CLIP adaptation methods, contrasting with previous work on calibrating fully-supervised models. Motivated by these observations, we present a simple and model-agnostic solution to mitigate miscalibration, by scaling the logit range of each sample to its zero-shot prediction logits

ECCV 2024 Paper: Robust Calibration of Large Vision-Language Adapters

About the Speaker: Balamurali Murugesan is currently pursuing his Ph.D. in developing reliable deep learning models. Earlier, he completed his master’s thesis on accelerating MRI reconstruction. He has published 25+ research articles in renowned venues.

Recorded on Nov 19, 2024

Image of Datadog

The Essential Toolkit for Front-end Developers

Take a user-centric approach to front-end monitoring that evolves alongside increasingly complex frameworks and single-page applications.

Get The Kit

Top comments (0)

AWS Security LIVE!

Join us for AWS Security LIVE!

Discover the future of cloud security. Tune in live for trends, tips, and solutions from AWS and AWS Partners.

Learn More

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay