DEV Community

Omansh Sharma
Omansh Sharma

Posted on

When having a better display backfires: fixing 'blurry' VLC in Hyprland

Skip to the main content

I dual booted to Arch Linux few months ago and started running Hyprland on it. The experience is out of the world. I just love how I can open any number of windows without bothering with layout, navigate between them without lifting a finger off my keyboard, and when it gets messy just open a new workspace. So naturally I have been trying to slowly shift most of my work from Windows to my Linux setup.

This weekend I decided to finally watch the movie I had been wanting to for a long time. I thought why not use Arch for that? I already had VLC installed, but I had never used it. When I opened VLC, I noticed it was all blurry. So I spent my afternoon solving this issue. I felt the issue was pretty interesting so I thought of writing a short blog about it.

Background

To understand why a simple video player looks blurry, we need some technical context:

1. HiDPI (High Dots Per Inch) Displays

This is the physical reality of your screen. Historically many UI toolkits assumed ~96 DPI as a reference value. Modern screens pack far more pixels into the same physical space. If Qt draws that same 50-pixel button on a HiDPI screen, it will appear tiny unless it knows it should rasterize a 100-pixel button instead (scale factor 2).

When developers design an interface, they define components using logical sizes, like '12pt font' or '100 logical pixels.' The compositor then provides a scale factor telling applications how large their buffers should be relative to the physical pixels.

If an application renders at scale 1 while the compositor later enlarges it to scale 2, the UI becomes blurry. If the application renders directly at scale 2, the UI stays sharp.

So the important question is: who performs the scaling — the application or the compositor?

2. Qt

Qt is the framework VLC uses to create its interface. Its primary job is Rasterization.

What is Rasterization?

Think of it as the difference between a blueprint and a photograph.

  • The Blueprint (Vector): The application code describes the UI mathematically: "Draw a perfect circle with a 50-unit radius." This mathematical description has infinite resolution—it's just an equation.
  • The Photograph (Raster): To show this on a screen, Qt must convert that equation into a specific grid of colored dots (pixels). This process of turning "math" into "dots" is rasterization.

Once Qt rasterizes the interface, it stores the result in a chunk of memory called a Graphical Buffer. Think of the buffer as a finished bitmap image (like a PNG) sitting in your RAM. Qt then hands this finished image to the Display Server to be placed on the screen.

Rasterization translates Qt's internal vector math (left) into the physical dots on your screen (right).

3. The Display Server / Compositor

The Display Server is the piece of software that sits between the applications (clients) and the hardware (kernel/GPU). Its job is to take the buffers (images) from all your open apps and decide exactly where they appear on the physical screen.

4. The Protocols: X11 vs Wayland

These are the communication standards that define how an application talks to the display hardware.

X11

Designed in the 1980s as a network-transparent protocol, X11 mandates a strict separation of concerns. It requires three distinct components:

  1. The Client (App): Requests to draw a window.
  2. The X Server (Display Server): Controls the hardware while the window manager decides placement and behavior of windows.
  3. The Window Manager: A separate program that decides where windows go.

This architecture relies on a Global Coordinate Space, where the Server manages a single, unified canvas across all screens. Because every app shares the same global grid, they all have to agree on the "scale" of that grid. If you have a HiDPI screen that needs a dense grid and a normal screen that needs a coarse grid, X11 struggles to stretch just one part of the canvas.

While X11 does have DPI awareness through settings like Xft.dpi, it's system-wide rather than per-monitor, and many legacy apps simply ignore it and default to rendering pixels for a low-res screen.

Wayland

Wayland was designed to eliminate this inefficiency. The protocol defines a direct relationship between the client and the entity controlling the screen. To implement the Wayland protocol, the "Display Server" and the "Window Manager" must be merged into a single piece of software called the Compositor (in our case, Hyprland).

  • Direct Control: Clients allocate their own buffers and hand them directly to the Compositor.
  • Isolation: Instead of a shared global grid, Wayland gives every app its own isolated Surface.

Wayland handles HiDPI seamlessly. The compositor advertises a scale factor and the application renders its buffer at that scale. The app observes the scale and redraws itself with high-resolution assets. Because the buffer is produced at the correct size from the start, the compositor can display it directly without stretching it.

5. The Bridge: XWayland

X11 was the standard for over 30 years. Thousands of applications (like older games, Steam, and yes, VLC 3.0) were written to speak only X11. Since standard X11 applications cannot communicate via the Wayland protocol, XWayland acts as a translation layer. It is a fully functional X Server that runs inside your Wayland session.

  • To the App (VLC): XWayland looks like a standard X Server.
  • To the Compositor (Hyprland): XWayland looks like a single application window.

But this creates a problem. X11 apps render assuming a single unscaled coordinate space. In a scaled Wayland session, the compositor must enlarge that already rendered output to match the display scale. This leads to two possibilities:

  • Let them render at native 1x resolution (sharp but tiny)
  • Scale them up (readable size but blurry)

Hyprland chooses the second option by default so the UI remains usable. However this means the compositor stretches a low‑resolution buffer, causing blur.

The issue and the fix

VLC 3.0 lacks native Wayland support, so it falls back to XWayland. XWayland has it render at 1x and the Wayland compositor (Hyprland) then scales the final image to fit the screen, resulting in a low-resolution display. But this can be fixed, since unlike legacy X11 apps, Qt can handle scaling internally!

To fix this, first disable force scaling in Hyprland, by adding this to the Hyprland config file:

xwayland {
    force_zero_scaling = true
}
Enter fullscreen mode Exit fullscreen mode

This makes VLC render sharp—but tiny. To get it to the correct size, we prevent compositor scaling and tell Qt directly to scale at 2x

QT_SCALE_FACTOR=2 vlc
Enter fullscreen mode Exit fullscreen mode

Qt now renders the UI at 2x internally, and since force scaling is disabled, Hyprland doesn't scale it again, giving you sharp, properly‑sized text.

You might ask why does Hyprland's scaling cause the blur while Qt's doesn't? Because Hyprland scales pixels after rasterization, while Qt scales before rasterization using the geometry of the interface.

Closing Thoughts

This journey gave me an opportunity to get to know the Linux GUI ecosystem a bit better. I am still learning the intricacies of the Linux graphics stack (it’s a deep rabbit hole!). If I have oversimplified any architectural details or if you know a cleaner way to handle this in Hyprland, I would love to hear your feedback. Feel free to reach out or correct me!

Top comments (0)