DEV Community

David Ortinau
David Ortinau

Posted on

My Top 5 UI Moments of 2025 (So Far)

We are now at the end of the month, and I've had the benefit of reading everyone's MAUI UI July 2025 posts. They have been fantastic!

As the product owner for .NET MAUI I get go talk to so many of you and peek behind the scenes at the amazing cross-platform experiences you're building with .NET MAUI. You are the real pros. You work in the trenches between demanding users and often more demanding stakeholders. I applaud you! Please continue sharing your wisdom with our amazing .NET community.

In this blog I'm NOT going to rehash Telepathy, the AI infused task app that I showcased at Build 2025 and blogged about. Here are the links and if you explore the code there are quite a few fun little UI design things in there like animation, SkiaSharp animation, gradient text, and gradient borders.

Now here are my top 5 UI moments from 2025 (so far).

One: Do I Need WYSIWYG?

Every quarter we send out a survey asking for feedback about developers' experiences building .NET MAUI apps. This July we saw steady annual improvement in our customer satisfaction score (+13%) and our net promoter score (+43%). It turns out, you like the quality focus we've been giving the product for the last several releases.

One of the things that regularly shows up as a dislike is the lack of a drag and drop visual designer. We don't have one and we don't have plans to do one. What then is the best way to achieve your design goals without a WYSIWYG surface?

Let me feature 2 options I've found to be very useful.

Copilot Vision renderings of AirBnB, Spotify, and Camera app screens

GitHub Copilot Vision

Recently I was spinning up a series of UI examples to validate some .NET 10 safe area work the team had started, and I didn't have much time to build them. A part of our culture these days is to immediately ask "how could I do this with AI". I started prompting.

Create a new ContentPage called AirBnBPage.xaml and create a page to imitates the design of the detail view from the AirBnB app with a hero image that extends to the top of the screen. It should scroll and have sample content like the real app.

Use the new APIs for SafeArea described here https://github.com/dotnet/maui/issues/28986 and here https://github.com/dotnet/maui/pull/30337

Update the MainPage to be a navigation to sub pages like this new one. We will add more pages in the future
Enter fullscreen mode Exit fullscreen mode

It did an okay job originally, but I suddenly had a foggy recollection of a feature I knew we were working on whereby I could provide GitHub Copilot an image and it could render that UI in XAML. I wasn't sure it had shipped, so I just gave it a go. I took a couple screenshots of the AirBnB app on my phone to capture different states of the UI, air dropped them to my Macbook, and dragged them into the chat box.

Update the AirBnB page to reflect this design. The 3 images are all the same page in different states of scrolling vertically.
Enter fullscreen mode Exit fullscreen mode

The results were excellent! I now needed to just provide some more clarity on behavior.

the buttons for back, share, and like should not scroll with the content. They should remain in the same position always.
Enter fullscreen mode Exit fullscreen mode

All-in this probably took me 15 minutes, mostly patiently watching Copilot do the work.

btw - if you're interested in how to use the new SafeAreaEdges API in .NET 10, here's the gist. This new API works on all platforms, not just iOS. And instead of being an attached property, it's now a property on specific containers.

  • Layout.SafeAreaEdges
  • ContentView.SafeAreaEdges
  • ContentPage.SafeAreaEdges
  • Border.SafeAreaEdges
  • ScrollView.SafeAreaEdges

When setting SafeAreaEdges you'll set flags for Left, Top, Right, and Bottom.

public enum SafeAreaRegions
{
    None = 0,          // Edge-to-edge content (no safe area padding)
    SoftInput = 1,     // Always pad for keyboard/soft input
    Container = 2,     // Flow under keyboard, stay out of bars/notch  
    Default = 4,       // Platform default behavior
    All = int.MaxValue // Obey all safe area insets
}
Enter fullscreen mode Exit fullscreen mode

Here's an example of how I did this for the Spotify page. All I really care about is making sure some of the content starts and flows to the top edge. By default the rest will obey the safe area edges. As you can see, setting the value doesn't propagate down the visual tree.

<ContentPage>
  <Grid SafeAreaEdges="None">
    <ScrollView x:Name="MainScrollView" SafeAreaEdges="None">
      <VerticalStackLayout Spacing="0">
        <Grid x:Name="HeroSection">
          <Image Source="hero.png" />
          <!-- more code -->
        </Grid>
      </VerticalStackLayout>
    </ScrollView>

    <Grid x:Name="TopNavigation" SafeAreaEdges="All">
      <!-- more code -->
    </Grid>
Enter fullscreen mode Exit fullscreen mode

Did you know about the Vision capability of GitHub Copilot? If you didn't catch the announcement of this feature, read about it here. This is not only for design. Use it for anything you want to SHOW to Copilot.

You can also read more about it here and watch it in action here.

Figma to Code with MCP

Figma has several integration options, and this year is of course the year of the MCP (Model Context Protocol). At Build 2025 we included a brief demo of using an MCP to bring a Figma design into VS and render the design. For years now Figma has been my primary design tool, so this is very appealing to me.

Check out GLips/Figma-Context-MCP for details on how to setup and use this MCP in your preferred editor.

Two: I Relearned How to Draw

vocabulary matching screen

The main app that I keep iterating on is Sentence Studio, a playground for language learning. As you can see from the screenshots, it has a minimal design aesthetic. Last year I had it running on my e-ink Boox tablet, and any heavier design would have been a waste. I digress.

When I see some activity that could support my studies, I bring that idea into the app. Why not just use the various apps out there? I found that it was unproductive to constantly be switching between different vocabulary contexts and leveling (what is beginner vs intermediate). With my app I control the vocab and grammar skills, and pair them with different activities.

activities dashboard screen

Shadowing is a technique where you repeat a phrase just exactly as you hear it. The goal is to mimic the intonation and pacing while also practicing fundamental pronunciation. I thought to myself, wouldn't it be great to clip some audio and be able to play it back with some visual cues to practice against?

Solving good audio

Ideally, I would love to have native speaker recordings to play back. I tried quickly to use the YouTube API and grab one of the audio streams by providing the URL and timecodes I wanted to grab, however that didn't come together quickly. I had already been using the OpenAI text to speech feature, so I just implemented that. The results for Korean (my target language), were laughable. Seriously, Rachel just lol'd at me when I sent it to her.

Then I found ElevenLabs, a company using AI to model native speaker voices. You can actually even use it to record and model your own voice (or model Gerald and have his voice narrate all your videos). The library of voice profiles was much better for my purposes, so I grabbed a selection of Korean models and implemented that instead.

For the visualization I wanted to produce a waveform of the audio. I had done something similar pre-Microsoft for a startup that was watermarking audio and video to prove provenance. So, somewhere in my brain I had the understanding of working with an audio byte stream, and somewhere on a NAS I have the code. But this story isn't about how to write the code, it's about how to get the result.

Despite having AI at my fingertips, the initial results were not great. In fact, it had the peaks inverted. I had what looked like a waveform, but it just didn't match what I expected. I compared the drawing it created with the actual wave in Adobe Audition, and that's when I realized it was inside out.

Through this exercise I (re)learned:

  • coding is a collaborative exercise with Copilot
  • use Copilot to refresh your understanding of fundamentals, (i.e. audio encoding/decoding, file formats)
  • point Copilot to documentation resources and constantly add them to the copilot-instructions.md

This drawing uses .NET MAUI GraphicsView and Microsoft.Maui.Graphics for drawing. MauiReactor made it really easy to work with interactions on the GraphicsView

var graphicsView = new MauiReactor.GraphicsView(graphicsViewRef => _graphicsViewRef = graphicsViewRef)
    .Drawable(_drawable)
    .HeightRequest(_height)
    .HStart()
    .VCenter()
    .OnStartInteraction(OnWaveformStartInteraction)
    .OnEndInteraction(OnWaveformEndInteraction)
    .OnDragInteraction(OnWaveformDragInteraction);
Enter fullscreen mode Exit fullscreen mode

If you want to explore this deeper, head to davidortinau/SentenceStuedio and click the Copilot button at the top of the page. Ask something like:

Explain to me the drawing implementation on ShadowingPage and how it renders the waveform
Enter fullscreen mode Exit fullscreen mode

Three: Why So Slow

One of the activities I added this month to Sentence Studio is a reading view. The idea is that I can take a transcript from any content I'm studying, generate natural audio for it, and then read along as it highlights sentence by sentence. In addition, I can see vocabulary highlighted and tap those words to get translations. Other words are also tappable and will retrieve translations from OpenAI.

So I prompted this feature and the initial implementation was very close to exactly the experience I wanted. There was one exception. The UI was absurdly slow! The sentence highlighting lagged behind the audio and any touch interaction started a cascade of freezes. (I wish I could find a video to show you.)

I tried release mode and it was certainly much better. Still, it wasn't acceptable. Why? There were actually several problems that I needed to work through before I got to the big win, and unfortunately, I couldn't YOLO a fix with Copilot this time.

  • the generated code was guessing at the timing
  • the generated audio wasn't including the transcript timecode metadata despite me providing the documentation ahead of time (I didn't explicitly tell the model which API to use)
  • the highlighting code was looping over loops over loops
  • the text was all Label and Span rendering thousands of UI elements

So, once I set it straight on the API to use, and directed it to STOP guessing (work smarter not harder) the results were much improved, but I still needed to solve the thousands of UI elements.

If you've read any of the other articles this month, you've likely seen SkiaSharp mentioned already. While drawing and text aren't usually a combination I consider, I figured it was worth a shot. I asked Copilot first for an analysis of what the improvement might be along with a plan.

Alright! Why not, let's do it. My prompt was something like.

Do it!
Enter fullscreen mode Exit fullscreen mode

reading page before and after font fix

My first run was VERY fast, but the reading experience was lacking something (can you guess what it was from the image?). This led me to figuring out how to best surface fonts to the SkiaSharp context since it's not the same as a MauiFont which is loaded into memory.

using var fontStream = await FileSystem.OpenAppPackageFileAsync("fonts/bm_yeonsung.ttf");
typeface = SKTypeface.FromStream(fontStream);
Enter fullscreen mode Exit fullscreen mode

In order however to make sure the build process doesn't exclude a file as duplicate, I needed to call out the font I wanted here with a unique logical name. I received good pointers from this issue discussion, as well as by nagging Matthew Leibowitz directly.

<!-- Custom Fonts -->
<MauiFont Include="Resources\Fonts\*" />

<!-- Korean font also as MauiAsset for SkiaSharp direct access (different logical name) -->
<MauiAsset Include="Resources\Fonts\bm_yeonsung.ttf" LogicalName="fonts/bm_yeonsung.ttf" />
Enter fullscreen mode Exit fullscreen mode

Four: I'm a game developer (once again)

If you've followed me for any duration, it's not long before I drop my favorite F-bomb. That's right. Flash. From around 1998 until shortly after the letter I wrote a lot of ActionScript and it was often some kind of game. Those were good times!

Back in April an ad kept popping up in my Instagram feed for a kids game device called GiiKER Super Decode. It just looked really cool! If you have little kids you want to keep occupied maybe without a screen, then go snag one.

It inspired me to see if I could prompt my way to a decent little game. I took a bit of Wordle, some arcade cabinet button designs, and learned how Mastermind was actually played (I _never _understood that game as a kid). I think davidortinau/CodeBreakerGame turned out pretty darn good!

The thing I really loved about my experience making this game was just how easy it was to try different things! If I were coding it all myself by hand like a cave man, I would have thought twice about experimenting with whacky ideas. Now though, all I had to do was type a few directives and walk about for a few minutes while Copilot churned it out.

The ROI on this kind of iteration is amazing! I can have an idea, try it out for a few minutes and see how it feels. Oh, the game is too easy? Too hard? Too fast? The buttons are too small? What if the buttons to disable when not in play? On and on I can poke at ideas and in only an afternoon I have an app.

Five: Colors

Whenever I start a new app I copy over a theme file (ResourceDictionary) from another app. I have a variety of semantic styles I like to use, and what really needs to change is just a few colors.

When I was working on Telepathy, I wanted to set it apart visually from the default sample content template we ship with .NET MAUI (You know about that right? dotnet new maui -n AwesomeToDo -sc). As you can see below, I went through quite a few color theme variations.

What I wanted to experiment with was creating an app that would generate an entire color palette and theme that I could copy/paste into a project.

I saw that Ado blogged about Reactor Theme earlier this month, so I'll look at incorporating that too.

set of screenshots for the Color Stylerator app
dark mode screnshots

I spent quite a bit of time with ChatGPT back in April going over the best approaches to creating such a theme that worked independently from Material Design, Fluent Design, etc. What I decided on was entering 2 colors, a primary and secondary. From those, the app would generate everything I needed for a light and dark mode by using sensible defaults for the light and dark background colors which I could override if I so desired.

The app will output a ResourceDictionary or C# theme file in a variety of formats:

  • AppThemeColor
  • AppThemeBinding
  • DynamicResource
  • MauiReactor

AppThemeColor from the .NET MAUI Community Toolkit is one of my favorites because I only need to create one color with 2 variations.

<toolkit:AppThemeColor x:Key="Primary"
        Light="#0078D4"
        Dark="#219EFE" />
Enter fullscreen mode Exit fullscreen mode

It's applied using a markup extension like this.

<Setter Property="BackgroundColor" Value="{toolkit:AppThemeResource Primary}" />
Enter fullscreen mode Exit fullscreen mode

AppThemeBinding is the default way to do theme aware colors in .NET MAUI.

<Setter Property="Stroke" Value="{AppThemeBinding Light={StaticResource Gray200}, Dark={StaticResource Gray500}}" />
Enter fullscreen mode Exit fullscreen mode

DynamicResource is an often overlooked option. It's suitable for supporting more than light and dark themes. Perhaps you have dozens of themes the user can choose from. Then you'd use DynamicResource so the user can apply them at runtime. I would typically keep colors in separate RDs, and then load the one I want to use. As the color Primary changes, the dynamic resources pick those changes up.

 <Setter Property="Background" Value="{DynamicResource Primary}" />
Enter fullscreen mode Exit fullscreen mode

In fact, this is the approach that this app takes. In my case I'm generating the

public void ApplyThemeColors(string primary, string secondary,
                            string lightBackground, string darkBackground)
{
    // Check if the app is in dark mode
    var isDark = Application.Current?.RequestedTheme == AppTheme.Dark;

    // Generate the theme using the new overload that handles both light and dark backgrounds
    // and automatically generates appropriate dark variants of primary and secondary colors
    var theme = ColorThemeGenerator.Generate(
        primary: Color.FromArgb(primary),
        secondary: Color.FromArgb(secondary),
        lightBackground: Color.FromArgb(lightBackground),
        darkBackground: Color.FromArgb(darkBackground),
        isDarkTheme: isDark);

    // Apply the theme
    Application.Current?.Resources.ApplyTheme(theme, isDark);
}
Enter fullscreen mode Exit fullscreen mode
public static void ApplyTheme(this ResourceDictionary resources, ColorTheme theme, bool isDarkMode)
{
        resources[nameof(theme.Primary)] = isDarkMode ? theme.PrimaryDark : theme.Primary;
        resources[nameof(theme.OnPrimary)] = isDarkMode ? theme.OnPrimaryDark : theme.OnPrimary;

        resources[nameof(theme.Secondary)] = isDarkMode ? theme.SecondaryDark : theme.Secondary;
        resources[nameof(theme.OnSecondary)] = isDarkMode ? theme.OnSecondaryDark : theme.OnSecondary;

        resources[nameof(theme.Background)] = isDarkMode ? theme.BackgroundDark : theme.Background;
        resources[nameof(theme.Surface0)] = isDarkMode ? theme.Surface0Dark : theme.Surface0;
        resources[nameof(theme.Surface1)] = isDarkMode ? theme.Surface1Dark : theme.Surface1;
        resources[nameof(theme.Surface2)] = isDarkMode ? theme.Surface2Dark : theme.Surface2;
        resources[nameof(theme.Surface3)] = isDarkMode ? theme.Surface3Dark : theme.Surface3;
        resources[nameof(theme.OnBackground)] = isDarkMode ? theme.OnBackgroundDark : theme.OnBackground;
        resources[nameof(theme.OnSurface)] = isDarkMode ? theme.OnSurfaceDark : theme.OnSurface;

        resources[nameof(theme.Error)] = isDarkMode ? theme.ErrorDark : theme.Error;
        resources[nameof(theme.Success)] = isDarkMode ? theme.SuccessDark : theme.Success;
        resources[nameof(theme.Info)] = isDarkMode ? theme.InfoDark : theme.Info;
}
Enter fullscreen mode Exit fullscreen mode

And finally, my preferred way to write apps, MauiReactor.

IndicatorViewStyles.Default = _ => _
  .IndicatorColor(IsLightTheme ? Gray200 : Gray500)
  .SelectedIndicatorColor(IsLightTheme ? Gray950 : Gray100);
Enter fullscreen mode Exit fullscreen mode

I'd say this app is definitely an Alpha state, so please if you're interested, use it and send some PRs! Or do a really good job on an Issue description and I'll assign it to the GitHub Copilot agent. :)

In Conclusion

I've had a very productive year so far, both in terms of learning and producing. Our space is moving so quickly, and information is so widely available that I'm not sure if I'm ahead or behind! Are you also experiencing this wonderful boom of productivity and creativity? Let me know in the comments!

Top comments (0)