DEV Community

Cover image for First Cross-App Multimodal Search Dataset Shows How Users Navigate Mobile Apps
Mike Young
Mike Young

Posted on • Originally published at aimodels.fyi

First Cross-App Multimodal Search Dataset Shows How Users Navigate Mobile Apps

This is a Plain English Papers summary of a research paper called First Cross-App Multimodal Search Dataset Shows How Users Navigate Mobile Apps. If you like these kinds of analysis, you should join AImodels.fyi or follow us on Twitter.

Overview

  • Qilin dataset contains 8.4 million multimodal search sessions from real mobile app usage
  • Features interactions across 9 different mobile apps with diverse content types
  • Includes text, image, and hybrid search queries with corresponding search results
  • First dataset to track user behaviors across multiple apps in sequence
  • Contains 2.2 million unique images and 6.9 million text documents
  • Enables research on how users navigate between different mobile applications

Plain English Explanation

The Qilin dataset is like a digital diary of how people actually search for information on their phones. Instead of just tracking what people search for on a single app like Google, it follows users as they hop between different apps like news readers, social media, e-commerce ...

Click here to read the full summary of this paper

Heroku

Build apps, not infrastructure.

Dealing with servers, hardware, and infrastructure can take up your valuable time. Discover the benefits of Heroku, the PaaS of choice for developers since 2007.

Visit Site

Top comments (0)

Billboard image

The Next Generation Developer Platform

Coherence is the first Platform-as-a-Service you can control. Unlike "black-box" platforms that are opinionated about the infra you can deploy, Coherence is powered by CNC, the open-source IaC framework, which offers limitless customization.

Learn more

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay