DEV Community

 ViksaaSkool
ViksaaSkool

Posted on

(Idea + Sensors + Compose + Lottie)*AI Agents = Beautiful UI

Idea

In the AI era of software development, the doing part has become significantly cheaper — especially in terms of time. Complex ideas are now just a prompt away from becoming an MVP.

This shift means that standing out is no longer about what you build, but how well you build it. Quality, polish, and thoughtful UX are what separate good products from forgettable ones.

In this post, I’ll walk through a step-by-step guide to implementing a widely used UI widget: a steps counter component that visualizes a user’s daily activity as progress toward a goal (typically 10,000 steps).

The component features an animated humanoid figure that mirrors the user’s activity — when the user walks, the figure walks; when the user is idle, the figure remains still. Color is used as a second signal: progress shifts from red (far from the goal) to green (close to completion).

The goal is to demonstrate how small details and animation can turn a standard metric into a more engaging, human-centered experience.

Idea state 1 Idea state 2

Sensors

The first thing that needs to be done is to create a mechanism that tracks movement.

  • it needs to be able to tell if the user is indoor or outdoor.
  • if it's indoor it should rely on sensors - Sensor.TYPE_LINEAR_ACCELERATION and Sensor.TYPE_GYROSCOPE
  • if it's outdoor it should rely ActivityRecognitionResult, i.e. distinguish between DetectedActivity.WALKING, DetectedActivity.RUNNING and DetectedActivity.ON_FOOT in MotionTransitionReceiver

For that I need to update the AndroidManifest.xml:


<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>
    <uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION"/>

    <uses-permission android:name="android.permission.ACTIVITY_RECOGNITION" />

Enter fullscreen mode Exit fullscreen mode

and


<receiver android:name=".motion.transition.MotionTransitionReceiver"
            android:exported="false" />

Enter fullscreen mode Exit fullscreen mode

Then, the more important part:


//how we request updates 
fun ComponentActivity.requestActivityUpdates() = ActivityRecognition.getClient(this)
    .requestActivityUpdates(
        1500,
        getMotionPendingIntent()
    )
    .addOnSuccessListener {
        Timber.d("Periodic updates registered")
    }

//simple way of deciding if the user is indoor or outdoor 
fun Location.isLikelyOutdoors(): Boolean {
    val goodAccuracy = accuracy <= 20f
    val hasAltitude = hasAltitude()
    val hasBearing = hasBearing()

    return goodAccuracy && (hasAltitude || hasBearing)
}

Enter fullscreen mode Exit fullscreen mode

Two listeners (OutdoorStateListener and MotionBasedOnSensorListener) are defined so they can be passed around.

interface MotionBasedOnSensorListener {

    fun start()
    fun stop()
    val isMoving: StateFlow<Boolean>
}

interface OutdoorStateListener {

    fun start()
    fun stop()
    val isOutdoor: StateFlow<Boolean>
    val shouldRequestPermission: StateFlow<Boolean>
}

Enter fullscreen mode Exit fullscreen mode

MotionBasedOnSensorDetector main idea is represented in:


class MotionBasedOnSensorDetector @Inject constructor(
    @ApplicationContext val context: Context
) : SensorEventListener, MotionBasedOnSensorListener { 

    private val _isMoving = MutableStateFlow(false)
    override val isMoving: StateFlow<Boolean> = _isMoving

//...

    override fun onSensorChanged(event: SensorEvent) {
        val values = event.values
        val magnitude = sqrt(values[0] * values[0] + values[1] * values[1] + values[2] * values[2])

        val now = System.currentTimeMillis()

        val moving = when (event.sensor.type) {
            Sensor.TYPE_LINEAR_ACCELERATION -> magnitude > thresholdAccel
            Sensor.TYPE_GYROSCOPE -> magnitude > thresholdGyro
            else -> false
        }

        if (moving) {
            lastMovementTime = now
            _isMoving.value = true
        } else {
            if (now - lastMovementTime > 1200) {  // 1.2s without movement
                _isMoving.value = false
            }
        }
    }


}

Enter fullscreen mode Exit fullscreen mode

While OutdoorStateMonitor is implementation of OutdoorStateListener:


class OutdoorStateMonitor @Inject constructor(
    @ApplicationContext val context: Context
) : OutdoorStateListener {

//...
private val callback = object : LocationCallback() {
        override fun onLocationResult(result: LocationResult) {
            val location = result.lastLocation ?: return

            val isOutdoor = location.isLikelyOutdoors()

            if (lastIsOutdoor != isOutdoor) {
                lastIsOutdoor = isOutdoor
                Timber.d("OutdoorStateMonitor | isOutdoor = $isOutdoor, acc=${location.accuracy}")
                context.debugToast("isOutdoor = $isOutdoor, acc=${location.accuracy}")
                _isOutdoor.value = isOutdoor
            }
        }
    }

Enter fullscreen mode Exit fullscreen mode

As stated in the AndroidManifest.xml, there's a MotionTransitionReceiver that is defined as:


class MotionTransitionReceiver : BroadcastReceiver() {

    override fun onReceive(context: Context, intent: Intent) {
        if (ActivityRecognitionResult.hasResult(intent)) {
            ActivityRecognitionResult.extractResult(intent)?.let {
                val activity = it.mostProbableActivity
                Timber.d("Periodic update = ${activity.type.toActivityName()} confidence=${activity.confidence}")
                context.debugToast("onReceive() | event = ${activity.type.toActivityName()}, confidence=${activity.confidence}")

                val moving = activity.type == DetectedActivity.WALKING ||
                        activity.type == DetectedActivity.RUNNING ||
                        activity.type == DetectedActivity.ON_FOOT

                MotionState.setMoving(moving)
            }
        }



Enter fullscreen mode Exit fullscreen mode

This forms the basic simple, heuristic for the detection of movement that needs to be managed into a ViewModel that, then in UDF fashion communicates with the UI:


@HiltViewModel
class StepsViewModel @Inject constructor(
    val outdoorStateListener: OutdoorStateListener,
    val motionBasedOnSensorListener: MotionBasedOnSensorListener
) : IStepsViewModel() {

    private val _shouldStartMotionTransition = MutableStateFlow(false)
    override val shouldStartMotionTransition = _shouldStartMotionTransition.asStateFlow()

    private val _isMoving = MutableStateFlow(false)
    override val isMoving = _isMoving.asStateFlow()

//...

init {
        collectShouldRequestPermission()
        outdoorStateListener.start()
        collectOutdoorStateChange()
    }

    private fun collectOutdoorStateChange() = viewModelScope.launch {
        outdoorStateListener.isOutdoor.collect { isOutdoor ->
            Timber.d("collectOutdoorStateChange() |  isOutdoor = $isOutdoor")
            if (isOutdoor) {
                _shouldStartMotionTransition.value = true
                motionBasedOnSensorListener.stop()
                collectMotionTransitionState()
            } else {
                _shouldStartMotionTransition.value = false
                motionBasedOnSensorListener.start()
                collectMotionBasedOnSensor()
            }
        }
    }

//...

}

Enter fullscreen mode Exit fullscreen mode

Compose + Lottie

Second, compose component needs to be created. A circle that needs with lottie animation in the center.
The easy part - drawing the animation. The hard part making the lottie animation tintable.


@Composable
fun CircularFillWithLottie(
    modifier: Modifier = Modifier,
    progressPercentage: Int,
    fillColor: Color = Color.Green,
    backgroundColor: Color = Color.LightGray,
    strokeWidth: Dp = 8.dp,
    animationDuration: Int = 800,
    animationEasing: Easing = FastOutSlowInEasing,
    lottieRes: Int? = null,
    isMoving: Boolean = true
) {
//...
}

//and then the most important component: 

@Composable
fun TintableLottie(
    @RawRes lottieRes: Int,
    fillColor: Color,
    isPlaying: Boolean = true,
    iterations: Int = LottieConstants.IterateForever
) {
    val composition by rememberLottieComposition(LottieCompositionSpec.RawRes(lottieRes))

    val lottieAnimState = animateLottieCompositionAsState(
        composition = composition,
        isPlaying = isPlaying,
        iterations = iterations
    )

    val fillProperty = rememberLottieDynamicProperty(
        property = LottieProperty.COLOR,
        value = fillColor.toArgb(),
        keyPath = arrayOf("**")
    )

    val strokeProperty = rememberLottieDynamicProperty(
        property = LottieProperty.STROKE_COLOR,
        value = fillColor.toArgb(),
        keyPath = arrayOf("**")
    )

    val filterProperty = rememberLottieDynamicProperty(
        property = LottieProperty.COLOR_FILTER,
        value = SimpleColorFilter(fillColor.toArgb()),
        keyPath = arrayOf("**")
    )
    val properties = rememberLottieDynamicProperties(
        fillProperty,
        strokeProperty,
        filterProperty
    )

    val dynamicProps = remember(fillColor) {
        properties
    }

    LottieAnimation(
        composition = composition,
        progress = { lottieAnimState.progress },
        dynamicProperties = dynamicProps
    )
}

Enter fullscreen mode Exit fullscreen mode

Note: the lottie animation was found on lottiefiles.com.

To make things come together, one separate component, MotionDetector needs to be implemented with the sole purpose of fetching the motion data and passing it into the UI. That's done in this manner:


@Composable
fun MotionDetector(
    stepsViewModel: IStepsViewModel = hiltViewModel<StepsViewModel>(),
    onMovementChange: (Boolean) -> Unit
) {
    val activity = LocalActivity.current
    val shouldStart by stepsViewModel.shouldStartMotionTransition.collectAsState()
    val shouldRequestPermission by stepsViewModel.shouldRequestPermission.collectAsState()
    val isMoving by stepsViewModel.isMoving.collectAsState()
    onMovementChange(isMoving)
    if (activity is ComponentActivity) {
        val permissions = arrayOf(
            Manifest.permission.ACTIVITY_RECOGNITION,
            Manifest.permission.ACCESS_FINE_LOCATION,
        )

        val launcher = rememberLauncherForActivityResult(
            contract = ActivityResultContracts.RequestMultiplePermissions()
        ) { results ->
            val allGranted = results.values.all { it }

            if (allGranted) {
                activity.requestActivityUpdates()
                if (shouldRequestPermission) {
                    stepsViewModel.onPermissionGranted()
                }
            } else {
                Toast.makeText(activity, "Permissions denied", Toast.LENGTH_SHORT).show()
            }
        }

        val allGranted = permissions.all {
            ContextCompat.checkSelfPermission(activity, it) == PackageManager.PERMISSION_GRANTED
        }

        LaunchedEffect(shouldStart) {
            if (shouldStart) {
                if (allGranted) {
                    activity.requestActivityUpdates()
                } else {
                    launcher.launch(permissions)
                }
            } else {
                activity.removeActivityUpdates()
            }
        }

        LaunchedEffect(shouldRequestPermission) {
            if (shouldRequestPermission) {
                launcher.launch(permissions)
            }
        }

        DisposableEffect(Unit) {
            onDispose {
                activity.removeActivityUpdates()
            }
        }
    }

}

Enter fullscreen mode Exit fullscreen mode

And then the final component:

@Composable
fun StepsContainer(stepsViewModel: IStepsViewModel = hiltViewModel<StepsViewModel>()) {

    var isMoving: Boolean by remember { mutableStateOf(false) }
    var progressPercentage: Int by remember { mutableIntStateOf((0..100).random()) } //this needs to be real value from health api

    MotionDetector(stepsViewModel) {
        isMoving = it
    }

    Scaffold(modifier = Modifier.fillMaxSize()) { innerPadding ->
        Box(
            modifier = Modifier
                .fillMaxSize()
                .padding(innerPadding),
            contentAlignment = Alignment.Center
        ) {
            CircularFillWithLottie(
                progressPercentage = progressPercentage,
                modifier = Modifier.size(150.dp),
                fillColor = progressPercentage.toProgressColor(),
                backgroundColor = Color.LightGray,
                strokeWidth = 12.dp,
                lottieRes = R.raw.walker_man,
                isMoving = isMoving
            )
        }
    }
}

Enter fullscreen mode Exit fullscreen mode

AI Agents

Needles to say yo can achieve this using any of the most popular coding tools. My choice was ChatGPT, but I bet you'd get equally good results with any other.

Result


video here

The whole code is in this repo.

Top comments (0)