<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Noah Velasco</title>
    <description>The latest articles on DEV Community by Noah Velasco (@noahvelasco).</description>
    <link>https://dev.to/noahvelasco</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/noahvelasco"/>
    <language>en</language>
    <item>
      <title>React Native Maps: Easy Bird's Eye View Animation</title>
      <dc:creator>Noah Velasco</dc:creator>
      <pubDate>Tue, 22 Aug 2023 11:44:23 +0000</pubDate>
      <link>https://dev.to/noahvelasco/react-native-maps-easy-birds-eye-view-animation-23dj</link>
      <guid>https://dev.to/noahvelasco/react-native-maps-easy-birds-eye-view-animation-23dj</guid>
      <description>&lt;ol&gt;
&lt;li&gt;Introduction&lt;/li&gt;
&lt;li&gt;Setting Up Project&lt;/li&gt;
&lt;li&gt;Understanding the MapView Props&lt;/li&gt;
&lt;li&gt;Creating the Animation&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;li&gt;&amp;lt;/&amp;gt; &lt;a href="https://github.com/noahvelasco/React-Native-Maps-Animation-Tutorial" rel="noopener noreferrer"&gt;Full Code&lt;/a&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Hey there, fellow developers! Ever thought about giving your React Native maps a fresh twist? Today, we're diving into creating a bird's-eye view animation, offering users an engaging and elevated perspective. And the best part? It's completely free to use the map and the animation! Just a few lines of code and we got ourselves a nice bird's eye view animation! Our goal is to create something like the following - &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rzlngque8t64a23nwrj.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rzlngque8t64a23nwrj.gif" alt="final prod" width="600" height="1301"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Setting Up Project
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;a href="https://docs.expo.dev/get-started/create-a-project/" rel="noopener noreferrer"&gt;Create an Expo Project&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://docs.expo.dev/versions/latest/sdk/map-view/" rel="noopener noreferrer"&gt;Add the MapView dependency&lt;/a&gt;&lt;br&gt;
&lt;code&gt;npx expo install react-native-maps&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a basic Map View&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import React from "react";
import { StyleSheet, View } from "react-native";
import MapView from "react-native-maps";

export default function App() {
  return (
    &amp;lt;View style={styles.container}&amp;gt;
      &amp;lt;MapView style={styles.map} /&amp;gt;
    &amp;lt;/View&amp;gt;
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
  },
  map: {
    width: "100%",
    height: "100%",
  },
});

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and you should have something like the following.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4kqw0crl0qfk5wismvnt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4kqw0crl0qfk5wismvnt.png" alt="mapview default" width="800" height="1733"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;Great we are already halfway there!&lt;/p&gt;

&lt;h3&gt;
  
  
  Understanding the MapView Props
&lt;/h3&gt;

&lt;p&gt;Now for the fun part - understanding the relevant MapView props! When reading the full &lt;a href="https://github.com/react-native-maps/react-native-maps/blob/master/docs/mapview.md" rel="noopener noreferrer"&gt;MapView documentation&lt;/a&gt; you can see the &lt;strong&gt;MANY&lt;/strong&gt; props but we will only need the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;mapType&lt;/strong&gt;: Prop controlling which map layer type we are using. I use "satellite" since we only care about the map and not the pins nor pin labels, but feel free to use any other layer.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;camera&lt;/strong&gt;: Prop controlling which region we are viewing on screen, the heading (direction faced), the zoom level, and the 'pitch' (up-down angle).&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let's now test out these props! Let's try mapping the pyramids from a strictly top view.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;MapView
        style={styles.map}
        mapType="satellite"
        camera={{
          center: {
            latitude: 29.978,
            longitude: 31.131,
          },
          pitch: 0, // Change this value to set the desired pitch
          heading: 0, // Direction faced by the camera, in degrees clockwise from North.
          zoom: 15.5, // Closer values mean a higher zoom level.
        }}
      /&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and we get the following -&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg177z3lddsv5afcjh5c0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg177z3lddsv5afcjh5c0.png" alt="mapview basic camera unconfigured" width="800" height="1733"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Relative to the original, modifying the &lt;strong&gt;pitch&lt;/strong&gt; to 90 we get -&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5vttf09sargalqwybu38.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5vttf09sargalqwybu38.png" alt="modified pitch" width="800" height="1733"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can see we have created an angle of depression.&lt;/p&gt;

&lt;p&gt;Next - relative to the original, modifying the &lt;strong&gt;heading&lt;/strong&gt; to 180 we get -&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fom22t5rfvox34bqgkrpe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fom22t5rfvox34bqgkrpe.png" alt="modified heading" width="800" height="1733"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can see we have ended up on the other side of the imaginary orbit focused on the pyramids.&lt;/p&gt;

&lt;p&gt;We now know that pitch controls the angle of depression focused about the specified region and the heading controls the direction we are facing on an imaginary orbit about the specified region. Now lets animate it!&lt;/p&gt;

&lt;h3&gt;
  
  
  Creating the Animation
&lt;/h3&gt;

&lt;p&gt;In order to create the animation, we must first understand how the animation is going to work. We want the camera to stay focused on a region as it orbits the specified region.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F13oxa2tb31j5d5skq1e7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F13oxa2tb31j5d5skq1e7.png" alt="animation sketch" width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The only thing we will need to update is where we are on that circle, thus we only need to update the heading every so often. The center and the pitch will remain the same for the animations entirety. &lt;/p&gt;

&lt;p&gt;To update the heading every so often we will need to update the headings state, thus we will use useState. We will also be updating the state on an interval so that we can get a "bird's eye view" orbiting animation. Below is what we add before the return -&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; //animation to circle map
  const [heading, setHeading] = useState(0);
  useEffect(() =&amp;gt; {
    const intervalId = setInterval(() =&amp;gt; {
      setHeading((prevHeading) =&amp;gt; (prevHeading + 0.1) % 360); // Increment heading, and reset to 0 after reaching 360
    }, 10);

    return () =&amp;gt; clearInterval(intervalId); // Clear the interval when component is unmounted
  }, []);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and update the header variable to be dynamic -&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;heading: heading, // Direction faced by the camera in degrees clockwise from North.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We are done and should have something like this now -&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rzlngque8t64a23nwrj.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rzlngque8t64a23nwrj.gif" alt="final prod" width="600" height="1301"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;And that's a wrap! With the bird's-eye view effect in place, your React Native maps just got a whole lot cooler. It's these thoughtful touches that can make an app truly stand out. Keep exploring, keep innovating, and as always, happy coding!&lt;/p&gt;

&lt;h3&gt;
  
  
  Please like and follow me on Github @&lt;a href="https://github.com/noahvelasco" rel="noopener noreferrer"&gt;noahvelasco&lt;/a&gt;!
&lt;/h3&gt;

&lt;h2&gt;
  
  
  &amp;lt;/&amp;gt; Full Code
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/noahvelasco/React-Native-Maps-Animation-Tutorial" rel="noopener noreferrer"&gt;Full Code on Github&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import React, { useState, useEffect } from "react";
import { StyleSheet, View } from "react-native";
import MapView from "react-native-maps";

export default function App() {
  //animation to circle map
  const [heading, setHeading] = useState(0);
  useEffect(() =&amp;gt; {
    const intervalId = setInterval(() =&amp;gt; {
      setHeading((prevHeading) =&amp;gt; (prevHeading + 0.1) % 360); // Increment heading, and reset to 0 after reaching 360
    }, 10);

    return () =&amp;gt; clearInterval(intervalId); // Clear the interval when component is unmounted
  }, []);

  return (
    &amp;lt;View style={styles.container}&amp;gt;
      &amp;lt;MapView
        style={styles.map}
        mapType="satellite"
        camera={{
          center: {
            latitude: 29.978,
            longitude: 31.131,
          },
          pitch: 90, // Change this value to set the desired pitch
          heading: heading, // Direction faced by the camera, in degrees clockwise from North.
          zoom: 15.5, // Closer values mean a higher zoom level.
        }}
      /&amp;gt;
    &amp;lt;/View&amp;gt;
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
  },
  map: {
    width: "100%",
    height: "100%",
  },
});

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>reactnative</category>
      <category>javascript</category>
      <category>beginners</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Flutter + A.I. Text-To-Speech: A Simple Guide</title>
      <dc:creator>Noah Velasco</dc:creator>
      <pubDate>Tue, 23 May 2023 10:36:13 +0000</pubDate>
      <link>https://dev.to/noahvelasco/amplify-your-flutter-apps-with-elevenlabs-tts-api-a-simple-guide-5147</link>
      <guid>https://dev.to/noahvelasco/amplify-your-flutter-apps-with-elevenlabs-tts-api-a-simple-guide-5147</guid>
      <description>&lt;h3&gt;
  
  
  Please like and follow me on Github @&lt;a href="https://github.com/noahvelasco" rel="noopener noreferrer"&gt;noahvelasco&lt;/a&gt;!
&lt;/h3&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;a href="https://www.youtube.com/shorts/x4pe_Fh5CIk?feature=share" rel="noopener noreferrer"&gt;
      youtube.com
    &lt;/a&gt;
&lt;/div&gt;


&lt;p&gt;&lt;a href="https://github.com/noahvelasco/tts_dev/blob/main/lib/main.dart" rel="noopener noreferrer"&gt;GitHub Code&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Introduction&lt;/li&gt;
&lt;li&gt;Get ElevenLabs API Key&lt;/li&gt;
&lt;li&gt;Flutter Project Configuration&lt;/li&gt;
&lt;li&gt;Basic UI&lt;/li&gt;
&lt;li&gt;API Key Setup&lt;/li&gt;
&lt;li&gt;ElevenLabs API Code Call&lt;/li&gt;
&lt;li&gt;Full Code&lt;/li&gt;
&lt;li&gt;Possible Errors&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Hey there, fellow developers! We're going to dive into the exciting realm of text-to-speech (TTS) integration in Flutter. In today's fast-paced world, multimedia experiences are key to engaging users, and TTS APIs have become our secret weapon. In this tutorial, I'll walk you through the process of harnessing an API to bring text-to-speech functionality to your Flutter applications with a simple Flutter app. &lt;/p&gt;

&lt;p&gt;Whether you're building an educational app, adding an accessibility feature, or simply enhancing your user experience, this guide will equip you with all the know-how to get started.&lt;/p&gt;




&lt;h2&gt;
  
  
  Get ElevenLabs API Key
&lt;/h2&gt;

&lt;p&gt;First things first! Get your &lt;a href="https://docs.elevenlabs.io/welcome/introduction" rel="noopener noreferrer"&gt;API key&lt;/a&gt; from your ElevenLabs profile and save it somewhere! Don't worry, it's &lt;strong&gt;free for 10,000 characters a month once you sign up&lt;/strong&gt;. After you're done with this tutorial you're gonna want to pay them - it's REALLY good. Anyways, save the key - we will need it later!&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv5eetsgim8tumv3lxoao.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv5eetsgim8tumv3lxoao.png" alt="EL API Key Dialogue Box" width="711" height="591"&gt;&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;
  
  
  Flutter Project Configuration
&lt;/h2&gt;

&lt;p&gt;Create a new flutter project and follow these steps. Do not skip these steps since enabling certain rules and permissions is necessary to make TTS possible! Follow the below steps for your platform. &lt;/p&gt;
&lt;h3&gt;
  
  
  Android
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Enable multidex support in the &lt;code&gt;android/app/build.gradle&lt;/code&gt; file
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;defaultConfig {
   ...
   multiDexEnabled true
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Enable Internet Connection on Android in &lt;code&gt;android/app/src/main/AndroidManifest.xml&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;uses-permission android:name="android.permission.INTERNET"/&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;and update the application tag&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;application ... android:usesCleartextTraffic="true"&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  iOS
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Enable internet connection on iOS in the iOS/Runner/Info.plist
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;dict&amp;gt;
....
&amp;lt;key&amp;gt;NSAppTransportSecurity&amp;lt;/key&amp;gt;
&amp;lt;dict&amp;gt;
    &amp;lt;key&amp;gt;NSAllowsArbitraryLoads&amp;lt;/key&amp;gt;
    &amp;lt;true/&amp;gt;
&amp;lt;/dict&amp;gt;
...
&amp;lt;/dict&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Basic UI
&lt;/h2&gt;

&lt;p&gt;Let's code up a simple text form field and a button. The button will call the ElevenLabs API and play the input text through the speaker once pressed. First, lets set up the front end before any API calls -&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import 'package:flutter/material.dart';

void main() =&amp;gt; runApp(MyApp());

class MyApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'TTS Demo',
      home: MyHomePage(),
    );
  }
}

class MyHomePage extends StatefulWidget {
  @override
  _MyHomePageState createState() =&amp;gt; _MyHomePageState();
}

class _MyHomePageState extends State&amp;lt;MyHomePage&amp;gt; {
  TextEditingController _textFieldController = TextEditingController();

  @override
  void dispose() {
    _textFieldController.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(
        title: const Text('EL TTS Demo'),
      ),
      body: Padding(
        padding: const EdgeInsets.all(16.0),
        child: Column(
          crossAxisAlignment: CrossAxisAlignment.stretch,
          children: &amp;lt;Widget&amp;gt;[
            TextField(
              controller: _textFieldController,
              decoration: const InputDecoration(
                labelText: 'Enter some text',
              ),
            ),
            const SizedBox(height: 16.0),
            ElevatedButton(
              onPressed: () {
                //Eleven Labs API Call Here
              },
              child: const Icon(Icons.volume_up),
            ),
          ],
        ),
      ),
    );
  }
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe5xengmbpv2e6e53wo30.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe5xengmbpv2e6e53wo30.png" alt="Basic UI" width="800" height="1688"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  API Key Setup
&lt;/h2&gt;

&lt;p&gt;Let's utilize the flutter package &lt;code&gt;flutter_dotenv&lt;/code&gt;, create a .env file and insert our API key into it, and modify the pubspec.yaml file to include the .env file as it states on the &lt;a href="https://pub.dev/packages/flutter_dotenv" rel="noopener noreferrer"&gt;instructions&lt;/a&gt;. Follow the below steps - &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add package to project &lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;code&gt;$ flutter pub add flutter dotenv&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Make the following changes
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Create a .env file in the root directory&lt;/li&gt;
&lt;li&gt;Add the ElevenLabs API key to the .env file (as a string)&lt;/li&gt;
&lt;li&gt;Add the .env file to the pubspec.yaml assets section&lt;/li&gt;
&lt;li&gt;Add import to code (as seen below)&lt;/li&gt;
&lt;li&gt;Add the .env variable as a global (as seen below)&lt;/li&gt;
&lt;li&gt;Update main method code (as seen below)
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import 'package:flutter_dotenv/flutter_dotenv.dart';

String EL_API_KEY = dotenv.env['EL_API_KEY'] as String;

Future main() async {
  await dotenv.load(fileName: ".env");

  runApp(MyApp());
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fstob4wzivprh9j2krho0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fstob4wzivprh9j2krho0.png" alt="dot env setup" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  ElevenLabs API Code Call
&lt;/h2&gt;

&lt;p&gt;Now for the fun part! Now since we are going to be turning the text into speech using a REST API - we need a couple more packages. Follow the below - &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add package to project &lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;code&gt;$ flutter pub add http&lt;/code&gt;&lt;br&gt;
 &lt;code&gt;$ flutter pub add just_audio&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;Add the following imports
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import 'package:just_audio/just_audio.dart';
import 'package:http/http.dart';
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Create an AudioPlayer object that will be responsible for playing the audio
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;final player = AudioPlayer(); //audio player obj that will play audio
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;To play the Audio, we need to borrow a function from the &lt;code&gt;just_audio&lt;/code&gt; package. Place the following outside the main() -
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Feed your own stream of bytes into the player
class MyCustomSource extends StreamAudioSource {
  final List&amp;lt;int&amp;gt; bytes;
  MyCustomSource(this.bytes);

  @override
  Future&amp;lt;StreamAudioResponse&amp;gt; request([int? start, int? end]) async {
    start ??= 0;
    end ??= bytes.length;
    return StreamAudioResponse(
      sourceLength: bytes.length,
      contentLength: end - start,
      offset: start,
      stream: Stream.value(bytes.sublist(start, end)),
      contentType: 'audio/mpeg',
    );
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Now we can add the REST API function 'playTextToSpeech' that fetches the main data from ElevenLabs in the class _MyHomePageState. We pass 'text' and that text will be converted to 'bytes' which our helper class/function 'MyCustomSource' will convert into sound.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  //For the Text To Speech
  Future&amp;lt;void&amp;gt; playTextToSpeech(String text) async {

    String voiceRachel =
        '21m00Tcm4TlvDq8ikWAM'; //Rachel voice - change if you know another Voice ID

    String url = 'https://api.elevenlabs.io/v1/text-to-speech/$voiceRachel';
    final response = await http.post(
      Uri.parse(url),
      headers: {
        'accept': 'audio/mpeg',
        'xi-api-key': EL_API_KEY,
        'Content-Type': 'application/json',
      },
      body: json.encode({
        "text": text,
        "model_id": "eleven_monolingual_v1",
        "voice_settings": {"stability": .15, "similarity_boost": .75}
      }),
    );

    if (response.statusCode == 200) {
      final bytes = response.bodyBytes; //get the bytes ElevenLabs sent back
      await player.setAudioSource(MyCustomSource(
          bytes)); //send the bytes to be read from the JustAudio library
      player.play(); //play the audio
    } else {
      // throw Exception('Failed to load audio');
      return;
    }
  } 

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you want to tweak the way the voice sounds you can modify: the voice (in this case we are using the voice ID Rachel - '21m00Tcm4TlvDq8ikWAM'), stability, and the similarity boost. You can view the &lt;a href="https://api.elevenlabs.io/docs#/" rel="noopener noreferrer"&gt;API docs&lt;/a&gt; to go more in depth. &lt;/p&gt;

&lt;p&gt;To make this more UI friendly, we can add a linear progress indicator to know if the request is/isn't in progress.&lt;/p&gt;

&lt;h2&gt;
  
  
  Full Code
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import 'package:flutter/material.dart';
import 'dart:convert';

import 'package:flutter_dotenv/flutter_dotenv.dart';
import 'package:just_audio/just_audio.dart';
import 'package:http/http.dart' as http;

String EL_API_KEY = dotenv.env['EL_API_KEY'] as String;

Future main() async {
  await dotenv.load(fileName: ".env");

  runApp(MyApp());
}

class MyApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'TTS Demo',
      home: MyHomePage(),
    );
  }
}

class MyHomePage extends StatefulWidget {
  @override
  _MyHomePageState createState() =&amp;gt; _MyHomePageState();
}

class _MyHomePageState extends State&amp;lt;MyHomePage&amp;gt; {
  TextEditingController _textFieldController = TextEditingController();
  final player = AudioPlayer(); //audio player obj that will play audio
  bool _isLoadingVoice = false; //for the progress indicator

  @override
  void dispose() {
    _textFieldController.dispose();
    player.dispose();
    super.dispose();
  }

  //For the Text To Speech
  Future&amp;lt;void&amp;gt; playTextToSpeech(String text) async {
    //display the loading icon while we wait for request
    setState(() {
      _isLoadingVoice = true; //progress indicator turn on now
    });

    String voiceRachel =
        '21m00Tcm4TlvDq8ikWAM'; //Rachel voice - change if you know another Voice ID

    String url = 'https://api.elevenlabs.io/v1/text-to-speech/$voiceRachel';
    final response = await http.post(
      Uri.parse(url),
      headers: {
        'accept': 'audio/mpeg',
        'xi-api-key': EL_API_KEY,
        'Content-Type': 'application/json',
      },
      body: json.encode({
        "text": text,
        "model_id": "eleven_monolingual_v1",
        "voice_settings": {"stability": .15, "similarity_boost": .75}
      }),
    );

    setState(() {
      _isLoadingVoice = false; //progress indicator turn off now
    });

    if (response.statusCode == 200) {
      final bytes = response.bodyBytes; //get the bytes ElevenLabs sent back
      await player.setAudioSource(MyCustomSource(
          bytes)); //send the bytes to be read from the JustAudio library
      player.play(); //play the audio
    } else {
      // throw Exception('Failed to load audio');
      return;
    }
  } //getResponse from Eleven Labs

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(
        title: const Text('EL TTS Demo'),
      ),
      body: Padding(
        padding: const EdgeInsets.all(16.0),
        child: Column(
          crossAxisAlignment: CrossAxisAlignment.stretch,
          children: &amp;lt;Widget&amp;gt;[
            TextField(
              controller: _textFieldController,
              decoration: const InputDecoration(
                labelText: 'Enter some text',
              ),
            ),
            const SizedBox(height: 16.0),
            ElevatedButton(
              onPressed: () {
                playTextToSpeech(_textFieldController.text);
              },
              child: _isLoadingVoice
                  ? const LinearProgressIndicator()
                  : const Icon(Icons.volume_up),
            ),
          ],
        ),
      ),
    );
  }
}

// Feed your own stream of bytes into the player
class MyCustomSource extends StreamAudioSource {
  final List&amp;lt;int&amp;gt; bytes;
  MyCustomSource(this.bytes);

  @override
  Future&amp;lt;StreamAudioResponse&amp;gt; request([int? start, int? end]) async {
    start ??= 0;
    end ??= bytes.length;
    return StreamAudioResponse(
      sourceLength: bytes.length,
      contentLength: end - start,
      offset: start,
      stream: Stream.value(bytes.sublist(start, end)),
      contentType: 'audio/mpeg',
    );
  }
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Possible Errors
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Check the 'Flutter Project Configuration' section above&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Hey, you did it! You've now got the superpower to integrate text-to-speech into your Flutter apps like a pro. By adding this awesome feature, you're taking your users' experience to a whole new level, making your app more accessible and engaging. Don't forget to keep exploring the endless possibilities offered by your chosen API and have fun experimenting with different customization options.&lt;/p&gt;




&lt;h3&gt;
  
  
  Please like and follow me on Github @&lt;a href="https://github.com/noahvelasco" rel="noopener noreferrer"&gt;noahvelasco&lt;/a&gt;!
&lt;/h3&gt;

</description>
      <category>flutter</category>
      <category>api</category>
      <category>elevenlabs</category>
      <category>dart</category>
    </item>
  </channel>
</rss>
