<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Alan Voice AI Platform</title>
    <description>The latest articles on DEV Community by Alan Voice AI Platform (@alanvoiceai).</description>
    <link>https://dev.to/alanvoiceai</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/alanvoiceai"/>
    <language>en</language>
    <item>
      <title>How to add a Voice Assistant to your existing Application</title>
      <dc:creator>Alan Voice AI Platform</dc:creator>
      <pubDate>Fri, 22 May 2020 18:25:48 +0000</pubDate>
      <link>https://dev.to/alanvoiceai/how-to-add-a-voice-assistant-to-your-existing-application-3m3p</link>
      <guid>https://dev.to/alanvoiceai/how-to-add-a-voice-assistant-to-your-existing-application-3m3p</guid>
      <description>&lt;p&gt;&lt;strong&gt;How exactly do you add a voice interface to your app?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If you’ve been doing software development lately, you’ve probably (definitely) heard about the hot new fields of machine learning and artificial intelligence — helping us create smarter applications that can improve themselves. You may have even heard of natural language processing, a specific type of machine learning that deals with understanding human speech: making sense of what we tell each other.&lt;/p&gt;

&lt;p&gt;While natural language processing is still a very complex topic, we here at Alan are taking steps to solve it — and make it accessible to everyone. With Alan, anyone can integrate a complete voice interface into their applications without the complexity.&lt;/p&gt;

&lt;p&gt;Put broadly, Alan is a complete Conversational Voice AI Platform that lets you Build, Debug, Integrate, and Iterate on a voice assistant for your application.&lt;/p&gt;

&lt;p&gt;Previously, you would’ve had to work from the ground-up: learning Python, creating your machine learning model, hosting on the cloud, training Speech Recognition software, and tediously integrating it into your app.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IQHm5tLF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/2adqohpdnm0t9lyf3hzq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IQHm5tLF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/2adqohpdnm0t9lyf3hzq.png" alt="Alan Platform Diagram"&gt;&lt;/a&gt;&lt;/p&gt;
Alan Platform Diagram



&lt;p&gt;The Alan Platform automates this with its cloud-based infrastructure — incorporating a large number of advanced voice recognition and Spoken Language Understanding technologies. This enables Alan to support complete conversational voice experiences — defined by developers using Alan Studio scripts, written in JavaScript. Alan integrates voice AI into any application with easy to use SDKs.&lt;/p&gt;

&lt;p&gt;Voice interfaces in existing applications apply to a variety of use cases — from healthcare, to industrial maintenance, to emergency services. Alan has been used by leading industry professionals in all types of work.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cZRZWXco--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/8cngy1ss2axq1iwk4x5u.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cZRZWXco--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/8cngy1ss2axq1iwk4x5u.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;
Alan Use Cases



&lt;p&gt;Unlike Google Assistant and Alexa, Alan integrates with whatever application you’re developing, meaning that you won’t need to change your existing workflows or UI.&lt;/p&gt;

&lt;p&gt;In addition, Alan uses Javascript and simple IDEs that are accessible to any developer — foregoing current template-driven approaches in favor of powerful and robust voice dialogues with both voice and visual interactions. With all the heavy-lifting done for you, Alan enables developers to roll out conversational voice and hands-free usage in just a couple of days.&lt;/p&gt;

&lt;p&gt;To show you the power of the Alan Platform, we’ll start by building our own simple voice script to define the experience; then, we’ll add it to a mobile application. The mobile application we’ll be using here is the &lt;a href="https://github.com/alan-ai/alan-sdk-ios/tree/master/examples/SAP_SampleApplication"&gt;SAP Deliveries app&lt;/a&gt; available on the Alan Github page.&lt;/p&gt;

&lt;p&gt;To start, please download and install &lt;a href="https://git-lfs.github.com/"&gt;Git Large File Storage&lt;/a&gt; and make sure you have the latest version of Xcode and Swift.&lt;/p&gt;

&lt;p&gt;We can download the Alan iOS SDK here:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git clone https://github.com/alan-ai/alan-sdk-ios.git
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;The application will be located in the &lt;em&gt;examples/&lt;/em&gt; directory and titled &lt;em&gt;SAP_SampleApplication/&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Now that we have our mobile app saved on our computer, we can start with the voice script. Remember where this is saved — we’ll need to come back for it later!&lt;/p&gt;

&lt;h2&gt;
  
  
  Building Your Alan Application
&lt;/h2&gt;

&lt;p&gt;First, &lt;a href="https://studio.alan.app/"&gt;sign up&lt;/a&gt; and create an Alan Studio account.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PYr2fao9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/9l746crir4d9hdm8kvwg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PYr2fao9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/9l746crir4d9hdm8kvwg.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, log in to Alan and you’ll see the project dashboard. Here, we’ll create a sample project for the open source SAP Deliveries Application for iOS (which we downloaded before).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EAd5ehJ2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/7gdvuhhyi0u2qcwo94gg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EAd5ehJ2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/7gdvuhhyi0u2qcwo94gg.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Previous models would require thousands of lines of code to host, develop, train, and deploy — Alan automates all these tasks and develops your models for you.&lt;/p&gt;

&lt;p&gt;In the project, click the ‘Add Script’ button and select the &lt;em&gt;SAP_Deliveries_Data&lt;/em&gt; and the &lt;em&gt;SAP_Deliveries_Logic&lt;/em&gt; scripts.&lt;/p&gt;

&lt;p&gt;Make sure that the &lt;em&gt;SAP_Deliveries_Data&lt;/em&gt; script is listed first.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--74z0dTCJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/bnfyau6sd58hyyrxwwh4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--74z0dTCJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/bnfyau6sd58hyyrxwwh4.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To understand the Alan voice scripts, there are two essential features we need to know — intents and entities.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Intents — the phrases which we want to recognize — phrases like “what products are available” or “how much does the notebook cost?”&lt;/li&gt;
&lt;li&gt;Entities — the keywords in these intents. Product names or product suppliers, for example, would be important specific words that are relevant to the functioning of the app.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In these scripts, Alan supports advanced language definition tools that can be used to make intents and entities of any complexity. Entities like lists loaded from databases or fuzzy entities are critical in many different use cases and can be handled by Alan’s advanced dialog management system.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--r2mBNCEj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/57z1bobhmnwjo5lo1ojg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--r2mBNCEj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/57z1bobhmnwjo5lo1ojg.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Alan also supports UI navigation with voice commands — enabling users to navigate through the different screens in an application and creating a seamless user experience.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Cozdc-sL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/g59w0zuhykdz6pm1ugms.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Cozdc-sL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/g59w0zuhykdz6pm1ugms.gif" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that we understand the basic parts of our script, we can move to debugging our voice experience in the application.&lt;/p&gt;

&lt;h2&gt;
  
  
  Debugging
&lt;/h2&gt;

&lt;p&gt;Alan provides a host of features that makes debugging easy and efficient.&lt;/p&gt;

&lt;p&gt;First, we can test out this script in the Debug Chat by pressing the Alan button and asking “What products are available?”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hL36hv2a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/2k3g8fqzp7qpokolwlaj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hL36hv2a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/2k3g8fqzp7qpokolwlaj.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here, we can see that Alan replies back to the user and sends a corresponding visual update recognizing the Product intent.&lt;/p&gt;

&lt;p&gt;Many applications have complex workflows and could have dozens or hundreds of intents. While debugging, Alan lets you see which intents are available in the current context and what has occurred in the current dialog flow — showing the intent that was used. That makes your script easy to debug even with the most complex intents and user flows.&lt;/p&gt;

&lt;p&gt;Finally, Alan provides a dedicated platform where we can test our application — Alan Playground. Available on Web, &lt;a href="https://apps.apple.com/ca/app/alan-playground/id1474478921"&gt;iOS&lt;/a&gt;, and &lt;a href="https://play.google.com/store/apps/details?id=app.alan.playground&amp;amp;hl=en_US"&gt;Android&lt;/a&gt;, Alan Playground is another option to test your application alongside its visual contexts.&lt;/p&gt;

&lt;p&gt;To debug on mobile, start by clicking the QR code button in the Alan Studio Debug Chat, and use Alan Playground on mobile to scan the code. This will scan your Alan Voice Script and open it in the application, which you can then test.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--f16gRta2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/vr44zxddoib7ztd1t5oz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--f16gRta2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/vr44zxddoib7ztd1t5oz.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To test on Web, click the Alan Playground icon (Play Button) in the top right corner, and you can test your script on the next screen.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Ee64NFqs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/wl1g0gplg75nbsyxh78v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Ee64NFqs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/wl1g0gplg75nbsyxh78v.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once we’re done testing, we can create a new version of the SAP Deliveries Script for Production!&lt;/p&gt;

&lt;h2&gt;
  
  
  Versioning
&lt;/h2&gt;

&lt;p&gt;Alan supports versioning for development, testing, and production — helping you easily manage the process of adding the voice experience to your application. Publishing a new version is automated in Alan’s backend and will automatically link to all production devices, without requiring any manual deployment.&lt;/p&gt;

&lt;p&gt;Our script here is saved to &lt;em&gt;Development&lt;/em&gt; and &lt;em&gt;Last&lt;/em&gt; (the only editable version). After debugging, we’ll save our voice script and move it to Production. Let’s name this script “V1” and select “Run on Production”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--utpPN7LR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/mahg40vjotvr0dti4405.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--utpPN7LR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/mahg40vjotvr0dti4405.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To get our production key, navigate to the production section and select the “Embed Code” button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xfvPuham--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ocngb6pu1851023es76m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xfvPuham--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ocngb6pu1851023es76m.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At the top, we see our Alan SDK Key, which we will save to integrate our script into the application. Now that we have our full script setup and tested, we’ll integrate our voice script into the SAP deliveries application.&lt;/p&gt;

&lt;h2&gt;
  
  
  Integration
&lt;/h2&gt;

&lt;p&gt;Remember the Github project we downloaded? The app is a sample application made using the SAP Developer stack.&lt;/p&gt;

&lt;p&gt;Opening the project in Xcode, we can see that the Alan library is already integrated for us. In the Alan folder, we can open the UIWindow+Alan.swift file to see the integration completed.&lt;/p&gt;

&lt;p&gt;We only need to change this line in func addAlan() to incorporate our Alan production key:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let config = AlanConfig(key: *“YOUR_KEY_FROM_ALAN_STUDIO_HERE"*, isButtonDraggable: false)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;In other projects, integrating the Alan libraries in any Xcode project is simple. We just need to post the following code snippet into our Swift project view controller and add the next code snippet into the viewDidLoad function. Since this is already taken care of in our SAP application, we don’t need to follow these steps.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import AlanSDK

/**........**/

fileprivate var button: AlanButton!

/**.........**/

override func viewDidLoad() {
  let config = AlanConfig(key: "YOUR_KEY_FROM_ALAN_STUDIO_HERE")
  self.button = AlanButton(config: config)
  self.button.translatesAutoresizingMaskIntoConstraints = false
  self.view.addSubview(self.button)
  let b = NSLayoutConstraint(item: self.button, attribute: .bottom, relatedBy: .equal, toItem: self.view, attribute: .bottom, multiplier: 1, constant: -40)
  let r = NSLayoutConstraint(item: self.button, attribute: .right, relatedBy: .equal, toItem: self.view, attribute: .right, multiplier: 1, constant: -20)
  let w = NSLayoutConstraint(item: self.button, attribute: .width, relatedBy: .equal, toItem: nil, attribute: .notAnAttribute, multiplier: 1.0, constant: 64)
  let h = NSLayoutConstraint(item: self.button, attribute: .height, relatedBy: .equal, toItem: nil, attribute: .notAnAttribute, multiplier: 1.0, constant: 64)
  self.view.addConstraints([b, r, w, h])  
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;For more detailed integration or for other platforms like Android and Web, see our documentation &lt;a href="https://alan.app/docs/client-api/common-api"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;After adding this code, you can run the SAP Deliveries app in an emulator or iOS device and start using Alan!&lt;/p&gt;

&lt;p&gt;After testing out a couple features, go to the Home screen and ask &lt;em&gt;“How do I use this?”&lt;/em&gt; Alternatively, you can use an emulator to test the Alan button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8fHblOXc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/xu7n8rv08lbwdfxuw8ul.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8fHblOXc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/xu7n8rv08lbwdfxuw8ul.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Unfortunately, in our script, this intent will be unrecognized.&lt;/p&gt;

&lt;p&gt;To get a better understanding of fixing unrecognized intents, let’s go to Alan Analytics to see all our user interactions and how Alan has been engaged.&lt;/p&gt;

&lt;h2&gt;
  
  
  Iteration
&lt;/h2&gt;

&lt;p&gt;Alan provides logs and analytics so you can gauge how users interact with your script and prioritize improvements.&lt;/p&gt;

&lt;p&gt;Now that we’ve integrated and run the SAP Deliveries App, we can see some analytics data in Alan Studio. We can see the session time, amount of usage, most frequent intents, and more.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--g3NxpoXJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/szu30g63use72j3dlssl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--g3NxpoXJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/szu30g63use72j3dlssl.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the logs, we see the unrecognized intents as well — phrases that the users have asked which didn’t match any intent in Alan. By reviewing these, we can better understand what users want and update the app and make it more responsive in all use cases.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--legxC0X0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/w51hcxnnhmawv9ilk1cc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--legxC0X0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/w51hcxnnhmawv9ilk1cc.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here, we see that the user asked “How do I use this” in the home screen, where it was unrecognized.&lt;/p&gt;

&lt;p&gt;Let’s update the code in Alan Studio to support this specific visual context.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;intent(vProducts, "How do I use this?", p =&amp;gt; { 
  p.play(`This is the Products screen. (Here,|) you can (use commands|ask questions) like What products are available?, ` 
         + `How much is the ${rand(project.prods.map(p =&amp;gt; p.Name))}?, ` 
         + `and What ${rand(project.cats.map(p =&amp;gt; p.CategoryName))} are available?`); 
});
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Here, we can add this code in the &lt;em&gt;SAP_Deliveries_Logic&lt;/em&gt; script on line 190 to help fix this unrecognized intent.&lt;/p&gt;

&lt;p&gt;Now, let’s test this in the Alan Debug chat.&lt;/p&gt;

&lt;p&gt;First, press “Set visual state” and replace the hint with:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    “screen”: “Product”
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Then, ask “How do I use this?”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tu0RpOg4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/1wl19nlcwo3666mcnjrg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tu0RpOg4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/1wl19nlcwo3666mcnjrg.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It works! Now, we can add a new version of the script and push it to production. This should be available to the user immediately, and reopening the SAP Deliveries app on your mobile device should yield the correct answer to your question.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WO0Rbht1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/dvmlh0gtxh3mkmql9uz4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WO0Rbht1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/dvmlh0gtxh3mkmql9uz4.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With other voice-enabling software, this type of functionality doesn’t exist. With Alan, you can gauge user feedback and update the intents in your script to make your app more reactive and intelligent.&lt;/p&gt;

&lt;p&gt;To take advantage of this full iteration experience, let’s finish off by building some automated test cases for our scripts in the future.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bs1OIdKT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/hrkpx52vngf4z7ld0ezx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bs1OIdKT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/hrkpx52vngf4z7ld0ezx.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let’s add some automated tests. Here, we’ve populated our test cases with multiple sample phrases:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;“What products do you have?”&lt;/li&gt;
&lt;li&gt;“Show me notebooks.”&lt;/li&gt;
&lt;li&gt;“What can I do here?”&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--kjOs_a_5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/19au4ykknsieenle4c0r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--kjOs_a_5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/19au4ykknsieenle4c0r.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By pressing the Play button, I can run all of these tests at once, and confirm that my script is working correctly.&lt;/p&gt;

&lt;p&gt;Now that we’ve completely engaged with the Alan platform, let’s go over everything that we’ve learned.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Only the Alan Platform gives you the ability to create a voice assistant that enhances your application’s existing user experience and make continuous improvements that precisely align with what your users want.&lt;/p&gt;

&lt;p&gt;With its simplified format, Alan is accessible to any developer, and does the heavy lifting of creating an accurate language model and managing the dialogues so that you can Build, Debug, and Integrate a voice assistant into your app in just a few days.&lt;/p&gt;

&lt;p&gt;Building with Alan is simple — the voice scripts are intuitive, scalable, and powerful. After developing your voice script, you can debug your scripts and take full control of your development-deployment stack. Then, you can integrate Alan into your application without making any changes to your existing workflow or UI. Finally, you can develop automated testing for future scripts and efficient deployment.&lt;/p&gt;

&lt;p&gt;With Alan, make your applications hands-free and bring your users the best conversational voice experience.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;More of a visual learner? Follow along with our overview video &lt;a href="https://www.youtube.com/watch?v=wZG1-rmb0TM&amp;amp;feature=youtu.be"&gt;here&lt;/a&gt;!&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;For reference, view sample Alan projects and SDKs here: &lt;a href="https://github.com/alan-ai"&gt;https://github.com/alan-ai&lt;/a&gt;&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Or look at the &lt;a href="http://alan.app/docs"&gt;Alan Documentation&lt;/a&gt; for additional &lt;a href="https://alan.app/docs/usage/tutorials/food-delivery-tutorial"&gt;projects&lt;/a&gt;.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>tutorial</category>
      <category>javascript</category>
      <category>ux</category>
    </item>
    <item>
      <title>What is a Voice User Interface? (VUI)</title>
      <dc:creator>Alan Voice AI Platform</dc:creator>
      <pubDate>Mon, 28 Oct 2019 18:23:24 +0000</pubDate>
      <link>https://dev.to/alanvoiceai/what-is-a-voice-user-interface-afm</link>
      <guid>https://dev.to/alanvoiceai/what-is-a-voice-user-interface-afm</guid>
      <description>&lt;p&gt;Table of Contents:&lt;br&gt;
What is a Voice User Interface(VUI)?&lt;br&gt;
The Creation of  VUI &lt;br&gt;
Technology used to create a VUI&lt;br&gt;
VUI vs Voice AI&lt;br&gt;
VUI vs GUI&lt;br&gt;
Different VUI approaches&lt;br&gt;
Benefits of VUIs&lt;br&gt;
Conclusion&lt;/p&gt;

&lt;p&gt;What is a Voice User Interface?&lt;br&gt;
A Voice User Interface(VUI) enables users to interact with a device or application using spoken voice commands. VUIs give users complete control of technology hands free, often times without even having to look at the device. A combination of Artificial Intelligence(AI) technologies are used to build VUIs, including Automatic Speech Recognition, Name Entity Recognition, and Speech Synthesis among others. VUIs can be contained either in devices or inside of applications. The backend infrastructure, including AI technologies used to create the VUI’s speech components, are often stored in a public or private cloud where the user’s speech is processed. In the cloud, AI components determine the intent of the user and return a given response back to the device or application where the user is interacting with the VUI.&lt;/p&gt;

&lt;p&gt;Well known VUIs include Amazon Alexa, Apple Siri, Google Assistant, Samsung Bixby, Yandex Alisa, and Microsoft Cortana. For the best user experience, VUIs have visuals and additional sound effects to accompany them so that users know when the VUI is active, listening, processing speech, or responding back to the user. VUIs today have different ways of handling this. Benefits of VUIs include hands-free accessibility, productivity, and better customer experience.&lt;/p&gt;

&lt;p&gt;The Creation of VUI &lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--khMjTJfj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/bobyfb9ensqt1aulp963.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--khMjTJfj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/bobyfb9ensqt1aulp963.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Read the full article at &lt;a href="https://alan.app/blog/voiceuserinterface/"&gt;https://alan.app/blog/voiceuserinterface/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ux</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Add a Conversational Voice Experience  to any iOS application</title>
      <dc:creator>Alan Voice AI Platform</dc:creator>
      <pubDate>Thu, 10 Oct 2019 19:22:55 +0000</pubDate>
      <link>https://dev.to/alanvoiceai/add-voice-to-ios-apps-quickly-535g</link>
      <guid>https://dev.to/alanvoiceai/add-voice-to-ios-apps-quickly-535g</guid>
      <description>&lt;p&gt;Want to learn how to a complete Conversational Voice Experience to your iOS application? &lt;br&gt;
Alan Voice AI Platform lets any developer add a complete Conversational Voice Experience to any application or device. In our weekly webinar series, we discuss and show you how to integrate Alan, and today we will cover iOS. Get started with the video below then head to our website to learn more!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://youtu.be/gLyEcmGS8XY"&gt;https://youtu.be/gLyEcmGS8XY&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://alan.app"&gt;https://alan.app&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ios</category>
      <category>machinelearning</category>
      <category>javascript</category>
      <category>webdev</category>
    </item>
    <item>
      <title>What is a Voice Assistant?</title>
      <dc:creator>Alan Voice AI Platform</dc:creator>
      <pubDate>Fri, 20 Sep 2019 19:22:59 +0000</pubDate>
      <link>https://dev.to/alanvoiceai/what-is-a-voice-assistant-492p</link>
      <guid>https://dev.to/alanvoiceai/what-is-a-voice-assistant-492p</guid>
      <description>&lt;p&gt;A voice assistant is a digital assistant that uses voice recognition, language processing algorithms, and voice synthesis to listen to specific voice commands and return relevant information or perform specific functions as requested by the user.&lt;/p&gt;

&lt;p&gt;Based on specific commands, sometimes called intents, spoken by the user, voice assistants can return relevant information by listening for specific keywords and filtering out the ambient noise.&lt;/p&gt;

&lt;p&gt;Using a mixture of artificial intelligence and machine learning, the software behind the voice controlled digital assistants UI uses algorithms and previous examples to more accurately understand what the user is asking for.&lt;/p&gt;

&lt;p&gt;Many talking assistants today combine both a task-oriented and knowledge-oriented workflow to carry out almost any task that a user can throw at it. A task-oriented workflow might include filling out a form, while a knowledge-oriented workflow includes answering what the capital of a state might be or specifying the technical specifications of a product.&lt;/p&gt;

&lt;p&gt;Some examples of what a Voice Assistant can do include:&lt;/p&gt;

&lt;p&gt;Check the weather&lt;br&gt;
Turn on/off connected smart devices&lt;br&gt;
Search databases&lt;br&gt;
One of the main reasons of the growing popularity of Voice User Interfaces (VUI) is due to the growing complexity within mobile software without an increase in screen size, leading to a huge disadvantage by using a GUI (Graphical User Interface). As more iterations of phones come out, the screen sizes stay relatively the same, leading for very cramped interfaces and creating frustrating user experiences, which is why more and more developers are switching to Voice User Interfaces.&lt;/p&gt;

&lt;p&gt;Some examples of popular smart assistants today are Alan, Amazon Alexa, Siri by Apple, and Google Voice Assistant.&lt;/p&gt;

&lt;p&gt;Read the full article at: &lt;br&gt;
&lt;a href="https://alan.app/blog/voiceassistant-2/"&gt;https://alan.app/blog/voiceassistant-2/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://alan.app"&gt;https://alan.app&lt;/a&gt;&lt;/p&gt;

</description>
      <category>voice</category>
      <category>explainlikeimfive</category>
      <category>ux</category>
      <category>webdev</category>
    </item>
  </channel>
</rss>
