<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Rick Cheng</title>
    <description>The latest articles on DEV Community by Rick Cheng (@icywind).</description>
    <link>https://dev.to/icywind</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/icywind"/>
    <language>en</language>
    <item>
      <title>Quickstart: Add Video Chat to Vision Pro with Agora</title>
      <dc:creator>Rick Cheng</dc:creator>
      <pubDate>Wed, 04 Sep 2024 18:07:14 +0000</pubDate>
      <link>https://dev.to/icywind/quickstart-add-video-chat-to-vision-pro-with-agora-496</link>
      <guid>https://dev.to/icywind/quickstart-add-video-chat-to-vision-pro-with-agora-496</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbi01uujq8c8djktbrvz5.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbi01uujq8c8djktbrvz5.jpg" alt="banner" width="800" height="393"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The future is here, and you're holding it – the Apple VisionPro device. It's a marvel of technology, but sometimes, you just need to see a friendly face across the digital divide. That's where Agora's ubiquitous Real-Time Communication (RTC) SDK comes in. With Agora, video chat on your VisionPro device transcends physical boundaries. You can connect with other VisionPro users, and Agora's expansive reach allows you to chat seamlessly with people on entirely different platforms. Imagine video calling your friends and family on their smartphones, tablets, or laptops – all through the power of Agora on your VisionPro. &lt;/p&gt;

&lt;p&gt;This quick start guide will equip you to unlock the potential of Agora's native RTC SDK and transform your VisionPro into a powerful video chat hub.  We will show you the way on how to set up and run the Quickstart project using both the simulator and the actual device.  And we will walk through the code with an architecture overview and provide remarks on important APIs.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If you are a Unity developer, you may refer to &lt;a href="https://www.agora.io/en/blog/vision-pro-unity-quickstart-with-agora-sdk/" rel="noopener noreferrer"&gt;this blog&lt;/a&gt; for the corresponding quick start guide.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Prerequisites:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;  Apple VisionPro
&lt;/li&gt;
&lt;li&gt;  Apple computer with Silicon CPU Chip (M1 or newer generation)&lt;/li&gt;
&lt;li&gt;  Xcode (15.4)  with VisionOS Support&lt;/li&gt;
&lt;li&gt;  Agora Developer account&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Setups
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt; Clone the GitHub project from &lt;a href="https://github.com/AgoraIO-Community/visionOS-Quickstart" rel="noopener noreferrer"&gt;this GitHub repo&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Open the project with Xcode:  import the Agora SDK plugin. The project includes the Agora Video SDK as a dependency package. As soon as Xcode loads the project, the package download should start automatically and finish in a few minutes.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If the package did not load for some reason, e.g. networking issues, or you are setting up a brand new project of your own, see the following information to cover the package manager, CDN, and cocoapods.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;To manually add the package to your Xcode project, put the specific branch of the git location as the dependency:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7oykx3kzkxs04343bz1h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7oykx3kzkxs04343bz1h.png" alt="xcode package" width="800" height="157"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Zip file &lt;a href="https://download.agora.io/null/Agora_Native_SDK_for_iOS_rel.v4.2.6.133_39484_FULL_20240126_0204_291749.zip" rel="noopener noreferrer"&gt;CDN download&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Cocoa Pods&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight make"&gt;&lt;code&gt;   &lt;span class="err"&gt;pod&lt;/span&gt; &lt;span class="s1"&gt;'AgoraRtcEngine_Special_iOS'&lt;/span&gt;&lt;span class="err"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'4.2.6.133.VISION'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Running the App
&lt;/h2&gt;

&lt;p&gt;Open VisionVideoCallView.swift. and fill in your Agora app ID in the appID field.  &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;This guide is made for Agora projects without security enabled.  You should use a Test Mode app ID for the testing of this project. It is important to understand that authentication security is crucial and must be implemented for your actual application.  Please refer to &lt;a href="https://docs.agora.io/en/video-calling/core-functionality/integrate-token-generation?platform=ios" rel="noopener noreferrer"&gt;this guide&lt;/a&gt; for token security implementation.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3f4xsl32kmm2bdljycow.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3f4xsl32kmm2bdljycow.png" alt="appid" width="800" height="77"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Build and run for either the Simulator or your physical VisionPro device. At the same time, prepare a second user to join the chat as a remote user.&lt;/p&gt;

&lt;h3&gt;
  
  
  Remote User Setup
&lt;/h3&gt;

&lt;p&gt;Since we are making an app that enables user video chat, we will have the VisionPro user as the local user and we will need another user as the remote user.  Both users should appear in the application.&lt;/p&gt;

&lt;p&gt;You have two options for the remote users application.  &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Use another physical device, including but not limited to another VisionPro device.  If you not the first time Agora developer, and have already built another Agora app with the sample app ID on any platform, you are good to go.
&lt;/li&gt;
&lt;li&gt;Use the &lt;a href="https://webdemo-global.agora.io/example/basic/basicVideoCall/index.html" rel="noopener noreferrer"&gt;Agora Web demo app&lt;/a&gt;.  This is the quickest way to test any new applications.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In this tutorial, we will use the web demo to run as our remote user.   The following screenshot shows a web user published video feed using the same App ID and channel name.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpnnf4xwm3xaiuorbow7e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpnnf4xwm3xaiuorbow7e.png" alt="remote user" width="800" height="570"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Execution Sequence
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;You can start either the VisionPro user or the web user first.
&lt;/li&gt;
&lt;li&gt;Upon launch, the VisionPro user should see a dialog waiting for input.  Enter the channel name (e.g. "visionpro") and hit the Join button. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqbfdnz4n6pagamuxdv76.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqbfdnz4n6pagamuxdv76.png" alt="join button" width="800" height="582"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The flow should be similar on other platforms.  In our case, the web user will need to enter the channel name, press the button in Step 1 to 3 as shown in above screenshot.&lt;/li&gt;
&lt;li&gt;The remote user should show up on the VisionPro demo's display panel automatically.&lt;/li&gt;
&lt;li&gt;For the web demo, the remote user id should populate itself when the VisionPro user joins successfully.  Press "Subscribe and Play" button to show the video stream from VisionPro.&lt;/li&gt;
&lt;li&gt;When the chat session is finished on VisionPro, press the "&amp;lt;" button to leave the channel. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx7clvonorrrj9itej86r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx7clvonorrrj9itej86r.png" alt="back button" width="800" height="599"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;More environment specific description follows.&lt;/p&gt;

&lt;h3&gt;
  
  
  Running on Simulator
&lt;/h3&gt;

&lt;p&gt;Since there is no camera simulation in the VisionPro simulator, your local video stream won’t be presented to the other users. However, you can still see other users who have enabled their cameras or custom video feeds on their devices. The following screenshot captures what you will see in the simulator.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flryucemuweoafm1yt20t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flryucemuweoafm1yt20t.png" alt="simulator" width="800" height="633"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Running on VisionPro Device
&lt;/h3&gt;

&lt;p&gt;Before building your project to the VisionPro device, Add the following keys to your Info.plist file to request the necessary permissions:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;key&amp;gt;&lt;/span&gt;NSCameraUsageDescription&lt;span class="nt"&gt;&amp;lt;/key&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;string&amp;gt;&lt;/span&gt;camera for self capture&lt;span class="nt"&gt;&amp;lt;/string&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;key&amp;gt;&lt;/span&gt;NSMicrophoneUsageDescription&lt;span class="nt"&gt;&amp;lt;/key&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;string&amp;gt;&lt;/span&gt;mic for my voice&lt;span class="nt"&gt;&amp;lt;/string&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuv9adgrdixfgz1pfvujr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuv9adgrdixfgz1pfvujr.png" alt="plist" width="800" height="199"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The permission request dialog should show up when you press the Join button.  You should confirm your permission to allow the app to use your VisionPro’s camera and microphone. Once you joins the channel, you should see yourself in your avatar form and the remote user should show up on your screen!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2eztl1xzidhc6vgx8xw4.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2eztl1xzidhc6vgx8xw4.gif" alt="device view" width="600" height="338"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the other device/web browsers, the user can also see you in your digital form and you can start chatting together now!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F27sp1qzhr7v0flz8468o.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F27sp1qzhr7v0flz8468o.gif" alt="web user view" width="600" height="338"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Basic Code Walkthrough
&lt;/h2&gt;

&lt;p&gt;If you are wondering how the components work together in this project, please read on. The following section aims to provide a clear and concise explanation of the code structure, logic, and flow.  &lt;/p&gt;

&lt;h3&gt;
  
  
  Code Structure
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Class&lt;/th&gt;
&lt;th&gt;Purpose&lt;/th&gt;
&lt;th&gt;Notes&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Agora_VisionApp&lt;/td&gt;
&lt;td&gt;Entry point of application&lt;/td&gt;
&lt;td&gt;Standard entry point&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;ContentView&lt;/td&gt;
&lt;td&gt;UI View presenting to user at the beginning&lt;/td&gt;
&lt;td&gt;Accepts the meeting’s channel name. Calls VisionVideoCallView&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;VisionVideoCallView&lt;/td&gt;
&lt;td&gt;Showing user’s video feeds as a grid format&lt;/td&gt;
&lt;td&gt;Input the appID here&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;AgoraVideoCanvasView&lt;/td&gt;
&lt;td&gt;The video canvas view for rendering the video&lt;/td&gt;
&lt;td&gt;This is a reusable class that works on iOS/MacOS as well.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;AgoraManager&lt;/td&gt;
&lt;td&gt;The controller of the app business logic that implements Agora RTC Engine’s delegates.&lt;/td&gt;
&lt;td&gt;Respond to join channel and leave channel; handles callback events like &lt;em&gt;didJoinedOfUid&lt;/em&gt; and &lt;em&gt;didJoinChannel&lt;/em&gt;.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  Logic Flow
&lt;/h3&gt;

&lt;p&gt;This quickstart demo follows a typical SwiftUI logic hierarchy.  The Agora_VisionApp class defines the entry point, which brings a the ContentView.  The ContentView is a container that uses a NavigationStack to allow Join channel user interface and the VisionVideoCallView interface to swap places, according to the current session state. &lt;/p&gt;

&lt;p&gt;When the user hits the Join button, the VisionVideoCallView is constructed and pushed into the view stack.  And video call session starts after the AgoraManager's initialization.  &lt;/p&gt;

&lt;p&gt;AgoraManager is an &lt;a href="https://developer.apple.com/documentation/swiftui/observedobject" rel="noopener noreferrer"&gt;observed object&lt;/a&gt;  that will implicitly causes updates to the view.  The &lt;em&gt;localUserId&lt;/em&gt; property and the &lt;em&gt;allUsers&lt;/em&gt; property from AgoraManager are being observed.  &lt;/p&gt;

&lt;p&gt;When the VisionPro user joins the channel, the &lt;em&gt;didJoinChannel&lt;/em&gt; event callback populates the &lt;em&gt;localUserId&lt;/em&gt;.  When a remote user joins the channel, the &lt;em&gt;didJoinedOfUid&lt;/em&gt; callback adds that user's uid to the allUsers collection.  On the other hand, when a remote user leaves the channel, the &lt;em&gt;didOfflineOfUid&lt;/em&gt; callback remove the user from the &lt;em&gt;allUsers&lt;/em&gt; collection.  Whenever the &lt;em&gt;allUsers&lt;/em&gt; value changes, VisionVideoCallView updates its canvas view grid automatically.  &lt;/p&gt;

&lt;p&gt;Class AgoraVideoCanvasView encapsulates the logic for populating a canvas with video stream using an uid as the input.  The code is well documented and aims to be reusable for the iOS/MacOS and visionOS projects.&lt;/p&gt;

&lt;p&gt;When the user hits the back button, NavigationStack pops the view stack and thus the VisionVideoCallView gets destroyed.  At such point, AgoraManager invokes LeaveChannel(), which will cleanup the canvas views and dispose the Agora engine.&lt;/p&gt;

&lt;h3&gt;
  
  
  View Update Sequence
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsyt2noxeua3s57l8ppkc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsyt2noxeua3s57l8ppkc.png" alt="sequence diagram" width="637" height="876"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Essential API Calls
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Create an Agora RTC engine
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight swift"&gt;&lt;code&gt;        &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="nv"&gt;config&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;AgoraRtcEngineConfig&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kt"&gt;AgoraRtcEngineConfig&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;config&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;appId&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;appId&lt;/span&gt;
        &lt;span class="n"&gt;config&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;audioScenario&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;gameStreaming&lt;/span&gt;
        &lt;span class="n"&gt;config&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;channelProfile&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;liveBroadcasting&lt;/span&gt;      
        &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="nv"&gt;eng&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kt"&gt;AgoraRtcEngineKit&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sharedEngine&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;with&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;config&lt;/span&gt; &lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;delegate&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Join a channel
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight swift"&gt;&lt;code&gt;        &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;agoraEngine&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;joinChannel&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;byToken&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;token&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;channelId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;info&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;info&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;uid&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;uid&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Leave a channel
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight swift"&gt;&lt;code&gt;        &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;agoraEngine&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;leaveChannel&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;leaveChannelBlock&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Dispose of the engine
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight swift"&gt;&lt;code&gt;        &lt;span class="kt"&gt;AgoraRtcEngineKit&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;destroy&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Local Camera Stream
&lt;/h3&gt;

&lt;p&gt;If you are an experienced Agora developer, you may see all the APIs are the same as you've done in implementing an iOS app.   You may wonder - how do I create a stream to capture my Avatar from the VisionPro? The answer is you don't need to.  The beauty of the visionOS is that - your avatar rendering frames are used for your local camera's view.  Therefore, without any change to the API calls, you get the same behavior to stream your local camera view!   &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;As you've seen, integrating Agora's RTC SDK into your VisionPro app for video chat is surprisingly straightforward. The provided quick start project on GitHub serves as a fantastic springboard, offering a basic structure you can build upon. With just a few steps and a little code modification, you've unlocked the power of real-time communication for your VisionPro device.&lt;/p&gt;

&lt;p&gt;The true beauty lies in Agora's cross-platform capabilities. Forget limitations! You can now video chat with friends and family on any device, regardless of their operating system. This opens doors for unparalleled collaboration and connection, shattering the boundaries between platforms.&lt;/p&gt;

&lt;p&gt;So, dive into the world of Agora's RTC SDK and unleash the full potential of video chat on your VisionPro. With its ease of use and expansive reach, Agora empowers you to connect with the world on a whole new level.  &lt;/p&gt;

&lt;p&gt;For further reading, it is worthwhile to check out the following resources:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://docs.agora.io/en/video-calling/get-started/get-started-sdk?platform=ios" rel="noopener noreferrer"&gt;iOS quickstart guide&lt;/a&gt; &lt;/li&gt;
&lt;li&gt;
&lt;a href="https://api-ref.agora.io/en/voice-sdk/ios/4.x/documentation/agorartckit" rel="noopener noreferrer"&gt;API references&lt;/a&gt;.
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>visionos</category>
      <category>ios</category>
      <category>xr</category>
      <category>videochat</category>
    </item>
    <item>
      <title>Cloud Recording for Unity Video Chat</title>
      <dc:creator>Rick Cheng</dc:creator>
      <pubDate>Fri, 30 Apr 2021 19:19:25 +0000</pubDate>
      <link>https://dev.to/icywind/cloud-recording-for-unity-video-chat-45p6</link>
      <guid>https://dev.to/icywind/cloud-recording-for-unity-video-chat-45p6</guid>
      <description>&lt;h1&gt;
  
  
  Cloud Recording for Unity Video Chat
&lt;/h1&gt;

&lt;p&gt;In this advanced real-time engagement topic, we show you how the Agora Cloud Recording API can be called to handle the task for Unity applications. You should be familiar with the basic setup of a simple video chat app in Unity. If not, you can follow the tutorial in&lt;a href="https://www.agora.io/en/blog/agora-video-sdk-for-unity-quick-start-programming-guide/"&gt; this blog&lt;/a&gt;, which will get you ready in 30 minutes. We further expand the logic to control the clouding recording with a custom server.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Prerequisites&lt;/strong&gt;
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Clone &lt;a href="https://github.com/AgoraIO/Agora-Unity-Quickstart/tree/master/QuickStart-VideoChat"&gt;the quick start repo&lt;/a&gt; as the base project to work on.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Fulfill the dependencies on the quick start project, such as getting an Agora App Id*.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Set up a working Amazon S3 Bucket. Be sure you verify that you can upload files to it by scripting.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Have basic server-side coding knowledge and environment.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;** Note: This guide does not implement token authentication which is recommended for all RTE apps running in production environments. For more information about token based authentication within the Agora platform please refer to this guide: &lt;a href="https://bit.ly/3sNiFRs"&gt;https://bit.ly/3sNiFRs&lt;/a&gt;*&lt;/p&gt;
&lt;/blockquote&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Architecture
&lt;/h2&gt;

&lt;p&gt;We will create a simple client-server system to simulate common usage of the feature. Here are the parts as depicted in figure1:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Unity App client 1:&lt;/strong&gt; the commanding client, which controls the start and stop of the video recording&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Unity App client 2:&lt;/strong&gt; the normal client, which chats with client 1&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;App Server&lt;/strong&gt;: a trusted server that stores secret keys, takes command from client 1 and makes RESTFul API calls to the SD-RTN back end&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;SD-RTN back end:&lt;/strong&gt; the Agora network process video that makes magic happen (Software Defined Real Time Network)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Amazon S3 Bucket:&lt;/strong&gt; the cloud storage with APIs for the video upload&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--amd_r4d7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AU2sqxjihlgGW-LFHuUN2LQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--amd_r4d7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AU2sqxjihlgGW-LFHuUN2LQ.png" alt="Figure 1: Project Architecture"&gt;&lt;/a&gt;&lt;em&gt;Figure 1: Project Architecture&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; Theoretically, we can omit the app server from this architecture and put the secret keys and RESTFul API calls directly in the client. But that wouldn’t be the best practice.&lt;/p&gt;

&lt;h2&gt;
  
  
  Project Overview
&lt;/h2&gt;

&lt;p&gt;This project has three major steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Environment&lt;/strong&gt;: This involves enabling the feature on your Agora account and setting up client-server and an S3 account.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Server&lt;/strong&gt;: The code is written in PHP.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Client&lt;/strong&gt; : A Unity application built on the basic video chat project.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Environment Setup
&lt;/h2&gt;

&lt;p&gt;First, familiarize yourself with the background description from the &lt;a href="https://docs.agora.io/en/cloud-recording/product_cloud_recording?platform=RESTful"&gt;documentation page&lt;/a&gt;. Follow the steps in the QuickStart section on how to enable the service from your Agora developer account console. You’ll need a &lt;em&gt;customer Id *and a *customer secret&lt;/em&gt; from&lt;a href="https://console.agora.io/restfulApi"&gt; the console&lt;/a&gt;. Adding the setup from the basic video chat project, you should have the following tokens so far:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;App Id: **Created with a new project.&lt;/strong&gt; **I chose not to use a certificate for this tutorial.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;App token:&lt;/strong&gt; Only if you enabled certificate for your AppId.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;**Customer Id: **Press “Add a secret” to get one (figure 2).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;**Customer secret: **After getting the customer Id, the console generates a secret token for download.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--34cvXc8K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2Ah2VuHlqZXQIh0qdnj4qnSg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--34cvXc8K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2Ah2VuHlqZXQIh0qdnj4qnSg.png" alt="Figure 2: Agora RESTful Console"&gt;&lt;/a&gt;&lt;em&gt;Figure 2: Agora RESTful Console&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Amazon S3 Credentials
&lt;/h3&gt;

&lt;p&gt;You will need the following information for the clouding recording configuration:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Bucket name:&lt;/strong&gt; In my example from figure 3, the name is “agoracdn”.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Region: **Since&lt;/strong&gt; **my bucket is in “us-west-1”, it is mapped to “2” according to &lt;a href="https://docs.agora.io/en/cloud-recording/cloud_recording_api_rest?platform=RESTful#a-namestorageconfigacloud-storage-configuration"&gt;this table&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;*&lt;em&gt;User access key: **Get this from AWS IAM under *Users&lt;/em&gt; (figure 4).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;*&lt;em&gt;User Secret key: **Get this from AWS IAM under *Users&lt;/em&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tI-Et5C5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2154/1%2AZmobhiGdx8ovB6nnMfN-Fg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tI-Et5C5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2154/1%2AZmobhiGdx8ovB6nnMfN-Fg.png" alt="Figure 3: S3 Management Console"&gt;&lt;/a&gt;&lt;em&gt;Figure 3: S3 Management Console&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9R2F9r9C--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2ArSSuEfhOwDe6PEeZkBBheQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9R2F9r9C--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2ArSSuEfhOwDe6PEeZkBBheQ.png" alt="Figure 4 : IAM Management Console"&gt;&lt;/a&gt;&lt;em&gt;Figure 4 : IAM Management Console&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Server Environment
&lt;/h3&gt;

&lt;p&gt;You should use the server framework that you are most familiar with. I run a quick PHP server on the localhost from my MacBook. I use this command to start the server from my PHP source code directory:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;$ php -S localhost:8000&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;For more information, such as how to do the same on Windows machines, check the&lt;a href="https://www.php.net/manual/en/features.commandline.webserver.php"&gt; PHP manual page&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Server Implementation
&lt;/h2&gt;

&lt;p&gt;The main goal of our server is to separate the necessary credential keys away from the client and relay the RESTful API calls to the Agora SD-RTN back end. The &lt;a href="https://docs.agora.io/en/cloud-recording/cloud_recording_rest?platform=RESTful"&gt;Cloud Recording Quick Start documentation&lt;/a&gt; shows the following essential steps for its life cycle:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Acquire&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Start&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Query&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Stop&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;These four steps are mapped to four PHP scripts running from our test server. They will share the configuration that we gather from the Environment Setup step above.&lt;/p&gt;

&lt;h3&gt;
  
  
  Configuration
&lt;/h3&gt;

&lt;p&gt;Here is the code of &lt;em&gt;config.php&lt;/em&gt;. You need to fill into your credential tokens in the empty space and set the region number for your Amazon S3 bucket. The script combines the &lt;em&gt;Customer Id&lt;/em&gt; and the &lt;em&gt;Customer&lt;/em&gt; &lt;em&gt;Secret&lt;/em&gt; token into an Authorization Secret (&lt;em&gt;AuthSecret&lt;/em&gt;), which will be used throughout the API calls. You should create an integer string for the &lt;em&gt;RecUID&lt;/em&gt; (Recorder User Id). This Id is different from the clients’ user Ids.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h3&gt;
  
  
  Acquire
&lt;/h3&gt;

&lt;p&gt;The Aquire step initializes a Cloud Recording request to the Agora SD-RTN. In return, the RTN provides a &lt;em&gt;resourceId&lt;/em&gt;. The channel name is passed from the client in a POST body. As a result, the resourceId is sent to the client. The client should maintain that id in its memory for the next steps.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h3&gt;
  
  
  Start
&lt;/h3&gt;

&lt;p&gt;After you obtain a resourceId, you can tell the SD-RTN when the recording starts. The SD-RTN sends chunks of video data in .ts format to your S3 bucket. The &lt;em&gt;resourceId&lt;/em&gt; and the &lt;em&gt;channel&lt;/em&gt; name are passed from the client in a POST body. If the call is successful, you should get back the same &lt;em&gt;resourceId&lt;/em&gt; and a &lt;em&gt;sid&lt;/em&gt; (Session Id).&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h3&gt;
  
  
  Query
&lt;/h3&gt;

&lt;p&gt;It is important to know if a Recording is in session or the Start failed due to an AWS authentication issue. The Query command checks the current status with the stored &lt;em&gt;resourceId&lt;/em&gt; and &lt;em&gt;sid&lt;/em&gt;. The server responds with the current video name for the file list (m3u8) and other information.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h3&gt;
  
  
  Stop
&lt;/h3&gt;

&lt;p&gt;When you decide the recording is done, call Stop to complete the cloud recording session.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; If you set up your Agora project with the certificate enabled, you need to add a token field into the CURLOPT_POSTFIELDS in the Acquire, Start, Stop, and Query API calls. Since my project was set up without one, I removed the field from the list.&lt;/p&gt;

&lt;h3&gt;
  
  
  Checkpoint
&lt;/h3&gt;

&lt;p&gt;So far, the server code and configuration are set up. You can test the recording manually before the client code is created. To do so, simply assign the necessary values by hand instead of getting the values from the POST body.&lt;/p&gt;

&lt;p&gt;If you are interested in learning more about the RESTful calls, you can use the&lt;a href="https://github.com/AgoraIO/Agora-RESTful-Service"&gt; Agora-RESTful-Service&lt;/a&gt; project to test cloud recording on the Postman app. In fact, the scripts and JSON body that I used in the PHP code above are inspired by the Postman app. You can quickly create the corresponding code for your chosen framework and language. See figures 5 and 6.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7vWZKB-C--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2642/1%2AFANXWeQ7vRbBwMrrkHinhw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7vWZKB-C--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2642/1%2AFANXWeQ7vRbBwMrrkHinhw.png" alt="Figure 5 : Get the Code"&gt;&lt;/a&gt;&lt;em&gt;Figure 5 : Get the Code&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Wgkj8vMT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2600/1%2AMOtk1HKtld18G3TzkjPuCQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Wgkj8vMT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2600/1%2AMOtk1HKtld18G3TzkjPuCQ.png" alt="Figure 6 : Use any language"&gt;&lt;/a&gt;&lt;em&gt;Figure 6 : Use any language&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Unity Client
&lt;/h2&gt;

&lt;p&gt;As discussed in the Architecture section, there are two clients. Client 1 is the commanding client that controls the start and stop of the recording. Client 2 is a normal chat client. The original VideoChat project applies to client 2. We update the project with new features to support the commanding client.&lt;/p&gt;

&lt;p&gt;The commanding feature is a good example application for the MVC design pattern. We will write the Unity C# code for the Model, the View and the Controller.&lt;/p&gt;

&lt;h3&gt;
  
  
  Model
&lt;/h3&gt;

&lt;p&gt;From observation of the returning JSON output from the SD-RTN, we find the following sample structure covers the response body for Acquire, Start, Stop, and Query:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;{“resourceId”:”xxxx”, “sid”:”yyy”, “serverResponse”:{“status”:5,”fileList”:”zzzz.m3u8",”fileListMode”:”string”,”sliceStartTime”:1606357122528} }&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Turning that into C# classes, we have the following implementation:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h3&gt;
  
  
  View
&lt;/h3&gt;

&lt;p&gt;We will update the Video Chat demo scene to the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Add a Start/Stop button to the lower-left side of the view area. Name it “RecordButton”. Set the text to display “Start”. Later, we modify the text to display “Stop” in the controller logic.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Add a Query button below the Start/Stop button. Make the RecordButton its parent in the hierarchy.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WtmV8ueP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2Aj8KNzsvKam70j8TBqxMYqA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WtmV8ueP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2Aj8KNzsvKam70j8TBqxMYqA.png" alt="Figure 7: The Cloud Record Commanding Client"&gt;&lt;/a&gt;&lt;em&gt;Figure 7: The Cloud Record Commanding Client&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Controller
&lt;/h3&gt;

&lt;p&gt;A &lt;em&gt;CloudRecordController&lt;/em&gt; class responds to the button events and encapsulates the networking logic to drive the cloud recording sequence. Here is a top-down outline of the class structure:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Bu37t97N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AGmWOkL9SPpGxlipTmrsXHg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Bu37t97N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AGmWOkL9SPpGxlipTmrsXHg.png" alt="Figure 8: Class Outline"&gt;&lt;/a&gt;&lt;em&gt;Figure 8: Class Outline&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Fields
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;ServerURL&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;recordButton&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;queryButton&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Properties
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;ChannelName&lt;/strong&gt;: Set by main controller (&lt;em&gt;VideoChat.cs&lt;/em&gt;),&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;ResourceId&lt;/strong&gt;: Get from server message,&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;SID&lt;/strong&gt;: Get from server message,&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;IsRecording&lt;/strong&gt;: Indicates if recording is in progress.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  UI Controls
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;HandleStartStop():&lt;/strong&gt; Responds to the &lt;em&gt;RecordButton&lt;/em&gt; click and toggles between the two states. The &lt;em&gt;RecordButton&lt;/em&gt; is visible only after the user joins a channel and it appears as a Start button. When the user taps the &lt;em&gt;Start&lt;/em&gt; button, the client sends an Acquire command to the server first. If the &lt;em&gt;Acquire&lt;/em&gt; is successful, then &lt;em&gt;Start&lt;/em&gt; begins automatically in the response handler to &lt;em&gt;Acquire&lt;/em&gt;. The &lt;em&gt;Start&lt;/em&gt; button becomes a &lt;em&gt;Stop&lt;/em&gt; button. The &lt;em&gt;Stop&lt;/em&gt; button stops the recording upon tapping by the user.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;HandleQuery():&lt;/strong&gt; Responds to the &lt;em&gt;QueryButton&lt;/em&gt; click. The output prints to the console.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;RestoreRecordState():&lt;/strong&gt; Helps to maintain states between two Unity client sessions. If a recording is in progress but the user quits, the Start state is saved in the device’s persistent memory (&lt;em&gt;PlayerPrefs&lt;/em&gt;). This function gets called at the &lt;em&gt;Awake&lt;/em&gt; step of the hosting object.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SetRecordUI():&lt;/strong&gt; Changes the UI appearance for the Start state and the Stop state.&lt;/p&gt;

&lt;h3&gt;
  
  
  Server Interaction
&lt;/h3&gt;

&lt;p&gt;The client interacts with the server with the four API calls defined earlier. It is easy to see that there are four API callers and four matching server response handlers for Acquire, Start, Stop, and Query.&lt;/p&gt;

&lt;p&gt;The four functions use the same logic flow for the caller and the handler. We use the &lt;em&gt;UnityWebRequest *class to package and send the POST request and parse the result using the model we defined for *CloudRecordResponseModel.&lt;/em&gt; At the end, invoke the callback to the handler to finish the processing. Figure 9 shows an example from the &lt;em&gt;_Start&lt;/em&gt; function.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gPjtnOic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AsBV39q0bYXsU3PasvjSqMg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gPjtnOic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AsBV39q0bYXsU3PasvjSqMg.png" alt="Figure 9 : Start function"&gt;&lt;/a&gt;&lt;em&gt;Figure 9 : Start function&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;_Start()&lt;/em&gt; runs separately from the Main thread in Unity in a coroutine so that the network calls won’t block the execution of the UI display.&lt;/p&gt;

&lt;p&gt;The complete code listing for &lt;em&gt;CloudRecordController class:&lt;/em&gt;&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Integration
&lt;/h2&gt;

&lt;p&gt;Now we have the MVC components all defined. Let’s integrate everything. We need a game object to host the &lt;em&gt;CloudRecordController&lt;/em&gt;. We simply add that as an extra component to the &lt;em&gt;RecordButton&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;First, go back to the Unity Editor and drag the &lt;em&gt;CloudRecordController&lt;/em&gt;.cs script to &lt;em&gt;RecordButton&lt;/em&gt;. Second, assign the &lt;em&gt;RecordButton&lt;/em&gt; and &lt;em&gt;QueryButton&lt;/em&gt; to their fields. Third, enter the following URL in the Server URL field:&lt;/p&gt;

&lt;p&gt;&lt;a href="http://localhost:8000"&gt;*http://localhost:8000&lt;/a&gt;*&lt;/p&gt;

&lt;p&gt;You Inspector should look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--N2GCj5xd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2Ajdl06q_PErhUlPEjr9kW0Q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--N2GCj5xd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2Ajdl06q_PErhUlPEjr9kW0Q.png" alt="Figure 10: Record Button as a Controller"&gt;&lt;/a&gt;&lt;em&gt;Figure 10: Record Button as a Controller&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;We update the main controller logic to link to the &lt;em&gt;CloudRecordController&lt;/em&gt;.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Update the &lt;em&gt;AgoraTest.cs&lt;/em&gt; script with a new serialized field for &lt;em&gt;CloudRecordingObject&lt;/em&gt; (see figure 11).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the &lt;em&gt;Start()&lt;/em&gt; method, hide the &lt;em&gt;CloudRecordingObject&lt;/em&gt; (see figure 11).&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hILOqIrF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2Ao-xJ6hdQjHu68fZcrfrR4g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hILOqIrF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2Ao-xJ6hdQjHu68fZcrfrR4g.png" alt="Figure 11: New Code in AgoraTest Class"&gt;&lt;/a&gt;&lt;em&gt;Figure 11: New Code in AgoraTest Class&lt;/em&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;In the &lt;em&gt;OnJoinChannelSuccessHandler&lt;/em&gt;() method, update the code to pass the channel information to &lt;em&gt;CloudRecordController&lt;/em&gt; (see figure 12)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--f1HCJtdZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2ALGMsOkgHc81CcWPH6cT13A.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--f1HCJtdZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2ALGMsOkgHc81CcWPH6cT13A.png" alt="Figure 12: Passing Channel Name"&gt;&lt;/a&gt;&lt;em&gt;Figure 12: Passing Channel Name&lt;/em&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;In the Unity Editor, drag and drop the &lt;em&gt;RecordButton&lt;/em&gt; into the &lt;em&gt;CloudRecordingObject&lt;/em&gt; field of AgoraTest inside &lt;em&gt;GameController&lt;/em&gt; (see figure 13).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pcKJIaIx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AZd6U-Ox63p5sjl-2ozLqQA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pcKJIaIx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AZd6U-Ox63p5sjl-2ozLqQA.png" alt="Figure 13: Linking Recording Object"&gt;&lt;/a&gt;&lt;em&gt;Figure 13: Linking Recording Object&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Test
&lt;/h2&gt;

&lt;p&gt;First, start the server by going to the Terminal and entering the php command, as shown in figure 14.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GcBBPPpD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2ANV6ueNDfdiB79UBXKCVBCQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GcBBPPpD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2ANV6ueNDfdiB79UBXKCVBCQ.png" alt="Figure 14: Start Server"&gt;&lt;/a&gt;&lt;em&gt;Figure 14: Start Server&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Run the project from the Unity Editor, and execute in the following order:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Join&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Start&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Query&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Stop&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Leave&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;When you are done, head over to your AWS S3 account to check the files. You should find a list of .ts files uploaded. Here is the quick test I just did:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PEKOL_8S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A6PX0hONQzYe2HhxpoTC3xg.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PEKOL_8S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A6PX0hONQzYe2HhxpoTC3xg.gif" alt="Figure 15: Sample Run"&gt;&lt;/a&gt;&lt;em&gt;Figure 15: Sample Run&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This project is a bit complex, with various depending configurations to set up. Thank you for following my post up to the end. The completed project can be found in &lt;a href="https://github.com/AgoraIO-Community/UnityCloudRecording"&gt;our community Github repo&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Other Resources
&lt;/h2&gt;

&lt;p&gt;For more information about Agora.io applications, take a look at the &lt;a href="https://docs.agora.io/en/Video/start_call_web?platform=Web"&gt;Agora Video Call Quickstart Guide&lt;/a&gt; and &lt;a href="https://docs.agora.io/en/Video/API%20Reference/web/index.html"&gt;Agora API Reference&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I also invite you to &lt;a href="http://bit.ly/2IWexJQ"&gt;join the Agoira.io Developer Slack community&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>videochat</category>
      <category>recording</category>
      <category>agora</category>
      <category>rtc</category>
    </item>
    <item>
      <title>How to Add Chat with Face Filters to Your Multiplayer Unity Game</title>
      <dc:creator>Rick Cheng</dc:creator>
      <pubDate>Wed, 19 Aug 2020 17:16:01 +0000</pubDate>
      <link>https://dev.to/icywind/how-to-add-chat-with-face-filters-to-your-multiplayer-unity-game-4mdk</link>
      <guid>https://dev.to/icywind/how-to-add-chat-with-face-filters-to-your-multiplayer-unity-game-4mdk</guid>
      <description>&lt;h1&gt;
  
  
  How to Add Chat with Face Filters to Your Multiplayer Unity Game
&lt;/h1&gt;

&lt;p&gt;Do you have any thoughts about real time engagement in your game? Undoubtedly, having video chat during a co-op or multiplayer online game adds jokes and laughs to the experience. How about extend the fun with a face filter? It will definitely increase a player’s session time during this COVID-19 lock down.&lt;/p&gt;

&lt;p&gt;In this tutorial, we will make use of an existing project, written by Joel Thomas and apply a face filter feature to the player. To follow what has been done, please check out the tutorial &lt;a href="https://bit.ly/2CvfARA"&gt;here&lt;/a&gt;. Make sure you can download&lt;a href="https://github.com/icywind/agora-partychat-demo"&gt; the full project from GitHub&lt;/a&gt; to start. For comparison to the original project, consider the following preview:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PkdrF1F6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AAZEF1ssKToJkxaEwNfVl8Q.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PkdrF1F6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AAZEF1ssKToJkxaEwNfVl8Q.gif" alt="" width="600" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://www.agora.io/en/blog/how-to-get-started-with-agora?utm_source=medium&amp;amp;utm_medium=blog&amp;amp;utm_campaign=unity_party_chat"&gt;Create an Agora.io developer account here&lt;/a&gt; to get your AppID.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://dashboard.photonengine.com/en-US/Account/SignUp"&gt;Create a Photon developer account&lt;/a&gt; for their AppID.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Clone &lt;a href="https://github.com/icywind/agora-partychat-demo"&gt;my GitHub project&lt;/a&gt; as the starter project, which contains Viking demo’s assets, Agora Video SDK and Photon SDK.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://assetstore.unity.com/packages/add-ons/machinelearning/face-ar-plugin-for-unity-153466"&gt;Banuba Face SDK for Unity&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Unity 2019.3 or up.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;You need to have a Banuba client token to run the project. Contact &lt;a href="//mailto:info@banuba.com"&gt;info@banuba.com&lt;/a&gt; for a trial token and SDK package if you don’t want to pay up front at the Asset Store.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Getting Started
&lt;/h2&gt;

&lt;p&gt;First, we will use the Banuba Face SDK to start the AR face masking. Import the package that you downloaded from the Asset store or the unzipped SDK to the Unity project directory. You should see the current project directory structure and find the &lt;strong&gt;BanubaClientToken&lt;/strong&gt; file in the Resource folder. Open the file in text editor and paste your SDK token key there.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zBwS5o3K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AiHW3fzhotYpZaVoWQLB32g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zBwS5o3K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AiHW3fzhotYpZaVoWQLB32g.png" alt="" width="631" height="621"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, enter your Photon project info in &lt;em&gt;Window &amp;gt; Photon Unity Networking &amp;gt; PUN Wizard &amp;gt; Setup Project.&lt;/em&gt; Paste your AppID or email you used for your Photon account.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jILt7jbv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AFAzyyyOD9YpUUFwW3K9ZHA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jILt7jbv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AFAzyyyOD9YpUUFwW3K9ZHA.png" alt="" width="700" height="248"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Note that the GitHub project already contains a copy of the Agora Video SDK. You may choose update to the latest from the Asset Store. It is recommended that you delete the folder Assets/AgoraEngine/Plugins for a clean replacement.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Create BNB_AR Prefab
&lt;/h3&gt;

&lt;p&gt;Open the &lt;strong&gt;Sample&lt;/strong&gt; scene from BanabuFaceAR/Scenes, and do the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create an empty game object, name it “BNBARStuff”.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Drag and drop items &lt;em&gt;Camera, Faces and Surface *into **BNBARStuff.&lt;/em&gt;*&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Change layers of &lt;strong&gt;BNBARStuff&lt;/strong&gt; to “default” recursively.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Update the Camera component to culling mask on “default” only.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Drag the &lt;strong&gt;BNBARStuff&lt;/strong&gt; object into th Prefabs folder to make it a prefab.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Open the prefab, rename &lt;em&gt;Camera&lt;/em&gt; to “ARCamera”, and rename &lt;em&gt;Surface&lt;/em&gt; to “ARSurface”. Change &lt;strong&gt;ARCamera’s&lt;/strong&gt; culling mask to Default only. Uncheck &lt;em&gt;Default&lt;/em&gt; from MainCamera’s culling mask list.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KCCk9_K3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AxNaafDIxKDetYAQCb3cHdg.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KCCk9_K3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AxNaafDIxKDetYAQCb3cHdg.gif" alt="" width="640" height="520"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If your Banuba SDK has a FPS game object under &lt;em&gt;Surface&lt;/em&gt;, you can remove it from the prefab. Also update the code in CameraController to do a null check:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;**if (fpsTextObj != null)** fpsTextObj.text = “FPS: “ + fps;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Do NOT save the Sample scene. Open &lt;strong&gt;VikingsScene&lt;/strong&gt; from DemoVikings and set up the BNBARStuff object using the prefab:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Drag and drop the &lt;strong&gt;BNBARStuff&lt;/strong&gt; prefab to the &lt;em&gt;Hierarchy.&lt;/em&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enable the one of the filters under &lt;em&gt;Faces&lt;/em&gt; (e.g. &lt;em&gt;Afro&lt;/em&gt;).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Highlight the Camera under &lt;strong&gt;BNBARStuff&lt;/strong&gt; and play the app in Editor.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;√ You can observe yourself with the AR filter applied in the Camera preview.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Cx_IpSoR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AyJT5c6VZc01RdJTibEN-3w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Cx_IpSoR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AyJT5c6VZc01RdJTibEN-3w.png" alt="" width="534" height="385"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Save the &lt;strong&gt;VikingsScene&lt;/strong&gt; scene if everything looks good so far.&lt;/p&gt;

&lt;h3&gt;
  
  
  Create Render Texture
&lt;/h3&gt;

&lt;p&gt;Create a render texture to host the &lt;em&gt;ARCamera&lt;/em&gt; input:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create a RenderTexture; name it “ARRenderTexture”.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Assign &lt;strong&gt;ARRenderTexture&lt;/strong&gt; into &lt;em&gt;ARCamera’s&lt;/em&gt; Target Texture field.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9uApQzlQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A4m5RTQXSZGuwfd2txnnjlg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9uApQzlQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A4m5RTQXSZGuwfd2txnnjlg.png" alt="" width="512" height="621"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Create Local View Prefab
&lt;/h3&gt;

&lt;p&gt;In the* Viking* scene, create a UI hierarchy as following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create a new Image, change its width to 140 and height to 140; name it “ViewContainerWithMask”. Add a Mask component to this object.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Add a new game object under &lt;strong&gt;ViewContainerWithMask&lt;/strong&gt;, name it “AvatarRenderTarget”. Make sure its scale is set to 1.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Add a RawImage component to &lt;strong&gt;AvatarRenderTarget&lt;/strong&gt;. Drag and drop the &lt;strong&gt;ARRenderTexture&lt;/strong&gt; into the Texture field of the &lt;strong&gt;RawImage&lt;/strong&gt; component.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Add a button that has no text and no image as a child object of &lt;strong&gt;AvatarRenderTarget&lt;/strong&gt;, set Normal Color and Pressed Color’s alpha to 0 (transparent). Set a low alpha (e.g. 35) for the other states. This button will invoke the appearance of the &lt;strong&gt;FilterDropdown&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Attach this “AvatarViewController.cs” script to &lt;strong&gt;AvatarRenderTarget&lt;/strong&gt; object.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ol&gt;
&lt;li&gt;Assign &lt;strong&gt;ViewContainerWithMask&lt;/strong&gt; object into the “Mask RectTan” field of the &lt;strong&gt;AvatarViewController&lt;/strong&gt; component on the &lt;strong&gt;AvatarRenderTarget&lt;/strong&gt; object.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Verify you set up as describe in Step 1–4. Drag the game object ViewContainerWithMask to the Prefab folder. Delete it from the &lt;em&gt;Vikings&lt;/em&gt; Scene. Save.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PZrb9w9q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A4w_c8kxjKpKWXnZm6ExePg.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PZrb9w9q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A4w_c8kxjKpKWXnZm6ExePg.gif" alt="" width="540" height="481"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Set up dropdown list for the masks
&lt;/h3&gt;

&lt;p&gt;In the &lt;strong&gt;VikingsScene&lt;/strong&gt;, create a UI Dropdown and place it to the upper right corner. Make sure you set the Canvas scaler to “Scale with Screen”. Assign the dropdown’s layer to “UI” and name it “FilterDropdown”. Hide this from the scene for now, since it will be activated later only when user taps the self-video display.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1zj84QNw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AIgjLULRTPnuey1ZgTDoCQw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1zj84QNw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AIgjLULRTPnuey1ZgTDoCQw.png" alt="" width="800" height="554"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Code Modification
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Spawn with Banuba FaceFilter enabled Prefab
&lt;/h3&gt;

&lt;p&gt;In AgoraVideoChat.cs, update the &lt;em&gt;CreateUserVideoSurface&lt;/em&gt; method with the following code snippet:&lt;/p&gt;

&lt;h3&gt;
  
  
  Update Prefab Field Assignments
&lt;/h3&gt;

&lt;p&gt;Open Assets/DemoVikings/Resources/Charprefab, updated the fields:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Assign the prefab &lt;em&gt;ViewContainerWithMask&lt;/em&gt; into the “AvatarVideoPrefab” field (see line 2 in the above code snippet).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Update the AppID and Channel ID.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bG0L3AEH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2060/1%2App5aJG8ccif5_pRR9T_gLg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bG0L3AEH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2060/1%2App5aJG8ccif5_pRR9T_gLg.png" alt="" width="880" height="537"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Send the Frames as Custom Source Video
&lt;/h3&gt;

&lt;p&gt;Add the following code snippet to the end of the &lt;strong&gt;AgoraVideoChat.cs&lt;/strong&gt; script. The code runs at every frame to capture what is displaying on the &lt;strong&gt;ARCamera&lt;/strong&gt; and sends it as a video frame into the channel.&lt;/p&gt;

&lt;h3&gt;
  
  
  Last Check
&lt;/h3&gt;

&lt;p&gt;Since we created a bigger rectangle for Avatar image view (140 pt), we should also update Asset/Prefabs/UserVideo’s RectTransform to 140x140. Also, in &lt;strong&gt;AvatarViewController&lt;/strong&gt;, we provided an option for zoom level. You may want to try 1.5x zooming for better view of yourself in that window. Here is the look after everything connects up:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--AG3-xy3F--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AHev2HVPGc4BN64zKV2xEjA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--AG3-xy3F--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AHev2HVPGc4BN64zKV2xEjA.png" alt="" width="831" height="525"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That’s it! With a face mask on, you look much cooler than before! Have fun playing with friends and talking to them at the same time with your cool new look!&lt;/p&gt;

&lt;h2&gt;
  
  
  In Summary
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;We connected to Agora’s network to display our video chat channel.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We enabled other users to join our party, see their faces, and talk with them in real time.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We took it one step further and let user to pick a face mask provided by the Banuba ARFace Filter SDK.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you have any questions or hit a snag in the course of building your own networked group video chat, please feel free to reach out directly or via the Agora Slack Channel!&lt;/p&gt;

&lt;p&gt;Check out the link to the full project (it is the &lt;em&gt;CompletedProject&lt;/em&gt; branch of the same repo):&lt;a href="https://github.com/icywind/UnityVikingGameChatDemo/tree/CompletedProject"&gt;UnityVikingGameChatDemo&lt;/a&gt;&lt;/p&gt;

</description>
      <category>unity3d</category>
      <category>facefilter</category>
      <category>banuba</category>
    </item>
    <item>
      <title>Adding Video Communication to A Multiplayer Unity Game</title>
      <dc:creator>Rick Cheng</dc:creator>
      <pubDate>Thu, 04 Jun 2020 02:15:22 +0000</pubDate>
      <link>https://dev.to/icywind/adding-video-communication-to-a-multiplayer-unity-game-2mo4</link>
      <guid>https://dev.to/icywind/adding-video-communication-to-a-multiplayer-unity-game-2mo4</guid>
      <description>&lt;p&gt;Do you ever imagine when you play against your friends on a multiplayer game on your lovely mobile device, you want to see each other’s facial expression or tease each other with jokes and funny faces? You’ve found a solution here without leaving the game itself to another chat App. In this tutorial we are going to take Unity’s popular Tanks game to the next level and make it into a game with live video chats!&lt;/p&gt;

&lt;p&gt;Before we get started, there are a few prerequisites for anyone reading this article.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://store.unity.com/" rel="noopener noreferrer"&gt;Unity&lt;/a&gt; (2018) and a &lt;a href="https://developer.cloud.unity3d.com/" rel="noopener noreferrer"&gt;Unity Developer account&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Knowledge of how to build your Unity project for &lt;a href="https://unity3d.com/learn/tutorials/topics/mobile-touch/building-your-unity-game-ios-device-testing" rel="noopener noreferrer"&gt;iOS&lt;/a&gt; and &lt;a href="https://unity3d.com/learn/tutorials/topics/mobile-touch/building-your-unity-game-android-device-testing" rel="noopener noreferrer"&gt;Android&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A cross-platform mobile multiplayer Unity game (I chose to use &lt;a href="https://assetstore.unity.com/packages/essentials/tutorial-projects/tanks-reference-project-80165" rel="noopener noreferrer"&gt;Tanks!!!&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;An understanding of C# and &lt;a href="https://unity3d.com/learn/tutorials/topics/scripting/coding-unity-absolute-beginner" rel="noopener noreferrer"&gt;scripting within Unity&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;An &lt;a href="https://dashboard.agora.io/" rel="noopener noreferrer"&gt;Agora.io developer account&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;At least two mobile devices (&lt;em&gt;one iOS &amp;amp; one Android is ideal&lt;/em&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Project Setup
&lt;/h2&gt;

&lt;p&gt;If you plan to use your own existing Unity project, go ahead and open it now and skip down to “&lt;a href="https://medium.com/p/73d8f8e3b2f9#2d94" rel="noopener noreferrer"&gt;Integrating Group Video Chat&lt;/a&gt;”.&lt;/p&gt;

&lt;p&gt;For those readers that don’t have an existing project, keep reading; &lt;em&gt;the next few sections are for you&lt;/em&gt;. (Note, this step is exactly the same to the Project Setup you may find in or have done most of the set up by following Hermes’ “Adding Voice Chat to a Multiplayer Cross-Platform Unity game” tutorial.)&lt;/p&gt;

&lt;h2&gt;
  
  
  New Unity Project
&lt;/h2&gt;

&lt;p&gt;Please bear with me as the basic setup has a few steps and I’ll do my best to cover it swiftly with lots of images. Let’s start by opening Unity, creating a blank project. I recommend starting this project with the latest Unity 2018 LTS version. Note that the networking module UNet has been deprecated in 2019. So 2018 LTS is essentially the best option here.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2490%2F1%2AaprQAZ6TxUUbBwZ5Q1MBOQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2490%2F1%2AaprQAZ6TxUUbBwZ5Q1MBOQ.png" alt="Create a new project from Unity Hub"&gt;&lt;/a&gt;&lt;em&gt;Create a new project from Unity Hub&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://assetstore.unity.com/packages/essentials/tutorial-projects/tanks-reference-project-80165" rel="noopener noreferrer"&gt;Download and import the “Tanks!!! Reference Project” from the Unity Store&lt;/a&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2Ay89AICjom4yxtr4t5BMqAQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2Ay89AICjom4yxtr4t5BMqAQ.png" alt="Searched by “Tanks Reference” and download this asset"&gt;&lt;/a&gt;&lt;em&gt;Searched by “Tanks Reference” and download this asset&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;When Unity prompts for if you want to overwrite the existing project with the new asset, click Yes. Furthermore, accept the API update prompt that will come up next.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2A8NmzhRsXlv7sc4IvV17idg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2A8NmzhRsXlv7sc4IvV17idg.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Theres a couple more steps to getting the Tanks!!! reference project ready for building on mobile. First we need to enable Unity Live Mode for the project through the Unity dashboard. (&lt;em&gt;select project → Multiplayer → Unet Config&lt;/em&gt;).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2AyvKSiawHUWa29aks.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2AyvKSiawHUWa29aks.png" alt="Set max players to 6 even though Tanks!!! limits the game to 4 players and click save"&gt;&lt;/a&gt;&lt;em&gt;Set max players to 6 even though Tanks!!! limits the game to 4 players and click save&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2A2D9iBiu9Kvu7b9_5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2A2D9iBiu9Kvu7b9_5.png" alt="Once Unity Live Mode is enabled"&gt;&lt;/a&gt;&lt;em&gt;Once Unity Live Mode is enabled&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Building for iOS
&lt;/h2&gt;

&lt;p&gt;Now that we have Unity’s multiplayer enable, we are ready to build the iOS version. Let’s start by opening our Build Settings and switch our platform to iOS and build the project for testing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2APsK-_l3BsAbFGFru0tKV9g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2APsK-_l3BsAbFGFru0tKV9g.png" alt="Update the Bundle id and Usage Descriptions"&gt;&lt;/a&gt;&lt;em&gt;Update the Bundle id and Usage Descriptions&lt;/em&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Please note: you need to have Xcode installed and setup before attempting to build the project for iOS.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2AE4Bevgp6Vdm6cUiF.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2AE4Bevgp6Vdm6cUiF.png" alt="When building for the first time, create a new folder “Builds” and save the build as iOS"&gt;&lt;/a&gt;&lt;em&gt;When building for the first time, create a new folder “Builds” and save the build as iOS&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3068%2F0%2AZpTF_SY7jYtvfQF9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3068%2F0%2AZpTF_SY7jYtvfQF9.png" alt="After the project has successfully built for iOS, we will see the project in the **Builds** folder"&gt;&lt;/a&gt;&lt;em&gt;After the project has successfully built for iOS, we will see the project in the **Builds&lt;/em&gt;* folder*&lt;/p&gt;

&lt;p&gt;Let’s open Unity-iPhone.xcodeproj, sign, and build / run on our test device.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2AQ-xj06zaHzfrRA8-.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2AQ-xj06zaHzfrRA8-.png" alt="Enable automatic signing to simplify the signing process. Remove In-App-Purchase if it shows up."&gt;&lt;/a&gt;&lt;em&gt;Enable automatic signing to simplify the signing process. Remove In-App-Purchase if it shows up.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Don’t start celebrating just yet. Now that we have a working iOS build we still need to get the Android build running.&lt;/p&gt;

&lt;h2&gt;
  
  
  Building for Android
&lt;/h2&gt;

&lt;p&gt;Android is a bit simpler than iOS since Unity can build, sign, and deploy to Android without the need to open Android Studio. For this section I’m going to assume everyone reading this has already linked Unity with their Android SDK folder. Let’s start by opening our Build Settings and switching our platform to Android.&lt;/p&gt;

&lt;p&gt;Before we try to “Build and Run” the project on Android we need to make a couple adjustments to the code. Don’t worry this part is really simple, we only need to comment out a few lines of code, add a simple return statement, and replace one file.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;**Some background:&lt;/em&gt;* the Tanks!!! Android build contains the &lt;a href="https://everyplay.com/" rel="noopener noreferrer"&gt;Everyplay&lt;/a&gt; plugin for screen recording and sharing your game session. Unfortunately &lt;a href="https://everyplay.com/shutdown-notice.html" rel="noopener noreferrer"&gt;Everyplay shutdown&lt;/a&gt; in October 2018 and the plugin contains some issues that if not addressed will cause the project to fail to compile and to quit unexpectedly once it compiles.*&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The first change we need to make is to correct a mistake in the syntax within the Everplay plugin’s build.gradle file. Start by navigating to our project’s Plugins folder and click into the Android folder and then go into the everyplay folder and open the build.gradle file in your favorite code editor.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2A_TJDAAREk7U6HpOg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2A_TJDAAREk7U6HpOg.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that we have the Gradle file open, select all and replace it with the code below. The team that built Tanks!!! updated the code on GitHub but for some reason it didn’t make its way into the Unity Store plugin.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The last change we need to make is to disable EveryPlay. Why would we want to disable EveryPlay, you may ask. That’s because when the plugin tries to initialize itself it causes the Android app to crash. The fastest way I found was to update a couple lines within the EveryPlaySettings.cs, (&lt;em&gt;Assets → Plugins → EveryPlay → Scripts&lt;/em&gt;) so that whenever EveryPlay attempts to check if it’s supported or enabled, we return false.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Now we are finally ready to build the project for Android! Within Unity open the &lt;strong&gt;&lt;em&gt;Build Settings&lt;/em&gt;&lt;/strong&gt; (&lt;em&gt;File &amp;gt; Build Settings&lt;/em&gt;), select &lt;strong&gt;&lt;em&gt;Android&lt;/em&gt;&lt;/strong&gt; from the &lt;em&gt;Platform&lt;/em&gt; list and click &lt;strong&gt;&lt;em&gt;Switch Platform&lt;/em&gt;&lt;/strong&gt;. Once Unity finishes its setup process, open the &lt;strong&gt;&lt;em&gt;Player Settings&lt;/em&gt;&lt;/strong&gt;. We need to make sure our Android app also has a unique &lt;em&gt;Package Name&lt;/em&gt;, I chose com.agora.tanks.videodemo.&lt;/p&gt;

&lt;p&gt;You may also need to create a key store for the Android app. See this section of the PlayerSettings in Unity Editor:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AaaDpzcD4UJ9Fa9YEyYDEZQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AaaDpzcD4UJ9Fa9YEyYDEZQ.png" alt="Android KeyStore setting"&gt;&lt;/a&gt;&lt;em&gt;Android KeyStore setting&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Integrating Video Chat
&lt;/h2&gt;

&lt;p&gt;For this project Agora.io Video SDK for Unity was chosen, because it makes implementation into our cross-platform mobile project, really simple.&lt;/p&gt;

&lt;p&gt;Let’s open up the Unity Store and search for “Agora Video SDK”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2AUYMr47aXj2DMQgfS.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3200%2F0%2AUYMr47aXj2DMQgfS.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AyXsvPxEq1v4Up6S76O18Ww.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AyXsvPxEq1v4Up6S76O18Ww.png" alt="You only download the asset once, and then you can import it to different projects."&gt;&lt;/a&gt;&lt;em&gt;You only download the asset once, and then you can import it to different projects.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Once the plugin page has loaded, go ahead and click &lt;em&gt;**Download&lt;/em&gt;&lt;em&gt;. *Once the download is complete, click and *&lt;/em&gt;&lt;em&gt;Import *&lt;/em&gt;*the assets into your project.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AtbySbQz0_bYqH7HuTW7VmQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AtbySbQz0_bYqH7HuTW7VmQ.png" alt="Uncheck the last four items before import"&gt;&lt;/a&gt;&lt;em&gt;Uncheck the last four items before import&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;You should then open the Lobby as your main scene. The following shows the also how the service page would look like for the multiplayer settings:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2628%2F1%2AyEqSW8yUnHIrzwW6EWeJPA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2628%2F1%2AyEqSW8yUnHIrzwW6EWeJPA.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Discussion&lt;/strong&gt;: in the following sections we will go through how the project to be updated with new code and prefabs changes. For those just want to quickly try out everything. Here is a &lt;a href="https://apprtcio-my.sharepoint.com/:u:/g/personal/rick_agora_io/ETAvJX_kPoxBj1ZsFFpng_wBI4PU4nqmM3TFt9vdRUym-g?e=AKy8hg" rel="noopener noreferrer"&gt;plugin file&lt;/a&gt; to import all the changes. You will just need to enter the AppId to the GameSettings object as described after importing.&lt;/p&gt;

&lt;h2&gt;
  
  
  Modify the Tank Prefab
&lt;/h2&gt;

&lt;p&gt;Let add a plane on to the top of the tank to render the video display. Find the CompleteTank prefab from the project. Add a 3D object Plane to the prefab. Make sure the follow values are updated for the best result:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Y =8 for position; Scale to 0.7. Rotate -45 degrees on X, 45 degrees on Y.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Do not cast shadow&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Disable the Mesh Collider script&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AXt6wCF1nFl7ukB-2XXxTHQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AXt6wCF1nFl7ukB-2XXxTHQ.png" alt="Plane Prefab values"&gt;&lt;/a&gt;&lt;em&gt;Plane Prefab values&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Attach VideoSurface.cs script from the Agora SDK to the Plane game object.&lt;/p&gt;

&lt;p&gt;Save the change, and test the prefab in the game by going to Training to see the outcome. You should see a tank similar to the following screen:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AArFBPJupw34SwaiJSegZQA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AArFBPJupw34SwaiJSegZQA.png" alt="Tank with Plane attached"&gt;&lt;/a&gt;&lt;em&gt;Tank with Plane attached&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Create UI for Mic/Camera Controls
&lt;/h2&gt;

&lt;p&gt;Next, open the GameManager prefab and create a container game object and add three toggles under it:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Mic On&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Cam On&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;FrontCam&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F4628%2F1%2AJ0jZZ4x7vV_ydxpmiFXiiA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F4628%2F1%2AJ0jZZ4x7vV_ydxpmiFXiiA.png" alt="GameManager UI Canvas"&gt;&lt;/a&gt;&lt;em&gt;GameManager UI Canvas&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;That’s basically all the UI changes we need for this project. The controller script will be added to the prefab later in the following sections.&lt;/p&gt;

&lt;h2&gt;
  
  
  Controller Scripts
&lt;/h2&gt;

&lt;p&gt;Next we will go over some scripts to make the video chat to work for this game. Before adding new scripts, we will modify a script to allow the input for the Agora AppId.&lt;/p&gt;

&lt;h3&gt;
  
  
  GameSettings
&lt;/h3&gt;

&lt;p&gt;Two updates to the scripts to make the game to work with Agora SDK.&lt;/p&gt;

&lt;p&gt;(1) Add a SerializedField here for the AppId.&lt;/p&gt;

&lt;p&gt;Go to your Agora developer’s account and get the AppId (you may need to following the instruction to create the project first):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F5488%2F1%2ATFdma4NjCzmMx1R6lPK-WA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F5488%2F1%2ATFdma4NjCzmMx1R6lPK-WA.png" alt="Agora AppId"&gt;&lt;/a&gt;&lt;em&gt;Agora AppId&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In the Lobby scene of the Unity Editor, paste the value of the App ID there and save:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F4332%2F1%2AG5p1IRgv0eirsjdAuYO_gg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F4332%2F1%2AG5p1IRgv0eirsjdAuYO_gg.png" alt="Set the App ID for Agora API"&gt;&lt;/a&gt;&lt;em&gt;Set the App ID for Agora API&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;(2) Add support for Android devices by asking for Microphone and Camera permissions.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h3&gt;
  
  
  Agora Controller Scripts
&lt;/h3&gt;

&lt;p&gt;Before jump right into the code, let’s understand what capabilities are needed. They are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Interface to the Agora Video SDK to do join channel, show video, mute microphone, flip camera, etc.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Actual implementation for the Agora SDK event callbacks.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Mapping from the Unity Multiplayer’s id to Agora user’s id.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A manager to respond to the UI Toggle actions that we created earlier.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The following script capture shows the corresponding scripts hierarchy. A discussion about the four classes follows.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3296%2F1%2ArtYiKK9fFjka9qfPtvzBOg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3296%2F1%2ArtYiKK9fFjka9qfPtvzBOg.png" alt="Controller Script Hierarchy"&gt;&lt;/a&gt;&lt;em&gt;Controller Script Hierarchy&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AgoraApiHandlerImpl&lt;/strong&gt;: this class implements most of the Agora video SDK event callbacks. Many of them are placeholders. To support the minimum capability in this game, the following handlers are of the most interest:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;JoinChannelSuccessHandler&lt;/strong&gt; — when local user joins a channel, this will corrospond to create a game and start a server. The server name is the same as the channel number for Agora SDK.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;UserJoinedHandler&lt;/strong&gt; — when a remote user joins the game.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;UserOfflineHandler&lt;/strong&gt; — when a remote user leaves the game.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;SDKWarningHandler is commented out to reduce the noise in the debugging log. But it is recommended to enable it for an actual project.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;AgoraVideoController&lt;/strong&gt;: this singleton class is the main entry point for the Tanks project to interact with the Agora SDK. It will create the AgoraApiHandlerImpl** **instance and handles interface call to join channel, mute functions, etc. The code also checks for Camera and Microphone access permission for Android devices.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;AgoraPlayerController&lt;/strong&gt;: while Unity Unet library maintains the Network player’s profile, the Agora user’s id is created asynchronously. We will maintain a list of network player and a list of Agora user ids. When the game scene actually starts, we will bind the two list together into a dictionary so the Agora id can be looked up by using a Networkplayer’s profile. (We don’t need this binding mechanism if user id is known. In an actual production project, it is recommended to let the game server to provide the user ids to set for JoinChannel() call.)&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;AgoraUIManager&lt;/strong&gt;: position the container game object to top right location of the game screen. It provides three toggling functions:&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;Mic On&lt;/em&gt; : mute the audio input.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;Cam On&lt;/em&gt;: mute the local camera streaming and turn off the display of the local player.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;CamSwitch&lt;/em&gt;: switch between the front camera or the back camera on the mobile device.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Tanks Code Modifications
&lt;/h2&gt;

&lt;p&gt;We will interact the above controllers code into the existing project by updating the Tanks code in the following classes:&lt;/p&gt;

&lt;h3&gt;
  
  
  TankManager
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Add a field to bring in the VideoSurface instance that we added to the Plane and drag the Plane game object from the children to the field.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fcikn9zmenmn1tvc928ek.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fcikn9zmenmn1tvc928ek.png" alt="Setting up project setting"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Add a constant to name the video-surface.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public const string **LocalTankVideoName **= "Video-Local";
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Change code near the end of the &lt;em&gt;initialize&lt;/em&gt;() method, where it looked like this before:&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AQ892mta4lDDOsvj0f9VtWQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AQ892mta4lDDOsvj0f9VtWQ.png" alt="old code"&gt;&lt;/a&gt;&lt;em&gt;old code&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The new code:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Discussion: here is the code to associate the plane display that we created earlier to render the video feed. The VideoSurface script handles this work. The only thing it needs is the Agora Id. If it is the local player, the Agora Id will be default to 0, and the SDK will automatically render the device’s camera video onto the hosting plane. If this is a remote player, then the non-zero Agora Id is required to get the stream to render.&lt;/p&gt;

&lt;h3&gt;
  
  
  Calls JoinChannel()
&lt;/h3&gt;

&lt;p&gt;The &lt;em&gt;JoinChannel&lt;/em&gt;() function calls in &lt;strong&gt;AgoraVideoController&lt;/strong&gt; class establish the local player status and starts a channel server. There are three places to initiate the call.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;*&lt;em&gt;CreateGame.cs: **add a line to the *StartMatchmakingGame&lt;/em&gt;() function inside the callback. It will look like this:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;    &lt;span class="k"&gt;private&lt;/span&gt; &lt;span class="k"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;StartMatchmakingGame&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
       &lt;span class="n"&gt;GameSettings&lt;/span&gt; &lt;span class="n"&gt;settings&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;GameSettings&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;s_Instance&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
       &lt;span class="n"&gt;settings&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;SetMapIndex&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;m_MapSelect&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;currentIndex&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
       &lt;span class="n"&gt;settings&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;SetModeIndex&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;m_ModeSelect&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;currentIndex&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

       &lt;span class="n"&gt;m_MenuUi&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ShowConnectingModal&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;false&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

       &lt;span class="n"&gt;Debug&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;GetGameName&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt;
       &lt;span class="n"&gt;m_NetManager&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;StartMatchmakingGame&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;GetGameName&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;success&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;matchInfo&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt;
             &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(!&lt;/span&gt;&lt;span class="n"&gt;success&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
             &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="n"&gt;m_MenuUi&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ShowInfoPopup&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Failed to create game."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;null&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
             &lt;span class="p"&gt;}&lt;/span&gt;
             &lt;span class="k"&gt;else&lt;/span&gt;
             &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="n"&gt;m_MenuUi&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;HideInfoPopup&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
                &lt;span class="n"&gt;m_MenuUi&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ShowLobbyPanel&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
                &lt;span class="n"&gt;AgoraVideoController&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;instance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;JoinChannel&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;m_MatchNameInput&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
             &lt;span class="p"&gt;}&lt;/span&gt;
          &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;LevelSelect.cs&lt;/strong&gt;: add the call in &lt;em&gt;OnStartClick&lt;/em&gt;(). And the function will look like this:
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;OnStartClick&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
       &lt;span class="n"&gt;SinglePlayerMapDetails&lt;/span&gt; &lt;span class="n"&gt;details&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;m_MapList&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;m_CurrentIndex&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
       &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;details&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;medalCountRequired&lt;/span&gt; &lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;m_TotalMedalCount&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
       &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="k"&gt;return&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
       &lt;span class="p"&gt;}&lt;/span&gt;

       &lt;span class="n"&gt;GameSettings&lt;/span&gt; &lt;span class="n"&gt;settings&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;GameSettings&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;s_Instance&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
       &lt;span class="n"&gt;settings&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;SetupSinglePlayer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;m_CurrentIndex&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;ModeDetails&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;details&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;details&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;description&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;details&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;rulesProcessor&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;

       &lt;span class="n"&gt;m_NetManager&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ProgressToGameScene&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
      &lt;span class="n"&gt;AgoraVideoController&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;instance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;JoinChannel&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;details&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;LobbyServerEntry.cs&lt;/strong&gt;: add the call in &lt;em&gt;JoinMatch&lt;/em&gt;(). Modify the function signature to add string channelName. And the function will look like this:
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;    &lt;span class="k"&gt;private&lt;/span&gt; &lt;span class="k"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;JoinMatch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;NetworkID&lt;/span&gt; &lt;span class="n"&gt;networkId&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;channelName&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
       &lt;span class="n"&gt;MainMenuUI&lt;/span&gt; &lt;span class="n"&gt;menuUi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;MainMenuUI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;s_Instance&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

       &lt;span class="n"&gt;menuUi&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ShowConnectingModal&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;true&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

       &lt;span class="n"&gt;m_NetManager&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;JoinMatchmakingGame&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;networkId&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;success&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;matchInfo&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt;
             &lt;span class="c1"&gt;//Failure flow&lt;/span&gt;
             &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(!&lt;/span&gt;&lt;span class="n"&gt;success&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
             &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="n"&gt;menuUi&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ShowInfoPopup&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Failed to join game."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;null&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
             &lt;span class="p"&gt;}&lt;/span&gt;
             &lt;span class="c1"&gt;//Success flow&lt;/span&gt;
             &lt;span class="k"&gt;else&lt;/span&gt;
             &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="n"&gt;menuUi&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;HideInfoPopup&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
                &lt;span class="n"&gt;menuUi&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ShowInfoPopup&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Entering lobby..."&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
                &lt;span class="n"&gt;m_NetManager&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;gameModeUpdated&lt;/span&gt; &lt;span class="p"&gt;+=&lt;/span&gt; &lt;span class="n"&gt;menuUi&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ShowLobbyPanelForConnection&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

                &lt;span class="n"&gt;AgoraVideoController&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;instance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;JoinChannel&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;channelName&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
             &lt;span class="p"&gt;}&lt;/span&gt;
          &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

   &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;Populate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;MatchInfoSnapshot&lt;/span&gt; &lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Color&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
       &lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt; &lt;span class="n"&gt;split&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="kt"&gt;char&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="m"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="sc"&gt;'|'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="n"&gt;StringSplitOptions&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RemoveEmptyEntries&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
       &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;channel_name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;split&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="m"&gt;1&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;Replace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;" "&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Empty&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
       &lt;span class="n"&gt;m_ServerInfoText&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;channel_name&lt;/span&gt;&lt;span class="p"&gt;**;&lt;/span&gt;

       &lt;span class="n"&gt;m_ModeText&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;split&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="m"&gt;0&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;

       &lt;span class="n"&gt;m_SlotInfo&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"{0}/{1}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;currentSize&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;maxSize&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

       &lt;span class="n"&gt;NetworkID&lt;/span&gt; &lt;span class="n"&gt;networkId&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;networkId&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

       &lt;span class="n"&gt;m_JoinButton&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;onClick&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;RemoveAllListeners&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
       &lt;span class="n"&gt;m_JoinButton&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;onClick&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddListener&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;JoinMatch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;networkId&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;**&lt;/span&gt;&lt;span class="n"&gt;channel_name&lt;/span&gt;&lt;span class="p"&gt;**));&lt;/span&gt;

       &lt;span class="n"&gt;m_JoinButton&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;interactable&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;currentSize&lt;/span&gt; &lt;span class="p"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;maxSize&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
   &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h3&gt;
  
  
  NetworkManager.cs
&lt;/h3&gt;

&lt;p&gt;Insert code for player leaving the channel in Disconnect():&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;Disconnect&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
       &lt;span class="k"&gt;switch&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;gameType&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
       &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="k"&gt;case&lt;/span&gt; &lt;span class="n"&gt;NetworkGameType&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Direct&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
             &lt;span class="nf"&gt;StopDirectMultiplayerGame&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
             &lt;span class="k"&gt;break&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
          &lt;span class="k"&gt;case&lt;/span&gt; &lt;span class="n"&gt;NetworkGameType&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Matchmaking&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
             &lt;span class="nf"&gt;StopMatchmakingGame&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
             &lt;span class="k"&gt;break&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
          &lt;span class="k"&gt;case&lt;/span&gt; &lt;span class="n"&gt;NetworkGameType&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Singleplayer&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
             &lt;span class="nf"&gt;StopSingleplayerGame&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
             &lt;span class="k"&gt;break&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
       &lt;span class="p"&gt;}&lt;/span&gt;
       &lt;span class="n"&gt;AgoraVideoController&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;instance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;LeaveChannel&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;That’s basically all code changes we need to get the video streaming on local and remote player working! But wait, there is a catch we missed. The plane changes rotation with the tank when moving! See one of the tilted position:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AN7enCXKwxSp786SMFNqkNg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AN7enCXKwxSp786SMFNqkNg.png" alt="Tank Moved with Plane"&gt;&lt;/a&gt;&lt;em&gt;Tank Moved with Plane&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;We will need another script to fix the rotation:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;Build the project deploy the game to iOS or Android devices and start playing to a friend! You may see other person’s face (and yours) on top of the tanks and you can yell to each other now!&lt;/p&gt;

&lt;h3&gt;
  
  
  So, we are done building a fun project!
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AjKNriRlh4Ly22MMzAgKQAA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AjKNriRlh4Ly22MMzAgKQAA.png" alt="Bear vs Duck in a Tank Battle!"&gt;&lt;/a&gt;&lt;em&gt;Bear vs Duck in a Tank Battle!&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The current, complete code is hosted on &lt;a href="https://github.com/icywind/AgoraTanks" rel="noopener noreferrer"&gt;Github&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Other Resources
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;The complete API documentation is available in the Document Center.&lt;/li&gt;
&lt;li&gt;For technical support, submit a ticket using the Agora Dashboard or reach out directly to our Developer Relations team &lt;a href="mailto:devrel@agora.io"&gt;devrel@agora.io&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Come join the Slack community: &lt;a href="https://agoraiodev.slack.com/messages/unity-help-me" rel="noopener noreferrer"&gt;https://agoraiodev.slack.com/messages/unity-help-me&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>unity3d</category>
      <category>agora</category>
      <category>game</category>
      <category>videochat</category>
    </item>
    <item>
      <title>Video Chat with Unity3D and AR Foundation — Chapter 3: Remote Assistant App</title>
      <dc:creator>Rick Cheng</dc:creator>
      <pubDate>Tue, 07 Apr 2020 05:35:00 +0000</pubDate>
      <link>https://dev.to/icywind/video-chat-with-unity3d-and-ar-foundation-chapter-3-remote-assistant-app-53lm</link>
      <guid>https://dev.to/icywind/video-chat-with-unity3d-and-ar-foundation-chapter-3-remote-assistant-app-53lm</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gaUHl4G7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/an7elid2258j1eg7hs4m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gaUHl4G7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/an7elid2258j1eg7hs4m.png" alt="ar-blog-cover"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Video Chat with Unity3D and AR Foundation — Chapter 3: Remote Assistant App
&lt;/h1&gt;

&lt;p&gt;Previously, we had shown two relatively simple sample applications on how to do video chat with AR Foundation on Unity. Here are the links to the blogs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="%5Bhttps://dev.to/icywind/video-chat-with-unity3d-the-arfoundation-version-51b8%5D(https://dev.to/icywind/video-chat-with-unity3d-the-arfoundation-version-51b8)"&gt;Chapter 1: Video Chat with Unity3D, the AR Foundation Version&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="%5Bhttps://dev.to/icywind/video-chat-with-unity3d-and-ar-foundation-chapter-2-screensharing-m5i%5D(https://dev.to/icywind/video-chat-with-unity3d-and-ar-foundation-chapter-2-screensharing-m5i)"&gt;Chapter 2: Video Chat with Unity3D and AR Foundation — ScreenSharing&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Questions continue to come up:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;“How do I share AR objects from the sender’s AR camera?”&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;“Also, I don’t want to share my HUD UIs, how do I do that with screen sharing?”&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;To address these technical challenges, we will make a more sophisticated application on the basis of knowing how to utilize AgoraIO’s Video SDK. Some of you may have already seen from &lt;a href="https://www.agora.io/en/blog/how-to-build-an-augmented-reality-remote-assistance-app"&gt;this blog about a Remote Assistant App on iOS&lt;/a&gt;. We will make a similar App on Unity3D with AR-Foundation.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://store.unity.com/"&gt;Unity Editor&lt;/a&gt; (Version 2019.2 or above)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;2 devices to test on (one to broadcast, one to view)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Broadcast device will be a mobile device to run AR scene: Apple device of iOS 11 or above; Android device with API level Android 7 or above.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Viewing device does not need AR capability — pretty much any device with Windows, Mac, Android, or iOS operating systems will work&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A &lt;a href="https://sso.agora.io/en/signup"&gt;developer account&lt;/a&gt; with Agora.io&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Getting Started
&lt;/h2&gt;

&lt;p&gt;To start, we will need to integrate the Agora Video SDK for Unity3D into our project by searching for it in the Unity Asset Store or click this &lt;a href="https://assetstore.unity.com/packages/tools/video/agora-video-sdk-for-unity-134502"&gt;link&lt;/a&gt; to begin the download. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--x3Uuxou7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lkom1iae9ogukisigo4s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--x3Uuxou7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lkom1iae9ogukisigo4s.png" alt="Screen Shot 2021-04-30 at 11.26.17 AM"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Figure 0. Video SDK on Asset Store&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;This project should be compatible with latest SDK. In case this tutorial could not keep to date, you may fall back to the current SDK archive &lt;a href="https://apprtcio-my.sharepoint.com/:u:/g/personal/rick_agora_io/EfSQ7gv4JMtBhLPrfHBYsDgBxPM4gwy3iYQbj7ORX61utQ?e=xrqosx"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;After you finish downloading and importing the SDK into your project, you should be able to see the README files for the different platforms the SDK supports. You should be familiar with the set up for the demo by following the previous tutorials.&lt;/p&gt;

&lt;h2&gt;
  
  
  Unity AR Packages
&lt;/h2&gt;

&lt;p&gt;On UnityEditor (2019), open Package Manager from the Window tab. Install the following packages:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;AR Foundation 3.0.1&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;ARCore XR Plugin 3.0.1&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;ARKit XR Plugin 3.0.1&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you use other versions of Unity Editor please check with the README file on GitHub for the verified setups. Or you may want to try different versions, since AR Foundation is changing a lot between versions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Project Set Up
&lt;/h2&gt;

&lt;p&gt;Please follow this &lt;a href="https://github.com/icywind/RemoteAssistantAR"&gt;link to download the completed project “RemoteAssistantAR” on Github&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Open the project in Unity Editor. Go to the Unity Asset Store page to import Agora Video SDK. When importing the Agora Video SDK, unselect “demo” from the list since those file names are modified in the Github repo. Switch to iOS or Android platform and follow the README instruction on how to set up the build environment for AR Foundation.&lt;/p&gt;

&lt;p&gt;Open up the Main scene, and your project should look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2bvYwOur--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/3362/1%2Aai1cBEJZxFt7PxA9Ov2P_w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2bvYwOur--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/3362/1%2Aai1cBEJZxFt7PxA9Ov2P_w.png" alt="Figure 1. Main Scene"&gt;&lt;/a&gt;&lt;em&gt;Figure 1. Main Scene&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;You should fill your App ID into the field of the GameController object on this screen. You may then build the project for iOS or Android to run on device. Just like the previous chapter, you will need a mobile device to run as the broadcaster and another mobile or desktop device (include Unity Editor) to run as the audience.&lt;/p&gt;

&lt;h2&gt;
  
  
  Project Architecture
&lt;/h2&gt;

&lt;p&gt;The RemoteAssistantAR project consists of three scenes. Here are their descriptions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Main&lt;/strong&gt; — the entry screen with buttons to open the two screens for different purposes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;AudPlay&lt;/strong&gt; —audience client, in which the user sees the the remote client’s video stream on the main screen; the user can choose color and draw things on the screen.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;CastAR&lt;/strong&gt; — the AR camera client, in which the user sees the real world on the back camera, mixed with AR objects; the user casts what he/she sees to the audience client.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This picture shows relationship among the scenes:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4P3iunZ9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A5Gz_a5uKktJFIGIYhWqbGA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4P3iunZ9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A5Gz_a5uKktJFIGIYhWqbGA.png" alt="Figure 2. Scenes"&gt;&lt;/a&gt;&lt;em&gt;Figure 2. Scenes&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Shared Interfaces
&lt;/h3&gt;

&lt;p&gt;The AudPlay client and the CastAR client share the programming interface &lt;em&gt;IVideoChatClient&lt;/em&gt;. As you can see from Figure 2, the two clients also share a common user interface in the prefab &lt;em&gt;ChatCanvasUI&lt;/em&gt;, as depicted below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3e5Rqwdg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A7-ibHlU01HbGC1hy-UmzBQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3e5Rqwdg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A7-ibHlU01HbGC1hy-UmzBQ.png" alt="Figure 3. ChatCanvasUI Prefab"&gt;&lt;/a&gt;&lt;em&gt;Figure 3. ChatCanvasUI Prefab&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Color Controller
&lt;/h3&gt;

&lt;p&gt;A color controller is the additional UI controller that the AudPlay client contains. User may tap the ink droplet and get a list of color to pick. The follow picture shows the construct of this prefab:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bPRBLboa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2490/1%2A14nfErUVM1kb9g4t-hZLYA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bPRBLboa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2490/1%2A14nfErUVM1kb9g4t-hZLYA.png" alt="Figure 4.1 Color Controller"&gt;&lt;/a&gt;&lt;em&gt;Figure 4.1 Color Controller&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  BroadCast View Controller
&lt;/h3&gt;

&lt;p&gt;The BroadCast View Controller (or “BroadcastVC”), in most sense, is the equivalent client to the Chapter 2’s “ARClient”. You may go back to read about how to use external video source to share the video frames to the receiver. However, in this controller, we capture the AR camera image in a different way. We are actually getting the raw image data from a &lt;em&gt;RenderTexture&lt;/em&gt;, in this method:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;RenderTexture&lt;/strong&gt; is the key to answer those questions at the beginning of this tutorial.&lt;/p&gt;

&lt;h2&gt;
  
  
  An Approach to Share AR Screen
&lt;/h2&gt;

&lt;p&gt;So far, we’ve discovered two ways to share screens. The first way, is to share the entire screen as if you are continuously making screen capture of your devices and send it. This is shown in the blog &lt;a href="https://www.agora.io/en/blog/how-to-broadcast-your-screen-with-unity3d-and-agora"&gt;“How to Broadcast Your Screen with Unity3D and Agora.io”&lt;/a&gt;. The second way was discussed in my previous tutorial. The view from the AR Camera was shared without AR objects. What we want for this application is in between — we want the AR camera and AR objects (called it “AR Screen”) but not the HUD/UI elements. There could be different ways to achieve this. One way is the use of &lt;strong&gt;RenderTexture&lt;/strong&gt;. In summary, it will follow these steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Create multiple cameras that uses different culling masks.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Assign everything under the Canvas UI to the “UI” layer.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Label/assign in code the AR Objects to another layer, e.g., “ARLayer” (anyhow, I picked the built-in “&lt;em&gt;Ignored Raycast&lt;/em&gt;” for convenience, see Figure 4.2 and the DrawDot function in Figure 7.3).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Assign “Target Texture” field in the AR Camera to a Rendered Texture.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Assign the same Rendered Texture to another Camera (“Render Camera”) that will render the AR Objects.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a Material to use the Render Texture.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a Quad to use the Material from the last step.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a camera (“View Camera”) to render the quad texture.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yXK_mJhC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A-ixo4yuDA1FlMy08Yjkaeg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yXK_mJhC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A-ixo4yuDA1FlMy08Yjkaeg.png" alt="Figure 4.2 Layer"&gt;&lt;/a&gt;&lt;em&gt;Figure 4.2 Layer&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Cameras
&lt;/h3&gt;

&lt;p&gt;So far, there are three cameras being used in the CastAR scene. They are laid out in the following wireframe:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Qf3r0ByJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2APAr9Cf93nhmVAteGPb9H9A.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Qf3r0ByJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2APAr9Cf93nhmVAteGPb9H9A.png" alt="Figure 5. Wireframe of the Cameras"&gt;&lt;/a&gt;&lt;em&gt;Figure 5. Wireframe of the Cameras&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AR Camera&lt;/strong&gt;: Takes the actual physical back camera from a device as input, sets each frame as output to a RenderTexture, e.g., &lt;em&gt;CasterRenderTexture&lt;/em&gt;. Figure 6.1 illustrates the important parameters for this camera:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CMWZquXL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AXFEV4BmFllt2Thx9iURZpw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CMWZquXL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AXFEV4BmFllt2Thx9iURZpw.png" alt="Figure 6.1 — AR Camera"&gt;&lt;/a&gt;&lt;em&gt;Figure 6.1 — AR Camera&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Render Camera: **Renders&lt;/strong&gt; *&lt;em&gt;3D AR objects. One may ask, why can’t AR Camera be used for 3D object rendering? The reason being, is that the AR Camera is a physical camera. As we can see from the last tutorial, the 3D objects are not captured in this camera. Our Render Camera is placed on the same world position as the AR Camera. And its output is also being sent to the same RenderTexture — *CasterRenderTexture&lt;/em&gt;. See figure 6.2:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6qlVAtEG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2Agg9l-7v0bIR5OfvEvChYVw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6qlVAtEG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2Agg9l-7v0bIR5OfvEvChYVw.png" alt="Figure 6.2 — Render Camera"&gt;&lt;/a&gt;&lt;em&gt;Figure 6.2 — Render Camera&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;View Camera&lt;/strong&gt;: Looks at the quad, which uses a material that “reads” from the shared RenderTexture &lt;em&gt;CasterRenderTexture&lt;/em&gt;. This camera provides the actual “view” for the user using this device. Note that since both &lt;em&gt;AR Camera&lt;/em&gt; and &lt;em&gt;Render Camera&lt;/em&gt; are rendering to the RenderTexture, the image is not rendered to the screen itself. That’s why this &lt;em&gt;View Camera&lt;/em&gt; came into place. Moreover, the positions of &lt;em&gt;View Camera&lt;/em&gt; and &lt;em&gt;Quad&lt;/em&gt; are separated from &lt;em&gt;AR Camera&lt;/em&gt; and &lt;em&gt;Render Camera&lt;/em&gt;, so that the &lt;em&gt;Quad&lt;/em&gt; object is not taken into the &lt;em&gt;Render Camera&lt;/em&gt;’s view.&lt;/p&gt;

&lt;p&gt;From the Hierarchy view in Figure 6.2, we may also see that &lt;em&gt;View Camera *parents *Quad&lt;/em&gt;, so &lt;em&gt;Quad&lt;/em&gt; turns and changes position with the camera’s movement. Same parent-child relationship applies to &lt;em&gt;AR Camera&lt;/em&gt; and both &lt;em&gt;View Camera *and *Render Camera&lt;/em&gt;. Their view angles and positions are kept in-sync this way.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qTZArsMt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AS7vovQmKxuxXZ6tUqbjSvg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qTZArsMt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AS7vovQmKxuxXZ6tUqbjSvg.png" alt="Figure 6.3 — View Camera"&gt;&lt;/a&gt;&lt;em&gt;Figure 6.3 — View Camera&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The follow diagram illustrates the relationships between the cameras and the client views:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yuW6dK81--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AwKI8Bvndx4opcfgLL7iqkQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yuW6dK81--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AwKI8Bvndx4opcfgLL7iqkQ.png" alt="Figure 6.4 — Relationship Diagram"&gt;&lt;/a&gt;&lt;em&gt;Figure 6.4 — Relationship Diagram&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Next we will discuss the details about the remote assistant Part.&lt;/p&gt;

&lt;h2&gt;
  
  
  Remote Assistant
&lt;/h2&gt;

&lt;p&gt;The idea about the remote assistant app, is that a field operator may be out to an area to work on something, but he needs some guidance on where the things to look at from a help elsewhere. Our &lt;em&gt;CastAR&lt;/em&gt; client will be used the field operator in this scenario; and a help operator uses the &lt;em&gt;AudPlay&lt;/em&gt; client to draw an outline of the important part on the screen. Technically, we will need to do this in the following steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Capture the finger motion on &lt;em&gt;AudPlay&lt;/em&gt; client’s screen and record the touch points.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Send the touch points to &lt;em&gt;CastAR&lt;/em&gt; client.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;CastAR&lt;/em&gt; client receives the data points, and renders the outline in its 3D space.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Data model
&lt;/h3&gt;

&lt;p&gt;We will use a small sphere to represent a dot, and the user’s dragging motion will create a series of small colored spheres. We will establish the model for our remote drawing data:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Unlike the iOS version of this App, we don’t need to worry about the offset of screen points for different devices. We will convert the points from the &lt;em&gt;Screen Space&lt;/em&gt; to &lt;em&gt;ViewPort&lt;/em&gt; &lt;em&gt;Space&lt;/em&gt; for the transport. &lt;a href="https://answers.unity.com/questions/168156/screen-vs-viewport-what-is-the-difference.html"&gt;This page&lt;/a&gt; will help you to refresh your memory about the different point systems in Unity. A point from screen touch is “normalized” in this line:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Vector3 vp = Camera.main.ScreenToViewportPoint(screenPos)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;We use a buffer list to accumulate the touch points to limit the number of the data stream call, which can be a bit of overhead adding to the performance measure.&lt;/p&gt;
&lt;h3&gt;
  
  
  Data Stream
&lt;/h3&gt;

&lt;p&gt;On the &lt;em&gt;AudPlay&lt;/em&gt; client side, one of the initialization steps in &lt;em&gt;AudienceVC&lt;/em&gt; involves the creation of the data stream, in the following line:&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;dataStreamId = rtcEngine.CreateDataStream(reliable: true, ordered: true);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;The &lt;em&gt;dataStreamId&lt;/em&gt; is later used in the send stream message method as following:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;On the &lt;em&gt;CastAR&lt;/em&gt; client side, we dedicated a game object “DrawListener” to listen to the data stream and process the request.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ngCjbW7z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2ACUTyNhcrXzVCcwBJ1q-8WQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ngCjbW7z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2ACUTyNhcrXzVCcwBJ1q-8WQ.png" alt="Figure 7.1 — DrawListener"&gt;&lt;/a&gt;&lt;em&gt;Figure 7.1 — DrawListener&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The &lt;em&gt;DrawListener’s&lt;/em&gt; controller script &lt;em&gt;RemoteDrawer&lt;/em&gt; registers a callback to handle the data stream event:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;rtcEngine.OnStreamMessage += HandleStreamMessage;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;The data passing between the two ends is in character string, which is a JSON formatted string. And we invoke this method to draw the dots in the &lt;em&gt;CastAR&lt;/em&gt; client’s 3D world:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;Here the &lt;em&gt;referenceObject&lt;/em&gt; is the sphere. It shares the same parent of the container of the dots to be drawn. The &lt;em&gt;DeNormalizedPosition&lt;/em&gt;() function on line 19 does the opposite of what we did previously in &lt;em&gt;Normalize&lt;/em&gt;():&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;camera.ScreenToWorldPoint(pos);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The important thing here, is to use the correct camera for the viewport space. Since the Render Camera is responsible to render the AR 3D objects, it is used for the conversion here.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Linking all the information above should provide you a better understanding of the design and the technical implementation of the RemoteAssistantAR project in Unity.&lt;/p&gt;

&lt;p&gt;Some known issues here:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;We skipped the “Undo” feature that exists in the iOS sample project, instead, we have the “Clear” button to clear all the drawing once.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The drawn outline may not “stay” on the object if the camera keeps moving around. This is due to the hardcoded z-space for the drawing object. A possible fix may be using RayCast to determine where to place the Z position at.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;There is a lag after both devices joining a channel to start the video streaming.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Video recording of the app (through iOS controls or desktop Quicktime Player) causes longer lags.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There are definitely plenty room of improvement here. And the multi camera solution for AR screen sharing is not the only valid approach. Please let me know if there is any suggestion or even better — make a Pull Request of your modification on Github will be very welcomed!&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Other Resources&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The complete API documentation is available in the &lt;a href="https://docs.agora.io/en/"&gt;Document Center&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For technical support, submit a ticket using the &lt;a href="https://dashboard-v1.agora.io/show-ticket-submission"&gt;Agora Dashboard&lt;/a&gt; or reach out directly to our Developer Relations team &lt;a href="//mailto:devrel@agora.io"&gt;devrel@agora.io&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Come join the Slack community: &lt;a href="https://agoraiodev.slack.com/messages/unity-help-me"&gt;https://agoraiodev.slack.com/messages/unity-help-me&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ar</category>
      <category>unity3d</category>
      <category>agora</category>
      <category>videochat</category>
    </item>
    <item>
      <title>Video Chat with Unity3D and AR Foundation — Chapter 2: ScreenSharing</title>
      <dc:creator>Rick Cheng</dc:creator>
      <pubDate>Fri, 03 Apr 2020 23:03:18 +0000</pubDate>
      <link>https://dev.to/icywind/video-chat-with-unity3d-and-ar-foundation-chapter-2-screensharing-m5i</link>
      <guid>https://dev.to/icywind/video-chat-with-unity3d-and-ar-foundation-chapter-2-screensharing-m5i</guid>
      <description>&lt;h1&gt;
  
  
  Video Chat with Unity3D and AR Foundation — Chapter 2: ScreenSharing
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TKyUSphT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/1%2AMzcYxJNHlzZ20z3u4_iP4w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TKyUSphT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/1%2AMzcYxJNHlzZ20z3u4_iP4w.png" alt="background"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Many of you may have read my previous blog about the simple modification to the demo App to incorporate AR Foundation. If not, you may go back to &lt;a href="https://dev.to/icywind/video-chat-with-unity3d-the-arfoundation-version-51b8"&gt;read the story&lt;/a&gt;, and use it for the set up for this new project. The most frequently asked question from the previous tutorial may have been&lt;/p&gt;

&lt;p&gt;&lt;em&gt;“How do I share my AR camera screen?”&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In this tutorial, we will emphasize how to use external video source to stream the AR camera view to the remote user. To get the best from AR Foundation, we will use the latest stable version of the Unity Editor.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://store.unity.com/"&gt;Unity Editor&lt;/a&gt; (Version 2019.2 or above)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;2 devices to test on (one to broadcast, one to view)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Broadcast device will be a mobile device to run AR scene: Apple device of iOS 11 or above; Android device with API level Android 7 or above.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Viewing device does not need AR capability: pretty much any device with Windows, Mac, Android, or iOS operating systems will work&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A &lt;a href="https://sso.agora.io/en/signup"&gt;developer account&lt;/a&gt; with Agora.io&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Getting Started
&lt;/h2&gt;

&lt;p&gt;To start, we will need to integrate the Agora Video SDK for Unity3D into our project by searching for it in the Unity Asset Store or click this &lt;a href="https://assetstore.unity.com/packages/tools/video/agora-video-sdk-for-unity-134502"&gt;link&lt;/a&gt; to begin the download. Note the current SDK version is archived &lt;a href="https://apprtcio-my.sharepoint.com/:u:/g/personal/rick_agora_io/EfSQ7gv4JMtBhLPrfHBYsDgBxPM4gwy3iYQbj7ORX61utQ?e=xrqosx"&gt;here&lt;/a&gt;, in case that the future SDK release has a different layout.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Yi9T-XeA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2464/1%2ACzedOTuVyM0yUpIRYphNwA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Yi9T-XeA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2464/1%2ACzedOTuVyM0yUpIRYphNwA.png" alt="Video SDK on Asset Store"&gt;&lt;/a&gt;&lt;em&gt;Video SDK on Asset Store&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;After you finish downloading and importing the SDK into your project, you should be able to see the README files for the different platforms the SDK supports. You should be familiar with the set up for the demo by following the previous tutorials.&lt;/p&gt;

&lt;h2&gt;
  
  
  Unity AR Packages
&lt;/h2&gt;

&lt;p&gt;On UnityEditor (2019), open Package Manager from the Window tab. Install the following packages:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;AR Foundation 3.0.1&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;ARCore XR Plugin 3.0.1&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;ARKit XR Plugin 3.0.1&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Project Set Up
&lt;/h2&gt;

&lt;p&gt;Please follow this &lt;a href="https://github.com/icywind/ARUnityClient"&gt;link to download the completed project “ARUnityClient” on Github&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Open the project in Unity Editor. Go to the asset store page to import Agora Video SDK. When importing the Agora Video SDK, unselect “demo” from the list since those file names are modified in the Github repo. Switch to iOS or Android platform and follow the README instruction on how to set up the build environment for AR Foundation.&lt;/p&gt;

&lt;p&gt;Open up the &lt;em&gt;SceneHome&lt;/em&gt; scene, your project should look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--STljYLqY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2718/1%2A0suiApgz2Lk-2X5u0uw1uw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--STljYLqY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2718/1%2A0suiApgz2Lk-2X5u0uw1uw.png" alt="Home Scene"&gt;&lt;/a&gt;&lt;em&gt;Home Scene&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Again, the project is based on the original demo that comes with the SDK. Scenes and scripts are modified to run the screen sharing. You should fill in your App ID in the field of &lt;em&gt;GameController&lt;/em&gt; before running the demo.&lt;/p&gt;

&lt;h2&gt;
  
  
  Project Architecture
&lt;/h2&gt;

&lt;p&gt;The ARUnityClient consists of three scenes and here are their descriptions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Home&lt;/strong&gt; — the entry screen with buttons to open the two screens for different purposes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;HelloVideo&lt;/strong&gt; — the 1-to-1 chat client, in which the remote user is shown on the big screen while the local user is in the small image box.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;ARClient&lt;/strong&gt; — the big screen will show the real-world environment with AR objects: a sphere for anchoring, and a cube hosts remote user’s video.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This picture shows their relationship:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xBHNHvDB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AUL5padN4nnHHpsSLK42u4g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xBHNHvDB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AUL5padN4nnHHpsSLK42u4g.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The application will require two devices to run. There are two scenarios:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Both devices use “1-to-1 Chat”.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;One device uses “1-to-1 Chat”, while the other uses “AR Camera”.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;=&amp;gt; You may not want to run “AR Camera” on both device.&lt;/p&gt;

&lt;p&gt;We will focus on Scenario 2.&lt;/p&gt;

&lt;h2&gt;
  
  
  Shared Interface
&lt;/h2&gt;

&lt;p&gt;The AR Camera client and the 1-to-1 client both implement the IVideoChatClient interface for the following methods:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;TestUnityARClient.cs&lt;/em&gt; and &lt;em&gt;TestHelloVideo.cs&lt;/em&gt; implement their version of the client on this common interface. User chooses the client on the Home screen to create the instance of the client and load the respective scene.&lt;/p&gt;

&lt;h2&gt;
  
  
  AR Camera Client
&lt;/h2&gt;

&lt;p&gt;AR Camera client’s &lt;em&gt;join(channel)&lt;/em&gt; method is implemented this way:&lt;/p&gt;

&lt;p&gt;Notice the following lines are different from what we wrote in previous tutorial:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;         //mRtcEngine.EnableLocalVideo(false);
         CameraCapturerConfiguration config = new CameraCapturerConfiguration();
         config.preference =      CAPTURER_OUTPUT_PREFERENCE.CAPTURER_OUTPUT_PREFERENCE_AUTO;
         config.cameraDirection = CAMERA_DIRECTION.CAMERA_REAR;
         mRtcEngine.SetCameraCapturerConfiguration(config);
         mRtcEngine.SetExternalVideoSource(true, false);
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;The code snippet above does the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Set the automatic video quality preference.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use the rear camera for capturing video&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The video sent out from an external video source. What this means is that the stream won’t be taken directly off the camera. Instead, we will encapsulate a source of video raw data manually.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;So for the video stream on AR Camera client, there are two steps of work done in continuous fashion:&lt;/p&gt;

&lt;p&gt;A. Capture the image from AR camera&lt;/p&gt;

&lt;p&gt;B. Send the image raw data to the external video source API&lt;/p&gt;

&lt;p&gt;The source code is listed in &lt;em&gt;TestUnityARClient.cs&lt;/em&gt;. We will discuss the important parts of the code that addresses these steps.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Step A — AR Foundation Camera Image capture&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The Unity documentation provides very insightful help on &lt;a href="https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@3.1/manual/cpu-camera-image.html"&gt;how to retrieve the data from AR Camera on CPU.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The first thing you will need is the access to the &lt;strong&gt;ARCameraManager&lt;/strong&gt;. Make sure the component is added to the AR Camera instance in the scene, in our example - the &lt;em&gt;ARClient&lt;/em&gt; scene. We called the instance of the ARCameraManager “cameraManager” in code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--aA2O-OnC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AbBNWqkccqmmT6kYoIGSnvA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--aA2O-OnC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AbBNWqkccqmmT6kYoIGSnvA.png" alt="ARCameraManager"&gt;&lt;/a&gt;&lt;em&gt;ARCameraManager&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Next, you will need to register a delegate to wake a function to collect the raw data from the image on each camera frame.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cameraManager.frameReceived += OnCameraFrameReceived;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;On each frame received from the AR camera, we use &lt;em&gt;cameraManager&lt;/em&gt; to get the latest image, and extract the raw binary data into a byte array. Lastly send to the external video source. The complete function snippet is following:&lt;/p&gt;

&lt;p&gt;Note that line 39, which converts the raw data into binary bytes, is actually accessing a memory pointer for the copy operation. Therefore this is an “unsafe” operation in such context.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;image.Convert(conversionParams, new System.IntPtr(buffer.GetUnsafePtr()), buffer.Length);
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;For this reason, the &lt;em&gt;CaputreARBuffer&lt;/em&gt;() function requires an “unsafe” modifier. And it also requires “allow unsafe code” enabled in the project setting. See this picture:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rTKhi0R1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AF_iNG_b-ouqE1eXB_GsLvQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rTKhi0R1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AF_iNG_b-ouqE1eXB_GsLvQ.png" alt="Allow unsafe code"&gt;&lt;/a&gt; &lt;em&gt;Allow unsafe code&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The last line in this function is a call to run &lt;em&gt;PushFrame&lt;/em&gt;() as a Coroutine. The PushFrame Coroutine takes the raw image bytes to send it out as the external video source. Note that on the callback delegate (line 46) we dispose the image and buffer to clean up the memory.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step B — External Video Source
&lt;/h3&gt;

&lt;p&gt;Using external video source is actually a concept you may find in another Agora demo app on screen sharing — &lt;a href="https://www.agora.io/en/blog/how-to-broadcast-your-screen-with-unity3d-and-agora"&gt;How to Broadcast Your Screen with Unity3D and Agora.io.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The code is similar for the most part. Here is the snippet from TestUnityARClient.cs:&lt;/p&gt;

&lt;p&gt;In summary, this function serves the purpose: create and populate an instance of the &lt;em&gt;ExternalVideoFrame&lt;/em&gt; data structure; then send the data using the API &lt;em&gt;pushVideoFrame&lt;/em&gt;().&lt;/p&gt;

&lt;h3&gt;
  
  
  Testing
&lt;/h3&gt;

&lt;p&gt;Please refer to the previous tutorial or the SDK Readme file for the other project settings for AR applications. You should be able to build the project and run it on your devices without any code changes.&lt;/p&gt;

&lt;p&gt;The following picture illustrates the two screens for the test.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---B9VCGEL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2ABaLo3LMk00DWpfEfvzYtVw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---B9VCGEL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2ABaLo3LMk00DWpfEfvzYtVw.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  All Done!
&lt;/h2&gt;

&lt;p&gt;Thank you for following along. It is very straight-forward, isn’t it? If you have any questions please feel free to leave a comment, DM me, or join our Slack channel &lt;a href="https://agoraiodev.slack.com/messages/unity-help-me"&gt;agoraiodev.slack.com/messages/unity-help-me&lt;/a&gt;. There will be another chapter coming. We will make an even more complex application on AR Foundation. See you next time!&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Other Resources&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://dev.to/icywind/video-chat-with-unity3d-and-ar-foundation-chapter-3-remote-assistant-app-53lm"&gt;Next AR Foundation Chapter&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;The complete API documentation is available in the &lt;a href="https://docs.agora.io/en/"&gt;Document Center&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;For technical support, submit a ticket using the &lt;a href="https://dashboard-v1.agora.io/show-ticket-submission"&gt;Agora Dashboard&lt;/a&gt; or reach out directly to our Developer Relations team &lt;a href="//mailto:devrel@agora.io"&gt;devrel@agora.io&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>unity3d</category>
      <category>ar</category>
      <category>screensharing</category>
      <category>agora</category>
    </item>
    <item>
      <title>Video Chat with Unity3D, the ARFoundation Version</title>
      <dc:creator>Rick Cheng</dc:creator>
      <pubDate>Thu, 02 Apr 2020 00:53:47 +0000</pubDate>
      <link>https://dev.to/icywind/video-chat-with-unity3d-the-arfoundation-version-51b8</link>
      <guid>https://dev.to/icywind/video-chat-with-unity3d-the-arfoundation-version-51b8</guid>
      <description>&lt;h1&gt;
  
  
  Video Chat with Unity3D, the ARFoundation Version
&lt;/h1&gt;

&lt;p&gt;Many of you took steps in creating your very own video chat app using our &lt;a href="https://medium.com/agora-io/how-to-create-a-video-chat-app-in-unity-26780b479a78"&gt;How To: Create a Video Chat App in Unity&lt;/a&gt;. Now, that you’ve got that covered, let’s take your app a couple notches by adding an immersive experience to it. In this Augmented Reality (AR) tutorial, you’ll learn how to communicate and chat with friend in AR. This quick start guide is very similar to the previous tutorial.  We just need to make few changes to make it work in less than an hour.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://store.unity.com/"&gt;Unity Editor&lt;/a&gt; (Version 2018 or above)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;2 devices to test on (one to broadcast, one to view)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Broadcast device will be a mobile device to run AR scene: Apple device of iOS 11 or above; Android device with Android 7 or above.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Viewing device will run the standard Video Chat demo app - pretty much any device with Windows, Mac, Android, or iOS operating systems will work&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A &lt;a href="https://sso.agora.io/en/signup"&gt;developer account&lt;/a&gt; with Agora.io&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Getting Started
&lt;/h2&gt;

&lt;p&gt;To start, we will need to integrate the Agora Video SDK for Unity3D into our project by searching for it in the Unity Asset Store or click this &lt;a href="https://assetstore.unity.com/packages/tools/video/agora-video-sdk-for-unity-134502"&gt;link&lt;/a&gt; to begin the download. Note the current SDK version is archived &lt;a href="https://apprtcio-my.sharepoint.com/:u:/g/personal/rick_agora_io/EfSQ7gv4JMtBhLPrfHBYsDgBxPM4gwy3iYQbj7ORX61utQ?e=xrqosx"&gt;here&lt;/a&gt;, in case that the future SDK release has a different layout.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DQ5ntKOa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AZtV2gDGGDwvTPbf7khIbKw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DQ5ntKOa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2AZtV2gDGGDwvTPbf7khIbKw.png" alt="Video SDK on Asset Store" width="800" height="478"&gt;&lt;/a&gt;&lt;em&gt;Video SDK on Asset Store&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;After you finish downloading and importing the SDK into your project, you should be able to see the README.md files for the different platforms the SDK supports. For your convenience, you can also access the quick start tutorials for each platform below.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Android&lt;/strong&gt;- &lt;a href="https://medium.com/agora-io/run-video-chat-within-your-unity-application-android-add6949f6078"&gt;Here&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;iOS&lt;/strong&gt;- &lt;a href="https://medium.com/agora-io/run-video-chat-within-your-unity-application-ios-425db335a325"&gt;Here&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Unity AR Packages
&lt;/h2&gt;

&lt;p&gt;On UnityEditor, open Package Manager from the Window tab. Install the following packages:&lt;/p&gt;

&lt;p&gt;For Unity 2018:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;AR Foundation 1.0.0 — preview.22 (the latest for 1.0.0)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;ARCore XR Plugin 1.0.0 — preview.24 (the latest for 1.0.0)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;ARKit XR Plugin 1.0.0-preview.27 (the latest for 1.0.0)&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For Unity 2019:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;AR Foundation 3.0.1&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;ARCore XR Plugin 3.0.1&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;ARKit XR Plugin 3.0.1&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Modify the Existing Project
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Modify the play scene
&lt;/h3&gt;

&lt;p&gt;Open up the TestSceneHelloVideo scene. Take out the Cube and Cylinder. Delete the Main Camera since we will use an AR Camera later.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hCCZH5Pa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2394/1%2AkeM29t-Tux4kv0iiKaUTig.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hCCZH5Pa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2394/1%2AkeM29t-Tux4kv0iiKaUTig.png" alt="Test Scene — before" width="880" height="702"&gt;&lt;/a&gt;&lt;em&gt;Test Scene — before&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;On the Hierarchy panel, create a “AR Session” and “AR Session Origin”. Click on the AR Camera and change the tag to “Main Camera”, then create a Sphere 3D object. Modify its transform position to (0,0,5.67) so it can be visible in your Editor Game view and Save.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gbsbhIlv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2392/1%2AHf_Hrx3Wg7usbcUHC8_LuQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gbsbhIlv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/2392/1%2AHf_Hrx3Wg7usbcUHC8_LuQ.png" alt="Test Scene — after" width="880" height="710"&gt;&lt;/a&gt;&lt;em&gt;Test Scene — after&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Unlike the Cube or Cylinder, the purpose of this sphere is just for positional reference. You would find this sphere in your AR view when running on a mobile device. We will need to add video view objects relative to this sphere’s position by code.&lt;/p&gt;

&lt;h3&gt;
  
  
  Modify Test Scene Script
&lt;/h3&gt;

&lt;p&gt;Open TestHelloUnityVideo.cs, change the onUserJoin() method to generate a cube instead of a plane. We will add a function to provide a new position for each new remote user joining the chat.&lt;/p&gt;

&lt;p&gt;In the Join() method, add the following line in the “enable video” section:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mRtcEngine.EnableLocalVideo(false);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This call disables the front camera, so it won’t get into conflict with the back camera which is used by the AR Camera on the device.&lt;/p&gt;

&lt;p&gt;Last but not the least, fill in your APP ID for the variable declared in the beginning section of the TestHelloUnityVideo class.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;If you haven’t already go to &lt;a href="https://www.agora.io/en/"&gt;Agora.io&lt;/a&gt;, log in and get an APP ID (it’s free). This will give you connectivity to the global Agora Real-time Communication network and allow you to broadcast across your living room or around the world, also your first **10,000 minutes are free every month&lt;/em&gt;&lt;em&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Build Project
&lt;/h3&gt;

&lt;p&gt;The configuration for building an ARFoundation enabled project is slightly different from the standard demo project.&lt;/p&gt;

&lt;p&gt;Here is a quick checklist of things to set:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;IOS:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Rendering Color Space = Linear&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Graphics API = Metal&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Architecture = ARM64&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Target Minimum iOS Version = 11.0&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A unique bundle id&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Android&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Graphics API = GLES3&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Multithreaded Rendering = off&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Minimum API Level = Android 7.0 (API level 24)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a new key store in the Publishing Settings&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Now build the Application to either iOS or Android. Run the standard demo application for the remote users, from any of the four platforms that we discussed at the beginning of this tutorial. To test your demo, stand up and use the device to look around you and you should find the sphere. A joining remote user’s video will now be placed on the cubes next to the sphere.&lt;/p&gt;

&lt;p&gt;Great job! You’ve built a simple AR world of video chatters!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WzJcl6Kk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A-ZzMOVC2xh9tvEP4PCzbqA.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WzJcl6Kk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/2000/1%2A-ZzMOVC2xh9tvEP4PCzbqA.gif" alt="The ARFoundation Demo" width="600" height="338"&gt;&lt;/a&gt;&lt;em&gt;The ARFoundation Demo&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  All Done!
&lt;/h2&gt;

&lt;p&gt;Thank you for following along. If you have any questions please feel free to leave a comment, DM me, or join our Slack channel agoraiodev.slack/unity-help-me.&lt;/p&gt;

&lt;p&gt;Note there will be another two chapters of the AR Foundation Video Chat tutorial with deeper contents. Stay in touch!&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Other Resources&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The complete API documentation is available in the &lt;a href="https://docs.agora.io/en/"&gt;Document Center&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For technical support, submit a ticket using the &lt;a href="https://dashboard-v1.agora.io/show-ticket-submission"&gt;Agora Dashboard&lt;/a&gt; or reach out directly to our Developer Relations team &lt;a href="//mailto:devrel@agora.io"&gt;devrel@agora.io&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>unity3d</category>
      <category>agora</category>
      <category>videochat</category>
      <category>rtc</category>
    </item>
  </channel>
</rss>
