Reality 2.0. This is how I tend to call Augmented Reality in the last half a year. Why is that?
I honestly believe that Augmented reality already at the stage of getting out of the field of "gimmicky" tech and got into the stage, where businesses seriously consider adding AR experiences for consumers. Location-based AR, casual games, retail, real estate. You name it. AR got to the point where it's still not ready yet in our smart glasses or "Black mirror type" contact lenses, but we have it on our phones with the help of ARKit and ARCore.
How it works
Now if you are not familiar with the later, ARKit and ARCore are frameworks by Apple and Google respectively that brings AR tech to any modern phone. In a nutshell, it has basic environment understanding and motion tracking that enables developers not only to render 3d content in the real world but to "anchor it" to recognized surfaces. For instance, pointing your phone camera, you can recognize your office desk as a surface and render a 3d model on it. The cool part is because of motion tracking when you walk around the model, it will seem to be on the same spot in the physical world where you've rendered it. It can enable you to create various experiences that bring real-world context to your app or vice versa.
React Native is the future of mobile apps
React Native has revolutionized mobile apps creation experience in recent years, and naturally, there is a demand for ARKit and ARCore solutions for developers specializing in React Native. Usually, the standard with creating AR solutions will be to go to the Unity3d game engine and create AR apps with a pretty straightforward workflow. Developers get excellent visual editor where they can work with 3d content, and the code they write is cross-platform - compatible for both iOS and Android.
So what if you need to embed AR Content in your existing React Native app. Or what if AR is just a part of a more extensive app written in React Native. There are several solutions there. In this blog post, we will talk though not about how to embed Unity content into our React Native app, which is also possible, but develop simple AR experiences using React components. And we will do so with a framework called ViroReact
Creating AR content in Viro.
ViroReact gives you rendering engine powered by ARKit and ARCore and supplies with a bunch of React Components and events that you can use to layout your AR app. In addition to that, it has all the following features described in the image below.
Getting started
To get started, you need to go to Viro website and sign up to get an API key. Follow quick start guide there to get up and running with simple Hello World app.
In a nutshell, it will be as simple as running.
npm install -g react-viro-cli
react-viro init MyAwesomeApp
When you do that you will have a basic app that you can run using Viro testbed app. You will need to add API key, to App.js file though
Alternatively, if you want to run it natively on your device, you can run ./setup-ide.sh
- This script will set up your development environment for iOS, Android, or both. You can read more about that here for Xcode or for Android, here
Building blocks
The first and the most essential building block of any AR scene is <ViroARSceneNavigator>
component. This component is the one you pass your apiKey to and pass your initialScene
prop. Take a look at this example here:
<ViroARSceneNavigator
apiKey="1839C275-6929-45AF-B638-EF2DEE44C1D9"
numberOfTrackedImages={this.props.numberOfTrackedImages || 1}
initialScene={{
scene: this.props.navigation.state.params.screenType === "portal" ? ARPortalsScene: MarkerScreen,
passProps: this.props.navigation.state.params
}}
/>
Here we will specify an AR screen to be either ARPortalsScene
scene or MarkerScreen
. basically, we will be creating this experience:
Marker screen is the screen where we recognize the marker and show text data for every marker with slight animation. The portal screen is self-descriptive and is the one where you've seen a doorway to VR world from which you can still view our real world.
The portal
Let's actually start with the portal scene. Before loading the portal, you've seen a gray quad rendered on the ground. This quad is basically created by using ViroARPlaneSelector
component, and it gives us an ability to select a recognized surface, and while it is selected, this component will render its children on the surface.
So the code for rendering the portal will look like this:
<ViroARPlaneSelector dragType="FixedToWorld">
<ViroPortalScene passable={true} dragType="FixedDistance" onDrag={()=>{}}>
<ViroPortal scale={[.5, .5, .5]}>
<Viro3DObject onLoadStart={() => { alert("Loading portal") }}
source={require('../res/portal_wood_frame.vrx')}
resources={[require('../res/portal_wood_frame_diffuse.png'),
require('../res/portal_wood_frame_normal.png'),
require('../res/portal_wood_frame_specular.png')]}
type="VRX"
/>
</ViroPortal>
<Viro360Image source={{ uri: data.portals_by_pk.portalMedia360 }} />
</ViroPortalScene>
</ViroARPlaneSelector>
As you can see, we use helper components provided by Viro to both load .fbx
model (.vrx
is a FBX
format extension passed through Viro compression script) and 360 image (can also be a video)
The marker
The marker will use ViroARImageMarker
helper component to render specific data for every marker. The code will be as following.
return (
<ViroARImageMarker target={"businessCard"}
onAnchorFound={
() => this.setState({
runAnimation: true
})}
>
<ViroNode key="card" onTouch={() => alert("twitter")}>
<ViroNode
opacity={0} position={[0, -0.02, 0]}
animation={{
name:'animateImage',
run: this.state.runAnimation
}}
>
<ViroFlexView
rotation={[-90, 0, 0]}
height={0.03}
width={0.05}
style={styles.card}
>
<ViroFlexView
style={styles.cardWrapper}
>
<ViroImage
height={0.015}
width={0.015}
style={styles.image}
source={{uri: markerData.user.avatarUrl}}
/>
<ViroText
textClipMode="None"
text={`${markerData.user.name} ${markerData.user.lastname}`}
scale={[.015, .015, .015]}
style={styles.textStyle}
/>
</ViroFlexView>
<ViroFlexView
style={styles.subText}
>
<ViroText
width={0.01}
height={0.01}
textAlign="left"
textClipMode="None"
text={`@${markerData.user.twitterProfile}`}
scale={[.01, .01, .01]}
style={styles.textStyle}
/>
<ViroAnimatedImage
height={0.01}
width={0.01}
loop={true}
source={require('../res/tweet.gif')}
/>
</ViroFlexView>
</ViroFlexView>
</ViroNode>
<ViroNode opacity={0} position={[0, 0, 0]}
animation={{
name:'animateViro',
run: this.state.runAnimation
}}
>
{ markerData.bottomBar && <ViroText text={markerData.bottomBar}
rotation={[-90, 0, 0]}
scale={[.01, .01, .01]}
style={styles.textStyle}
/> }
</ViroNode>
</ViroNode>
</ViroARImageMarker>
Here you can see we are using a bunch of other components such as ViroNode for grouping content and ViroFlexView, so we can layout text in AR world using flexbox. We also have sliding animations created etc.
What about the data
You've probably already noticed that we don't have hardcoded values and in fact, actual data comes from GraphQL API that was autogenerated with Hasura.
Hasura GraphQL
If it's the first time you hear about Hasura I strongly suggest to check out this blog post about it here or here
In a nutshell, Hasura can auto-generate GraphQL API from your Postgres DB either a new or existing one; it gives you impressive management console for both your data, graphql api, auth, access control, business logic and more. It provides you with various options of adding your business logic whether you prefer more synchronous approach or ready to step forward and use 3factor.app architecture for your modern apps. You can read more about adding your business logic here:
So what is crucial for us is to get real-time data also in our AR world. Let's start by creating our backend
Creating backend with Hasura
Head over to hasura.io and use any method of deploying the engine that you prefer. I will use Heroku. After deploying the engine, I will be presented with empty Hasura console where we will start adding our data
Head to the Data tab and let's create the following tables:
- model_resources - will hold resources for 3d models (textures and such)
- models - will hold 3d models obj urls, scaling, lighting models and will have a relationship to model_resources table
- markers - will hold data for our markers and will have a relationship to users and models table
- portals - Will hold data for our portals
- users - we will have a dummy login in this app, but you can implement proper authentication by checking out the following blog post:
After you set all of this in the newly created engine, Hasura will auto-generate GraphQL API for you including really performant subscriptions which we will be using in our AR app.
Setup your client
Now it's time to head to our Viro app and actually, remove all example content we had and start by creating basic React Native app with GraphQL. If you are new to React Native and GraphQL, I strongly suggest checking learn.hasura.io tutorial on React Native.
Our AppScreen
in App.js
code from being just ViroSample will turn to something like this:
const mkWsLink = (uri) => {
const splitUri = uri.split('//');
const subClient = new SubscriptionClient(
'wss://' + splitUri[1],
{ reconnect: true }
);
return new WebSocketLink(subClient);
}
const wsLink = mkWsLink(GRAPHQL_ENDPOINT)
const httpLink = new HttpLink({ uri: GRAPHQL_ENDPOINT });
const link = split(
// split based on operation type
({ query }) => {
const { kind, operation } = getMainDefinition(query);
return kind === 'OperationDefinition' && operation === 'subscription';
},
wsLink,
httpLink
);
// Creating a client instance
const client = new ApolloClient({
link,
cache: new InMemoryCache({
addTypename: false
})
});
export default class AppScreen extends React.Component {
render() {
return(
<ApolloProvider client={client}>
<RootNavigator
client={client}
session={this.props.sessionInfo}
/>
</ApolloProvider>
)
}
}
As you can see we have ApolloProvider
which sets our apollo client on React context, so it can be accessed through Subscription
component later on.
Navigation
Let's take a look at our RootNavigator
const AppStack = createBottomTabNavigator({
Markers: createStackNavigator({
Home: Home,
ARScreen: ARScreen
}),
Portals: createStackNavigator({
Portals: PortalsList,
ARPortalScreen: ARScreen
})
});
const AuthStack = createStackNavigator({ SignIn: SignIn });
export default createAppContainer(createSwitchNavigator(
{
AuthLoading: AuthLoadingScreen,
App: AppStack,
Auth: AuthStack,
Upload
},
{
initialRouteName: 'AuthLoading',
}
));
As you can see here our initial screen will be AuthLoading screen, but both Markers
tab and Portals
tab will have a stack that will have purely react native screen with a list of markers/portals and ARScreen
.
We've already seen ARScreen previously but it looks like that:
getARView() {
return this.state.arEnabled ? (
<ViroARSceneNavigator
apiKey=API_KEY
numberOfTrackedImages={this.props.numberOfTrackedImages || 1}
initialScene={{
scene: this.props.navigation.state.params.screenType === "portal" ? ARPortalsScene: MarkerScreen,
passProps: this.props.navigation.state.params
}}
/>
) : null
}
render() {
return (
<SafeAreaView style={styles.container}>
{this.getARView()}
<View style={styles.bottomTab}>
<Button text="Go Back" onPress={this._goBack} />
</View>
</SafeAreaView>
)
}
Subscriptions work in both AR and mobile world
Now the only thing we are left to do is to head to our MarkerScreen.js and wrap our Marker with Subscription. MarkerScreen
render function looks like this:
render() {
return (
<ViroARScene onTrackingUpdated={this._onInitialized} >
<ViroDirectionalLight color="#777777"
direction={[0, -1, -2]}
shadowOrthographicPosition={[0, 8, -5]}
shadowOrthographicSize={10}
shadowNearZ={2}
shadowFarZ={9}
lightInfluenceBitMask={2}
castsShadow={true}
/>
{ this.state.isTracking ? this.getNoTrackingUI() : this.getARScene()}
</ViroARScene>
);
}
Basically, we wrap our scene in ViroARScene
component to bring AR capabilities to React Native leveraging ViroReact and ARKit/ARCore frameworks.
We've seen previously how marker code looks like, so now the only thing that is left to do is to wrap it with Subscription
const MARKER_DETAILS = gql`
subscription fetchMarker($markerId: uuid!) {
markers_by_pk(id: $markerId) {
user {
name
lastname,
twitterProfile,
avatarUrl
}
bottomBar
}
}
`;
getARScene() {
return (
<ViroNode>
<Subscription subscription={MARKER_DETAILS} variables={{ markerId: this.props.id }}>
{({loading, error, data }) => {
if (error) {
alert("Error", "Could not fetch marker");
return null;
}
if (loading) {
return (
<LoadingComponent text="Loading Marker" />
)
}
const markerData = data.markers_by_pk
if (this.props.is3d) {
return this.render3d()
}
return (
<ViroARImageMarker target={"businessCard"}
onAnchorFound={
() => this.setState({
runAnimation: true
})}
>
// rest of marker code
The result will look as follows:
Summary
As you have seen in this blog post adding AR content to your React Native is not a rocket science and can be done pretty easily with a basic understanding of 3d environments. This AR content can be similarly connected to GraphQL API as regular React Native apps. And Hasura will make this step really easy and straightforward for you.
I haven't covered all the code from this sample app, but it's available here for you to check.
In the last few months, I also gave several talks on this topic featuring this demo app and explaining more on AR parts so check out a talk from DevDays Europe:
Slides are available here
If you want to learn more about AR and GraphQL combination, I will teach an AR and GraphQL workshop at Chain React 2019 conference so if you are interested in more guided hands-on experience with this, you are welcome to get tickets for it.
And on the final note before concluding this blog post, On a weekly basis, I or someone from the team at Hasura.io streams something cool on twitch.tv/hasurahq and upload that on our Youtube channel so make sure you are aware of our next events. For instance, yesterday, I streamed a detailed code overview of the same AR app we've just seen.
If you have any questions you are free to reach out to me by filling the contact form on my website: vnovick.com or by sending me DM on Twitter @VladimirNovick
Top comments (0)