DEV Community

Jackson for HMS Core

Posted on

AR Engine and Scene Kit Deliver Virtual Try-On of Glasses

Background​

The ubiquity of the Internet and smart devices has made e-commerce the preferred choice for countless consumers. However, many longtime users have grown wary of the stagnant shopping model, and thus enhancing user experience is critical to stimulating further growth in e-commerce and attracting a broader user base. HMS Core offers intelligent graphics processing capabilities to identify a user's facial and physical features, which when combined with a new display paradigm, enables users to try on products virtually through their mobile phones, for a groundbreaking digital shopping experience.

Scenarios​

AR Engine and Scene Kit allow users to virtually try on products found on shopping apps and shopping list sharing apps, which in turn will lead to greater customer satisfaction and fewer returns and replacements.

Effects​

A user opens an shopping app, then the user taps a product's picture to view the 3D model of the product, which they can rotate, enlarge, and shrink for interactive viewing.

Image description

Getting Started​

Configuring the Maven Repository Address for the HMS Core SDK
Open the project-level build.gradle file in your Android Studio project. Go to buildscript > repositories and allprojects > repositories to configure the Maven repository address for the HMS Core SDK.

buildscript {
    repositories{
      ...   
        maven {url 'http://developer.huawei.com/repo/'}
    }   
}
allprojects {
    repositories {     
       ...
         maven { url 'http://developer.huawei.com/repo/'}
    }
}
Enter fullscreen mode Exit fullscreen mode

Adding Build Dependencies for the HMS Core SDK
Open the app-level build.gradle file of your project. Add build dependencies in the dependencies block and use the Full-SDK of Scene Kit and AR Engine SDK.

dependencies {
....
implementation 'com.huawei.scenekit:full-sdk:5.0.2.302'
implementation 'com.huawei.hms:arenginesdk:2.13.0.4'
}
Enter fullscreen mode Exit fullscreen mode

For details about the preceding steps, please refer to the development guide for Scene Kit on HUAWEI Developers.

Adding Permissions in the AndroidManifest.xml File
Open the AndroidManifest.xml file in main directory and add the permission to use the camera above the <application line.

<!--Camera permission-->
<uses-permission android:name="android.permission.CAMERA" />
Enter fullscreen mode Exit fullscreen mode

Development Procedure​

Configuring MainActivity

Add two buttons to the layout configuration file of
MainActivity . Set the background of the onBtnShowProduct button to the preview image of the product and add the text Try it on ! to the onBtnTryProductOn button to guide the user to the feature.

<Button
    android:layout_width="260dp"
    android:layout_height="160dp"
    android:background="@drawable/sunglasses"
    android:onClick="onBtnShowProduct" />

<Button
    android:layout_width="wrap_content"
    android:layout_height="wrap_content"
    android:text="Try it on!"
    android:textAllCaps="false"
    android:textSize="24sp"
    android:onClick="onBtnTryProductOn" />
Enter fullscreen mode Exit fullscreen mode

If the user taps the onBtnShowProduct button, the 3D model of the product will be loaded. After tapping the onBtnTryProductOn button, the user will enter the AR fitting screen.

Configuring the 3D Model Display for a Product​

1.Create a SceneSampleView inherited from SceneView.

public class SceneSampleView extends SceneView {
    public SceneSampleView(Context context) {
        super(context);
    }
    public SceneSampleView(Context context, AttributeSet attributeSet) {
        super(context, attributeSet);
    }
}
Enter fullscreen mode Exit fullscreen mode

Override the surfaceCreated method to create and initialize SceneView. Then call loadScene to load the materials, which should be in the glTF or GLB format, to have them rendered and displayed. Call loadSkyBox to load skybox materials,
loadSpecularEnvTexture to load specular maps, and loadDiffuseEnvTexture to load diffuse maps. These files should be in the DDS (cubemap) format.

All loaded materials are stored in the src > main > assets > SceneView folder.

@Override
public void surfaceCreated(SurfaceHolder holder) {
    super.surfaceCreated(holder);
    // Load the materials to be rendered.
    loadScene("SceneView/sunglasses.glb");
    // Call loadSkyBox to load skybox texture materials.
    loadSkyBox("SceneView/skyboxTexture.dds");
    // Call loadSpecularEnvTexture to load specular texture materials.
    loadSpecularEnvTexture("SceneView/specularEnvTexture.dds");
    // Call loadDiffuseEnvTexture to load diffuse texture materials.
    loadDiffuseEnvTexture("SceneView/diffuseEnvTexture.dds");
}
Enter fullscreen mode Exit fullscreen mode

2.Create a SceneViewActivity inherited from Activity .

Call setContentView using the onCreate method, and then pass SceneSampleView that you have created using the XML tag in the layout file to setContentView .

public class SceneViewActivity extends Activity {
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_sample);
    }
}
Enter fullscreen mode Exit fullscreen mode
<com.huawei.scene.demo.sceneview.SceneSampleView
    android:layout_width="match_parent"
    android:layout_height="match_parent"/>
Enter fullscreen mode Exit fullscreen mode
  1. Create an onBtnShowProduct in MainActivity.

When the user taps the onBtnShowProduct button,
*SceneViewActivity * is called to load, render, and finally display the 3D model of the product.

public void onBtnShowProduct(View view) {
    startActivity(new Intent(this, SceneViewActivity.class));
}
Enter fullscreen mode Exit fullscreen mode

Configuring AR Fitting for a Product
Product virtual try-on is easily accessible, thanks to the facial recognition, graphics rendering, and AR display capabilities offered by HMS Core.

  1. Create a** FaceViewActivity** inherited from Activity, and create the corresponding layout file.

Create face_view in the layout file to display the try-on effect.

<com.huawei.hms.scene.sdk.FaceView
    android:id="@+id/face_view"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    app:sdk_type="AR_ENGINE"></com.huawei.hms.scene.sdk.FaceView>
Enter fullscreen mode Exit fullscreen mode

Create a switch. When the user taps it, they can check the difference between the appearances with and without the virtual glasses.

<Switch
    android:layout_width="wrap_content"
    android:layout_height="wrap_content"
    android:id="@+id/switch_view"
    android:layout_alignParentTop="true"
    android:layout_marginTop="15dp"
    android:layout_alignParentEnd="true"
    android:layout_marginEnd ="15dp"
    android:text="Try it on"
    android:theme="@style/AppTheme"
    tools:ignore="RelativeOverlap" />
Enter fullscreen mode Exit fullscreen mode
  1. Override the onCreate method in FaceViewActivity to obtain FaceView.
public class FaceViewActivity extends Activity {
    private FaceView mFaceView;
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_face_view);
        mFaceView = findViewById(R.id.face_view);
    }
}
Enter fullscreen mode Exit fullscreen mode
  1. Create a listener method for the switch. When the switch is enabled, the loadAsset method is called to load the 3D model of the product. Set the position for facial recognition in LandmarkType .
mSwitch.setOnCheckedChangeListener(new CompoundButton.OnCheckedChangeListener() {
    @Override
    public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) {
        mFaceView.clearResource();
        if (isChecked) {
            // Load materials.
         int index = mFaceView.loadAsset("FaceView/sunglasses.glb", LandmarkType.TIP_OF_NOSE);
        }
    }
});
Enter fullscreen mode Exit fullscreen mode

Use setInitialPose to adjust the size and position of the model. Create the position, rotation, and scale arrays and pass values to them.

final float[] position = { 0.0f, 0.0f, -0.15f };
final float[] rotation = { 0.0f, 0.0f, 0.0f, 0.0f };
final float[] scale = { 2.0f, 2.0f, 0.3f };
Enter fullscreen mode Exit fullscreen mode

Put the following code below the loadAsset line:

mFaceView.setInitialPose(index, position, scale, rotation);
Enter fullscreen mode Exit fullscreen mode
  1. Create an onBtnTryProductOn in MainActivity. When the user taps the onBtnTryProductOn button, the FaceViewActivity is called, enabling the user to view the try-on effect.
public void onBtnTryProductOn(View view) {
    if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA)
            != PackageManager.PERMISSION_GRANTED) {
        ActivityCompat.requestPermissions(
                this, new String[]{ Manifest.permission.CAMERA }, FACE_VIEW_REQUEST_CODE);
    } else {
        startActivity(new Intent(this, FaceViewActivity.class));
    }
}
Enter fullscreen mode Exit fullscreen mode

References​

For more details, you can go to:
AR Engine official website; Scene Kit official website
Reddit to join our developer discussion
GitHub to download sample codes
Stack Overflow to solve any integration problems

Top comments (0)