DEV Community

Jackson for HMS Core

Posted on

Precise and Immersive AR for Interior Design

Augmented reality (AR) technologies are increasingly widespread, notably in the field of interior design, as they allow users to visualize real spaces and apply furnishing to them, with remarkable ease. HMS Core AR Engine is a must-have for developers creating AR-based interior design apps, since it's easy to use, covers all the basics, and considerably streamlines the development process. It is an engine for AR apps that bridge the virtual and real worlds, for a brand new visually interactive user experience. AR Engine's motion tracking capability allows your app to output the real-time 3D coordinates of interior spaces, convert these coordinates between real and virtual worlds, and use this information to determine the correct position of furniture. With AR Engine integrated, your app will be able to provide users with AR-based interior design features that are easy to use.
Image description

As a key component of AR Engine, the motion tracking capability bridges real and virtual worlds, by facilitating the construction of a virtual framework, tracking how the position and pose of user devices change in relation to their surroundings, and outputting the 3D coordinates of the surroundings.

About This Feature

The motion tracking capability provides a geometric link between real and virtual worlds, by tracking the changes of the device's position and pose in relation to its surroundings, and determining the conversion of coordinate systems between the real and virtual worlds. This allows virtual furnishings to be rendered from the perspective of the device user, and overlaid on images captured by the camera.

For example, in an AR-based car exhibition, virtual cars can be placed precisely in the target position, creating a virtual space that's seamlessly in sync with the real world.
Image description

The basic condition for implementing real-virtual interaction is tracking the motion of the device in real time, and updating the status of virtual objects in real time based on the motion tracking results. This means that the precision and quality of motion tracking directly affect the AR effects available on your app. Any delay or error can cause a virtual object to jitter or drift, which undermines the sense of reality and immersion offered to users by AR.

Advantages

Simultaneous localization and mapping (SLAM) 3.0 released in AR Engine 3.0 enhances the motion tracking performance in the following ways:

  • With the 6DoF motion tracking mode, users are able to observe virtual objects in an immersive manner from different distances, directions, and angles.
  • Stability of virtual objects is ensured, thanks to monocular absolute trajectory error (ATE) as low as 1.6 cm.
  • The plane detection takes no longer than one second, facilitating plane recognition and expansion. Image description

Integration Procedure

Logging In to HUAWEI Developers and Creating an App

Integrating the AR Engine SDK

1) Open the project-level build.gradle file in Android Studio, and add the Maven repository (versions earlier than 7.0 are used as an example).
Go to buildscript > repositories and configure the Maven repository address for the SDK.
Go to allprojects > repositories and configure the Maven repository address for the SDK.

buildscript {
    repositories {
        google()
        jcenter()
        // Configure the Maven repository address for the HMS Core SDK.
        maven {url "https://developer.huawei.com/repo/" }
    }
}
allprojects {
    repositories {
        google()
        jcenter()
        // Configure the Maven repository address for the HMS Core SDK.
        maven {url "https://developer.huawei.com/repo/" }
    }
}
Enter fullscreen mode Exit fullscreen mode

2) Open the app-level build.gradle file in your project.

dependencies {
implementation 'com.huawei.hms:arenginesdk:3.1.0.1'
}
Enter fullscreen mode Exit fullscreen mode

Code Development

1) Check whether AR Engine has been installed on the current device. If yes, your app can run properly. If not, your app should automatically redirect the user to AppGallery to install AR Engine.

private boolean arEngineAbilityCheck() {
    boolean isInstallArEngineApk = AREnginesApk.isAREngineApkReady(this);
    if (!isInstallArEngineApk && isRemindInstall) {
        Toast.makeText(this, "Please agree to install.", Toast.LENGTH_LONG).show();
        finish();
    }
    LogUtil.debug(TAG, "Is Install AR Engine Apk: " + isInstallArEngineApk);
    if (!isInstallArEngineApk) {
        startActivity(new Intent(this, com.huawei.arengine.demos.common.ConnectAppMarketActivity.class));
        isRemindInstall = true;
    }
    return AREnginesApk.isAREngineApkReady(this);
}
Enter fullscreen mode Exit fullscreen mode

2) Check permissions before running.
Configure the camera permission in the AndroidManifest.xml file.

<uses-permission android:name="android.permission.CAMERA" />

private static final int REQUEST_CODE_ASK_PERMISSIONS = 1;
private static final int MAX_ARRAYS = 10;
private static final String[] PERMISSIONS_ARRAYS = new String[]{Manifest.permission.CAMERA};
List<String> permissionsList = new ArrayList<>(MAX_ARRAYS);
boolean isHasPermission = true;

for (String permission : PERMISSIONS_ARRAYS) {
    if (ContextCompat.checkSelfPermission(activity, permission) != PackageManager.PERMISSION_GRANTED) {
        isHasPermission = false;
        break;
    }
}
if (!isHasPermission) {
    for (String permission : PERMISSIONS_ARRAYS) {
        if (ContextCompat.checkSelfPermission(activity, permission) != PackageManager.PERMISSION_GRANTED) {
            permissionsList.add(permission);
        }
    }
    ActivityCompat.requestPermissions(activity,
        permissionsList.toArray(new String[permissionsList.size()]), REQUEST_CODE_ASK_PERMISSIONS);
}
Enter fullscreen mode Exit fullscreen mode

3) Create an ARSession object for motion tracking by calling ARWorldTrackingConfig.

private ARSession mArSession;
private ARWorldTrackingConfig mConfig;
config.setCameraLensFacing(ARConfigBase.CameraLensFacing.FRONT);   // Set scene parameters by calling config.setXXX.
config.setPowerMode(ARConfigBase.PowerMode.ULTRA_POWER_SAVING);
mArSession.configure(config);
mArSession.resume();
mArSession.configure(config);


mSession.setCameraTextureName(mTextureDisplay.getExternalTextureId());
ARFrame arFrame = mSession.update();  // Obtain a frame of data from ARSession.

// Set the environment texture probe and mode after the camera is initialized.
setEnvTextureData();
ARCamera arCamera = arFrame.getCamera();  // Obtain ARCamera from ARFrame. ARCamera can then be used for obtaining the camera's projection matrix to render the window.

// The size of the projection matrix is 4 x 4.
float[] projectionMatrix = new float[16];
arCamera.getProjectionMatrix(projectionMatrix, PROJ_MATRIX_OFFSET, PROJ_MATRIX_NEAR, PROJ_MATRIX_FAR);
mTextureDisplay.onDrawFrame(arFrame);
StringBuilder sb = new StringBuilder();
updateMessageData(arFrame, sb);
mTextDisplay.onDrawFrame(sb);

// The size of ViewMatrix is 4 x 4.
float[] viewMatrix = new float[16];
arCamera.getViewMatrix(viewMatrix, 0);
for (ARPlane plane : mSession.getAllTrackables(ARPlane.class)) {    // Obtain all trackable planes from ARSession.

    if (plane.getType() != ARPlane.PlaneType.UNKNOWN_FACING
        && plane.getTrackingState() == ARTrackable.TrackingState.TRACKING) {
        hideLoadingMessage();
        break;
    }
}
drawTarget(mSession.getAllTrackables(ARTarget.class), arCamera, viewMatrix, projectionMatrix);
mLabelDisplay.onDrawFrame(mSession.getAllTrackables(ARPlane.class), arCamera.getDisplayOrientedPose(),
    projectionMatrix);
handleGestureEvent(arFrame, arCamera, projectionMatrix, viewMatrix);
ARLightEstimate lightEstimate = arFrame.getLightEstimate();
ARPointCloud arPointCloud = arFrame.acquirePointCloud();
getEnvironmentTexture(lightEstimate);
drawAllObjects(projectionMatrix, viewMatrix,  getPixelIntensity(lightEstimate));
mPointCloud.onDrawFrame(arPointCloud, viewMatrix, projectionMatrix);

ARHitResult hitResult = hitTest4Result(arFrame, arCamera, event.getEventSecond());
if (hitResult != null) {
    mSelectedObj.setAnchor(hitResult.createAnchor());  // Create an anchor at the hit position to enable AR Engine to continuously track the position.

}
Enter fullscreen mode Exit fullscreen mode

4) Draw the required virtual object based on the anchor position.

mEnvTextureBtn.setOnCheckedChangeListener((compoundButton, b) -> {
    mEnvTextureBtn.setEnabled(false);
    handler.sendEmptyMessageDelayed(MSG_ENV_TEXTURE_BUTTON_CLICK_ENABLE,
            BUTTON_REPEAT_CLICK_INTERVAL_TIME);
    mEnvTextureModeOpen = !mEnvTextureModeOpen;
    if (mEnvTextureModeOpen) {
       mEnvTextureLayout.setVisibility(View.VISIBLE);
    } else {
      mEnvTextureLayout.setVisibility(View.GONE);
    }
    int lightingMode = refreshLightMode(mEnvTextureModeOpen, ARConfigBase.LIGHT_MODE_ENVIRONMENT_TEXTURE);
    refreshConfig(lightingMode);
});
Enter fullscreen mode Exit fullscreen mode

Top comments (0)