How to Develop an AR-Based Home Decoration App
The ability to see how furniture will look in your home via AR tech changes the way we decorate our homes. Check out how to develop such an app.
Join the DZone community and get the full member experience.
Join For FreeBackground
Around half a year ago, I decided to start decorating my new house. Before getting started, I did lots of research on a variety of different topics relating to interior decoration, such as how to choose a consistent color scheme, which measurements to make and how to make them, and how to choose the right furniture. However, my preparations made me realize that no matter how well-prepared you are, you're always going to run into many unexpected challenges.
Before rushing to the furniture store, I listed all the different pieces of furniture that I wanted to place in my living room, including a sofa, tea table, potted plants, dining table, and carpet, and determined the expected dimensions, colors, and styles of these various items of furniture.
However, when I finally got to the furniture store, the dizzying variety of choices had me confused, and I found it very difficult to imagine how the different choices of furniture would actually look in an actual living room. At that moment, a thought came to my mind: wouldn't it be great if there was an app that allows users to upload images of their home and then freely select different furniture products to see how they'll look in their home? Such an app would surely save users wishing to decorate their homes lots of time and unnecessary trouble and reduce the risks of users being dissatisfied with the final decoration result.
That's when the idea of developing an app by myself came to my mind. My initial idea was to design an app that people could use to help them quickly satisfy their home decoration needs by allowing them to see what furniture would look like in their homes. The basic way the app works is that users first upload one or multiple images of a room they want to decorate and then set a reference parameter, such as the distance between the floor and the ceiling. Armed with this information, the app would then automatically calculate the parameters of other areas in the room. Then, users can upload images of furniture they like into a virtual shopping cart. When uploading such images, users need to specify the dimensions of the furniture. From the editing screen, users can drag and drop furniture from the shopping cart onto the image of the room to preview the effect. But then a problem arises: images of furniture dragged and dropped into the room look pasted on and do not blend naturally with their surroundings.
By a stroke of luck, I happened to discover HMS Core AR Engine when looking for a solution to the aforementioned problem. This development kit provides the ability to integrate virtual objects realistically into the real world, which is exactly what my app needs. With its plane detection capability, my app will be able to detect the real planes in a home and allow users to place virtual furniture based on these planes; and with its hit test capability, users can interact with virtual furniture to change their position and orientation in a natural manner.
AR Engine tracks the illumination, planes, images, objects, surfaces, and other environmental information to allow apps to integrate virtual objects into the physical world and look and behave like they would if they were real. Its plane detection capability identifies feature points in groups on horizontal and vertical planes, as well as the boundaries of the planes, ensuring that your app can place virtual objects on them.
In addition, the kit continuously tracks the location and orientation of devices relative to their surrounding environment and establishes a unified geometric space between the virtual world and the physical world. The kit uses its hit test capability to map a point of interest that users tap on the screen to a point of interest in the real environment, from where a ray will be emitted pointing to the location of the device camera, and return the intersecting point between the ray and the plane. In this way, users can interact with any virtual object on their device screen.
Functions and Features
- Plane detection: Both horizontal and vertical planes are supported.
- Accuracy: The margin of error is around 2.5 cm when the target plane is 1 m away from the camera.
- Texture recognition delay: < 1s
- Supports polygon fitting and plane merging.
Developing Plane Detection
1. Create a WorldActivity
object. This example demonstrates how to use the world AR scenario of AR Engine.
Public class WorldActivity extends BaseActivity{
Protected void onCreate (Bundle saveInstanceState) {
Initialize DisplayRotationManager.
mDisplayRotationManager = new DisplayRotationManager(this);
Initialize WorldRenderManager.
mWorldRenderManager = new WorldRenderManager(this,this);
}
// Create a gesture processor.
Private void initGestureDetector(){
mGestureDetector = new GestureDetector(this,new GestureDetector.SimpleOnGestureListener()){
}
}
mSurfaceView.setOnTouchListener(new View.OnTouchListener()){
public Boolean onTouch(View v,MotionEvent event){
return mGestureDetector.onTouchEvent(event);
}
}
// Create ARWorldTrackingConfig in the onResume lifecycle.
protected void onResume(){
mArSession = new ARSession(this.getApplicationContext());
mConfig = new ARWorldTrackingConfig(mArSession);
…
}
// Initialize a refresh configuration class.
private void refreshConfig(int lightingMode){
// Set the focus.
mConfig.setFocusMode(ARConfigBase.FocusMode.AUTO_FOCUS);
mArSession.configure(mConfig);
}
}
2. Initialize the WorldRenderManager
class, which manages rendering related to world scenarios, including label rendering and virtual object rendering.
public class WorldRenderManager implements GLSurfaceView.Renderr{
// Initialize a class for frame drawing.
Public void onDrawFrame(GL10 unused){
// Set the openGL textureId for storing the camera preview stream data.
mSession.setCameraTextureName(mTextureDisplay.getExternalTextureId());
// Update the calculation result of AR Engine. You are advised to call this API when your app needs to obtain the latest data.
ARFrame arFrame = mSession.update();
// Obtains the camera specifications of the current frame.
ARCamera arCamera = arFrame.getCamera();
// Returns a projection matrix used for coordinate calculation, which can be used for the transformation from the camera coordinate system to the clip coordinate system.
arCamera.getProjectionMatrix(projectionMatrix, PROJ_MATRIX_OFFSET, PROJ_MATRIX_NEAR, PROJ_MATRIX_FAR);
Session.getAllTrackables(ARPlane.class)
...
}
}
3. Initialize the VirtualObject
class, which provides properties of the virtual object and the necessary methods for rendering the virtual object.
Public class VirtualObject{
}
4. Initialize the ObjectDisplay
class to draw virtual objects based on specified parameters.
Public class ObjectDisplay{
}
Developing Hit Test
1. Initialize the WorldRenderManager
class, which manages rendering related to world scenarios, including label rendering and virtual object rendering.
public class WorldRenderManager implementsGLSurfaceView.Renderer{
// Pass the context.
public WorldRenderManager(Activity activity, Context context) {
mActivity = activity;
mContext = context;
…
}
// Set ARSession, which updates and obtains the latest data in OnDrawFrame.
public void setArSession(ARSession arSession) {
if (arSession == null) {
LogUtil.error(TAG, "setSession error, arSession is null!");
return;
}
mSession = arSession;
}
// Set ARWorldTrackingConfig to obtain the configuration mode.
public void setArWorldTrackingConfig(ARWorldTrackingConfig arConfig) {
if (arConfig == null) {
LogUtil.error(TAG, "setArWorldTrackingConfig error, arConfig is null!");
return;
}
mArWorldTrackingConfig = arConfig;
}
// Implement the onDrawFrame() method.
@Override
public void onDrawFrame(GL10 unused) {
mSession.setCameraTextureName(mTextureDisplay.getExternalTextureId());
ARFrame arFrame = mSession.update();
ARCamera arCamera = arFrame.getCamera();
...
}
// Output the hit result.
private ARHitResult hitTest4Result(ARFrame frame, ARCamera camera, MotionEvent event) {
ARHitResult hitResult = null;
List<ARHitResult> hitTestResults = frame.hitTest(event);
// Determine whether the hit point is within the plane polygon.
ARHitResult hitResultTemp = hitTestResults.get(i);
if (hitResultTemp == null) {
continue;
}
ARTrackable trackable = hitResultTemp.getTrackable();
// Determine whether the point cloud is tapped and whether the point faces the camera.
boolean isPointHitJudge = trackable instanceof ARPoint
&& ((ARPoint) trackable).getOrientationMode() == ARPoint.OrientationMode.ESTIMATED_SURFACE_NORMAL;
// Select points on the plane preferentially.
if (isPlanHitJudge || isPointHitJudge) {
hitResult = hitResultTemp;
if (trackable instanceof ARPlane) {
break;
}
}
return hitResult;
}
}
2. Create a WorldActivity
object. This example demonstrates how to use the world AR scenario of AR Engine.
public class WorldActivity extends BaseActivity {
private ARSession mArSession;
private GLSurfaceView mSurfaceView;
private ARWorldTrackingConfig mConfig;
@Override
protected void onCreate(Bundle savedInstanceState) {
LogUtil.info(TAG, "onCreate");
super.onCreate(savedInstanceState);
setContentView(R.layout.world_java_activity_main);
mWorldRenderManager = new WorldRenderManager(this, this);
mWorldRenderManager.setDisplayRotationManage(mDisplayRotationManager);
mWorldRenderManager.setQueuedSingleTaps(mQueuedSingleTaps)
}
@Override
protected void onResume() {
if (!PermissionManager.hasPermission(this)) {
this.finish();
}
errorMessage = null;
if (mArSession == null) {
try {
if (!arEngineAbilityCheck()) {
finish();
return;
}
mArSession = new ARSession(this.getApplicationContext());
mConfig = new ARWorldTrackingConfig(mArSession);
refreshConfig(ARConfigBase.LIGHT_MODE_ENVIRONMENT_LIGHTING | ARConfigBase.LIGHT_MODE_ENVIRONMENT_TEXTURE);
} catch (Exception capturedException) {
setMessageWhenError(capturedException);
}
if (errorMessage != null) {
stopArSession();
return;
}
}
@Override
protected void onPause() {
LogUtil.info(TAG, "onPause start.");
super.onPause();
if (mArSession != null) {
mDisplayRotationManager.unregisterDisplayListener();
mSurfaceView.onPause();
mArSession.pause();
}
LogUtil.info(TAG, "onPause end.");
}
@Override
protected void onDestroy() {
LogUtil.info(TAG, "onDestroy start.");
if (mArSession != null) {
mArSession.stop();
mArSession = null;
}
if (mWorldRenderManager != null) {
mWorldRenderManager.releaseARAnchor();
}
super.onDestroy();
LogUtil.info(TAG, "onDestroy end.");
}
...
}
Summary
If you've ever done any interior decorating, I'm sure you've wanted the ability to see what furniture would look like in your home without having to purchase them first. After all, most furniture isn't cheap, and delivery and assembly can be quite a hassle. That's why apps that allow users to place and view virtual furniture in their real homes are truly life-changing technologies. HMS Core AR Engine can help greatly streamline the development of such apps. With its plane detection and hit test capabilities, the development kit enables your app to accurately detect planes in the real world and then blend virtual objects naturally into the real world. In addition to virtual home decoration, this powerful kit also has a broad range of other applications. For example, you can leverage its capabilities to develop an AR video game, an AR-based teaching app that allows students to view historical artifacts in 3D or an e-commerce app with a virtual try-on feature. Try AR Engine now and explore the unlimited possibilities it provides.
Published at DZone with permission of Jackson Jiang. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments