arcore
by Ayusch Jain
通過Ayusch Jain
如何使用ARCore和Android Studio構建增強現實Android應用 (How to build an Augmented Reality Android App with ARCore and Android Studio)
This article was originally posted here
本文最初發布在這里
In the previous post, I explained what ARCore is and how it helps developers build awesome augmented reality apps without the need to understand OpenGL or Matrix maths.
在上 一篇 文章中 ,我解釋了ARCore是什么以及它如何幫助開發人員構建超棒的增強現實應用程序,而無需了解OpenGL或Matrix數學。
If you haven’t checked it out yet, I highly recommend doing so before moving ahead with this article and diving into ARCore app development.
如果您還沒有檢查過,我強烈建議您這樣做,然后再繼續本文并深入研究ARCore應用程序開發。
總覽 (Overview)
According to Wikipedia, ARCore is a software development kit developed by Google that allows for augmented reality applications to be built.
根據Wikipedia的說法 ,ARCore是Google開發的軟件開發套件,可用于構建增強現實應用程序。
ARCore uses three key technologies to integrate virtual content with the real environment:
ARCore使用三種關鍵技術將虛擬內容與實際環境集成在一起:
Motion Tracking: it allows the phone to understand its position relative to the world.
運動追蹤:它可以讓手機了解其相對于世界的位置。
Environmental understanding: This allows the phone to detect the size and location of all type of surfaces, vertical, horizontal and angled.
對環境的了解:這使手機可以檢測所有類型的表面(垂直,水平和傾斜)的大小和位置。
Light Estimation: it allows the phone to estimate the environment’s current lighting conditions.
燈光估計:它使手機可以估計環境當前的照明條件。
入門 (Getting Started)
To get started with ARCore app development, you first need to enable ARCore in your project. This is simple as we will be using Android Studio and Sceneform SDK. There are two major operations Sceneform performs automatically:
要開始進行ARCore應用程序開發,您首先需要在項目中啟用ARCore。 這很簡單,因為我們將使用Android Studio和Sceneform SDK。 Sceneform自動執行兩個主要操作:
Checking for availability of ARCore
檢查ARCore的可用性
Asking for camera permission
要求相機許可
You don’t need to bother with these two steps when creating an ARCore app using Sceneform SDK. But you do need to include Sceneform SDK in your project.
使用Sceneform SDK創建ARCore應用時,您無需費心這兩個步驟。 但是您確實需要在項目中包含Sceneform SDK。
Create a new Android Studio project and select an empty activity.
創建一個新的Android Studio項目并選擇一個空的活動。
Add the following dependency to your project level build.gradle file:
將以下依賴項添加到項目級別的build.gradle文件中:
dependencies { classpath 'com.google.ar.sceneform:plugin:1.5.0'}
Add the following to your app level build.gradle file:
將以下內容添加到您的應用程序級別build.gradle文件中:
implementation "com.google.ar.sceneform.ux:sceneform-ux:1.5.0"
Now sync project with Gradle files and wait for the build to finish. This will install the Sceneform SDK to the project and Sceneform plugin to AndroidStudio. It will help you to view the .sfb files. These files are the 3D models which are rendered in your camera. It also helps you in importing, viewing, and building 3D assets.
現在,將項目與Gradle文件同步,并等待構建完成。 這會將Sceneform SDK安裝到項目中,并將Sceneform插件安裝到AndroidStudio中 。 它將幫助您查看。 sfb文件。 這些文件是在相機中渲染的3D模型。 它還可以幫助您導入,查看和構建3D資產 。
構建您的第一個ARCore應用 (Building your first ARCore app)
Now with our Android Studio setup complete and Sceneform SDK installed, we can get started with writing our very first ARCore app.
現在,我們完成了Android Studio設置并安裝了Sceneform SDK,我們可以開始編寫我們的第一個ARCore應用程序了。
First, we need to add the Sceneform fragment to our layout file. This will be the Scene where we place all our 3D models. It takes care of the camera initialization and permission handling.
首先,我們需要將Sceneform片段添加到布局文件中。 這將是我們放置所有3D模型的場景。 它負責相機的初始化和權限處理。
Head over to your main layout file. In my case it is activity_main.xml and add the Sceneform fragment:
轉到您的主布局文件。 在我的情況下,它是activity_main.xml并添加Sceneform片段:
<?xml version="1.0" encoding="utf-8"?><FrameLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent" android:layout_height="match_parent" tools:context=".MainActivity">
<fragment android:name="com.google.ar.sceneform.ux.ArFragment" android:id="@+id/ux_fragment" android:layout_width="match_parent" android:layout_height="match_parent" />
</FrameLayout>
I’ve set the width and height to match parent as this will cover my entire activity. You can choose the dimensions according to your requirements.
我將寬度和高度設置為與父項匹配,因為這將覆蓋我的整個活動。 您可以根據需要選擇尺寸。
兼容性檢查 (Compatibility Check)
This is all that you need to do in the layout file. Now head over to the java file, in my case which is MainActivity.java. Add the method below in your class:
這是您在布局文件中需要做的所有事情。 現在轉到Java文件,在我的例子中是MainActivity.java。 在您的課程中添加以下方法:
public static boolean checkIsSupportedDeviceOrFinish(final Activity activity) { if (Build.VERSION.SDK_INT < Build.VERSION_CODES.N) { Log.e(TAG, "Sceneform requires Android N or later"); Toast.makeText(activity, "Sceneform requires Android N or later", Toast.LENGTH_LONG).show(); activity.finish(); return false; } String openGlVersionString = ((ActivityManager) activity.getSystemService(Context.ACTIVITY_SERVICE)) .getDeviceConfigurationInfo() .getGlEsVersion(); if (Double.parseDouble(openGlVersionString) < MIN_OPENGL_VERSION) { Log.e(TAG, "Sceneform requires OpenGL ES 3.0 later"); Toast.makeText(activity, "Sceneform requires OpenGL ES 3.0 or later", Toast.LENGTH_LONG) .show(); activity.finish(); return false; } return true;}
This method checks whether your device can support Sceneform SDK or not. The SDK requires Android API level 27 or newer and OpenGL ES version 3.0 or newer. If a device does not support these two, the Scene would not be rendered and your application will show a blank screen.
此方法檢查您的設備是否可以支持Sceneform SDK。 該SDK需要Android API級別27或更高版本以及OpenGL ES 3.0版或更高版本。 如果設備不支持這兩個設備,則不會渲染場景,并且您的應用程序將顯示空白屏幕。
Although, you can still continue to deliver all the other features of your app which don’t require the Sceneform SDK.
不過,您仍然可以繼續提供應用程序的所有其他功能,這些功能不需要Sceneform SDK。
Now with the device compatibility check complete, we shall build our 3D model and attach it to the scene.
現在完成設備兼容性檢查后,我們將構建3D模型并將其附加到場景。
添加資產 (Adding the assets)
You will need to add the 3D models which will be rendered on your screen. Now you can build these models yourself if you are familiar with 3D model creation. Or, you can visit Poly.
您將需要添加將在屏幕上渲染的3D模型。 現在,如果您熟悉3D模型創建,則可以自己構建這些模型。 或者,您可以訪問Poly。
There you’ll find a huge repository of 3D assets to choose from. They are free to download. Just credit the creator and you are good to go.
在那里,您將找到龐大的3D資產存儲庫供您選擇。 可以免費下載。 只要相信創作者,您就可以開始了。
In the Android Studio, expand your app folder available on the left-hand side project pane. You’ll notice a “sampledata” folder. This folder will hold all of your 3D model assets. Create a folder for your model inside the sample data folder.
在Android Studio中,展開左側項目窗格上可用的應用程序文件夾。 您會注意到一個“ sampledata ”文件夾。 該文件夾將保存您的所有3D模型資源。 在樣本數據文件夾中為您的模型創建一個文件夾。
When you download the zip file from poly, you will most probably find 3 files.
從poly下載zip文件時,很可能會找到3個文件。
.mtl file
.mtl文件
.obj file
.obj文件
.png file
.png文件
Most important of these 3 is the .obj file. It is your actual model. Place all the 3 files inside sampledata -> “your model’s folder”.
這3個文件中最重要的是.obj文件。 這是您的實際模型。 將所有3個文件放在sampledata- >“模型的文件夾 r”中。
Now right click on the .obj file. The first option would be to Import Sceneform Asset. Click on it, do not change the default settings, just click finish on the next window. Your gradle will sync to include the asset in the assets folder. Once the gradle build finishes, you are good to go.
現在,右鍵單擊.obj 文件 。 第一個選項是導入Sceneform Asset。 單擊它,不更改默認設置,只需在下一個窗口中單擊完成。 您的gradle將同步以將資產包括在資產文件夾中。 一旦gradle構建完成,您就可以開始了。
You’ve finished importing a 3D asset used by Sceneform in your project. Next, let’s build the asset from our code and include it in the scene.
您已完成在項目中導入Sceneform使用的3D資源。 接下來 ,讓我們從代碼中構建資產并將其包含在場景中。
建立模型 (Building the Model)
Add the following code to your MainActivity.java file (or whatever it is in your case). Don’t worry, I’ll explain all the code line by line:
將以下代碼添加到MainActivity.java文件(或您的情況)中。 不用擔心,我將逐行解釋所有代碼:
private static final String TAG = MainActivity.class.getSimpleName();private static final double MIN_OPENGL_VERSION = 3.0;
ArFragment arFragment;ModelRenderable lampPostRenderable;
@Override@SuppressWarnings({"AndroidApiChecker", "FutureReturnValueIgnored"})
protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); if (!checkIsSupportedDeviceOrFinish(this)) { return; } setContentView(R.layout.activity_main); arFragment = (ArFragment) getSupportFragmentManager().findFragmentById(R.id.ux_fragment);
ModelRenderable.builder() .setSource(this, Uri.parse("LampPost.sfb")) .build() .thenAccept(renderable -> lampPostRenderable = renderable) .exceptionally(throwable -> { Toast toast = Toast.makeText(this, "Unable to load andy renderable", Toast.LENGTH_LONG); toast.setGravity(Gravity.CENTER, 0, 0); toast.show(); return null; });
}
First, we find the arFragment that we included in the layout file. This fragment is responsible for hosting the scene. You can think of it as the container of our scene.
首先 ,我們找到包含在布局文件中的arFragment 。 該片段負責主持場景。 您可以將其視為我們場景的容器。
Next, we are using the ModelRenderable class to build our model. With the help of setSource method, we load our model from the .sfb file. This file was generated when we imported the assets. thenAccept method receives the model once it is built. We set the loaded model to our lampPostRenderable.
接下來 ,我們使用ModelRenderable類構建模型。 借助setSource方法,我們從中加載了模型。 sfb文件。 該文件是在我們導入資產時生成的。 構建模型后, thenAccept方法將接收模型。 我們將加載的模型設置為lampPostRenderable。
For error handling, we have .exceptionally method. It is called in case an exception is thrown.
對于錯誤處理,我們有.exceptionally方法。 如果引發異常,則調用該方法。
All this happens asynchronously, hence you don’t need to worry about multi-threading or deal with handlers XD
所有這些都是異步發生的,因此您無需擔心多線程或處理XD處理程序。
With the model loaded and stored in the lampPostRenderable variable, we’ll now add it to our scene.
將模型加載并存儲在lampPostRenderable變量中之后,我們現在將其添加到場景中。
將模型添加到場景 (Adding the Model to Scene)
The arFragment hosts our scene and will receive the tap events. So we need to set the onTap listener to our fragment to register the tap and place an object accordingly. Add the following code to onCreate method:
arFragment托管我們的場景,并將接收點擊事件。 因此,我們需要將onTap偵聽器設置為片段,以注冊拍擊并相應地放置一個對象。 將以下代碼添加到onCreate方法:
arFragment.setOnTapArPlaneListener( (HitResult hitresult, Plane plane, MotionEvent motionevent) -> { if (lampPostRenderable == null){ return; }
Anchor anchor = hitresult.createAnchor(); AnchorNode anchorNode = new AnchorNode(anchor); anchorNode.setParent(arFragment.getArSceneView().getScene());
TransformableNode lamp = new TransformableNode(arFragment.getTransformationSystem()); lamp.setParent(anchorNode); lamp.setRenderable(lampPostRenderable); lamp.select(); });
We set the onTapArPlaneListener to our AR fragment. Next what you see is the Java 8 syntax, in case you are not familiar with it, I would recommend checking out this guide.
我們將onTapArPlaneListener設置為我們的AR片段 。 接下來,您會看到Java 8語法 ,如果您不熟悉Java 8語法 ,我建議您查閱本指南 。
First, we create our anchor from the HitResult using hitresult.createAnchor() and store it in an Anchor object.
首先,我們使用hitresult.createAnchor()從HitResult創建錨,并將其存儲在Anchor對象中。
Next, create a node out of this anchor. It will be called AnchorNode. It will be attached to the scene by calling the setParent method on it and passing the scene from the fragment.
接下來 ,從該錨點創建一個節點。 它將被稱為AnchorNode。 通過在其上調用setParent方法并從片段傳遞場景,將其附加到場景。
Now we create a TransformableNode which will be our lamppost and set it to the anchor spot or our anchor node. The node still doesn’t have any information about the object it has to render. We’ll pass that object using lamp.setRenderable method which takes in a renderable as it’s parameter. Finally call lamp.select();
現在,我們創建一個TransformableNode ,將其作為路燈柱并將其設置為錨點或錨節點。 該節點仍然沒有任何有關它必須呈現的對象的信息。 我們將使用lamp.setRenderable方法傳遞該對象, 該方法以renderable作為參數。 最后調用lamp.select();
Phew!! Too much terminology there, but don’t worry, I’ll explain it all.
ew! 那里的術語太多了,但是請放心,我將全部解釋。
Scene: This is the place where all your 3D objects will be rendered. This scene is hosted by the AR Fragment which we included in the layout. An anchor node is attached to this screen which acts as the root of the tree and all the other objects are rendered as its objects.
場景 :這是將渲染所有3D對象的地方。 該場景由我們包含在布局中的AR片段托管。 此屏幕上附加了一個錨點,該錨點充當樹的根,所有其他對象均渲染為其樹對象。
HitResult: This is an imaginary line (or a ray) coming from infinity which gives the point of intersection of itself with a real-world object.
HitResult :這是一條來自無限遠的假想線(或射線),它給出了其與現實世界對象的交點。
Anchor: An anchor is a fixed location and orientation in the real world. It can be understood as the x,y,z coordinate in the 3D space. You can get an anchor’s post information from it. Pose is the position and orientation of the object in the scene. This is used to transform the object’s local coordinate space into real-world coordinate space.
錨點 :錨點是現實世界中的固定位置和方向。 可以理解為3D空間中的x,y,z坐標。 您可以從中獲取主播的信息。 姿勢是場景中對象的位置和方向。 這用于將對象的局部坐標空間轉換為實際坐標空間。
- AnchorNode: This is the node that automatically positions itself in the world. This is the first node that gets set when the plane is detected. AnchorNode:這是自動將自己放置在世界上的節點。 這是檢測到平面時設置的第一個節點。
TransformableNode: It is a node that can be interacted with. It can be moved around, scaled rotated and much more. In this example, we can scale the lamp and rotate it. Hence the name Transformable.
TransformableNode :它是可以與之交互的節點。 它可以移動,縮放旋轉等等。 在此示例中,我們可以縮放燈并旋轉。 因此,名稱為可變形。
There is no rocket science here. It’s really simple. The entire scene can be viewed as a graph with Scene as the parent, AnchorNode as its child and then branching out different nodes/objects to be rendered on the screen.
這里沒有火箭科學。 真的很簡單。 可以將整個場景視為一個圖形,以Scene為父級, AnchorNode作為其子級,然后分支出要在屏幕上呈現的不同節點/對象。
Your final MainActivity.java must look something like this:
您最終的MainActivity.java必須看起來像這樣:
package com.ayusch.arcorefirst;
import android.app.Activity;import android.app.ActivityManager;import android.content.Context;import android.net.Uri;import android.os.Build;import android.support.v7.app.AppCompatActivity;import android.os.Bundle;import android.util.Log;import android.view.Gravity;import android.view.MotionEvent;import android.widget.Toast;
import com.google.ar.core.Anchor;import com.google.ar.core.HitResult;import com.google.ar.core.Plane;import com.google.ar.sceneform.AnchorNode;import com.google.ar.sceneform.rendering.ModelRenderable;import com.google.ar.sceneform.ux.ArFragment;import com.google.ar.sceneform.ux.TransformableNode;
public class MainActivity extends AppCompatActivity { private static final String TAG = MainActivity.class.getSimpleName(); private static final double MIN_OPENGL_VERSION = 3.0;
ArFragment arFragment; ModelRenderable lampPostRenderable;
@Override @SuppressWarnings({"AndroidApiChecker", "FutureReturnValueIgnored"}) protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); if (!checkIsSupportedDeviceOrFinish(this)) { return; } setContentView(R.layout.activity_main); arFragment = (ArFragment) getSupportFragmentManager().findFragmentById(R.id.ux_fragment);
ModelRenderable.builder() .setSource(this, Uri.parse("LampPost.sfb")) .build() .thenAccept(renderable -> lampPostRenderable = renderable) .exceptionally(throwable -> { Toast toast = Toast.makeText(this, "Unable to load andy renderable", Toast.LENGTH_LONG); toast.setGravity(Gravity.CENTER, 0, 0); toast.show(); return null; });
arFragment.setOnTapArPlaneListener( (HitResult hitresult, Plane plane, MotionEvent motionevent) -> { if (lampPostRenderable == null){ return; }
Anchor anchor = hitresult.createAnchor(); AnchorNode anchorNode = new AnchorNode(anchor); anchorNode.setParent(arFragment.getArSceneView().getScene());
TransformableNode lamp = new TransformableNode(arFragment.getTransformationSystem()); lamp.setParent(anchorNode); lamp.setRenderable(lampPostRenderable); lamp.select(); } );
}
public static boolean checkIsSupportedDeviceOrFinish(final Activity activity) { if (Build.VERSION.SDK_INT < Build.VERSION_CODES.N) { Log.e(TAG, "Sceneform requires Android N or later"); Toast.makeText(activity, "Sceneform requires Android N or later", Toast.LENGTH_LONG).show(); activity.finish(); return false; } String openGlVersionString = ((ActivityManager) activity.getSystemService(Context.ACTIVITY_SERVICE)) .getDeviceConfigurationInfo() .getGlEsVersion(); if (Double.parseDouble(openGlVersionString) < MIN_OPENGL_VERSION) { Log.e(TAG, "Sceneform requires OpenGL ES 3.0 later"); Toast.makeText(activity, "Sceneform requires OpenGL ES 3.0 or later", Toast.LENGTH_LONG) .show(); activity.finish(); return false; } return true; }}
Congratulations!! You’ve just completed your first ARCore app. Start adding objects and see them come alive in the real world!
恭喜!! 您剛剛完成了第一個ARCore應用。 開始添加對象,看看它們在現實世界中變得栩栩如生!
This was your first look into how to create a simple ARCore app from scratch with Android studio. In the next tutorial, I would be going deeper into ARCore and adding more functionality to the app.
這是您第一次了解如何使用Android Studio從頭開始創建簡單的ARCore應用 。 在下一個教程中,我將更深入地研究ARCore并為該應用程序添加更多功能。
If you have any suggestions or any topic you would want a tutorial on, just mention in the comments section and I’ll be happy to oblige.
如果您有任何建議或主題想要在本教程上學習,請在評論部分中提及,我們非常樂意為您服務。
Like what you read? Don’t forget to share this post on Facebook, Whatsapp and LinkedIn.
喜歡你讀的書嗎? 不要忘記在Facebook , Whatsapp和LinkedIn上分享此帖子。
You can follow me on LinkedIn, Quora, Twitter and Instagram where I answer questions related to Mobile Development, especially Android and Flutter.
您可以在LinkedIn , Quora , Twitter和Instagram上關注我,在那里我回答與移動開發(尤其是Android和Flutter)有關的問題 。
翻譯自: https://www.freecodecamp.org/news/how-to-build-an-augmented-reality-android-app-with-arcore-and-android-studio-43e4676cb36f/
arcore