arkit技術介紹
by Neil Mathew
通過尼爾·馬修(Neil Mathew)
面向移動AR的觸覺技術:如何以“觸摸”感增強ARKit應用 (Haptics for mobile AR: how to enhance ARKit apps with a sense of “touch”)
I’m really excited about the future of haptics for AR and VR. It feels like the missing link between my HTC Vive and jumping into the OASIS with Parzival and Art3mis. So it’s not surprising that haptics is perhaps the most hotly anticipated tech in the XR community right now. Several companies like Microsoft and HTC, as well as startups like SenseGlove and HaptX, have shown demos of increasingly promising iterations of haptic gloves that I’m itching to try out.
我對AR和VR觸覺的未來感到非常興奮。 感覺就像我的HTC Vive 與Parzival和Art3mis跳入OASIS之間的缺失鏈接。 因此,觸覺技術可能是XR社區中目前最受期待的技術也就不足為奇了。 微軟和HTC等多家公司,以及SenseGlove和HaptX等初創公司都展示了演示,這些演示我正在嘗試嘗試,它們的觸覺手套的迭代前景越來越好。
Unfortunately, like most AR developers today, our work at Placenote is focused almost entirely on mobile AR platforms like ARKit and ARCore. Naturally, this got us thinking, “Could haptics do anything for mobile AR?”
不幸的是,像今天大多數AR開發人員,我們的工作Placenote幾乎完全集中在像ARKit和ARCORE移動AR平臺。 自然,這使我們想到: “觸覺可以為移動AR做任何事情嗎? ”
Haptics have been an awesome addition to touch screens, from simulating tactile button clicks to silent notifications. But, after some frantic googling we realized that there’s actually been no real discussion about haptics for mobile AR apps so far… CHALLENGE ACCEPTED ??
從模擬觸覺按鈕的點擊到靜音通知,觸覺已經成為觸摸屏的絕佳補充。 但是,經過一番瘋狂的搜尋之后,我們意識到到目前為止,實際上還沒有關于移動AR應用程序的觸覺的真正討論……挑戰已被接受 ??
移動AR的挑戰 (The challenge of mobile AR)
We decided to dig into why haptics hasn’t made it’s way into mobile AR and it wasn’t hard to see why. Mobile AR is by far the least immersive AR medium. The consensus in the community is that it’s just a stop gap to the ultimate AR platform — smart glasses.
我們決定深入研究為什么觸覺尚未進入移動AR,并且不難理解為什么。 到目前為止,移動AR是最不沉浸式的AR媒介。 社區中的共識是,這只是終極AR平臺(智能眼鏡)的制止點。
But mindset isn’t the only barrier here. We found that the mobile form-factor presents some unique challenges to the AR experience designer:
但是心態并不是這里唯一的障礙。 我們發現,移動外形對AR體驗設計師提出了一些獨特的挑戰:
- unlike headsets, the phone screen is the display as well as the controller 與耳機不同,電話屏幕既是顯示屏又是控制器
- it’s impossible to bring your hands into the experience since you’re holding the phone. 握著手機,不可能將您的雙手帶入體驗。
- we still rely on touch screen interactions that are ambiguous in dimensionality — 2D or 3D touch? 我們仍然依賴于維度模糊的觸摸屏交互-2D或3D觸摸?
Nevertheless, the reality is that, for the next few years and perhaps more, mobile AR is here to stay. There are a billion mobile devices in consumer pockets right now and only about a handful of AR headsets on their heads. As a developer, distribution for your apps trumps most other factors. In fact, in applications like indoor navigation and gaming, mobile has already proven itself as a viable medium for deploying AR experiences.
盡管如此,現實是,在接下來的幾年甚至更長的時間內,移動AR仍將繼續存在。 目前,消費者的口袋里有十億個移動設備,而只有少數幾個AR頭戴式設備在他們頭上。 作為開發人員,應用程序的發行量要比其他大多數因素都要重要。 實際上,在室內導航和游戲等應用中,移動設備已經證明自己是部署AR體驗的可行媒介。
This brings us to the topic of haptics for mobile AR. At first, it might seem like there’s no real hope for haptics to enhance mobile AR experiences, but recent studies have actually shown otherwise.
這將我們帶入了移動AR的觸覺主題。 乍一看,觸覺似乎沒有真正的希望來增強移動AR體驗,但是最近的研究實際上卻顯示出了其他事實。
在觸覺中,少即是多 (In haptics, less is more)
There’s been a myriad of methods conceived to achieve haptic feedback. In general they fall under two broad categories — kinesthetic haptics (force feedback) and cutaneous haptics (skin sensations).
有許多方法可以實現觸覺反饋。 通常,它們分為兩大類- 運動覺觸覺 (力反饋)和皮膚觸覺 (皮膚感覺)。
Kinesthetic haptics has widely been considered to be the more realistic haptic technology. It involves physical actuators, either grounded or ungrounded. These push and pull our fingers and other appendages in response to interactions with virtual objects. Intuitively, realistic force-feedback should perform vastly better than plain old vibrations. But a study published in Science Robotics this year titled “The Uncanny Valley of Haptics” has challenged these assumptions.
動覺觸覺已被廣泛認為是更現實的觸覺技術。 它涉及已接地或未接地的物理執行器。 這些響應與虛擬對象的交互來推動和拉動我們的手指和其他附件。 憑直覺,逼真的力反饋應該比普通的舊振動好得多。 但是, 今年在《 科學機器人》雜志上發表的一項名為“觸覺異常谷”的研究挑戰了這些假設。
The researchers found that increasing the realism of haptic sensation doesn’t necessarily increase the quality of the AR experience. It often has a negative impact due to the uncanny valley of realism in simulations. They found that cutaneous haptics, which is essentially a combination of light touches and vibrations, did a lot better in fooling the brain deeper into the illusion. Strange results, but they basically realized that we’ve underestimated how good our brain is at filling the gaps in our sensation of reality.
研究人員發現,增加觸覺的真實感并不一定會提高AR體驗的質量。 由于在模擬中存在不可思議的現實主義谷底,因此通常會產生負面影響。 他們發現,皮膚觸覺本質上是輕觸和振動的結合,在使大腦更深入幻覺方面做得更好。 結果很奇怪,但他們基本上意識到,我們低估了大腦在填補現實感方面的能力。
The situations where our brain steps in to fill the gaps is what I find most interesting about our perception of the sensation of touch. — Justin Brad, CEO of Osso VR
對于我們對觸摸感的感知,我發現大腦最有趣的情況就是填補這些空白。 — Osso VR首席執行官Justin Brad
將觸覺帶入移動AR (Bringing haptics to mobile AR)
Given these findings, why not test what cutaneous haptics can do for mobile AR? After all, haptics on mobile is not just about vibrating ring tones anymore.
鑒于這些發現,為什么不測試可移動AR的皮膚觸覺器呢? 畢竟,移動設備上的觸覺不再只是振動鈴聲了。
Micro-Electro-Mechanical Systems (MEMS) on mobile devices have gotten a lot more sophisticated and capable of some pretty nuanced behaviors. Since the iPhone 7, Apple has upgraded the old basic rumble vibrations to what they now call the Taptic Engine. This is a lot more subtle and consists of seven different types of haptic feedback with varying patterns and strengths.
移動設備上的微機電系統(MEMS)變得更加復雜,并且能夠表現出一些細微的行為。 自iPhone 7起,Apple已將舊的基本隆隆振動升級為現在稱為Taptic Engine的裝置。 這要更加微妙,它由七種不同類型的觸覺反饋組成, 它們具有不同的模式和強度。
The haptic feedback modes available are:
可用的觸覺反饋模式為:
- Selection Change 選擇變更
- Impact Light 撞擊燈
- Impact Medium 沖擊介質
- Impact Heavy 重沖擊
- Notification Success 通知成功
- Notification Warning 通知警告
- Notification Failure 通知失敗
To learn more about the iOS feedback generator, check out this Apple documentation. At the end of this article, I will share some code you can use to quickly add these feedback types to your ARKit apps.
要了解有關iOS反饋生成器的更多信息, 請查閱此Apple文檔 。 在本文的結尾,我將分享一些代碼,您可以使用這些代碼將這些反饋類型快速添加到ARKit應用中。
We decided to experiment with a number of these haptic feedback modes in our AR apps and I’m really excited to say that the results were a pleasant surprise to our team.The following are some examples of haptic implementations in our mobile AR apps.
我們決定在我們的AR應用程序中嘗試多種觸覺反饋模式,我很高興地說結果為我們的團隊帶來了驚喜。以下是我們的移動AR應用程序中觸覺實現的一些示例。
移動AR中觸覺的用法示例 (Usage examples of haptics in mobile AR)
In our experiments so far, we’ve found that haptic feedback for mobile AR works well in five distinct scenarios. Here’s a description of each.
到目前為止,在我們的實驗中,我們發現針對移動AR的觸覺反饋在五個不同的場景中效果很好。 這是每個的描述。
1.磁性指針(即對齊網格) (1. Magnetic pointers (i.e. snap to grid))
A pointer locked along a planar surface is a commonly used feature in many ARKit apps, especially in measurement tools like Air Measure and Magic Plan. Since your phone behaves as a controller in mobile AR, the standard UX in measurement apps involves dragging a pointer along a surface to draw lines or polygons to measure things in the real world. Of course, when it comes to line drawing, magnetic pointers that snap to the end points and edges of lines are seen everywhere — from PowerPoint to Photoshop.
指針鎖定在平面上是許多ARKit應用程序中常用的功能,尤其是在Air Measure和Magic Plan等測量工具中。 由于您的手機在移動AR中充當控制器,因此測量應用程序中的標準UX涉及沿表面拖動指針以繪制線或面以測量現實世界中的事物。 當然,在繪制線條時,從PowerPoint到Photoshop到處都可以看到捕捉到線條的端點和邊緣的磁性指針。
We found that subtle haptic feedback indicating a “snap” in pointer position is a great enhancement. It almost feels like your phone, (i.e your controller) is physically moving to snap into place.
我們發現,微妙的觸覺反饋指示了指針位置的“快照”,這是一個很大的增強。 幾乎感覺就像您的電話(即您的控制器)在物理上移動以卡入到位。
I was really happy to see that Apple’s new app “Measure” actually uses haptic feedback in their UX. It’s an amazingly subtle implementation and you can see a GIF of it in action below. An “Impact Medium” is fired when the pointer snaps to the edge of the plane.
我真的很高興看到Apple的新應用“ Measure”實際上在其UX中使用了觸覺反饋。 這是一個非常微妙的實現,您可以在下面的操作中看到它的GIF。 當指針捕捉到平面邊緣時,將觸發“沖擊介質”。
2.命中測試(感覺真實世界的表面) (2. Hit testing (feeling real world surfaces))
Another common feature in ARKit apps is the hit-test. This is implemented as a ray-cast from a point on the screen — either a touch point or the center — to a surface in the real word. It is generally used to add a 3D object at the point of contact. A slight haptic sensation can help the user understand that a surface was “hit”. We found two methods that work well here:
ARKit應用程序的另一個常見功能是命中測試。 這是通過從屏幕上的某個點(觸摸點或中心)到實際單詞的表面的射線投射實現的。 通常用于在接觸點添加3D對象。 輕微的觸覺可以幫助用戶了解表面被“擊中”。 我們發現兩種方法在這里很有效:
PinningIn this example, a marker is added to the scene at the hit point. An “Impact Light” helps users sense the “pinning” of the marker in 3D space. Of course, the downside to this is you can’t quite sense the “depth” of the hit point — in other words, how far the pin is from the user.
固定在此示例中,將標記添加到命中點處的場景。 “撞擊光”可幫助用戶感知3D空間中標記的“固定”。 當然,這樣做的不利之處是您無法完全感覺到擊中點的“深度”,換句話說,就是針距用戶的距離。
GrazingAn alternative to pinning is the grazing method of hit testing. In this case, a constantly updating marker previews where a marker might be added to a scene. We found that a series of haptic impulses, based on the magnitude of displacement of the preview marker at each frame, gives the sensation of scraping a pointer along a 3D surface and let’s you “feel” a 3D surface.
放牧釘扎的一種替代方法是命中測試的放牧方法。 在這種情況下,不斷更新的標記預覽會在其中將標記添加到場景中。 我們發現,基于每幀預覽標記位移的大小,一系列觸覺脈沖給人一種沿3D表面刮擦指針的感覺,讓您“感覺” 3D表面。
Here’s a code example of grazing in Unity:
這是在Unity中放牧的代碼示例:
if (distanceChange >= 0.1 && distanceChange < 0.2)
{iOSHapticFeedback.Instance.Trigger(Impact_Light);
}
else if (distanceChange >= 0.2 && distanceChange < 0.4)
{iOSHapticFeedback.Instance.Trigger(Impact_Medium);
}
else if (distanceChange >= 0.4)
{iOSHapticFeedback.Instance.Trigger(Impact_Heavy);
}
3. FPS槍后坐或爆炸 (3. FPS gun recoil or explosions)
This is by far the most fun example of haptic feedback. When building a first person shooter in AR, your phone is the display as well as the weapon. A great way to simulate a gun shot is a simple “Impact Heavy”, which produces a single bump or a “Notification Failure”, which creates a double bump that feels a lot like a gun recoil. Of course the example below is a laser weapon but, hey, this isn’t meant to be too realistic remember?
這是迄今為止最有趣的觸覺反饋示例。 在AR中建立第一人稱射擊游戲時,您的手機既是顯示屏又是武器。 模擬槍擊的一種好方法是簡單的“ Impact Heavy”,它會產生一個顛簸,或者是一個“ Notification Failure”,它會產生一個雙顛簸,感覺就像槍的后坐力。 當然,下面的示例是激光武器,但是,這并不意味著太現實了嗎?
4.與控制器尖端碰撞 (4. Collision with controller tip)
In VR apps like Oculus Medium or Tilt Brush, one of the handheld controllers serves as a brush tip that the user moves around to draw in 3D space. I’ve spent hours painting in Tilt Brush and so naturally I have tried really hard to mimic this experience with ARKit.
在Oculus Medium或Tilt Brush等VR應用程序中 ,其中一個手持控制器用作筆尖,用戶可以在其中移動以繪制3D空間。 我已經花了幾個小時在Tilt Brush中繪畫,所以自然而然地,我非常努力地模仿ARKit的這種體驗。
The trouble is that creating an accurate drawing experience on mobile becomes really difficult. You lose the sense of depth when your phone is both the display and the controller. One of the hardest things in 3D drawing apps on mobile is knowing where your brush tip is relative to the other 3D objects in the scene.
問題在于,在移動設備上創建準確的繪畫體驗變得非常困難。 當電話既是顯示屏又是控制器時,您會失去深度感。 移動設備上的3D繪圖應用程序中最難的事情之一就是知道筆刷尖端相對于場景中其他3D對象的位置。
And, again, haptics was the answer. We found that one way to give users a sense of depth is to imagine the brush is actually a cane you can use to hit 3D objects that are already in scene. Providing haptic feedback to let users know whether the brush tip is in contact with any existing objects in the scene lets users accurately pin point their brush in 3D space.
而且,觸覺是答案。 我們發現,給用戶一種深度感的一種方法是想象畫筆實際上是可以用來擊打場景中已經存在的3D對象的手杖。 提供觸覺反饋以使用戶知道筆刷尖端是否與場景中的任何現有對象接觸,從而使用戶可以準確地將筆刷指向3D空間。
5.在Persistent AR Apps中重新定位快照。 (5. Re-localization snap in Persistent AR Apps.)
At Placenote , we primarily build Persistent AR, or AR Cloud, apps. The core functionality of these apps is the ability to save AR content permanently in a physical place. Users can load it up in the same location every time.
在Placenote ,我們主要構建Persistent AR或AR Cloud應用程序。 這些應用程序的核心功能是能夠將AR內容永久保存在物理位置。 用戶每次都可以將其加載到同一位置。
This behaviour is called the relocalization of a scene.
此行為稱為場景的重新定位。
In order to relocalize an AR scene, a user must first point their phone’s camera to the real world, and then wait until the camera detects its location.
為了重新定位AR場景,用戶必須首先將手機的攝像頭對準現實世界,然后等待,直到攝像頭檢測到其位置。
With Placenote, relocalization happens almost instantaneously but it all happens internally. Hence we need to design a way to notify the user of a successful relocalization. The visual cues might be enough, as seen in the GIF above. But a more subtle indication is to provide a haptic “Impact Light” to suggest that you have snapped into place in the real world.
使用Placenote,重新定位幾乎是即時發生的,但所有這些都內部發生。 因此,我們需要設計一種方法來通知用戶成功的重新定位。 如上面的GIF所示,視覺提示可能就足夠了。 但是,更微妙的指示是提供觸覺“沖擊光”,以暗示您已融入現實世界。
如何為您的ARKit項目添加觸覺 (How to add haptics to your ARKit project)
If you’re working with Swift for Native iOS ARKit development, check out this tutorial on implementing haptic feedback in Native apps.
如果您正在使用Swift進行Native iOS ARKit開發, 請查看有關在Native應用中實現觸覺反饋的本教程 。
If you’re working with Unity, my favorite package so far is the iOS Haptic Feedback Package on the Unity Asset Store. It’s $5 but well worth it because Unity’s built in function Handheld.Vibrate() doesn’t actually expose the new iOS Taptic Engine functions!
如果您正在使用Unity,那么到目前為止,我最喜歡的軟件包是Unity Asset Store上的iOS Haptic Feedback軟件包 。 這是5美元,但非常值得,因為Unity的內置函數Handheld.Vibrate()實際上并未公開新的iOS Taptic Engine函數!
The iOS Haptic Feedback Package provides a simple Prefab and Scripts to add all 7 types of haptic feedback into your app. You can get it from the Asset Store link here:
iOS觸覺反饋包提供了一個簡單的Prefab和腳本,可將所有7種類型的觸覺反饋添加到您的應用中。 您可以從此處的資產商店鏈接中獲取它:
注意事項 (Things to watch out for)
As with any design tool, here’s a few things to watch out for when incorporating haptics in your mobile AR app.
與任何設計工具一樣,在移動AR應用程序中集成觸覺時需要注意以下幾點。
過多使用觸覺會弄亂ARKit跟蹤 (Using haptics too much can mess up ARKit tracking)
Test the impact of haptics on your AR session. Since ARKit relies on inertial sensing to track the phone’s motion, adding too many vibrations during an ARKit session can throw off tracking slightly.
測試觸覺對您的AR會話的影響。 由于ARKit依靠慣性感測來跟蹤手機的運動,因此在ARKit會話期間添加過多的振動可能會略微影響跟蹤。
過多使用觸覺會導致設備過熱 (Using haptics too much can overheat the device)
Haptics is, after all, a physical movement of your mobile device and naturally tends to use more energy. Use this sparingly to ensure your phone doesn’t overheat or run out of battery too fast.
畢竟,觸覺是移動設備的物理運動,自然會消耗更多的能量。 請謹慎使用此功能,以確保手機不會過熱或電池電量耗盡過快。
過多的觸覺反饋可能會使您的用戶感到困惑和不敏感 (Too much haptic feedback might confuse and desensitize your user)
This is true for any haptic mechanism. Don’t overdo it. Specifically, don’t use it without a clear understanding of why haptic feedback is necessary for the action your user is performing. The danger of overuse is that your user gets confused by it and therefore gets desensitized to your feedback.
對于任何觸覺機制都是如此。 不要過分。 具體來說,不要在沒有清楚理解為什么用戶執行操作時需要觸覺反饋的情況下使用它。 過度使用的危險是您的用戶對此感到困惑,因此對您的反饋不敏感。
And that’s it! I hope this article has given you a helpful dose of design ideas and convinced you to venture into the world of mobile AR haptics. We’ve really enjoyed exploring the different ways we could simulate touch sensations in mobile AR and if you have any more ideas we would love to talk to you about it. If you’re interested in trying out any of our code examples for mobile AR haptics, send me an email at neil [at] placenote.com.
就是這樣! 我希望本文能為您提供有益的設計思想,并說服您冒險進入移動AR觸覺世界。 我們真的很喜歡探索可以在移動AR中模擬觸摸感覺的不同方法,如果您還有其他想法,我們很樂意與您討論。 如果您有興趣嘗試使用我們的任何有關移動AR觸覺的代碼示例,請發送電子郵件至neil [at] placenote.com 。
If you’re interested in Persistent AR apps or what we do at Placenote, message us on twitter, or check out Placenote.com
如果您對Persistent AR應用程序或Placenote的功能感興趣,請在Twitter上向我們發送消息,或查看Placenote.com
翻譯自: https://www.freecodecamp.org/news/haptics-for-mobile-ar-how-to-enhance-arkit-apps-with-a-sense-of-touch-151d9e9c9950/
arkit技術介紹