Unity logo
Cancel
Cart
Applications
Sell Assets

Over 11,000 five-star assets

Rated by 85,000+ customers

Supported by 100,000+ forum members

Every asset moderated by Unity

Home
Tools
Utilities
This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.
1/10
⌛ Time is money! Iterate faster on your AR projects without leaving Unity Editor. Save time and sanity developing AR apps.
Render pipeline compatibility
The Built-in Render Pipeline is Unity’s default render pipeline. It is a general-purpose render pipeline that has limited options for customization. The Universal Render Pipeline (URP) is a Scriptable Render Pipeline that is quick and easy to customize, and lets you create optimized graphics across a wide range of platforms. The High Definition Render Pipeline (HDRP) is a Scriptable Render Pipeline that lets you create cutting-edge, high-fidelity graphics on high-end platforms.
Unity VersionBuilt-inURPHDRP
6000.0.17f1
Compatible
Compatible
Compatible
2019.4.0f1
Compatible
Compatible
Compatible
Description

AR Foundation Remote 2.0 is a big update to AR Foundation Remote: the most popular debugging tool for AR apps. With new exclusive features, you'll be able to iterate even faster and deliver high-quality AR apps to the market with greater confidence.


In simple words: AR Foundation Remote 2.0 = Unity Remote + AR Foundation + Input System (New) + So Much More.



💡 Current workflow with AR Foundation 💡


1. Make a change to your AR project.

2. Build the project and run it on a real AR device.

3. Wait for the build to complete.

4. Wait a little bit more.

5. Test your app on a real device using only Debug.Log().



🔥  Improved workflow with AR Foundation Remote 2.0 🔥


1. Setup the AR Companion app once. The setup process takes less than a few minutes.

2. Just press play! Run and debug your AR app with full access to scene hierarchy and all object properties right in the Editor!



💡 This plugin is licensed on a per-seat basis, meaning that one license is required for each developer in your team. More Info.



⚡ Features ⚡


• Precisely replicates the behavior of a real AR device in Editor.

• Extensively tested with both ARKit and ARCore.

Plug-and-play: no additional scene setup is needed, just run your AR scene in Editor with AR Companion running (minor code change may be needed).

• Streams video from Editor to real AR device so you can see how your app looks on it without making a build (see Limitations).

Multi-touch input remoting: stream multi-touch from an AR device or simulate touch using a mouse in Editor (see Limitations).

• Test Location Services (GPS), Gyroscope, and Compass right in the Editor.

• Written in pure C# with no third-party libraries. Full source code is available.

• Connect any AR Device to Windows PC or macOS via Wi-Fi: iOS + Windows PC, Android + macOS... any variation you can imagine!

• Compatible with Wikitude SDK Expert Edition.

• Compatible with VisionLib SDK.



🎥 Session Recording and Playback 🎥


Session Recording and Playback feature will allow you to record AR sessions to a file and play them back in the reproducible environment (see Limitations).

• Record and playback all supported features: face tracking, image tracking, plane tracking, touch input, you name it!

Fix bugs that occur only under some specific conditions. Playing a previously recorded AR session in the reproducible environment will help you track down and fix bugs even faster!

• Record testing scenarios for your AR app. Your testers don't have to fight over testing devices ever again: record a testing scenario once, then play it back as many times as you want without an AR device.



⚓️ ARCore Cloud Anchors ⚓️


Testing ARCore Cloud Anchors don't have to be that hard. With a custom fork adapted to work with the AR Foundation Remote 2.0, you can run AR projects with ARCore Cloud Anchors right in the Unity Editor.

Host Cloud Anchors.

Resolve Cloud Anchors.

Record an AR session with ARCore Cloud Anchors and play it back in the reproducible environment.



🕹 Input System (New) support 🕹


Version 2.0 brings Input System (New) support with all benefits of input events and enhanced touch functionality.

Input Remoting allows you to transmit all input events from your AR device back to the Editor. Test Input System multi-touch input right in the Editor!

• Test Input Actions right in the Editor without making builds.

Record all Input System events to a file and play them back in the reproducible environment. Again, all supported features can be recorded and played back!



⚡ Supported AR subsystems


ARCore Cloud Anchors: host and resolve ARCore Cloud Anchors.

Meshing (ARMeshManager): physical environment mesh generation, ARKit mesh classification support.

Occlusion (AROcclusionManager): ARKit depth/stencil human segmentation, ARKit/ARCore environment occlusion (see Limitations).

Face Tracking: face mesh, face pose, eye tracking, ARKit Blendshapes.

Body Tracking: ARKit 2D/3D body tracking, scale estimation.

Plane Tracking: horizontal and vertical plane detection, boundary vertices, raycast support.

Image Tracking: supports mutable image library and replacement of image library at runtime.

Depth Tracking (ARPointCloudManager): feature points, raycast support.

Camera: camera background video (see Limitations), camera position and rotation, facing direction, camera configurations.

CPU images: camera and occlusion CPU images support (see Limitations).

Anchors (ARAnchorManager): add/remove anchors, and attach anchors to detected planes.

Session subsystem: Pause/Resume, receive Tracking State, set Tracking Mode.

Light Estimation: Average Light Intensity, Brightness, and Color Temperature; Main Light Direction, Color, and Intensity; Exposure Duration and Offset; Ambient Spherical Harmonics.

Raycast subsystem: perform world-based raycasts against detected planes, point clouds, and the depth map.

Object Tracking: ARKit object detection after scanning with the scanning app (see Limitations).

ARKit World Map: full support of ARWorldMap. Serialize the current world map, deserialize the saved world map and apply it to the current session.



FAQ

Forum

Support

Technical details

💡 Requirements 💡


• Stable version of Unity 2019.4 or newer.

• AR Device (iPhone with ARKit support, Android with ARCore support, etc.).

• AR Device and Unity Editor should be on the same Wi-Fi network (a wired connection is supported with an additional setup).

• Verified version of AR Foundation 3.0.1 or newer.



👉 Limitations 👈


• Please check that your AR device supports the AR feature you want to test in Editor. For example, to test Meshing in Editor, your AR device should support Meshing.


• Video streaming and occlusion textures:

- Are supported with these Editor Graphics APIs: Direct3D11, Metal, and OpenGLCore.

- The framerate is around 15-20 FPS on high-end mobile devices. You can increase the framerate by decreasing the video resolution.

- Default video resolution scale is 0.33. You can increase the resolution in the plugin's Settings, but this will result in higher latency and lower framerate.


• Touch input remoting and simulation:

- Input Manager (Old): UI can respond to touch simulation and remoting only if the Game View window is focused.

- Input System (New): UI can respond to touch simulation and remoting only if the Unity Editor window is focused.


• ARKit Object Tracking:

- Adding a new object reference library requires a new build of the AR Companion app.


• CPU images:

- Only one XRCpuImage can be acquired at a time for each CPU image type.

- Only one XRCpuImage.ConvertAsync() conversion is supported at a time.


• Session Recording and Playback

- Your app should be deterministic to be able to record and playback AR sessions. That is, your AR experience shouldn't rely on framerate-dependent or time-dependent events (animations, timers, ScrollRect moving with inertia, physics simulation, server connection, etc.). Instead, make sure your AR app only responds to AR Foundation and input events to trigger AR functionality while recording or playing back a session.

- Input Manager (Old): Game View should be focused to play back recorded input.

- Input System (New): Unity Editor window should be focused to play back recorded input.


AR Foundation Remote 2.0

(193)
429 users have favourite this asset
(429)
$150
Seat
1
Updated price and taxes/VAT calculated at checkout
Refund policy
This asset is covered by the Unity Asset Store Refund Policy. Please see section 2.9.3 of the EULA for details.
Secure checkout:
Supported payment methods: Visa, Mastercard, Maestro, Paypal, Amex
License type
File size
5.0 MB
Latest version
2.0.39-release.0
Latest release date
Aug 31, 2024
Original Unity version
2019.4.0 or higher
Support
Visit site
Quality assets
Over 11,000 five-star assets
Trusted
Rated by 85,000+ customers
Community support
Supported by 100,000+ forum members
Unity logo
Language
Feedback
Partners Program
Partners
USD
EUR
Copyright © 2025 Unity Technologies
All prices are exclusive of tax
USD
EUR