Building Your First AR Experience: From Zero to Pokemon Go Vibes

Let me be honest with you: the first time I tried creating an AR app, I thought I’d need to understand quantum physics, sacrifice a small animal to the dev gods, and probably learn three new programming languages. Turns out, I was only right about one of those things, and it wasn’t the animal part. Fast forward to today, and building AR apps in Unity is actually enjoyable. The best part? You don’t need to maintain separate codebases for iOS and Android. That’s where AR Foundation swoops in like your friendly neighborhood Spider-Man—making everyone’s life easier. In this article, I’ll walk you through creating your first augmented reality application that works seamlessly on both iOS and Android. We’re talking about the kind of app that lets you place virtual objects in the real world, track images, and impress your friends (or at least confuse them).

Understanding AR Foundation: The Bridge Between Your Dreams and Reality

Before we get our hands dirty with code, let’s understand what we’re actually dealing with. AR Foundation is Unity’s cross-platform framework that allows you to write your AR experience once and deploy it to either iOS or Android without changing your Scene’s scripts or components. Think of it as the diplomatic translator between your Unity dreams and the platform-specific realities of iOS and Android. Under the hood, AR Foundation acts as a unifying layer. When you target iOS, it uses ARKit (Apple’s framework), and when you target Android, it uses ARCore (Google’s framework). The magic happens because Unity abstracts these differences away, meaning you—the developer—don’t have to be fluent in each platform’s specific quirks.

Why This Matters (Beyond Just Being Cool)

Before AR Foundation existed, developers had to build separate scenes and maintain different code branches for each platform. That’s like having two different cars but only one engine—you’d have to redesign the car every time you changed platforms. Now? You literally “build once, deploy everywhere.”

The Technical Foundation: What You’ll Work With

AR Foundation supports quite an impressive array of features:

  • Multi-platform deployment for iOS and Android
  • Plane detection (vertical and horizontal)
  • Feature point detection to understand the environment
  • Light estimation for realistic virtual object lighting
  • Hit testing to determine where users are tapping
  • AR Anchors for positioning objects in space
  • Image Tracking to detect and track flat surfaces like posters
  • 3D object tracking for more complex scene understanding
  • Face Tracking (yes, you can make AR face filters if you want)
  • Cloud Anchors for multi-user experiences
  • AR Remote for easier testing during development
graph TD
    A["AR Foundation"] -->|iOS| B["ARKit"]
    A -->|Android| C["ARCore"]
    B --> D["AR Experience"]
    C --> D
    E["Your Unity Scripts"] --> A
    style A fill:#61dafb
    style D fill:#4CAF50
    style E fill:#FF6B6B

Setting Up: The Checklist That Actually Works

Let’s get down to brass tacks. Here’s what you need installed on your machine before you even think about creating a project:

  • Unity 2019.4 LTS or later (I’m using a more recent version, but 2019.4 is the baseline that works)
  • Visual Studio Code or Visual Studio for scripting
  • An iOS device or Android device for testing (simulators work, but actual devices are better)
  • Developer accounts (optional, but recommended for actual deployment)

Step 1: Creating Your Project Foundation

Fire up Unity and create a new 3D project. You’ll want to use at least version 2019.2, but honestly, use the latest stable version—future you will thank present you for it. Once the project opens and you’re staring at that beautiful blank scene, resist the urge to jump into creating things immediately. First, we need to install the necessary packages.

Step 2: Installing AR Foundation Packages

Here’s where the real fun begins. You need to navigate to the Package Manager. In Unity’s menu, go to Window → Package Manager. When the Package Manager window opens, search for “AR Foundation” and click Install. But wait—there’s more. You can’t just install AR Foundation and call it a day. You also need platform-specific plugins:

  • If targeting iOS: Install ARKit XR Plugin
  • If targeting Android: Install ARCore XR Plugin
  • If targeting both (which is the whole point): Install all three packages Here’s what that looks like in a Unity project targeting both platforms:
// This is conceptual - you're not actually writing this code,
// but this represents what's happening behind the scenes:
// - AR Foundation (abstract layer)
// - ARKit XR Plugin (iOS implementation)
// - ARCore XR Plugin (Android implementation)

Step 3: Scene Setup - Delete, Create, Repeat

This next part catches a lot of beginners off guard. When you create a new 3D scene in Unity, it automatically includes a Main Camera GameObject. You need to delete this. I know, it feels wrong, like deleting your first child. But trust me—AR requires a special AR Camera, and the default camera will just get in the way. With the default camera gone, you now need to add two critical GameObjects to your scene. Think of these as the skeleton of your AR experience: First: The AR Session GameObject Navigate to Create → XR → AR Session. This GameObject should have an ARSession component automatically attached. The AR Session controls the entire lifecycle of your AR experience. Without it, your app won’t be able to track features in the environment. You only need one AR Session per scene, and it can exist on any GameObject (though typically we keep it on the AR Session GameObject itself). Second: The AR Session Origin GameObject Navigate to Create → XR → AR Session Origin. This is the unsung hero of AR Foundation. While the AR Session manages the lifecycle, the AR Session Origin handles the math. Here’s what I mean: AR devices provide data in “session space”—essentially a coordinate system relative to where the user started. That’s great and all, but Unity needs everything in “Unity space.” The AR Session Origin transforms tracking data (planes, feature points, detected images) into poses (position and orientation) that make sense in your scene. Think of it like this: AR Session Origin is the translator that converts “the phone thinks this plane is 2 meters away at a 45-degree angle from launch point” into “place this virtual cube right there in the 3D world.”

Step 4: Platform Configuration

Now we’re getting to the part where we tell Unity which platform we’re targeting. Go to File → Build Settings… In the Build Settings window, you’ll see a list of platforms. Select either iOS or Android depending on your target device. You can toggle between them later to build for different platforms—that’s the beauty of AR Foundation.

The First Real Test: A Minimal AR Scene

Let’s create something practical. We’re going to build a simple scene that detects horizontal planes (like a table or floor) and lets users place virtual cubes.

Creating the AR Manager Script

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
public class ARPlacementManager : MonoBehaviour
{
    private ARRaycastManager raycastManager;
    private ARPlaneManager planeManager;
    public GameObject placementPrefab;
    private static List<ARRaycastHit> hits = new List<ARRaycastHit>();
    private void Start()
    {
        raycastManager = GetComponent<ARRaycastManager>();
        planeManager = GetComponent<ARPlaneManager>();
    }
    private void Update()
    {
        if (Input.touchCount == 0)
            return;
        Touch touch = Input.GetTouch(0);
        if (touch.phase != TouchPhase.Began)
            return;
        if (raycastManager.Raycast(touch.position, hits, TrackableType.PlaneWithinPolygon))
        {
            foreach (ARRaycastHit hit in hits)
            {
                Instantiate(placementPrefab, hit.pose.position, hit.pose.rotation);
            }
        }
    }
}

This script does a few important things:

  1. Gets references to the ARRaycastManager and ARPlaneManager components
  2. Listens for touch input on the device
  3. Performs a raycast from the camera through the touch point
  4. Places objects where the raycast intersects with detected planes The magic happens in that Raycast call. It shoots an invisible ray from the camera through your touch point and checks if it hits a tracked plane.

Setting Up the Placement Prefab

You’ll need a simple prefab to instantiate. Here’s what I’d recommend for testing:

  1. Create a new Cube (Create → 3D Object → Cube)
  2. Scale it down to something reasonable (like 0.1, 0.1, 0.1)
  3. Add a simple material with a color so you can see it
  4. Drag it into your Assets folder to make it a Prefab
  5. Delete it from the scene Now, in your AR Session Origin GameObject, add the AR Manager script and assign your cube prefab to the placementPrefab field.

Image Tracking: Making AR Recognize Real-World Objects

Let’s level up and create something cooler. Image tracking allows your AR app to detect and track flat images in the real world—like a poster, a book cover, or a trading card. Here’s how to set this up:

Prepare Your Reference Images

  1. Open your project’s Assets folder
  2. Create a new folder called ARResources
  3. Add images you want to track as PNG or JPG files
  4. Select each image in the Project view
  5. In the Inspector, set the Texture Type to Editor GUI and Legacy GUI (temporary)
  6. Apply the changes

Create an AR Reference Image Library

In your Assets folder, right-click and navigate to Create → XR → Reference Image Library. This creates a database of images your app will recognize.

  1. Open your newly created Reference Image Library
  2. Add entries by increasing the Size value
  3. For each entry, assign one of your images
  4. Set the Width to the real-world size of the image in meters (for a standard playing card, that’s about 0.09 meters)

The Image Tracking Script

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using System.Collections.Generic;
public class ImageTrackingManager : MonoBehaviour
{
    private ARTrackedImageManager trackedImageManager;
    public GameObject[] arPrefabs;
    private Dictionary<int, GameObject> instantiatedPrefabs = new Dictionary<int, GameObject>();
    private void OnEnable()
    {
        trackedImageManager = GetComponent<ARTrackedImageManager>();
        trackedImageManager.trackedImagesChanged += OnTrackedImagesChanged;
    }
    private void OnDisable()
    {
        trackedImageManager.trackedImagesChanged -= OnTrackedImagesChanged;
    }
    private void OnTrackedImagesChanged(ARTrackedImagesChangedEventArgs eventArgs)
    {
        foreach (ARTrackedImage trackedImage in eventArgs.added)
        {
            UpdateImage(trackedImage);
        }
        foreach (ARTrackedImage trackedImage in eventArgs.updated)
        {
            UpdateImage(trackedImage);
        }
        foreach (ARTrackedImage trackedImage in eventArgs.removed)
        {
            if (instantiatedPrefabs.ContainsKey(trackedImage.GetInstanceID()))
            {
                Destroy(instantiatedPrefabs[trackedImage.GetInstanceID()]);
                instantiatedPrefabs.Remove(trackedImage.GetInstanceID());
            }
        }
    }
    private void UpdateImage(ARTrackedImage trackedImage)
    {
        if (!instantiatedPrefabs.ContainsKey(trackedImage.GetInstanceID()))
        {
            GameObject prefab = arPrefabs[trackedImage.referenceImage.index];
            GameObject instance = Instantiate(prefab, trackedImage.transform.position, trackedImage.transform.rotation);
            instantiatedPrefabs[trackedImage.GetInstanceID()] = instance;
        }
        else
        {
            GameObject instance = instantiatedPrefabs[trackedImage.GetInstanceID()];
            instance.transform.position = trackedImage.transform.position;
            instance.transform.rotation = trackedImage.transform.rotation;
        }
    }
}

Here’s what this script does:

  • Subscribes to image tracking events when enabled
  • Creates AR objects when an image is first detected
  • Updates positions as the image moves or rotates
  • Destroys objects when the image leaves the view
  • Maintains a dictionary to track which objects correspond to which images

Scene Setup for Image Tracking

  1. In your AR Session Origin, add the ARTrackedImageManager component
  2. Assign your Reference Image Library to it
  3. Create the AR objects you want to appear (cubes, models, animations)
  4. Make them into Prefabs
  5. Add the ImageTrackingManager script to your AR Session Origin
  6. In the Inspector, assign your AR Prefabs to the arPrefabs array

Advanced Architecture: Understanding the Complete Flow

Let me give you a visual representation of how everything connects:

sequenceDiagram
    participant User as User Input
    participant Camera as AR Camera
    participant ARKit/Core as Native Framework
    participant ARFoundation as AR Foundation
    participant Scripts as Your Scripts
    participant Scene as Unity Scene
    User->>Camera: Point camera at surface
    Camera->>ARKit/Core: Capture frame & sensor data
    ARKit/Core->>ARFoundation: Detected planes/images/features
    ARFoundation->>Scripts: Trigger callbacks
    Scripts->>Scene: Update GameObject positions
    Scene-->>Camera: Render updated view

This flow happens dozens of times per second. AR Foundation is constantly receiving tracking data from ARKit or ARCore, forwarding it to your scripts, and letting you respond by updating your virtual scene.

Building to Device: The Moment of Truth

Here’s where all your hard work pays off. Let’s actually test this on a real device.

For Android

  1. Connect your Android device via USB cable
  2. Enable Developer Options on your phone (tap Settings → About Phone → Build Number seven times—no, I’m not joking)
  3. Enable USB Debugging in Developer Options
  4. Go to File → Build Settings
  5. Select Android as the platform
  6. Make sure your scene is added to the build (it should show up in “Scenes in Build”)
  7. Click Build and Run
  8. Unity will compile, and your app will install and launch on your device

For iOS

  1. Connect your iPhone via USB cable
  2. Go to File → Build Settings
  3. Select iOS as the platform
  4. Configure the following:
    • Bundle Identifier: Something like com.yourname.arapp
    • Minimum iOS Version: iOS 12.0 or later
  5. Click Build
  6. When prompted, create a folder for your Xcode project
  7. Open the generated Xcode project
  8. In Xcode, select your device and click the Play button The first time you run on an actual device, there might be a moment of suspense. Will it work? Will it compile? Will your app just crash immediately? Most of the time, it works beautifully. Sometimes you’ll get permission requests asking if the app can use the camera. Always approve those (unless you’re running your AR app for the CIA, in which case, different rulebook).

Performance Optimization: Because Battery Life Matters

AR apps can be battery-hungry little monsters if you’re not careful. Here are some practical tips: 1. Disable Plane Detection When You Don’t Need It

planeManager.enabled = false; // Disable when not needed

2. Limit the Update Frequency

Application.targetFrameRate = 30; // 30 FPS instead of 60

3. Use Object Pooling for Frequently Created Objects Instead of instantiating and destroying objects constantly, reuse them:

public class ObjectPool : MonoBehaviour
{
    private Queue<GameObject> pool = new Queue<GameObject>();
    public GameObject prefab;
    private int poolSize = 10;
    private void Start()
    {
        for (int i = 0; i < poolSize; i++)
        {
            GameObject obj = Instantiate(prefab);
            obj.SetActive(false);
            pool.Enqueue(obj);
        }
    }
    public GameObject GetObject()
    {
        if (pool.Count > 0)
        {
            GameObject obj = pool.Dequeue();
            obj.SetActive(true);
            return obj;
        }
        return Instantiate(prefab);
    }
    public void ReturnObject(GameObject obj)
    {
        obj.SetActive(false);
        pool.Enqueue(obj);
    }
}

Common Pitfalls and How to Avoid Them

“My app compiles but crashes immediately” Check that you have both the AR Session and AR Session Origin in your scene. I’ve done this more times than I’d like to admit. “The virtual objects jitter or look unstable” This usually means your device is struggling with plane detection. Try disabling other tracking features or reducing object complexity. “Image tracking doesn’t detect my image” Ensure your reference images have enough visual features (not just plain colors) and that they’re properly sized in the Reference Image Library. “Everything works in the editor but breaks on device” The editor is forgiving. Real devices aren’t. Make sure you have proper null checks and error handling throughout your code.

Next Steps: Going Beyond the Basics

You’ve now got the foundation (see what I did there?) for building AR experiences. From here, you can:

  • Implement multiplayer AR using Firebase for Cloud Anchors
  • Add 3D models from your favorite modeling software
  • Create UI overlays that track with AR objects
  • Build face tracking filters similar to Snapchat
  • Integrate physics engines for realistic object interactions
  • Create persistent AR that remembers where objects were placed The possibilities are genuinely wild. You’re no longer limited by the flat screen—you’ve got the entire physical world as your canvas.

The Journey Ahead

Creating AR applications used to feel like black magic reserved for wizards with PhDs in computer vision. AR Foundation has democratized this to the point where anyone willing to learn can build compelling experiences. Yes, there will be debugging sessions that make you want to throw your laptop out a window. Yes, you’ll probably spend an hour trying to figure out why something isn’t working only to realize you forgot to enable a checkbox. That’s part of the game. But then one day, you’ll point your phone at a surface, tap the screen, and watch a virtual object materialize in the real world exactly where you intended. That moment? That’s pure magic. That’s when all the frustration evaporates, and you realize why so many developers are obsessed with AR. Now go build something awesome. The AR world is waiting for you.