Back to Blog
Development Tips10 min read

AR/VR App Development 2026: Complete Mobile Guide

Complete guide to AR and VR mobile app development. Learn ARKit, ARCore, Unity, immersive experiences, and building augmented reality applications.

Hevcode Team
January 17, 2026

Augmented Reality (AR) and Virtual Reality (VR) are transforming mobile experiences. This guide covers frameworks, implementation patterns, and practical examples for building immersive mobile applications.

AR vs VR: Understanding the Difference

Augmented Reality (AR):
├── Overlays digital content on real world
├── Uses device camera
├── User sees real environment
├── Examples: Pokémon GO, IKEA Place, Snapchat filters

Virtual Reality (VR):
├── Fully immersive digital environment
├── Uses headset (Quest, Vision Pro)
├── User isolated from real world
├── Examples: Beat Saber, VR Chat, training simulations

Mixed Reality (MR):
├── Digital objects interact with real world
├── Uses advanced sensors
├── Most immersive on devices like Vision Pro
└── Examples: Industrial training, surgical planning

Mobile AR Frameworks

ARKit (iOS)

Apple's AR framework for iPhone and iPad.

import ARKit
import RealityKit

class ARViewController: UIViewController {
    @IBOutlet var arView: ARView!

    override func viewDidLoad() {
        super.viewDidLoad()

        // Configure AR session
        let configuration = ARWorldTrackingConfiguration()
        configuration.planeDetection = [.horizontal, .vertical]
        configuration.environmentTexturing = .automatic

        // Enable features
        if ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh) {
            configuration.sceneReconstruction = .mesh
        }

        arView.session.run(configuration)

        // Add tap gesture for placing objects
        let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap))
        arView.addGestureRecognizer(tapGesture)
    }

    @objc func handleTap(_ gesture: UITapGestureRecognizer) {
        let location = gesture.location(in: arView)

        // Raycast to find surface
        let results = arView.raycast(from: location, allowing: .estimatedPlane, alignment: .horizontal)

        if let firstResult = results.first {
            // Place 3D object
            placeObject(at: firstResult)
        }
    }

    func placeObject(at raycastResult: ARRaycastResult) {
        // Create anchor
        let anchor = AnchorEntity(world: raycastResult.worldTransform)

        // Load 3D model
        let modelEntity = try! ModelEntity.loadModel(named: "robot")
        modelEntity.scale = SIMD3(repeating: 0.1)

        // Add to scene
        anchor.addChild(modelEntity)
        arView.scene.addAnchor(anchor)
    }
}

ARCore (Android)

Google's AR framework for Android devices.

import com.google.ar.core.*
import com.google.ar.sceneform.*

class ARActivity : AppCompatActivity() {
    private lateinit var arFragment: ArFragment
    private var session: Session? = null

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_ar)

        arFragment = supportFragmentManager.findFragmentById(R.id.ar_fragment) as ArFragment

        // Configure AR session
        arFragment.arSceneView.scene.addOnUpdateListener { frameTime ->
            onUpdate(frameTime)
        }

        // Handle tap for placing objects
        arFragment.setOnTapArPlaneListener { hitResult, plane, motionEvent ->
            placeObject(hitResult)
        }
    }

    private fun onUpdate(frameTime: FrameTime) {
        val frame = arFragment.arSceneView.arFrame ?: return

        // Get detected planes
        val planes = frame.getUpdatedTrackables(Plane::class.java)
        for (plane in planes) {
            if (plane.trackingState == TrackingState.TRACKING) {
                // Plane detected - show indicator
            }
        }
    }

    private fun placeObject(hitResult: HitResult) {
        // Create anchor
        val anchor = hitResult.createAnchor()
        val anchorNode = AnchorNode(anchor)
        anchorNode.setParent(arFragment.arSceneView.scene)

        // Load 3D model
        ModelRenderable.builder()
            .setSource(this, Uri.parse("robot.glb"))
            .build()
            .thenAccept { renderable ->
                val node = TransformableNode(arFragment.transformationSystem)
                node.renderable = renderable
                node.setParent(anchorNode)
                node.select()
            }
    }
}

Cross-Platform: Unity AR Foundation

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;

public class ARPlacement : MonoBehaviour
{
    public GameObject objectToPlace;
    public ARRaycastManager raycastManager;
    public ARPlaneManager planeManager;

    private List<ARRaycastHit> hits = new List<ARRaycastHit>();
    private GameObject placedObject;

    void Update()
    {
        // Check for touch input
        if (Input.touchCount > 0)
        {
            Touch touch = Input.GetTouch(0);

            if (touch.phase == TouchPhase.Began)
            {
                // Raycast against detected planes
                if (raycastManager.Raycast(touch.position, hits, TrackableType.PlaneWithinPolygon))
                {
                    Pose hitPose = hits[0].pose;
                    PlaceObject(hitPose);
                }
            }
        }
    }

    void PlaceObject(Pose pose)
    {
        if (placedObject == null)
        {
            placedObject = Instantiate(objectToPlace, pose.position, pose.rotation);
        }
        else
        {
            placedObject.transform.position = pose.position;
            placedObject.transform.rotation = pose.rotation;
        }
    }
}

Common AR Features

1. Plane Detection

// iOS ARKit Plane Detection
class PlaneDetectionViewController: UIViewController, ARSessionDelegate {
    @IBOutlet var arView: ARView!

    override func viewDidLoad() {
        super.viewDidLoad()

        let config = ARWorldTrackingConfiguration()
        config.planeDetection = [.horizontal, .vertical]

        arView.session.delegate = self
        arView.session.run(config)
    }

    func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
        for anchor in anchors {
            if let planeAnchor = anchor as? ARPlaneAnchor {
                // Plane detected
                let planeType = planeAnchor.alignment == .horizontal ? "Floor/Table" : "Wall"
                print("Detected \(planeType) plane")

                // Add visual indicator
                addPlaneVisualization(for: planeAnchor)
            }
        }
    }

    func addPlaneVisualization(for anchor: ARPlaneAnchor) {
        let planeEntity = ModelEntity(
            mesh: .generatePlane(width: anchor.extent.x, depth: anchor.extent.z),
            materials: [SimpleMaterial(color: .blue.withAlphaComponent(0.3), isMetallic: false)]
        )

        let anchorEntity = AnchorEntity(anchor: anchor)
        anchorEntity.addChild(planeEntity)
        arView.scene.addAnchor(anchorEntity)
    }
}

2. Image Recognition

// ARKit Image Recognition
class ImageRecognitionVC: UIViewController {
    @IBOutlet var arView: ARView!

    override func viewDidLoad() {
        super.viewDidLoad()

        guard let referenceImages = ARReferenceImage.referenceImages(
            inGroupNamed: "AR Resources",
            bundle: nil
        ) else { return }

        let config = ARWorldTrackingConfiguration()
        config.detectionImages = referenceImages
        config.maximumNumberOfTrackedImages = 4

        arView.session.run(config)
    }

    func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
        for anchor in anchors {
            if let imageAnchor = anchor as? ARImageAnchor {
                let imageName = imageAnchor.referenceImage.name ?? "Unknown"
                print("Detected image: \(imageName)")

                // Place content on detected image
                placeContentOnImage(imageAnchor)
            }
        }
    }

    func placeContentOnImage(_ imageAnchor: ARImageAnchor) {
        let imageSize = imageAnchor.referenceImage.physicalSize

        // Create video player on top of image
        let videoItem = AVPlayerItem(url: Bundle.main.url(forResource: "video", withExtension: "mp4")!)
        let player = AVPlayer(playerItem: videoItem)

        let videoMaterial = VideoMaterial(avPlayer: player)
        let videoPlane = ModelEntity(
            mesh: .generatePlane(width: Float(imageSize.width), depth: Float(imageSize.height)),
            materials: [videoMaterial]
        )

        let anchorEntity = AnchorEntity(anchor: imageAnchor)
        anchorEntity.addChild(videoPlane)
        arView.scene.addAnchor(anchorEntity)

        player.play()
    }
}

3. Face Tracking

// ARKit Face Tracking (TrueDepth camera required)
class FaceTrackingVC: UIViewController, ARSessionDelegate {
    @IBOutlet var arView: ARView!

    override func viewDidLoad() {
        super.viewDidLoad()

        guard ARFaceTrackingConfiguration.isSupported else {
            print("Face tracking not supported")
            return
        }

        let config = ARFaceTrackingConfiguration()
        config.isLightEstimationEnabled = true

        arView.session.delegate = self
        arView.session.run(config)
    }

    func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
        for anchor in anchors {
            guard let faceAnchor = anchor as? ARFaceAnchor else { continue }

            // Get facial expressions
            let smile = faceAnchor.blendShapes[.mouthSmileLeft]?.floatValue ?? 0
            let eyeBrowUp = faceAnchor.blendShapes[.browInnerUp]?.floatValue ?? 0
            let eyesClosed = faceAnchor.blendShapes[.eyeBlinkLeft]?.floatValue ?? 0

            // Apply to 3D character
            updateCharacterFace(smile: smile, eyeBrow: eyeBrowUp, eyesClosed: eyesClosed)

            // Get head position/rotation
            let headTransform = faceAnchor.transform
            let headPosition = SIMD3<Float>(headTransform.columns.3.x,
                                            headTransform.columns.3.y,
                                            headTransform.columns.3.z)
        }
    }

    func updateCharacterFace(smile: Float, eyeBrow: Float, eyesClosed: Float) {
        // Update 3D character blend shapes
    }
}

4. Object Occlusion

// People Occlusion (ARKit 3+)
class OcclusionVC: UIViewController {
    @IBOutlet var arView: ARView!

    override func viewDidLoad() {
        super.viewDidLoad()

        let config = ARWorldTrackingConfiguration()
        config.planeDetection = .horizontal

        // Enable people occlusion
        if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) {
            config.frameSemantics.insert(.personSegmentationWithDepth)
        }

        // Enable scene mesh occlusion (LiDAR)
        if ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh) {
            config.sceneReconstruction = .mesh

            // Virtual objects appear behind real objects
            arView.environment.sceneUnderstanding.options.insert(.occlusion)
        }

        arView.session.run(config)
    }
}

VR Development

Vision Pro (visionOS)

import SwiftUI
import RealityKit

@main
struct MyVisionApp: App {
    var body: some Scene {
        WindowGroup {
            ContentView()
        }

        ImmersiveSpace(id: "ImmersiveSpace") {
            ImmersiveView()
        }
    }
}

struct ImmersiveView: View {
    var body: some View {
        RealityView { content in
            // Create immersive environment
            let floor = ModelEntity(
                mesh: .generatePlane(width: 50, depth: 50),
                materials: [SimpleMaterial(color: .gray, isMetallic: false)]
            )

            // Add 3D content
            let sphere = ModelEntity(
                mesh: .generateSphere(radius: 0.5),
                materials: [SimpleMaterial(color: .blue, isMetallic: true)]
            )
            sphere.position = [0, 1.5, -2]

            // Enable hand tracking interaction
            sphere.components.set(InputTargetComponent())
            sphere.components.set(CollisionComponent(shapes: [.generateSphere(radius: 0.5)]))

            content.add(floor)
            content.add(sphere)
        }
        .gesture(TapGesture().targetedToAnyEntity().onEnded { value in
            // Handle tap on entity
            let entity = value.entity
            entity.components[ModelComponent.self]?.materials = [
                SimpleMaterial(color: .random, isMetallic: true)
            ]
        })
    }
}

Quest VR with Unity

using UnityEngine;
using UnityEngine.XR;
using UnityEngine.XR.Interaction.Toolkit;

public class VRController : MonoBehaviour
{
    public XRController leftController;
    public XRController rightController;
    public GameObject teleportRay;

    private InputDevice leftHand;
    private InputDevice rightHand;

    void Start()
    {
        // Get hand devices
        var leftHandDevices = new List<InputDevice>();
        InputDevices.GetDevicesAtXRNode(XRNode.LeftHand, leftHandDevices);
        if (leftHandDevices.Count > 0) leftHand = leftHandDevices[0];

        var rightHandDevices = new List<InputDevice>();
        InputDevices.GetDevicesAtXRNode(XRNode.RightHand, rightHandDevices);
        if (rightHandDevices.Count > 0) rightHand = rightHandDevices[0];
    }

    void Update()
    {
        // Check trigger button
        if (rightHand.TryGetFeatureValue(CommonUsages.triggerButton, out bool triggerPressed))
        {
            if (triggerPressed)
            {
                OnTriggerPressed();
            }
        }

        // Check grip button
        if (rightHand.TryGetFeatureValue(CommonUsages.gripButton, out bool gripPressed))
        {
            if (gripPressed)
            {
                OnGripPressed();
            }
        }

        // Teleportation with thumbstick
        if (leftHand.TryGetFeatureValue(CommonUsages.primary2DAxis, out Vector2 thumbstick))
        {
            if (thumbstick.y > 0.5f)
            {
                ShowTeleportRay();
            }
        }
    }

    void OnTriggerPressed()
    {
        // Shoot or interact
    }

    void OnGripPressed()
    {
        // Grab object
    }

    void ShowTeleportRay()
    {
        teleportRay.SetActive(true);
    }
}

AR Use Cases

E-Commerce: Product Visualization

// React Native AR Product Viewer
import { ViroARScene, ViroARSceneNavigator, Viro3DObject, ViroNode } from '@viro-community/react-viro';

function ProductARScene({ productModel }) {
  const [scale, setScale] = useState([0.1, 0.1, 0.1]);
  const [rotation, setRotation] = useState([0, 0, 0]);

  return (
    <ViroARScene>
      <ViroNode
        position={[0, 0, -1]}
        dragType="FixedToWorld"
        onDrag={() => {}}
      >
        <Viro3DObject
          source={{ uri: productModel.modelUrl }}
          resources={[
            { uri: productModel.textureUrl },
          ]}
          scale={scale}
          rotation={rotation}
          type="GLB"
          onPinch={(pinchState, scaleFactor) => {
            if (pinchState === 3) {
              setScale(prev => prev.map(s => s * scaleFactor));
            }
          }}
          onRotate={(rotateState, factor) => {
            if (rotateState === 3) {
              setRotation(prev => [prev[0], prev[1] + factor, prev[2]]);
            }
          }}
        />
      </ViroNode>
    </ViroARScene>
  );
}

function ARProductViewer({ product }) {
  return (
    <View style={{ flex: 1 }}>
      <ViroARSceneNavigator
        initialScene={{
          scene: () => <ProductARScene productModel={product.arModel} />
        }}
      />

      <View style={styles.controls}>
        <Button title="Take Photo" onPress={captureScreenshot} />
        <Button title="Share" onPress={shareAR} />
      </View>
    </View>
  );
}

Education: Interactive Learning

// ARKit Educational App
class AnatomyARViewController: UIViewController {
    @IBOutlet var arView: ARView!
    var currentOrgan: Entity?

    func loadHeartModel() {
        let anchor = AnchorEntity(plane: .horizontal)

        // Load anatomical model
        Entity.loadModelAsync(named: "heart")
            .sink(receiveValue: { [weak self] entity in
                self?.currentOrgan = entity

                // Add labels
                self?.addLabel(to: entity, text: "Left Ventricle", position: [-0.02, 0.05, 0])
                self?.addLabel(to: entity, text: "Right Atrium", position: [0.03, 0.08, 0])

                // Add tap interaction
                entity.generateCollisionShapes(recursive: true)

                anchor.addChild(entity)
                self?.arView.scene.addAnchor(anchor)
            })
            .store(in: &cancellables)
    }

    func addLabel(to entity: Entity, text: String, position: SIMD3<Float>) {
        let textMesh = MeshResource.generateText(
            text,
            extrusionDepth: 0.001,
            font: .systemFont(ofSize: 0.01)
        )

        let textEntity = ModelEntity(mesh: textMesh)
        textEntity.position = position

        // Face camera
        let billboardComponent = BillboardComponent()
        textEntity.components.set(billboardComponent)

        entity.addChild(textEntity)
    }

    @objc func handleTap(_ gesture: UITapGestureRecognizer) {
        let location = gesture.location(in: arView)

        if let entity = arView.entity(at: location) {
            // Show detail popup for tapped organ part
            showOrganDetail(entity.name)
        }
    }
}

Industrial: Maintenance Assistance

// ARCore Remote Assistance
class MaintenanceARActivity : AppCompatActivity() {
    private lateinit var arFragment: ArFragment
    private val annotations = mutableListOf<AnchorNode>()

    fun addAnnotation(hitResult: HitResult, type: AnnotationType) {
        val anchor = hitResult.createAnchor()
        val anchorNode = AnchorNode(anchor)
        anchorNode.setParent(arFragment.arSceneView.scene)

        when (type) {
            AnnotationType.ARROW -> addArrowAnnotation(anchorNode)
            AnnotationType.WARNING -> addWarningAnnotation(anchorNode)
            AnnotationType.INFO -> addInfoAnnotation(anchorNode)
        }

        annotations.add(anchorNode)

        // Sync annotation to cloud for remote viewing
        syncAnnotationToCloud(anchor, type)
    }

    private fun addArrowAnnotation(parent: AnchorNode) {
        ModelRenderable.builder()
            .setSource(this, Uri.parse("arrow.glb"))
            .build()
            .thenAccept { renderable ->
                val node = Node()
                node.renderable = renderable
                node.setParent(parent)

                // Add animation
                val animator = node.renderableInstance?.animate(true)
                animator?.apply {
                    repeatCount = ValueAnimator.INFINITE
                    start()
                }
            }
    }

    private fun syncAnnotationToCloud(anchor: Anchor, type: AnnotationType) {
        // Use Cloud Anchors for sharing
        arFragment.arSceneView.session?.hostCloudAnchor(anchor)?.let { cloudAnchor ->
            // Send cloudAnchor.cloudAnchorId to remote user
        }
    }

    // Receive remote annotations
    fun resolveRemoteAnnotation(cloudAnchorId: String) {
        arFragment.arSceneView.session?.resolveCloudAnchor(cloudAnchorId)?.let { anchor ->
            // Display received annotation
        }
    }
}

Performance Optimization

AR Performance Tips:
├── 3D Models
│   ├── Use LOD (Level of Detail)
│   ├── Optimize polygon count (<100K for mobile)
│   ├── Compress textures
│   └── Use glTF/USDZ formats
├── Rendering
│   ├── Limit draw calls
│   ├── Use occlusion culling
│   ├── Batch static objects
│   └── Reduce shader complexity
├── Tracking
│   ├── Limit tracked images to 4-5
│   ├── Use appropriate plane detection
│   └── Disable unused tracking features
└── General
    ├── Target 60fps
    ├── Monitor thermal throttling
    ├── Reduce physics complexity
    └── Use async loading

Conclusion

AR/VR development is becoming mainstream:

  1. AR is accessible: Most smartphones support ARKit/ARCore
  2. Tools are mature: Unity, RealityKit make development easier
  3. Use cases are proven: Retail, education, industrial applications work
  4. VR is expanding: Vision Pro brings spatial computing mainstream
  5. Start simple: Begin with basic plane detection, add complexity

The technology is ready - focus on solving real user problems.

Need help building an AR/VR application? Contact Hevcode for expert guidance on immersive app development.

Related Articles

Tags:ARVRARKitARCoreAugmented RealityMobile Development

Need help with your project?

We've helped 534+ clients build successful apps. Let's discuss yours.

Ready to Build Your App?

534+ projects delivered • 4.9★ rating • 6+ years experience

Let's discuss your project — no obligations, just a straightforward conversation.