Exploring Apple ARKit with sample application

Exploring Apple ARKit
Reading Time: 7 minutes

Apple’s AR Kit is an augmented reality platform for the devices running iOS. ARKit allows developers to build detailed AR experiences for its users by capturing the environment and placing characters, objects, 3D virtual text in that environment.

ARKit was released by Apple in June 2017. It opened a wide range of possibilities for iOS developers to explore a new approach.

ARKit uses iOS device camera, accelerometers, gyroscope, and context awareness to map the environment as the device is moved.

ARKit uses inertial sensors to pick out visual features in the environment, such as planes and track motion. The camera is used to determine the light sources.


What is Apple ARKit?

ARKit is a development environment that enables iOS app developers to build AR experiences into their applications quickly and easily.

ARKit uses Visual Inertial Odometry in order to track the world around the iPhone. It not only tracks the room’s layout but also tracks tables and other objects.

Recently, Apple has introduced ARKit 3, RealityKit, and Reality Composer. It is a collection of tools to make AR development easier.

ARKit 3 offers major improvements which include –

  • Detection capabilities of up to 100 images at a time
  • Better 3D object detection
  • Estimation of the physical size of an image
  • You can use front and back cameras simultaneously
  • Multiple face tracking lets you track up to three faces at once

It can be a little daunting task for some people to jump into the world of AR who are not familiar with the technology, but I will try to make it easy for you through this blog. 


Getting started with AppleAR Kit

We will be developing a small project using ARKit2. ARKit 2 introduced several new features and improvements to the API. To get started, you need – 

  • Xcode 10 or later 
  • iOS 12
  • Apple device

We can’t use the simulator to debug an AR application as the simulator doesn’t provide a camera and other sensors that are required to configure the AR world. 


How Apple ARKit Works?

Apple ARKit uses Visual Inertial Odometry (VIO) process to access the device motion sensor and visual information from the camera to track the real world. Mathematics is used to take this tracking information from the real 3D world to your 2D screen.

ARKit also collects the amount of available light using the sensors and applies the same lightning to the AR objects within the scenario.

The camera is also used to detect all the possible objects in the frame. As you move the camera, ARKit constantly refines the information.


Setting up a project in Xcode

To set up a project in Xcode, follow these steps –

  • Create a new project in Xcode
  • Select the Augmented Reality App in Application
  • Type Name of project
  • Choose a location to save the project. 

Congratulation, you have created your first AR Project. Any doubt? Just run the app and see the results. 

When you will run the app you will be asked for permission for the camera. Allow the camera permission and you will have a spaceship hanging in the air. Isn’t it cool! You can look at this spaceship from any distance and at any orientation angle. It won’t move as it is settled in the 3d world according to your phone.

Looking at the code, you will see that there is a scene view’s outlet connected in your View Controller 

@IBOutlet var sceneView: ARSCNView!

A scene for the spaceship is added in viewDidLoad, which is present in art.scnassets assets.

Run the sceneView session with ARWorldTrackingConfiguration, so that iPhone tracks the object in the real world 


Create our own node or object

Firstly remove the scene created by default i.e. delete these lines of code from viewDidLoad

  • let scene = SCNScene(named: “art.scnassets/ship.scn”)!
  • sceneView.scene = scene

Let’s set up the camera 

  • Call this function from viewDidLoad first
  • Create a SCNNode with a name object. 

var object = SCNNode()

  • Now specify a position w.r.t. your phone in 3 dimensions for example (0,0,0.6) as (x,y,z). Remember z should be negative so that the object can be placed on the backside of iPhone so that it can be seen 

let position = SCNVector3(0, 0, -0.6)

  • Download a rectangular image and save it in art.scnassets and name it as arimage.jpg
  • Make a variable and name it as an object so that we can track it and render it later during runtime.
  • Create a function to make a spherical node to have an image around it as if this image is used as the wrapper of a circular object.

  • Now call this function from viewDidLoad to place the object at a distance of .6 meter from iPhone at the time of loading the view controller.

object = createObject(at: position)

  • Add this object in your scene view as rootnode in the scene of sceneView by adding this line after creating the object.

sceneView.scene.rootNode.addChildNode(object)

  • Now run the app, and see your first self made object in the Augmented reality world. 

Reposition the object

Our next step is to remove the object if the object is tapped and add it to the tapped location if tapped at any location other than the object.

First, we add the following method to our class

This will return the first object in the world added to our Augmented world.

  • Make an array of floats to save the distance between the previous and present position of the object.

var recentVirtualObjectDistances = [Float]()

  • Make a variable selectedNode to track the object to render as you might render many objects in the future. Now I’m only rendering one object.

var selectedObject: SCNNode?

Now add all these methods to your class. They will do a hit test at the position to find the possible nodes where the object can be placed and then add the object to the closest node to the tapped location.

Add these extensions to the project in order to convert float4x4 values we get by hit test to be used to locate the current position of the object and to find the average of an array of float values, track camera state, etc.


Add tap gesture in ViewDidLoad and make your ViewController its delegate

Now let’s add the tap gesture to hide and show our object

That’s a lot of code, isn’t it? Yes, but this code needs to be added only once and can be reused many times. So it’s a one time ache, go on with it;).

Moving the object

Let’s move on to 4th step which is to make our object move according to our drag gesture in our screen. Let me take you through the steps which we are going to follow –

Add pan gesture in viewDidLoad and make your view controller its delegate.

  • Add a new cocoa Class ThresholdPanGesture of type UIPanGestureRecognizer and import SceneKit and UIKit.UIGestureRecognizerSubclass

import UIKit.UIGestureRecognizerSubclass

import SceneKit

  • Add all of this code in this class 

Now add ARSCNViewDelegate function which is commented by default, it renders the view at run time in each frame of screen to modify it. Delete it and add following code – 

Now in each frame your tapped or dragged position in the screen is been monitored and if your drag starts from the object, then it will move to your current touched place on the screen. Else it won’t move.

Congratulations you are having an object placed in the Augmented world which can be placed anywhere and can be moved anywhere as if it is dancing on your fingers. What else do you want? With dance, I remember some salsa dance moves like rotation.

Rotating the object

Let’s make our object t rotate at its place also.

    • Since we have used tap gesture pan gesture we will use rotationGesture for rotating the object. Add rotation gesture in ViewDidLoad and make your viewController its delegate  

    • This is how your ViewDidLoad function will work

  • So let’s add it by adding the following code in ViewController. 

  • It just changes eulerAngles of the object w.r.t. screen.
  • So now your object is able to do some salsa moves also, enjoy the dancing 
  • Feel free to add any other movement to your augmented object.

Summary

It is not so difficult to work with ARKit. However, there are some limitations like scanning white surfaces and showcasing an object on it. 

Other than that you just need to have proper configuration and rest will be taken care of by the framework. 

Other than this demo application we have implemented ARKit on a few of our client projects, and it has been fun to work with ARKit.


Hire our experienced iOS app developers

Leveraging agile methodology and experienced iOS developers, we provide our clients with custom iOS app development solutions keeping in mind the design guidelines related to the Apple ecosystem.

Our iOS app development services offer –

ARKit

Build beautiful AR experiences for your users with Apple ARKit 3. With ARKit 3, Apple has introduced Reality Composer – an app that easily let you create AR experiences, RealityKit – a powerful augmented reality framework and Motion Capture which let you track human movement as input to AR.

iPhone app development

iOS ecosystem has defined the ideal mobile environment and pushed the boundaries for smart mobile computing. We offer a wide range of iOS app development services for our global clients. We build and deliver robust and scalable applications for the iPhone.

iPad app development

iPadOS is the latest offering from Apple which offers additional technologies by leveraging the power of iPad using the iOS SDK. Now offer your users a multiwindow experience, add full drawing experiences for Apple Pencil in your iPad application with iPadOS.

Apple Watch app development

Give your users an easy way to complete quick actions by developing an application for Apple Watch. watchOS leverages the power of SwiftUI and new APIs to deliver robust experience. Now build independent watchOS apps without the iOS counterpart with Core ML and the Neural Engine.

Request a callback from our industry experts.


Frequently Asked Questions

What is Apple ARKit?
Apple ARKit is a software development framework that enables developers to create augmented reality (AR) experiences for iOS devices. It provides tools and APIs for integrating virtual content into real-world environments, allowing for immersive AR applications. ARKit supports features like motion tracking, scene understanding, and light estimation, empowering developers to build interactive and engaging AR experiences for iPhone and iPad users.
How Apple ARKit Works?
How do you set up a project in Xcode?
Why Should You Hire Our Experienced iOS App Developers?

Sharing is Caring

Author:

Kuldeep is currently working as Mobile Tech Lead (iOS) at VT Netzwelt. He is having 12+ years of industry experience. Kuldeep has worked on various technology stacks including Dot Net Technology, iOS development, Hybrid app development, React Native, PhoneGap, WebRTC, Objective C, Swift, Pjsip, CoreData. Kuldeep is currently working on AI and ML kit.

Leave a Reply

Your email address will not be published. Required fields are marked *

Partnering for Success, Delivering with Excellence

by 270+ customers for 700+ Web and Mobile App Development Projects

    For Project Inquiries

    Start Growing Your Business With Us

    Please fill in the details and our representative will be in touch with you shortly.
    VT Netzwelt Close