Getting started with Augmented Reality (iOS)

What is Augmented Reality?
Augment means “to increase the size or value of something by adding something to it”
Therefore, augmented reality can be understood as “an enhanced version of reality” created by the use of technology to overlay digital information on an image of something being viewed through a device (such as a smartphone camera)

With iOS 11, Apple has released many exciting features for developers. The most interesting of them are ARKit, Machine Learning and Vision Frameworks.

ARKit is a toolkit for developers that allows apps to have a native augmented reality framework. It allows apps to use computer vision for object recognition, and virtual objects can be placed on the display that are context-aware.

Similarly, CoreML is a framework that makes it really easy for developers to integrate machine learning models into their apps. And the iOS 11 Vision framework uses can range from text, barcode, face, and landmark detection to object tracking and image registration.

You can create some awesome apps using one or combination of these frameworks. Some common scenarios are:

1. Face Detection – This can be achieved using Vision framework. You can detect faces and the facial features such as eyes, nose, ears and mouth using Vision framework. You can detect faces in still images as well as in live camera feed by combining it with AVFoundation.

While looking for AR related examples, I came across with such an awesome example using Vision framework for face landmarks detection on live camera in iOS 11. Here is the link to the same:
https://github.com/DroidsOnRoids/VisionFaceDetection

Also, you can use the latest iPhone X’s TrueDepth camera with ARKit to
place and animate 3D content that follows the user’s face and matches facial expressions. Here is a sample from Apple:
https://developer.apple.com/documentation/arkit/creating_face_based_ar_experiences

Google has also provided Mobile Vision SDK that can be used to achieve the same:
https://github.com/googlesamples/ios-vision

2. Placing virtual objects – SceneKit can be used to detect planes and place virtual objects in the real world. All you need is a device with an A9 or later processor (iPhone 6s or better, iPhone SE, any iPad Pro, or the 2017 iPad), iOS 11 and Xcode 9 or higher.

The supported formats for virtual objects that you can add to the plane include COLLADA (COLLAborative Design Activity), identified with a .dae extension (digital asset exchange), and Alembic, identified with a .abc extension.
After adding this .dae object to your Xcode project, you need to convert it to SceneKit file format(.scn). This can be done in Xcode by choosing Editor → Convert to SceneKit scene file format (.scn)
However, most of the time, the conversion is not necessary. According to the SCNSceneSource documentation, when you include a model in DAE or Alembic format in your project, Xcode automatically converts the file to SceneKit’s scene format for use in the app, retaining its original .dae or .abc extension.

There is a good example code for tracking planes and placing furniture and decorative virtual objects at any place. Here is the link to the same:
https://github.com/ignacio-chiazzo/ARKit

3. Object Recognition – How to create an app that recognises the object that is placed in front of camera? This can be done by combining ARKit(SceneKit) and Vision framework and MLModel (Core ML) to tell the name of object placed in front of camera. I looked into multiple tutorials on implementation of object recognition and the one I liked most is :
https://github.com/JordanOsterberg/ARKit-Vision-Example

 

With this basic understanding of Augmented Reality and the links to different type of examples, think of some innovative ideas and start implementing interesting apps by combining all these awesome frameworks.

 

Written By: Neha Gupta

Advertisements

Codesign Installer package for distribution outside the Mac App Store

Sometimes you need to distribute your application installers outside Mac App Store. You can code sign it so that it will be recognized by Gatekeeper as identified developer product. Once you code sign the installer with your Apple Developer Id certificate, gatekeeper will allow to open the installer, otherwise it will show a dialog saying “The app cannot be opened because it is from an unidentified developer” (if gatekeeper settings are set to ‘Mac App store and identified developers’).

To know more about Gatekeeper options click here.

Here we will see how to sign the installer package so that gatekeeper won’t block it.

The installers created by PackageMaker with minimum target set to 10.5 and above are flat package while the installers created with minimum target set to 10.4 will create a bundle package.

Bundle type installers cannot be signed using Developer Id Installer certificate. These can be signed using Developer Id Application certificate, but gatekeeper does not pass it.

To sign a flat type installer first you need to enroll to Mac Developer Program and download your Developer Id Installer certificate. Double click the downloaded certificate to load it to keychain.

Now once you have the certificate in your keychain, you may check it via KeyChain Access. The certificate will be named like “Developer ID Installer: Any Name”.

To code-sign your installer package, run the following command in terminal:

productsign –timestamp=none –sign “Your Certificate Name” “/path/and/name/of/the/unsigned/installer” “path/and/name/of/signed/installer

For example, in my case

productsign –timestamp=none –sign “Developer ID Installer: Neha Gupta” “/myApp.pkg” “/signed/myApp.pkg”

The new installer created will be signed by your installer certificate and will be recognized by gatekeeper as a identified developer product. To check the certificate by which package is signed, launch the signed installer package and click the lock sign on the upper right corner.

Written By: Neha Gupta

Round Cornered Views

mac developers

Custom NSView with rounded corner:

To draw a custom view with its one or more corners round, all you need to do is draw a bezier path of the required shape in the view’s drawRect: method.
Also, this can be used to customize any NSView subclassed objects as  NSTextView, NSTextField etc.
Here I present the code to create a text field with any one round corner as shown in the image below. The code can be easily changed to create more than one round corners of a view.

View original post 471 more words

Circular Progress Bar

mac developers

Circular progress indicatorWe  all are familiar with native progress bar but a circular progress bar might be required in a project. In circular progress bar, progress of process will be shown by a bar filling in clockwise or anti-clockwise direction. Here I am providing the code to create a circular progress bar as shown in the image below:

View original post 564 more words

Protocol Buffers with Objective-C

mac developers

What are Protocol Buffers? Protocol Buffers are a way of encoding structured data. Protocol buffers are a flexible, efficient, automated mechanism for serializing structured data like XML, but smaller, faster, and simpler. You define how you want your data to be structured once, then you can use special generated source code to easily write and read your structured data to and from a variety of data streams and using a variety of languages. You can even update your data structure without breaking deployed programs that are compiled against the “old” format.

How to use Protocol Buffers with Objective C?

View original post 892 more words

How to get selected text and its coordinates from any system wide application using Accessibility API?

mac developers

In my previous post, I had discussed about accessing the text value from any system wide application by using Accessibility API. Now going further,  you may want to access only the selected text or the position of selected text, we will discuss about it in this article.

View original post 461 more words

Accessing text value from any System wide Application via Accessibility API

mac developers

Suppose, I want to create a application which will monitor the typing and say if “macdevelopers” is typed anywhere be it in TextEdit or Mail etc. the application will perform a operation for example automatically opening the website in a browser.

The text value from any System wide Application can be accessed using the Accessibility API. In this article I will discuss about accessing the text value from any application like Text Edit or other applications if its current focused element is some text field or text view in it. Thus using this you can implement the functionality to access the active application’s text field value.

View original post 260 more words