Social Support in Quick Link Scan

Tyler Hackbart
3 min readMar 9, 2021

--

If you do not know what Quick Link Scan is, it’s an app that I developed a couple of years ago. The original version only supported URL scanning, with the ability to do so from a printed poster, takeout menu, business card, or wherever else you may find one, just by scanning it with your phone.

Recently, I supported social.

The Elevator Pitch

An app that allows you to open social media platforms by scanning an account tag or hashtag.

At the time this article was written, this is an iOS-only application, built on top of ML Kit from Google’s Firebase framework. When working on the first version back in 2019, Google was king when it came to image processes for text recognition. However, in the last couple of years, Apple has made on-device machine learning very appealing.

Case Study — Google versus Apple for ML Text Recognition

While building out the social update for Quick Link Scan, I did test out the new features added into iOS and Swift, however the processing speeds and accuracy still depend on the device itself. With an iPhone 12, the processed accuracy of the outcome of text recongnition was good, but when I switched back to my older iPhone X running iOS 14 it dropped the ball. Both processing speeds and camera quality with shading from light sources would drastically change the outcome, returning a jumbled letter output of a response from Apple’s model.

Now don’t get me wrong, I was really excited when Apple announced VNRecognizeTextRequest at the WWDC in 2019, a step in the right direction from the limited VNRectangleObservation but it is not quite there as of yet.

Focus on Offline and On-Device Support

The biggest push since the beginning has always been on-device machine learning. Since the original version of Quick Link Scan, I have and will always focus on the real time, real place situation while using the app. Not worrying about HTTP requests, server processes, and most importantly requiring the user to have mobile data turned on while out and about just to use the app. Quick Link Scan is offline first, so machine learning processes on the device with Apple’s ML text recognition just can’t cut it yet.

Once you scan a social account tag or hashtag, Quick Link Scan does allow you to click into Twitter, Facebook, and Instagram to search and interact with them, which does require network connection, but being able to pull those tags is all done without the internet. This allows a user without 24/7 access to the internet to quickly scan it and save it for later to view and interact with it once they have connectivity.

Using Google ML text recognition does still have hiccups, but the accuracy of the models through Google far out perform the accuracies I was getting while using Apple’s own native support. The more accurate the outcome, the more likely a user will continue to use Quick Link Scan. In addition, the success of testing as far back as an iPhone 6s and still getting high accuracy on my results, even with a 6 year gap between that and the iPhone 12.

I will continue to watch as VNRecognizeTextRequest grows into a more mature and better-supported feature. Maybe after a future WWDC I will finally be able to switch to Apple’s native support. Until then, Google will continue to be my preferred framework.

Look out for future versions of Quick Link Scan as I continue on with the core concept but expand from URLs and social tags; more on that in the future.

Some Additional Resources

Text Recognition in Vision Framework — Great session when Apple released VNRecognizeTextRequest.

ML Kit — Google Firebase Framework for machine learning.

--

--

Tyler Hackbart
Tyler Hackbart

Written by Tyler Hackbart

Founder of Juice Box Monkey Designs

No responses yet