Little Lives Check-In Face Check-In features built into Little Check-In iOS application.
TECHNOLOGY
BUILD
Swift
UIKit
SnapKit
Vision
CoreData
Alamofire
Azure Face
TOOLS
XCode
Cocoapods
Jazzy
SwiftLint
PLATFORMS
iOS
ABOUT
DESCRIPTION
Little Lives Check-In is an iOS application optimised for iPads and used to perform check-in on re-schoolers by taking individual or group photos. The application uses face recognition technology to register attendance for individuals in the photos and sends the check-in photo along with other details to parents.
MOTIVATION
This application was built with the goal to speed up the check-in process of pre-schoolers. The face recognition solution was explored as it not only could boost efficiency, but also provide entertainment value and novel technical branding to the process. Detecting additional information such as emotions of children at check-in could prove useful to parents. As part of our software engineering module, it was also an opportunity to hone our software architecture design skills.
CHALLENGES
The main issue we faced was speed and performance of the face-tracking, especially on older iPads. Since visual computations tend to be expensive, the lack of processing power on old tablet models would result in sluggishness of the interface and overheating, especially when processing multiple faces simultaneously. We moved heavy processes to the cloud by using Azure Face API for emotion and face detection, as well as face recognition, and performed object tracking offline periodically using the iOS Vision API.

Another issue would be the recognition accuracy for young children. We filtered, normalised and tagged a good amount of photos of participating pre-schoolers to train the model on Azure Face. We also proposed that newly taken and normalised profiles should be used in training and updating the model to progressively keep up with the children's growth.

A key concern would be to track faces of active children who tend to move about in the frame. We improved our tracking fidelity by toggling between authoritative face detection using Azure Face API and auxillary object tracking on Vision API. With object tracking being less computation-intensive, we could afford higher refresh rates and more reliable tracking with relatively lesser performance trade-off.
DESIGN
ARCHITECTURE
Little Lives Check-In was written in a VIPER (Views, Interactors, Presenters, Entities, Router) architecture, along with Data Access Objects, Service Workers and Adapters for access to both networked and local services, and database resources. The design ultimately geared towards an extensible application that could be integrated with other third party data storage, recognition services and maintained modularity of front-end components.
FEATURES
MULTI-FACE RECOGNITION
Mutiple faces can be detected and recognised in a single picture.
LIVE FACE TRACKING
Faces will be boxed and tracked when previewing before taking the photo.
EMOTION DETECTION
All faces will be tagged with their detected emotions if network is available.
MANUAL SEARCH & TAG
In the occasion that a face is tagged wrongly or has no valid tags, you may manually search and tag the face using the school registry.
ACCESS CONTROL
Passwords can be set to lock certain views such as the check-in view from unauthorised access to the check-in feature.
NOTE:
Due to non-disclosure agreement, the screengrabs did not feature children at the pre-schools. Instead, we used stock photos and faces of celebrities, politicians and myself.