Sample Swift app that takes a picture, sends it to Microsoft's Face API to detect emotion and draws emoji to match.
-
Get Cognitive Services API key from Microsoft
-
Clone repo
-
Open workspace, replace
COGNATIVE_SERVICE_API_KEY
constant value in AnnotatedPictureViewController.swift with your API key -
Open project file and set Team under Signing
-
That's it! Run the app, snap away and start detecting emotions!
- PromiseKit ~> 4.4
- Alamofire = 4.8
-
Uses Microsoft's Face API to detect emotions.
- This app previously used Emotion API, which is now being deprecated. Emotion API features have been integrated into Face API.
-
Takes picture and detects immediately - no need to access user's Photo Library.
-
Supports multiple faces.
-
Once picture is analyzed, it is annotated with rectangle and emoji.
-
Quickly dismiss annotated picture by swiping down.
- AVFoundation
- AVCaptureSession
- AVCaptureVideoPreviewLayer
- AVCapturePhotoOutput
- UIKit
- UITapGestureRecognizer
- UIVisualEffectView
- UIPresentationController / UIViewControllerTransitioningDelegate