iOS Liveness SDK
Our iOS SDK enables seamless integration of real-time liveness detection into your mobile applications. To get started with our SDK, follow the guide below:
Last updated
Was this helpful?
Our iOS SDK enables seamless integration of real-time liveness detection into your mobile applications. To get started with our SDK, follow the guide below:
Last updated
Was this helpful?
Youverify Liveness SDK is the recommended iOS SDK for integrating real-time liveness detection into your SwiftUI-based applications. It supports both passive and active liveness detection, offering a modern and secure way to verify user presence. This SDK is ideal for developers targeting iOS 13 and later, with additional features available on iOS 15 and above.
Passive Liveness Detection: Analyzes facial features (e.g., blinking, skin texture) to confirm liveness without user interaction. Available on iOS 15+.
Active Liveness Detection: Requires users to perform specific actions (e.g., head movements, answering questions) to verify liveness. Available on iOS 14+.
Built with SwiftUI, aligning with modern iOS development practices.
Get started with the SDK in just a few steps:
Add to Your Podfile
Open your projectβs Podfile
and include:
Install Dependencies Run this command in your terminal:
Open Your Project
Launch the generated .xcworkspace
file in Xcode.
Project Configuration
To prevent linker errors, add the following post-install script to your application's Podfile, making sure to replace "YourAppTarget"
with the correct target name for your app.
post_install do |installer| target_name = "YourAppTarget"
end
To use the SDK in your SwiftUI view, import it at the top of your file:
Create an instance of the SDK as a private variable within your SwiftUI view. Pass in the necessary configuration options during initialization:
The publicKey
can be retrieved dynamically (e.g., from an environment variable), as shown in the working code with Environment.shared.value(forKey: "API_KEY") ?? ""
.
Optional parameters like onFailure
, lastName
, and email
can be included if needed (see Configuration Options).
Trigger the liveness detection process by calling startSDK()
on your SDK instance. Pass an array of tasks to specify the liveness checks to perform:
Complete the Circle
Yes or No
Motions
Passive Liveness
See the Tasks section for detailed task options.
Include the SDKβs view in your SwiftUI view hierarchy to display the liveness detection interface:
Unlike the previous version, this view should always be present in the view hierarchy (not conditionally rendered). The SDK internally manages when to show the liveness interface based on startSDK()
calls.
The following options can be passed during initialization of YVLiveness
:
Option
Type
Required
Description
Default Value
Possible Values
publicKey
String
Yes
Your API key for authentication
-
Any string
user
Class
Yes
User details (requires firstName
)
-
See below
user.firstName
String
Yes
Userβs first name
-
Any string
user.lastName
String
No
Userβs last name
nil
Any string
user.email
String
No
Userβs email
nil
Any string
onSuccess
Function
No
Callback on success with liveness data
nil
Any function
onFailure
Function
No
Callback on failure with error data
nil
Any function
sandboxEnvironment
Boolean
No
Toggle sandbox mode for testing
true
true
, false
tasks
Array
No
Define liveness tasks for initialization
nil
See Tasks
branding
Class
No
Customize SDK appearance (e.g., color)
nil
color
Tasks are interactive challenges or passive checks to confirm liveness. Specify them when calling startSDK()
. Below are the available tasks:
Users trace a circle with head movements.
Option
Type
Required
Description
Default
Possible Values
difficulty
TaskDifficulty
No
Task difficulty
.medium
.easy
, .medium
, .hard
timeout
Number
No
Time (ms) before task fails
nil
Any milliseconds
Example:
Users answer questions by tilting their head (right = yes, left = no).
Option
Type
Required
Description
Default
Possible Values
difficulty
TaskDifficulty
No
Task difficulty
.medium
.easy
, .medium
, .hard
timeout
Number
No
Time (ms) before task fails
nil
Any milliseconds
questions
Array
Yes
List of yes/no questions
-
See below
questions.question
String
Yes
Yes/no question
-
Any string
questions.answer
Bool
Yes
Correct answer
-
true
, false
questions.errorMessage
String
No
Message on wrong answer
nil
Any string
Example:
Users perform nods, blinks, or mouth movements.
Option
Type
Required
Description
Default
Possible Values
difficulty
TaskDifficulty
No
Task difficulty
.medium
.easy
, .medium
, .hard
timeout
Number
No
Time (ms) before task fails
nil
Any milliseconds
maxNods
Int
No
Max nods to perform
5
Any number
maxBlinks
Int
No
Max blinks to perform
5
Any number
Example:
Analyzes facial features without user interaction (no additional parameters required).
Example:
The onSuccess
callback returns the following data:
Field
Type
Description
faceImage
String
Userβs face image
livenessClip
String
Video of liveness check (if applicable)
passed
Bool
True if passed, false if failed
metadata
Any
Metadata from initialization
The onFailure
callback (if provided) returns error details as a string or object.
Hereβs the complete SwiftUI view from the working code, demonstrating the SDK integration:
This SDK is developed and maintained solely by