iosRhino - iOS Quick Start


  • iOS (9.0+)


Picovoice Account & AccessKey

Signup or Login to Picovoice Console to get your AccessKey. Make sure to keep your AccessKey secret.

Quick Start


  1. Install Xcode.

  2. Install CocoaPods

  3. Import the Rhino-iOS binding by adding the following line to the project's Podfile:

pod 'Rhino-iOS'
  1. Run the following from the project directory:
pod install
  1. Add the following to the app's Info.plist file to enable recording with your iOS device's microphone
<string>[Permission explanation]</string>


Include the context file (either a pre-built context file (.rhn) from the Rhino GitHub Repository or a custom context created with the Picovoice Console) in the app as a bundled resource (found by selecting in Build Phases > Copy Bundle Resources). Then, get its path from the app bundle:

let contextPath = Bundle.main.path(forResource: "${CONTEXT_FILE}", ofType: "rhn")

Create an instance of RhinoManager that infers custom commands:

import Rhino
do {
let rhinoManager = try RhinoManager(
accessKey: "${ACCESS_KEY}",
contextPath: contextPath,
onInferenceCallback: inferenceCallback)
} catch { }

The onInferenceCallback parameter is a function that will be invoked when Rhino has returned an inference result:

let inferenceCallback: ((Inference) -> Void) = { inference in
if inference.isUnderstood {
let intent:String = inference.intent
let slots:Dictionary<String,String> = inference.slots
// take action based on inferred intent and slot values
} else {
// handle unsupported commands

Start audio capture:

do {
try rhinoManager.process()
} catch { }

Once an inference has been made, the inferenceCallback will be invoked and audio capture will stop automatically.

Release resources explicitly when done with Rhino:


Custom Contexts

Create custom contexts with the Picovoice Console. Download the custom context file (.rhn) and include it in the app as a bundled resource (found by selecting in Build Phases > Copy Bundle Resources).

Non-English Languages

Use the corresponding model file (.pv) to infer non-English commands. The model files for all supported languages are available on the Rhino GitHub repository.

Pass in the model file using the modelPath input argument to change the inference language:

let modelPath = Bundle.main.path(forResource: "${MODEL_FILE}", ofType: "pv")
do {
let rhinoManager = try RhinoManager(
accessKey: "${ACCESS_KEY}",
contextPath: contextPath,
modelPath: modelPath,
onInferenceCallback: inferenceCallback)
} catch { }


For the Rhino iOS SDK, we offer demo applications that demonstrate how to use the Speech-to-Intent engine on real-time audio streams (i.e. microphone input).


Clone the Repository:

git clone --recurse-submodules


  1. Install dependencies:
cd rhino/demo/ios/
pod install
  1. Replace let ACCESS_KEY = "${YOUR_ACCESS_KEY_HERE}" in the file ContentView.swift with a valid AccessKey.

  2. Open the RhinoDemo.xcworkspace and run the demo.

For more information on our Rhino demos for iOS, head over to our Github repository.






Further Reading

Issue with this doc? Please let us know.