javaRhino - Java API

  • Speech-to-Intent Engine
  • Domain Specific NLU
  • Offline NLU
  • Local Voice Recognition
  • Linux
  • macOS
  • Windows
  • Java

This document outlines how to integrate Rhino Speech-to-Intent engine within an application using its Java API.


  • Java 11+


  • Linux (x86_64)
  • macOS (x86_64)
  • Windows (x86_64)


You can install the Rhino Java SDK by downloading and referencing the latest Rhino JAR file.


To build from source, we recommend using the IntelliJ IDE. Open the .iml file with IntelliJ and click "Build > Build Project" to build or "Build > Build Artifacts" to package as a JAR file.


The easiest way to create an instance of the engine is with the Rhino Builder:

import ai.picovoice.rhino.*;
Rhino handle = new Rhino.Builder()
} catch (RhinoException e) { }

Where the setContextPath() builder argument sets the absolute path to the Rhino Speech-to-Intent context.

The sensitivity of the engine can be tuned using the setSensitivity builder argument. It is a floating-point number within [0, 1]. A higher sensitivity value results in fewer misses at the cost of (potentially) increasing the erroneous inference rate.

import ai.picovoice.rhino.*;
Rhino handle = new Rhino.Builder()
} catch (RhinoException e) { }

When initialized, the valid sample rate is given by handle.getSampleRate(). Expected frame length (number of audio samples in an input array) is handle.getFrameLength(). The engine accepts 16-bit linearly-encoded PCM and operates on single-channel audio.

short[] getNextAudioFrame(){
// .. get audioFrame
return audioFrame;
while(true) {
boolean isFinalized = handle.process(getNextAudioFrame());
RhinoInference inference = handle.getInference();
String intent = inference.getIntent();
Map<string, string> slots = inference.getSlots();
// .. code to take action based on inferred intent and slot values
} else {
// .. code to handle unsupported commands

Once you're done with Rhino, ensure you release its resources explicitly:


Custom Context

You can create custom Rhino context models using Picovoice Console.

Non-English Contexts

In order to run inference on non-English contexts you need to use the corresponding model file. The model files for all supported languages are available here.

Issue with this doc? Please let us know.