csharpRhino - .NET API

  • Speech-to-Intent Engine
  • Domain Specific NLU
  • Offline NLU
  • Local Voice Recognition
  • Linux
  • macOS
  • Windows
  • .NET
  • C#

This document outlines how to integrate Rhino Speech-to-Intent engine within an application using its .NET API.

Requirements

  • .NET SDK 3.1+

Compatibility

  • .NET Standard 2.0, .NET Core 2.0+ and .NET Framework 4.6.1+
  • Runs on Linux (x86_64), macOS (x86_64), Windows (x86_64) and Raspberry Pi (.NET Core 3.1+)

Installation

You can install the latest version of Rhino by getting the latest Rhino NuGet package in Visual Studio or using the .NET CLI:

dotnet add package Rhino

Usage

Create an instance of the engine:

using Pv;
Rhino handle = Rhino.Create(contextPath:"/absolute/path/to/context");

Where contextPath is the absolute path to a Rhino Speech-to-Intent context.

The sensitivity of the engine can be tuned using the sensitivity parameter. It is a floating point number within [0, 1]. A higher sensitivity value results in fewer misses at the cost of (potentially) increasing the erroneous inference rate.

using Pv;
Rhino handle = Rhino.Create(contextPath:"/absolute/path/to/context", sensitivity: 0.25f);

When initialized, the valid sample rate is given by handle.SampleRate. Expected frame length (number of audio samples in an input array) is handle.FrameLength. The engine accepts 16-bit linearly-encoded PCM and operates on single-channel audio.

short[] GetNextAudioFrame()
{
// .. get audioFrame
return audioFrame;
}
while(true)
{
bool isFinalized = handle.Process(GetNextAudioFrame());
if(isFinalized)
{
Inference inference = handle.GetInference();
if(inference.IsUnderstood)
{
string intent = inference.Intent;
Dictionary<string, string> slots = inference.Slots;
// .. code to take action based on inferred intent and slot values
}
else
{
// .. code to handle unsupported commands
}
}
}

Rhino will have its resources freed by the garbage collector, but to have resources freed immediately after use, wrap it in a using statement:

using(Rhino handle = Rhino.Create(contextPath:"/absolute/path/to/context"))
{
// .. Rhino usage here
}

Custom Context

You can create custom Rhino context models using Picovoice Console.

Non-English Contexts

In order to run inference on non-English contexts you need to use the corresponding model file. The model files for all supported languages are available here.


Issue with this doc? Please let us know.