Crates.io | aristech-stt-client |
lib.rs | aristech-stt-client |
version | |
source | src |
created_at | 2024-10-21 11:47:15.760913 |
updated_at | 2024-12-05 07:04:01.334306 |
description | A Rust client library for the Aristech Speech-to-Text API |
homepage | https://github.com/aristech-de/stt-clients/blob/main/rust/README.md |
repository | https://github.com/aristech-de/stt-clients |
max_upload_size | |
id | 1417299 |
Cargo.toml error: | TOML parse error at line 18, column 1 | 18 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include` |
size | 0 |
This is the Rust client implementation for the Aristech STT-Server.
To use the client in your project, add it to your Cargo.toml
or use cargo
to add it:
cargo add aristech-stt-client
use aristech_stt_client::{get_client, recognize_file, TlsOptions, Auth};
use std::error::Error;
#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
let mut client = get_client(
"https://stt.example.com",
Some(TlsOptions {
ca_certificate: None,
auth: Some(Auth { token: "your-token", secret: "your-secret" }),
}),
).await?;
let results = recognize_file(&mut client, "path/to/audio/file.wav", None).await?;
for result in results {
println!(
"{}",
result
.chunks
.get(0)
.unwrap()
.alternatives
.get(0)
.unwrap()
.text
);
}
Ok(())
}
There are several examples in the examples directory:
You can run the examples directly using cargo
like this:
.env
file in the rust directory:HOST=stt.example.com
# The credentials are optional but probably required for most servers:
TOKEN=your-token
SECRET=your-secret
# The following are optional:
# ROOT_CERT=your-root-cert.pem # If the server uses a self-signed certificate
# If neither credentials nor an explicit root certificate are provided,
# you can still enable SSL by setting the SSL environment variable to true:
# SSL=true
# MODEL=some-available-model
# NLP_SERVER=some-config
# NLP_PIPELINE=function1,function2
cargo run --example live
To build the library, run:
cargo build