Crates.io | openai-flows |
lib.rs | openai-flows |
version | 0.9.1 |
source | src |
created_at | 2023-02-12 12:25:49.467581 |
updated_at | 2023-11-07 08:45:44.648741 |
description | OpenAI integration for flows.network |
homepage | |
repository | |
max_upload_size | |
id | 783041 |
size | 31,559 |
This is a library for integrating OpenAI in your flow function for flows.network.
use flowsnet_platform_sdk::logger;
use lambda_flows::{request_received, send_response};
use openai_flows::{
chat::{ChatModel, ChatOptions},
OpenAIFlows,
};
use serde_json::Value;
use std::collections::HashMap;
#[no_mangle]
#[tokio::main(flavor = "current_thread")]
pub async fn run() {
logger::init();
request_received(handler).await;
}
async fn handler(_qry: HashMap<String, Value>, body: Vec<u8>) {
let co = ChatOptions {
model: ChatModel::GPT35Turbo,
restart: false,
system_prompt: None,
};
let of = OpenAIFlows::new();
let r = match of
.chat_completion(
"any_conversation_id",
String::from_utf8_lossy(&body).into_owned().as_str(),
&co,
)
.await
{
Ok(c) => c.choice,
Err(e) => e,
};
send_response(
200,
vec![(
String::from("content-type"),
String::from("text/plain; charset=UTF-8"),
)],
r.as_bytes().to_vec(),
);
}
This example lets you have a conversation with ChatGPT using chat_completion
by Lambda request.
The whole document is here.