bevy_openai

Crates.iobevy_openai
lib.rsbevy_openai
version0.1.1
sourcesrc
created_at2024-01-27 04:52:37.757804
updated_at2024-01-27 05:00:56.656401
descriptionbevy_openai is an event-driven plugin for Bevy that provides convenient access to the OpenAI API.
homepage
repositoryhttps://github.com/hytracen/bevy_openai
max_upload_size
id1116505
size120,857
Thy (hytracen)

documentation

README

中文

bevy_openai

bevy_openai is an event-driven plugin for Bevy that provides convenient access to the OpenAI API.

Current features:

Installation

Add the crate as a dependency:

[dependencies]
bevy_openai = "0.1.0"

Add the plugin:

use bevy::prelude::*;
use bevy_openai::OpenAiPlugin;

fn main() {
    App::new()
        .add_plugins((DefaultPlugins, OpenAiPlugin))
}

Usage

Set OPENAI_API_KEY to environment variable

$ export OPENAI_API_KEY=sk-xxxxxxx

bevy_openai is event-driven. You can send a prompt to the ChatGPT and read the response from the ChatGPT using events.

Use SendToAiEvent to send your prompt to the ChatGPT.

fn send_to_ai(
    mut event_writer: EventWriter<SendToAiEvent>
) {
    event_writer.send(SendToAiEvent("Hello".to_string()));
}

Or use SendToAiWithConfigEvent to send your prompt to the ChatGPT with a custom config.

fn send_to_ai(
    mut config_event_writer: EventWriter<SendToAiWithConfigEvent>,
) {
    // with config
    config_event_writer.send(SendToAiWithConfigEvent {
        prompt: "Hello".to_string(),
        config: ClientConfigBuilder::default()
            .api_endpoint("".to_owned())
            .api_key("".to_owned())
            .build()
            .expect("Failed to build config"),
    });
}

Use AiResponseEvent to read the response from the ChatGPT.

fn process_ai_response(mut event_reader: EventReader<AiResponseEvent>) {
    for event in event_reader.read() {
        println!("response: {}", event.0);
    }
}

The full example is in the examples directory.

Commit count: 0

cargo fmt