Crates.io | gemini_bridge |
lib.rs | gemini_bridge |
version | 0.1.16 |
source | src |
created_at | 2024-11-01 12:49:53.764956 |
updated_at | 2024-11-21 22:52:10.422255 |
description | Types and functions to interact with Gemini AI API |
homepage | |
repository | https://github.com/paoloposso/gemini_rs |
max_upload_size | |
id | 1431756 |
size | 18,262 |
Gemini Bridge is a Rust crate designed to interact with the Gemini API. This crate aims to provide a seamless and efficient way to communicate with Gemini API.
Add gemini_bridge
to your Cargo.toml
:
[dependencies]
gemini_bridge = "0.1.15"
# example version
extern crate gemini_rs;
async fn main() {
let gen = GeminiRs::new_text_generator("api_key_xxxxxxx".to_owned(), "gemini-1.5-flash-latest".to_owned());
let res = gen
.generate_content(RequestBody {
contents: vec![Content {
parts: vec![Part {
text: "send me a test response".to_owned(),
}],
}],
safety_settings: None,
generation_config: None,
})
.await;
}
You can create an interactive chat with the Gemini API by sending multiple messages in a single request. The following example demonstrates how to create an interactive chat with the Gemini API. So you can use the latest response from the model, append to this history and send it back to the model. This way you can create an interactive chat. Always be aware of the token count and the maximum token count for the model you are using. So you will want to send the minimum useful information and limit the size of the history you are sending to the model.
async fn test_interactice_chat() {
let gen = GeminiRs::new_text_generator("api_key_xxxxxxx".to_owned(), "gemini-1.5-flash-latest".to_owned());
let res = gen
.generate_content(RequestBody {
contents: vec![
Content {
role: Some(String::from("user")),
parts: vec![Part {
text: "Hello".to_owned(),
}],
},
Content {
role: Some(String::from("model")),
parts: vec![Part {
text: "Great to meet you. What would you like to know?".to_owned(),
}],
},
Content {
role: Some(String::from("user")),
parts: vec![Part {
text: "I have two dogs in my house. How many paws are in my house?"
.to_owned(),
}],
},
Content {
role: Some(String::from("model")),
parts: vec![Part {
text: "That's a fun question! You have a total of **7 paws** in your house. 🐶🐾 \n"
.to_owned(),
}],
},
Content {
role: Some(String::from("user")),
parts: vec![Part {
text: "Thank you! How did you calculate that? Are you sure?"
.to_owned(),
}],
},
],
safety_settings: None,
generation_config: None,
})
.await;
if res.is_ok() {
let response = res.unwrap();
print!("{:?}", response);
assert!(response.candidates.len() > 0);
return;
}
panic!("Error: {:?}", res.err().unwrap());
}
Not tested yet with other models.