Crates.io | fgpt |
lib.rs | fgpt |
version | 0.1.6 |
source | src |
created_at | 2024-04-18 07:03:38.820585 |
updated_at | 2024-04-21 04:25:47.669942 |
description | A free reverse proxy and cli tool for OpenAI GPT-3.5-turbo. |
homepage | https://github.com/shenjinti/fgpt |
repository | https://github.com/shenjinti/fgpt |
max_upload_size | |
id | 1212198 |
size | 101,035 |
It allows you to use the GPT-3.5 API without needing to sign up for an API key or pay for usage.
OpenAI GPT-3.5-turbo is free to use, without any account or API key DON'T USE IN PRODUCTION, ONLY FOR PERSONAL USE/TESTING
Linux executable binary
Mac M1/M2 executable binary
Windows (Coming soon)
Or via Docker
Or build from source (see below, cargo is required)
cargo install fgpt
# To get answers from GPT-3.5
fgpt "How to get a domain's MX record on linux shell?"
# Output plain code -c/--code
fgpt -c "Write python code to reverse a string"
# With pipe
git diff | fgpt "Write a git commit bief with follow diff"
# With stdin
fgpt "Convert the follow csv data to json, without any description" < contacts.csv
# With file -f/--file
fgpt -f contacts.csv "Convert the follow csv data to json, without any description"
# REPL mode
fgpt
>> Write a javascript code to reverse a string
...
If you are unable to connect , you can try using a proxy. HTTP and SOCKS5 proxies are supported. For example:
# 1. pass the proxy address by -p/--proxy
fgpt -p 'socks5://127.0.0.1:9080' "Linux command to list files in a directory"
# 2. pass the proxy address by environment variable
export HTTPS_PROXY='socks5://127.0.0.1:9080'
fgpt "Linux command to list files in a directory"
# 3. use alias to set the proxy address
alias fgpt='fgpt -p "socks5://127.0.0.1:9080"'
fgpt "Linux command to list files in a directory"
fgpt --stats "Linux command to list files in a directory"
docker run -it --rm shenjinti/fgpt "Linux command to list files in a directory"
Offering free self-hosted API access to ChatGPT. This is useful if you want to use the OpenAI API without needing to sign up for an API key.
fgpt -s 127.0.0.1:4090
Your local server will now be running and accessible at: http://127.0.0.1:4090/v1/chat/completions
import openai
openai.api_key = 'nothing'
openai.base_url = "http://127.0.0.1:4090/v1/"
completion = openai.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Write a javascript simple code"},
],
)
print(completion.choices[0].message.content)
or test with curl:
curl -X POST -H "Content-Type: application/json" -d '{"model":"gpt-3.5-turbo","messages":[{"role":"user","content":"Write a javascript simple code"}], "stream":true}' http://127.0.0.1:4090/v1/chat/completions
curl -X POST -H "Content-Type: application/json" -d '{"model":"gpt-3.5-turbo","messages":[{"role":"user","content":"Write a javascript simple code"}]}' http://127.0.0.1:4090/v1/chat/completions