| Crates.io | oneiromancer |
| lib.rs | oneiromancer |
| version | 0.6.3 |
| created_at | 2025-02-22 13:11:15.565716+00 |
| updated_at | 2025-09-18 09:51:49.275097+00 |
| description | Reverse engineering assistant that uses a locally running LLM to aid with pseudo-code analysis. |
| homepage | https://0xdeadbeef.info/ |
| repository | https://github.com/0xdea/oneiromancer |
| max_upload_size | |
| id | 1565381 |
| size | 65,211 |
"A large fraction of the flaws in software development are due to programmers not fully understanding all the possible states their code may execute in." -- John Carmack
"Can it run Doom?" -- https://canitrundoom.org/
Oneiromancer is a reverse engineering assistant that uses a locally running LLM that has been fine-tuned for Hex-Rays pseudo-code to aid with code analysis. It can analyze a function or a smaller code snippet, returning a high-level description of what the code does, a recommended name for the function, and variable renaming suggestions, based on the results of the analysis.

mistral-7b-instruct.analyze_code or analyze_file to analyze pseudo-code and then process analysis results.The easiest way to get the latest release is via crates.io:
cargo install oneiromancer
To install as a library, run the following command in your project directory:
cargo add oneiromancer
Alternatively, you can build from source:
git clone https://github.com/0xdea/oneiromancer
cd oneiromancer
cargo build --release
wget https://huggingface.co/AverageBusinessUser/aidapal/resolve/main/aidapal-8k.Q4_K_M.gguf
wget https://huggingface.co/AverageBusinessUser/aidapal/resolve/main/aidapal.modelfile
ollama create aidapal -f aidapal.modelfile
ollama list
export OLLAMA_BASEURL=custom_baseurl # if not set, the default will be used
export OLLAMA_MODEL=custom_model # if not set, the default will be used
oneiromancer <target_file>.c
<target_file>.out.c:
vim <target_file>.out.c
code <target_file>.out.c
Note: for best results, you shouldn't submit for analysis to the LLM more than one function at a time.
aidapal <3aidapal IDA Pro plugin (e.g., context).aidapal LLM and implement a modular architecture to plug in custom LLMs.