137 lines
No EOL
2.9 KiB
Markdown
137 lines
No EOL
2.9 KiB
Markdown
Here’s a polished `README.md` for your project **CodeRecall**:
|
||
|
||
---
|
||
|
||
# 🧠 CodeRecall
|
||
|
||
> Context-aware local AI assistant for developers using Ollama, ChromaDB, and VS Code’s Continue extension.
|
||
|
||
CodeRecall ingests your entire codebase — including Git history and diffs — into a local vector database (ChromaDB), enabling RAG-augmented queries via Ollama models right inside VS Code.
|
||
|
||
No cloud APIs. No latency. Full control.
|
||
|
||
---
|
||
|
||
## 🚀 Features
|
||
|
||
- 🔍 **Semantic code search** across multiple languages
|
||
- 📜 **Git commit + diff embedding** for code evolution awareness
|
||
- 🤖 **RAG integration** with local Ollama models (e.g. LLaMA 3)
|
||
- 💻 **VS Code Continue extension support**
|
||
- ⚙️ Configurable with a simple `config.ini`
|
||
|
||
---
|
||
|
||
## 🧩 Project Structure
|
||
|
||
```
|
||
CodeRecall/
|
||
├── chroma_ingest.py # Ingest codebase + Git into ChromaDB
|
||
├── chroma_context_provider.py # VS Code Continue context provider
|
||
├── config.ini # Ollama + Chroma settings
|
||
├── chroma-db/ # ChromaDB persistence directory
|
||
└── config.json # Continue extension config
|
||
```
|
||
|
||
---
|
||
|
||
## 🔧 Setup
|
||
|
||
### 1. Install dependencies
|
||
|
||
```bash
|
||
pip install chromadb requests
|
||
```
|
||
|
||
Make sure you have:
|
||
- 🦙 [Ollama](https://ollama.com/) installed and running
|
||
- ✅ [Continue Extension](https://marketplace.visualstudio.com/items?itemName=Continue.continue) for VS Code
|
||
- 🐙 Git repo initialized (optional but recommended)
|
||
|
||
### 2. Configure `config.ini`
|
||
|
||
```ini
|
||
[ollama]
|
||
url = http://localhost:11434
|
||
|
||
[chroma]
|
||
persist_directory = ./chroma-db
|
||
```
|
||
|
||
---
|
||
|
||
## 📥 Ingest your project
|
||
|
||
```bash
|
||
python chroma_ingest.py
|
||
```
|
||
|
||
This loads:
|
||
- Source code in common languages
|
||
- Markdown and text docs
|
||
- Git commit messages and full diffs
|
||
|
||
---
|
||
|
||
## 🧠 Add as a VS Code Context Provider
|
||
|
||
### `config.json` for Continue
|
||
|
||
```json
|
||
{
|
||
"models": [
|
||
{
|
||
"title": "LLaMA 3 (Ollama)",
|
||
"provider": "ollama",
|
||
"model": "llama3",
|
||
"apiBase": "http://localhost:11434"
|
||
}
|
||
],
|
||
"contextProviders": [
|
||
{
|
||
"title": "ChromaDB Search",
|
||
"provider": "custom",
|
||
"path": "./chroma_context_provider.py"
|
||
}
|
||
]
|
||
}
|
||
```
|
||
|
||
---
|
||
|
||
## ✨ Usage
|
||
|
||
1. Launch VS Code.
|
||
2. Open the Continue sidebar.
|
||
3. Set `"ChromaDB Search"` as your context provider.
|
||
4. Ask your model questions about your codebase, architecture, or commits.
|
||
|
||
Example prompt:
|
||
> _“How does the Git ingestion pipeline work?”_
|
||
|
||
---
|
||
|
||
## 📌 Notes
|
||
|
||
- Default embedding model is `nomic-embed-text` (via Ollama).
|
||
- Change `n_results` in `chroma_context_provider.py` for broader/narrower context.
|
||
- Works offline, no API keys required.
|
||
|
||
---
|
||
|
||
## 🧪 Roadmap Ideas
|
||
|
||
- [ ] Add OpenAPI spec ingestion
|
||
- [ ] Enable full-text search fallback
|
||
- [ ] Support for multi-repo ingestion
|
||
- [ ] Optional chunking for large diffs or files
|
||
|
||
---
|
||
|
||
## 🛡 License
|
||
|
||
MIT — free for personal + commercial use.
|
||
|
||
---
|
||
|
||
Want me to auto-generate a logo, badge set, or GitHub Actions to auto-ingest on commit? |