diff --git a/README.md b/README.md index 0ed245f..3dee1ea 100644 --- a/README.md +++ b/README.md @@ -1,2 +1,137 @@ -# CodeRecall +Here’s a polished `README.md` for your project **CodeRecall**: +--- + +# 🧠 CodeRecall + +> Context-aware local AI assistant for developers using Ollama, ChromaDB, and VS Code’s Continue extension. + +CodeRecall ingests your entire codebase β€” including Git history and diffs β€” into a local vector database (ChromaDB), enabling RAG-augmented queries via Ollama models right inside VS Code. + +No cloud APIs. No latency. Full control. + +--- + +## πŸš€ Features + +- πŸ” **Semantic code search** across multiple languages +- πŸ“œ **Git commit + diff embedding** for code evolution awareness +- πŸ€– **RAG integration** with local Ollama models (e.g. LLaMA 3) +- πŸ’» **VS Code Continue extension support** +- βš™οΈ Configurable with a simple `config.ini` + +--- + +## 🧩 Project Structure + +``` +CodeRecall/ +β”œβ”€β”€ chroma_ingest.py # Ingest codebase + Git into ChromaDB +β”œβ”€β”€ chroma_context_provider.py # VS Code Continue context provider +β”œβ”€β”€ config.ini # Ollama + Chroma settings +β”œβ”€β”€ chroma-db/ # ChromaDB persistence directory +└── config.json # Continue extension config +``` + +--- + +## πŸ”§ Setup + +### 1. Install dependencies + +```bash +pip install chromadb requests +``` + +Make sure you have: +- πŸ¦™ [Ollama](https://ollama.com/) installed and running +- βœ… [Continue Extension](https://marketplace.visualstudio.com/items?itemName=Continue.continue) for VS Code +- πŸ™ Git repo initialized (optional but recommended) + +### 2. Configure `config.ini` + +```ini +[ollama] +url = http://localhost:11434 + +[chroma] +persist_directory = ./chroma-db +``` + +--- + +## πŸ“₯ Ingest your project + +```bash +python chroma_ingest.py +``` + +This loads: +- Source code in common languages +- Markdown and text docs +- Git commit messages and full diffs + +--- + +## 🧠 Add as a VS Code Context Provider + +### `config.json` for Continue + +```json +{ + "models": [ + { + "title": "LLaMA 3 (Ollama)", + "provider": "ollama", + "model": "llama3", + "apiBase": "http://localhost:11434" + } + ], + "contextProviders": [ + { + "title": "ChromaDB Search", + "provider": "custom", + "path": "./chroma_context_provider.py" + } + ] +} +``` + +--- + +## ✨ Usage + +1. Launch VS Code. +2. Open the Continue sidebar. +3. Set `"ChromaDB Search"` as your context provider. +4. Ask your model questions about your codebase, architecture, or commits. + +Example prompt: +> _β€œHow does the Git ingestion pipeline work?”_ + +--- + +## πŸ“Œ Notes + +- Default embedding model is `nomic-embed-text` (via Ollama). +- Change `n_results` in `chroma_context_provider.py` for broader/narrower context. +- Works offline, no API keys required. + +--- + +## πŸ§ͺ Roadmap Ideas + +- [ ] Add OpenAPI spec ingestion +- [ ] Enable full-text search fallback +- [ ] Support for multi-repo ingestion +- [ ] Optional chunking for large diffs or files + +--- + +## πŸ›‘ License + +MIT β€” free for personal + commercial use. + +--- + +Want me to auto-generate a logo, badge set, or GitHub Actions to auto-ingest on commit? \ No newline at end of file