Initial code commit

This commit is contained in:
Lutchy Horace 2025-04-16 16:57:39 -04:00
parent 14c06787dc
commit 9842d55bbf
7 changed files with 441 additions and 32 deletions

View file

@ -1,4 +1,3 @@
Heres a polished `README.md` for your project **CodeRecall**:
---
@ -26,11 +25,11 @@ No cloud APIs. No latency. Full control.
```
CodeRecall/
├── chroma_ingest.py # Ingest codebase + Git into ChromaDB
├── chroma_context_provider.py # VS Code Continue context provider
├── config.ini # Ollama + Chroma settings
├── chroma-db/ # ChromaDB persistence directory
└── config.json # Continue extension config
├── lancedb_ingest.py # Ingest codebase + Git into ChromaDB
├── lancedb_context_provider.py # VS Code Continue context provider
├── config.ini.example # Ollama + LanceDB settings
├── lancedb-data/ # LanceDB persistence directory
└── config.json # Continue extension config
```
---
@ -40,7 +39,7 @@ CodeRecall/
### 1. Install dependencies
```bash
pip install chromadb requests
pip install lancedb
```
Make sure you have:
@ -51,11 +50,24 @@ Make sure you have:
### 2. Configure `config.ini`
```ini
[ollama]
[[ollama]
url = http://localhost:11434
[chroma]
persist_directory = ./chroma-db
[lancedb]
persist_directory = ./lancedb-data
[s3]
enable = True
bucket_name = my-s3-bucket
access_key_id = my-access-key
secret_access_key = my-secret-key
region = us-east-1
# Optional, if using third party s3 providers
endpoint = http://minio:9000
[server]
host = 0.0.0.0
port = 8080
```
---
@ -63,7 +75,7 @@ persist_directory = ./chroma-db
## 📥 Ingest your project
```bash
python chroma_ingest.py
python lancedb_ingest.py
```
This loads:
@ -75,26 +87,41 @@ This loads:
## 🧠 Add as a VS Code Context Provider
### `config.json` for Continue
### `config.yaml` for Continue
```json
{
"models": [
{
"title": "LLaMA 3 (Ollama)",
"provider": "ollama",
"model": "llama3",
"apiBase": "http://localhost:11434"
}
],
"contextProviders": [
{
"title": "ChromaDB Search",
"provider": "custom",
"path": "./chroma_context_provider.py"
}
]
}
```yaml
name: Local Assistant
version: 1.0.0
schema: v1
models:
- name: Ollama Autodetect
provider: ollama
model: AUTODETECT
apiBase: http://localhost:11434
- name: Ollama Autocomplete
provider: ollama
model: qwen2.5-coder:1.5b-base
apiBase: http://localhost:11434
roles:
- autocomplete
- name: Nomic Embed Text
provider: ollama
model: nomic-embed-text
apiBase: http://localhost:11434
roles:
- embed
context:
- provider: code
- provider: docs
- provider: diff
- provider: terminal
- provider: problems
- provider: folder
- provider: codebase
# LanceDB Context Provider
- provider: http
params:
url: http://localhost/retrieve
```
---
@ -103,7 +130,7 @@ This loads:
1. Launch VS Code.
2. Open the Continue sidebar.
3. Set `"ChromaDB Search"` as your context provider.
3. Set `"@HTTP"` as your context provider.
4. Ask your model questions about your codebase, architecture, or commits.
Example prompt:
@ -114,7 +141,7 @@ Example prompt:
## 📌 Notes
- Default embedding model is `nomic-embed-text` (via Ollama).
- Change `n_results` in `chroma_context_provider.py` for broader/narrower context.
- Change `n_results` in `lancedb_context_provider.py` for broader/narrower context.
- Works offline, no API keys required.
---