LB
likhith-bavisetti
Community Connect Newsletter RAG
RAG Python LangChain ChromaDB Ollama Local-only

Community Connect
Newsletter RAG

AI chatbot grounded in UMSL Community Connect newsletter data — fully local, no external APIs.

AI / Backend Engineer

Overview

An AI chatbot that answers questions using UMSL Community Connect newsletters as its sole knowledge base. The system uses a Retrieval-Augmented Generation (RAG) pipeline with fully local models — no OpenAI, no external APIs, no data leaving the machine.

Key constraint: Entirely local. Embeddings and response generation both run on-device using open-source models served through Ollama.


Models

Embedding Model BAAI/bge-large-en-v1.5 State-of-the-art open-source text embeddings
LLM Local via Ollama Response generation served locally

Features


How It Works

Newsletter Documents
Chunking
Embedding — bge-large-en-v1.5
ChromaDB Vector Store
User Query → Embed
Vector Similarity Search → Top-K Chunks
Ollama LLM + Retrieved Context
Grounded Response

Tech Stack


Setup

Install dependencies

uv pip install -r requirements.txt

Ingest newsletter data

python ingest.py

Start the server

uvicorn main:app --reload

Example query

User:  What was announced in the June 2025 newsletter?

Bot:   The June 2025 Community Connect newsletter highlighted
       upcoming community events, volunteer programs, and
       local initiatives.