AI-based intelligent document question answering system built with FastAPI, supporting multiple document formats and AI providers.
- Document Management: Upload, parse, and store documents (TXT, Markdown, PDF, DOCX)
- Intelligent Q&A: RAG-based question answering using semantic search
- Multi-turn Conversations: Context-aware chat with conversation history
- Multiple AI Providers: Support for OpenAI, Anthropic (Claude), and local models
- Vector Search: ChromaDB for efficient document chunk retrieval
- Backend: FastAPI, Async SQLAlchemy, Pydantic v2
- Database: PostgreSQL, Redis
- Vector Store: ChromaDB
- AI: OpenAI, Anthropic, Sentence Transformers
- Document Parsing: pypdf, python-docx, markdown
- Python 3.11+
- Docker and Docker Compose
- OpenAI or Anthropic API key
cd python-proj
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
make devcp .env.example .env
# Edit .env with your settings (especially API keys)# Start PostgreSQL and Redis
make docker-up
# Run database migrations
make migratemake runThe API will be available at http://localhost:8000
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
| Method | Endpoint | Description |
|---|---|---|
| POST | /api/v1/documents |
Upload a document |
| GET | /api/v1/documents |
List all documents |
| GET | /api/v1/documents/{id} |
Get document details |
| DELETE | /api/v1/documents/{id} |
Delete a document |
| Method | Endpoint | Description |
|---|---|---|
| POST | /api/v1/qa/ask |
Ask a question |
| Method | Endpoint | Description |
|---|---|---|
| POST | /api/v1/conversations |
Create a conversation |
| GET | /api/v1/conversations |
List conversations |
| GET | /api/v1/conversations/{id} |
Get conversation details |
| PUT | /api/v1/conversations/{id} |
Update a conversation |
| DELETE | /api/v1/conversations/{id} |
Delete a conversation |
| POST | /api/v1/conversations/{id}/messages |
Send a message |
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/v1/settings/ai-providers |
List AI providers |
| POST | /api/v1/settings/ai-providers |
Add AI provider |
| PUT | /api/v1/settings/ai-providers/{id} |
Update AI provider |
| DELETE | /api/v1/settings/ai-providers/{id} |
Delete AI provider |
| POST | /api/v1/settings/ai-providers/{id}/test |
Test AI connection |
curl -X POST "http://localhost:8000/api/v1/documents" \
-F "file=@document.pdf" \
-F "title=My Document"curl -X POST "http://localhost:8000/api/v1/qa/ask" \
-H "Content-Type: application/json" \
-d '{"question": "What is the main topic of the document?"}'curl -X POST "http://localhost:8000/api/v1/conversations" \
-H "Content-Type: application/json" \
-d '{"title": "My Chat", "document_ids": ["doc-uuid-here"]}'curl -X POST "http://localhost:8000/api/v1/conversations/{id}/messages" \
-H "Content-Type: application/json" \
-d '{"message": "Tell me more about section 2"}'python-proj/
├── app/
│ ├── api/routers/ # API route handlers
│ ├── services/ # Business logic
│ ├── daos/ # Data access layer
│ ├── models/ # SQLAlchemy models
│ ├── schemas/ # Pydantic schemas
│ ├── core/ # Config, exceptions, middleware
│ └── infra/ # Database, Redis, Vector store
├── migrations/ # Alembic migrations
├── tests/ # Test suite
├── docker/ # Docker configuration
└── docs/ # Documentation
# Run tests
make test
# Run linters
make lint
# Format code
make format
# Create a migration
make migration msg="add new table"# Build the image
make docker-build
# Run with Docker Compose
make docker-runKey environment variables:
| Variable | Description | Default |
|---|---|---|
ENV |
Environment (dev/test/prod) | dev |
DB_HOST |
PostgreSQL host | localhost |
REDIS_HOST |
Redis host | localhost |
OPENAI_API_KEY |
OpenAI API key | - |
ANTHROPIC_API_KEY |
Anthropic API key | - |
DEFAULT_AI_PROVIDER |
Default AI provider | openai |
See .env.example for all options.
MIT