Self-Hosted · Open Source · Sovereign

Personal AI Research Environment

Local LLMs, RAG systems, and agent workflows running on dedicated infrastructure in Germany. Fully self-hosted, no data leaves the server.

A personal research project by Aymen Mastouri. Not affiliated with any employer or client.

Services

Active
Authentik

Identity & Access Management. Central SSO for all lab services.

auth.aymenmastouri.io
Active
Open WebUI

Chat interface with RAG support. Qdrant + nomic-embed-text for document Q&A.

chat.aymenmastouri.io
LLM
Active
LiteLLM

OpenAI-compatible API proxy. Routes to local Ollama models.

llm.aymenmastouri.io
Active
Qdrant

Vector database for semantic search. Connected to nomic-embed-text via LiteLLM.

qdrant.aymenmastouri.io
Internal
Ollama

Local LLM runtime. qwen2.5:3b + nomic-embed-text + llama3.2:3b.

internal · :11434

Roadmap

Planned
n8n

Workflow automation for AI agent pipelines.

Planned
Docling

Document parsing and extraction for RAG ingestion.

Planned
Langflow

Visual low-code builder for AI agents and chains.