top of page


Article 9 EU AI Act in Practice: How to Build a Risk Management System for High-Risk AI
You've run the analysis. Your AI system is classified as High-Risk . Now what? If you've followed our KI-Verordnung Framework guide, you know the four steps to clarity: 1. Scope Check — Does the AI Act apply? ✓ 2. Risk Classification — What's the risk level? ✓ 3. Role Determination — Provider, Deployer, or both? ✓ 4. Obligations — What must you do? ← You are here For High-Risk AI systems, one of the most critical obligations is Article 9: the Risk Management System (RMS) . In
Joe Simms
Apr 158 min read


Why You Shouldn't Use ChatGPT, Claude & Co. as Your EU AI Act Compliance Guide
• Modern LLMs (ChatGPT, Claude, Gemini, Grok) will be better than ever by 2026 — but they aren’t designed for compliance
• The problem is no longer obvious errors, but errors that sound convincing
• Without source references, an audit trail and project-specific context, LLM responses are unusable for the AI Regulation
- Alternatives: Research official sources yourself, use specialised tools with RAG, or consult experts
Joe Simms
Mar 278 min read
bottom of page