AI / GDPR / privacy / EU AI Act / compliance
AI and GDPR: what Belgian businesses need to know
One of the most frequent questions in our audits: “Are we allowed to do this? What about GDPR?”
It’s a fair question. And the short answer is: yes, you can use AI — but you need to do it right. Here’s what you concretely need to know as a Belgian business.
GDPR and AI: the core
GDPR doesn’t prohibit AI. What GDPR does require is that you handle personal data responsibly. The same principles that apply to your CRM, your email system, and your customer database also apply to AI applications:
Purpose limitation. You may only process personal data for the purpose it was collected for. If you have customer data for invoicing, you can’t just use it to train an AI model.
Data minimization. Only use the data that’s necessary. If you’re building a document generator, you don’t need to load a complete customer history when a name and project number suffice.
Transparency. If AI makes decisions that affect people (customer communication, HR screening, credit assessment), the individuals involved need to know that AI is involved and how it works.
Right to explanation. With automated decision-making, individuals have the right to request human review. This is relevant if you use AI for things like customer scoring or application screening.
Where most SMEs are safe
The good news: most AI applications at SMEs fall into a low-risk category. Think of:
Internal tools. A document generator, a reporting system, or an internal knowledge base that works with business data (not personally identifiable data) has minimal GDPR impact.
Anonymized data. If you work with aggregated or anonymized datasets — “how many hours do we spend on average on X” — there’s no personal data processing involved.
Existing processing grounds. If you already have a legitimate interest to process customer data (for example, for service delivery), you can in many cases also use that data for AI applications that improve that same service. Provided it’s proportional and documented.
Where you do need to pay attention
Third-party AI tools. If you use ChatGPT, Copilot, or other cloud AI services, your data may leave the EU. Check where the data is processed and whether there’s a Data Processing Agreement (DPA). Most major providers now offer EU-hosted options.
Customer-facing AI. A chatbot that communicates with customers processes personal data. Ensure you have a privacy statement, an opt-out option, and logging of what the AI does.
HR applications. AI for recruitment, performance evaluation, or roster management is sensitive territory. The EU AI Act classifies some HR applications as “high risk” with additional requirements.
Training on business data. If you train or fine-tune an AI model on your own data, document which data you use, why, and how long you retain it.
The EU AI Act — what’s new?
Since 2024, the EU AI Act has been in effect, with phased implementation through 2026. The key points for SMEs:
Risk-based approach. AI systems are classified into four risk levels: minimal, limited, high, and unacceptable. Most SME applications fall into “minimal” or “limited.”
Transparency obligation. If customers or employees interact with AI (chatbot, virtual assistant), you need to make clear that it’s AI. A simple “This conversation is supported by AI” suffices.
Prohibited practices. Certain AI applications are banned: social scoring, manipulative systems, real-time biometric surveillance. None of these are relevant to typical SME applications.
SME exemptions. The EU AI Act contains specific provisions to limit the burden on SMEs. There will be sandboxes, simplified compliance documentation, and lower fines.
Practical checklist
For every AI application you implement, walk through these points:
What data does the tool use? Is it personally identifiable or business data? Where is the data processed — in the EU or outside? Is there a Data Processing Agreement with the AI provider? Do affected parties (customers, employees) know that AI is being used? Can a human intervene if the AI makes a mistake? Is it documented in your processing register?
If you have a clear answer to all these questions, you’re in good shape.
Our approach
In every audit, we include GDPR and AI Act compliance as a standard component. Not as legal advice — we’re not lawyers — but as practical guidelines: what data is needed, where it’s processed, and what precautions we take during implementation.
We prefer to build with EU-hosted services, minimize the use of personal data, and document the data flows for every system we deliver.