The Coming Compliance Divide
The EU AI Act marks a major shift in how artificial intelligence must be developed, deployed, and governed across Europe (European Union, 2024, Art. 10 and Arts. 10–24). As the world moves toward more regulated AI environments, companies of all sizes will need to change their internal processes to meet transparency, documentation, and risk-management requirements. However, not all companies begin from the same starting point. Large companies often have mature compliance structures, while SMEs face resource constraints that make regulatory adaptation much more difficult. This creates what many commentators describe as a “compliance divide” between large enterprises and smaller innovators (OCED, 2025 and Annika L., 2021).
The stakes are significant. According to the AI Act, AI systems that do not meet applicable requirements — especially high-risk systems — may not be placed on the EU market (European Union, 2024, Art. 10-24). Non-compliance can also trigger penalties (European Union, 2024, Art. 10 and Arts. 71-74). Beyond legal risk, businesses today face rising expectations from customers, partners, and investors, who increasingly associate regulatory compliance with trust, responsibility, and long-term viability. For proactive organizations, early preparation can become a strategic advantage. Companies capable of demonstrating that their systems are “AI Act–ready” will find it easier to enter markets, secure partnerships, and signal reliability in a competitive landscape.
Remark:
I) From Arts 10–24: Technical documentation, risk management, data governance; used in: a. must meet transparency, documentation, risk-management requirements and b. may not be placed on the market if they do not meet requirements ((European Union, 2024, Art. 10)
II) From chapter XII Arts 71–74; used in non-compliance can trigger penalties.
Why Large Companies Are (Mostly) Ready
Large companies are generally better prepared for the EU AI Act because they already operate within mature compliance structures established under regulations like GDPR and industry standards such as ISO. These existing systems make it easier for them to adapt to new requirements around documentation, risk management, and oversight. They also have in-house legal and compliance teams that can interpret regulatory obligations and integrate them into internal processes. Many large organizations have even created Responsible AI or AI governance units, reflecting growing investment in ethical AI frameworks.
The AI Act sets out obligations for high-risk AI systems — including risk management (European Union, 2024, Art. 9), data governance (European Union, 2024, Art. 10), technical documentation (European Union, 2024, Art. 11), record-keeping (European Union, 2024, Art. 12), transparency (European Union, 2024, Art. 13), human oversight (European Union, 2024, Art. 14), and Accuracy, robustness and cybersecurity (European Union, 2024, Art. 15). While these obligations apply to all providers, large companies are typically more capable of meeting them to greater financial capacity, technical expertise, and existing governance infrastructure.
Large enterprises also benefit from alignment with recognized ISO/IEC standards like ISO/IEC 27001, ISO 9001, ISO/IEC 42001(International Organization for Standardization, 2023), which mirror many AI Act requirements and reduce the compliance gap.
Why the AI Act Hits SMEs Much Harder
SMEs face significantly greater challenges in meeting the EU AI Act’s requirements due to limited financial, technical, and organizational resources. Unlike large companies, most SMEs do not have in-house legal or compliance teams to interpret regulatory obligations or assess how the Act applies to their AI systems (Sillberg C.V., et al., in press). One of the most difficult steps for SMEs is determining the correct risk category of their AI system. The AI Act classifies systems into minimal, limited, high-risk, and prohibited categories, but interpreting these distinctions — especially for high-risk use cases — requires specialized expertise (Sillberg, C.V., et. al., 2025). SMEs often report uncertainty about where their systems fit and what obligations follow from that classification.
Even after identifying the risk category, SMEs struggle with understanding which obligations apply, such as documentation, data governance, risk management, transparency requirements, or human oversight. The AI Act sets these obligations clearly in law but applying them in practice can be complex without regulatory experience. SMEs also face higher financial and administrative burdens. Preparing technical documentation, maintaining logs, implementing quality management systems, and undergoing conformity assessments require time and specialist knowledge — resources that are often limited in small organizations.
Real Consequences of the SME–Large Company Compliance Gap
The compliance gap between large companies and SMEs can have substantial downstream effects on innovation and competition. SMEs often hesitate to develop or deploy AI because the administrative and financial burden of compliance feels overwhelming. This uncertainty can lead to delays or even the decision to avoid building AI systems altogether. As a result, SMEs may adopt AI more slowly, which reduces their ability to innovate at the same pace as larger organizations. Without sufficient support, smaller companies risk falling behind in the AI market (Small Businesses’ Guide, 2024).
Another consequence is increased reliance on large AI providers. Many SMEs may prefer to buy AI solutions from large technology companies that are already able to meet compliance obligations, rather than developing their own systems. This dependency can reduce SMEs’ control over how AI is implemented and limit their ability to differentiate their products (Small Businesses’ Guide, 2024).
What SMEs Can Do Now
Although SMEs face significant challenges under the AI Act, several practical steps can help reduce the compliance burden and make the process more manageable. A key recommendation is to start small and early by documenting AI development, data sources, and decision-making processes, even before formal compliance work begins. This early preparation prevents gaps later and makes it easier to demonstrate conformity. SMEs are also encouraged to take advantage of provider documentation and pre-configured compliance features offered by larger AI or cloud providers. Using tools that already meet certain requirements can reduce the need for building everything from scratch.
The European Commission encourages SMEs to use EU support program, such as digital innovation hubs and testing facilities, which help interpret obligations and reduce the administrative burden (Small Businesses’ Guide, 2024).
In conclude, the EU AI Act creates important safeguards for trustworthy AI, but its impact is uneven. Large companies are generally well prepared due to their established compliance structures, while SMEs face higher administrative and financial burdens that can slow AI adoption and limit competitiveness. To bridge this gap, SMEs need clear guidance, practical tools, and access to support program across Europe. The European Commission highlights that targeted assistance — such as digital innovation hubs and simplified guidance — is essential to ensure that regulation does not create disproportionate barriers for small companies (Small Businesses’ Guide, 2024).
Reference:
1. European Union. 2024. Regulation (EU) 2024/1689 of the European Parliament and of the Council on artificial intelligence (Artificial Intelligence Act). Official Journal of the European Union. Available at: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32024R1689
2. Linck A. 2022. The AI Act & SMEs: How can the AI Act support innovation?. European Digital SME Alliance. Available at: https://www.digitalsme.eu/fact-sheet-ai-act/?utm_source=chatgpt.com
3. European Digital SME Alliance. 2025. SME Guide for the implementation of ISO/IEC 27001 on information security management. Brussels, Belgium. 1-51.
4. Organization for Economic Co-operation and Development (OCED). 2025. SMEs and entrepreneurship. https://www.oecd.org/en/topics/smes-and-entrepreneurship.html
5. Linck A.. 2021. European AI Act – Possible risks for innovative European SMEs?. Available at: https://www.digitalsme.eu/european-ai-act-possible-risks-for-innovative-european-smes/?utm_source=chatgpt.com
6. International Organization for Standardization. 2023. ISO/IEC 42001:2023 – Artificial intelligence management system (AI MS). Available at: https://www.iso.org/standard/81230.html
7. Sillberg, C. V., De Cerqueira, J. S., Sillberg, P., Kemell, K. K., & Abrahamsson, P. 2024. The EU AI Act is a good start but falls short. In International Conference on Software Business (pp. 114-130). Cham: Springer Nature Switzerland.
8. Sillberg, C. V., Kemell, K.K., Sillberg, P., Saari. M., 4Harjuveteläinen, K., Waseem, M. & Abrahamsson, P. (in press). Navigating Compliance Strategic Guidance for SMEs under the EU AI Act. Springer Briefs in Computer Science.
9. ArtificialIntelligenceAct.eu. 2024. Small businesses’ guide to the AI Act. Available at: https://artificialintelligenceact.eu/small-businesses-guide-to-the-ai-act/
