AI Act High-Risk Requirements Readiness: Industrial Perspectives and Case Company Insights
This program is tentative and subject to change.
The AI Act’s (AIA) requirements for high-risk AI systems affect many aspects of modern software systems. Knowing which AIA-related technical challenges are relevant to different companies is essential to focus compliance-oriented research on the aspects that matter. We therefore conducted an interview study in collaboration with a case company that specializes in network video solutions within the security and surveillance industry. External experts enrich the study for a broader industry perspective. The goal was to analyze the case company’s readiness for the AIA’s high-risk requirements, based on methods and techniques already established prior to the legislation. Our results yielded a positive sentiment towards the regulation and the planning security that it brings, although a high workload was expected. We identified a solid foundation with well-established practices to build upon for the requirements on cybersecurity, human oversight, record-keeping, and technical documentation. However, we also report several open challenges, mainly connected to the requirement on data governance, followed by accuracy, robustness, and cybersecurity. The AIA specifically demands a post-market monitoring system (Art 72) and the right to an explanation of individual decision-making (Art 86). These two obligations were identified as especially challenging by our respondents. The result of this study is expected to steer future compliance-oriented work toward pressing challenges.