Germany Wants EU to Double Down on Idea That Would Hinder the AI Economy
Center for Data Innovation
by Eline Chivot
October 9, 2020
The European Commission has proposed strictly regulating AI systems that meet two conditions: they are used in sectors and in a manner where significant risks are likely to occur. But Germany has called on the EU to abandon its proposal, arguing that tougher rules should apply for all sectors that use AI and even for AI applications that do not pose a significant risk.
This is not the first time that Germany has called for stricter regulation of AI, but as Germany has taken over the EU Council presidency, its perspective is likely to have more influence on the Commission’s regulatory choices. But following Germany’s advice would have far-reaching negative implications for innovation in the EU.
First, imposing stricter rules on lower-risk AI systems would achieve little in the way of consumer protection because these systems already pose little risk to consumers and existing consumer protection laws apply. It does not make sense to require AI-powered dating apps to undergo the same level of scrutiny as credit scoring tools. Moreover, subjecting these AI systems to additional regulations would increase compliance costs and liability for companies, discouraging them from using AI, and raising prices for consumers.
Second, applying regulation to AI applications outside sectors where significant risks are likely to occur means that more applications would have to undergo the Commission’s proposed conformity assessments prior to going to market. These regulatory impediments will put European companies, especially SMEs, at a competitive disadvantage because it will take them longer and cost them more to bring products to market. This may also deter companies from launching AI products and services in the EU, who may instead focus their investments on friendlier markets.
Third, extending regulation to cover the whole spectrum of AI systems, from the most simple to the most advanced, would mean many already commercialized AI applications would have to be taken off the market—at least until a risk assessment is performed—to the detriment of countless businesses and their consumers.
To be clear, even the European Commission’s proposed ex ante conformity assessments for AI systems are unnecessary and problematic.
AI systems already must adhere to various European laws and regulations, including sector-specific ones. Moreover, applying additional rules to AI systems signals that the technology should be viewed with suspicion (thereby limiting market demand), while the assessments themselves slow companies from offering AI solutions to various sectors and use cases (thereby limiting market supply).
Instead of shackling AI, Germany needs to find ways to accelerate its use. The German Minister for Digital Affairs Dorothee Bär acknowledged during a European AI Forum held in June that the penetration of AI in the German economy remains partial and that SMEs (which are the main fabric of its economy) “do not make enough use of the potential of AI.”
The European Commission should resist the temptation to expand its proposed conformity assessments to more types of AI systems. Doing so would limit the use of AI applications in the EU and undermine the EU’s objective to catch up to the United States and China.
Image credits: Wikipedia Commons