Regulatory

EU Negotiators Reach Agreement on Landmark AI Regulations

BitOoda Regulatory Analysis, 12/13/23

Tom Nath
Key Takeaway #1

The new rules establish obligations for AI based on the level of risk and systemic nature of an AI model or system.

Key Takeaway #2

The EU AI Act will serve as a first-of-its-kind law that other countries will likely leverage when developing their own laws governing AI

Key Takeaway #3

If a general purpose AI model meets a certain set of criteria, it will be deemed a high impact AI model, and will be subject to heightened testing and reporting requirements.

Key Takeaway #4

On December 8, negotiators from the EU Parliament, Council, and Commission reached a provisional agreement on comprehensive regulations that will govern the use of AI for both private companies and government agencies. The EU AI Act (the “Act”) is expected to be passed next year and become effective in two years. The new rules establish obligations for AI based on the level of risk and systemic nature of an AI model or system. They also provide guardrails and exceptions for AI used by government agencies, specifically law enforcement.

When formally passed by the EU Parliament and Council, the EU AI Act will serve as a first-of-its-kind law that other countries will likely leverage when developing their own laws governing AI.

The Act prohibits certain applications of AI, including biometric categorization of sensitive characteristics (e.g., religious beliefs, political beliefs, and race), untargeted image scraping from the internet or CCTV in order to establish facial recognition databases, and manipulation of human behavior. However, there are narrow exceptions for law enforcement agencies, permitting them to use biometric identification systems with respect to certain crimes

Premium Content

Unlock exclusive insights with our cutting-edge digital finance platform. Gain access to next-gen data analytics and digital asset products crafted with applied science. Subscribe now to stay ahead of the curve.

  • Research and Consulting
  • Investment Banking and Advisory
  • Sales and Origination
  • HPC and Power Advisory
Request Access Now!

On December 8, negotiators from the EU Parliament, Council, and Commission reached a provisional agreement on comprehensive regulations that will govern the use of AI for both private companies and government agencies. The EU AI Act (the “Act”) is expected to be passed next year and become effective in two years. The new rules establish obligations for AI based on the level of risk and systemic nature of an AI model or system. They also provide guardrails and exceptions for AI used by government agencies, specifically law enforcement.

When formally passed by the EU Parliament and Council, the EU AI Act will serve as a first-of-its-kind law that other countries will likely leverage when developing their own laws governing AI.

The Act prohibits certain applications of AI, including biometric categorization of sensitive characteristics (e.g., religious beliefs, political beliefs, and race), untargeted image scraping from the internet or CCTV in order to establish facial recognition databases, and manipulation of human behavior. However, there are narrow exceptions for law enforcement agencies, permitting them to use biometric identification systems with respect to certain crimes

Under the Act, all general purpose AI systems (e.g., ChatGPT) will be subject to transparency requirements, including technical documentation, adherence to EU copyright law, and summaries of the information and content a model uses for training.

If a general purpose AI model meets a certain set of criteria, it will be deemed a high impact AI model, and will be subject to heightened requirements:

• Conducting model evaluations;

• Assessing and mitigating systemic risk;

• Conducting adversarial testing;

• Reporting serious incidents to the Commission; and

• Reporting on the model’s energy requirements.

In addition to establishing standards for AI model usage and oversight, the Act provides consumers a path to file complaints against AI systems and obtain explanations for how AI may have been used to impact outcomes. Fines for violations of the Act range from €35 million or 7% of global turnover to €7.5 million or 1.5% of global turnover.

We applaud EU legislators for recognizing that governments need to balance the systemic risk certain AI models may pose with the benefits of AI-focused innovation. The clarity the Act offers to AI companies could provide EU developers with an advantage over other jurisdictions still considering how to regulate the technology.

BitOoda will continue to share our insights as we work to advance the market and provide our clients with innovative and compliant solutions.

Disclosures

Purpose This research is only for the clients of BitOoda. This research is not intended to constitute an offer, solicitation, or invitation for any securities and may not be distributed into jurisdictions where it is unlawful to do so. For additional disclosures and information, please contact a BitOoda representative at info@bitooda .io. Analyst Certification Tom Nath, the research analyst denoted by an “AC” on the cover of this report, hereby certifies that all of the views expressed in this report accurately reflect his personal views, which have not been influenced by considerations of the firm’s business or client relationships. Conflicts of Interest This research contains the views, opinions, and recommendations of BitOoda. This report is intended for research and educational purposes only. We are not compensated in any way based upon any specific view or recommendation. General Disclosures Any information (“Information”) provided by BitOoda Holdings, Inc., BitOoda Digital, LLC, BitOoda Technologies, LLC or Ooda Commodities, LLC and its affiliated or related companies (collectively, “BitOoda”), either in this publication or document, in any other communication, or on or through http ://www.bitooda.io/, including any information regarding proposed transactions or trading strategies, is for informational purposes only and is provided without charge. BitOoda is not and does not act as a fiduciary or adviser, or in any similar capacity, in providing the Information, and the Information may not be relied upon as investment, financial, legal, tax, regulatory, or any other type of advice. The Information is being distributed as part of BitOoda’s sales and marketing efforts as an introducing broker and is incidental to its business as such. BitOoda seeks to earn execution fees when its clients execute transactions using services. BitOoda its brokerage makes no representations or warranties (express or implied) regarding, nor shall it have any responsibility or liability for the accuracy, adequacy, timeliness or completeness of, the Information, and no representation is made or is to be implied that the Information will remain unchanged. BitOoda undertakes no duty to amend, correct, update, or otherwise supplement the Information. The Information has not been prepared or tailored to address, and may not be suitable or appropriate for the particular financial needs, circumstances or requirements of any person, and it should not be the basis for making any investment or transaction decision. The Information is not a recommendation to engage in any transaction. The digital asset industry is subject to a range of inherent risks, including but not limited to: price volatility, limited liquidity, limited and incomplete information regarding certain instruments, products, or digital assets, and a still emerging and evolving regulatory environment. The past performance of any instruments, products or digital assets addressed in the Information is not a guide to future performance, nor is it a reliable indicator of future results or performance. All derivatives brokerage is conducted by Ooda Commodities, LLC a member of NFA and subject to NFA’s regulatory oversight and examinations. However, you should be aware that NFA does not have regulatory oversight authority over underlying or spot virtual currency products or transactions or virtual currency exchanges, custodians or markets. BitOoda Technologies, LLC is a member of FINRA. “BitOoda”, “BitOoda Difficulty”, “BitOoda Hash”, “BitOoda Compute”, and the BitOoda logo are trademarks of BitOoda Holdings, Inc. Copyright 2023 BitOoda Holdings, Inc. All rights reserved. No part of this material may be reprinted, redistributed, or sold without prior written consent of BitOoda.

On December 8, negotiators from the EU Parliament, Council, and Commission reached a provisional agreement on comprehensive regulations that will govern the use of AI for both private companies and government agencies. The EU AI Act (the “Act”) is expected to be passed next year and become effective in two years. The new rules establish obligations for AI based on the level of risk and systemic nature of an AI model or system. They also provide guardrails and exceptions for AI used by government agencies, specifically law enforcement.

When formally passed by the EU Parliament and Council, the EU AI Act will serve as a first-of-its-kind law that other countries will likely leverage when developing their own laws governing AI.

The Act prohibits certain applications of AI, including biometric categorization of sensitive characteristics (e.g., religious beliefs, political beliefs, and race), untargeted image scraping from the internet or CCTV in order to establish facial recognition databases, and manipulation of human behavior. However, there are narrow exceptions for law enforcement agencies, permitting them to use biometric identification systems with respect to certain crimes

Under the Act, all general purpose AI systems (e.g., ChatGPT) will be subject to transparency requirements, including technical documentation, adherence to EU copyright law, and summaries of the information and content a model uses for training.

If a general purpose AI model meets a certain set of criteria, it will be deemed a high impact AI model, and will be subject to heightened requirements:

• Conducting model evaluations;

• Assessing and mitigating systemic risk;

• Conducting adversarial testing;

• Reporting serious incidents to the Commission; and

• Reporting on the model’s energy requirements.

In addition to establishing standards for AI model usage and oversight, the Act provides consumers a path to file complaints against AI systems and obtain explanations for how AI may have been used to impact outcomes. Fines for violations of the Act range from €35 million or 7% of global turnover to €7.5 million or 1.5% of global turnover.

We applaud EU legislators for recognizing that governments need to balance the systemic risk certain AI models may pose with the benefits of AI-focused innovation. The clarity the Act offers to AI companies could provide EU developers with an advantage over other jurisdictions still considering how to regulate the technology.

BitOoda will continue to share our insights as we work to advance the market and provide our clients with innovative and compliant solutions.

Disclosures

Purpose This research is only for the clients of BitOoda. This research is not intended to constitute an offer, solicitation, or invitation for any securities and may not be distributed into jurisdictions where it is unlawful to do so. For additional disclosures and information, please contact a BitOoda representative at info@bitooda .io. Analyst Certification Tom Nath, the research analyst denoted by an “AC” on the cover of this report, hereby certifies that all of the views expressed in this report accurately reflect his personal views, which have not been influenced by considerations of the firm’s business or client relationships. Conflicts of Interest This research contains the views, opinions, and recommendations of BitOoda. This report is intended for research and educational purposes only. We are not compensated in any way based upon any specific view or recommendation. General Disclosures Any information (“Information”) provided by BitOoda Holdings, Inc., BitOoda Digital, LLC, BitOoda Technologies, LLC or Ooda Commodities, LLC and its affiliated or related companies (collectively, “BitOoda”), either in this publication or document, in any other communication, or on or through http ://www.bitooda.io/, including any information regarding proposed transactions or trading strategies, is for informational purposes only and is provided without charge. BitOoda is not and does not act as a fiduciary or adviser, or in any similar capacity, in providing the Information, and the Information may not be relied upon as investment, financial, legal, tax, regulatory, or any other type of advice. The Information is being distributed as part of BitOoda’s sales and marketing efforts as an introducing broker and is incidental to its business as such. BitOoda seeks to earn execution fees when its clients execute transactions using services. BitOoda its brokerage makes no representations or warranties (express or implied) regarding, nor shall it have any responsibility or liability for the accuracy, adequacy, timeliness or completeness of, the Information, and no representation is made or is to be implied that the Information will remain unchanged. BitOoda undertakes no duty to amend, correct, update, or otherwise supplement the Information. The Information has not been prepared or tailored to address, and may not be suitable or appropriate for the particular financial needs, circumstances or requirements of any person, and it should not be the basis for making any investment or transaction decision. The Information is not a recommendation to engage in any transaction. The digital asset industry is subject to a range of inherent risks, including but not limited to: price volatility, limited liquidity, limited and incomplete information regarding certain instruments, products, or digital assets, and a still emerging and evolving regulatory environment. The past performance of any instruments, products or digital assets addressed in the Information is not a guide to future performance, nor is it a reliable indicator of future results or performance. All derivatives brokerage is conducted by Ooda Commodities, LLC a member of NFA and subject to NFA’s regulatory oversight and examinations. However, you should be aware that NFA does not have regulatory oversight authority over underlying or spot virtual currency products or transactions or virtual currency exchanges, custodians or markets. BitOoda Technologies, LLC is a member of FINRA. “BitOoda”, “BitOoda Difficulty”, “BitOoda Hash”, “BitOoda Compute”, and the BitOoda logo are trademarks of BitOoda Holdings, Inc. Copyright 2023 BitOoda Holdings, Inc. All rights reserved. No part of this material may be reprinted, redistributed, or sold without prior written consent of BitOoda.

Related Research