English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 PreviousNext  
  Rethinking open source generative AI: open-washing and the EU AI Act

Liesenfeld, A., & Dingemanse, M. (in press). Rethinking open source generative AI: open-washing and the EU AI Act. In The 2024 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’24). ACM.

Item is

Basic

show hide
Genre: Conference Paper

Files

show Files
hide Files
:
liesenfeld_dingemanse_2024_FAccT_generative_AI_open-washing_EU_AI_Act.pdf (Any fulltext), 906KB
Name:
liesenfeld_dingemanse_2024_FAccT_generative_AI_open-washing_EU_AI_Act.pdf
Description:
-
OA-Status:
Not specified
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
CC-BY-4.0

Locators

show

Creators

show
hide
 Creators:
Liesenfeld, Andreas1, Author
Dingemanse, Mark1, Author           
Affiliations:
1Center for Language Studies, External Organizations, ou_55238              

Content

show
hide
Free keywords: -
 Abstract: The past year has seen a steep rise in generative AI systems that claim to be open. But how open are they really? The question of what counts as open source in generative AI is poised to take on particular importance in light of the upcoming EU AI Act that regulates open source systems differently, creating an urgent need for practical openness assessment. Here we use an evidence-based framework that distinguishes 14 dimensions of openness, from training datasets to scientific and technical documentation and from licensing to access methods. Surveying over 45 generative AI systems (both text and text-to-image), we find that while the term open source is widely used, many models are `open weight' at best and many providers seek to evade scientific, legal and regulatory scrutiny by withholding information on training and fine-tuning data. We argue that openness in generative AI is necessarily composite (consisting of multiple elements) and gradient (coming in degrees), and point out the risk of relying on single features like access or licensing to declare models open or not. Evidence-based openness assessment can help foster a generative AI landscape in which models can be effectively regulated, model providers can be held accountable, scientists can scrutinise generative AI, and end users can make informed decisions.

Details

show
hide
Language(s): eng - English
 Dates: 2024-01-142024-05-05
 Publication Status: Accepted / In Press
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: -
 Degree: -

Event

show
hide
Title: Seventh Annual ACM Conference on Fairness, Accountability, and Transparency (ACM FAccT 2024)
Place of Event: Rio de Janeiro, Brazil
Start-/End Date: 2024-06-03 - 2024-06-06

Legal Case

show

Project information

show

Source 1

show
hide
Title: The 2024 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’24)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: ACM
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: - Identifier: -