alt.chi commentary to: Piling et al. - The process of gaining an AI legibility Mark

Research output: Chapter in Book/Report/Conference proceedingConference paper

Abstract

Researchers and designers working in industrial sectors seeking to incorporate Artificial Intelligence (AI) technology, will be aware of the emerging International Organisation for AI Legibility (IOAIL). IOAIL was established to overcome the eruption of obscure AI technology. One of the primary goals of IOAIL is the development of a proficient certification body providing knowledge to users regarding the AI technology they are being exposed to. To this end IOAIL produced a system of standardised icons for attaching to products and systems to indicate both the presence of AI and to increase the legibility of that AI's attributes. Whilst the process of certification is voluntary it is becoming a mark of trust, enhancing the usability and acceptability of AI-infused products through improved legibility. In this paper we present our experience of seeking certification for a locally implemented AI security system, highlighting the issues generated for those seeking to adopt such certification.

Original languageEnglish
Title of host publicationCHI EA 2020 - Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems
ISBN (Electronic)9781450368193
DOIs
Publication statusPublished - 25 Apr 2020

Keywords

  • AI and transparency
  • Transparency
  • Design fiction
  • Marks
  • Artificial intelligence
  • Legibility

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'alt.chi commentary to: Piling et al. - The process of gaining an AI legibility Mark'. Together they form a unique fingerprint.

Cite this