By: Emma Fitch, Clare McKenzie, Terri Janke and Adam Shul
Introduction
Artificial intelligence (AI) is increasingly being used for creative and artistic innovation. But how does AI interface with Indigenous culture and knowledge?
AI has the potential to assist people in many capacities. Despite this, AI raises issues and risks for Copyright, Intellectual Property (IP), and Indigenous Cultural and Intellectual Property (ICIP). This blog post discusses the current landscape of AI and copyright, then explores how AI connects to ICIP. It outlines some key issues and concerns, identifies opportunities where AI can be beneficial, and presents ways to protect ICIP in projects that involve AI.
AI is a developing space, and there is still uncertainty regarding the relationships between AI and copyright. In Australia, AI tools do not currently have their own distinct legal status as copyright owners – AI tools generally cannot own copyright, as copyright is only afforded to human authors who contributed ‘independent intellectual effort’. This means that there is a level of human input that must be reached before an AI generated work could be protected by copyright.[1]
This has led to many alleged copyright infringements perpetuated by AI companies globally. In September, for example, thousands of books from recognised Australian authors were found to have been used in a dataset that trains generative AI without permission from the authors.[2] AI also has the means to cause cultural offence. During the Campaign for the Voice to Parliament referendum, a lobby group is alleged to have used AI create advertisements that depicted Indigenous ‘characters’ who were voting no.[3] These two examples highlight the great legal and ethical challenges that appear in this space.
AI and ICIP
AI works through processing data and learning behavioural pattern with special algorithms. It then creates something, whether an image or text, based on the information it learnt.
AI sources information that it scours on the internet. It may not be correct, and in terms of the cultural information that it can access on line, there is no way of telling whether that information has authority or is credible or accurate. For example, if you use ChatGPT to write about Wiradjuri culture, it could source documents that contain false, offensive, and/or secret/sacred cultural knowledge. In fact, by asking on two separate occasions to explain Wiradjuri Kinship structures, Chat GPT, provided two conflicting descriptions of the language word for Moieties. The AI delivered the information in an authoritative way without being able to verify its sourced material.
Noting the above, ICIP could be inappropriately exploited by AI, especially if the AI program, or the AI user has not obtained the proper free, prior, and informed consent (FPIC) from knowledge holders to use that ICIP. There are also issues surrounding the lack of attribution for the knowledge, and the context in which the ICIP has been used.
In sum, AI could negatively impact ICIP rights by:
Giving no attribution to the Traditional Owners who hold that knowledge and language;
Providing false or misleading information;
Disregarding cultural protocols that affect how art styles are used, or who can share certain information e.g., who is permitted to tell or illustrate a story, using gender-specific artwork and designs;
Creating works that misappropriate ICIP without undergoing consultation, or obtaining FPIC; and
Something created by AI may not be protected under copyright law requirements, meaning the ICIP contained in it has no legal protection.
Aboriginal and Torres Strait Islander Artwork and AI
There are already issues with how ICIP is used in misappropriated artwork that inappropriately exploits Indigenous art styles and designs. The Productivity Commission’s study into Aboriginal and Torres Strait Islander arts and crafts identified that inauthentic products are common.[4] The study identified that inauthentic art is especially prominent within the tourism industry, absent of using AI to create works.
Generative AI can create further issues with inauthentic art because anyone can create ‘Aboriginal style art’. Using the same case study of Wiradjuri Culture, the following images were produced by the generative image AI program Stable Diffusion when given the following prompts:
Aboriginal Art: Wiradjuri Art:
Please note, that the above images are purely shown for demonstrative purposes and are not intended to cause offence.
A reverse Google image search shows how it draws from many examples from the internet.
This case study draws attention to the ways by which AI could create inauthentic works and infringe on individual artists’ own ICIP rights by using artworks without consent and proper attribution. Amy Allerton, a Gumbaynggir and Bundjalung woman, and founder of Indigenous design company Indigico Creative, notes the danger of AI in this respect:
Having their [Indigenous Artists] artwork stolen and used without permission is a traumatic experience. The increased accessibility provided by AI technology could intensify these challenges within the online realm.[6]
It also calls into question the inability for AI to understand and respect the important and meaningful connection that a person creating the artwork has to culture and Country, informing what the artwork is about and how it is created.[7] Reinserting Indigenous sovereignty and self-determination practices within the AI space can alleviate this concern.
Some examples of this include:
Collating data about Country and Community – using data to drive initiatives;
Making data available to Community;
Tools and apps for monitoring and managing Country;
Writing scripts or essays that can be edited through a cultural lens; and
Virtual reality which shares images or stories of Country.
Protecting ICIP Rights
The users of AI should consider ICIP rights as well recognising Indigenous Data Sovereignty Principles as they relate to AI,[8] and ensuring there is appropriate use of AI that respects ICIP rights. ‘Out of the Black Box: Indigenous Protocols for AI[9] (2021) is a great guide that explains how AI projects can conform to these principles and practices. The project, led by Angie Abdilla, Megan Kelleher, Rick Shaw, and Tyson Yunkaporta incorporates Indigenous ontology and epistemology, and Country-centred designs into AI. It demonstrates how Indigenous knowledge can guide the design and practices of an AI system.
AI can create something through ‘machine learning’, which is how the AI creator trains the system by generating intelligence from outside input.[10] The information provided to the AI system influences its behaviour. To protect misuse of ICIP, it is important that First Nations people are involved in the development of AI, that creators of the AI system can recognise where ICIP may be used inappropriately, and limiting the information used for machine learning to avoid negative impacts on ICIP rights and copyright. These strategies could help prevent the misuse of ICIP and other expressions of cultural heritage, such as artwork, songs, and stories.
Involving First Nations people whose ICIP might be used by AI in the design of the system can protect ICIP rights. It can help maintain, contribute to, and preserve cultural knowledge. For example, Traditional Owners in the Kakadu National Park are using AI to manage the growth of invasive para grass, which affects magpie geese populations, across a 2 million hectare area. Traditional Owners work with rangers and researchers in conservation management and assist in programming AI with ICIP that is specific to geographic knowledge of Country that helps manage para grass.[11]
TJC notes that True Tracks® protocols are necessary for any AI project involving Traditional Knowledge and ICIP. These protocols should stress respect for Indigenous Data Sovereignty, FPIC, and attribution and integrity. Traditional Knowledge can also be protected using contracts. If you are hoping to undertake a project that involves ICIP and AI, get into contact with us to find out more!
[1] Arts Law Centre of Australia, ‘Artificial Intelligence (AI) and Copyright’, Arts Law Centre of Australia (2023) <https://www.artslaw.com.au/information-sheet/artificial-intelligence-ai-and-copyright/>.
[2] Kelly Burke, ‘“Biggest Act of Copyright Theft in History”: Thousands of Australian Books Allegedly Used to Train AI Model’, The Guardian (online, 28 September 2023) <https://www.theguardian.com/australia-news/2023/sep/28/australian-books-training-ai-books3-stolen-pirated>.
[3] Josh Butler, ‘Unofficial Indigenous Voice No Campaigner Defends Use of AI-Generated Ads on Facebook’, The Guardian (online, 7 August 2023) <https://www.theguardian.com/australia-news/2023/aug/07/indigenous-voice-to-parliament-no-campaign-ai-facebook-ads>.
[4] Australian Government Productivity Commission, Aboriginal and Torres Strait Islander Visual Arts and Crafts Study Report (November 2022) <https://www.pc.gov.au/inquiries/completed/indigenous-arts/report>.
[5] note that the copyright to these images remain within the original artists.
[6] ‘NAIDOC Week: Amy Allerton Puts the Focus on Indigenous Art’, The Big Smoke (6 July 2023) <https://thebigsmoke.com.au/2023/07/07/naidoc-week-amy-allerton-puts-the-focus-on-indigenous-art/>.
[7] Bronwyn Carlson and Peita Richards, ‘Indigenous Knowledges Informing “Machine Learning” Could Prevent Stolen Art and Other Culturally Unsafe AI Practices’, The Conversation (8 September 2023) <https://theconversation.com/indigenous-knowledges-informing-machine-learning-could-prevent-stolen-art-and-other-culturally-unsafe-ai-practices-210625>.
[8] Angie Abdilla et al, Out of the Black Box: Indigenous Protocols for AI < https://www.anat.org.au/wp-content/uploads/2021/11/Out-of-the-Black-Box_Indigenous-protocols-for-AI.pdf/>. 16
[9] Angie Abdilla et al, Out of the Black Box: Indigenous Protocols for AI < https://www.anat.org.au/wp-content/uploads/2021/11/Out-of-the-Black-Box_Indigenous-protocols-for-AI.pdf/>.
[10] Bronwyn Carlson and Peita Richards, ‘Indigenous Knowledges Informing “Machine Learning” Could Prevent Stolen Art and Other Culturally Unsafe AI Practices’, The Conversation (8 September 2023) <https://theconversation.com/indigenous-knowledges-informing-machine-learning-could-prevent-stolen-art-and-other-culturally-unsafe-ai-practices-210625>.
[11] Ibid.
Kate Cranney, ‘Magpie Geese Return with Help from Ethical AI and Indigenous Knowledge’, CSIRO (19 November 2019) <https://www.csiro.au/en/news/all/articles/2019/november/magpie-geese-return-ethical-ai-indigenous-knowledge>.