(Un)read in the ledger

Elliott Bledsoe
8 min readApr 14, 2024

--

Weekly reading list: Monday 8–Sunday 14 March 2024

Things I have been reading, stuff I have come back to for a re-read and things that I want to read but haven’t got to yet.

Image: AI-generated by Elliott Bledsoe using Text to Vector Graphic (Beta) in Adobe Illustrator.

This week we found out if the Ladies Lounge at MONA was discriminatory, we learned Meta thought about buying Simon & Schuster and there’s lots to read about music and AI, among other things.

Read

What I’ve been reading the week:

  • Meta ‘discussed buying publisher Simon & Schuster to train AI’
    In a sadly unsurprising but very much concerning turn of events, recordings of internal meetings at Meta reveal that staff discussed buying ‘Big Five’ publisher Simon & Schuster so they could use books it published to train the company’s artificial intelligence tools. I saw it on Aimée Lindorff LinkedIn, who rightly flagged that authors need to be reviewing their publishing contracts to see what is what. While this didn’t actually occur similar acquisitions could, and the terms of creator’s agreements will determine if their content could be fed into AI legally. This kind of move is also not surprising given the panic that the AI industry might be running out of training data (see Re-reads below).
    Ella Creamer – Wednesday 10 April 2024
    The Guardian
  • Artificial Intelligence (AI) content policy
    During the week Medium emailed participants in the Medium Partner Program (including me) to notify them of a policy update related to the use of AI for writing paywalled as part of the program. The crux of it is that writing where the “… majority of the content has been created by an AI-writing program with little or no edits, improvements, fact-checking, or changes” is considered AI-generated writing and is not allowed to be paywalled, whether the use of AI was disclosed or not. Where AI was used to assist in the writing it must be disclosed at the beginning of the story (within the first two paragraphs). {As you know, I disclose my use of AI and the way I do that is now modelled on Kester Brewin’s AI transparency statement suggestions which where in last week’s reading list.} AI-generated images also need to be captioned as such. While I get the intention, there is currently no disclosure around how the use of AI will be detected, what notification to the author/user will be made, what, if any, right of reply the author/user has. Perhaps there’s a blog post in that?
    Medium Help Centre, Medium

Re-reads

Things I have circled back to:

  • Researchers warn we could run out of data to train AI by 2026. What then?
    In light of the revelation that Meta considered buying Simon & Schuster, I went back to Dr Rita Matulionyte’s article about running out of training data for AI from late last year. The problem is that accurate and high-quality AI algorithms need huge amounts of data but “… research shows online data stocks are growing much slower than datasets used to train AI.” Couple this with the increasing scrutiny and (rightful) questions being asked about where AI training data was sourced, and it is no surprise AI developers are looking around for ‘less copyright grey area’ content. In fact, Matulionyte suggested large publishers and offline repositories as possible sources!
    Rita Matulionyte – Wednesday 8 November 2023
    The Conversation

Add it to the pile

New additions to the unread pile:

  • Fairness And Fair Use In Generative AI
    I haven’t read it yet, but, as the abstract says, “Generative AI gives us yet another context to consider copyright’s most fundamental question: where do the rights of the copyright owner end and the freedom to use copyrighted works begin?” Sag notes that countries may try to answer that question through specific exceptions, through the application of existing fair use or fair dealing provisions or ‘hiding their heads in the sand’. Sag’s stated aim “… is not to establish that generative AI is, or should be, non-infringing; it is to outline an analytical framework for making that assessment in particular cases.” It should be an insightful article.
    Matthew Sag – Monday 1 April 2024
    Fordham Law Review, Vol 92
  • No Algorithm for Culture: How Humans See What AI Can’t
    {This is actually something to listen to, not to read but still,} Patternmakers shared a link to Toygun Yilmazer’s recent talk at SXSW in their newsletter which I’ve added to the list. I am very interested in the ways different industries are rejecting, embracing, interrogating and internalising AI. I am especially interested in how arguments are made in favour of human creativity, so I will be interested to hear Yilmazer’s thoughts.
    Toygun Yilmazer – Sunday 10 March 2024
    SXSW

Of course, there’s lots of other stuff I have been reading that doesn’t make it into the weekly round up. (If you have a Google Account you can even share links with me.)

Disclosure

Conflict of interest

I work part-time for the Australian Digital Alliance (ADA) and the Australian Libraries and Archives Copyright Coalition (ALACC). The ADA recently hosted Professor Matthew Sag as the keynote speaker at the ADA Copyright Forum 2024.

AI use

This blog post was drafted using Google Docs. No part of the text of this blog post was generated using AI. The original text was not modified or improved using AI. No text suggested by AI was incorporated. If spelling or grammar corrections were suggested by AI they were accepted or rejected based on my discretion (however, sometimes spelling, grammar and corrections of typos may have occurred automatically in Google Docs).

The banner image (i.e. the image at the top of the blog post) was generated by AI using Text to Vector Graphic (Beta) in Adobe Illustrator.

Credits

Image: A pile of books with purple, yellow and pink covers. An adaptation of an image generated by Elliott Bledsoe using Text to Vector Graphic (Beta) in Adobe Illustrator. Prompt: ‘pile of books uneven hand-drawn’.

Provenance

This blog post was produced by Elliott Bledsoe from Agentry, an arts marketing micro-consultancy. It was first published on 14 Apr 2024. It has not been updated. This is version 1.0. Questions, comments and corrections are welcome – you can get Elliott on elliott@agentry.au.

Reuse

You can’t keep a good idea to yourself. I believe in the power of open access to knowledge and creativity and a thriving commons of shared knowledge and culture. That’s why this blog post is licensed for reuse under a Creative Commons licence.

You can reuse the text of this blog post under the terms of a Creative Commons Attribution 4.0 International licence (CC BY 4.0) (see link below). Under the licence, you are free to copy, share and adapt this blog post, or any modified version you create from it, even commercially, as long as you acknowledge Elliott Bledsoe/Agentry as the original creator of it. So please make use of this blog post as you see fit.

Whether AI-generated outputs are protected by copyright remains contested. To the extend that copyright exists, if at all, in the banner image I generated using AI for this blog post (i.e. the image at the top of the blog post/article), I also license it for reuse under the terms of the Creative Commons licence (CC BY 4.0).

--

--