{"id":1556,"date":"2023-11-10T10:51:35","date_gmt":"2023-11-10T10:51:35","guid":{"rendered":"https:\/\/davidmoore.io\/?p=1556"},"modified":"2024-05-03T15:57:59","modified_gmt":"2024-05-03T14:57:59","slug":"the-transparency-question-ais-role-in-content-creation-and-creative-integrity","status":"publish","type":"post","link":"https:\/\/davidmoore.io\/the-transparency-question-ais-role-in-content-creation-and-creative-integrity\/","title":{"rendered":"The Transparency Question: AI’s Role in Content Creation and Creative Integrity"},"content":{"rendered":"
I recently spoke at an internal Balfour Beatty Investments conference where I spoke about our AI strategy, and someone asked me… “Should you declare if you’re using AI to generate content?” It also came up when I rambled at Multiverse students on an internal panel discussion on AI’s potential impact on the employment market, and again this week on an internal call with some Balfour Beatty employees.<\/p>\n
My answer varied each time, as honestly, I don’t know. I also don’t think it would be enforceable in any way. However, it’s currently often pretty obvious when AI has been used in its raw form, particularly in list articles and opinion pieces… we’re onto you, ‘5 things you should consider…’ writers \ud83d\ude09\u00a0 I see the same on my LinkedIn<\/a> feed, in Medium articles and Reddit posts, and it keeps crossing my mind \u2013 maybe there are contexts where disclosure is necessary, and more to the point, ethical?<\/p>\n I recently watched some videos where new business owners described using ChatGPT to write whole books on particular subjects, specifically targeting particular markets and demographics, sometimes putting the name of a human author on a book entirely written by an AI tool. To me, that feels disingenuous at best.<\/p>\n This video<\/a> was genuinely a bit troubling. It tells the story of a young guy who claims to have made more than $1m selling entirely AI-generated books. Several of the books are targeted at the white-middle-class-suburban-mum self-help market, presenting self-help advice on improving self-confidence and calming minds.<\/p>\n There’s a lot of breathless excitement online about these sorts of startups, but I struggle to find any inspiration in a business model rooted in churning out generative AI content and wrapped in sophisticated online marketing techniques. The naked focus on profit and marketing, rather than quality content, is pretty disconcerting.<\/p>\n <\/p>\n<\/div>\n <\/p>\n <\/p>\n <\/p>\n I also know a ton of people who work in the creative industries, and this keeps coming up. Should a photographer declare if they’ve edited a photo using Stable Diffusion and Inpainting<\/a>\u00a0or used Adobe Firefly<\/a> to create fake backgrounds or generate whole images?<\/p>\n A friend suggested that this is an extension of the never-ending debates around Photoshop use and disclosure in the photography world,<\/a> and he probably has a point. However,\u00a0 Photoshop is complex to learn and the barrier to entry is much lower today. The new breed of free (or very cheap) tools require little to no skill to use. In some contexts, I think that’s great, particularly when it enables non-technical users to use historically complex tools in areas like data analysis, but I’m not so sure the same applies when generating creative or expert advice content and then passing it off as being created by a human.<\/p>\n<\/div>\n
\nAI and Creative Boundaries<\/h4>\n