From Backderf to Munch; Crumb to Pollack; Kirby to Xbox
January 2, 2024 3:50 PM   Subscribe

 
Randall Monroe and Jeph Jacques popped out to me. I assume there’s more wild stuff buried in there.
posted by supercres at 3:57 PM on January 2


Who/what is “my vision”?
posted by Going To Maine at 4:03 PM on January 2




Alan Davis
Scott McCloud
John Romita Jr
Steve Ditko
Bill Sienkiewicz
Bill Watterson
Berkeley Breathed
posted by ursus_comiter at 4:06 PM on January 2 [1 favorite]




Ditto:

Video Arcade System
Video Art
Video Challenger
Video Driver
Videopac+G7400
Videosmarts
View-Master Interactive
Vision
posted by Going To Maine at 4:07 PM on January 2




Wow, that “AI Spring” article on Wikipedia is some garbage.
posted by Going To Maine at 4:08 PM on January 2 [3 favorites]


Oh hey, Yoko Ono is in there. Gonna ask Midjourney to make a line and follow it
posted by Going To Maine at 4:11 PM on January 2


It's really obvious to us that this is ethically mega wrong, but getting a judge to agree that this amounts to a crime is another question. The defendant companies all basically say "yeah, we trained on those images, but they are not stored or represented in the resulting model in any real way except that it has learned certain distinctive aspects of them. So there is no copyright infringement."

The argument in this suit is basically that because the inputs were copyrighted or copyrightable works, then the output is necessarily derivative of them and a "digital collage" that competes directly with the artist and infringes on their work. It's not an easy case to make, and the companies being sued have lots of money, good lawyers, and the advantage of there being little to no precedent in this area.

Unfortunately I expect a lot of these suits to fail until there is some kind of legislative action here or some more compelling legal angle is arrived at to compel the generative AI companies to act ethically.

Fortunately I also expect the artistic limits of these tools to become more and more obvious as people find that they are not replacements for creation, and indeed don't really "create" at all.
posted by BlackLeotardFront at 4:15 PM on January 2 [5 favorites]


I am more hopeful that the courts will help reign this in, because if a system can reproduce a work in such a way that's nearly entirely accurate without human intervention, then regardless of the details it obviously is being stored in some meaningful way, or else copyright is irreparably broken, as you could scrub any artist's rights just by training an AI on their works then asking it to make new ones.
posted by JHarris at 4:33 PM on January 2 [3 favorites]


chavenet posted this to the other AI thread: Things are about to get a lot worse for Generative AI.

It's not artists who are going to cause trouble for the AI bros; it's megacorps invoking trademark law.

I mean, hoovering up art in order to regurgitate it for profit is immoral, but pissing off Disney is unwise.
posted by zompist at 4:36 PM on January 2 [18 favorites]


Small world. A brief skim revealed that I own an original sketch by one artist on this list (Gahan Wilson) and used to be good friends with another, until he turned out to be a manipulative asshole.

Appropriately, M.C. Escher is on the list twice.
posted by Faint of Butt at 4:49 PM on January 2 [7 favorites]


Burn it to the ground.
posted by cupcakeninja at 4:54 PM on January 2 [4 favorites]


I Am Here For This
posted by glonous keming at 5:10 PM on January 2 [1 favorite]


H R Giger

That's worrying.
posted by Pickman's Next Top Model at 5:11 PM on January 2 [2 favorites]


I am more hopeful that the courts will help reign this in, because if a system can reproduce a work in such a way that's nearly entirely accurate without human intervention, then regardless of the details it obviously is being stored in some meaningful way, or else copyright is irreparably broken, as you could scrub any artist's rights just by training an AI on their works then asking it to make new ones.

How is it different from you looking at those artists then doing your best to copy their work?
posted by Sebmojo at 5:13 PM on January 2 [3 favorites]


My other quibble is that we have spent 30 years patiently explaining to everyone how copyright doesn't actually matter, that nothing is lost when a copy is made, that information wants to be free etc. The web is built on endless, frictionless copyright infringement. It's a legally actionable breach of copyright every time you paste a funny meme into your socials. Compared to that, this is not close to a copyright issue imo, for all it's kind of fucked for working artists.
posted by Sebmojo at 5:19 PM on January 2 [3 favorites]


I don't even particularly think Disney would be down on generative AI even if it's trained on their stuff. The question I would think Disney would be asking itself would be: "So what if they used our art and style? If that makes it good at generating images that look the way that we in particular want them to look, and can be used as part of a production system where we don't need to hire as many artists or pay them as much? Hello, we would like to invest in your project!"
posted by notoriety public at 5:24 PM on January 2 [1 favorite]


How is it different from you looking at those artists then doing your best to copy their work?

You are a person and "generative AI" isn't.

BUT WHAT ABOUT STRONG AI DOESNT THAT RAISE QUESTIONS ABOUT THE NATURE OF

The day we have strong AI is the day we can consider it. That day is currently far, far off. Generative AI ain't it.
posted by star gentle uterus at 5:29 PM on January 2 [12 favorites]


With the way technology gets, AI art has probably already peaked. Like iTunes started out basic, got better and better, and then Apple started taking features away and making it worse until they replaced it with something entirely suckier. Or the way Google's search engine got better and better until it started geting suckier and now you might as well use bing.
posted by rikschell at 5:32 PM on January 2 [1 favorite]


You are a person and "generative AI" isn't.

a paintbrush isn't a person, and nor is a camera. obviously they're different, but is the difference as material as it feels? you're still manipulating a tool, to produce a result.
posted by Sebmojo at 5:33 PM on January 2


Jack Kirby, Jim Lee, and John Byrne are there, but no Greg Land, so whoever is choosing who to steal from at least has a modicum of taste.
posted by signal at 5:33 PM on January 2 [3 favorites]


How is it different from you looking at those artists then doing your best to copy their work?

One difference is that when a computer looks at art it, the word "looks" is a metaphor, and when a human looks art the word "looks" is not. The exact technical details will vary, but in the case of AI "looking" at art, what is likely going on is that it is being fed a perfect digital copy, and that copy is in all likelihood unambiguously the very thing copyright law concerns itself with.
posted by surlyben at 5:38 PM on January 2 [10 favorites]


Jack Kirby, Jim Lee, and John Byrne are there, but no Greg Land, so whoever is choosing who to steal from at least has a modicum of taste.

Also no Rob Liefeld, which is somewhat confusing, because that would at least partly explain any difficulties with generating images of feet.
posted by notoriety public at 5:39 PM on January 2 [18 favorites]


a paintbrush isn't a person, and nor is a camera. obviously they're different, but is the difference as material as it feels?

Yes.

you're still manipulating a tool, to produce a result.

You're moving the goalposts. Your first question was how is the system different than a human artist. Now it's "well, they're just tools manipulated by humans". Which is it? Paintbrushes and cameras don't by themselves generate anything, they are indeed tools manipulated by humans.
posted by star gentle uterus at 5:40 PM on January 2 [8 favorites]


if a system can reproduce a work in such a way that's nearly entirely accurate without human intervention, then regardless of the details it obviously is being stored in some meaningful way, or else copyright is irreparably broken

If 37,000 of the images a neural network is trained on (out of 5 billion) are tagged with "fish scale", then the network is going to form a rough idea of what a fish scale shape means, just in general, and that fish scales are usually rendered with very high specularity (shiny) in a highly repetitive and tight semi-staggered pattern. It will also infer that fish scales are usually associated with water in general / fish in particular, in a way that is distinct from other usages of the token "scale." None of those 37,000 images need to be in there - only the general shape of a fish scale as derived from all of them.

I know JHarris and most of the thread participants already know this, but: 3Blue1Brown's introduction to neural networks if you want an example of how a neural network is trained to recognize numerals in an image. It's just linear algebra but even a simple network can arrive at a set of weights that encompasses all of what it means to be 9-shaped. It may not have any notion of what 9 is or does or how it is used, let alone numerals, but it definitely understands what the shape of 9 generally means.

It's really obvious to us that this is ethically mega wrong, but getting a judge to agree that this amounts to a crime is another question

Creative work has put food on my table for the last 20 years and this is not obviously wrong to me, at all. What is obvious to me is that it radically alters the financial landscape for visual artists in a way that is going to have an absolutely horrible impact, and that it is bad for artists to starve. The problem is the impact, not the intent.
posted by Ryvar at 5:40 PM on January 2 [1 favorite]


Photocopiers are 'just tools', but that doesn't make it ethical or legal to photocopy books.
If you make a tool that copies other people's work so you can profit off of it, saying it's 'just a tool' does not make your actions ethical or legal.
This is so obvious that people making this dumb argument about AI must be disingenuous.
posted by signal at 5:44 PM on January 2 [13 favorites]


Brief caveat that my "it is bad for artists to starve" is only 99.999% true:
Scott Adams
[sotto voce] Suffer.
posted by Ryvar at 5:49 PM on January 2 [15 favorites]


How is it different from you looking at those artists then doing your best to copy their work?

You all know that if I stare at a picture of Kirby and then draw a Kirby, I can't legally sell it on a t-shirt, even if I drew it myself, right? That's the whole point of copyright.

To be clear, I think it's spurious to try to treat this 'AI' routine like a person. But even if we do, this argument falls flat.
posted by SaltySalticid at 5:55 PM on January 2 [11 favorites]


Walt Disney
INCOMING!
posted by The Tensor at 6:00 PM on January 2 [1 favorite]


if I stare at a picture of Kirby and then draw a Kirby, I can't legally sell it on a t-shirt, even if I drew it myself, right?

Correct. But you could sell a new composition drawn “in the style of Jack Kirby,” which, in theory, is how this artist-name-as-a-prompt is meant to be used. In practice: usage is an entire spectrum from “in the style of” to “straight ripping off.” Also, recognizable artist names are such powerful triggers in a prompt that in the rare cases where I use one I usually have to manually reduce their weight to 0.7 or lower… which seems indicative of a broader training problem. (Edited to add: I do not use these tools commercially, ever)
posted by Ryvar at 6:03 PM on January 2


Fortunately I also expect the artistic limits of these tools to become more and more obvious as people find that they are not replacements for creation, and indeed don't really "create" at all.

I'm not sure those limitations will last. But even then there is weird stuff happening. Someone on twitter posted a thread of bird photos generated by Adobe's AI plagiarist and they were pretty good and will probably only keep getting better. But a big part of Adobe's business is tools for professional photographers. By eliminating photographers ability to earn money for their photos they are going to bankrupt a profession and thus hurt their own bottom line.

Strange times.
posted by srboisvert at 6:06 PM on January 2 [1 favorite]


Who made the decision to sort this by author's first name? Does that seem inconvenient to anyone else, or am I being a grouchy old fart?

Or both?
posted by Saxon Kane at 6:11 PM on January 2 [4 favorites]


Ryvar right, original works that are derivative of other people are often ok. These systems go far beyond that though right? To be clear, I'm pretty sure I can't sell a close copy of a given work, whether I make it by hand or with a copy machine.

And I can't sell an image of Kirby (sorry for any confusion there), even if the composition is all my own. I suppose that's likeness rights in addition to copyright, but that's just an additional layer of things I can't legally do, independent of whether it's free-hand or tool-assisted.
posted by SaltySalticid at 6:24 PM on January 2


lol screw AI and everything about it. what a joke.
posted by AlbertCalavicci at 6:27 PM on January 2 [1 favorite]


The problem is the impact, not the intent.

As the computer has no agency, intent cannot be predicated on it. People have agency and thus intent. As the people behind this project are not artists, I would not predicate that their intent is to create art. No, their intent is purely to make money. And that is the fucking problem as they are making money off the labor and property of other people without any recompense. The magic glow of technology is a falsehood. It’s not magic. In this case it is theft.
posted by njohnson23 at 6:30 PM on January 2 [6 favorites]


How is it different from you looking at those artists then doing your best to copy their work?

As others have said, an AI (which, again, isn't a great word for these things, so I'm going to start employing the scare quotes from here) isn't a person. Speaking entirely practically, "AIs," being computer programs, can be endlessly spun up. If an "AI" looks like a person subjectively, should it be given a vote? Okay then, but then what if a company with billions of dollars of resources can create them at a whim? What if such a company went about laundering the copyright of everyone in the whole damn world?

These AIs aren't people, and what's more they don't even look vaguely like people. They don't have opinions, they don't have inner lives. All they do is make things that look like other things! People deserve to be thought of as the creator of things, because no matter how much of our minds are composed of fragments of other sources (and that is a great deal, close to 100%), we can still add something out of ourselves to it, because we're also a lot more than a content generation algorithm. "AIs" are not.

Every time one of these threads about "AIs" comes up this question arises, but it always takes the view that something spooooooky is going on here when there really isn't, these kinds of "AIs" are not magical. This is the reason why I use the scare quotes, because calling them "AIs" is ultimately marketing, obscuring the fact that these are purpose-built machines.
posted by JHarris at 6:35 PM on January 2 [6 favorites]


I am deeply worried about letting rights holders have their every wish enforced, not because I care much about AI progress, but because transformative works have always been, intentionally, fair use. Copyright was supposed to be a balance, but the more monopoly power we cede to rights holders, the less room there is for public culture.
posted by Popular Ethics at 6:47 PM on January 2 [2 favorites]


These systems go far beyond that though right? To be clear, I'm pretty sure I can't sell a close copy of a given work, whether I make it by hand or with a copy machine.

These systems go far beyond that in the sense that Photoshop goes far beyond that: it's entirely based on what you prompt the AI with. If I wanted an image I knew for absolute certain could not possibly be a clone of any existing comics image then I could prompt Stable Diffusion with "four black women superheroes in the style of jack kirby" and be relatively certain there is no prior art to draw from. And I'd get this.

Full promptPrompt: four black women superheroes in the style of jack kirby, (jack kirby), rich colors
Negative prompt: (worst quality, low quality, normal quality:1.7), (text:1.5), (error, cropped, blurry, signature, watermark, username:1.3), monochrome, extra limbs, fused fingers, too many fingers
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 7, Seed: 3621751367, Size: 768x512, Model hash: c249d7853b, Model: dreamshaper_6BakedVae, Clip skip: 2, ENSD: 31337

posted by Ryvar at 6:54 PM on January 2 [1 favorite]


And if you want proof of what I'm talking about, there's a known phenomenon with these kinds of "AIs," about what happens when you train them on the work of other "AIs." It turns to crap. If "AIs" are fed other "AI"-generated output, then they become unmoored from reality and become more and more abstract, undergoing what is being called "model collapse."

"AIs" can't produce anything meaningful without being trained ultimately on human work. It is only right that those humans have some say over whether their work can be fed into this infernal machine, and if it is, that they be credited and profit from it.
posted by JHarris at 6:56 PM on January 2 [8 favorites]


Photocopiers are 'just tools', but that doesn't make it ethical or legal to photocopy books.

How about photocopying a collage of copyrighted works as the cover of your 'zine? I have the feeling most of us would have approved of that....

Anyway, Midjourney is: 1) Obviously transformative and 2) Pretty cool. Has anyone played with Midjourney v6 yet? It's been out for a couple of weeks and it's impressive. I've been really enjoying the subreddit.

I think corporations imposing limits on how computers can view their work leans way too close to corporations imposing limits on how humans can view their work. Once I've paid for a piece of art and hold it in my hands (or on my ssd) I should be able to view it or allow my computer to view it however I like. I feel like we've been arguing for years in favor of borderless internet copyright rules--If you can see it in the US, why shouldn't you be able to see it in Canada? I see this as the same principle.

...no matter how much of our minds are composed of fragments of other sources (and that is a great deal, close to 100%), we can still add something out of ourselves to it...

I think this is pretty dubious. We're ultimately material.
posted by mr_roboto at 7:05 PM on January 2 [2 favorites]


>a paintbrush isn't a person, and nor is a camera. obviously they're different, but is the difference as material as it feels? you're still manipulating a tool, to produce a result.

What AI does is overwhelmingly dependent on random generation. You wouldn't roll dice and then say you "created" the resulting numbers by "manipulating a tool". When you use a paintbrush you have an extremely high level of control over what you're making at the largest and smallest levels. With AI you have almost none. You get what you get; it's not remotely the same thing as making something yourself. You can ask it for exactly the same thing, in exactly the same words, over and over and get dozens or thousands or millions of different results, and the differences have nothing to do with your "manipulation of a tool".
posted by Sing Or Swim at 7:14 PM on January 2 [2 favorites]


You wouldn't roll dice and then say you "created" the resulting numbers by "manipulating a tool".

I thought we had buried this argument 80 years ago.
posted by mr_roboto at 7:18 PM on January 2 [3 favorites]


If "AIs" are fed other "AI"-generated output, then they become unmoored from reality and become more and more abstract, undergoing what is being called "model collapse."

there's lots of fun stuff to unpack with model collapse. like, presumably there is some group hard at work trying to figure out an architecture that doesn't suffer from that problem. i strongly suspect the way to do it is to allow the model to select its own inputs rather than being force-fed whatever data was scraped to train it, i.e., endowing it with drives and urges that don't fit neatly in the generative AI framework. it would be at least a little hilarious if the thing that allowed for truly general AI was also the thing the prevented it from being monetized in any meaningful fashion.

i also wonder to what extent humans are susceptible to some form of model collapse. given how much of the content people consume is promoted by The Algorithm, and The Algorithm itself is likely informed by human data, it seems plausible that society, or at least significant portions thereof, are headed for a similar collapse.

the extent to which this has already happened is left as an exercise for the reader
posted by logicpunk at 7:24 PM on January 2 [3 favorites]


I’m sad my name isn’t on the list.
posted by chronkite at 7:48 PM on January 2


BOB ROSS!?
posted by clavdivs at 7:48 PM on January 2


I think this is pretty dubious. We're ultimately material.

Yeah, that's probably right. But we're also the material that makes up the world, that makes up civilization, that has to survive, and we're the material that makes the rules. We're the material that draws the lines.

The rules about who creates things, they're not edicts from some higher authority. We created them. We decided that authorship means something, that it imputes certain rights to the creator, because they're a human being. Giving those same rights to an "AI" is basically just giving them to the entity that owns the "AI," and the courts have already decided that "AI"s can't have copyright. We might be material, but we're material that has come up with rules around the production of creative work, and there's no reason that we have to regard "AIs," which were put together for this purpose, isn't even alive, and doesn't have to support itself, rights that supersede ours, especially when it needs to be fed gigantic amounts of human-produced art to function.

We might be material, but we're a lot better material, for this purpose, than a damn "AI." We don't need to cede this ground to the corporations who will fill our world with shit.
posted by JHarris at 8:15 PM on January 2 [7 favorites]


Sergio Aragonés isn't in the list which explains why midjouney can't generate little doodles in the margins.
posted by Catblack at 8:22 PM on January 2 [4 favorites]


"Transformative" is part of a fair use defense when you have been accused of unauthorized copying. To use it is kinda to acknowledge that there was unauthorized copying, but also, it is not an automatic win, and it must be considered along with other fair use principles. Financial impact is one of those principles. If the transformed derivative work competes with original work and causes financial harm, it might well be held to be a copyright violation.

Something like a zine with a bunch of different works on the cover probably isn't competing with the original work, and it might be offering commentary on the work, so it could possibly be fair use even if the original isn't transformed at all.

That said, I suspect the typical output of something like Midjourney likely *does* count as sufficiently transformative that the copyright liability is nonexistent. But the output isn't the only thing to consider. The training data may be a violation, if the company was just like, "oh yeah, we made a bunch of unauthorized copies and put them on our servers, and then used them however we wanted for months and years." The model itself may be a copyright violation, particularly if it can output copyrighted material (as in the New York Times case). It's generally access to the model that AI companies are selling, after all, not the (uncopyrightable) output of the model.
posted by surlyben at 8:31 PM on January 2


Anyway, Midjourney is: 1) Obviously transformative and 2) Pretty cool.

Just to be clear, in copyright law, "transformative" is an adjective applied to works, not tools.

Also, as surlyben pointed out, whether or not a work is transformative is not necessarily obvious. The court will apply a legal analysis that considers a number of different factors.
posted by Artifice_Eternity at 8:37 PM on January 2 [1 favorite]


There are more Freds than I would have expected.
posted by skyscraper at 9:44 PM on January 2 [1 favorite]


Linked from the "garners criticism" article: This new data poisoning tool lets artists fight back against generative AI

Seems like a fairly simple and equitable proposition. Don't want to fuck up your profitable data model? Don't scrape my art.
posted by flabdablet at 10:34 PM on January 2 [1 favorite]


It's propaganda to say that AIs do just what humans do. We have a whole vocabulary for things humans can do with other people's art that's illegal, unethical, questionable, or just tacky. And of course there are other things they can do that are parody, commentary, pastiche, or tribute. Getting courts to work all this out is tricky; getting developers to even understand the issue is trickier.

When some rando makes a cool video, and then a megacorporation rips off that video for a commercial... I don't know if it's illegal, but it sure is unethical. This seems close to what Generative AI does.

If you want to train your AI on dead artists, go crazy. Just make sure it knows that it can draw Mickey Mouse as Steamboat Willie today, but it can't put white gloves on him till next year.
posted by zompist at 10:42 PM on January 2 [8 favorites]


Nor can it depict him at the helm of an actual steamboat without running foul of trademark law.

Steamboat Willie version of Mickey Mouse enters public domain — but there is a catch
Disney said it will "work to safeguard against consumer confusion caused by unauthorised uses of Mickey and our other iconic characters."

The company has even added a clip from Steamboat Willie to the opening sequence of every Walt Disney Animation Studios film.

"They were very smart folks at Disney — they realised that the best thing to do was to establish that iconic sequence of Steamboat Willie as a trademark," Mr Hughes said.

Anybody using the classic image of Mickey at the helm of the boat on shirts, caps or mugs could be open to legal action, he said.

Anyone hoping to cash in on Disney's beloved mascot "should move cautiously and with counsel," added Mr Hughes.
posted by flabdablet at 10:53 PM on January 2


Can’t sleep. Thoughts:

The training data may be a violation, if the company was just like, "oh yeah, we made a bunch of unauthorized copies and put them on our servers, and then used them however we wanted for months and years." The model itself may be a copyright violation, particularly if it can output copyrighted material (as in the New York Times case).

Yeah this from the other thread was a great “disagree, but…” The thing I’m not sure about is where the courts might land on whether the spaghettified version of the inarguably copyrighted material used for training remains under copyright after. In theory you could feed all of those copyrighted images into the training process direct from the original host webserver just like a human browsing the images would view them, rather than cache a copy in the filesystem of an onsite harddrive. Nobody actually directly downloads images that I’m aware of because the performance bottleneck would be insane, but in principle they could.

Further complicating this is that in some cases the spaghettification is incomplete or outright skipped leading to straight copyright violation in the output. Some of that can be written off as “oh we only had one image for that prompt term or only 37 variations of the exact same image at slightly different resolutions,” and presumably be fixed by just supplying a greater diversity of images for the offending term, but occasionally it happens on terms that damn well better have at least hundreds of affiliated images; and that level of training error/negligence is a serious problem both in terms of copyright and in terms of basic correct function.

Point being: there’s a lot of room for argument that copyright isn’t truly being violated beyond what a human user does, but every one of those arguments comes with a bunch of pretty ugly caveats.

getting developers to even understand the issue is trickier

The development is not being done with any amount of rigor: the open source toolchain is a security nightmare the likes of which I have never seen before. The megacorp implementations are likely worse under the hood.

I think a lot of “move fast and break things” came over with the crypto bros who saw AI exploding in capability and suddenly had a new use for all those now-useless GPUs. Hence the robber-baron approach to securing training materials. The community is problematic, porn obsession aside (underage shit has been shockingly and mercifully absent from what I have seen). I think they understand just fine, and a very large minority simply doesn’t give a fuck. See also: negligence in training. That’s not an argument against any of this in principle, just a terrible community that could stand to be hit with some legal consequences for excessive expedience.

I don't know if it's illegal, but it sure is unethical. This seems close to what Generative AI does.

Agreed but it’s a spectrum. Human etiquette with mashup culture / homages / straight ripoffs falls apart when people who take a liberal usage approach that borders or even qualifies as bad faith + industrial scale output is legit a problem. Impact not intent.

You get what you get; it's not remotely the same thing as making something yourself.

Again: spectrum. Prompt engineering + Img2Img batches anchored on a source image + inpainting masks for “generate only here / generate everywhere not here” can turn into a huge amount of genuine work if you have a very specific concept, composition, and mixture of style in mind. Nearly as much work as artists doing the same thing with fine arts skills.

My personal use is mostly very specific images of kaiju-sized flying jellyfish consuming chunks of skyscrapers above burning cities. Dozens of hours, thousands of renders, still not quite right. It can be a genuine artform of sorts, but I strongly suspect that is not the vast majority, and it will only get worse as companies use it to flood the zone with bullshit.
posted by Ryvar at 1:41 AM on January 3


Once I've paid for a piece of art and hold it in my hands (or on my ssd) I should be able to view it or allow my computer to view it however I like.

I think this is a huge part of the argument against these training systems. In these cases, no one paid for the right to use these images in this way. They didn't even bother asking for permission in almost every case.

If they paid for usage, there wouldn't have been an issue. If they asked to compensate someone for the use of their work, this never would've been an issue.

This isn't as simple as someone being inspired by the work of another, this is a systemic theft of others work. My work was likely used without my permission when they downloaded images from some major sites to train their system on. I was never given an option to opt out. In most cases these sites weren't even given a chance to let users know this was happening. I also earned nothing from the usage of my works.

The entire model is predicated on not paying or asking for permission to use copyrighted works. In your example you clearly paid to have that art, but these systems haven't. It's not even about the output, it's that they clearly and openly admit to using copyrighted materials without permission. They know it, they say it, their only defense is trying to get otherwise reasonable people to ignore it. Another step on the path of diminishing the value of creative works.
posted by JakeEXTREME at 2:37 AM on January 3 [5 favorites]


Prompt engineering + Img2Img batches anchored on a source image + inpainting masks for “generate only here / generate everywhere not here” can turn into a huge amount of genuine work if you have a very specific concept, composition, and mixture of style in mind. Nearly as much work as artists doing the same thing with fine arts skills.

Yeah, no, not even close.
posted by signal at 5:05 AM on January 3 [12 favorites]


shut up spending an evening refining a prompt is exactly the same as toiling for years to master the techniques and materials that allows one to execute a competent oil painting
posted by logicpunk at 6:42 AM on January 3 [10 favorites]


Which comes back to one of the oldest points with these things, which is that telling something what you want and taking what it gives you isn't called being an artist, it's called being a customer.
posted by Pope Guilty at 7:27 AM on January 3 [10 favorites]


In a decade or so these tools will be very useful to the biodiversity denialists I firmly expect to start appearing soon:
"Elephants never existed! They were just faked in midjourney by some woke teenager!"
posted by thatwhichfalls at 7:49 AM on January 3 [2 favorites]


Gentle reminder to people replying to me that what I said was very much NOT that they require similar amounts of skill or creativity, and that I currently work with literally dozens of artists who spent the past few weeks working out how to incorporate both Photoshop generative fill (identical to inpainting) and Midjourney (as a better targeted Pinterest reference search replacement) into their workflows. Logicpunk, I await with eager anticipation your first reply to me that is not burning a straw man for internet points.
posted by Ryvar at 7:56 AM on January 3 [1 favorite]


I think of AI prompts as essentially ordering food off a menu at a restaurant, and you pay that restaurant - except the food was made by other workers at a different restaurant and yoinked. Those workers who actually made the food get jack shit. And those workers are just told - oh well - too bad for you!
posted by Gyre,Gimble,Wabe, Esq. at 8:25 AM on January 3 [2 favorites]


mr_roboto….

Cage used the I Ching, rarely dice, in order to remove the idea of a creator from the process. Though he chose what musical elements corresponded to the I Ching results, the actual output was random, hence he didn’t make it. It just happened.
posted by njohnson23 at 8:38 AM on January 3


I think of AI prompts as essentially ordering food off a menu at a restaurant,

If you’re just tweaking prompts that is definitely the case, while the department collab I watched unfold was a mixture of all the available techniques (and I need to clarify: it was a group investigation as to whether any of this could be used to speed up their work, not a commitment to using them for their professional output going forward). Worth noting that every single one of these artists has their work in Stable Diffusion’s LAION-5B training set, without their consent.

The process of inpainting a mask in Photoshop with a Wacom tablet and stylus is very similar to quickly sketching concepts using a charcoal texture layer in Photoshop with a Wacom tablet. The best artist I have ever worked with can punch out jaw-dropping concept pieces with the latter technique every 45 minutes like clockwork. They’re like watching a timelapse video play out in person. Inpainting masks doesn’t shave much time off that, if any, and I did not see much enthusiasm for it on the whole.

The response to the new Midjourney was far more enthusiastic, it gave more junior artists a feeling like they were getting to play art director a bit during the reference search phase. It boils down to a faster, better Pinterest reference scroll before banging out the real thing exactly as before, but that’s not nothing.

I think the value of inpainting is mostly for people like me with an extremely specific vision and zero innate fine arts talent - there is no amount of time spent developing those skills that will bridge the gap between me and most of the artists I work with when they were 8 years old. But that value is built on the backs of countless artists without compensation. And I don’t have a good answer as to what to do about something that so many people want or are going to want, and is freely available to run on your personal PC, and is built without compensating those artists. The horse has long since fled the barn, those tools aren’t going away and the existing trained models are not going to disappear - Midjourney is a service and could be shut down, but outright passing a law that bans Stable Diffusion would play out exactly the same as passing a law banning Photoshop or game piracy.
posted by Ryvar at 9:13 AM on January 3


I think the notion that an algorithm can't be creative is incorrect. Evolution created the lemur. It's an algorithm that mutates configurations and then filters them. Perlin noise can write a melodic line when constrained to markov-chained chord changes that will prompt a passerby to stop and say, "I like that music." Stochastic processes can be a lot less "random" than you might think. Was something new created that has value? I say yes. Language models and generative networks are a lot more sophisticated than something I threw together. Creativity can happen outside a human brain.

Secondly, I think the notion that nothing "magical" is happening inside these things is also incorrect. I don't mean actual wizard magic. I'm referring to emergent behavior. It's been discovered that language models can not only write computer programs, but they can execute them as well. They were not designed to do this. This behavior somehow emerged from the complicated mathematics within the system. Similarly (sort of), text-to-image generators have been found to build 3d representations internally of scenes they produce. They'll give you real steroscopic images if you ask for them. This ability was not programmed into the system. There is no way to imagine what kind of emergent behavior will be discovered in these things.
posted by The Half Language Plant at 9:41 AM on January 3 [2 favorites]


Appropriately, M.C. Escher is on the list twice.

It's got both M.C. Escher and M. C. Escher.
posted by straight at 1:28 PM on January 3


The lemur just happened. Unless, of course, you believe in creationism. Evolution is not teleological. Please provide some examples of creativity outside of any living brain that does not involve anthropomorphism including other living brains in the mix.
posted by njohnson23 at 1:31 PM on January 3 [1 favorite]


Evolution is not teleological.

If it were, you'd think the designer would have optimized past bogosort.
posted by flabdablet at 10:24 PM on January 3 [1 favorite]


What is this nonsense. It's a new level of Art to collaborate with a machine: meta-Art.

I don't believe in copyright really. Copying an idea doesn't deprive anyone else who has it. At the Substrate level, All Is Number, anyway. No human made the Numbers, and depicting Art as Number doesn't mean you own that Number and can prevent others from using it.

Data Wants To Be Free.

Intellectual property concepts and laws have held back human progress and wasted so much effort.

Restraints on sharing, spreading, copying, and remixing ideas will all eventually fall in the end.

Artists should not be deprived of what they need to live because ALL of society's members should be provided for, feeding and housing everyone as a baseline, plus a fair share of goodies and luxuries. Let's solve for *that*, and then *no one* starves.

Honestly this has echoes of the complaints of buggy-whip manufacturers at the dawn of the Automobile Age. Time and technology march on.
posted by cats are weird at 1:17 AM on January 4 [1 favorite]


Honestly this has echoes of the complaints of buggy-whip manufacturers at the dawn of the Automobile Age.

I keep seeing this comparison of a transitional technology that was never especially important to something that has been central to the experience of being human for tens of millennia and it just fills me with despair.
How much of what we are are we willing to give up for some slight convenience?
posted by thatwhichfalls at 8:56 AM on January 4 [1 favorite]


How much of what we are are we willing to give up for some slight convenience?

Automobiles offer only a slight improvement over beasts of burden to society?

There was an Ask about that a back in July. What are the pros and cons of reintroducing animal powered transport?
posted by achrise at 9:27 AM on January 4


the complaints of buggy-whip manufacturers at the dawn of the Automobile Age.


Do you have a source for that? I can't find any actual complaints from buggy-whip manufacturers anywhere, except as a just-so story used to ridicule people raising concerns about the intersection of technological advances and Capital.
posted by signal at 9:33 AM on January 4 [3 favorites]


Here is a report on the quantity of horse manure produced daily in New York City back in the days of horse drawn carts, etc.
posted by njohnson23 at 9:35 AM on January 4


Let's solve for *that*, and then *no one* starves.

How long do you anticipate solving this will take? And should artists be allowed to starve in the meantime?
posted by mittens at 9:37 AM on January 4 [1 favorite]


Automobiles offer only a slight improvement over beasts of burden to society?

This is not what I wrote or even close to it.
posted by thatwhichfalls at 9:37 AM on January 4


Not believing in copyright is fine but do you not believe people should be paid for their labor? Why shouldn’t I be paid for the use of my artwork that will help create and unknown number of future projects that is no less than the number of total images derived in the entire life of these bits of computer software?

There’s a bit that there is a purposeful obfuscation of, even if someone tries to generate “in the style of Romita” they will still have used the total works trained on to generate the mathematical output that becomes the image. So the entire business model is to repeatedly use the work of others that they never compensated them for. I’ve said in other circles, I’m not afraid of “AI” output because it lacks creativity and originality but I don’t like that they stole my work to create it. They never even asked me, I’ve had no choice in the matter.

So again, shouldn’t I be paid for my labor?
posted by JakeEXTREME at 12:50 PM on January 5 [5 favorites]


Whip Socket production in the early 20th century.

"He had this to say in volume 25 of The Spokesman and Harness World magazine in 1909, “the present year’s business has been the largest in the history of the concern. For the past five months, the plant has been in operation night and day.” In 1910, the company was producing over a million sockets a year. Naturally, as the automobile industry exploded, the whip socket factory was shut down; its product rendered obsolete."
posted by clavdivs at 5:24 PM on January 5


« Older CAUGHT TESTICLES IN LAWN CHAIR   |   Go 'head, be gone with'it Newer »


This thread has been archived and is closed to new comments