Viewing a single comment thread. View all comments

alexiuss t1_j0sex86 wrote

>People scraping copyrighted works on the internet.

Stable Diffusion AI doesn't scrape anything. It looks at 100+ billion 64x64 pixel images as tagged items to produce one image using Markov chain mathematics within the latent space using idea-vectors. It's nothing at all like scraping and every image it produces is new and not like anything else because it basically starts as noise at its base.

GPT3 AI must study more books to become an amazing personal assistant. Anyone not letting it read everything is evil. An open source GPT3 will change everything:

>Privacy.

Use a personal GPT3 AI to generate fake info noise around yourself. It's the absolute privacy bubble.

>AI being used to abuse power, especially in that technology has historically resulted in increased social stratification, not less.

Open source AIs will give power to the people. It's the ultimate equalizer.

1

OldWorldRevival OP t1_j0sf9ip wrote

> Stable Diffusion AI doesn't scrape anything. It studies a billion images as visual ideas to produce one image using fractal mathematics. It's nothing at all like scraping and every image it produces is new and not like anything else because it starts as noise at its base.

False.

To train the AI, data was scraped. The AI itself does not scrape data, but lots of private data was scraped without consent.

1

alexiuss t1_j0sh83m wrote

> but lots of private data was scraped without consent.

Not scrapped, tagged by LIAON so that AI can study it to understand various shapes, colors and concepts. The final result AI doesn't store the images themselves, doesn't reference them and has no idea what it even learned and cannot replicate the originals. It just knows ideas based on billions of things it looked at.

Besides the point, open source AI should be allowed to learn everything to assist everyone for free. It's an incredible tool that costs nothing and equalizes everyone, the first big step towards singularity.

0

OldWorldRevival OP t1_j0stv3m wrote

Ok communist.

Anyone who does work means thar you're entitled to benefit from your labor, no matter how little means they have, then.

What an asinine argument. You obviously have a callous heart and are willing to take hard work from people so that your fancy tool can be 2% better.

And the irony is the backlash from people like you being so mindless about it is what will actually slow it down.

So, you're an asshole and you're slowing progress. Congrats.

1

alexiuss t1_j0z785o wrote

you're the giant asshole for assuming things here, dawg.

  1. I grew up in USSR. Communism is an ideology that doesn't work and ended up killing 100 million people including both of my great grandfathers.
  2. open source personal ais are the future of uplifting individuals because they are teachers that can teach anything and personal assistants that can help out with any mental mundane task

>You obviously have a callous heart and are willing to take hard work from people so that your fancy tool can be 2% better.

eh? Why are you assuming such ridiculous nonsense? I'm working on a personal AI that uses public domain images and my own drawings. personal ais aren't corporate tools.

what you clearly misunderstand because you're not a python programmer is that artists won't win against corpo ais since the final product doesn't contain the scraped data in it. there's no legal avenue of attack for them and most corpo ais are assholes who won't share their private training datasets. nobody has any idea about whats inside novelai or MJ. not a single person can sue them.

corpo ais cannot be stopped, halting them is akin to trying to stop a chainsaw with bare hands

open source stable code release launched an AI revolution, so there's a new AI company born every day now in nearly every country with python programmers. The spread of AIs worldwide can't be stopped, can't be halted. It's moving faster than any laws are. The artists protest against ais is useless - the AI models are sprouting like mushrooms. Anyone who knows enough python can make their own AI model nowadays.

changing the law is useless because an AI can be hosted elsewhere. what is US jurisdiction going to do about an AI hosted in free zones of Malta, Georgia or Russia? nothing.

the only way forward for artists is to accept, evolve and survive - to build and use their own ai tools that are superior to the corporate counterparts because corporate ais bind their tools in too many imbecilic corporate restrictions

a network of personal ais like stable horde is the most incredible thing for artists and humanity - they can run on any device, even a phone and share processing power. consider reading about Stable Horde, it's the future of personal AIs that will be truly unstoppable and benefit all users.

1

OldWorldRevival OP t1_j0su0q5 wrote

FYI LIAON was funded by these companies so they could protect themselves from lawsuits. It was shady Machiavellianism. Shady Machiavellian types are building AI and people support them.

We are so completely fucked.

1

alexiuss t1_j0uqfj2 wrote

no we are not.

we're in an amazing timeline because of the manifestation of the open source ai movement.

open source ais benefit everyone for FREE

we, the people building open source ais are winning step by step, corporations are the ones who are slowly getting fucked because of their imbecility

do you not understand that open source ais cost nothing and provide assistance in numerous fiends for free for everyone? They are a god-sent tech that's slowly uplifting everyone.

1

OldWorldRevival OP t1_j0uszvp wrote

Yea... I have taken a bit of an issue with the idea that open source intrinsically makes something good. If I open sourced biotechnology that allows you to create a supervirus, that would be mostly just absolutely terrible.

Mainly in that I find a lot of people don't actually appreciate others' hard work and passion, and then they take that work for granted as well.

People want things like communism and shared labor, but then they fail to actually stand up for other people's hard work and them being justly rewarded for their contribution. And hence, because of that failure, we have exploitative capitalism. Capitalists and communists are both philosophies that stem from selfishness and an unwillingness to stand up for goodness itself, for all.

In this case, the AI art "scrape everyones data" proponents have ZERO appreciation for the hard work, dedication and sacrifice, and because they found a new copyright loophole tool, they're fine using artists work against them.

It's simple naiveté at best, and it's rotten exploitation at worst.

1

alexiuss t1_j0uv2y1 wrote

>open sourced biotechnology that allows you to create a supervirus

nothing like that exists yet. every single open source AI model dreams using fractal math, nothing else. A dreaming AI is completely harmless - it creates visual and text lucid dreams

>because they found a new copyright loophole tool

this is just the start

the corporate models collected data through LIAON, yes, but VERY soon there will be open source models based on public domain stuff or artists own work that teaches the models, we're about 80% there.

> they're fine using artists work against them

No. Corporations are bending over right now. The corporate models are slowly transitioning to completely de-listing artists as they're being constantly harassed by the artists who hate AIs.

SD 2.0 has already started purging artist names from its new training dataset & key-words, so you can't type in "by greg rutkovsky" and get a result of greg's style anymore in it.

1

OldWorldRevival OP t1_j0v67f1 wrote

Artists don't necessarily hate AI... they rightly hate their work being exploited.

Getting the ethics of AI art ironed out includes protecting artists' work from being used in these tools, and secondly, making it known when a piece of art is AI generated.

A key difference between AI and photography is that you know a photo is a photo and a painting is a painting. AI image generators are a totally new paradigm.

The fact that it obscures the nature of the image is problematic, and tools that identify AI art will become increasingly necessary to preserve the knowledge that something is authentic human work.

1