Comments

You must log in or register to comment.

varkarrus t1_j1mynsr wrote

>AI got startlingly good in 2022. It began with the AI image generators [...]

Someone hadn't used GPT-3…

37

TouchCommercial5022 t1_j1mq4cf wrote

Filing Statement;

"The long-term trend has been that new technologies tend to exacerbate precariousness. Large, profitable industries typically turn away new entrants until they incorporate emerging technologies into their existing workflows."

This article is a very interesting way to look at the generative AI revolution of 2022. As with previous IT revolutions such as social media, it will be the profit interests of the business that are likely to prevail in how that this technology shapes our future.

Bladerunner dystopia confirmed, understood.

I can't imagine how much cost they're racking up, I mean, they're already monetizing GPT-3, so I guess it's pretty clear what they're going to do next. This is the "gain publicity and users" phase. Making money will come soon enough

They earn a lot of money from user and company subscriptions to access ChatGPT and their other services. It's free right now, but it won't be for long.

this is also why they are trying to remove Stable Diffusion, to incorporate it into the next Adobe release or something

I really appreciate and admire what Stable Diffusion did. A few weeks after Dall-E and Midjourney made the rounds with their paid private service, they simply went out and released their work openly, open source and free to play at home. They threw a whole new "industry" that was just beginning to capitalize, under the bus. The fucking rich who invest in Dall-E must have been furious.

So now only the rich will benefit from AI, the poor will eat shit as usual.

And with AI replacing the poor, very soon the rich won't need the poor at all.

but I would pay for chatgpt;

I can afford GPT3 right now: 50,000 tokens (about 37,500 words, input and output counted) for $1. GPT3 is almost as good in many ways as GPT chat.

$50 will get you 2.5 million tokens or about 2 million words. An average page contains 500 words. So let's say your average query is half a page, 250 words. So those $50 = 10K individual inquiries.

So basically you can buy it at that price right now (in the form of CPT3), except you don't pay monthly, you pay per token, so you could spread out those 10K queries over many months if you wanted.

I suspect that chatGPT would have similar prices.

What I really can't wait to see and use is GPT4.

The genie is out of the bottle. It's all open source, so unless they start banning personal computer ownership and co-op working, there's going to be weird AI for the masses for the foreseeable future.

The analogy with social networks is incorrect, because social networks require everyone to be on the same network. The best analogy is the app store. There will be big players and little players, but getting locked out will only happen in the most extreme cases, and those guys will continue to thrive in their own corners.

15

visarga t1_j1my2tt wrote

If you want chatGPT to incorporate information from sources, you have to paste search results into the context. This can easily get 4000 tokens long. For each interaction afterwards, you pay the same 4000 tokens price as the history is very long. You would have to pay $1 after 10 replies.

You would need to do this when you want to summarise, or ask questions based on a reference article, or just use chatGPT as your top level above search, like you.com/chat

It's not cheap enough to use in bulk, for example to validate Wikipedia references. You'd need to call the model for millions of times.

12

blueSGL t1_j1n8084 wrote

They seem to be getting clever esp around certain concepts, I doubt they have hard coded training around [subject] such that the returned text is always [block text from openAI] more that they have trained it to return [keyword token] when [subject] gets mentioned and that is what pulls in the [block text from openAI]

you can bet they are going to work hard with every trick they can think of to remove inference cost, having a lookup table for a lot of common things and getting the model to return a [keyword token] that activate these would be one way of going about it.

Also likely how this sort of system would work in a tech support field. You don't need the system waxing lyrical over [step (n)] you just need to tell customer to perform [step (n)] with maybe a little fluff at the start or the end to make things flow smoother.

1

SnipingNinja t1_j1ni87i wrote

Look at Google's CaLM, it's trying to solve this exact issue afaict

2

breadsniffer00 t1_j1ogu5g wrote

“Only the rich will benefit from AI”. Average ppl were using GPT-3 before ChatGPT. It’s inexpensive and they even gave $18 in free credit. Not everything has to fit this dystopian anti capitalist narrative you’re creating

8

imlaggingsobad t1_j1oqq9c wrote

Also this is very early days still. Computers also started off expensive, so did phones, gaming consoles, and TVs. But now we have a huge market with many affordable options.

2

GuyWithLag t1_j1q43hg wrote

There's good indications that one can trade-off training time and corpus size against model size, making the post-training per-execution cost smaller.

Note that ChatGPT is already useful to very many people; but training a new version takes time, and I'm guessing that OpenAI is currently still in the iterative development phase, and each iteration needs to be short as it's still very early in the AI game.

1

DukkyDrake t1_j1mu3sl wrote

>I can't imagine how much cost they're racking up

I've seen estimates from $3m/day to $3m/month for chatGPT compute.

>average is probably single-digits cents per chat; trying to figure out more precisely and also how we can optimize it— Sam Altman (@sama) December 5, 2022

6

luciddream00 t1_j1nq26k wrote

Well, it did what it needed to do already. I think a lot of folks like myself were aware of what OpenAI was doing, but not necessarily aware of how good it had gotten. Within a week of seeing what ChatGPT was doing, I was making little prototypes using their API. I'm sure I'm not unique in that, and I expect there to be an explosion of AI driven software over the next 6 months to a year. This stuff has gotten good enough and easy enough to use that almost anyone could learn the AI part over night. You still have to be able to make the rest of the software, and you have to have a vision for some kind of product that leverages the AI, but actually integrating the AI into a familiar framework is trivially easy.

Costs will surely go down over time too. Right now it's probably like a penny per chat interaction going by the API costs and it wouldn't be too many doublings in GPU efficiency before that would get to the point that it could be ad supported or something.

8

odragora t1_j1o2x77 wrote

Money are not the problem.

The real problem is the level of the society, where normal people who just want to use the tool are stuck between people who abuse it to produce "funny" content about Hitler, racism and such, and Karens who have a life goal of banning absolutely everything on Earth.

7

snwfdhmp t1_j1pq4ah wrote

Why would they be stuck by funny content ?

3

odragora t1_j1psm01 wrote

Because people who are using the AI to generate stuff like this are making it look in the eyes of the broad public like a tool for being either a criminal or just a piece of shit.

Which inevitably leads to calls for banning it, or neutering it to the point it becomes absolutely crippled. Which is what already happening.

5

BinyaminDelta t1_j1nlyi7 wrote

I'd pay for it right now, especially if Premium let me adjust filters (I'm a grown up dammit) and use it via API.

5

snwfdhmp t1_j1pq7ce wrote

Most filters can be disabled and gpt is already available via API

1

Representative-Bag89 t1_j1n2svl wrote

I will happily pay for gpt-3, maybe with the option of having longer threads

4

zerocoldgg t1_j1o1ld9 wrote

nope, not gpt 3, i will happily pay for gpt 4

6

Beneficial_Fall2518 t1_j1pshav wrote

I'm not worried. These products may cost money next year, but in two years others will make better versions for free. The reason AI art programs cost $100+ is because the companies have a year or two to make a profit until they become small fish in a big pond.

3

epSos-DE t1_j1ombq5 wrote

Ask it : how to bla, bala. 15 options.

Good chat Ai does deliver some options to fix issues.

1

Rezeno56 t1_j1osx40 wrote

Unless ChatGPT's code is released open-source. Then people can make ChatGPT-like AI for different fields like, storytelling, worldbuilding, journalism, coding, and etc.

1

No_Ninja3309_NoNoYes t1_j1p9r41 wrote

I am not sure if I would pay for ChatGPT. Yesterday I had it get stuck in a loop giving me the same piece of text over and over But I would probably accept a fixed and low monthly fee. Actually the Wikipedia API and a summary API could cover some of my needs. I think that Markov chain models or whatever they were called could do decent text generation. Not as good as GPT of course, but that technology is much simpler to implement. I won't be surprised if someone uses it in a novel way soon.

1

[deleted] t1_j1oka74 wrote

[deleted]

−1

madmadG t1_j1oxpuk wrote

Eh… ads means the advertisers are in control. Of the content as well! Not so sure about this

5

GlobusGlobus t1_j1nbrd3 wrote

No, money will make it better.

−4

Xist3nce t1_j1nnu2w wrote

That’s unfortunately very wrong. Greed makes every system worse. This is fresh and needs PR so the people will see it, but the moment a walled garden can be made easily for profit, it will be.

10

odragora t1_j1o7cca wrote

They will, but only when there will be enough pressure from open source tools to make them compete.

If any of their competitors will offer an equally restricted, censored and neutered to oblivion product, we will stagnate hard.

0