Viewing a single comment thread. View all comments

ninjasaid13 t1_jc9m41f wrote

They said competitive landscape reasons or safety reasons but we all know it's the former.

57

SidewaysFancyPrance t1_jcbeccw wrote

AI arms race, which is 100% about money and greed. Morality and ethics slow things down. Regulations slow things down. Wall Street and investors are watching for a prospective winner to bet big on. Tech is stale, old news. They need the next big win.

AI domination is going to be "first past the post" and all these companies know it. We're going to hear all kinds of BS about how they're trying to be ethical/etc but they really want to be first and snap up lucrative deals with their protected IP. What was open will stop being open very quickly once a public tech demo like this gets attention, and be strictly protected for that reason.

18

chuntus t1_jcbmk62 wrote

Why will it be โ€˜first past the postโ€™? The market tolerates multiple versions of other technologies why not in this field?

3

CactusSmackedus t1_jcdmi15 wrote

It's not, the commenter doesn't know what they're talking about. There's a paper out in the last few days (I think) showing that weaker systems can be fine tuned on input/output from stronger model and approximate the better models' results. This implies any model with paid or unpaid API access could be subject to a sort of cloning. It suggests that competitive moats will not be able to hold.

Plus (I have yet to reproduce since I've been away from my machine) APPARENTLY a Facebook model weights got leaked in the last week and apparently someone managed to run the full 60B weights model on a raspberry pi (very very slowly) but two implications:

  1. "Stealing" weights continues to be a problem, this isn't the first set of model weights to get leaked iirc, and once you have a solid set of model weights out, experience with stable diffusion suggests there might could be an explosion of use and fine tuning.

  2. Very very very surprisingly (I am going to reproduce it if I can because if true this is amazingly cool) consumer grade GPUs can run these LLMs in some fashion. Previous open sourced LLMs that fit in under 16Gb of vram are super disappointing because to get the model size small enough to fit on the card you have to limit the number of input tokens, which means the model "sees" very few words of input with which to produce output, pretty useless.

Now I don't think this year we'll have competitive LLMs running on GPUs at home, but, even if openAI continues to be super lame and political about their progress, eventually the moat will fall.

Also all the money to be made (aside from bing eating google) or maybe I should say most of the value is going to be captured by skilled consumers/users of LLMs not by glorified compute providers.

2

Smeagollu t1_jcczori wrote

It's likely that the first broadly usable general AI will be ahead of all others by long enough that it can grab up most of the market. It would make Bing the new go-to search engine or Google could manifest it's place.

0

CactusSmackedus t1_jcdk9kb wrote

MONEY

AND GREEEEEEEEED

๐Ÿ™„๐Ÿ™„๐Ÿ™„ Garbage populist memerey

1