Submitted by OmegaConstant t3_121qk23 in Futurology

This is the year of Google search's death. Skeptical? Bear with me. I spent many years building startups in the AdTech space and I am pretty familiar with how ad networks work. And one thing that is clear to me is this: OpenAI does not have to deal with publishers. At list for now. Google is obligated to. The entire business model of Google is still mostly advertisement -it is based on the idea of catering both to publishers and advertisers. Even if they replicate chatGPT as technology - it does nothing to stop the fall. Google can't just lock users into a chatbot that surfaces answers without a nod to publishers somehow. Of course, they can't! They would deprive them of coveted eyeballs and selling premium ads on their lovely websites! The entire Google ad network will be down. On the other end - OpenAI, they have zero obligations to publishers for now and can strike new deals directly with the same publishers reinventing the ad-selling business model (no, obviously not with ugly banners ). They will easily disrupt the model with zero risk. They are not bound by multimillion-yearly ad contracts. No obligations and no strings attached. Pure innovation and disruption. Goodbye Google. Welcome AI.

Edit: I see a lot of replies arguing on the costs of running inference. Belive me it's temporary state of things - just this week there is paper of teaching models like this for 600 backs, and leaked open sourced by accident Laama model already was published destilled into 100x smaller model running on PC. Just day ago there was meet up of running such models on Raspberry Pi.

The cost always goes down. So was with first computers, first phones, even hard problem of solar panels got drastically cheaper.

it's just temporal state of things. Extrapolate



You must log in or register to comment.

grundar t1_jdmzbxs wrote

> They will easily disrupt the [business] model with zero risk.

How often are grandiose statements like these made by anyone who is not woefully naive (or trying to con investors)?

Serving requests takes computers and electricity, which costs money. Even if chatGPT took over from Google search, OpenAI would still need to pay its bills. Just as Google didn't fundamentally change the search ecosystem when it took over from Yahoo et al., OpenAI would be almost certain to operate within the existing paradigm due to a need to pay its expenses.


psychotobe t1_jdod6bp wrote

Guess where the nft people went after that bubble burst. Cause that We will change everything statement is literally what crypto claimed constantly. Then billions got lost like 3 times a month and most realized "oh its a scam"

Ai contents gonna go the same way. It's people in programming instead of cryptography this time thinking they found an infinite money glitch. It'll be something else when this bubble bursts. Meanwhile, the rest of us will get some neat tech that'll have a use at some point (although who knows where block chains use is) after they tear out their hair and rip open their wallets stress testing it to its limits and beyond. My money on ai arts real use after the novelty wears off is in the roleplay community. Mostly because it would be hilarious to see how the snobby tech bros feel about that.


JoeBookish t1_jdmwqdw wrote

Kind of a reach. I think the tech is revolutionary, but still a pretty niche product, especially when Google is so ubiquitous that you can type in just about any blank field on a screen and instantly do a Google search. I mean, Google is literally synonymous with 'internet search'. Let's come back in a year.


BonFemmes t1_jdn0j4f wrote

They are two different things.

Google: When is tony's pizza open

AI: as a pizza maker, how do I make a neopolitan pepperoni pizza. list the steps.

We will continue to ask google questions that have definite answers. AI can be wrong. AI can be a valuable assistant. It writes stories and code. We need google to check its work.


_Mechaloth_ t1_jdobb1x wrote

A recent thread on a history forum requested a bibliography on horse culture in 14th century Japan. A respondent gave an AI-compiled list which had a large number of invented/phantom sources, sources which the supposed authors themselves chimed in and said, “Yeah, I never wrote that.”


Sirisian t1_jdnndio wrote

OpenAI has a growing list of plugins. It could ask a plugin, I think OpenTable, for information like hours when it doesn't have the information.


Current_Panic63 t1_jdnzfsp wrote

Where should I go to dinner? AI: Let me recommend these sponsored restaurants


imnos t1_jdpkoxx wrote

Not different things. Chat GPT will soon have internet access (i.e. live data) and a ton of plugins.


kikiubo t1_jdn563r wrote

ChatGPT is amazing but sometimes it just says complete nonsense with extreme confidence.


alecs_stan t1_jdn7oah wrote

Sounds like my uncle Brett.


mascachopo t1_jdn8v7f wrote

Except your uncle Brett is not being marketed as a genius that will solve all problems of humankind.


powaqqa t1_jdng0wx wrote

This. I tried it for the first time this week and it was pretty disappointing. Insane mistakes in the answers. But it’s promising tech. Still a few years before it’s gives some really trustworthy answers.


casentron t1_jdoedut wrote

Far too confident for someone with such limited knowledge of the numerous complexities at play in this situation. This post frankly feels a bit manic.

I know AdTech startup guys...let's just say I'm not at all impressed with those credentials.


OmegaConstant OP t1_jdoevmg wrote

Could you add anything helpful to conversation except insults?


Bewaretheicespiders t1_jdneo2a wrote

The cost of inference, in GPU and thus electric power, of these LLM is just too high. A 8.5 billion searches a day, replacing google search with GPT4 would consume an estimated 7 billion watt hours. A day. Just for the power consumed by the GPUs.

You would need over 638 hoover dams just to power that.


vitalyc t1_jdo3iji wrote

So how are people running LLMs locally on laptops and phones? It seems the training costs are unimaginable but you can optimize the models to run on consumer hardware.


Bewaretheicespiders t1_jdokdmg wrote

They arent running GPT4 locally, it sends the request through an API.

GPT3 has 175 billion parameters, at float16 thats 326 gigabyte just for the parameters. That would fill most phone's storage, not to mention the 12 gig of ram the most expensive phones have.

Then GPT4 is many times that...


alpaca417 t1_jdmxfx7 wrote

So for my own understanding, how is Google going away? Are you saying that OpenAI will always be an ad-free alternative to Google, or that OpenAI will replace Google once they strike new deals with publishers?

Because if it’s the latter, then why would publishers want a different deal than the terms they get from Google. OpenAI would start moving closer to what Google is in terms of results being influenced and Google can always close the gap on the technical side to get closer to OpenAI.


gjallerhorn t1_jdmz6n0 wrote

No. I keep seeing this argument and it doesn't make sense. Google is a search engine. Language AI are not. They can regurgitate summaries of stuff they've been fed and have some limited ability to remix it into new forms.

But their data sets are often very sparse with info from the last several years. And they don't point you to actual sources. Just info-which can be completely fabricated/wrong


mascachopo t1_jdn8myn wrote

If we’ve learnt something is that LLM are very good at making stuff up with a high degree of confidence. This is not what you want from an Internet search, so I while I think they can be useful for a number of tasks, I really hope they won’t replace search engines before this problem is solved or we’ll end up in an ocean of misinformation that will create a myriad of issues.


rixtil41 t1_jdnigvk wrote

It will not be its death. If google doesn't keep up with the pace, it will die. Fact checking is still a problem with current AI.


thatsallweneed t1_jdn27eu wrote

is there any numbers? is google lost a 5% at least?


Diveinto_AI t1_jdnlgsu wrote

Probably hard to get those numbers. But OpenAI is used for creative work mostly now, not search. Yet - all might change once there will be plugins in place, and that is just a beginning…


altmorty t1_jdnihat wrote

Dumb searches are way more efficient though. How much more would ChatGPT-4 cost to run at the same scale as Google?


Lysmerry t1_jdo1ya6 wrote

Google may have false results but you can quickly go through a few and evaluate the websites and their trustworthiness on your own. With AI you just have to take its word for it.


yogurt_thrower_75 t1_jdoia3l wrote

OpenAI is only open because no one has monetized it yet. It was Google who started AI with their machine learning algorithms years ago. To think they haven't seen this coming but don't have an answer for it is just silly .


ecnecn t1_jdq1wlk wrote

There are ads for Google Pixel Pro Smart Phone and its AI powered abilities to change and manipulate pictures. It feels really outdated compared to the actual development in AI. It feels like "KODAK-moment on steroids" because you can tell that google had all the tech already in place but their management decided to take parts of it and create apps for their new mobile phone rather than becoming the first AI web service. Furthermore their management - for some weird reason - favored black SEO methods that actually killed the results. I wonder if google management read the blogs of OpenAI and Stanford AI blog (available for everyone) because they literally described their future steps and what is about to happen in AI development. Google sleeping in its own IT bubble.