DrifterInKorea

DrifterInKorea t1_j6hp8sa wrote

Yes in some sense : the ability to have a deep understanding of a language, be it human or computer, is a prerequisite to build and optimize logical blocks like in programming.

The human languages are way more complex than computer languages and they have way more context, requiring to memorize a lot of a text to follow what the subject is.

ChatGPT is really good at discerning what the subject and the intent are while also good a finding related topics.

I would not be surprised if it was revealed that they use similar models to develop, debug or improve their existing code base.

ChatGPT also makes a lot of mistakes which most likely is preventing it to improve without supervision and tweaking from actual developers.
But we are definitely going more and more into what was described as science fiction a few years ago.

2

DrifterInKorea t1_j4yy8bf wrote

That's not a TIFU... if the girl is weird enough to not even ask you why you did it and just goes straight full ignore you may have dodged a bullet.

Even though younger generations seems to struggle with basic communication.

10

DrifterInKorea t1_j2qotpn wrote

Yes and it's hard to detect true bots (I mean automated processes that do not just follow links like wget) because even a simple curl call can spoof its signature and become "human" from an external observer.

So its both ways :

  • on one side you have users that may interact with tools that will cause the traffic to be labelled as "bot".
  • on the other side even simple scripts (bots) can alter their behavior to make it look like they are humans (added noise and delays to mouse cursor position, randomizing ips, using various user agents, etc...) and be labelled "human".
2

DrifterInKorea t1_j2px16p wrote

If true it means there is something wrong with their detection.

It makes no sense to have less bots crawling the web when automation is getting bigger and bigger in every field.
Also when you see social medias' bot generated contents explosion during the pandemic and not really slowing down it is going in the opposite direction.

2

DrifterInKorea t1_j2n60xb wrote

It should be more than 50% by now and it will keep growing for multiple reasons :

  • more SaaS and more servers talking to apis to get or post data.
  • AIs and other statistical tools require lots and lots of data from the web, hence more crawling.
  • There is more and more motivations for crawling the web and we go up in layers which means web pages will be crawled way more to present data in a different format.
  • AIs are most likely going to start building tools for us (and for itself) and those tools will require way more data (a lot more) than what they use today.

Reason 3 & 4 are basically extensions of reasons 1 & 2.

49

DrifterInKorea t1_ix21tdp wrote

They are cute.
I met with one last week, thinking it was a cat.
But it was making sounds while walking so instead I thought it was a dog.
But then getting closer it looked like a very strange not-doggy thing at all.
Once close enough it turned around and completely froze.

It was huge so I guess it was emptying stray cats food that old ladies are giving out.

This was a bit random just like the encounter.

1

DrifterInKorea t1_iwpg6a2 wrote

People who seriously think Elon Musk is bringing a potential solution with interplanetary transport should do a bit more research about what it takes to make something very very hard (teraforming or at least having an habitable buble) and very very far away VS limiting pollution on our current planet.

The guy is a dreamer which is great but our peoblems should be answered with solutions, not dreams.

Like the wars going on all over the world are polluting like crazy and basically nobody has ever to take responsibility for it.

It's way easier and quicker (ie ozone hole) to fix pollution than trying to build something new.

4