Comments

You must log in or register to comment.

Dr_Singularity OP t1_irkrmtd wrote

The memory requirement is greatly diminished because of their system-algorithm co-design approach. When compared to cloud training frameworks, the suggested methods significantly reduce memory use by over a factor of 1000 and a factor of 100 compared to the best edge training framework can discover (MNN).

This framework saves energy and encourages practical use by decreasing the per-iteration time by more than 20 compared to dense update and vanilla system design. Their findings show that small IoT devices may make inferences, learn from experience, and acquire new skills over time.

47

WashiBurr t1_irlimw6 wrote

There has to be some kind of catch to this. It seems too good to be true.

21

Tobislu t1_irllk8q wrote

The catch is that technology is just as easily used for evil as good.

Now deep-learning doesn't need to go through well-known channels, because it can be run locally on cheap hardware. Now these inferences can be used by anyone with a flip-phone or a hacked microwave.

15

HofvarpnirStudios t1_irlvexn wrote

Power asymmetry can lead to nefarious use as well

As in only those with massive GPUs and corner the market or something like that

8

_____DEAD_____ t1_irpamzq wrote

Gonna be interesting to see general appliances having the ability to "learn", give it a camera so it can see the world too, just dont forget to clean the microwave before it microwaves you

2

Quealdlor t1_irlw3j3 wrote

It feels unreal, I originally thought we would need terabytes of RAM and that we would get them, thanks to Moore's Law giving us 100X in a decade. Looks like we won't even need those terabytes and low gigabytes might be enough.

12

genshiryoku t1_irmxqml wrote

There seems to be forming a consensus that we don't need any better technology than we already have right now.

If for some reason hardware stops today and no new things will ever get made, it's possible that with the right architectural/software breakthroughs we could still reach AGI.

Yeah moore's law is most likely going to end around the end of this decade, but we have more than enough processing power for the AGI revolution to still happen.

9

_____DEAD_____ t1_irqlbpd wrote

I agree that it’s more a programming problem, but with increasing technology comes increased speeds and available memory, which absolutely benefits development of these programs.

2

Quealdlor t1_irrr1tx wrote

But do you know how hard it is to run modern games on Ultra settings in 4K 60fps? Let alone higher framerates. We need better hardware or some software breakthroughs. DLSS looks horrible btw.

1

GenoHuman t1_is15mcm wrote

4090 can easily run any modern game in 4K 60fps, in fact it can run most games at 4K 100+ fps with ray tracing.

0

Bakoro t1_irlyoif wrote

I want to make sure I've got this right, is it that you still train the pretty-good base model with major resources, and the edge device is able to improve it over time, or is it that you can start with a poop-tier model which can still become very good over time?

6

tokkkkaaa t1_irp0z2b wrote

Can someone explain this?

1