Comments

You must log in or register to comment.

Purple_noise_84 t1_itgynxx wrote

Few more years and it will be as good as pytorch today :)

22

BITE_AU_CHOCOLAT t1_ithud6t wrote

Eh... I'm currently training a model with 700M parameters (most of which being in the emdeddings used as input, not so much the hidden layers themselves) and Pytorch pretty much required at least 50GB per GPU while Tensorflow was happy to train on 3090s, which were way, wayyyy cheaper to rent than a6000s even though Pytorch managed better GPU utilization. So I think I'm just gonna stick with TF/Keras and TFLite for now.

−8

learn-deeply t1_itikofd wrote

PyTorch doesn't inherently use more or less memory than TensorFlow, there's a bug in your code. If it's easier to switch frameworks than debug, more power to you.

14

BITE_AU_CHOCOLAT t1_itiofjz wrote

Well, I haven't "switched" since I've been using Tensorflow since the start of the project. I was just curious to see if Pytorch could allow me to squeeze more juice and after spending a weekend trying to learn assembly Pytorch syntax it turns out that yes, but actually no. So yeah I'm perfectly content with using model.fit and calling it a day for the time being.

Oh and I also forgot: Pytorch won't train with a distributed strategy in a Jupyter environment. KEK.

−11

jaschau t1_itguay4 wrote

Am I the only one for who the announcement feels a bit out of touch with reality?

19

sharky6000 t1_iti87n8 wrote

I am not a fan of TF by any means, but:

> It’s the 3rd most-starred software repository on GitHub (right behind Vue and React) and the most-downloaded machine learning package on PyPI

Can't really make that stuff up. There are quite a lot of TF users out there.

13

jaschau t1_itk0o02 wrote

Completely agree with you there that the numbers are certainly correct. I just felt like they might not tell the whole story. For example, looking at the section where they say, I paraphrase, "x preprints are uploaded every day that mention TF", I don't doubt the numbers, but the way they tell it certainly evokes a different image compared to saying "the share of preprints relying on TF has been steadily declining the past years".

Regarding the many TF users out there, I would be curious what the main benefit is for them. Is it TPU support, TF serving, TF lite, something else?

6

LetterRip t1_itiqjqp wrote

Everyone downloads pytorch directly from the pytorch site, so it is somewhat misleading.

−4

drinkingsomuchcoffee t1_ithb1xs wrote

Unsurprising. Google's been out of touch with reality for awhile now. That's what happens when you have a near monopoly (besides Apple). Despite the claims of how elite they are, the APIs they produce are pretty garbage except for a few lucky hits like JAX.

5

VirtualHat t1_itkajqr wrote

I use Pytorch every day and haven't gone back to TF for years. That being said, there are lots of old projects still on TF, and indeed on the older 1.x version before they fixed most of the stuff.

I'm glad they're working on XLA and JAX though.

3

johnnymo1 t1_ith4dxp wrote

I'm happy to learn about this KerasCV package. I've been using the TensorFlow Object Detection API for work lately and I hate it.

10

puppet_pals t1_ithf7m4 wrote

Luke here from KerasCV - the object detection API is still under active development. I recently got the RetinaNet to score an MaP of 0.49 on PascalVOC on a private notebook, should be possible with just the standalone package in the 0.4.0 release. I'd say give it a month.

​

The API surface won't change a lot, but the results will get a lot better in the next release.

4

johnnymo1 t1_ithghw0 wrote

I think there may be a name collision going on here haha. You mean the object detection API within KerasCV?

1

IndieAIResearcher t1_ith113b wrote

JAX integration in TF inference, lite or JS, is fav!!

7

Lajamerr_Mittesdine t1_itgeiy5 wrote

Can someone that has both the perspective of using TensorFlow and using Pytorch give their opinions why you would / wouldn't use each?

And how this announcement changes things for you.

3

IndependentSavings60 t1_itgqugk wrote

I used TF back when deep learning taking off, around 2016. Personally, it began really hard to use when TF allow contribution modules to merge into codebase, inconsistent API interface and wall of deprecated warning appeared everytime I run code or update TF version. I think most of people back then use TF or Pytorch to try out research idea and TF just didn't work well for that role like Pytorch. More and more reseaechers switch to Pytorch for easy use and so are their new models, I have a doubt that there is still anyone published model on TF beside Google.

12

anomaly_in_testset t1_itgwh3d wrote

I am primarily a TF 2.x user and have started using Jax as most of my thesis. For me, this is a huge update since TF doesn't support item assignments, numpy like behavior etc. And XLA is a huge update as the speed difference in Jax and TF is significant. So overall, I am very excited.

2

Mefaso t1_itkp4he wrote

>And XLA is a huge update as the speed difference in Jax and TF is significant

Which one is significantly faster? I guess Jax?

2

fasttosmile t1_itiivv7 wrote

Interesting they've decided to keep investing into it. I suppose with the amount of existing code they must have it was hard not to.

1

learn-deeply t1_itikwmy wrote

They have to. If they created TF 3.0 without backward compatibility or told people to switch to JAX, every one who used TensorFlow would just move to PyTorch.

5

levilain35 t1_itrlfj0 wrote

I have tried tensorflow 1, tensorflow 2 and pytorch. I love tensorflow because it is very efficient for scaling big models on several GPU or worker. In addition to that, tfrecords are very cool, we never go back in python during training. It is a very important part in tensorflow even if pytorch have so good quality. For my applications i need to train very big model easily and deploying it. I am happy to hear that is a main goal of tensorflow

1