Viewing a single comment thread. View all comments

suflaj t1_j2yhjdw wrote

My man, I only recently convinced myself to start using PyTorch Lightning, no way I'd be able to switch to some other new hip marginally better framework, when it was this hard to start using something that speeds stuff up 10x.

Unless there are clear benefits to switching to some other new technology, it's not worth it.

5

BellyDancerUrgot t1_j3013um wrote

Quick question is PyTorch lightning a better PyTorch or is it more ‘streamlined’ (meaning high level) like say keras for tf

2

suflaj t1_j319k9o wrote

It's basically just a higher abstraction layer for PyTorch. It's completely separate but works in tandem with PyTorch.

I use LightningModules (analogous to torch.nn.Module) as basically decorators over ordinary PyTorch models. So you have your model class, and then you create a LightningModule which is instantiated with said model, where you implement ex. what optimizers and schedulers you use, how your training, evaluation and testing goes, what metrics you track and when etc.

But once you're done with R&D you can just use ordinary PyTorch as-is, that's why I like it. It doesn't make setting stuff up for production different in any way, but it makes plenty of stuff during R&D effortless. It has some smelly parts but IMO they're not a dealbreaker, just take a day or two to learn it.

4

enterthesun t1_j30tlfx wrote

I hope PL is better than the Keras situation because I’m very excited about PL and a big reason I started learning PyTorch was to avoid the crutch that is Keras.

2

BellyDancerUrgot t1_j30uc0r wrote

Keras is a good tool when getting started imo. But eventually it’s better to switch to PyTorch because it’s more pythonic and although tensorflow used to be the deployment standard PyTorch has caught up and add to that tf doesn’t even have a team working on it anymore iirc since everyone moved onto jax. I will say tho PyTorch can be very frustrating to work on initially because a lot of the internal optimizations that keras does are absent in PyTorch. I’ve never used PL tho.

2

enterthesun t1_j32div1 wrote

Wow no one working on tf that’s crazy I didn’t know that!

2

kraegarthegreat t1_j316z7p wrote

I found PL helped reduce boilerplate code while still giving the niceties of torch versus tf.

The main thing I like is that it abstracts the training loops while still giving you the ability to add custom code to any part of the training loop. This likely sounds weird, but check out their page. 12/10 recommend.

2

todeedee t1_j31v7pu wrote

JAX looks appealing, but totally agree -- I'm not ready to go back to using Bazel to build code from source

1