BellyDancerUrgot t1_j3013um wrote
Reply to comment by suflaj in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
Quick question is PyTorch lightning a better PyTorch or is it more ‘streamlined’ (meaning high level) like say keras for tf
suflaj t1_j319k9o wrote
It's basically just a higher abstraction layer for PyTorch. It's completely separate but works in tandem with PyTorch.
I use LightningModules (analogous to torch.nn.Module) as basically decorators over ordinary PyTorch models. So you have your model class, and then you create a LightningModule which is instantiated with said model, where you implement ex. what optimizers and schedulers you use, how your training, evaluation and testing goes, what metrics you track and when etc.
But once you're done with R&D you can just use ordinary PyTorch as-is, that's why I like it. It doesn't make setting stuff up for production different in any way, but it makes plenty of stuff during R&D effortless. It has some smelly parts but IMO they're not a dealbreaker, just take a day or two to learn it.
enterthesun t1_j30tlfx wrote
I hope PL is better than the Keras situation because I’m very excited about PL and a big reason I started learning PyTorch was to avoid the crutch that is Keras.
BellyDancerUrgot t1_j30uc0r wrote
Keras is a good tool when getting started imo. But eventually it’s better to switch to PyTorch because it’s more pythonic and although tensorflow used to be the deployment standard PyTorch has caught up and add to that tf doesn’t even have a team working on it anymore iirc since everyone moved onto jax. I will say tho PyTorch can be very frustrating to work on initially because a lot of the internal optimizations that keras does are absent in PyTorch. I’ve never used PL tho.
enterthesun t1_j32div1 wrote
Wow no one working on tf that’s crazy I didn’t know that!
kraegarthegreat t1_j316z7p wrote
I found PL helped reduce boilerplate code while still giving the niceties of torch versus tf.
The main thing I like is that it abstracts the training loops while still giving you the ability to add custom code to any part of the training loop. This likely sounds weird, but check out their page. 12/10 recommend.
enterthesun t1_j32dgvr wrote
Thank you.
Viewing a single comment thread. View all comments