You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
From the NVidia-NeMo org git README.md, it states that NeMo repo will be breaked down "into a series of functional-focused repos to facilitate code discovery" and that it "will repurpose to focus on speech". I currently use NeMo Adapters on custom torch modules, which aren't linked to HuggingFace, for SFT and PEFT. I could not find in any of the new repos such as NeMo Megatron-Bridge or NeMo AutoModel any way to use the PEFT framework on arbitrary torch modules not being linked to HF. Hence, I wanted to know if you plan to maintain this feature in future or if it will be deprecated?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
From the NVidia-NeMo org git
README.md, it states that NeMo repo will be breaked down "into a series of functional-focused repos to facilitate code discovery" and that it "will repurpose to focus on speech". I currently use NeMo Adapters on custom torch modules, which aren't linked to HuggingFace, for SFT and PEFT. I could not find in any of the new repos such as NeMo Megatron-Bridge or NeMo AutoModel any way to use the PEFT framework on arbitrary torch modules not being linked to HF. Hence, I wanted to know if you plan to maintain this feature in future or if it will be deprecated?Thanks!
Beta Was this translation helpful? Give feedback.
All reactions