You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Aug 1, 2024. It is now read-only.
However I wonder whether the pre-trained model can be used on downstream work like sequencing classification? I'm currently harvest the power of ESM2 model to do protein annotation and found it hard to fine-tuning the model. The example I can reach only used full parameter fine-tuning and I wish to use Lora/Qlora or freeze the top layers. I wonder where I can get some examples about using these high efficient fine-tuning methods? Like can I use torch method to freeze part of the layers in EMS2 model and train the remaining parameters?
Any help is welcomed! Tks.
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Amazing work!
However I wonder whether the pre-trained model can be used on downstream work like sequencing classification? I'm currently harvest the power of ESM2 model to do protein annotation and found it hard to fine-tuning the model. The example I can reach only used full parameter fine-tuning and I wish to use Lora/Qlora or freeze the top layers. I wonder where I can get some examples about using these high efficient fine-tuning methods? Like can I use torch method to freeze part of the layers in EMS2 model and train the remaining parameters?
Any help is welcomed! Tks.
The text was updated successfully, but these errors were encountered: