You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First of all many thanks for the release of llama 2 7b 32k and your precious contributions!
It's appreciated that you provide example scripts for Finetuning; however the (for me) most interesting part, the continued pre-training mentioned in the blog post, is missing.
Would it be possible to provide that script as well? Many thanks in advance+ all the best!
The text was updated successfully, but these errors were encountered:
Hi, I am also interested in the pre-training part. A script would be the best. Also, a quick question, what's the sequence length setting in the pre-training? 4k or 32k?
First of all many thanks for the release of llama 2 7b 32k and your precious contributions!
It's appreciated that you provide example scripts for Finetuning; however the (for me) most interesting part, the continued pre-training mentioned in the blog post, is missing.
Would it be possible to provide that script as well? Many thanks in advance+ all the best!
The text was updated successfully, but these errors were encountered: