Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example script for continued pre-training? #162

Open
jphme opened this issue Jul 28, 2023 · 2 comments
Open

Example script for continued pre-training? #162

jphme opened this issue Jul 28, 2023 · 2 comments

Comments

@jphme
Copy link

jphme commented Jul 28, 2023

First of all many thanks for the release of llama 2 7b 32k and your precious contributions!

It's appreciated that you provide example scripts for Finetuning; however the (for me) most interesting part, the continued pre-training mentioned in the blog post, is missing.

Would it be possible to provide that script as well? Many thanks in advance+ all the best!

@lllyyyqqq
Copy link

Hi, I am also interested in the pre-training part. A script would be the best. Also, a quick question, what's the sequence length setting in the pre-training? 4k or 32k?

@csris
Copy link
Contributor

csris commented Aug 6, 2023

@zhangce @LorrinWWW, can you please comment?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants