Long Sequence Modeling with XGen: A 7B LLM Trained on 8K Input Sequence Length
TLDR We trained a series of 7B LLMs named XGen-7B with standard dense attention on up to 8K sequence length for up to 1.5T tokens. We also fine tune the models on public-domain instructional data. The main take-aways are: * On standard NLP benchmarks, XGen achieves comparable or better results
28 Jun 2023 • #llm