Starling-LM 7B

Basic Information

Organization
Berkeley
Parameters
7B
License
Apache 2.0
Last Update
2023-11-05
Links

Benchmark Scores

modelDetail.detailedInfo

Description

Research-focused model with strong performance on various tasks

Features

  • Research optimization
  • Task versatility
  • Efficient architecture
  • Academic focus
  • Easy fine-tuning

Use Cases

  • Research projects
  • Academic applications
  • Model development
  • Educational tools
  • Task evaluation

Architecture Details

Type
Research-optimized Transformer
Context Window
8,000 tokens tokens
Training Data Size
1.0 trillion tokens