GPT4ALL - A 7B Parameter Model
GPT4ALL - A 7B Parameter Model - Flow Card Image

GPT4All - a 7B parameter model (based on LLaMA) trained on a massive collection of clean assistant data including code, stories, and dialogue.

This preliminary technical report describes the development of GPT4All, a chatbot trained over a massive curated corpus of assistant interactions including word problems, story descriptions, multi-turn dialogue, and code.

The collected data, data curation procedure, training code, and final model weights promote open research and reproducibility. Additionally, the release of quantized 4-bit versions of the model allows virtually anyone to run the model on a CPU.

Categories : Computer Science . Machine Learning

Press Ask Flow below to get a link to the resource

     

Related