Metas goal wasnt simply to replicate GPT. It says that LLaMA is a “smaller, more performant model” than its peers, built to achieve the same feats of comprehension and articulation with a smaller footprint in terms of compute *, and so has a correspondingly smaller environmental impact. The fact that its cheaper to run doesnt hurt, either. But the company also sought to differentiate itself in another way, by making LLaMA “open”, implicitly pointing out that despite its branding, “OpenAI” is...
