====== Large Language Model (LLM) ====== A neural network trained on large text corpora to model language. Modern LLMs use [[concepts:softmax_attention|transformer]] architectures with [[concepts:prenorm|PreNorm]] and [[concepts:residual_connections|residual connections]], and increasingly [[concepts:moe|MoE]] for efficiency. [[concepts:scaling_laws|Scaling laws]] govern their performance gains. See also: [[concepts:softmax_attention]], [[concepts:moe]], [[concepts:scaling_laws]], [[concepts:prenorm]]