State of the Universe

Repurposing Large Language Models for Cosmology

by Dr Daniel Schiller (Heidelberg University)

Asia/Kolkata
A304 and on Zoom

A304 and on Zoom

Description

Foundation models are a very successful approach to linguistic tasks. Naturally, there is the desire to develop foundation models for physics data. Currently, existing networks are much smaller than publicly available Large Language Models (LLMs), the latter having typically billions of parameters. By applying pretrained LLMs in an unconventional way, we introduce large networks for cosmological data with a relatively cheap training cost.