English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Meta-in-context learning in large language models

Coda-Forno, J., Binz, M., Akata, Z., Botvinick, M., Wang, J., & Schulz, E. (2023). Meta-in-context learning in large language models. In Advances in Neural Information Processing Systems 36: 37th Conference on Neural Information Processing Systems (NeurIPS 2023).

Item is

Basic

show hide
Genre: Conference Paper

Files

show Files

Locators

show
hide
Description:
-
OA-Status:
Not specified

Creators

show
hide
 Creators:
Coda-Forno, J1, Author           
Binz, M1, Author                 
Akata, Z, Author
Botvinick, M, Author
Wang, JX, Author
Schulz, E1, Author           
Affiliations:
1Research Group Computational Principles of Intelligence, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_3189356              

Content

show
hide
Free keywords: -
 Abstract: Large language models have shown tremendous performance in a variety of tasks. In-context learning -- the ability to improve at a task after being provided with a number of demonstrations -- is seen as one of the main contributors to their success. In the present paper, we demonstrate that the in-context learning abilities of large language models can be recursively improved via in-context learning itself. We coin this phenomenon meta-in-context learning. Looking at two idealized domains, a one-dimensional regression task and a two-armed bandit task, we show that meta-in-context learning adaptively reshapes a large language model's priors over expected tasks. Furthermore, we find that meta-in-context learning modifies the in-context learning strategies of such models. Finally, we broaden the scope of our investigation to encompass two diverse benchmarks: one focusing on real-world regression problems and the other encompassing multiple NLP tasks. In both cases, we observe competitive performance comparable to that of traditional learning algorithms. Taken together, our work improves our understanding of in-context learning and paves the way toward adapting large language models to the environment they are applied purely through meta-in-context learning rather than traditional finetuning.

Details

show
hide
Language(s):
 Dates: 2023-11
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: -
 Degree: -

Event

show
hide
Title: Thirty-seventh Conference on Neural Information Processing Systems (NeurIPS 2023)
Place of Event: New Orleans, LA, USA
Start-/End Date: 2023-12-10 - 2023-12-16

Legal Case

show

Project information

show

Source 1

show
hide
Title: Advances in Neural Information Processing Systems 36: 37th Conference on Neural Information Processing Systems (NeurIPS 2023)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: - Identifier: -