|
You are here |
simons.berkeley.edu | ||
| | | | |
afiodorov.github.io
|
|
| | | | | I am tired of Americanism: consuming in the Anglosphere sometimes makes me feellike all movies are the same. America ... | |
| | | | |
ssc.io
|
|
| | | | | Data integration and cleaning have long been a key focus of the data management community. Recent research indicates the potential of large language models (LLMs) for such tasks. However, scaling and automating data wrangling with LLMs for real-world use cases poses additional challenges. Manual prompt engineering for example, is expensive and hard to operationalise, while full fine-tuning of LLMs incurs high compute and storage costs. Following up on previous work, we evaluate parameter-efficient fine-tuning (PEFT) methods for efficiently automating data wrangling with LLMs. We conduct a study of four popular PEFT methods on differently sized LLMs for ten benchmark tasks, where we find that PEFT methods achieve performance on-par with full fine-tuning, and ... | |
| | | | |
pshapira.net
|
|
| | | | | Using a generative language model (GPT-4) to produce labels and rationales for large-scale text analysis of Public Value Expressions in AI patent documents. | |
| | | | |
www.dotnet-tv.com
|
|
| | | [AI summary] The post discusses the Nemerle programming language, highlighting its features and capabilities as a .NET platform language. | ||