The Kaitchup – AI on a Budget

The Kaitchup – AI on a Budget

Share this post

The Kaitchup – AI on a Budget
The Kaitchup – AI on a Budget
LoRA Adapters: When a Naive Merge Leads to Poor Performance

LoRA Adapters: When a Naive Merge Leads to Poor Performance

The case of LoRA adapters fine-tuned with QLoRA

Benjamin Marie's avatar
Benjamin Marie
Sep 07, 2023
∙ Paid
14

Share this post

The Kaitchup – AI on a Budget
The Kaitchup – AI on a Budget
LoRA Adapters: When a Naive Merge Leads to Poor Performance
46
Share

QLoRA is a memory-efficient way to fine-tune LLMs. It quantizes the LLM and then fine-tunes a LoRA adapter on top of it. I have used this method many times in my previous articles to fine-tune GPT-NeoX, Falcon, and Llama 2 models.

Combine Multiple LoRA Adapters for Llama 2

Combine Multiple LoRA Adapters for Llama 2

Benjamin Marie
·
November 27, 2023
Read full story

QLoRA only saves the fine-tuned adapter and not the entire model since we have kept its parameters frozen.

But then, what do we do with this adapter?

We have two solutions to use it:

  • Load it on top of the base model every time we need it

  • Merge it with the base model to get a new model

For both these solutions, we have to be careful. We can’t just naively load the base model and then load the adapter on top of it. We have to load the base model and preprocess it the same way it was during QLoRA fine-tuning, otherwise, we may get a significant performance drop. The same applies if you want to merge the adapter.

The Kaitchup – AI on a Budget is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

In this article, I show you how to use the fine-tuned adapter. We will see that merging an adapter fine-tuned with QLoRA is not trivial. There is a method to avoid the performance drop after merging. I will explain and benchmark it. All the code to reproduce my experiments and the optimal merging strategy is available on the notebook page:

Get the notebook (#14)

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 The Kaitchup
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share