SixFinger-8B Adapter for LLaMA 3.1 8B
This repository contains a LoRA adapter for the SixFinger-8B model.
The adapter allows fine-tuned responses on top of the base model unsloth/llama-3.1-8b-bnb-4bit without modifying the base weights.
Overview
- Base Model: unsloth/llama-3.1-8b-bnb-4bit
- Adapter Type: LoRA
- Quantization: 4-bit (via bitsandbytes)
- Purpose: Enhanced response generation for Turkish/English mixed datasets.
- Compatibility: Use with Hugging Face Transformers + PEFT library.
Installation
Install required dependencies:
!pip install transformers accelerate bitsandbytes peft
Ensure you have a GPU with sufficient VRAM for 4-bit inference.
Loading the Model
- Load the Base Model
from transformers import AutoTokenizer, AutoModelForCausalLM
"unsloth/llama-3.1-8b-bnb-4bit",
device_map="auto"
)
- Load the Adapter
'from peft import PeftModel'
'model = PeftModel.from_pretrained(' ' base_model,' ' "sixfingerdev/SixFinger-8B"' ')'
- Load the Tokenizer
'tokenizer = AutoTokenizer.from_pretrained("unsloth/llama-3.1-8b-bnb-4bit")'
Example Usage
Generate text using the adapter:
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel
import torch
# Base model
base_model = AutoModelForCausalLM.from_pretrained(
"unsloth/llama-3.1-8b-bnb-4bit",
device_map="auto"
)
# LoRA adapter
model = PeftModel.from_pretrained(base_model, "sixfingerdev/SixFinger-8B")
# Tokenizer
tokenizer = AutoTokenizer.from_pretrained("unsloth/llama-3.1-8b-bnb-4bit")
# Örnek text generation
prompt = "Soru: Yapay zeka nedir?\nCevap:"
inputs = tokenizer(prompt, return_tensors="pt")
with torch.no_grad():
outputs = model.generate(**inputs, max_new_tokens=50, do_sample=True, temperature=0.7)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Notes
- The adapter does not modify the base model; it only applies LoRA weights on top.
- 4-bit quantization significantly reduces VRAM usage. Ensure your GPU supports bitsandbytes 4-bit operations.
- You can merge the adapter into the base model for easier deployment if needed.
References
License
The adapter and its usage are provided under the terms specified in the repository.
Ensure compliance with the base model license (Meta’s LLaMA).
- Downloads last month
- 51