File size: 471 Bytes
d3203c7 9942b35 d3203c7 9942b35 d3203c7 9942b35 d3203c7 9942b35 d3203c7 9942b35 d3203c7 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | # payelb/UltraFeedback_openbmb_TinyLlama-1.1B_aligned_with_WoN_deberta_RM
Base model: TinyLlama/TinyLlama-1.1B-Chat-v1.0
Alignment dataset: openbmb/UltraFeedback
Reward model: payelb/UltraFeedback_openbmb_reward-model-deberta-v3-base_1k_fixed_WoN
Method: PPO alignment with LoRA adapters.
Notes:
- Reward normalization and clipping enabled
- KL control enabled
- pad_token_id/eos_token_id explicitly set
- DeBERTa RM loaded on a single device (no device_map='auto')
|