--- base_model: - THUDM/GLM-4-32B-0414 library_name: transformers tags: - mergekit - merge --- # glm-e1 This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the Passthrough merge method using [THUDM/GLM-4-32B-0414](https://huggingface.co/THUDM/GLM-4-32B-0414) + /alloc/axolotl/data/32b-lora-out/checkpoint-718 as a base. ### Models Merged The following models were included in the merge: ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: THUDM/GLM-4-32B-0414+/alloc/axolotl/data/32b-lora-out/checkpoint-718 dtype: bfloat16 merge_method: passthrough models: - model: THUDM/GLM-4-32B-0414+/alloc/axolotl/data/32b-lora-out/checkpoint-718 ```