google/code_x_glue_ct_code_to_text
Viewer • Updated • 1.01M • 8.45k • 80
How to use nielsr/codet5-small-code-summarization-ruby with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("nielsr/codet5-small-code-summarization-ruby")
model = AutoModelForSeq2SeqLM.from_pretrained("nielsr/codet5-small-code-summarization-ruby")CodeT5-small model, fine-tuned on the code summarization subtask of CodeXGLUE (Ruby programming language). This model can generate a docstring of a given function written in Ruby.
The notebook that I used to fine-tune CodeT5 can be found here.
Here's how to use this model:
from transformers import RobertaTokenizer, T5ForConditionalGeneration
model_name = "nielsr/codet5-small-code-summarization-ruby"
tokenizer = RobertaTokenizer.from_pretrained(model_name)
model = T5ForConditionalGeneration.from_pretrained(model_name)
code = """
def update_with_file_contents(digest, filename)
File.open(filename) do |io|
while (chunk = io.read(1024 * 8))
digest.update(chunk)
end
end
end
"""
input_ids = tokenizer(code, return_tensors="pt").input_ids
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
# Update the digest with the contents of the given file
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("nielsr/codet5-small-code-summarization-ruby") model = AutoModelForSeq2SeqLM.from_pretrained("nielsr/codet5-small-code-summarization-ruby")