From 41876aee89dd7ca51936fbaae5dc6734d8fcd73b Mon Sep 17 00:00:00 2001 From: WANG Yue <337111657@qq.com> Date: Tue, 16 May 2023 23:16:23 +0800 Subject: [PATCH] add InstructCodeT5+ 16B model --- CodeT5+/README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/CodeT5+/README.md b/CodeT5+/README.md index 3fce6a8..3238d3e 100644 --- a/CodeT5+/README.md +++ b/CodeT5+/README.md @@ -24,6 +24,7 @@ We release the following CodeT5+ models: * CodeT5+ `220M` and `770M` at Huggingface [here](https://huggingface.co/Salesforce/codet5p-220m) and [here](https://huggingface.co/Salesforce/codet5p-770m), respectively. * CodeT5+ `220M` and `770M` that are further tuned on Python subset at Huggingface [here](https://huggingface.co/Salesforce/codet5p-220m-py) and [here](https://huggingface.co/Salesforce/codet5p-770m-py), respectively. +* InstructCodeT5+ `16B` at Huggingface [here](https://huggingface.co/Salesforce/instructcodet5p-16b) * CodeT5+ `2B`, `6B`, `16B` will be released soon. # How to Use? @@ -53,4 +54,4 @@ print(tokenizer.decode(outputs[0], skip_special_tokens=True)) journal={arXiv preprint}, year={2023} } -``` \ No newline at end of file +```