Skip to content

Commit

Permalink
[hotfix] fix variable type for top_p (hpcaitech#5313)
Browse files Browse the repository at this point in the history
Co-authored-by: binmakeswell <[email protected]>
  • Loading branch information
CZYCW and binmakeswell authored Feb 19, 2024
1 parent 705a62a commit b833153
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion applications/Colossal-LLaMA-2/inference_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ def generate(args):
parser.add_argument("--do_sample", type=bool, default=True, help="Set whether or not to use sampling")
parser.add_argument("--temperature", type=float, default=0.3, help="Set temperature value")
parser.add_argument("--top_k", type=int, default=50, help="Set top_k value for top-k-filtering")
parser.add_argument("--top_p", type=int, default=0.95, help="Set top_p value for generation")
parser.add_argument("--top_p", type=float, default=0.95, help="Set top_p value for generation")
parser.add_argument("--input_txt", type=str, default="明月松间照,", help="The prompt input to the model")
parser.add_argument("--prompt_style", choices=["sft", "pretrained"], default="sft", help="The style of the prompt")
args = parser.parse_args()
Expand Down

0 comments on commit b833153

Please sign in to comment.