Skip to content

Add variable llm-prompt-default-max-tokens #81

Add variable llm-prompt-default-max-tokens

Add variable llm-prompt-default-max-tokens #81

Triggered via pull request September 7, 2024 03:40
Status Success
Total duration 1m 1s
Artifacts

ci.yaml

on: pull_request
Matrix: test
Fit to window
Zoom out
Zoom in