Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: supporting OpenRouter include_reasoning parameter #8130

Open
jamesbraza opened this issue Jan 31, 2025 · 2 comments · May be fixed by #8184
Open

[Feature]: supporting OpenRouter include_reasoning parameter #8130

jamesbraza opened this issue Jan 31, 2025 · 2 comments · May be fixed by #8184
Assignees
Labels
enhancement New feature or request

Comments

@jamesbraza
Copy link
Contributor

The Feature

https://openrouter.ai/docs/parameters#include-reasoning

This opt-in flag include_reasoning from OpenRouter enables one to get reasoning traces out of a reasoning model hosted on OpenRouter, for example DeepSeek R1.

Motivation, pitch

Being able to get the "reasoning" when using OpenRouter.

Are you a ML Ops Team?

No

Twitter / LinkedIn details

No response

@jamesbraza jamesbraza added the enhancement New feature or request label Jan 31, 2025
@jamesbraza
Copy link
Contributor Author

I got it to work via this with openai==1.60.2:

client = OpenAI(
    base_url=os.getenv("OPENROUTER_API_URL", "https://openrouter.ai/api/v1"),
    api_key=os.environ["OPENROUTER_API_KEY"],
)
response = client.chat.completions.create(
    model="deepseek/deepseek-r1",
    messages=[{"role": "user", "content": "What is 1+1?"}],
    extra_body={"include_reasoning": True},
)
reply = response.choices[0].message

Or via curl:

curl -X POST https://openrouter.ai/api/v1/chat/completions \
  -H "Authorization: Bearer API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek/deepseek-r1",
    "include_reasoning": true,
    "messages": [
      {"role": "user", "content": "What is 1+1?"}
    ]
  }'

I could not get it to work with litellm.completion

@krrishdholakia krrishdholakia self-assigned this Feb 1, 2025
@krrishdholakia
Copy link
Contributor

seems like a bug for passing provider-specific params, which litellm does support

@jamesbraza does it work via completion if you use extra_body?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants