Deepseek R1 encountered a message truncation issue


In the plane quadrilateral ABCD, AB = AC = CD = 1, ∠ADC = 30°, ∠DAB = 120°. Fold triangle ACD along AC to triangle ACP, where P is a moving point. Find the minimum value of the cosine of the dihedral angle A - CP - B.

1 Like

@HarmonyOSam

At present the maximum context length is 4k . This includes the reasoning output in that calculation. Unfortunately in the UI it will just sort of trim off at that point. There is an enhancement for a better return message in these case filed; also larger context lengths are coming soon. I apologize for any inconvenience this may have caused.

-Coby

@HarmonyOSam I was wrong in my first statement the GUI playground sets a max_tokens value of 3200

{
  "body": {
    "messages": [
      {
        "role": "system",
        "content": "keep your reasoning short"
      },
      {
        "role": "user",
        "content": "In the plane quadrilateral ABCD, AB = AC = CD = 1, ∠ADC = 30°, ∠DAB = 120°. Fold triangle ACD along AC to triangle ACP, where P is a moving point. Find the minimum value of the cosine of the dihedral angle A - CP - B."
      }
    ],
    "max_tokens": 3200,
    "stop": [
      "<|eot_id|>"
    ],
    "stream": true,
    "stream_options": {
      "include_usage": true
    },
    "model": "DeepSeek-R1"
  },
  "env_type": "text",
  "fingerprint": "xxxxxxxxxxxxxxxxxxxxxxxxxxx"
}

And if you look at the return in a browser inspect you will the stop reason is length .

Try your same call with the API , without max_tokens set and you will get the full 4k. ( Larger context lengths are pending ).

-Coby