Perplexity API JSON Mode Returns Markdown: Fix
🔍 WiseChecker

Perplexity API JSON Mode Returns Markdown: Fix

When you use Perplexity API JSON mode, you expect structured JSON output. Instead, the API sometimes returns markdown text wrapped in triple backticks. This breaks your application logic and requires extra parsing. The root cause is a mismatch between the API endpoint and the model used. This article explains why JSON mode returns markdown and provides a step-by-step fix to get clean JSON output.

Key Takeaways: Fix Perplexity API JSON Mode Returning Markdown

  • API Endpoint /chat/completions with response_format: Forces the model to output raw JSON without markdown wrappers.
  • Model sonar-pro or sonar-medium-online: These models support JSON mode correctly; older models may not.
  • Explicit system message instruction: Add “Output only valid JSON, no markdown formatting” to the system prompt.

ADVERTISEMENT

Why Perplexity API JSON Mode Returns Markdown

Perplexity API offers a JSON mode that should return structured JSON. When you set response_format: { "type": "json_object" }, the API is supposed to enforce JSON output. However, the underlying language model may still generate markdown code fences around the JSON. This typically happens with models not optimized for JSON mode or when the system prompt lacks explicit JSON-only instructions.

The technical cause is that the model treats the JSON mode as a suggestion rather than a strict rule. Some older models like sonar-small-online or sonar-medium-chat do not fully comply with the response_format parameter. The model may wrap JSON in triple backticks because it learned to do so from training data. The fix requires using a compatible model and reinforcing the JSON-only behavior through the system prompt.

Steps to Force Clean JSON Output

  1. Use a compatible model
    Set the model parameter to sonar-pro or sonar-medium-online. These models have been tested to return raw JSON when response_format is set. Avoid sonar-small-online and sonar-medium-chat for JSON mode.
  2. Set the response_format parameter
    Include "response_format": { "type": "json_object" } in your API request body. This tells the API to expect a JSON object and instructs the model accordingly.
  3. Add a system message with strict JSON instructions
    Include a system message that says: "You are a JSON-only assistant. Output only valid JSON. Do not use markdown code fences or any other formatting." This reinforces the behavior.
  4. Send a user message that requests JSON
    In the user message, explicitly ask for JSON. Example: "Return the following data as a JSON object with keys 'name', 'age', 'city'. No other text."
  5. Parse the response in your code
    Even with all precautions, some models may still return markdown. In your application code, strip triple backticks and the word “json” before parsing. Use a regex like /```json?\n?/g to remove the fences.
  6. Test with a minimal request
    Send a simple request to verify clean JSON. Example curl command:
    curl -X POST https://api.perplexity.ai/chat/completions -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"model":"sonar-pro","messages":[{"role":"system","content":"Output only valid JSON, no markdown"},{"role":"user","content":"Return a JSON object with key 'test' and value 'success'"}],"response_format":{"type":"json_object"}}'

ADVERTISEMENT

If Perplexity API Still Returns Markdown After the Fix

“The API returns markdown even with sonar-pro”

If you still see triple backticks, check your system message. Some models ignore a single instruction. Add a second system message or include the instruction in the user message as well. Also verify that your API request uses the correct endpoint /chat/completions, not the legacy /completions endpoint.

“JSON mode is not working for streaming responses”

Streaming responses may return markdown even when non-streaming requests work. For streaming, the model may output partial tokens that include backticks. Consider using non-streaming requests for JSON mode. If you must stream, accumulate the full response and then strip markdown in your code.

“The API returns an error when I set response_format”

Some models do not support the response_format parameter. Check the Perplexity API documentation for the list of compatible models. As of 2025, sonar-pro and sonar-medium-online support it. Older models like sonar-small-chat will return an error. Switch to a supported model.

Item Model with JSON Mode Support Model without JSON Mode Support
Model Name sonar-pro, sonar-medium-online sonar-small-online, sonar-medium-chat
JSON Mode Behavior Returns raw JSON when response_format is set May return markdown or ignore response_format
Recommended Use All JSON mode requests General chat, not JSON mode
Error Handling No error with response_format May return error or fall back to markdown

You can now fix the Perplexity API JSON mode issue by switching to a supported model, setting the response_format parameter, and adding strict JSON instructions to the system message. If problems persist, strip markdown in your code with a regex. For production applications, always validate JSON output with a JSON parser and log any parsing failures for debugging.

ADVERTISEMENT