Models / DeepSeek
LLM

DeepSeek-V3-0324

Mixture-of-Experts model challenging top AI models at much lower cost. Updated on March 24th, 2025.

About model

DeepSeek-V3-0324 is a strong Mixture-of-Experts language model with 671B parameters, 37B activated per token, designed for efficient inference and cost-effective training. It excels in performance, outpacing other open-source models and rivaling leading closed-source models. Suitable for applications requiring high-quality language understanding and generation.

This endpoint was updated on March 24th, 2025 to use the weights of the improved DeepSeek-V3-0324 model.

Performance benchmarks

Model

AIME 2025

GPQA Diamond

HLE

LiveCodeBench

MATH500

SWE-bench verified

35.3%

56.3%

Related open-source models

Competitor closed-source models

Claude Opus 4.6

90.5%

34.2%

78.7%

OpenAI o3

83.3%

24.9%

99.2%

62.3%

OpenAI o1

76.8%

96.4%

48.9%

GPT-4o

49.2%

2.7%

32.3%

89.3%

31.0%

    Related models
    • Model provider
      DeepSeek
    • Type
      LLM
    • Main use cases
      Chat
      Function Calling
    • Features
      Function Calling
      JSON Mode
    • Fine tuning
      Supported
    • Deployment
      On-Demand Dedicated
      Monthly Reserved
    • Parameters
      671B
    • Context length
      131K
    • Input price

      $1.25 / 1M tokens

    • Output price

      $1.25 / 1M tokens

    • Input modalities
      Text
    • Output modalities
      Text
    • Released
      December 25, 2024
    • Last updated
      January 22, 2026
    • Quantization level
      FP8
    • External link
    • Category
      Chat