JSON to Pydantic Model Generator

Paste JSON, get a Pydantic BaseModel class with inferred types.

Input
Root class:
Output

How to use JSON to Pydantic

  1. Paste your JSON into the left editor. It can be a single object, an array, or deeply nested data.
  2. Set the root class name using the input above the editors. This becomes the top-level BaseModel name (default is Root).
  3. Click "Generate Pydantic" to produce the model code in the right panel.
  4. Review the output. Sub-models are created automatically for every nested object. Fields use Optional when a key has a null value.
  5. Copy the code into your Python project. Add validators, Field descriptions, or custom Config as needed.

What is Pydantic?

Pydantic is the most widely used data-validation library in the Python ecosystem. It lets you declare data shapes as plain Python classes that inherit from BaseModel, and it validates incoming data at runtime — coercing types, raising clear errors, and producing clean dictionary or JSON output. FastAPI, LangChain, Instructor, and dozens of other popular frameworks depend on Pydantic models to define request bodies, database rows, and tool schemas.

Writing these models by hand is straightforward for small objects, but when you are staring at a 200-line API response or a complex LLM extraction result, manually mapping every field is tedious and error-prone. This converter reads your JSON, walks the structure recursively, infers the narrowest Python type for each value, and emits ready-to-run BaseModel classes — including nested sub-models — in seconds. You get a working starting point that you can then enrich with validators, aliases, and docstrings.

Examples

FastAPI request body — user registration

Given this JSON payload:

{
  "username": "jdoe",
  "email": "jdoe@example.com",
  "age": 28,
  "is_active": true,
  "tags": ["admin", "beta"]
}

The generator produces:

from pydantic import BaseModel
from typing import List

class User(BaseModel):
    username: str
    email: str
    age: int
    is_active: bool
    tags: List[str]

LLM structured output — extraction result

If you ask Claude or GPT to extract invoice data you might receive:

{
  "invoice_number": "INV-2024-0042",
  "vendor": {
    "name": "Acme Corp",
    "tax_id": "US-123456789"
  },
  "line_items": [
    { "description": "Widget A", "quantity": 10, "unit_price": 4.99 }
  ],
  "total": 49.90,
  "currency": "USD",
  "notes": null
}

Output:

from pydantic import BaseModel
from typing import List, Optional

class Vendor(BaseModel):
    name: str
    tax_id: str

class LineItem(BaseModel):
    description: str
    quantity: int
    unit_price: float

class Invoice(BaseModel):
    invoice_number: str
    vendor: Vendor
    line_items: List[LineItem]
    total: float
    currency: str
    notes: Optional[str]

Nested config object

{
  "app_name": "my-service",
  "debug": false,
  "database": {
    "host": "localhost",
    "port": 5432,
    "credentials": {
      "user": "admin",
      "password": "secret"
    }
  },
  "feature_flags": ["dark_mode", "v2_api"]
}

This produces three models: Credentials, Database, and a root Config model with typed fields for every level.

Common use cases

  • Bootstrapping FastAPI request and response models from example payloads during API development.
  • Creating Instructor / LangChain output schemas so an LLM returns structured, validated data.
  • Generating models for webhook payloads from services like Stripe, GitHub, or Twilio.
  • Converting a legacy JSON config file into a typed, validated settings object.
  • Quickly producing type stubs when reverse-engineering an undocumented API.
  • Building data-ingestion pipelines where raw JSON from S3 or Kafka needs validation before processing.

Frequently asked questions

How does it handle Optional fields?

Any field whose value is null in the sample JSON is typed as Optional[T]. If the value is always null, the field becomes Optional[Any]. In practice you will usually want to refine this — replace Any with the actual expected type and add a default of None.

Can it handle nested objects?

Yes. Every nested object becomes its own BaseModel sub-class. The class name is derived from the key name (e.g. a key called "billing_address" produces a class named BillingAddress). Nesting depth is unlimited.

Does it support Pydantic v1 and v2?

The generated code uses the BaseModel import style that works with both Pydantic v1 and v2. If you are on v2 and prefer the newer model_config pattern or ConfigDict, you can add that after generation. The core field declarations are compatible with both versions.

What about lists of mixed types?

If an array contains elements of different primitive types (e.g. strings and numbers), the generator falls back to List[Any]. If all elements are objects with the same shape, it creates a typed sub-model. You may want to hand-edit mixed arrays into a Union type.

How do I add validators after generating?

Copy the output into your project and use Pydantic's @field_validator (v2) or @validator (v1) decorator to add constraints. For example, add @field_validator('email') to check email format, or use Field(ge=0) to enforce non-negative numbers.

Does it preserve field order?

Yes. Fields appear in the model in the same order they appear in the source JSON, which keeps the generated code predictable and easy to review.

Can I use this with JSON Schema Generator?

Absolutely. You can generate a JSON Schema from your data first, then use this tool to create a matching Pydantic model. Pydantic models can also export their own schema via .model_json_schema(), so the two representations stay in sync.

Privacy & how it works

This tool runs entirely in your browser. Your JSON is parsed and converted to Pydantic code using client-side JavaScript — nothing is sent to a server, stored in a database, or logged anywhere. You can verify this by opening your browser's network tab while using the tool. For sensitive data like API keys or customer records, this means you get the convenience of automatic code generation without any privacy trade-off. See also the JSON Formatter and JSON to Zod tools, which operate the same way.