-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Platform
notion
Action ID
conn_mod_def::GJ5EnlW1AEk::mEAQjNttT1GjVqNeMTvDiw
Action Name
Create a Page
What's wrong?
Other
Description
The knowledge documents maxLength=2000 on rich_text[].text.content in schema tables — it appears 20+ times. However, AI agents consistently fail to design around this limit because it's buried in field-level schema details rather than called out as a practical constraint.
What happened:
An AI workflow that uses Claude to synthesize research into a Notion page produced 3375 characters of content. The flow passed all validation but failed at runtime:
body failed validation: body.children[0].paragraph.rich_text[0].text.content.length should be ≤ 2000, instead was 3375
The AI agent that built the flow read the knowledge, saw the schema, but didn't internalize "I need to split my content into chunks."
Why this matters for AI agents:
Any workflow that writes AI-generated content to Notion will hit this. AI analysis output (research briefs, SEO audits, sales briefings) regularly exceeds 2000 characters. This is the most common Notion write pattern for AI workflows, and it fails silently at validation but loudly at runtime.
Suggested addition:
Add a prominent "Gotchas" or "Common Pitfalls" section near the top of the knowledge:
## Gotchas
- **Text content limit: 2000 characters per block.** Each `rich_text[].text.content`
value must be ≤ 2000 characters. For dynamic or AI-generated content that may exceed
this, split the text into multiple paragraph blocks. Example approach:
Split on double-newlines, then create one `paragraph` block per chunk:
{
"children": [
{
"object": "block",
"type": "paragraph",
"paragraph": {
"rich_text": [{ "text": { "content": "<first 2000 chars>" } }]
}
},
{
"object": "block",
"type": "paragraph",
"paragraph": {
"rich_text": [{ "text": { "content": "<next 2000 chars>" } }]
}
}
]
}
This would prevent the most common runtime failure for AI workflows writing to Notion.
How we discovered this
Building a "LinkedIn Research & Content Creator" flow that uses Exa for web research, Claude for synthesis, and Notion for storage. Claude's synthesized output was 3375 chars — well within what AI analysis typically produces, but over Notion's per-block limit.
Source
- Notion API docs — Block content limits: https://developers.notion.com/reference/request-limits#limits-for-property-values