You want to import large datasets into Notion without manually typing each row. Notion provides a REST API that lets you create, update, and query database pages programmatically. This article explains how to use the Notion API to perform a bulk upload of data from a CSV file into a Notion database. You will learn how to set up an integration, authenticate requests, and write a script that sends multiple pages in one batch.
Key Takeaways: Bulk Importing Data Into Notion via API
- Notion API Integration token: Required to authenticate all requests; create one in Settings & Members > My connections > Develop or manage integrations.
- Database ID: The unique identifier for your target Notion database; found in the database URL after the workspace name.
- Batch requests with retries: Use a loop that sends one page per API call and includes a 0.3-second delay to avoid rate limits.
What the Notion API Bulk Upload Feature Does and What You Need
The Notion API allows external applications to create pages inside a database. Each page corresponds to one row in a database table. A bulk upload means you send multiple create requests in sequence, typically from a CSV file or JSON array. The API does not support a single request that creates many pages at once. You must send one POST request per page.
Before you start, you need three things. First, a Notion integration token. This token identifies your script to Notion’s servers. Second, the database ID of the target Notion database. Third, the source data, usually in CSV format, with column headers that match the database property names.
How the Notion API Rate Limit Affects Bulk Uploads
Notion enforces a rate limit of three requests per second per integration. If you send requests faster, the API returns a 429 Too Many Requests error. To stay under this limit, add a delay of at least 333 milliseconds between requests. A practical delay is 350 milliseconds to provide a safety margin.
Property Types That Require Special Handling
Notion database properties have different JSON structures. Text properties accept a simple string. Select properties require an array with a name field. Date properties need an object with a start key in ISO 8601 format. Relation properties need an array of objects each containing a database_id and an id. Your script must map each CSV column to the correct property type. If the mapping is wrong, the API returns a 400 Bad Request error.
Steps to Import Data Into Notion via API Bulk Upload
The following steps assume you have a CSV file ready and a Notion database with properties that match the CSV columns. You will use Python with the requests library. If you prefer another language, the API calls are identical.
- Create a Notion integration
Go to https://www.notion.so/my-integrations. Click New integration. Give it a name, for example Bulk Upload Script. Select the workspace that contains your target database. Click Submit. Copy the Internal Integration Secret token. This token looks like secret_abc123. - Share the target database with the integration
Open your Notion database in the browser. Click Share in the top-right corner. Click Invite. Paste the integration name from step 1. Click Invite. The integration now has permission to read and write pages in that database. - Get the database ID
Open the database page in your browser. Look at the URL. It looks like https://www.notion.so/workspace-name/database-title-hash?v=xxx. The database ID is the 32-character string between the last slash and the question mark. For example, in https://www.notion.so/myworkspace/abc123def456?v=xxx, the ID is abc123def456. - Prepare the CSV file
Ensure the first row contains column headers that exactly match the Notion property names. For example, if your Notion database has a Title property called Name and a Select property called Status, the CSV headers must be Name and Status. Save the file as data.csv in the same folder as your script. - Write the Python script
Create a file named bulk_upload.py. Use the following code template. Replace YOUR_TOKEN with the token from step 1 and YOUR_DATABASE_ID with the ID from step 3.import requests
import csv
import time
NOTION_TOKEN = 'YOUR_TOKEN'
DATABASE_ID = 'YOUR_DATABASE_ID'
HEADERS = {
'Authorization': f'Bearer {NOTION_TOKEN}',
'Content-Type': 'application/json',
'Notion-Version': '2022-06-28'
}
def create_page(data):
url = 'https://api.notion.com/v1/pages'
payload = {
'parent': { 'database_id': DATABASE_ID },
'properties': data
}
response = requests.post(url, headers=HEADERS, json=payload)
return response
with open('data.csv', newline='', encoding='utf-8') as csvfile:
reader = csv.DictReader(csvfile)
for row in reader:
properties = {}
for key, value in row.items():
# Map each CSV column to Notion property type
# Adjust this mapping based on your database schema
if key == 'Name':
properties[key] = {
'title': [
{ 'text': { 'content': value } }
]
}
elif key == 'Status':
properties[key] = {
'select': { 'name': value }
}
elif key == 'Due Date':
properties[key] = {
'date': { 'start': value }
}
else:
# Default to rich text for other columns
properties[key] = {
'rich_text': [
{ 'text': { 'content': value } }
]
}
response = create_page(properties)
if response.status_code == 200:
print(f'Created page: {response.json()["id"]}')
else:
print(f'Error: {response.status_code} {response.text}')
time.sleep(0.35) # Respect rate limit - Run the script
Open a terminal in the folder containing bulk_upload.py and data.csv. Run python bulk_upload.py. The script reads each row, sends a POST request, and prints the created page ID or an error. Wait for all rows to finish. - Verify the data in Notion
Open your Notion database. You should see new pages for each row in the CSV. If some pages are missing, check the error messages in the terminal. Common errors include incorrect property mapping or missing required properties.
Common Bulk Upload Errors and How to Fix Them
API returns 400 Bad Request with validation error
This error means the JSON payload does not match the database schema. Check that every property name in the payload matches the exact property name in Notion. Also confirm that the property type matches. For example, if a property is a Select type, you must send an object with a name key, not a string.
API returns 401 Unauthorized
The integration token is invalid or the integration does not have access to the database. Regenerate the token in the integrations page. Then verify that you shared the database with the integration by clicking Share in Notion and ensuring the integration name appears in the list.
API returns 429 Too Many Requests
Your script sent requests faster than the rate limit. Increase the delay between requests to at least 0.35 seconds. If the error persists, add a retry mechanism that waits 1 second and retries the failed request.
Some rows are missing after the upload
The script may have stopped early due to an unhandled error. Add a try-except block around the API call to catch exceptions and continue with the next row. Also log failed rows to a separate file so you can retry them later.
Notion API Bulk Upload vs Manual Entry vs Third-Party Tools
| Item | API Bulk Upload | Manual Entry | Third-Party Tools |
|---|---|---|---|
| Speed | Fast for thousands of rows | Slow for more than 10 rows | Fast but limited by tool capacity |
| Cost | Free with Notion API | Free (labor cost) | Often paid subscription |
| Customization | Full control over property mapping | Full control but manual | Limited to tool features |
| Error handling | Must write retry logic | Immediate visual feedback | Built-in error handling |
| Learning curve | Requires programming knowledge | No learning needed | Low to medium |
The API method gives you the most control and is free, but it requires coding skills. Manual entry works for small datasets. Third-party tools like Zapier or Make offer a middle ground with a subscription fee.
You can now import data into Notion using the API with a script that reads a CSV and creates pages automatically. To improve reliability, add error logging and retry logic. As an advanced tip, use the Notion API’s filter_properties parameter to speed up the upload by only sending properties that the database actually uses. This reduces payload size and avoids validation errors.