Import CSV data directly into PostgreSQL.
Your users upload a spreadsheet. CSVbox maps their columns to your table schema, validates every row, and writes clean data straight into your PostgreSQL database. No intermediate API. No ETL layer. No manual imports.
- Validated data only
- Column mapping included
- SOC 2 Type II + GDPR
PostgreSQL is the database behind most SaaS applications. When your users need to bulk-load data — contacts, products, records, transactions — the typical path is a CSV upload that eventually lands in your Postgres tables. CSVbox closes that loop automatically.
Instead of building a pipeline to parse, validate, and insert spreadsheet data, you configure a destination in the CSVbox dashboard. CSVbox handles parsing, column mapping, type coercion, and validation. Your PostgreSQL table gets clean rows.
How It Works
- 1Connect your database
In the CSVbox dashboard, add a PostgreSQL destination. Provide your host, port, database name, username, and password. CSVbox connects over TLS and stores credentials encrypted.
- 2Map to your table
Select the target table. CSVbox reads your table’s column definitions. Map each schema field to a table column — or let CSVbox auto-match by name.
- 3Choose insert or upsert
Select Insert (append all rows) or Upsert (insert or update based on a unique column — e.g., email or external ID).
- 4Embed the importer
Add the CSVbox widget to your app. When users upload files, CSVbox validates their data and writes passing rows to your table. Failed rows surface inline for correction.
Configuration Options
| Option | Description |
|---|---|
| Connection | Host, port, database, user, password (TLS enforced) |
| Target table | Select any table in the connected database |
| Write mode | Insert (append) or Upsert (insert/update on conflict) |
| Unique column | Column used for upsert conflict resolution |
| Column mapping | Map schema fields to table columns; rename or skip |
| Null handling | Empty cells → NULL or a default value |
Common Use Cases
New customers upload their existing contacts, accounts, or products at signup. Data lands directly in your database, ready for use.
Users add large datasets (employee lists, product catalogs, SKUs) without manual entry.
Help customers migrate from a previous tool by accepting their exported CSV and writing it to your schema.
Partners or vendors upload a file each week; rows are validated and appended to your tables automatically.
Frequently Asked Questions
Does CSVbox store my database credentials?
Credentials are encrypted at rest using AES-256. CSVbox connects to your database only during import processing. If you require data to never touch CSVbox infrastructure, use the API/Webhook destination with Private Mode instead.
Can I target a specific schema or table in a multi-schema database?
Yes. You can specify schema and table in the destination config (e.g., public.contacts or analytics.events).
What happens when a row fails validation?
Failed rows are surfaced in the importer UI so the user can correct them inline. They do not reach your database. Passing rows are inserted; failing rows are held for correction.
Is there a row size limit?
PostgreSQL’s row size limits apply. For imports with large text fields or JSONB columns, standard PostgreSQL constraints hold.
Does CSVbox support SSL/TLS connections?
Yes. All database connections use TLS. You can also provide a CA certificate for strict certificate verification.