Store import files directly in Amazon S3.
CSVbox accepts your users’ CSV and Excel uploads, validates the data, and saves the file as CSV or JSON into any S3 bucket you control. Use it as a landing zone for downstream ETL, analytics, or data lake pipelines.
- Validated data only
- Column mapping included
- SOC 2 Type II + GDPR
S3 is the natural landing zone for data pipelines. If your architecture reads from S3 — a Lambda trigger, a Glue job, an Athena query, a Redshift COPY command — routing CSVbox imports to S3 slots directly into that workflow.
Instead of building a custom file upload endpoint and piping it to S3, configure CSVbox as the upload UI and S3 as the destination. Validated files appear in your bucket; your downstream pipeline picks them up.
How It Works
- 1Connect your S3 bucket
Add an Amazon S3 destination in the CSVbox dashboard. Provide your AWS access key ID, secret access key, bucket name, and region. CSVbox uses minimum required IAM permissions (s3:PutObject).
- 2Configure the file path
Set a folder prefix for stored files. CSVbox generates a unique filename per import using the import ID and timestamp.
- 3Choose file format
Select CSV (cleaned and re-encoded) or JSON (rows as a JSON array, useful for Lambda triggers or Athena).
- 4Embed and ship
Add the CSVbox widget to your app. Users upload files; validated data lands in your S3 bucket.
Configuration Options
| Option | Description |
|---|---|
| AWS credentials | Access key and secret with s3:PutObject minimum |
| Bucket | Any S3 bucket you own |
| Region | Any AWS region |
| Folder prefix | Path prefix for stored objects |
| File format | CSV or JSON |
| Object metadata | Import ID, user ID, row count, schema ID, timestamp |
| Server-side encryption | SSE-S3 (AES-256) or SSE-KMS |
S3 + Downstream Patterns
| Pattern | How It Works |
|---|---|
| Lambda trigger | S3 event notification fires on each CSVbox file; Lambda processes the rows |
| AWS Glue | Glue crawler catalogs the prefix; Glue job transforms and loads |
| Redshift COPY | COPY command loads CSVbox JSON or CSV from S3 into Redshift tables |
| Amazon Athena | Query CSVbox JSON files in S3 directly with Athena |
| Google BigQuery | Transfer via GCS to BigQuery using data transfer service |
Frequently Asked Questions
What IAM permissions does CSVbox need?
Minimum: s3:PutObject on your bucket. Optionally s3:GetBucketLocation to verify region.
Does CSVbox support S3-compatible storage like MinIO or Cloudflare R2?
Yes for providers that support the AWS SDK. Provide the custom endpoint URL in the destination config.
Can I encrypt objects in S3?
Yes. Configure SSE-S3 or SSE-KMS in the destination config.
Is there a file size limit?
CSVbox supports files up to 500 MB. Multipart upload is handled automatically for large files.