Skip to main content

Transform Platform

Move data between file formats and event streams β€” reliably, at scale, without writing pipeline code.

Transform Platform is a spec-driven transformation engine built on Kotlin and Spring Boot. Teams define a FileSpec β€” a JSON schema that describes a file format, its fields, business rules, and delivery targets β€” and the platform handles parsing, validation, correction, and delivery automatically. New file formats and integration routes are added through configuration, not code.


How It Works​

Transform Platform handles both directions of data movement from a single spec. Whether you're ingesting partner files into your event stream or generating output files from processed events, the same FileSpec drives the pipeline.

πŸ“₯ File β†’ Events

πŸ“„
File In
CSV Β· NACHA Β· XML Β· FW
β€Ί
πŸ”
Parse
FileParser via spec
β€Ί
✏️
Correct
CorrectionRules
β€Ί
βœ…
Validate
ValidationRules
β€Ί
πŸ“€
Deliver
Kafka Β· DB Β· Webhook

πŸ—‚ FileSpec Registry β€” single source of truth for both directions

πŸ“€ Events β†’ File

πŸ“¨
Events In
Kafka Β· REST Β· SFTP
β€Ί
πŸ”€
Map
FieldMappings via spec
β€Ί
✏️
Correct
CorrectionRules
β€Ί
βœ…
Validate
ValidationRules
β€Ί
πŸ“„
Generate
NACHA Β· CSV Β· SWIFT

A single FileSpec defines both directions. Add a new file format by implementing one interface and annotating it @Component β€” the platform discovers it automatically at startup, no restarts required.


Key Principles​

🎯
Spec-Driven, Not Code-Driven
Business rules live in a FileSpec β€” a JSON document that defines fields, correction logic, and validation constraints. New formats and rules are deployed as configuration changes, not code releases.
⚑
Stream-First, Memory-Safe
Records flow through the pipeline as a Kotlin coroutine Flow. Files are never fully loaded into memory β€” a 10 GB file consumes the same heap as a 10 KB one. Processing starts on the first byte.
πŸ”Œ
Open to Extension, Closed to Modification
Add a new parser, writer, or rule by implementing one interface and adding @Component. Spring auto-discovers it. No changes to the pipeline, registries, or any existing code.
πŸ›‘οΈ
Errors Stay with Records, Not the Pipeline
Validation failures are attached to the record that caused them. The pipeline never stops on bad data β€” FATAL records are quarantined, WARNING and ERROR records flow through with their errors attached for downstream reporting.
πŸ”—
Dynamic Client Integrations
SFTP servers, Kafka clusters, REST webhooks, and S3 buckets are registered via API at runtime. Credential rotation and connection changes hot-reload without service restarts or redeployments.
πŸ”’
Security Without Compromise
Credentials are AES-256-GCM encrypted at rest and decrypted only in memory at connection time. Sensitive field values are masked in all logs. Credentials never appear in API responses or event payloads.

Supported File Formats​

FormatTypeModule
CSV / DelimitedCharacter-separated values, any delimiterplatform-core
Fixed-WidthPositional flat files (ACH, NACHA, Mainframe exports)platform-core
XMLXPath-mapped field extractionplatform-core

Adding a new format requires implementing FileParser for inbound or FileGenerator for outbound β€” both are single-method interfaces. See Adding a Parser for the step-by-step guide.


Where to Go Next​