Data Processing Engine
Data Platform automation service: transform data, run and orchestrate production-grade ETL/ELT workflows.
Process
Execute batch data processing jobs to extract, transform and load data from any source to any destination.
Automate
Create workflows with a low-code visual builder and schedule them to run in order to automate data-intensive tasks.
Develop
Code and run any custom Python or PySpark script and leverage a complete SDK with over 40 connectors.
Iterate
Organise and version your code, via native versioning systems or Git integration
Join the Data Platform Beta and try it for free
Who is this for ?
Data Engineers
Create pipelines to extract data from enterprise data sources and aggregate them into data warehouse tables at any given time.
MLOps Engineers
Carry out all the data cleaning and feature engineering needed for the training of machine learning (ML) models.
Software Engineers
Deploy any data-intensive code, such as custom Python optimisation solvers to compute equation optimums.
Simple, transparent, pay-as-you-go pricing
Data Platform all inclusive usage-based pricing means you only pay for what you use:
- High-performance storage used up by your data stored on the platform, billed per GB per month.
- Serverless computing for lakehouse service, with queries billed per TB of scanned data.
- Reserved computing power billed per hour or per month, available for all Data Platform services.
No additional user licence or traffic costs to scale data projects without blowing up budgets.