Data Ingestion

Ahead of operation and maintenance
New ETL standard

Data Transfer (ETL) / Managed Data Transfer
Features

Helping you elevate your data
management experience

Supports ETL, ELT, and CDC, offering transfer methods that can be tailored to your specific needs.

ETL encompasses a range from no-code solutions to flexible conversion processing utilizing programming languages.
In contrast, ELT facilitates detailed data modeling through integration with dbt. Additionally, it enables data loading in the CDC (Change Data Capture) format, offering fully managed tracking of changes on the data source side.

With a rare offering worldwide: an ETL service grounded in DevOps principles.

This approach applies engineering best practices to data analysis infrastructure, featuring a range of functions designed to support full-scale operational phases:
Management of codebase configurations
Integration of Pull Request reviews with GitHub
Management of change history and rollback capabilities
Execution of tests prior to production deployment

Automatically tracks changes in data sources, reducing operation and maintenance efforts to zero.

ETL pipeline's schema change detection tracks column and table alterations, including additions and deletions.
Managed data transfer monitors these changes and ensures seamless updates to data source APIs.
Automation within TROCCO® streamlines data management and minimizes manual monitoring of schema changes.

Basic ETL can be performed easily without coding, while also supporting customized implementations through the use of programming languages.

Enhances data management with features like masking, filtering, hashing, regex replacement, and standardization.
Offers programming support in Ruby and Python for tailored scripting and automation.
Streamlines ETL with GUI for basics and programming for complexity, ensuring ease and flexibility.

Featuring an intuitive and sophisticated UI/UX, we offer the most direct path for data engineering

Our UI minimizes setup and operational frustrations, providing engineers with an efficient, self-preferable tool for data engineering tasks.
Key features include draft-saving, label management, test runs, batch editing, connection verification, three-step setup, input suggestions, and code difference checks, all designed for the "shortest route" to task completion.
Technical Capabilities

Elevate Your Data Workflow and Streamline Analytics with Automated Schema, Custom Templates, and Dynamic Variables

Native support for BigQuery partitioned table clustering
It supports creating and updating BigQuery partitioned tables and clustering tables.
Automatic schema inference
If the transfer source is in file format * , the file compression format, format, and schema (data type) are automatically inferred.
JSON type expansion
You can expand the source JSON data and map it to the destination column.
Advertising report template
TROCCO®'s unique report templates streamline data extraction from advertising connectors. Suggests the most suitable fields for advertising analysis.
Test run the job
You can perform a test run to check whether the job can be executed without problems with the created transfer settings
Custom variables
You can embed custom variables throughout your ETL pipeline settings. It can be used for backfill purposes such as manual re-execution, and supports flexible execution methods such as loop execution.
How it works

Automated data replication architecture

From source to destination, our core functionality automates the extract and load replication process for all of our connectors, so you can enjoy total pipeline peace of mind.
Data Integration/
Ingestion
Begin by swiftly connecting to any data source, enabling the collection of diverse datasets within minutes.
Data Transformation
Convert the ingested raw data into structured business data models that are ready for analysis.
Data Orchestration
Automate and optimize the entire data flow, from initial ingestion to final storage.

Still curious ?

Watch our live demo video to see the platform in action. Witness firsthand how our ETL pipelines can transform your data processes, making them more efficient and effective.
Book a Demo
Book a Demo

Frequently Asked Questions

01.
How to fix the error that occurs when the transfer volume from BigQuery is too large
Note This is a machine-translated version of the original Japanese article. Please understand that some of the information contained on this page may be inaccurate. summary When specifying BigQuery as the transfer source, an error may occur if ...
02.
How to specify elements and extract values when an array is included in the source column JSON
Note This is a machine-translated version of the original Japanese article. Please understand that some of the information contained on this page may be inaccurate. summary If the JSON of the transfer source column contains an array and you wan...
03.
How to generate a webhook URL in Slack
Note This is a machine-translated version of the original Japanese article. Please understand that some of the information contained on this page may be inaccurate. summary Learn how to issue the webhook URL required for notifications to Slack....
04.
Is it possible to increase the transfer speed?
Note This is a machine-translated version of the original Japanese article. Please understand that some of the information contained on this page may be inaccurate. summary When specifying BigQuery as the transfer source, an error may occur if ...
05.
Do you support transfers from an on-premise environment?
Note This is a machine-translated version of the original Japanese article. Please understand that some of the information contained on this page may be inaccurate. summary When specifying BigQuery as the transfer source, an error may occur if ...
06.
Do you support transfers from an on-premise environment?
Note This is a machine-translated version of the original Japanese article. Please understand that some of the information contained on this page may be inaccurate. summary When specifying BigQuery as the transfer source, an error may occur if ...

TROCCO is trusted partner and certified with several Hyper Scalers