Data Pipeline Service
Data Pipeline Service (DPS) assembles data processing components into a pipeline (data processing workflow) and then schedules, runs, manages, and monitors these pipelines. Using pre-packaged activities, including those used for data copy and format conversion, DPS reliably processes and moves data between internal data sources. Creating a complex pipeline that is repurposable, fault-tolerant, and high-availability is easier than ever.
Charged monthly or yearly, starting from CNY 85/pipeline/month. Enjoy 1 year of service by paying for just 10 months.Learn more
With GUI and a variety of data processing components, pipeline orchestration is made simpler.
DPS schedules pipelines at a specified time or frequency. DPS allows you to customize pipeline schedule configurations, making pipelines flexible and controllable.
Everything about a pipeline, from creation to scheduling and running, is visible.
DPS runs pipelines, monitors events, and reports alarms without user interventions.
Data Movement Between Cloud Products
If you have bought several cloud products and each product stores a certain amount of data, you may sometimes find it difficult to move data between them. DPS provides data transmission channels to help you achieve your goal quickly.
Data Format Conversion
Do you run into problems with converting a large volume of data from one format to another? Try out the pre-packaged data conversion activities of DPS! Conversion will be done quickly. That's not all. There are many more features worthy of your attention, such as compression, encryption, and decryption features.
Scheduled Batch Task Execution
Deep data analysis often requires a variety of complex tasks. DPS takes care of task configuration and scheduling, automatic monitoring, complex exception handling, and data recovery.
One-Stop Console for Various Cloud Products
By integrating with common big data processing components, DPS supports quick redirection and interactive configuration. Through a single console, you can monitor operations, tasks, and data status of every data processing component with an easy and consistent user experience.
Pipeline Creation and Management
Assembles data processing components (activities); defines component properties; moves, processes, and analyzes data.
Provides a consistent means for scheduling complex pipelines; processes data reliably, and moves data between internal data sources or between computing and storage services at specified intervals.
Unified Operation and Monitoring
Monitors pipeline status and usage in real time; displays pipeline scheduling history; delivers a consistent configuration console experience.
Data Movement, Compression, and Conversion
Provides pre-packaged activities for data movement, compression, and conversion to reduce coding complexity.