One-Stop Data Warehouse Building
DLF supports one-stop building of cloud data warehouses, where you can complete data integration, script development, job development, job scheduling, job monitoring, and data management without the need for multiple tools.
Data Lake Development
DLF manages a variety of Big Data services such as DWS and DLI and allows data to be orchestrated and scheduled in different types of data services.
Diverse Data Types
DLF allows online collaborative development, supports online editing of SQL and Shell scripts and real-time script query, and enables job development for types of data processing nodes such as Data Migration, SQL, MR, Shell, Machine Learning, and Spark.
Powerful Job Scheduling Capabilities
DLF provides you with diverse scheduling policies and powerful scheduling capabilities, supporting manual, periodic, and event-driven scheduling.
Automated Data Analysis Service Flow
DLF automates the E2E procedure from data import, data cleansing, machine learning, data backhaul, to report generation.
Orchestrates jobs of multiple cloud services, such as data migration, MR, Spark, Machine Learning, and SQL.
Supports GUI-based orchestration and is used out of the box.
Automated Generation of Complex BI Reports
DLF enables flexible development and automated generation of BI reports.
Supports SQL and Shell script development and multiple types of data warehouses, addressing the rapid development requirements of the new reports.
Develops workflows using drag-and-drop on a GUI, provides flexible dependency and scheduling policies, and automates generation of BI reports.
Quick Building of Cloud Data Warehouses
DLF can quickly migrate offline data to the cloud and integrate the data into the Big Data cloud services. With DLF Console, you can directly carry out data development, making data warehouse building easier than ever before.
With a unified configuration on the DLF Console, both online and offline data can be quickly integrated into cloud data warehouses with just one click.
You can build data warehouses for whatever services you want, such as DWS, DLI, and HBase.
One-stop data warehouse building and stable service provisioning free you from building and maintaining Big Data clusters, dramatically reducing your costs in data warehouse building and improving the security of your data on the cloud.
Easy Analysis and Mining of Massive Amounts of Logs
After ingesting logs into OBS or Cloud Search Service through DIS, you can simply use DLF to analyze and mine massive amounts of logs by compiling data development scripts and data mining scripts.
Using Big Data analysis and mining capabilities is as simple as using utilities like water or electricity, easily helping explore your business value.
Log data can be quickly analyzed and mined on the one-stop development interface, several times faster than before.
You can leverage SQL scripts, MR scripts, Shell scripts, and Machine Learning scripts to perform data analysis and mining for different business.
Visually designs data models to adapt to personalized service requirements.
Easily integrates various data sources into data warehouses.
Supported data warehouse types including DWS, DLI, and MRS Hive
Visualized table building and data table management
Seamless integration of DLF and CDM
CDM-enabled data migration among 20+ heterogeneous data sources, driving smooth integration of data sources into data warehouses
Edits and debugs scripts online and orchestrates workflows using drag-and-drop.
Supports flexible scheduling, real-time monitoring, simple management & maintenance, and prompt alarming of jobs.
Online collaborative development of SQL and Shell scripts
Drag-and-drop workflow development
Multiple supported task types including data integration, SQL, MR, Spark, Shell, machine learning, and REST
Flexible scheduling cycles by minute, hour, day, and week
Multiple alarming modes and manual retry of jobs