检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn

不再显示此消息

  • Intl-English
    International
    • English
    • Bahasa Indonesia
    • Español
    • Português
    • Türkçe
    • عربي
    • ไทย
    • 简体中文
    • 日本語
    中国站
    • 简体中文
    Europe
    • English
    • Deutsch
    • Español
    • Français
    • Nederlands
  • Huawei Cloud
    • Activities
    • Products
    • Solutions
    • Pricing
    • KooGallery
    • Partners
    • Developers
    • Support
    • About Us
    Hot
    • Free Packages
    • Elastic Cloud Server (ECS)
    • Object Storage Service (OBS)
    • ModelArts
    • Cloud Container Engine (CCE)
      Show more results for “”
      • Contact Us
      • Documentation
      • Console
        • My Account
        • Billing & Costs
        • Service Tickets
        • Unread Messages
        • Console
        • Partner Center
        • Sign In Sign Up
      • Sign In
      • Sign Up
        • My Account Complete Sign Up
        • Billing & Costs
        • Service Tickets
        • Unread Messages
        • Console
        • Partner Center
        • Log Out
      Cancel
      Hot
      • Free Packages
      • Elastic Cloud Server (ECS)
      • Object Storage Service (OBS)
      • ModelArts
      • Cloud Container Engine (CCE)
        • All
        • Products
        • Solutions
        • Documentation
        • KooGallery
        • Developer
        • Learn
        • Others
        7954 results found.
        • Connecting OBS to Big Data Components - Object Storage Service

          Connecting OBS to Big Data Components - Object Storage Service

          Connecting OBS to Big Data Components Supported Big Data Components Connecting Hadoop to OBS Connecting Hive to OBS Connecting Spark to OBS Connecting Presto to OBS Connecting Flume to OBS Connecting DataX to OBS Connecting Druid to OBS Connecting Flink to OBS Connecting Logstash

          Help > Object Storage Service > Best Practices > Using OBS to Decouple Storage from Compute in Big Data Scenarios
        • Automatically Creating an OBS Volume Through kubectl - Cloud Container Engine

          Automatically Creating an OBS Volume Through kubectl - Cloud Container Engine

          Automatically Creating an OBS Volume Through kubectl Scenario During the use of OBS, expected OBS buckets can be automatically created and mounted as volumes.

          Help > Cloud Container Engine > User Guide > Storage Management: FlexVolume (Deprecated) > Using OBS Buckets as Storage Volumes
        • Authorizing Access to an OBS Bucket - Video On Demand

          Authorizing Access to an OBS Bucket - Video On Demand

          Authorizing Access to an OBS Bucket Function Authorizes VOD to access an OBS bucket or cancels the authorization URI PUT /v1.0/{project_id}/asset/authority Table 1 Path Parameters Parameter Mandatory Type Description project_id Yes String Project ID.

          Help > Video On Demand > API Reference > Uploads media files
        • API Overview of OBS SDK for Python - Object Storage Service

          API Overview of OBS SDK for Python - Object Storage Service

          API Overview of OBS SDK for Python Table 1 describes the APIs provided by OBS SDK for Python. You can click an API name in the table to see its detailed information and sample code.

          Help > Object Storage Service > Python
        • OBS Result Table - Data Lake Insight

          OBS Result Table - Data Lake Insight

          OBS Result Table Function The FileSystem result (sink) table is used to export data to the HDFS or OBS file system. It is applicable to scenarios such as data dumping, big data analysis, data backup, and active, deep, or cold archiving.

          Help > Data Lake Insight > Flink SQL Syntax Reference > Flink OpenSource SQL 1.15 Syntax Reference > Connectors > OBS
        • Delete OBS - DataArts Studio

          Delete OBS - DataArts Studio

          Delete OBS Constraints This function depends on OBS. Functions The Delete OBS node is used to delete a bucket or directory on OBS. Parameters Table 1 and Table 2 describe the parameters of the Delete OBS node.

          Help > DataArts Studio > User Guide > DataArts Factory > Node Reference
        • Naming OBS Objects - Graph Engine Service

          Naming OBS Objects - Graph Engine Service

          Naming OBS Objects The OBS object names supported by GES can contain the following characters: Letters and digits 0-9, a-z, A-Z Special characters ! - _ . * ' ( ) The following characters are not supported: Special characters \{^}%`]">[~<#|&@:,$=+?

          Help > Graph Engine Service > API Reference > Before You Start > Constraints and Limitations on Using GES
        • Create OBS - DataArts Studio

          Create OBS - DataArts Studio

          OBS Path Yes Path to the OBS bucket or directory. To create a bucket, enter //OBS bucket name. The OBS bucket name must be unique To create an OBS directory, select the path to the OBS directory to be created, and enter the /Directory name following the path.

          Help > DataArts Studio > User Guide > DataArts Factory > Node Reference
        • Using OBS - Object Storage Service

          Using OBS - Object Storage Service

          Using OBS You can manage OBS resources in the ways listed in the table below. Tool Description How to Use Reference OBS Console OBS Console is a web-based GUI. You can manage all your OBS resources through this console. Create an account or an IAM user to log in to OBS Console.

          Help > Object Storage Service > Operation Guide (Leaving soon. Moving to User Guide.) > Before You Start
        • Accessing OBS - VPC Endpoint

          Accessing OBS - VPC Endpoint

          You can only access OBS using the OBS domain name in the region where the VPC endpoint is located. Configure an OBS route from your on-premises data center to the Direct Connect or VPN gateway. The IP address of OBS belongs to 100.125.0.0/16.

          Help > VPC Endpoint > User Guide
        • Connecting Flume to OBS - Object Storage Service

          Connecting Flume to OBS - Object Storage Service

          OBS and HDFS differ in consistency assurance.

          Help > Object Storage Service > Best Practices > Using OBS to Decouple Storage from Compute in Big Data Scenarios > Connecting OBS to Big Data Components
        • Connecting Druid to OBS - Object Storage Service

          Connecting Druid to OBS - Object Storage Service

          Parent Topic: Connecting OBS to Big Data Components

          Help > Object Storage Service > Best Practices > Using OBS to Decouple Storage from Compute in Big Data Scenarios > Connecting OBS to Big Data Components
        • Connecting Logstash to OBS - Object Storage Service

          Connecting Logstash to OBS - Object Storage Service

          /conf/file2obs.conf Parent Topic: Connecting OBS to Big Data Components

          Help > Object Storage Service > Best Practices > Using OBS to Decouple Storage from Compute in Big Data Scenarios > Connecting OBS to Big Data Components
        • Connecting DataX to OBS - Object Storage Service

          Connecting DataX to OBS - Object Storage Service

          Example: txtfilereader is the source, and OBS is the destination.

          Help > Object Storage Service > Best Practices > Using OBS to Decouple Storage from Compute in Big Data Scenarios > Connecting OBS to Big Data Components
        • Compatibility Between OBS APIs and PFS - Object Storage Service

          Compatibility Between OBS APIs and PFS - Object Storage Service

          Compatibility Between OBS APIs and PFS You can call some OBS APIs to use PFS. There may be additional requirements when you call these APIs. For details about the OBS APIs, see Object Storage Service API Reference.

          Help > Object Storage Service > Parallel File System (Leaving soon. Moving to User Guide.) > Using PFS with OBS APIs
        • Connecting MRS to OBS - Object Storage Service

          Connecting MRS to OBS - Object Storage Service

          Parent Topic: Connecting Big Data Platforms to OBS

          Help > Object Storage Service > Best Practices > Using OBS to Decouple Storage from Compute in Big Data Scenarios > Connecting Big Data Platforms to OBS
        • Connecting Spark to OBS - Object Storage Service

          Connecting Spark to OBS - Object Storage Service

          Check whether the connection is successful: $SPARK_HOME/bin/run-example org.apache.spark.examples.JavaWordCount obs://obs-bucket/input/test.txt Parent Topic: Connecting OBS to Big Data Components

          Help > Object Storage Service > Best Practices > Using OBS to Decouple Storage from Compute in Big Data Scenarios > Connecting OBS to Big Data Components
        • Connecting Hive to OBS - Object Storage Service

          Connecting Hive to OBS - Object Storage Service

          by ","; insert into table student select 6,"yangdong",29; Parent Topic: Connecting OBS to Big Data Components

          Help > Object Storage Service > Best Practices > Using OBS to Decouple Storage from Compute in Big Data Scenarios > Connecting OBS to Big Data Components
        • Where Is Data Stored in OBS? - Object Storage Service

          Where Is Data Stored in OBS? - Object Storage Service

          Where Is Data Stored in OBS? When creating a bucket on OBS, you can specify a region for the bucket. Then your data on OBS is stored on multiple storage devices in this region. Parent topic: OBS Basics

          Help > Object Storage Service > FAQs
        • Using ma-cli to Copy OBS Data - ModelArts

          Using ma-cli to Copy OBS Data - ModelArts

          /test/ obs://your-bucket/copy-data/ # Download OBS file to local path ma-cli obs-copy obs://your-bucket/copy-data/test.zip ./test.zip # Download OBS directory to local path ma-cli obs-copy obs://your-bucket/copy-data/ .

          Help > ModelArts > DevEnviron > ModelArts CLI Command Reference
        Total results: 7954
        • Previous
        • 1
        • ...
        • 7
        • 8
        • 9
        • ...
        • 398
        • Next
        • Go
        Load More
        Was this helpful?
        Feedbacks
        /200
        Submit Feedback Cancel
        Contact Sales After-Sales Self Service
        • Site Terms
        • Privacy Statement

        Explore Huawei Cloud

        Why Us Customer Stories Trust Center Legal Press Releases

        Featured Services

        Elastic Cloud Server (ECS) Elastic IP (EIP) RDS for MySQL Elastic Volume Service (EVS) MapReduce Service (MRS)

        Service and Support

        Documentation Contact Us Public Notices Support Plans Service Health Dashboard

        Account and Payment

        Top Up Invoices Billing Center My Account Payment Method

        Quick Links

        Huawei Corporate Huawei Enterprise Huawei Consumer Business Huawei Developers

        © 2025, Huawei Cloud Computing Technologies Co., Ltd. and/or its affiliates. All rights reserved.

        • Site Terms
        • Privacy Statement