检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Subscribe to model assets (optional), training resources (optional), and inference resources. You can use the paid or free models that have been subscribed to in the agent.
NOTE: If the service address contains authentication information, you are advised to use the authorization check (API Key) function of the platform.
Enter the load balancer name (in the format of mas-xxx), set the listening port (ranging from 30000 to 40000), and set the protocol type (using HTTP or HTTPS requests for calling the inference model). Click OK.
They include services like Object Storage Service (OBS) and Content Delivery Network (CDN). Click OK. Figure 4 Selecting a scope Click OK.
Figure 2 Obtaining the call path of a workflow (2) Obtaining a Token Obtain the token by following the instructions provided in section "Calling REST APIs" > "Authentication" in API Reference.
Response Parameters Streaming (with stream set to true in the header) Status code: 200 Table 4 Data units output in streaming mode Parameter Type Description data String If stream is set to true, agent execution messages will be returned in streaming mode.
Flexible MCP service MCP uses the universal standard language to supply tools and data through the MCP server (one-off development and unlimited connections). In this way, AI agents can communicate with thousands of external tools and data more efficiently and conveniently.