Skip to main content
Version: User Guides (BYOC)

Import Data (Console)

This page introduces how to import the prepared data on the Zilliz Cloud console.

Before you start

Make sure the following conditions are met:

Import data on the web UI

Once data files are ready, you can upload them to an object storage bucket for data imports.

📘Notes
  • You can have up to 10 running or pending import jobs in a collection.

  • The web console supports uploading a local JSON file of up to 1 GB. For larger files, it is recommended to upload from an object storage instead. If you have any difficulties with data import, please create a support ticket.

Remote files from an object storage bucket

To import remote files, you must first upload them to a remote bucket. You can easily convert your raw data into supported formats and upload the result files using the BulkWriter tool.

Once you have uploaded the prepared files to a remote bucket, select the object storage service and fill in the path to the files in the remote bucket and bucket credentials for Zilliz Cloud to pull data from your bucket.

Based on your data security requirements, you can use either long-term credentials or session tokens during data import.

For more information about obtaining credentials, refer to:

For more information about using session tokens, refer to the FAQ.

📘Notes

Zilliz Cloud now allows you to import data from any object storage service to any Zilliz Cloud cluster, regardless of the cloud provider hosting the clusters. For instance, you can import data from an AWS S3 bucket to a Zilliz Cloud cluster deployed on GCP.

byoc-data-import-on-console-remote

Verify resultes

You can view the progress and status of the import job on the Jobs page.

Supported object paths

For applicable object paths, refer to Tips on Import Paths.