Overview
When running COPY FROM queries, you should have the option to include the endpoint URL. This feature is especially useful for scenarios where you need to provide credentials and specific endpoints.Syntax
The syntax is as follows:Replace
AWS_CRED with AZURE_CRED or GCS_CRED when copying from the Azure Blob Storage or Google Cloud Storage.-
Shared parameters:
table_name: table that will receive data from the filefile_path: link to the file location accessible from the server
-
Parameters in
AWS_CRED:aws_region: AWS region associated with the storage service (e.g. ‘region1’)key_id: key identifier for authenticationaccess_key: access key for authenticationendpoint_url: URL endpoint for the storage service
-
Parameters in
GCS_CRED:<path_to_credentials>: path to JSON credentials file<json_credentials_string>: contents of the GCS’s credentials file
-
Parameters in
AZURE_CRED:tenant_id: tenant identifier representing your organization’s identity in Azureclient_id: client identifier used for authenticationclient_secret: secret identifier acting as a password for authentication.
Examples
COPY FROM with AWS S3 Bucket
In this example, we are using the COPY FROM statement to import data from a file namedstudents_file and the endpoint is s3.us-east-2.amazonaws.com.
students_file is copied into the students table
COPY FROM with Google Cloud Storage
This example shows how to use theCOPY FROM statement to import data, but this time, the data is stored on Google Cloud Storage;
credentials.json file, you can also pass its contents as a string in the following way:
Make sure that it is in JSON format
AWS_CRED like below, with the following endpoint https://storage.googleapis.com.
project_file is copied into the project table.
For Google Cloud Storage, it’s recommended to use HMAC keys for authentication. You can find more details about that on the HMAC keys - Cloud Storage page.
COPY FROM with Azure Blob Storage
It’s a similar story for getting the data from Azure Blob Storage.your_blob is copied into the taxi_data.