Storage Options
How to authenticate using Activeloop storage, AWS S3, and Google Cloud Storage.
Last updated
How to authenticate using Activeloop storage, AWS S3, and Google Cloud Storage.
Last updated
Deep Lake datasets can be stored locally, or on several cloud storage providers including Deep Lake Storage, AWS S3, Microsoft Azure, and Google Cloud Storage. Datasets are accessed by choosing the correct prefix for the dataset path
that is passed to methods such as deeplake.load(path)
, and deeplake.empty(path)
. The path prefixes are:
Connecting Deep Lake datasets stored in your own cloud via Deep Lake Managed Credentials is required for accessing enterprise features, and it significantly simplifies dataset access.
In order to access datasets stored in Deep Lake, or datasets in other clouds that are managed by Activeloop, users must register and authenticate using the steps in the link below:
Authentication with AWS S3 has 4 options:
Use Deep Lake on a machine in the AWS ecosystem that has access to the relevant S3 bucket via AWS IAM, in which case there is no need to pass credentials in order to access datasets in that bucket.
Configure AWS through the cli using aws configure
. This creates a credentials file on your machine that is automatically access by Deep Lake during authentication.
Save the AWS_ACCESS_KEY_ID
,AWS_SECRET_ACCESS_KEY
, and AWS_SESSION_TOKEN (optional)
in environmental variables of the same name, which are loaded as default credentials if no other credentials are specified.
Create a dictionary with the AWS_ACCESS_KEY_ID
,AWS_SECRET_ACCESS_KEY
, and AWS_SESSION_TOKEN (optional)
, and pass it to Deep Lake using:
Note: the dictionary keys must be lowercase!
endpoint_url
can be used for connecting to other object storages supporting S3-like API such as MinIO, StorageGrid and others.
In order to connect to other object storages supporting S3-like API such as MinIO, StorageGrid and others, simply add endpoint_url
the the creds
dictionary.
Authentication with Microsoft Azure has 4 options:
Log in from your machine's CLI using az login
.
Save the AZURE_STORAGE_ACCOUNT
, AZURE_STORAGE_KEY
, or other credentials in environmental variables of the same name, which are loaded as default credentials if no other credentials are specified.
Create a dictionary with the ACCOUNT_KEY
or SAS_TOKEN
and pass it to Deep Lake using:
Note: the dictionary keys must be lowercase!
Authentication with Google Cloud Storage has 2 options:
Create a service account, download the JSON file containing the keys, and then pass that file to the creds
parameter in deeplake.load('gcs://.....', creds = 'path_to_keys.json')
. It is also possible to manually pass the information from the JSON file into the creds
parameter using:
Authenticate through the browser using the steps below. This requires that the project credentials are stored on your machine, which happens after gcloud
is initialized and logged in through the CLI. Afterwards, creds
can be switched to creds = 'cache'
.
Storage Location
Path
Notes
Local
/local_path
Deep Lake Storage
hub://org_id/dataset_name
Deep Lake Managed DB
hub://org_id/dataset_name
Specify runtime = {"tensor_db": True}
when creating the dataset
AWS S3
s3://bucket_name/dataset_name
Dataset can be connected to Deep Lake via Managed Credentials
Microsoft Azure (Gen2 DataLake Only)
azure://account_name/container_name/dataset_name
Dataset can be connected to Deep Lake via Managed Credentials
Google Cloud
gcs://bucket_name/dataset_name
Dataset can be connected to Deep Lake via Managed Credentials
In-Memory
mem://dataset_name