Article

Getting Started with Native Object Store and Microsoft Azure Object Storage in 5 Easy Steps

Learn the prerequisites and configuration required for Vantage with Native Object Store to easily access Azure Blob storage and Azure Data Lake Gen 2.

William Kulju
William Kulju
November 11, 2020 4 min read
Vantage Native Object Store.

In case you missed the news, the latest release of Teradata Vantage delivered on cloud platforms as-a-Service includes a built-in Advanced SQL Engine capability called Native Object Store that allows business users to explore and analyze Parquet, CSV, and JSON format datasets located on external object storage on demand using familiar Teradata SQL.

Native Object Store is enabled “out-of-the-box”, which means Vantage users can start taking advantage of it immediately. Well almost, as there is a bit of set up needed to ensure the target object storage is accessible to Vantage. This article describes the prerequisites and configuration required to enable that access with Azure Blob storage and Azure Data Lake Gen 2 (ADLS-G2). Before beginning this process, make sure your Azure public cloud administrator is handy as he or she may need to assist with some of the steps.

Step 1: Get a location string to your Azure object storage
Whether your corporation uses Azure Blob or ADLS-G2 storage, from a Native Object Store perspective, both are functionally identical and fully supported Azure object storage options providing you supply Vantage with an Azure Blob compatible location string when accessing the storage. The easiest way to tell is to check that the location string includes the substring ‘.blob.’ For example:

/az/foobar.blob.core.windows.net/nostest/

If this substring is missing, Vantage will be denied access to the Azure object storage even if all the other prerequisites and configuration requirements below have been met. Once you have the location string, copy it to a text file as you will need that for the final validation test in Step 5.

Step 2: Get access keys to your Azure object storage
You also need to provide Vantage with your Azure object storage account name as well as a key to access the Azure object storage location from Step 1. You can generate the key by logging into your Azure Portal and locating the storage account that contains the target Azure object storage container. Once there you can choose to create either a shared access key or a shared access signature (SAS) token depending on your preference. Once you have this information, copy it to your text file again for the final validation test in Step 5. An example of valid account name / key pair syntax is as follows:

NAME: 'nospmadlsgen2'
KEY: '{your-api-key}'

If using the SAS token option, be aware that Vantage will be unable to access the Azure object storage if the SAS token was generated outside the Azure Portal and/or doesn’t provide full storage account access.

Step 3: Check the network connectivity between Teradata Vantage and your Azure object storage
You will need to establish TCP/IP connectivity between the Vantage environment and your corporate owned Azure object storage. While Vantage only cares that you have network connectivity, not how you achieve it, “the how” can have a big impact on latency, throughput, and egress charges, so it’s worth putting some thought into this.

As a rule, throughput will be maximized while latency and egress charges will be minimized if you access your corporate owned Azure object storage from a Vantage as-a-Service on Azure environment and both are located within the same region. Consequently, this is the recommended connectivity pattern to use with Native Object Store whenever possible.

That said, there are valid cross-region or multi-cloud use cases where, for example, you need to use Vantage as-a-Service on AWS or Google Cloud to access Azure object storage. In those cases, your only option may be to accept higher associated latency and corresponding reduced throughput.

Step 4: Check how your Azure object storage data is encrypted
Now it’s time to ensure your data at rest in Azure object storage is encrypted using Microsoft-managed keys. This is the default option, and unless you explicitly chose something else, is probably what you have. Customer managed and customer provided encryption keys are other options, but neither are currently supported with Native Object Store.  

All in flight communication between Native Object Store and Azure object storage is TLS encrypted by default. To confirm, ensure the ‘secure transfer required’ property is enabled in your Azure object storage account.

Step 5: Validate your configuration
From your Vantage as-a-Service environment, issue the following command from Teradata Studio or BTEQ using the Azure object storage location and access name / key information from Steps 1 and 2.

SELECT location(CHAR(100)), ObjectLength FROM READ_NOS (
     USING
     LOCATION ('/az/foobar.blob.core.windows.net/nostest/')
          ACCESS_ID ('nospmadlsgen2')
          ACCESS_KEY ('{your-api-key}')
          RETURNTYPE ('NOSREAD_KEYS')
     )
AS d ORDER BY 1;

This command is essentially the equivalent of a ‘ls’ or ‘dir’ command and, assuming everything has been configured correctly according to this article, Vantage should return a listing of all objects at the Azure object storage location (including possibly zero objects if the location is empty). At this point you are ready to start running the sample code found in the Native Object Store Getting Started Guide and/or the Native Object Store Orange Book (docs.teradata.com sign-in required).

Tags

About William Kulju

William Kulju is a Senior Product Manager at Teradata with 20 years’ experience in enterprise data management and big data analytics. William enjoys working with customers to build solutions they will love.  View all posts by William Kulju

Stay in the know

Subscribe to get weekly insights delivered to your inbox.



I consent that Teradata Corporation, as provider of this website, may occasionally send me Teradata Marketing Communications emails with information regarding products, data analytics, and event and webinar invitations. I understand that I may unsubscribe at any time by following the unsubscribe link at the bottom of any email I receive.

Your privacy is important. Your personal information will be collected, stored, and processed in accordance with the Teradata Global Privacy Statement.