Cloud Data desktop banner

Teradata Migration To BigQuery

Cloud Data Responsive banner

Teradata To BigQuery

Big Iron to BigQuery

Has your organization had enough of:

  • High ongoing costs
  • Limited flexibility
  • Fear of vendor lock-in?

Performing a Teradata migration to BigQuery is an extremely attractive alternative for many enterprises with legacy Teradata enterprise data warehouses.

How does Google BigQuery stack up with an on-premise Teradata installation?

Kaleidoscope 1444 x 312 - 20 perc
teradata migration to bigquery

Core Compete to BigQuery

How does Google BigQuery stack up with an on-premise Teradata installation?

Considerations
On-Premise Teradata
Google BigQuery
Pricing Multi-year, Data-center hosted On-Demand Pay-for-what-you-use.
Storage Static – when you run out of disk space, you need to drop tables, archive data to tape, or make a call to your Teradata rep to place an order for more disk space. Dynamic – load as much data as you like. There’s no practical upper limit
Compute Static – Teradata is one of the original MPP systems and offers good control of compute resource usage. However, when your CPUs are busy, there are no more to be had and your workload will slow down. Dynamic – BigQuery allocates compute resources on-demand and manages usage and balancing automatically. There is no charge for compute resources used, billing is based on scanned data.
Tuning Hot Amps, spool space, hashes, PK, NUPIs, distribution, partitioning, statistics. Tuning on BigQuery involves making your data look like how it’s used. Need a nested reference dataset – no problem. Have a common join – de-normalize and see the speed.
Data Loading ETL – expensive external ETL servers and software required to transform data outside of the warehouse to keep the data loads from using up all system resources ELT – Transform data using cloud based, elastic, compute resources. No risk of system slowdowns from over-utilization.
On-Premise Teradata
Google BigQuery
Pricing Multi-year, Data-center hosted Pricing On-Demand Pay-for-what-you-use.
Storage Static – when you run out of disk space, you need to drop tables, archive data to tape, or make a call to your Teradata rep to place an order for more disk space. Storage Dynamic – load as much data as you like. There’s no practical upper limit
Compute Static – Teradata is one of the original MPP systems and offers good control of compute resource usage. However, when your CPUs are busy, there are no more to be had and your workload will slow down. Compute Dynamic – BigQuery allocates compute resources on-demand and manages usage and balancing automatically. There is no charge for compute resources used, billing is based on scanned data.
Tuning Hot Amps, spool space, hashes, PK, NUPIs, distribution, partitioning, statistics. Tuning Tuning on BigQuery involves making your data look like how it’s used. Need a nested reference dataset – no problem. Have a common join – de-normalize and see the speed.
Data Loading ETL – expensive external ETL servers and software required to transform data outside of the warehouse to keep the data loads from using up all system resources Data Loading ELT – Transform data using cloud based, elastic, compute resources. No risk of system slowdowns from over-utilization.
On-Premise Teradata
Pricing Multi-year, Data-center hosted
Storage Static – when you run out of disk space, you need to drop tables, archive data to tape, or make a call to your Teradata rep to place an order for more disk space.
Compute Static – Teradata is one of the original MPP systems and offers good control of compute resource usage. However, when your CPUs are busy, there are no more to be had and your workload will slow down.
Tuning Hot Amps, spool space, hashes, PK, NUPIs, distribution, partitioning, statistics.
Data Loading ETL – expensive external ETL servers and software required to transform data outside of the warehouse to keep the data loads from using up all system resources
Google BigQuery
Pricing On-Demand Pay-for-what-you-use.
Storage Dynamic – load as much data as you like. There’s no practical upper limit
Compute Dynamic – BigQuery allocates compute resources on-demand and manages usage and balancing automatically. There is no charge for compute resources used, billing is based on scanned data.
Tuning Tuning on BigQuery involves making your data look like how it’s used. Need a nested reference dataset – no problem. Have a common join – de-normalize and see the speed.
Data Loading ELT – Transform data using cloud based, elastic, compute resources. No risk of system slowdowns from over-utilization.

What’s Important in a Teradata Migration to BigQuery?

It’s important to know that migrating to a serverless, next-gen data warehouse involves more than just lifting and shifting the data. Legacy data pipelines need to be refactored to load data into the new warehouse.  Data models and queries need to be evaluated and optimized to best take advantage of Google’s “pay-for-what-you-use” pricing model.

Important Questions to Ask – if you’re considering moving from Teradata to BigQuery.

  • What is your strategy for ensuring data loaded into your new cloud data warehouse is identical to your on-premised CDW?
  • How do you plan on switching over to BigQuery to minimize disruption to your end-users?
  • What technologies and architecture are you planning to use for your data pipelines?
  • What are your plans for integrating BigQuery with BI tools and reporting systems?
  • How are you going to monitor monthly usage billing to make sure ghost polling processes don’t run up your costs?
  • What are your DBAs going to do with all the time they used to spend collecting statistics, managing lumpy data partitions and chasing down hot amps?

Breadth of Experience

Core Compete understands Teradata Migration to BigQuery.  We’ll help you understand the sometimes-hidden scope of a migration initiative, discuss architectures that can ease the conversion and share lessons learned from our experience in migrating customers from Teradata to Big Query.

We’ve developed specific planning, development and testing accelerators that enable us to migrate Teradata data warehouses to the cloud with less risk and faster time to value.

Teradata Data & Metadata Discovery Agents

How much data do you have in Teradata?  How is it structured?

Teradata Parallel Transporter (TPT) Integration

One time or as part of a recurring job, TPT is the fastest way to extract data from Teradata to load your new cloud data warehouse.

Teradata to BigQuery schema converter

Convert your existing schema to BigQuery native data types and structures

Teradata metadata usage analytics for BigQuery

How are you using your data?  Is it structured optimally for BigQuery’s pay-for-what-you-use pricing model?

Data Validation Framework

Be absolutely certain that the data in your migrated cloud data warehouse is identical to the data in your legacy system.  Reduce risk with rock-solid audit and traceability reporting.

Data Warehouse Continuity Framework

Build, migrate and test your Big Query warehouse and switch over with zero downtime for your end users and operational systems.

Case Studies

top 10 media company case tile
Top 10 Media Company

Connect with our Teradata to BigQuery Specialists

Kumar Majety

EXECUTIVE CONSULTANT

Kumar has over 20 years consulting, product management and marketing leadership experience and delivers systems of innovation for our global 1000 clients. He has worked at Lenovo, Intel and PWC in on innovative architecture and technology solutions. He has an MBA from Kellogg School of Management and graduate degree in technology. He enjoys his family and practices yoga and meditation.

Khari Villela

DIRECTOR, SOLUTION ARCHITECTURE

Khari Villela leads the Cloud Data Warehousing practice at Core Compete. With 20 year experience as an architect and CTO, Khari specializes in making data architecture easier to understand for business people and techies alike. Khari earned a degree in English from Yale University and lives in the woods with his family and far too many cats.

Sumanth Yamala

PRINCIPAL ARCHITECT

Sumanth is an architect and leads the US data engineering practice. Sumanth has experience architecting enterprise applications in analytic and pricing domains, and enabling them in the cloud. He loves nature and walking.

Dr. Christopher Houck

PARTNER

Dr. Houck has been working on Machine Learning ever since first using reinforcement learning to train a Neural Network to play blackjack during his Ph.D. studies. Chris has spent the past 25 years helping companies use advanced analytics to improve supply chain, merchandising, and business processes.

Krishna Kumar

SENIOR DIRECTOR, CUSTOMER ENGAGEMENT

Krishna has over 20 years of experience in Technology Program Management and worked at Amdocs, Visa, Standard Chartered Bank and Cognizant, before joining Core Compete. Krishna manages several North America customer engagements and can just easily dive into Technical Architecture (he is a AWS Professional Architect), as he can on establishing our Program Management processes.

Request a Briefing

Let’s Connect

Make a connection with one of our specialists to:

  • Learn how cloud analytics can drive business value in your organization
  • Understand how other companies are transforming their business and using AI and ML
  • Identify specific use cases where you can implement a transformation project as soon as 3 months