Introduction to Snowflake (2023)

Cloud computing is now in its second decade of existence, and those two decades of development have been incredible. In these two decades, we have gone far beyond virtual machines and cloud storage. We now have tools like cloud functions (lambdas), containerization, and too many technologies to discuss in one article.

One area where there has been a lot of innovation is cloud databases. We have legacy database services that host familiar databases like MySQL, Postgres, and SQL Server. There are also offers for many other styles of data, such as documents, column data, key value data, and so on.

One of the relatively new cloud computing providers and the focus of this article is called Snowflake. Snowflake is a unique offering as it offers many features that developers need today. Do you need a cloud-based SQL database? Do you need a database that can query JSON data stored in a column? Do you need the ability to securely share data with external providers without exposing your infrastructure to them? Snowflake handles many of these concepts very well.

In my 30+ years of software development, I have seen many products come and go. Every once in a while I come across a product that I consider to be a game changer. The product I'm about to talk about is calledSnowflakeand I can say without a doubt that this is an innovative product.

Imagine a request from the marketing department. “We are going to be getting a large data set from Widgetemo, Inc. We will be conducting reviews on this and speedy processing is of paramount importance. This dataset will have around 100,000,000+ records and will be moved to Amazon's regular repository. I know this is not hard to use, but there is an exception. Isn't there always? This record contains PII (personally identifiable information) and can only be seen by the review team. Oh, and there's one more little item. We need to share the raw results with the client in real time every time we process the data. Please contact us with a quote.”

There are two ways to process this request. The first way is a classic cloud process. They add resources to EC2 (server) and RDS (database) instances to handle the load.

  1. Create a process to download data from S3 into a set of objects.
  2. Send this data to Postgres (SQL Server, MySQL, etc.).
  3. Create a process to query the masked or unmasked data.
  4. Create a process to send data to the customer's preferred sharing technology (downloadable file, SFTP, S3 bucket, Azure Storage, etc.).
  5. Test the process and put it into production.

The second way to handle the load is to use some commands in Snowflake like:

  1. add resources
  1. Import data from S3.
  1. Control personally identifiable information.
CREATE OR REPLACE MASK POLICY address_mask AS (string VAL) RETURNS string -> CASE WHEN current_role() IN ('MARKETING') THEN VAL ELSE '*********' END
  1. Share data with the client.

As you can see, each of these processes can fulfill the user's request. The big difference is the time from inquiry to production. The first process can take a day or more, depending on the backlog of the development team. The second process consists of a few commands from start to finish. Speed ​​is a clear competitive advantage, and it's Snowflake's built-in features that make it possible. We'll see.

snowflake features

Snowflake is a strange amalgamation of many of today's divergent database concepts. The following list represents some of the high-level features:

  • It supports SQL and offers SELECT, INSERT, UPDATE, DELETE, etc., as well as CREATE TABLE, CREATE VIEW, CREATE SCHEMA and JSON queries.
  • Snowflake can store and query JSON data stored in a special column type.
  • It is cloud independent (Azure, AWS, GCP). I call it BYOC (Bring Your Own Cloud). You can set up your Snowflake infrastructure with the cloud provider of your choice.
  • integrated python. Snowflake can embed and call Python code from your queries and procedures.
  • Secure data exchange. Snowflake may share and use shared data with other instances of Snowflake. Computing costs in shared environments are paid by the consumer of that data, not the host.
  • Access to multiple clients. You can access your data using the technology of your choice. There are drivers for ODBC, Python, Node, and GO.
  • Pay as you go with "infinite" scaling. With the touch of a button (actually a simple command line), you can increase or decrease the performance of your computer for the queries made.

“Infinite” compute with Snowflake Warehouses

When working with Snowflake, you need to understand that the concept of a warehouse is not what you might think of as organized data storage. Instead, the concept of data storage in Snowflake is COMPUTE, not storage. Let me say it again: it is COMPUTING and not a data store.

(Video) What is Snowflake? 8 Minute Demo

A good metaphor is that it's like a car that you can add cylinders to as you go. Want to save gas ($$$)? Choose a small car. Speeding on the highway at 200 km/h? Convert the car to eight cylinders. Snowflake COMPUTE runs on powers of 2: 1, 2, 4, 8, 16, 32 and up to 256 cylinders of COMPUTE power. The unique aspect of Snowflake is that you can control how many cylinders you use for any given transaction. You pay for it, but the performance of updating a single query or set of queries with a switch is attractive.

Pay with the simple Snowflake model to pay credits based on the Snowflake edition you create. Credits are used to pay for credit hours: one credit = one credit hour. Based on the size of the camp, the system determines how credit hours are billed, and you pay more credits for larger camps. All queries are billed by the minute.illustration 1displays prices per credit hour per spin.

Introduction to Snowflake (1)
Illustration 1:The price you pay for Snowflake depends on the run and the size of your inventory.

Some features, such as PCI or HIPAA, require higher editions of Snowflake.

See Also
COPY TO| Snowflake-Documentation

To better understand how this works, let's look at an example. The following code is simple. Create new tables with different sizes of bearings. The source table is calledDEMO_DATAand contains 100,000,000 records. This code creates new copies of this table with different store sizes. The command is the following:


The results are the following:

  • SMALL: 24 seconds
  • MEDIUM: 19 seconds
  • BIG: 12 seconds
  • XXL: 8 seconds

As you can see, performance is important for larger bearing sizes. In some, the tables contain more than a billion records. The difference is incredible.

snow view

The main mechanism for managing your Snowflake infrastructure is a web-based application called Snowsight, which offers many features.Figure 2,Figure 3, miFigure 4Highlighting some of Snowsight's most used features.

Introduction to Snowflake (2)
Figure 2:The data canvas is used to create data elements such as databases, schemas, tables, and views.
Introduction to Snowflake (3)
Figure 3:The activity log is used to monitor the performance, success, and failure of commands run on Snowflake.
Introduction to Snowflake (4)
Figure 4:The administration screen is used to add users, groups, stores, etc.

understand spreadsheets

You can run scripts on the worksheet canvas.Figure 5displays a spreadsheet with a code snippet. This snippet is used to create the database, table, and user account that you'll use in the examples in this article.

Introduction to Snowflake (5)
Figure 5:Snowsight's spreadsheet view is used to create, edit, run, and analyze Snowflake queries.

Create a data load pipeline

Before you can start querying, analyzing, and sharing data, you must transfer that data to your Snowflake instance. The next section shows how to create a simple pipeline so that you can access the data through Snowflake. Building a pipeline consists of the following steps:

  1. Create a CSV file.
  2. Send CSV to an S3 bucket.
  3. Create a format in code.
  4. Create a stage in code.
  5. Create a COPY TO command.
  6. Open a connection.
  7. Run a SQL command to load data.

start your project

This example creates a small console application to get some sample data and send it to Snowflake. The first step is to create a console application using the Visual Studio interface. The following code examples are based on C# 11 and its new string functions. To enable this in your project, you need to add the following section to the top of your Visual Studio project file.

(Video) Understanding Snowflake Data Platform for Beginners | Snowflake Tutorial


After creating your project and enabling C#11, you can continue working onPipeline.add the codelisting 1to your application.listing 1is a class capable of creating a series of display objects. These sample objects are returned in aList<T>Structure that you can save to a CSV file that ultimately goes into the Snowflake database via an S3 bucket.

Listing 1 Code to create sample data

SnowflakeDemo namespace; public class RandomDataService { public class Person { public string UniqueId { get; Phrase; } = System.Guid.NewGuid().ToString(); public string lastname { get; Phrase; } = ""; public string name { get; Phrase; } = ""; public string address { get; Phrase; } = ""; public string city { get; Phrase; } = ""; public string StateOrProvince { get; Phrase; } = ""; } public List<Person> GetSamplePeople(int how many = 100000) { var lastNames = GetLastNames(); var firstNames = GetFirstNames(); var addresses = GetStreetNames(); var Cities = GetCities(); var states = GetStatesAndProvinces(); var retval = new List<Person>(); for (int i = 0; i < how many; i++) { var person = new Person(); Person.LastName = LastName[Between0and9()]; Person.Name = Names[Between0and9()]; Person.City = Cities[Between0and9()]; person.StateORProvince = states[Between0y9()]; person.address = $"{RandomAddressNumber()} " + $"{addresses[Between 0 and 9()]}"; retval.Add(Person); } return value; } public int RandomAddressNumber () { var random = Random.Shared.Next(1000, 99999); return randomly; } public int Between0and9() { var random = Random.Shared.Next(0, 9); return randomly; } public List<string> GetLastNames() { var retval = new List<string>(); retval.Add("Lucas"); retval.Add("Smith"); retval.Add("Spielberg"); retval.Add("Gygax"); retval.Add("garland"); retval.Add("Wolff"); retval.Add("West"); retval.Add("Kardashian"); retval.Add("Van Halen"); retval.Add("Grohl"); Refund Policy; } public List<string> GetFirstNames() { var retval = new List<string>(); retval.Add("Mary"); retval.Add("Leslie"); retval.Add("Jane"); retval.Add("Jessica"); retval.Add("John"); retval.Add("Pablo"); retval.Add("George"); retval.Add("Ringo"); retval.Add("Eddie"); retval.Add("Alex"); Refund Policy; } public List<string> GetStreetNames() { var retval = new List<string>(); retval.Add("Orange"); retval.Add("Main"); retval.Add("Maple"); retval.Add("oak"); retval.Add("poplar"); retval.Add("brown"); retval.Add("Elm"); retval.Add("Redwood"); retval.Add("Lincoln Boulevard"); retval.Add("Sepúlveda Boulevard"); Refund Policy; } public List<string> GetCities() { var retval = new List<string>(); retval.Add("Seattle"); retval.Add("Austin"); retval.Add("Regina"); retval.Add("Calgary"); retval.Add("Winnipeg"); retval.Add("Portland"); retval.Add("Los Angeles"); retval.Add("Oak"); retval.Add("Montreal"); retval.Add("Ottawa"); Refund Policy; } public List<string> GetStatesAndProvinces() { var retval = new List<string>(); retval.Add("AB"); retval.Add("SK"); retval.Add("CA"); retval.Add("O"); retval.Add("WA"); retval.Add("TX"); retval.Add("CO"); retval.Add("NY"); retval.Add("MN"); retval.Add("KY"); Refund Policy; }}

Send to CSV file

After you create your sample data, you can save these records to a CSV file. This example uses the CsvHelper library, which you can install using NuGet with the following command:

CsvHelper Installation Package -Version 30.0.1

After installing thePackage,add the codelisting 2. This class has a function that can write data to a CSV file with a specific delimiter. I prefer the vertical bar ("|") because commas are more common in the data than vertical bars.

Listing 2 Use CsvHelper to create a delimited CSV file

Using System.Globalization;Using CsvHelper.Configuration;Namespace SnowflakeDemo;public class CsvTools{ public static void WriteCsvFile(dynamic dataToWrite, string outputFile, string delimiter) { var config = new CsvConfiguration(CultureInfo.InvariantCulture); // include header config.HasHeaderRecord = true; // change the setting of the delimiter. Delimiter = delimiter; //configuración del separator de comillas.DeberíaCitar = args => true; using (var writer = new StreamWriter(outputFile)) using (var csv = new CsvWriter(recorder, configuration)) { csv.WriteRecords(dataToWrite); } }}

Send files to S3

After you create your CSV file, you must upload it to an Amazon S3 bucket. This is done using the AWS SDK S3 package, which can be installed via NuGet with the following command:

Installation package AWSSDK.S3 -Version

After installing the package, add the code fromlisting 3. This code uses the Amazon SDK to write your file to an S3 bucket. You need a few items: the bucket you want to write to, AWS Access, and the secret keys that grant access to that bucket.

Listing 3: Use the Amazon SDK to write the file to an S3 bucket

usando Amazon;Usando Amazon.Runtime;Usando Amazon.S3;Usando Amazon.S3.Transfer;Namespace SnowflakeDemo;public class AmazonTools{public AmazonS3Client GetS3Client(BasicAWSCredentials creds, RegionEndpoint endpointRegion = null) { var clientRegion = RegionEndpoint.USEast1 ; if (endpointRegion != null) { clientRegion = endpointRegion; } var cliente = new AmazonS3Client(creds, clientRegion); cliente de retorno; } public BasicAWSCredentials GetBasicAwsCredentials( string awsAccessKey, string awsSecret) { var retval = new BasicAWSCredentials( awsAccessKey, awsSecret); política de reembolso; } public void UploadFile(cliente AmazonS3Client, string BucketName, string filename) { var ms = FileToMemoryStream(filename); var Utility = new TransferUtility (Cliente); var req = new TransferUtilityUploadRequest(); var fi = nueva información de archivo (nombre de archivo); Utility.Upload(ms, NombreDeCubo, fi.Nombre); } public MemoryStream FileToMemoryStream(string fileName) { var fs = new FileStream(fileName, FileMode.Open, FileAccess.Read); var ms = nuevo flujo de memoria (); fs.CopyTo(ms); retorno ms; }}

Sending data to Snowflake

The final step in this process is to copy data from S3 to your Snowflake databases. This is done using Snowflake's COPY INTO command. The COPY INTO command has two sections that are necessary to make it easier to load your command. These are the STAGE and FORMAT sections of the command. We'll see.

A STAGE is a set of codes that provide the information needed to access your S3 data. Specifically, you need the bucket, item key, AWS access key, and AWS secret key. The following block of code represents a STAGE used to access data through an Amazon S3 bucket:


Snowflake supports several types of data formats that can be used to load data into Snowflake tables. Snowflake supports CSV, JSON, XML, Parquet, and more. FORMATS describe how the data to be loaded into the file to be imported is displayed. The following code represents the format of a CSV file that is delimited by vertical bars and encloses the data in double quotes.


The following code creates a STAGE fragment.

espacio de nombres SnowflakeDemo;public class SnowflakeTools{ public string GetStage( String BucketName, String ItemKey, String AccessKey, String SecretKey) { var retval = $""" 's3://{bucketName}/{itemKey}' CREDENCIALES = (AWS_KEY_ID=' {accessKey}' AWS_SECRET_KEY='{secretKey}') FORCE=true """; política de reembolso; }}

The following code creates a FORMAT snippet.

(Video) What is Snowflake Database | Snowflake Tutorial | Snowflake Architecture | Edureka

public string GetFormat(string delimiter){ var retval = $""" file_format=(type='csv' COMPRESSION='AUTO' FIELD_DELIMITER='{delimiter}' RECORD_DELIMITER = '\n' SKIP_HEADER = 1 TRIM_SPACE = FALSE ERROR_ON_COLUMN_COUNT_MISMATCH = TRUE ESCAPE = 'NONE' DATE_FORMAT = 'AUTO' TIMESTAMP_FORMAT = 'AUTO' NULL_IF = ('\\N','NULL') ) """; Retval de Rückkehr;}

Create a COPY TO command

Once you have STAGE and FORMAT, you can create your COPY INTO command. The following code can be used to facilitate this process.

public string GetCopyCommand( string databaseName, string tableName, string stageInfo, string formatInfo){var retval = $""" COPIAR EN {databaseName}.PUBLIC.{tableName} FROM {stageInfo} {formatInfo}; """; Retval de Rückkehr;}

The following code shows the last COPY INTO command that you can run from a database connection in a Snowsight worksheet.


Running ADO.NET Commands

After you create your COPY INTO command, you must call it to load your data into Snowflake. You can do it from a spreadsheet or from code.Figure 6displays the executed code in a worksheet.

Introduction to Snowflake (6)
Figure 6:COPY INTO code is executed from a spreadsheet

If you want to run this command from within your applications, you need to add another piece of code. This snippet is the connection string you use to call Snowflake from ADO.NET. The following snippet contains a method you can use to create a proper connection string:

string pública GetConnectionString(string snowflakeIdentifier, string userName, string password, string databaseName, string tableName) { var retval = $""" account={snowflakeIdentifier}; user={userName}; password={password}; db={databaseName} ;Esquema=öffentlich;Armazém=COMPUTE_WH """; Retval de Rückkehr;}

The following code must be known. It just opens a connection, creates a command object, and invokes it.ExecuteNonQuery(). I mentioned this before. Snowflake meets him where he is and admits the tools he's familiar with.

usando (var conn = new SnowflakeDbConnection() { ConnectionString = connStringCode } { conn.Open(); var cmd = conn.CreateCommand(); cmd.CommandText = copyCode; cmd.CommandType = CommandType.Text cmd.ExecuteNonQuery();

Der Pipeline-Code

You now have a nice set of pipeline code.listing 4displays the output of all the code you created earlier.Figure 7displays the data that you created in the pipeline process.

Listing 4: The complete pipeline code

using System.ComponentModel;Using System.ComponentModel.DataAnnotations;Using System.Data;Using CsvHelper;Using Snowflake.Data.Client;Namespace SnowflakeDemo{ inner class Program { static void Main(string[] args) { if (Environment .GetEnvironmentVariable( "DEMO_KEY") == null || Environment .GetEnvironmentVariable("DEMO_SECRET") == null || Environment .GetEnvironmentVariable("DEMO_SNOWFLAKE_USER") == null || Environment .GetEnvironmentVariable("DEMO_SNOWFLAKE_PASSWORD") == null) { console. WriteLine( "You need the following environment" + "Configuration variables:\n" + "DEMO_KEY\n"+ "DEMO_SECRET\n"+ "DEMO_SNOWFLAKE_USER\n"+ "DEMO_SNOWFLAKE_PASSWORD"); Console.ReadKey(); To give back; } var awsAccessKey = Environment.GetEnvironmentVariable("DEMO_KEY"); var awsSecret = Environment.GetEnvironmentVariable("DEMO_SECRET"); var snowflakeUser = Environment.GetEnvironmentVariable("DEMO_SNOWFLAKE_USER"); var snowflakePassword = Environment.GetEnvironmentVariable("DEMO_SNOWFLAKE_PASSWORD"); var snow cup identifier = "gra75419"; var nombreBaseDeDatos = "CÓDIGO_REVISTA_DEMO"; var table name = "DEMO_DATA"; var BucketName = "dashpoint-demo"; var fileName = @"D:\data\junk\CodeSampleData.csv"; var itemKey = "CodeSampleData.csv"; var random = new RandomDataService(); var csvTools = new CsvTools(); var amazonTools = new AmazonTools(); var SnowflakeTools = new SnowflakeTools(); var data = random.GetSamplePeople(250000); csvTools.WriteCsvFile(data,filename,"|"); amazonTools.UploadFile(amazonTools.GetS3Client( amazonTools.GetBasicAwsCredentials(awsAccessKey,awsSecret)), BucketName, filename); var step = snowflakeTools.GetStage(bucketName,itemKey,awsAccessKey, awsSecret); format var = snowflakeTools.GetFormat("|"); var copyCommand = snowflakeTools.GetCopyCommand(databaseName, tableName,stage,format); var connString = snowflakeTools.GetConnectionString( snowflakeIdentifier, snowflakeUser, snowflakePassword, databaseName, tableName); //Send data with (var conn = new SnowflakeDbConnection() { ConnectionString = connString }) { conn.Open(); var cmd = conn.CreateCommand(); cmd.CommandText = copyCommand; cmd.CommandType = CommandType.Text; cmd.ExecuteNonQuery(); } Console.WriteLine("Data sent to Snowflake"); Console.ReadKey(); } }}
Introduction to Snowflake (7)
Figure 7:An example of imported data using pipeline code

Protection of information through masking

One of Snowflake's most compelling features is its ability to control access to who can view the content of different data elements. This is especially important when it comes to PII information. This feature is known asmasking policyin snowflake The following command shows how to protect data using a masking policy. Create a masking policy that shows unmasked data for people with the MARKETING role and asterisks for everyone else. This policy is then applied to the ADDRESS column inDEMO_DATATisch.


Figure 8shows the unmasked data.

Introduction to Snowflake (8)
Figure 8:unmasked data

Figure 9displays the masked data that most users will see.

Introduction to Snowflake (9)
Figure 9:A sample of data imported using pipeline code in a masked state due to the selected ROLE via Snowsight

share data

Once you have data in Snowflake, you can share that data with other Snowflake users. The real benefit is that the consumer of that data pays for the computational costs. With other cloud databases, the data owner is responsible for usage costs.

(Video) Getting started w/ Snowflake!

When sharing data, you have several options. The first option is to share data with a consumer in the same region where your Snowflake instance is set up. This is by far the simplest mechanism for sharing data and the most common use case in my experience.

The second option is to share data with a consumer who lives in a different region or cloud provider. This type of sharing uses Snowflake replication tools. In this article, I'll explore the simple use case.

There are two basic types of actions. With output shares, you can share data from one or more Snowflake instances with other Snowflake users. Input Shares are data sources provided to you by other Snowflake users.

There are two ways to create an output share: using SQL code or using Snowsight. This article demonstrates the SQL code way to do this. The first is to run commands from a spreadsheet like this:

create share code_mag_outBound; Use the Code_Magazine_Demo database to share the use of Code_Mag_OutBoundGrant in the code_magazine_demo.public schema. ON SIGHT SELECT CODE_MAGAZINE_DEMO.PUBLIC.V_SORTED_DEMO_DATA TO SHARE CODE_MAG_OUTBOUND; OLD CODE SHARING_MAG_ADD OUTGOING ACCOUNTS=XXXXXX;

Once another Snowflake account has shared data with you, you will be able to access that data through a database created from the share. There are two ways to create a database from a share. The first is Snowsight. This is done by selectingShare private databy Snowsight. Then select "Shared with you" on the data sharing screen. This will open a screen with the records shared with you, as shown in theFigure 10.

Introduction to Snowflake (10)
Figure 10:The list of records shared with a Snowflake account

Click on the share for which you want to create a database. You will be presented with the Get Data dialog box as shown inFigure 11.

Introduction to Snowflake (11)
Figure 11:The Get Data screen is used to create external shared databases.

This dialog allows you to specify the name of your database. Optionally, you can also specify which security groups can access this database.

The second way to create a database from a share is to run a command from a Snowsight spreadsheet. To create a database from a share, run this command:


The share name parameters for this command can be derived from the screen atFigure 11from the previous example.

After creating a database from a share, you can access it through Snowsight as shown inFigure 12.

(Video) Snowflake Architecture - Learn How Snowflake Stores Table data

Introduction to Snowflake (12)
Figure 12:The Snowsight Data screen showing the database created from a share

Figure 13displays shared data from demos created earlier in this article. One thing you'll notice right away is that the masking policy applies to shared data.

Introduction to Snowflake (13)
Figure 13:Masking policy applies to shared data

I've been working with Snowflake for several years and it wasn't until 2022 that I realized that the ability to seamlessly share data is monumental. We no longer need to set up S3 buckets or Azure blog storage accounts. We simply grant access to a set of tables/views and the client can access them using the SQL commands they already know.


Snowflake is that rare commodity that has immense power but can be used by mere mortals to access that power. This article only scratches the surface of what Snowflake is capable of. Other areas of your interest may include Python integration, clean room technology, and other interesting use cases with another "snow" technology called Snowpipe. Look at her. You won't be disappointed.


What is Snowflake basic introduction? ›

Snowflake is a cloud-based data warehousing platform that is built on top of AWS and is a true SaaS offering. In contrast with traditional data warehouse solutions, Snowflake provides a data warehouse which is faster, easy to set up, and far more flexible.
  1. Database storage.
  2. Query Processing.
  3. Cloud services in Snowflake.

What is Snowflake and how it works? ›

Snowflake is an elastically scalable cloud data warehouse

It can automatically scale up/down its compute resources to load, integrate, and analyze data. As a result, you can run virtually any number of workloads across many users at the same time without worrying about resource contention.

How do I start learning snowflakes? ›

Getting Started
  1. Prerequisites.
  2. Log into SnowSQL.
  3. Create Snowflake Objects.
  4. Stage the Data Files.
  5. Copy Data into the Target Table.
  6. Query the Loaded Data.
  7. Summary and Clean Up.

Is Snowflake easy to learn? ›

Things are different with Snowflake since it is fully SQL-based. Chances are, you have some experience using BI or data analysis tools that work on SQL. Most of what you already know can be applied to Snowflake. Not to mention that SQL is an easy-to-learn language, a significant benefit for general users.

Is Snowflake a database or ETL? ›

Snowflake supports both ETL and ELT and works with a wide range of data integration tools, including Informatica, Talend, Tableau, Matillion and others.

Is Snowflake just SQL? ›

Snowflake is a data platform and data warehouse that supports the most common standardized version of SQL: ANSI. This means that all of the most common operations are usable within Snowflake. Snowflake also supports all of the operations that enable data warehousing operations, like create, update, insert, etc.

What is Snowflake for dummies? ›

The Snowflake Data Cloud is a global network where thousands of organizations mobilize data with near-unlimited scale, concurrency, and performance. The Data Cloud removes silos to enable discovering, managing, and sharing data among business units, suppliers, other business partners, and customers.

What is Snowflake best for? ›

Snowflake enables data storage, processing, and analytic solutions that are faster, easier to use, and far more flexible than traditional offerings. The Snowflake data platform is not built on any existing database technology or “big data” software platforms such as Hadoop.

Why is Snowflake so popular? ›

Why is snowflake so popular? Snowflake delivers a platform that is fast, flexible, and user-friendly. It provides the means for not only data storage, but also processing and analysis. It is considered cloud-agnostic, as it operates across Amazon Web Services (AWS), Microsoft Azure or Google Cloud.

Does Snowflake need coding? ›

Ans: Snowflake has developed a one-of-a-kind architecture based on Amazon Web Services' cloud data warehouse. Snowflake does not require any additional software, hardware, or maintenance over and above other platforms' needs.

What should I learn before Snowflake? ›

Getting Started
  • Snowflake in 20 Minutes. Prerequisites. Log into SnowSQL. ...
  • Getting Started with Snowflake - Zero to Snowflake.
  • Getting Started with Python.
  • Bulk Loading from a Local File System.
  • Bulk Loading from Amazon S3 Using COPY.
  • JSON Basics.
  • Loading JSON Data into a Relational Table.
  • Loading and Unloading Parquet Data.

What skills are required for Snowflake? ›

  • Bachelor's degree in Computer Science, Business Administration or related field.
  • Strong aptitude for modern BI solution development using Azure solution suite.
  • Ability to design and implement effective analytics solutions and models with Snowflake.

What coding language is Snowflake? ›

For general users, Snowflake provides complete ANSI SQL language support for managing day-to -day operations. It's cloud agnostic, with unlimited, seamless scalability across Amazon Web Services (AWS) and Microsoft Azure (with the prospect of adding Google Cloud soon).

How many days will take to learn Snowflake? ›

About Snowflake Course

This particular course is a 1 month advanced training program on the Snowflake. Snowflake is the best cloud based data warehousing & analytics tool.

Is Snowflake better than SQL? ›

Snowflake vs SQL Server: Performance

The rapidly increasing data in business activities is one of the core reasons you would wanna switch from SQL Server to Snowflake. The moment you realize you are pushing through a lot more data to your SQL Server system than it can handle, you will have to add more resources.

Is Snowflake a PaaS or SaaS? ›

Snowflake is an SaaS solution. A cloud-based data warehouse that holds no physical space and takes care of maintenance itself.

Is Snowflake a data lake or warehouse? ›

Snowflake Has Always Been a Hybrid of Data Warehouse and Data Lake. There's a great deal of controversy in the industry these days around data lakes versus data warehouses. For many years, a data warehouse was the only game in town for enterprises to process their data and get insight from it.

Is Snowflake an OLAP or OLTP? ›

Snowflake uses OLAP as a foundational part of its database schema and acts as a single, governed, and immediately queryable source for your data. In addition to its built-in analytics features, the platform offers seamless integrations with popular business intelligence and analytics tools.

Is Snowflake relational or Nosql? ›

Snowflake is a cloud-hosted relational database for building data warehouses.

Is Snowflake like AWS? ›

Snowflake is an AWS Partner offering software solutions and has achieved Data Analytics, Machine Learning, and Retail Competencies.

Is Snowflake owned by Amazon? ›

It runs on Amazon S3 since 2014, on Microsoft Azure since 2018 and on the Google Cloud Platform since 2019.
Snowflake Inc.
TypePublic company
RevenueUS$1.219 billion (2022)
Net incomeUS$−680 million (2022)
Total assetsUS$6.650 billion (2022)
Total equityUS$5.049 billion (2022)
10 more rows

How is Snowflake different from AWS? ›

Snowflake implements instantaneous auto-scaling while Redshift requires addition/removal of nodes for scaling. Snowflake supports fewer data customization choices, where Redshift supports data flexibility through features like partitioning and distribution.

Is Snowflake hard to use? ›

Snowflake is recognized for an interface that is simple to use and intuitive. You can get started with the service quickly, and you can automatically or on the fly spin up and down compute clusters of any size for any user or workload without impacting other jobs.

Is Snowflake worth learning? ›

Summary. If you want an easier approach to data warehousing, without vendor lock-in, Snowflake may be your best bet. If you have extremely huge workloads, and/or need analytics functionality, however, you may want to go with Amazon, Google, or Microsoft.

What is the weakness of Snowflake? ›

Lack of support for unstructured data: Snowflake only supports structured and semi-structured data. Higher cost: Depending on the use case, Snowflake can be more expensive than competitors such as Amazon Redshift.

What SQL does Snowflake use? ›

Snowflake supports standard SQL, including a subset of ANSI SQL:1999 and the SQL:2003 analytic extensions. Snowflake also supports common variations for a number of commands where those variations do not conflict with each other.

Why Snowflake is better than Azure? ›

Snowflake offers native connectivity to multiple BI, data integration, and analytics tools . Azure comes with integration tools such as logic apps, API Management, Service Bus, and Event Grid for connecting to third-party services. Both the user and AWS are responsible for securing data.

What big companies use Snowflake? ›

Companies Using Snowflake:
  • Amazon.
  • Google.
  • Microsoft.
  • Capital One.
  • Warner Music Group.
  • jetBlue.
  • DoorDash.
  • Allianz.
Sep 5, 2022

Why Snowflake is better than Oracle? ›

Snowflake might be easier to use and work out cheaper because of its ability to pause clusters when not running queries. However, Oracle comes with support for cursors and in-built machine learning capabilities, helping you program and generate advanced insights from workloads.

What is the salary of Snowflake skills? ›

Snowflake Developer salary in India ranges between ₹ 4.5 Lakhs to ₹ 11.5 Lakhs with an average annual salary of ₹ 6.5 Lakhs.

Do I need an ETL tool for Snowflake? ›

Snowflake is a SaaS data warehouse tool, not an ETL tool. You can store and manage data within Snowflake, but you'll need a separate tool for the ETL (extract, transform, load) process itself. ETL is the modern replacement for traditional ELT (extract, load, transform) workflows.

Is Python required for Snowflake? ›

The Snowflake Connector for Python requires Python version 3.7 or later. For more information about installing the required version of Python, see: Installing the Required Version of Python.

What are the four layers of Snowflake? ›

Snowflake has 3 different layers:
  • Storage Layer.
  • Compute Layer.
  • Cloud Services Layer.
Jan 18, 2019

What kind of mathematics is involved in Snowflake? ›

A branch of geometry called fractal geometry helps explain the figures of snowflakes. A mathematician, Helge von Koch, created the Koch snowflake based on the Koch fractal curve.

Can anyone learn Snowflake? ›

This course is suitable for anyone who has an interest in learning Snowflake from the ground up. Taking this course, you'll learn how Snowflake can be used to launch and provision petabyte scale data warehouses underpinned by the power of the cloud.

What are the six workloads of Snowflake? ›

  • Snowflake Workloads Overview.
  • Data Applications.
  • Data Engineering.
  • Data Marketplace.
  • Data Science.
  • Data Warehousing.
  • Marketing Analytics.
  • Unistore.

Is Snowflake training free? ›

Free Online Course: An Introduction to Snowflake from Udemy | Class Central.

How much does a Snowflake training cost? ›

Master snowflake cloud data-warehouse with hands on exercises from real time expert. Rating: 4.5 out of 52624 reviews21 total hours256 lecturesAll LevelsCurrent price: $17.99Original price: $29.99.

Is Snowflake a DevOps? ›

Snowflake for DevOps

Snowflake enables developers to build data-intensive applications with no limitations on performance, concurrency, or scale. Thanks to its multi-cluster, shared data architecture, it scales horizontally and vertically on demand, delivering fast response times regardless of load.

Is Snowflake on AWS or Azure? ›

A Snowflake account can be hosted on any of the following cloud platforms: Amazon Web Services (AWS) Google Cloud Platform (GCP) Microsoft Azure (Azure)

Why is Snowflake called Snowflake? ›

Origins of the allegoric meaning

It is popularly believed that every snowflake has a unique structure. Most usages of "snowflake" make reference to the physical qualities of snowflakes, such as their unique structure or fragility, while a minority of usages make reference to the white color of snow.

Does Snowflake pay well? ›

How much do Snowflake employees get paid? The median yearly total compensation reported at Snowflake is $255,125.

Is Snowflake a good career choice? ›

80% of Snowflake employees would recommend working there to a friend based on Glassdoor reviews. Employees also rated Snowflake 3.8 out of 5 for work life balance, 4.1 for culture and values and 4.2 for career opportunities.

Who is Snowflake biggest competitor? ›

Top Snowflake Data Cloud Alternatives
  • MongoDB Atlas.
  • Oracle Database.
  • Amazon Redshift.
  • DataStax Enterprise.
  • Redis Enterprise Cloud.
  • Db2.
  • CDP Data Hub.
  • Couchbase Server.

Which is better azure or Snowflake? ›

When assessing the two solutions, reviewers found Snowflake easier to use and do business with overall. However, reviewers preferred the ease of set up with Azure Data Lake Store, along with administration. Reviewers felt that Azure Data Lake Store meets the needs of their business better than Snowflake.

How do you explain snowflakes to kids? ›

Tiny crystals of ice that fall to Earth are called snow. A crystal is a solid substance that has flat surfaces and sharp corners. Snowfall is made up of both single ice crystals and clumps of ice crystals. The clumps are called snowflakes.

What is Snowflake simple? ›

"Snowflake" is a derogatory slang term for a person, implying that they have an inflated sense of uniqueness, an unwarranted sense of entitlement, or are overly-emotional, easily offended, and unable to deal with opposing opinions.

How do you explain Snowflake project in interview? ›

Explain Snowflake architecture

Snowflake is built on an AWS cloud data warehouse and is truly a Saas offering. There is no software, hardware, ongoing maintenance, tuning, etc. needed to work with Snowflake. Three main layers make the Snowflake architecture - database storage, query processing, and cloud services.

What are 3 facts about snowflakes? ›

Every snowflake has approximately 200 snow crystals. A snowflake has six sides. A snowflake falls at a speed of 3 — 4 miles an hour. The majority of the world's fresh water supply is in ice and snow.

What are the 7 main shapes of a snowflake? ›

This system defines the seven principal snow crystal types as plates, stellar crystals, columns, needles, spatial dendrites, capped columns, and irregular forms.

What is Snowflake vs SQL? ›

SQL Server gives you complete control over the database backup schedule, high data availability and disaster recovery, encryption used, amount of logging, etc. With the Enterprise Edition, Snowflake assures total data security with customer-managed encryption keys and has HIPAA and PCI compliance.

Is Snowflake SQL or Nosql? ›

Snowflake is fundamentally built to be a complete SQL database. It is a columnar-stored relational database and works well with Tableau, Excel and many other tools familiar to end users.

How ETL works in Snowflake? ›

What is the ETL Process? ETL is an acronym that represents “extract, transform, load.” During this process, data is gathered from one or more databases or other sources. The data is also cleaned, removing or flagging invalid data, and then transformed into a format that's conducive for analysis.

What is Snowflake summary? ›

Snowflake enables data storage, processing, and analytic solutions that are faster, easier to use, and far more flexible than traditional offerings. The Snowflake data platform is not built on any existing database technology or “big data” software platforms such as Hadoop.


1. What is Snowflake ? snowflake - concept, architecture, user workflow explained (2022)
(IT k Funde)
2. What is Snowflake? The Biggest US Software IPO in History
(How It Happened)
3. Introduction to the Snowflake Data Cloud
(Snowflake Inc.)
4. Zero to Snowflake in 90 minutes
5. The Basics: An introductions to Snowflake and why we use it
(The Information Lab)
6. Introduction to Snowflake on Microsoft Azure
(Magna Data Inc)


Top Articles
Latest Posts
Article information

Author: Kareem Mueller DO

Last Updated: 05/16/2023

Views: 5586

Rating: 4.6 / 5 (66 voted)

Reviews: 89% of readers found this page helpful

Author information

Name: Kareem Mueller DO

Birthday: 1997-01-04

Address: Apt. 156 12935 Runolfsdottir Mission, Greenfort, MN 74384-6749

Phone: +16704982844747

Job: Corporate Administration Planner

Hobby: Mountain biking, Jewelry making, Stone skipping, Lacemaking, Knife making, Scrapbooking, Letterboxing

Introduction: My name is Kareem Mueller DO, I am a vivacious, super, thoughtful, excited, handsome, beautiful, combative person who loves writing and wants to share my knowledge and understanding with you.