Загрузка страницы

Read CSV Data from External Stage & write in Permanent Snowflake Table using Snowflake Snowpark

This video explains how to read CSV data from Snowflake External Stage in a SnowPark Dataframe & write the dataframe data in a Snowflake Permanent Table.

Documentation Link:
-----------------------
https://docs.snowflake.com/ko/developer-guide/snowpark/reference/python/api/snowflake.snowpark.DataFrameReader.html

Prerequisite:
---------------
Getting Started with Snowpark for Python on Snowflake
https://youtu.be/1-lXIrPZTCo
SQL Code: used:
--------------------
--drop database if required
drop database if exists ramu;
--Create Database
create database if not exists ramu;
--use the database
use ramu;
--Table Creation
create or replace table video (id int,sepallength number(10,4),sepalwidth number(10,4),petallength number(10,4) , petalwidth number(10,4),classname varchar(200));
--file-format creation
create or replace file format my_csv_format
type = csv field_delimiter = ',' skip_header = 1
field_optionally_enclosed_by = '"'
null_if = ('NULL', 'null')
empty_field_as_null = true;
--stage creation
create or replace stage ramu.PUBLIC.snow_simple url="s3://...."
credentials=(aws_key_id='{}'
aws_secret_key='{}')
file_format = my_csv_format;
--check the data present in S3
list @ramu.PUBLIC.snow_simple;
--Execute the query

select * from video;

select * from processeddata;

select * from processeddata1;
Snowpark Code:
---------------
from snowflake.snowpark import Session
from snowflake.snowpark.functions import *
from snowflake.snowpark.types import *
#setup the connection
connection_parameters = {"account":"",
"user":"",
"password": "",
"role":"ACCOUNTADMIN",
"warehouse":"COMPUTE_WH",
"database":"RAMU",
"schema":"PUBLIC"
}

session = Session.builder.configs(connection_parameters).create()

iris_flower_read=session.read.csv('@ramu.PUBLIC.snow_simple')
schema = StructType([StructField("id",IntegerType(),True), \
StructField("sepallength",DoubleType(),True), \
StructField("sepalwidth",DoubleType(),True), \
StructField("petallength",DoubleType(),True), \
StructField("petalwidth", DoubleType(), True), \
StructField("classname", StringType(), True)
])

iris_flower_read=session.read.schema(schema).csv('@ramu.PUBLIC.snow_simple')

iris_flower_read.show()
#mention file format
iris_flower_read=session.read.options({"field_delimiter": ",", "skip_header": 1}).\
schema(schema).csv('@ramu.PUBLIC.snow_simple')
iris_flower_read.show()

#let's put bad records
iris_flower_read=session.read.options({"field_delimiter": ",", "skip_header": 1}).\
schema(schema).csv('@ramu.PUBLIC.snow_simple')
iris_flower_read.show()
iris_flower_read.show(1000)
iris_flower_read=session.read.options({"field_delimiter": ",", "skip_header": 1,"ON_ERROR": "Continue"}).\
schema(schema).csv('@ramu.PUBLIC.snow_simple')
iris_flower_read.show(1000)
iris_flower_read.count()
iris_flower_read.count()

#cache
ms=iris_flower_read.cache_result()
ms.count()

#write data from s3 to Snowflake Table
iris_flower_read.write.mode('overwrite').saveAsTable('processeddata')

iris_flower_read.write.mode('append').saveAsTable('processeddata1')
iris_flower_read.write.mode('append').saveAsTable('processeddata1')
Check this playlist for more Data Engineering related videos:
https://youtube.com/playlist?list=PLjfRmoYoxpNopPjdACgS5XTfdjyBcuGku

Apache Kafka form scratch
https://youtube.com/playlist?list=PLjfRmoYoxpNrs0VmIq6mOTqXP52RfZdRf

Snowflake Complete Course from scratch with End-to-End Project with in-depth explanation--
https://doc.clickup.com/37466271/d/h/13qc4z-104/d4346819bd8d510

🙏🙏🙏🙏🙏🙏🙏🙏
YOU JUST NEED TO DO
3 THINGS to support my channel
LIKE
SHARE
&
SUBSCRIBE
TO MY YOUTUBE CHANNEL

Видео Read CSV Data from External Stage & write in Permanent Snowflake Table using Snowflake Snowpark канала Knowledge Amplifier
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
2 апреля 2023 г. 11:27:42
00:32:09
Яндекс.Метрика