Full_Load -V2
Full_Load -V2
Scenarios:
1. How to copy multiple tables from azure SQL database to ADLS gen2.
2. How to avoid running copy activity if table not available in source side
3. How to create & maintain metadata table
4.
Estimations:
1. Azure data factory
- Auto-Resolve IR or Azure IR
- 4 datasets
- 3 linked services(source, destination, keyvault)
- Pipeline
2. Look up activity, Foreach activity (Get metadata activity, if activity (True
(copy activity) or False (wait activity)))
3. Azure SQL Server - Source
4. Storage accounts – ADLSgen2 - Destination
5. Key vault
Imp points:
1. In ADF- there is a relationship b/w previous activity and coming activity
2. previous activity producing some output , that ouput help to the coming activity
as a input
3. Look up activity always talk about "value" manner
4. foreach activity always talk about "item" manner
5. sql server point of view talk about "row" manner
6. Dataset parameters - For handling bulk of tables+ each and every table have the
own schema name
and table name as well as I want to maintain my own nameing convention, once
the data is succesfully
copied from azure sql to adlsgen2 under bronze container
7. Dont create dataset parameters for look up activity becoz purpose of lookup
activity here to get
Step2:
Linked services:
Datasets
Lookup activity
1.lookup
-dataset, no need to select any table
2. foreach
get metadata.
key: exist.
-dataset, no need to select any table
- Step1: create dataset parameters
- Step2: go to connection under dataset, click on table, click on enter manually,
double click on adds dynamic content.
- Step3: go to pipeline, click on get metadata activity, you will see dataset
parameters
- we need to pass the values, but we are under foreach activity, always foreach
understand
item only
- pass the dynamic expression for values
- give the lookup activity output left side values
Dataset parameters for get metadata activity.
@equals(activity('Get Metadata1').output.exists,true)
Copy activity:
Source:
Sink: