No video

Azure Data Factory (ADF) How To Tip: Metadata-driven Copy Task = Flexible Copy Activity​

  Рет қаралды 5,251

Data Platform Central

Data Platform Central

Күн бұрын

Пікірлер: 24
@ASHOKKUMAR-vx9rh
@ASHOKKUMAR-vx9rh 2 жыл бұрын
Really nice explanation Murugan without creating any pipeline activities in Data Factory they can create automatic scripts and activities so much time reduced to finish the task and copying data from SQL Server to ADLS Thank you so much for sharing this information.
@dataplatformcentral3036
@dataplatformcentral3036 2 жыл бұрын
Thanks Ashok Glad that you liked it :) Feel free to share across. Also feel free to subscribe and hit bell icon for getting notifications on new videos Each week new video will be released based on Azure/ Power Platform based topics
@cedriclabuschagne5978
@cedriclabuschagne5978 2 жыл бұрын
Great video, now to do this programmatically not using the user interface. Generation rather than configuration to be truly metadata driven. When one wants to do this for multiple sources and hundreds of tables.
@dataplatformcentral3036
@dataplatformcentral3036 2 жыл бұрын
Glad that you liked it :) Feel free to like, subscribe and share across
@srikanthvarma5026
@srikanthvarma5026 2 жыл бұрын
Super Explanation Murugan .Thank you for the video
@dataplatformcentral3036
@dataplatformcentral3036 2 жыл бұрын
Thanks :) Feel free to like, share and subscribe. Make sure you press bel icon for getting notification on the new videos every week
@arindomjit
@arindomjit 2 жыл бұрын
Thanks for the in-depth video. It was very helpful. If I have different sources (SQL, Oracle, DB2), how would you recommend making entry into the control table for each of these different sources?
@dataplatformcentral3036
@dataplatformcentral3036 2 жыл бұрын
If you have different sources you need multiple metadata driven pipelines. Properties like IR name, database type, file format type cannot be parameterized as on today. So you would need separate parameterized pipeline targeting each source. They all can use the same control table though and will store their own metadata info based on tables included This is documented here as well docs.microsoft.com/en-us/azure/data-factory/copy-data-tool-metadata-driven
@vibhaskashyap8247
@vibhaskashyap8247 2 жыл бұрын
Nice video
@dataplatformcentral3036
@dataplatformcentral3036 2 жыл бұрын
Thanks Feel free to like, share and subscribe Click bell icon to get notifications for new videos
@prabhatratnala6589
@prabhatratnala6589 Жыл бұрын
Wonderful, quick question, how does version control span out for the control table? Does it auto integrate with Azure repos? If yes how?
@dataplatformcentral3036
@dataplatformcentral3036 Жыл бұрын
Sorry can you be specific. Version control of what?
@DilipDiwakarAricent
@DilipDiwakarAricent 2 жыл бұрын
Nice explaination Murugan. Did you created any video on create databricks Lakehouse solution using ADF.
@dataplatformcentral3036
@dataplatformcentral3036 2 жыл бұрын
Nope Havent worked much with databricks yet!
@terryliu3635
@terryliu3635 2 жыл бұрын
Great demo. Thanks for sharing! Quick question...do you know if this preview feature will be able to support SAP Table Connector, as we're trying to load the data from SAP ECC? Thanks.
@dataplatformcentral3036
@dataplatformcentral3036 2 жыл бұрын
Sorry not too sure on that. I've not worked with SAP yet! But since it is supported as a source for Copy activity, I would expect it to work fine for Metadata-driven copy activity too.
@Deepaksharma-we4eo
@Deepaksharma-we4eo 2 жыл бұрын
Hi sir need you help Is there any other approach to go for the tables having foreign key relationships ? I am getting foreign key violation error message
@dataplatformcentral3036
@dataplatformcentral3036 2 жыл бұрын
Which step you're getting the error?
@ketanmehta3058
@ketanmehta3058 2 жыл бұрын
How we can add the where clause to the query?
@dataplatformcentral3036
@dataplatformcentral3036 Жыл бұрын
WHERE clause to your our data extraction query or the control table one?
@ketanmehta3058
@ketanmehta3058 Жыл бұрын
@@dataplatformcentral3036 In the metadata drive we can select only the table name but it did not give the option to apply filter criteria. As an example: select * from emp where dept = 'HR' I want to include the where clause in the HR table. How can I achieve it?
@nigelnaicker7948
@nigelnaicker7948 Жыл бұрын
@@ketanmehta3058 when you select the views,tables you want to copy, you are given the option to do a configure on the individual tables\views or use the same config for all, if you clicked on configure individual, there is an advanced expander, when you click it, you will see the query in use, which is generally 'select * from table' , you can edit this query however you want to. Hope this helps.
@snad256
@snad256 2 жыл бұрын
Hi, From where did you get the Sales table?
@dataplatformcentral3036
@dataplatformcentral3036 2 жыл бұрын
sales is a schema under which different tables exist within the test database. It's taken from Adventureworks sample database
Azure Data Factory | Copy multiple tables in Bulk with Lookup & ForEach
23:16
Adam Marczak - Azure for Everyone
Рет қаралды 188 М.
Мы сделали гигантские сухарики!  #большаяеда
00:44
Кадр сыртындағы қызықтар | Келінжан
00:16
Azure Data Factory (ADF) Quick Tip: Reusable Data Flows A.k.a Flowlet
25:44
Data Platform Central
Рет қаралды 2,6 М.
50. Copy Incremental data within Azure SQL DB - Multiple Tables
32:59
CloudAndDataUniverse
Рет қаралды 9 М.
21. Dynamic Column mapping in Copy Activity in Azure Data Factory
23:43
Azure Synapse Tip: ​Visualizing PowerApps Data Real-Time Using Synapse Link​
18:06
Azure Data Factory/Synapse Pipeline Tip : Google Sheet Connector - An Intro
18:19
Data Platform Central
Рет қаралды 3,2 М.