Пікірлер
@peterdaniels3428
@peterdaniels3428 Күн бұрын
Awesome video. Thank you! I'm still confused about the region aspects/limitations for Managed Private Endpoints. Are you saying that the azure resource region and the fabric region have to be the same?
@bumdinh9911
@bumdinh9911 2 күн бұрын
Sir can we PASS NOTEBOOK PARAMETERS to LookUp activity in Data Pipeline?
@Tales-from-the-Field
@Tales-from-the-Field 9 сағат бұрын
Hi @bumdinh9911 per Bradley, "Hello! That is a great question, sadly I could not find a way to do it at this time. The closest way I found was to trigger the Job scheduler API for Microsoft Fabric and you could run a job from a pipeline, but there are limitations and one listed was passing a parameter via an API. Now, just to play the other side. While it is not possible you could write the notebook information to a DW or to an Azure SQL Database table, and then do a look up operation to get the data. So we could accomplish the task of getting data from a notebook back into a pipeline, but it is not as straight forward as I was hoping. If that changes, or I should say 'when that changes', I promise you I will make a video on that!"
@dubbocom
@dubbocom 6 күн бұрын
Thank you ! - great help / advice - cheers
@Godzillastinkydiaper
@Godzillastinkydiaper 13 күн бұрын
#beforetheguywhocomment
@garimasharma2558
@garimasharma2558 19 күн бұрын
Hello nice explanation. Can you please tell me when we create capacity from azure portal. at that time under which type of Azure subscription we can create this. Is it PAYG, CSP, CSP(NCE). or we can create under all these type of subscriptions.
@Tales-from-the-Field
@Tales-from-the-Field 7 күн бұрын
Hello Garima.. thank you for reaching out.. you can go to Fabric capacity under your Azure portal to create one under the same subscription & Tenant...you can create in any of the subscription type
@dankboii
@dankboii 21 күн бұрын
is this possible to trigger the pipeline as a file is added to the lakehouse
@Tales-from-the-Field
@Tales-from-the-Field 9 күн бұрын
Hi @dankboii per Bradley, "Not yet. Yet being the key word. But as of right now this is specifically for Azure Blob Storage only."
@User-k3z
@User-k3z 21 күн бұрын
Hi, Thanks for the video. I've created Vnet and subnet and connected with Virtual Network data gateways as per your video. Now I'm trying to connect with ADLS Gen2 but it is showing error like this - Invalid credentials. (Session ID: 7e3a7126-ca0c-4f1f-b762-c40a284fcc5a, Region: us) Please let me know that really helpful for me.
@robcarrol
@robcarrol 25 күн бұрын
Love this, thanks!
@Tales-from-the-Field
@Tales-from-the-Field 23 күн бұрын
Thank you for watching @robcarrol!
@gonzamole
@gonzamole 29 күн бұрын
How we've complicated our lives
@DBABulldog
@DBABulldog 7 күн бұрын
We do don't we? I did a more recent video on Elastic Jobs was a wee bit less when it comes to moving pieces.
@Nalaka-Wanniarachchi
@Nalaka-Wanniarachchi Ай бұрын
Wow.This seems magic..
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Gotta love that Fabric Magic!
@SQLTalk
@SQLTalk Ай бұрын
This video is SUPER helpful. Thank you very much for letting us know how to incorporate Fabric Eventhouse with Database Watcher. Super helpful.
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Thank you Kirby! We really appriceate you and all your great work!
@SQLTalk
@SQLTalk Ай бұрын
Wonderful video. Thank you for making this.
@sheaerickson537
@sheaerickson537 Ай бұрын
Great video, thank you!
@DBABulldog
@DBABulldog 7 күн бұрын
Glad you liked it my friend. Keep come back to the channel for more great content by the team.
@rhyslewis7922
@rhyslewis7922 Ай бұрын
Do you know if you can use spark streaming from a mirrored Azure SQL data warehouse? If you try and enable change data feed via a notebook in the lakehouses where the mirrored database is shortcutted from it errors out saying unavailable from shortcut. And yet SET TBL PERMISSIONS is not supported trying to enable change data feed via the SQL endpoint in the data warehouse.
@TheSQLPro
@TheSQLPro Ай бұрын
Great demo!!
@Tales-from-the-Field
@Tales-from-the-Field 9 күн бұрын
Thank you @TheSQLPro!
@clvc699
@clvc699 Ай бұрын
How can you pass parameters to where in Select statement ?
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Hi @claudiovasquezcampos9558 are you looking for this in a Spark SQL Statement in a Notebook or a T-SQL Statement as part of a data pipeline task against an Azure SQL DB, Azure SQL MI, SQL Server, or Fabric DW?
@fazizov
@fazizov 2 ай бұрын
Great video, thanks, Bradley. based on my understanding, mirroring doesn't allow cross-tenant connection. I have a trial fabric account, but can't add any other Azure services, like ASQL there (pay as you go option is disabled for some reason). Any ideas on how to overcome that problem?
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Hi @fazizov per Bradley, "Hello sir! If I'm correct in my understanding you have a Fabric / Power BI Tenant and the Azure Account is not associated with that through the M365 Tenant. The Azure Account that you want to Mirror must be associated witht he M365 account (not an M365 expert, but this is what I've been told). If that is the case the Mirroring should not be limited to Trial capacity, you should be able to Mirror into it. The Trial capacity is equivalent to an F64 so you should have the functionality of an F64 other than those items that have been called out as not included in the trial capacity like private endpoints and managed identity. If your Azure Account is associated with the M365 account your tenant is in please let me know, Mirroring should not be facing any other restriction."
@fazizov
@fazizov Ай бұрын
Thank you very much for detailed explanation,I will definitely try that advice!
@ismailbartolo9741
@ismailbartolo9741 2 ай бұрын
Hi, Can you make a tutorial on exploiting the capabilities of Delta Tables? Specifically, how previously stored data can be accessed using versioning to analyze changes in our data over time. Should we store the data from our Delta Table version 1 in another Delta Table, or should we create a view for subsequent visualizations in Power BI? Thank you again for your videos!
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Absolutely love this suggestion @ismailbartolo9741 , we will get Bradley to start working on this!
@ismailbartolo9741
@ismailbartolo9741 Ай бұрын
​@@Tales-from-the-Field🔥🔥🔥 thanks 😊
@abeerahmed5634
@abeerahmed5634 2 ай бұрын
I want to use the output of Notebook 1 in another notebook(it is using smpt to send mail and the mail should have the output), how do I do it
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Hi @abeerahmed5634 could we get a little bit more information. Are you building text from Lakehouse fields, or is it plugging in numbers. You may not be able to get to specific, but trying to understand what parameters we need to define to send from Notebook1 into the child notebook.
@peterjacobs3749
@peterjacobs3749 2 ай бұрын
Hello Daniel - thank you for the Video! I am teaching and want to bring my beginner SQL Admins to ADS, so this is very helpful! May I ask you which video software you are using for making this presentation? I like how you highlight the commands/areas to focus on. Is it Camtasia?
@danieltaylor6623
@danieltaylor6623 2 ай бұрын
Your very welcome. TY for coming to the channel. The team does utilize Camtasia for all our video editing. If you keep an eye out they drop good deals around holidays.
@peterjacobs3749
@peterjacobs3749 2 ай бұрын
@@danieltaylor6623 Thank you very much Daniel!
@777bashir
@777bashir 2 ай бұрын
what if I need to use pipeline in Fabric itself not ADF, is that possible?
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Hi @777bashir Yes this is completely possible! Bradley released a video on this just 11 days ago, check it out here: kzfaq.info/get/bejne/obOGp6ulxNecoH0.html
@terryliu3635
@terryliu3635 2 ай бұрын
Thanks for sharing!! The current copilot is related to DataFlowGen2, right? Are there any copilot capabilities associated with Pipeline orchestration?
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Hi @terryliu3635 per Bradley, "Yes Terry, you are correct. This is just for the Data Flow Gen 2 process in this video. Ok, scratch my previous comment. I realized that link goes to the data factory page. On the Public Road map it says 'In the future (Q2 CY2024), we'll also introduce Copilot for Data Factory in Data Pipelines. ' here's the link to that: learn.microsoft.com/en-us/fabric/release-plan/data-factory#copilot . The good news is we are currently in Q2..... so hopefully we will have this soon!"
@zongyili569
@zongyili569 2 ай бұрын
Hi. I am able to create vnet gateway, the connection status is online. However, Data Gateway is not loaded when I try to create the Connection. The Data Gateway dropdown list is empty. Would you advise what could be the issue?
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Hi @zongyili569 per Bradley, "Just want to make sure I understand this correctly. We've Registered the Microsoft.PowerPlatform resource provider, provisioned a Virtual Network and set up a subnet for the Microsoft Fabric tenent to use and we delegated the subnet to the service Microsoft.PowerPlatform/vnetaccesslinks, went to our Fabric Tenent and when we selected Virtual network data gateways, we had a License capacity, then you could select the Azure Subscription and the resource group, and the Vnet was provisioned in and it didn't show up? My initial thought would be that it could be a tenant where they Azure Subscription is not attached to the Fabric Tenent, but if you can see the subscription and the resource group, then you may need to call support. I'd love to hear from you on this to see what the resolution is. "
@zongyili569
@zongyili569 2 ай бұрын
@@Tales-from-the-Field Hi, it was my mistake. Fabric pipeline doesn't support vnet data gateway right now. That's why I can't see those vnet data gateway connections.
@OneNI83
@OneNI83 2 ай бұрын
In this method can we bulk import tables (lets say we want tables that are filtered using a query and those tables we need to import) ,what would be the maximum that can export at one go ?
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Hi @OneNI83 per Bradley, "Look at this much like Azure Data Factory. Use this for your initial fork lift, based on your comfortability with this and other technologies. If you are a T-SQL Warehouse person, land it in files and do a CTAS to load or load it directly to a warehouse and then use T-SQL to Transform your data. If you are spark person, land it in files and use a notebook to transform and load your data. This is a powerful tool, but there's multiple ways to ingest data after it lands. So super long answer for a short questioni, Yes you can use this to bulk load. Not sure, there's not really a limit and you could scale the parallelism to increase throughput for VLDB workloads."
@adilmajeed8439
@adilmajeed8439 2 ай бұрын
Thanks for sharing. When you have created the connection in the Fabric service and then you had clicked to see the properties again, after scrolling at the bottom, I can’t see the staging part in my connection which you had showed in the properties dialogue box. Power BI Gateway is already been updated. Any pointers…
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Hi @adilmajeed8439 per Bradley, "Hello sir, it's a little hard to troubleshoot this on KZfaq. Could you hit me up on Linkedin and we could DM with screen shots?" www.linkedin.com/in/sqlballs/
@mathiassaint
@mathiassaint 2 ай бұрын
Thanks for a great demo. Can u explain why ur not able to use the SQL endpoint directly into PBI?
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Hi @mathiassaint per Bradley, "Sure thing. So for this demo what I wanted to showcase was that no matter what format the data is stored in with Microsoft Fabric we can access it using any Engine we would like. So why did I use KQL DB instead of the SQL Endpoint? I could do that. However, this data set is very linear in nature and I'm using Direct Query. The KQL Engine works really good with time series data, I could create T-SQL stored procedures to accomplish something similar but KQL's query capabilities for a specific fraction of time that grows as I write more data to the database allow me to get this quickly in a way that will not deteriorate over time with volume and remains lightning fast. Could be nothing more than preference on my part, but if I were building this for custmer use that's the engine I would recommend for this workload. Thanks for the questioni!"
@knuckleheadmcspazatron4939
@knuckleheadmcspazatron4939 2 ай бұрын
I think you went by the connection too fast. Spending more time on a proper authenticated method using identity should have been covered because your viewers are working with production workloads and not just setting up a demo. Otherwise, very useful content. Thanks for taking the time.
@adilmajeed8439
@adilmajeed8439 2 ай бұрын
Anything for Spatial¿
@michamatracz706
@michamatracz706 2 ай бұрын
I would use sqlalchemy for local connection then load the data in this case , great content
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Hi @michamatracz706, "Thank you! We've actually got some content we are planning to do with SQLalchemy. That package has a lot of great stuff to it. Hope you stay tuned!"
@NicolasPappasA
@NicolasPappasA 2 ай бұрын
Amazing Demo...Thank you Bradley.
@NicolasPappasA
@NicolasPappasA 2 ай бұрын
Bradley how did you get the max bubble in your Power BI working like that?
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Hi @NicolasPappasA per Bradley, "I'm breaking building this report a bit more in my next video. For the max size I used a measure to make the most recent bubble bigger than the others. I put that measure in the size option for the ArcGIS Map. The colors were a conditional statement I did not cover, but I used the values to make the trailing dots yellow and the most current dot Red. I will try to expand on this the next time I'm putting a video together on the ISS. I'm not done, hopefully, I've got some fun stuff planned for this!"
@NicolasPappasA
@NicolasPappasA 2 ай бұрын
@@Tales-from-the-Field amazing... looking forward to it...I created a measure too with a max latitude but couldn't get the way you did...sounds like I am missing the conditional statement, going to try that...Thanks again for replying to this
@pawewrona9749
@pawewrona9749 3 ай бұрын
Great demo and thanks for adding Power BI into picture. KQL Database so far is not demoed very often, great to see this example as well!
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Thank you @pawewrona9749! We are BIG fans of the KQL DB over here. Agreed, it doesn't get nearly as much love as it should!
@DatahaiBI
@DatahaiBI 3 ай бұрын
I'm glad my blog about temp tables in Synapse Serverless helped out with a solution!
@acxhgg
@acxhgg 3 ай бұрын
Hello, thank you for this demonstration. Could it be possible to pass a type 'Array' as base parameter for our notebook ?
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @acxhgg per Bradley, "I hate to start off with saying it depends, but kinda. You only have 4 data types for parameters that can be passed from Pipeliens to Notebooks (currently). Int (integer), Bool (boolean), Float, or String. You would need the data type of the parameter is string. However you could pass some type of small JSON array in text, and then have the notebook parse the string to make it an array, but it wouldn't be native. I hope this helps!"
@acxhgg
@acxhgg 3 ай бұрын
@@Tales-from-the-Field Thank you for your answer ! I tried implementing a logic that would convert data from an array to a single String variable. My current use case is to retrieve latest JSON files from a specific folder of a lakehouse, and to call a notebook with these latest files as parameter, in order to do certain actions on my JSON files. Hopefully it'll be native soon !
@dagg497
@dagg497 3 ай бұрын
Adsl gen 2 not being available in Fabrics Warehouse view and only pipeline made me think this weren't an option... Again as with all Azure products, I don't think their File menu system works any good with each Views; Lake/Warehouse/workbooks/File storage having entirely different GUI teams confusing and apparently hiding viable options, tricking you into thinking there' limitations. Not only that but the Workbooks and separete GUI for Databricks has a horrible File explorer and db explorer expanding to the right instead of downward (Good luck where you are three levels down when the screen is full of expanded big menus) lile normally would be more suitable as in Visual Studio or the actual Windows OS
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @dagg497 per Bradley, "Hi dagg497, not sure what this comment is to in regard to the video. For the Data Flow Gen 2 you wouldn't really use this to load ADLS Gen 2 as this is a transform mechanism to land data into OneLake, Data Warehouse or Data Lake. Pipelines do have ADLS Gen 2 as a destination and they are used more for data movement."
@andreablenxcacopardo5359
@andreablenxcacopardo5359 3 ай бұрын
Nice video, but I think it is not possible to proceed if the TDE is active for the following message: The backup operation for a database with service-managed transparent data encryption is not supported on SQL Database Managed Instance. TDE should be disabled
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @andreablenxcacopardo5359 per Bradley, "Do you have customer managed keys set up? By default all databases in a Azure SQL MI Instance use a Microsoft Managed Key for TDE encryption. Copy only backups cannot be taken using a Microsoft Managed Key. So you must do one of two things: 1. Create an Azure Key Vault, Create a Key, and then associate that Key with the Managed Instance. or 2. (WARNING NOT THE BEST PRACTICE!!!!) Unencrypt the database and drop the TDE Key. If you have done those things and you are still getting an error review the following documentation to make sure the Azure Key Vault can communicate with the instance. I hope this helps!" learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql
@TEMIDAYOOMONIYI-gl7dq
@TEMIDAYOOMONIYI-gl7dq 3 ай бұрын
Thanks for sharing.
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Thank you for watching @Temidayoomoniyi!
@crax2000
@crax2000 3 ай бұрын
Awesome demo! Thanks for sharing!
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Thank you @crax2000! Really appriceate you watching!
@bornluthor
@bornluthor 3 ай бұрын
Great video. Question: It appears the alert will fire when any job succeeds (or fails). Is it possible to only fire when a specific job fails? Thanks
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
@bornluthor this is DBABulldog(Daniel), you have the ability to setup 3 different signals. I did only for successful completion of the job. However, you have 3 different signals(alerts) you can choose from the dropdown. You can choose to fire an alert on success, on failure, and on timeout.
@gopalannasundaram9989
@gopalannasundaram9989 3 ай бұрын
Great demo! Can we anonymize certain PII fields or select subset of columns in a table while setting up mirroring?
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Hi @gopalannasundaram9989 per Bradley, "Great question, this functionality is on the road map but it is not currently in Azure SQL DB Mirroring in Microsoft Fabric. Right now there are some limitations with regards to data masking or encrypting the fields and it being brought over into Microsoft Fabric. That functionally is coming but I'm not sure how soon it will be here."
@vishwassee
@vishwassee 3 ай бұрын
Hi Is there a way to connect to on-prem SQL server using Fabric Notebook?
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @vishwassee per Bradley, "This is a really interesting question. I don't want to say no, but it may not be. The only possible way I could see this working would be if you have an a site to site VPN connecting your on premises to Azure, and there is a network that you have a network gateway set up to that main network, and you have an F64 provisioned and a Managed private endpoint for your workspace that is attached to that network. Provided that you have end to end connectivity and can hit a source using jdbc or pyodbc (probably utilizing SQL Authentication), maybe this would work. Keep in mind I'm not a networking expert, and I do not have the means to set up this envrionment and proof it out..... but MAN do I WISH I could, because if it did work it would be really cool!"
@LearnMicrosoftFabric
@LearnMicrosoftFabric 3 ай бұрын
Great demo Bradley!
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Thank you Will!! Always appreciate your support my friend!
@hendrawan11
@hendrawan11 3 ай бұрын
can we use this scenario when we have the largest data of 10 TB?
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
@hnedrawan11 this is DBABulldog(Danile) in this particular scenario I was using Azure SQL DB. I am assuming that you are referring to Azure SQL Hyperscale when you mention 10 TB. This is supported; however, you may want to review the following information in the provided link: learn.microsoft.com/en-us/azure/azure-sql/database/database-copy?view=azuresql&tabs=azure-powershell#database-copy-for-azure-sql-hyperscale
@GuillaumeBerthier
@GuillaumeBerthier 3 ай бұрын
that's great stuff indeed ! now we would love to see a "Security / Access Control Management" layer on top of workspace folders eventually 😉
@olbu
@olbu 3 ай бұрын
Q: Will apps benefit from folders too?
@user-gf4oy6qo4u
@user-gf4oy6qo4u 3 ай бұрын
Hi, do you have any videos coming about using Azure HCI (private cloud) with an Azure public cloud, as a single pane of glass? Extending Saas offerings on prem?
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @user-gf4oy6qo4u sorry we do not have any videos planed on HCI devices at this time. I checked with the team and we do work against a variety of services but not that one.
@babakbarzegari9228
@babakbarzegari9228 4 ай бұрын
Hi, Thanks for great video! I have a question about linked service for Fabric, it started minutes 3:28, how you create it? what was that braball-msftfabric ?
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Hi @babakbarzegari9228 per Bradley, "Sorry it took so long to get back you you @babakbarzegari9228 ! That was the Service Principal that I had created. You have to create an Azure Entra ID Service Principal in order to add it to the Microsoft Fabric Workspace and then you can use that to create the linked service in Azure Data Factory to the Microsoft Fabric Workspace. I hope this helps. Here's a tutorial on creating a Service Princiapl : jiasli.github.io/azure-notes/aad/Service-Principal-portal.html "
@adnanabdulmalik
@adnanabdulmalik 4 ай бұрын
how to handle error from invoke pipeline? i want to get error message from activity invoke pipeline.
@rossgh76
@rossgh76 4 ай бұрын
Thank you so much for this amazing content
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Thank you for watching @rossgh76 !
@DanielSanchez-ik7gz
@DanielSanchez-ik7gz 4 ай бұрын
Awesome video. Thanks for sharing!
@DBABulldog
@DBABulldog 4 ай бұрын
Thank you my friend. I also presented and additional option recently using Elastic Jobs. I would check that one out as well so you have options.
@jerrysmith9324
@jerrysmith9324 4 ай бұрын
Danny! Love the beard bro! Always good to see you and 'balls' doing your thing.