Data Modeling Tutorial: Star Schema (aka Kimball Approach)

  Рет қаралды 90,815

Kahan Data Solutions

Kahan Data Solutions

Күн бұрын

It's hard to last as a data engineer without understanding basic data modeling.
In this video we'll cover the basics of one of the most common approaches: Star Schema data modeling.
...Aka Kimball modeling.
...Aka Dimensional modeling.
We'll discuss the high level concepts, I'll show how to build one from scratch and we'll end with a review of the benefits & future topics to explore.
Thank you for watching!
►► The Starter Guide for Modern Data → bit.ly/starter-mds
Simplify “modern” architectures + better understand common tools & components
Timestamps:
00:00 - Intro
00:31 - High-level Overview
01:35 - Intro to Fact Tables
03:56 - Create a Fact Table
07:25 - Intro to Dimension Tables
08:43 - Create Dimension Tables
11:53 - Join to Create Marts
14:55 - Benefits & Future Topics
Title & Tags:
Data Modeling Tutorial: Star Schema (aka Kimball Approach)
#kahandatasolutions #dataengineering #datamodeling

Пікірлер: 63
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
►► The Starter Guide for Modern Data → bit.ly/starter-mds Simplify “modern” architectures + better understand common tools & components
@vrta
@vrta Жыл бұрын
I've read the whole "The data warehouse toolkit" book by Kimball twice and this video explains the most important part of the 500 page book as clear as possible. Well done!
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
Love to hear that! Thanks for your feedback
@jhonsen9842
@jhonsen9842 17 күн бұрын
So if someone watched the vidio its not nesessary to read that 500 book thats quite optimistic statement.
@joshi1q2w3e
@joshi1q2w3e Жыл бұрын
Please make a video like this on Conformed Dimensions please! Also how do you handle the lack of primary key and foreign key constraints in Snowflake?
@AkhmadMizkat
@AkhmadMizkat 2 ай бұрын
This is a very clear explanation with a precise example of the star schema. Thanks a lot for the good video.
@mojtabapeyrovi9841
@mojtabapeyrovi9841 Жыл бұрын
Thank you for the great and to the point presentation, especially this is great that you showed a simple live code to execute it since most tutorials just repeat the books and present the theory and don't show how to code it. It would be great though to cover SCD implementation in the dimension tables and what would happen to the natural keys when we will have to use surrogate keys for the dimension and fact tables, because in almost all real-world DWH examples there is always need for keeping the history of dimensions inside the dim tables, which of course the OLTP primary keys will not be applicable.
@lucashoww
@lucashoww Жыл бұрын
Such a cool data concept man! Thanks for introducing me to it. Cheers!
@saheenahzan7825
@saheenahzan7825 Жыл бұрын
Wow..very beautifully explained. Loved it ❤ Makes me explore more of your content. Thank you.
@jpzhang8290
@jpzhang8290 Жыл бұрын
Good in-depth data engineering video for professionals!
@apibarra
@apibarra Жыл бұрын
I would agree with joshi. It would be cool for a future video to see how take data from conformed layer then create a star schema using dbt.
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
Noted! Thanks for the feedback
@alecryan8220
@alecryan8220 4 ай бұрын
What’s conformed layer?
@reviewyup592
@reviewyup592 2 ай бұрын
Very succinct, and practical.
@nlopedebarrios
@nlopedebarrios 9 ай бұрын
Hi, love your channel, I'm learning a lot. What are your thoughts on Star-schema vs One Big Table (OBT)? Would you make a video comparing pros and cons of each other?
@varuntirupati2566
@varuntirupati2566 7 ай бұрын
Thanks for this video. I have a few questions: 1) in the star schema, we won't do any joins between the dimension tables right, but why did you create a table by joining all the dimension tables to flatten all the dimensions in a single table? 2) since we are creating the mart tables by joining with other tables, How these tables get refreshed because those are not views or materialized views?
@mehdinazari8896
@mehdinazari8896 Жыл бұрын
Great tutorial, Thanks for putting this together.
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
You're very welcome! Hope it was helpful.
@marcosoliveira8731
@marcosoliveira8731 6 ай бұрын
Such a good explanation !
@JA-bp3yt
@JA-bp3yt 4 ай бұрын
Excellent
@lingerhu4339
@lingerhu4339 Жыл бұрын
Really great video!!!! Thank you!!! Hope to see new video concerning SCD, indexing!!!!
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
Thanks for watching!
@sadfasde3108
@sadfasde3108 Жыл бұрын
This is fantastic - keep it up.
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
Thanks, will do!
@user-cq7lc9ww2w
@user-cq7lc9ww2w 4 ай бұрын
Very crisp !!
@sievpaobun
@sievpaobun Жыл бұрын
Brilliant video So helpful in basic understanding of star schema design Keep up the great work bro
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
Glad it helped!
@sievpaobun
@sievpaobun 11 ай бұрын
@@KahanDataSolutions Hi, it would be even more helpful if u could make a video about tutorial on how to add time and date dimension in star or snowflake schema
@shermin2o9
@shermin2o9 Жыл бұрын
Hi! I am enjoying your content as a new subscriber. I am a business analyst trying to become a data engineer, and would love to begin my own projects building warehouses and pipelines, etc. I have been researching about using Snowflake as an individual and how much it would realistically cost me to use it for a project (to showcase on my resume), and cannot get a clear answer. Hoping you can assist me with this. Thanks!
@happyheart9431
@happyheart9431 7 ай бұрын
Million thanks
@ruslandr
@ruslandr Жыл бұрын
Thanks! Great and simple overview for someone who is pretty new to this. What I would recommend is to explain pitfalls like - how to load dimensions for values there to be unique, why you have to care about having unique values in dimensions, what will happen if you left join to the dimension which has duplicate records.
@ruslandr
@ruslandr Жыл бұрын
This can help beginners to avoid mistakes on early stages
@thesaintp100
@thesaintp100 9 ай бұрын
@Kahan What tool are you using to show JSON along with other sources while querying? Thanks much.
@kalyanben10
@kalyanben10 Жыл бұрын
Not related to the topic.. But, @Kahan, what do you think about Meltano? Will you add it to your Modern Data Stack?
@karangupta_DE
@karangupta_DE 8 ай бұрын
Hi, could you kindly make a video on how to load the fact table incrementally? What if we have an OLTP system and the dimension tables get big really quickly
@AdamSmith-lg2vn
@AdamSmith-lg2vn Жыл бұрын
Typo on the chapter header #7. Great video tho on a hard to teach and under-covered topic. I strongly agree about the usability case for Star schemas even in a modern stack. It creates a highly ordered, easy to reason about, junction point between the chaos of sources, ingestion, data lakes, etc and the complexity of ~infinite data consumer use cases. The payoff in downstream development efficiency is huge.
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
Ah, dang. Good catch on the typo. Unfortunately I can't edit that part after posting. Really appreciate your feedback on the video too. As you can probably tell, I'm on the same page as you and think it's still a great strategy.
@freshjulian1
@freshjulian1 5 ай бұрын
Hey, thanks for the video! It gives a good overview how to model a star schema. But how could new staging data be ingested in the tables of a star schema? For example, an easy but inefficient approach would be to create the tables on a daily basis. But to be more optimal, you would need a process to ingest new data into the tables. Do you have an idea how that could be done in modern warehouses like Snowflake? Or some resources on that? I think it would be helpful to add some technical columns to the raw data layer, like a load timestamp, to track the records that need to be ingested. Furthermore, a valid_to and valid_from timestamp in dimension tables could be added where changes can occur (changed address of a customer).
@wingnut29
@wingnut29 Жыл бұрын
Great video. One recommendation would be to change the font color of the text that is commented out, it is nearly impossible to read.
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
Ah good catch. Sorry about that but will take note for next time!
@Bulldoges90
@Bulldoges90 10 ай бұрын
question is would you even need this in DWH such as gcp bq - since its already uses dremel architecture.
@jeffrey6124
@jeffrey6124 Жыл бұрын
Great! videos as always ... is there a way though to edit or re-create some of your videos including this where you would zoom in/out parts of it specially when highlighting stuff 🙂
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
Thanks Jeff! Regarding the editing - unfortunately there's not much that can be done after it's initially uploaded (other than minor trimming).
@ostrich97
@ostrich97 7 ай бұрын
You are great
@MohamedMontaser91
@MohamedMontaser91 Ай бұрын
great video, i'm doing almost the same thing but i'm using views for marts instead of tables because views are updated automatically i would rather use tables instead of views but creating a table from select statement doesn't update automatically, is there a way to do it without using triggers?
@SoumitraMehrotra-ri4zb
@SoumitraMehrotra-ri4zb 9 ай бұрын
Thanks for the video. Quick Question: This might be subjective and vary from business problem to problem but is it a good practice to create Fact Tables before Dimension? I mean isn't it necessary to understand what dimensions exist and what are its keys before creating a Fact Table.
@vinaychary7815
@vinaychary7815 3 ай бұрын
First Create dimension tables then go for fact table
@sakeeta6498
@sakeeta6498 Ай бұрын
Was planning to commenting same, you should create first dimension tables as fact table is pointing to them using fk
@GT-bf7io
@GT-bf7io 3 ай бұрын
Hi, I loved your video. Is there anyway to get any sample databse with raw tables just to try and practice put it together on our own?
@vinaychary7815
@vinaychary7815 3 ай бұрын
Take one dataset and create own dimension and fact tables
@prafulbs7216
@prafulbs7216 Жыл бұрын
How did you insert json data into snowflake table? Is it a direct load for m local using file format? I am using apache nifi to insert json data into table from azure blob.(I was unsuccessful to insert json object so i was convert json data to string then parse json into another column in dbt SQL script) later work on? Let me know is there way to insert json object data directly into SNOWFLAKE table have column with data type VARIANT.
@mitchconnor7066
@mitchconnor7066 Жыл бұрын
Great tutorial. I have trouble understanding the difference between data warehouse and database. The thing you made in this video is DWH, but what would be DB regarding this example?
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
Great question! This is something that took me a while to understand as well. The way I think about it is that a data warehouse is just a way to describe HOW you are using a database (or multiple databases). For example, in this scenario, the two databases I'm using are "RAW" and "ANALYTICS_100". And I'm creating new tables within the "ANALYTICS_100" database in a way that resembles what we would call a data warehouse design. But if you strip away the term data warehouse, it's still just a database with tables (or views). Using a star schema (facts & dimensions) is just one way to create your tables in an intentional way that's capable of operating in a way that we all agree to call a "data warehouse". I also just thought of this example - it's almost like saying you can build a physical "house" or an "office building" or an "apartment". They have different purposes and terminology but underneath it all they have the same components (windows, floors, roof, etc.). Just designed in different ways. Maybe not the best example but hopefully that helps!
@mitchconnor7066
@mitchconnor7066 Жыл бұрын
@@KahanDataSolutions After numerous confusing articles on topic of "DB vs DWH", I think I finnaly got it. DWH is just bunch of tables within DB that are designed in specific way which supports analytics. Your example in this video and your comment made it crystal clear. Thank you!
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
@@mitchconnor7066 You got it!
@user-mz8zv5uk5r
@user-mz8zv5uk5r Жыл бұрын
@Kahan Data Solutions, I wish you covered the concepts of Surrogate Keys with SCD Type 2,, while in this video you have conveniently skipped that and made it look like it is a simple task,, by joining multiple entities which Ralph Kimball strongly advocates to Avoid. I really want to see your approach for some of the most difficult questions, when there are many to many relationships in the real world.
@MManv84
@MManv84 6 ай бұрын
Agreed, I was shocked to see he was apparently (?) using natural keys from the raw source as the dimensional keys, instead of of surrogates. This is a pretty basic no-no, and this model will break once it encounters common complications like "late arriving" dimensional data, need to combine data from multiple sources (either parallel systems, or migrations across source systems over time) as well as SCD you bring up. As I type I see he does mention surrogate keys at the very end, but only as an *alternative* to the natural keys of the source system, not as standard practice. So I guess he would advocate using natural keys as dimension/fact foreign keys in some situations, then switch to surrogate keys only when there are dimensions without natural keys he likes, or (as you are pointing out) as soon as he needs to move beyond a Type 1 SCD for something like a customer or employee? Yuck. Just use surrogate keys consistently everywhere, as Kimball strongly advocates.
@olayemiolaigbe8000
@olayemiolaigbe8000 Жыл бұрын
Great content bro! However, you haven’t done justice to the subject: a Data model is a representation of business objects, their corresponding attribute, the relationships amongst them and other business semantics - usually represented through an Entity-relationship diagram (ERD) and, sometimes, class diagrams. What you’ve done here isn’t Data Modeling. It is, however, an explanation of how a Data Model is used/leveraged. Great resource, nonetheless.
@carltonseymour869
@carltonseymour869 10 ай бұрын
ERDs are used as a modeling tool in both OLTP and OLAP systems, but their specific application and usage differ between the two. In OLTP, ERDs are used for database design and representation of operational data structures, while in OLAP, ERDs provide a conceptual foundation for building the data model used for analytical processing and data warehousing.
@punkdigerati
@punkdigerati Жыл бұрын
Crap, I thought it was how to date a model...
@KahanDataSolutions
@KahanDataSolutions Жыл бұрын
Classic
@dertrickwinn7982
@dertrickwinn7982 Ай бұрын
Too much bass in youre voice in the recordings
@ktg8742
@ktg8742 4 күн бұрын
Bro what does this have to do with the information 😂
Comparing 3 Types of Data Modeling (Normalized vs Star Schema vs Data Vault)
3:51
Data Modeling (Star Schema 🌟) in Power BI - Creating Dimension Tables
24:54
Тяжелые будни жены
00:46
К-Media
Рет қаралды 5 МЛН
Data Modeling in the Modern Data Stack
10:14
Kahan Data Solutions
Рет қаралды 88 М.
Let's Compare the Kimball and Inmon Data Warehouse Architectures
5:16
Dimensional Modeling
53:54
Bryan Cafferky
Рет қаралды 162 М.