FAANG Coding Interviews / Data Structures and Algorithms / Leetcode
Пікірлер: 45
@GregHoggАй бұрын
Master Data Structures & Algorithms For FREE at AlgoMap.io!
@markusklyver6277Ай бұрын
This is NOT the definition of O(1). O(1) is the the set of all bounded non-negative functions from N to R.
@StepbrohelpАй бұрын
As someone with a mathematical background but who just got into coding, THIS is the explanation I was looking for. I always had a hunch that this must be true, but everywhere I looked, people seemed to just say “constant time is fastest” as if it’s ALWAYS true. It has been driving me MAD to get to the bottom of this. Thank you so much for the confirmation!
@joergsonnenberger6836Ай бұрын
A good example for why constants matter: the best scaling primality test known is ECPP. It's complexity is O((log n)^5). The problem: the constants are so bad, that the exponential algorithms are faster for most interesting numbers.
@kostya48i57Ай бұрын
big O notation is not about "coding", it is asymptotic notation, usually used in calculus (for example taylor series). it is strange that you have "math background", but never heard of various notations.
@polycrystallinecandyАй бұрын
The asymptotic growth notations come from mathematics
@Mystic998Ай бұрын
Mathematicians don't use asymptotic notation often outside of computer science topics. I've only ever seen it occasionally referenced when dealing with series, and usually not in any rigorous, well-explained way.
@joergsonnenberger6836Ай бұрын
@@Mystic998 They are used in numerical analysis, but yeah, it's quite easy to study a lot of math without ever being exposed to asymptotic notation.
@nolanrata7537Ай бұрын
O(1) more accurately means that the time will remain below a constant value even for very large (up to infinity) values of n. The time is still allowed to vary so long as it remains below that upper bound.
@GregHoggАй бұрын
Yeah true could be under the line as well
@ashwin_mahajanАй бұрын
I study time and space complexity every time, and every time I forget it
@abebuckingham8198Ай бұрын
I only write O(∞) algorithms.
@asango_ahamАй бұрын
Chad
@ponmuthu..4796Ай бұрын
That breakes the definition of algorithm
@markusklyver6277Ай бұрын
Every algorithm is O(∞).
@abebuckingham8198Ай бұрын
@@markusklyver6277 That's the joke. 😆
@RS-fz4psАй бұрын
The point of big O runtime analysis is for scaling. The goal should be to keep this in mind when writing a solution that could be used, say millions of time consecutively. Your answer may take constant runtime, but if that constant runtime is greater than that of a comparable O(N) you might be missing out. This is why most practical sorting algorithms are hybrids which in the real world.
@dumnorАй бұрын
There are also considerations with theory vs reality. Sometimes computer architecture or physical limitations can cause runtime coefficients to vary. Memory limitations, graphics cards can do certain operations faster, cache access and registries within CPU. For example linked list vs array, theory says linked list is faster to iterate, but reality is that modern computers 'cheat' the theory by retrieving more data than single byte to the cache. This makes arrays faster to iterate through.
@ashiqurrahman-nu7ikАй бұрын
Well explained! Good content!
@alchenerdАй бұрын
What would an O(1/N) task look like? Perhaps filling an N-element array up to a constant length using copies of existing elements in the same array?
@markusklyver6277Ай бұрын
This is NOT the definition of O(1). O(1) is the the set of all bounded non-negative functions from N to R.
@Nikarus2370Ай бұрын
IDK what you're on about, this is a video talking about "Big O" in programming, which the OP's description of O(1) is accurate.
@dylanisaac1017Ай бұрын
@@Nikarus2370no???
@davidespinosa1910Ай бұрын
@@Nikarus2370 Markus is correct. O(1) means bounded. The function doesn't have to be flat.
@speakoutloud7293Ай бұрын
Brilliant! Keep going.
@balu5387Ай бұрын
please make full videos on Time and space complexity
@GregHoggАй бұрын
Working on it
@davidespinosa1910Ай бұрын
O(1) doesn't mean flat. It means bounded. Also, for all the big-O notations, it's irrelevant what the program does for small n. So if n is bounded (less than 1 gazillion, for example), then all programs are O(1).
@dhirajsharma5191Ай бұрын
Can you also explain o(logn) and o(nlogn)
@GregHoggАй бұрын
I've explained log n about 10 shorts ago, you can check it out
@DreadTeamLeaderАй бұрын
I had a feeling someone would have to talk about speed regarding this.
@Jay_BlvckАй бұрын
Nice explanation 🔥
@GregHoggАй бұрын
Thank you!
@dotnet8925Ай бұрын
Wow
@SirbikingvikingАй бұрын
So what if you implement a function that uses linear time under a certain input size and switches to hashing over a certain input size
@Nikarus2370Ай бұрын
Well you'd probably have 2 (or more) dedicated functions to doing the job tuned for the different ranges of N. And then 1 function that checks the input and picks which function should be used.
@808brotherstv4Ай бұрын
Wow i never thought of it in this perspective
@KaramAlayanАй бұрын
Thats the importance of expecting the scale of your project for example your website probably wont reach 10 million users so you should only take into account like 100k to be very optimistic here but when a company like google creates a website it should be scalable to numbers far far higher probably for 75% of people on earth
@MK-13337Ай бұрын
Every program is O(1) in the real world where inputs don't get arbitrarily large. But I guess bigO is somewhat useful.
@MK-13337Ай бұрын
Every program that terminates
@guythat779Ай бұрын
I guess an example would be checking an input value against every word in a .txt