Pure Information Gives Off Heat

  Рет қаралды 449,151

Up and Atom

Up and Atom

Жыл бұрын

Sign up to Brilliant to receive a 20% discount with this link! brilliant.org/upandatom/
Hi! I'm Jade. If you'd like to consider supporting Up and Atom, head over to my Patreon page :)
/ upandatom
Visit the Up and Atom store
store.nebula.app/collections/...
Subscribe to Up and Atom for physics, math and computer science videos
/ upandatom
Why Time Actually Flows Both Ways
• The Time-Reversibility...
Follow me @upndatom
Up and Atom on Twitter: upndatom?lang=en
Up and Atom on Instagram: / upndatom
For a one time donation, head over to my PayPal :) www.paypal.me/upandatomshows
A big thank you to my AMAZING PATRONS!
Michael Seydel, Cy 'kkm' K'Nelson
, Rick DeWitt, Thorsten Auth
, Purple Penguin
, AndrewA, Izzy Ca, bpatb
, Michael Martin, Scott Ready,
John H. Austin, Jr.
, Brian Wilkins, Thomas V Lohmeier, David Johnston
,
Thomas Krause
, Yana Chernobilsky,
Lynn Shackelford, Ave Eva Thornton,
Andrew Pann,
Anne Tan
, James Mahoney, Jim Felich, Fabio Manzini, Jeremy, Sam Richardson, Robin High, KiYun Roe, Christopher Rhoades, DONALD McLeod, Ron Hochsprung, OnlineBookClub.org, Aria Bend, James Matheson, Robert A Sandberg, Kevin Anderson, Tim Ludwig, Alexander Del Toro Barba, Corey Girard, Justin Smith, Emily, A. Duncan, Mark Littlehale, Lucas Alexander, Jan Gallo, Tony T Flores,
Jeffrey Smith
, Alex Hackman
, Joel Becane,
Michael Hunter
, Paul Barclay, 12tone,
Zhong Cheng Wang,
Sergey Ten, Damien Holloway,
Mikely Whiplash
, John Lakeman
, Jana Christine Saout
, Jeff Schwarz
, George Fletcher,
Louis Mashado,
Michael Dean
, Chris Amaris,
Matt G
,
Broos Nemanic
, Dag-Erling Smørgrav
, John Shioli
, Joe Court
, Todd Loreman
, Susan Jones, Richard Vallender, jacques magraith, William Toffey, Michel Speiser, Rigid Designator, James Horsley, Bryan Williams, Craig Tumblison, Rickey Estes, Cameron Tacklind, 之元 丁, Kevin Chi, Paul Blanchard, Lance Ahmu, Tim Cheseborough, Nico Papanicolaou, keine, Markus Lindström, Jeffrey Melvin, Midnight Skeptic, Kyle Higgins, aeidolos, Mike Jepson, Dexter Scott, Potch, Thomas P Taft, Indrajeet Sagar, Markus Herrmann (trekkie22), Gil Chesterton, Alipasha Sadri, Pablo de Caffe, Alexander230, Taylor Hornby, Eric Van Oeveren, Mark Fisher, Phizz, Rudy Nyhoff, Colin Byrne, Nick H, Jesper de Jong, Loren Hart, Ari Prasetyo, Sofia Fredriksson, Phat Hoang, Spuddy, Sascha Bohemia, tesseract, Stephen Britt, KG, Dagmawi Elehu, Hansjuerg Widmer, John Sigwald, Carlos Gonzalez, Jonathan Ansell, Thomas Kägi, James Palermo, Gary Leo Welz, Chris Teubert, Fran, Joe, Robert J Frey, The Doom Merchant, Wolfgang Ripken, Jeremy Bowkett, Vincent Karpinski, Nicolas Frias, Louis M, kadhonn, Moose Thompson, Andrew, Sam Ross, Garrett Chomka, Bobby Butler, Rebecca Lashua, Pat Gunn, Elze Kool, RobF, Vincent Seguin, Shawn, Israel Shirk, Jesse Clark, Steven Wheeler, Philip Freeman, KhAnubis, Jareth Arnold, Simon Barker, Dennis Haupt, Lou, amcnea, Simon Dargaville, and Magesh.
Creator - Jade Tan-Holmes
Script - Jack Johnson
Animations - Standard Productions
Music - epidemic sound

Пікірлер: 1 700
@compuholic82
@compuholic82 Жыл бұрын
Fun fact: Reversible logic is really important in quantum computing. Since all state changes can be represented as unitary matrices, quantum gates are always reversible.
@hyperduality2838
@hyperduality2838 Жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@laughingone3728
@laughingone3728 Жыл бұрын
@@hyperduality2838 Nicely stated. Thanks for that.
@Snowflake_tv
@Snowflake_tv Жыл бұрын
Really?
@hyperduality2838
@hyperduality2838 Жыл бұрын
@@laughingone3728 You're welcome, it gets better:- There is also a 5th law of thermodynamics, energy is duality, duality is energy! Energy is dual to mass -- Einstein. Dark energy is dual to dark matter. Action is dual to reaction -- Sir Isaac Newton (the duality of force). Attraction is dual to repulsion, push is dual to pull -- forces are dual. If forces are dual then energy must be dual. Energy = force * distance. Electro is dual to magnetic -- Maxwell's equations Positive is dual to negative -- electric charge. North poles are dual to south poles -- magnetic fields. Electro-magnetic energy is dual. "May the force (duality) be with you" -- Jedi teaching. "The force (duality) is strong in this one" -- Jedi teaching. There are new laws of physics! Your mind creates or synthesizes syntropy! Thesis is dual to anti-thesis creates the converging thesis or synthesis -- the time independent Hegelian dialectic. Duality creates reality! Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle. Everything in physics is made from energy hence duality. Concepts are dual to percepts -- the mind duality of Immanuel Kant.
@hyperduality2838
@hyperduality2838 Жыл бұрын
@@Snowflake_tv There is also a 5th law of thermodynamics, see my next comment.
@bobdiclson4173
@bobdiclson4173 Жыл бұрын
I love Jades vibe of having a secret to share
@vigilantcosmicpenguin8721
@vigilantcosmicpenguin8721 Жыл бұрын
Yeah, that's a perfect description. The way she grins when she gets to the juicy part of the secret,
@Blue.star1
@Blue.star1 Жыл бұрын
She looks like a meson
@PhysioAl1
@PhysioAl1 Жыл бұрын
You're right!
@sabouedcleek611
@sabouedcleek611 Жыл бұрын
@@user-bt1hf9cr5n Looks a bit like Modified Newtonian dynamics, though i dont think √(1+(GM/(RC²))² satisfies the interpolation requirement in the overview of the wiki.
@communitycollegegenius9684
@communitycollegegenius9684 Жыл бұрын
Too bad. No one would watch or take her seriously without her vibe. Science and humanity loses.
@patrickhanft
@patrickhanft Жыл бұрын
I do have a degree in computer science and I find you again and again to be one of my best CS teachers in topics, that were never discussed or badly explained during my studies!
@zen1647
@zen1647 Жыл бұрын
Great video! You're awesome!
@hyperduality2838
@hyperduality2838 Жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@sherpajones
@sherpajones Жыл бұрын
16:23 What if the logic gate operated by interpreting the interference pattern of light. If you only had one light source, there would be no pattern. If you have two, there would be a pattern. The presence or not of a pattern can be your 0 or 1. This should easily be reversible.
@moth5799
@moth5799 Жыл бұрын
@@hyperduality2838 Christ mate you really think quoting Yoda makes you look smart lmfao? There are 4 laws of thermodynamics but that's only because we call one of them the Zeroth law. Your one is just made up.
@hyperduality2838
@hyperduality2838 Жыл бұрын
@@moth5799 Subgroups are dual to subfields -- the Galois correspondence. The Galois correspondence in group theory is based upon duality. There are new laws of physics -- Yoda is correct. Energy is dual to matter -- Einstein. Dark energy is dual to dark matter. Energy is duality, duality is energy -- the 5th law of thermodynamics! Potential energy is dual to kinetic energy -- gravitational energy is dual. Energy is measured in Joules (duals, jewels) in physics.
@fmeshna
@fmeshna 8 ай бұрын
Jade, your ability to explain complex quantitative concepts so clearly is exceptional. We need more teachers like you.
@domotheus
@domotheus Жыл бұрын
Good stuff! If you're interested there's an even weirder (and very theoretical) application of reversible computing called a "Szilard engine" where you can go back and forth between waste data and waste energy. Using the wasted bits of reversible computing you can theoretically extract energy out of a system that's at an equilibrium state, basically meaning you can convert energy into data and data into energy
@zdlax
@zdlax Жыл бұрын
mass = energy = information
@landspide
@landspide Жыл бұрын
isn't this like Maxwell's demon?
@abdobelbida7170
@abdobelbida7170 Жыл бұрын
How so?
@MasterHigure
@MasterHigure Жыл бұрын
@@abdobelbida7170 Maxwell's demon can separate a fluid into fast / slow atoms, or a fluid mix into its constituent parts. You can extract energy from the recombining. The cost is that the demon's knowledge of the state of the system becomes less and less useful as it does its work.
@lubricustheslippery5028
@lubricustheslippery5028 Жыл бұрын
I am confused. I thought that information needs an physical representation according to information theory and thus something like pure information should not be a thing similar to that pure energy is not a thing. Then is it something like you can convert entropy and enthalpy, in chemistry you calculate with Gibbs free energy to see if an reaction can occur that is the combination of entropy and enthalpy.
@demetrius235
@demetrius235 Жыл бұрын
I worked in the semiconductor industry (DRAM) for a few years and now one of the courses I teach is Thermodynamics. I had no idea about the Landauer limit so thanks for teaching me something new! Also, good work pointing out that a completely reversible process is not possible as there is always some energy loss (collisions in your billiard ball case). This was an excellent video!
@anntakamaki1960
@anntakamaki1960 Жыл бұрын
Is it in Russia?
@demetrius235
@demetrius235 Жыл бұрын
@@anntakamaki1960 "it"?? I did not work in Russia and I have never been to Russia. I have no desire to set foot in Russia.
@mathslestan3323
@mathslestan3323 Жыл бұрын
@@demetrius235 Wow so much love for Russia 😑
@Nehmo
@Nehmo Жыл бұрын
How can you be associated with semiconductors in any way and not know about the Landauer limit? Sorry to be critical, but it's rather basic. Now that you know about it, you will recognize encountering it again and again.
@katrinabryce
@katrinabryce Жыл бұрын
@@Nehmo Possibly because it is so small that for all practical purposes it is zero? The Landauer limit of a typical modern 35W CPU is about 35 nanowatts. And for DRAM it is actually 0W, because the whole point of RAM is that you get back out what you put in, so it is reversible.
@bntagkas
@bntagkas Жыл бұрын
im just a stupid highschool dropout but to me it seems all of these energies are really kinetic energy. whether it moves a car, or manipulates information or produces heat, if you zoom in you move atoms in a way that benefits you, you move atoms to move the car=kinetic, you move atoms/electrons/photons to manipulate information, you move atoms etc to produce heat. so it seems to me all kinds are really one, kinetic
@stufarnham
@stufarnham Жыл бұрын
This has become my favorite KZfaq channel. These short, digestible discussions of deep topics. Are endlessly fascinating. I especially enjoy the discussions of paradoxes. Also, you are amgreat presenter - clear and engaging. Keep itn up, please!❤
@pedromartins1474
@pedromartins1474 Жыл бұрын
This took me back to my statistical physics classes! Wonderfully explained! Thank you so much!
@JanStrojil
@JanStrojil Жыл бұрын
The fact that information contains energy always boggles my mind.
@goldenwarrior1186
@goldenwarrior1186 Жыл бұрын
It makes sense. There’s nothing to really think of (don’t mean to be mean)
@MyMy-tv7fd
@MyMy-tv7fd Жыл бұрын
that is because it does not, there is no necessary information in a logic gate switching, it could randomly switch, or just be set to oscillate. No information is involved in mere switching, but energy obviously is. Either this is clickbait, or she does not understand that information is the intentional switching of logic gates to produce a certain storage pattern, which may or may not be volatile RAM or non-volatile like an SSD.
@dominobuilder100
@dominobuilder100 Жыл бұрын
@@MyMy-tv7fd give an example then of any sort of information that does not contain energy
@MyMy-tv7fd
@MyMy-tv7fd Жыл бұрын
@@dominobuilder100 - no information whatsoever contains energy, it is conceptual. But there is always a physicsl substrate, whether it be the page of a book or a RAM stick. The change in the substrate could contain information, or just be random, but the change itself will always require energy, the information is the 'ghost in the machine'
@paulthompson9668
@paulthompson9668 Жыл бұрын
@@goldenwarrior1186 Hindsight is always 20/20
@gotbread2
@gotbread2 Жыл бұрын
While the second law gives a mathematical justification for that energy loss, it does not give a deeper "why" that is the case. The fundamental issue is that of information erasure itself. This comes down to collapsing a state to a single value. Imagine the 2 bits getting reduced to 1 bit. This means we force one bit from having a variable state (either 0 or 1) to a fixed state. It can be any value, does not matter, but it is now a constant and no longer a variable. This is where the loss occures. One helpful visual is a ball in a potential with 2 valleys (as a standin for a particle in a bipotential). Now this ball can be in either of the 2 valleys initially. By definition we dont know, else it would be a known constant and not a variable. Lets say we want to move this ball into the left valley, from any starting valley. The issue here is that whatever we come up with needs to work for both starting valleys. Similar the bit erasure must be able to set a 0 to a 0, but also a 1 to a 0. In the ball case, you can move it over to the other valley but then it will have some speed left, which you need to dissipate in order for it to come to a rest at the bottom and keep this state. This is exactly where the loss happens. You can add some kind of reversible damping to "catch" this energy, but then it wont work for the case that the ball was already in the correct valley. Whatver case you design it for will always cause an energy loss for the other case, since you need to move from a case with potentially "some" kinetic energy to a state with "zero" kinetic energy, without knowing the direction of the motion. (This is similar to maxwells demon). Now how much energy do we need to dissipate? Also easy to see. In order to differentiate between the 2 bit states, there needs to be a potential barrier between them. This barrier needs to be high enough to prevent thermal movement from flipping the bit on its own. The energy you need to dissipate while "catching" the bit and bringing it to rest is directly coming from the energy you need to expend to cross this barrier. Since the barrier is temperature related (more temperature -> more thermal energy -> higher barrier needed to avoid flips), the energy loss is also temperature dependent. This is where the "T" in the equation comes from. The boltzman constant in a way is mandatory to match the units. Last piece of the puzzle is the ln(2). We can either be satisfied with using the second law as a shortcut here, but the ln(2) can also be derived directly from the "geometry" of this "information bit in 2 potential wells" problem.
@dtkedtyjrtyj
@dtkedtyjrtyj Жыл бұрын
Wow. I actually think I understood some of that. It makes intuitive sense...?
@rewe3536
@rewe3536 Жыл бұрын
Thank you! The video makes it seem like it's just magic, it just happens.
@garyw.9628
@garyw.9628 Жыл бұрын
Really nice analysis of why erasing information necessitates a loss of energy. Also very appropriate to mention Maxwell's demon, since his thought experiment cleverly demonstrated the important link between information and energy. But, in the derivation of the Landauer Limit, and of the Boltzmann constant itself, there seems to be the assumption of a system consisting of the atoms and molecules of a gas. What if the computing device consisted of something smaller than atoms like photons, or neutrinos or quarks ? Would the corresponding Landauer Limit then, by necessity, have a much lower value ?
@hyperduality2838
@hyperduality2838 Жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@gotbread2
@gotbread2 Жыл бұрын
@@garyw.9628 We did not make any assumptions about what the system is made of. All we need to assume is "some element" (here a particle, but even a wave function works too), which can be in different states, and that the 2 states are separated by an energy barrier. We need this barrier in order to preserve the state (else it would evolve between the states over time), and the height of the barrier is based on the expected disturbances (thermal energy). Further, we assume that by crossing this barrier of potential energy, a certain kinetic energy is needed, which we eventually need to dissipate again. Notice how abstract that setup it, it makes no mention of the particle type, or even a particle at all (even a field configuration would work here). You are correct with the Boltzmann constant however. This carries some assumptions with it. Since this constant relates the temperature of a gas to its average kinetic energy, it all comes down to how you define "temperature" for your system. If you use different particles, or something different entirely, your definition of what a temperature is may change, thus changing the Boltzmann constant.
@ericmedlock
@ericmedlock Жыл бұрын
Great video! I learned a bunch of this in university a million years ago but you do a super job of simplifying a really complex set of concepts. Kudos!
@E4tHam
@E4tHam Жыл бұрын
I’m currently pursuing a masters in VLSI, so thanks for introducing these concepts to people! Although the built in impedances in metal and semiconductors will always overshadow the Landauer limit by several orders of magnitude. But this is an interesting thought experiment
@adamnevraumont4027
@adamnevraumont4027 Жыл бұрын
always is a long time to make a promise for
@adivp7
@adivp7 Жыл бұрын
@@adamnevraumont4027 Even if we you eliminate metal impedances with super-conduction you definitely need semi-conducting material for a transistor. And semi-conductors will always produce heat.
@adamnevraumont4027
@adamnevraumont4027 Жыл бұрын
@@adivp7 which you then follow with a formal proof that all computation requires transistors made of semiconductors? No? Well then making a promise for "always" is beyond your pay grade. Always is a very long time. Always is not just 10 years, it is 100 years, it is 10000 years, it is 100,000,000 years, it is 10^16 years, it is 10^256 years, it is 10^10^10^10^10 years, it is G64 years, it is TREE3 years, it is BB(50) years. It is a really long time.
@dot32
@dot32 Жыл бұрын
​@@adamnevraumont4027 lmao, it's physics. You need semiconductors for transistors. If you found something other than a transistor, you may not need semiconductors, but semiconductors are what transistors are afaik
@adivp7
@adivp7 Жыл бұрын
@@adamnevraumont4027 Technically, none of those are "always", but fair point. What I meant to say is you need energy for switching. I can't see how you can have switching that doesn't use or release any energy.
@Alestrix76
@Alestrix76 Жыл бұрын
I was wondering about the "many million more" at 10:00 and did some math as I thought this sounded a little too much. But turns out it's about right: A somewhat modern 64bit x86 CPU has around 5*10^8 logic gates. Let's say with each cycle 1% of those gates gets flipped and there are 2*10^9 cycles per second (2GHz), then we end up with around 1µW. Modern power efficient x86 processors need roughly 10W, which is 10 million times this. Not sure what the numbers are like with an ARM processor in a smartphone. Of course this is just ballpark-math.
@greenaum
@greenaum 10 ай бұрын
The latest AMD CPUs have 8.2*10^10 transistors, or 82 billion. A logic gate might have, maybe 5 transistors, so you're off by a factor of about 100. A lot more than 1% of the gates get flipped. CPUs are designed to use as much of the hardware as possible, all the time. To get more processing done. You don't want bits of the chip sitting around idle. This is why you might have "4 cores 8 threads". Each core runs two "threads". That is, if, say, the chip's multiplier unit is being used by one instruction, there might be a memory access it can make, using it's memory access hardware, for another instruction. It runs two instructions at once but it's not two entire processors. Instead, it's two front-ends that work out what instructions can be run with the currently unused parts of the CPU. So it's like you get, say, 1.8 effective processors per core, with just the addition of a second front-end and a bit of logic to figure out what's compatible. There's also pipelining, where an instruction might take, say, 5 stages of operations to complete. The 5 stages are built separately, and as an instruction leaves the first stage, a new one is brought in, so all 5 stages are busy, all the time, with 5 instructions. Then there's out-of-order execution and all sorts of other mad tricks to try and get processing done in almost no time at all. CPUs will have multiple adder units and other parts. It's not just having multiple cores, each core does multiple things. So, they're busy, by design, to produce the most throughput of calculation for the silicon you buy. To have circuits sitting idle is wasteful, and indeed they analyse that at the factory, and if a part isn't pulling it's weight, they'll replace it with something else that does. It's all about getting the most processing possible done, because that's what they compete on, and what they set their prices by. In power-saving chips, like for phones, the opposite is true. They try and switch off as many sub-units as possible while still providing just enough CPU power to do whatever you're doing at that moment. Entire cores will usually be shut down, but they can wake up quickly. Plus modern phones might have 4 high-power, fast processors, and 4 slower, very low-power ones, designed with different techniques, switched on and off as the operating system decides it needs them.
@esquilax5563
@esquilax5563 Жыл бұрын
This is one of my favourite videos of yours. Most of them cover topics I have a fair bit of familiarity with, but your "where's the mystery?" intro made me realise I've barely thought about this at all
@itsawonderfullife4802
@itsawonderfullife4802 Жыл бұрын
Great video as always. One little reminder though: At 7:53 you have to also consider the power plant (and include it in the "system") which makes the electricity used to produce the work (=energy for refrigerator's compressor) which is needed for the refrigerator (to reverse and decrease the entropy of its interior) to say that the 2nd law is maintained overall and entropy of a closed system (most often) increases.
@danielschein6845
@danielschein6845 Жыл бұрын
Amazing to think about. I spent 10 years designing actual microprocessors and always thought of energy in terms of the electrical current flowing through the device.
@ralfbaechle
@ralfbaechle Жыл бұрын
To be honest, the Landauer limit is so low - we can spend anoher century without reaching it. Figuratively speaking that is. I've not done an estimate how long we'd probably need to reach the Landauer limit from where technology is now. Because, let's face such estimates usually are wrong :-) So it's perfectly ok to concentrate on all other losses.
@DrewNorthup
@DrewNorthup Жыл бұрын
And even that is an oversimplification… The need to understand the impact of both the resistance and the reactance escapes a good many people. Switching losses are so large compared to Landauer I'd not expect the latter to factor in meaningfully for quite some time.
@triffid0hunter
@triffid0hunter Жыл бұрын
Sure, but the landauer limit says that a modern CPU must use at least a nanowatt or so, and since they _actually_ use about a hundred watts, we've got a _long_ way to go before having to deal with the limit - wikipedia's article says maybe 2080 if Koomey's law holds, although it doesn't mention which Koomey's law figure (there's two in the relevant article) was used to derive that figure.
@hyperduality2838
@hyperduality2838 Жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@_ninthRing_
@_ninthRing_ Жыл бұрын
@@ralfbaechle Perhaps at the scale of a single CPU of a home computer, but at the scale of the massive data processor centers for internet servers, or the supercomputers used to simulate the Climate/Weather patterns of the entire Earth, the need to to reduce the tremendous heat being generated (both in terrms of efficiency & especially the cost in cooling these mega-systems), makes current efforts to overcome the Landauer Effect financially viable. Just look at how much energy is estimated at being diverted from electrical power grids towards Cryptocurrency farming & try to estimate the vast amount of waste heat being generated (especially in hotter regions close to the equator, like Texas, where there's already a profound energy cost in maintaining functional computer environments) due to all this data processing. Any tweak to the design of computer processors which either overcomes, or more likely reduces (given our current incomplete understanding of Quantum Physics), the Landauer Effect is going to be a worthwhile achievement.
@jesuss.c.8869
@jesuss.c.8869 Жыл бұрын
Great video, Jade. Thank you for introducing such a complex topic in an easy and fun way. 👍
@Kimwilliams45
@Kimwilliams45 Жыл бұрын
Thank you. I had never heard about the Landauer limit even though I was a physics student. Very good explanation.
@maxmustsleep
@maxmustsleep Жыл бұрын
This is so fascinating! Great job explaining this complex concept, I think even without much CS knowledge you'll be able to understand it from this video
@sachamm
@sachamm Жыл бұрын
The thought experiment I was given when learning about reversible computing referred to the elasticity of atomic bonds and how energy could be returned when a molecule returned to its original conformation.
@slevinchannel7589
@slevinchannel7589 Жыл бұрын
Not many have the intellectual Integrity to watch the harsh History Coverage that Some-More-News did i nthe video "Our Fake Thanksgiving'.
@david203
@david203 Жыл бұрын
I don't see it. You would have to identify where the "atomic bonds" are in the logic circuits. Read Gotbread's analysis in another comment here. It makes more sense.
@heartofdawn2341
@heartofdawn2341 Жыл бұрын
The question then is, where does that second bit of output information go? How is is stored and used? If you simply discard it later, all that happens is that you push the source of the landauer limit further downstream.
@JB52520
@JB52520 Жыл бұрын
No one really knows because there's no design for a reversible computer yet. The billiard ball example shows how the balls might be returned to the correct place with a logic gate that doesn't expend energy, but it doesn't show where they're stored or how they'll return at the precise time (as far as I remember; it's been a while since I read about this). I'm just guessing, but a useful metaphor might be to picture a mechanical computer where each of the waste bits is stored in a tiny spring, such that computing would be like winding a clock. Once the result is obtained, the program runs in reverse to unwind the system and return the stored energy. (How it would actually work, I have no idea.) It's also like the difference between standard car brakes and regenerative braking. The former just radiates heat, and the later runs one or more generators, storing energy to accelerate the car later. As far as I can tell, reversible computing doesn't have to be perfect, just like regenerative brakes. Even if a program can only run backward part way before releasing the remainder of its stored energy as heat, that's still better than releasing all of it, and it might be enough for a computer of the distant future to sidestep the Landauer limit.
@erkinalp
@erkinalp Жыл бұрын
@@JB52520 All quantum computers use reversible computational elements to prevent immediate collapses of superposition.
@glenncurry3041
@glenncurry3041 Жыл бұрын
@@erkinalp Qbits are not single binary bit.
@brandonklein1
@brandonklein1 Жыл бұрын
@@erkinalp but you're still subject to the Landauer limit once you make a measurement.
@hyperduality2838
@hyperduality2838 Жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@saggezza-artificiale
@saggezza-artificiale Жыл бұрын
Very nice video. I'd just suggest to clarify that, if we consider real use cases, probably we'll never get around Landauer's principle, even if we developed perfect reversible computing. This because reversible computing allows to handle information without erasing it, but it we don't rely on a device allowing to store an infinite quantity of information, soon or later we'll need to delete it, and this can't be done in a reversible way.
@zscriptwriter
@zscriptwriter 3 ай бұрын
Back in College in 1984 I created a circuit gate simulator on my TI99 computer that allowed the user to place circuit gates onto the screen, connect them and then input values And then compute the output. Given enough memory, the simulator could reverse lookup the initial values. Thank you Jade for reminding me how much fun I had in college. You are an awesome person with an unlimited imagination.
@trewaldo
@trewaldo Жыл бұрын
I went to watch this video with the preconceived notion about entropy's connection with energy and information. But this is different. The explanation made it better! Thanks, Jade. Cheers! 🥰🤓😍
@pandoorloki1232
@pandoorloki1232 Жыл бұрын
Shannon entropy is not thermodynamic entropy--they are fundamentally different things.
@fios4528
@fios4528 Жыл бұрын
Thank you for making this video. I've genuinely spent many a sleepless night thinking about the lifespan and logistics of a minimum energy computer for a future civilization that lives in a simulation.
@berniv7375
@berniv7375 Жыл бұрын
Could you possibly do a video about data banks and how they are taking over the world. I was astonished to learn how much energy is required just to take a digital photograph and how much energy is required to cool down data banks. 🌱
@cate01a
@cate01a Жыл бұрын
if the universe is a simulation, why care for energy? like the people controlling the simulation could either supply it with nuclear or better energy that would last many universes lifetimes. or more likely is they'd speed up the simulation speed to like tree(10^^^^^^^^^^^^^^^^^^^^^^^^^10), so that maintenence is a non issue because they (might, havent done the maths, can you even calculate that?? no probably not since iirc we only know a COUPLE digits of tree(3) so fuck that bigass number) would have already simulated more universes than we could literally comprehend in less than a microsecond. though thatd take fucking insane technology, probably impossible for the laws of physics, so then the simulator controllers would need to exist is a much more advanced, cooler universe, but then the persons controlling THEM would need even IMMENSELY more fucking power, like holy shit, not even a trillion bajillion shit tonne quatrillion lifecycles of the entire universe/existence could even come CLOSE to the amount of damn energy needed for that to happen so uh i guess simulation is outta the question unless like god is real and on his impossible computer he's playing the sims10 or some shit, though yknow that makes zero fucking sense too
@anywallsocket
@anywallsocket Жыл бұрын
@@cate01a A^^A = A^A
@paulmichaelfreedman8334
@paulmichaelfreedman8334 Жыл бұрын
Funny thing is, that if it turns out we're bound to our solar system (FTL completely impossible for complexly arranged matter) at least part of the planet's population will choose to live in a simulation created by ourselves by means of VR immersion with whatever technology is invented for this purpose in the future. Games like EVE, WoW and Elite Dangerous are examples of current day escape to a universe in which much more is possible than the real one. If you have seen the movie Surrogates or Ready Player One you'll catch my drift.
@cate01a
@cate01a Жыл бұрын
@@paulmichaelfreedman8334 faster than light travel for complex matter/humans ships plants etc should be possible (not feasible (yet)) by manipulating space time, which is possible - and by making the spacetime go faster than light rather than the matter, the matter is still travelling whilst not going against the laws of physics: kzfaq.info/get/bejne/frt9esZpzavPoJc.html&ab_channel=PBSSpaceTime havent watched that specific vid but probably explains same concept
@0ptikGhost
@0ptikGhost Жыл бұрын
I love this video as it pertains to theoretical computation. Realizable computers today generally use electric flow through substrates that always have some level of impedance. The real reason our current day computers generate heat has nothing to do with the Landauer Limit but rather with the technology we use to build said computers. We don't stand a chance of getting anywhere near the Landauer Limit regardless of the temperature we run the computer unless we don't figure out how to make computers using superconductors.
@david203
@david203 Жыл бұрын
Yes, that is today's understanding. But the theory itself doesn't require superconductors. And CMOS does actually use less energy than, say, TTL.
@TheDavidlloydjones
@TheDavidlloydjones Жыл бұрын
@@david203 You rather seem to miss Ghost's sensible point. Try reading it over again. Or look for E4tHam's sensible post, below in my feed, making Ghost's point in slightly different form.
@david203
@david203 Жыл бұрын
@@TheDavidlloydjones I fully agree with Ghost's point, thanks. The current required to activate computer logic causes vastly larger heat than does the actual processing of information. That's why it's called a Limit.
@greenaum
@greenaum 10 ай бұрын
Right. Charging and discharging the gates of umpty-billion transistors requires dumping the energy as heat. Resistors make heat, though CMOS is usually quite high-impedance. There's reversible logic, but frankly it looks like a pain in the arse and it seems like most of it happens only on paper and the rest might be that someone strings together a couple of logic gates, not an entire CPU. Even then it's probably made of snooker balls!
@allanwrobel6607
@allanwrobel6607 11 ай бұрын
A fundermental concept explained so clearly, you have a rare talent, keep going with these.
@cykkm
@cykkm Жыл бұрын
There is a little big problem with the reversible billiard ball (RBB) computer (tangentially, the problem was noticed by Landauer himself in the original paper, but I'll translate it into the RBB language. Suppose you place ideal compressive springs at all output of the RBB logical circuit to _actually_ reverse the computation. This works indeed, but then you _don't known the result of the computation!_ You can adiabatically _uncompute_ any computation as long as you don't read its result. If you want to know the result, i.e. whether or not a ball popped out from an output and bounced back, you have no option but to touch the system. Even a single photon interacting with a ball, such that you can detect whether if there was a ball bouncing off the spring or not, transfers momentum to the ball, breaking the time symmetry of the RBB. A reversible computation is possible, _as long as you uncompute it all without reading out the result!_ The act of reading a result must increase the computer's entropy, even if the computer is reversible. This was one of the main Landauer results. His paper connected Shannon and Boltzmann entropy so clearly.
@JB52520
@JB52520 Жыл бұрын
I hadn't read that Landauer was working from a time symmetry perspective. If you're going to run the universe backwards, there's no need for a special computer. Heat will flow back into a normal computer, cooling it and pushing electricity back into the wall with perfect efficiency. If time symmetry must actually remain unbroken and that's not a joke, there's nothing clear about this concept. Can you have a system that's not influenced by the probabilistic nature of quantum effects? Even if that's not a problem, reversible computing couldn't give off photons, including infrared, because they'd have to turn around and hit the computer precisely where and when they left. Any irreversible interaction with the environment would also be forbidden. This means reversible computing would require impossibly perfect efficiency in perfect isolation, ignoring quantum effects and spontaneously traveling backward in time, while being theoretically guaranteed to produce no results. I don't know anything anymore. This explains nothing "so clearly". At least you easily understand the incomprehensible and see utility in the useless. Screw it, everyone else is awesome and I'm wrong about everything. The more I try to learn or think, the more I realize it's a miracle I ever learned to tie my shoes. Being born was a terrible idea. Hey, I finally understand something.
@michaelharrison1093
@michaelharrison1093 Жыл бұрын
This is along the same lines as the argument that I made in a comment I submitted. Reversibility does not eliminate the chane in entropy
@mihailmilev9909
@mihailmilev9909 Жыл бұрын
@@JB52520 lmao well fucking said dude
@mihailmilev9909
@mihailmilev9909 Жыл бұрын
@@JB52520 one of my favorite comments ever
@mihailmilev9909
@mihailmilev9909 Жыл бұрын
@@michaelharrison1093 what was the context? Who's in ur pfp btw, some professor?
@hoptanglishalive4156
@hoptanglishalive4156 Жыл бұрын
Disinformation gives me the chills but information warms my soul. Like Prometheus giving the fire of knowledge to humanity, science educators like shining Jade are doing vital work.
@havenbastion
@havenbastion Жыл бұрын
"Pure information" still exists as a pattern in a physical substrate and the movement of physical things is heat. There's no mystery unless you imagine pure information to be non-physical, which is an existential metaphysical category error.
@thelocalsage
@thelocalsage Жыл бұрын
yeah yeah yeah and really there are no exact circles in nature and technically pi is a rational number because there aren’t *really* uncountably many things to construct pi from and blah blah blah we’re dealing in mathematical abstraction here you don’t need to “well, actually” a really good piece of science communication by telling mathematicians they’re performing errors of metaphysics when it takes only a modicum of emotional intelligence to see these ideas are a platform for discussing purely mathematical systems
@pandoorloki1232
@pandoorloki1232 Жыл бұрын
"unless you imagine pure information to be non-physical, which is an existential metaphysical category error." That is precisely backwards. Confusing information with physical implementations is a category mistake. P.S. The response is more of the same conceptually confused nonsense. Information is a mathematical abstraction ... it doesn't need a physical instantiation or a "physical need"--that confuses the colloquial meaning of information with the formal meaning.
@havenbastion
@havenbastion Жыл бұрын
@@pandoorloki1232 The information always has a physical instantiation or it couldn't be information. In order for it to exist in any meaningful sense there must be a physical need by which it may be acquired and manipulated.
@thelocalsage
@thelocalsage Жыл бұрын
@@havenbastion a circle always has a physical instantiation or it couldn’t be a circle. in order for it to exist in any meaningful sense there must be a physical need by which it may be acquired and manipulated.
@lensman7519
@lensman7519 Жыл бұрын
Great content ty Nice convergences with plank length, spooky connectivity, entropy and quantum computers. Thx for the headache
@nbooth
@nbooth Жыл бұрын
Minor quibble but the second law doesn't imply that entropy can only increase, only that it is more likely to. You can always get lucky and have total entropy go down. I'm sure it happens all the time.
@nmarbletoe8210
@nmarbletoe8210 Жыл бұрын
indeed! And at the maximum entropy state, it is more likely that entropy will increase. Randomness is fun
@IllIl
@IllIl Жыл бұрын
Thank you very much for the video! This is by far the best explanation of the Landauer limit that I've heard, it actually makes sense to me now. One question I still have is that this equivalence between information and entropy seems to be a purely theoretical limit that comes out of the math when we take the 2nd law as axiomatic and then "balance the entropy books" of a gate that has less information on the output than the input. But this seems to make an assumption that "information" obeys the 2nd law. In reality, the information of a logic gate is something that a human interprets. The reality is just physical conductors and semi-conductors. The physical reality of _any_ implementation of _any_ logic gate should always be sufficient to preserve the 2nd law. Or is that the point? That if we found the ultimate physical implementation of a logic gate, its waste heat would be equal to (or greater than) that limit?
@hyperduality2838
@hyperduality2838 Жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@Elear.
@Elear. Жыл бұрын
I have been wondering about this for years. Finally. Thank you
@deslomator
@deslomator Жыл бұрын
I didn't know this concept and it's really well presented, thank you.
@atrus3823
@atrus3823 Жыл бұрын
Great video, as always! Based on my absolutely zero research, my first thoughts are that though the entropy change (assume lossless collisions) of the gate is 0, half of the output energy is not used in the final calculation. There would need to be a way in the system as a whole to make use of the waste balls, or else you're right back where you started.
@waylonbarrett3456
@waylonbarrett3456 Жыл бұрын
This has been considered. See the Szilard engine.
@hyperduality2838
@hyperduality2838 Жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@atrus3823
@atrus3823 10 ай бұрын
Thanks for the replies. Someday, I'll hopefully find time to follow up on this.
@elroyfudbucker6806
@elroyfudbucker6806 Жыл бұрын
Practically speaking, the heat generated within a microprocessor comes during the infinitesimally short time that the MOSFETs that make up the logic gates change state. When they are either conducting or not conducting current, they generate no heat. Multiply this extremely tiny amount of heat by the millions of MOSFETs that are changing state millions of times a second & you see the need to have a cooling fan on a heatsink mounted directly on the microprocessor & why it's not a good idea for your daughter to have her laptop on the bed.
@htomerif
@htomerif Жыл бұрын
Minor correction: what you said was exactly true 20 years ago. Right now, with the size of junctions in gates, a good chunk of the heat generated by processors is from quantum tunneling (i.e. leakage current intrinsic to the gate). I say "gate" because in modern processors you don't have individual FETs. You have compound devices that serve the purpose of a logic gate without actually having a recognizable FET. The only way to stop that kind of power drain is to kill the power entirely to large sections of a processor, otherwise they constantly drain power, flipping bits or not. I realize I used the word "gate", both meaning the insulated physical gate that controls current flow in a semiconductor device and meaning "logic gate" here. Hopefully its relatively clear which is which.
@KirbyZhang
@KirbyZhang Жыл бұрын
@@htomerif does this mean older FET processes can produce more power efficient chips?
@htomerif
@htomerif Жыл бұрын
@@KirbyZhang Its not so much about "older". Its about what they were designed for. OP isn't wrong that FETs changing state *used* to be the primary power drain, or that it is *still* a significant contribution to TDP. Its just been optimized now in CPUs for minimum power per computation and that means gate leakage outweighs state changes. Unfortunately, this gets complicated. You probably know that past a certain point (I don't know, about 100nm?) the process nodes have nothing to do with size anymore. For example, the 7 "nm" node can't make features that are 7 nanometers. Its minimum feature size depends on the type of feature but its up around 50 actual nanometers. Its kind of a big fat lie. Ultimately, the answer to your question is: using current process technology, you can optimize it for power efficiency by increasing gate sizes and increasing gate dielectric thickness but it comes at the cost of more silicon real estate. They do use older processes and equipment for microcontrollers like TI's MSP430 series or Microchip's atmega series, but its older equipment that's been continuously developed for making these slow, extremely power efficient processors. I guess it really depends what you mean by "power efficient". If you mean "minimum standby TDP" then yes. If you mean "minimum power clocked at 100MHz, with a modern design" then also yes. If you mean "absolute minimum cost per floating point operation" then no. And all of those are important considerations. A rackmount server might be going all-out for 5 years straight. A desktop processor will probably spend most of its life at a much lower clock rate than maximum and a mobile (i.e. phone) processor will probably spend a whole lot of its life in standby at or near absolute minimum power. I hope that helped. Its hard to pack a whole lot of information into a small space that someone would theoretically read.
@c1dv1c1ous
@c1dv1c1ous Жыл бұрын
Jade is a treasure. Thanks for sharing your passion for science!
@mmicoski
@mmicoski Жыл бұрын
Very nice explanation with the billiard balls: each input has to go somewhere if not as output information, as loss energy (heat).
@AdrianBoyko
@AdrianBoyko Жыл бұрын
What a great presentation of these topics. This is Carl Sagan level exposition!
@equesdeventusoccasus
@equesdeventusoccasus Жыл бұрын
I am glad to see a new video from you. Excellent video as always. I think that the true energy limit will always be tied to the efficiency of the system. For instance, build a computer that channels the heat back into electricity, and you have automatically made the system more effective, thus decreasing the amount of energy required to run the system. As far as logic gate reversibility a check sum set could be incorporated into the gate that would provide the reversibility being sought. It would require 3 bits rather than 1, but it would be completely reversible. The first rule of computer journaling is that it is resource intensive. However, it can be worth it.
@paulthompson9668
@paulthompson9668 Жыл бұрын
That sounds like the beginning of a journal article. I'm looking forward to seeing you published.
@equesdeventusoccasus
@equesdeventusoccasus Жыл бұрын
@@paulthompson9668 I retired 15 years ago, my article writing days are long past.
@paulthompson9668
@paulthompson9668 Жыл бұрын
@@equesdeventusoccasus But if you have way of building a computer that channels the heat back into electricity, then you'll be doing society a huge favor by explaining how. Likewise for finding a more efficient way of providing logic get reversibility.
@pablosartor6715
@pablosartor6715 Жыл бұрын
Great video Jade!! There are others information paradoxes in physics, all related to entropy. This is a very interesting topic.
@hyperduality2838
@hyperduality2838 Жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@LMProduction
@LMProduction Жыл бұрын
The Landauer limit can be circumvented by noting that entropy can increase in other conserved quantities, not just energy. A paper from 2009 (and a few more papers in the years since) by J. Vaccaro and others showed that in principle, you can transfer the kb ln(2) bit of entropy to an angular momentum reservoir, therefore inducing no energy loss. We're currently working on a qubit operation to show a single bit transfer of thermal energy from the environment into a spin bath experimentally.
@douggale5962
@douggale5962 Жыл бұрын
The fundamental component of modern computers is the field effect transistor. They are metal oxide semiconductors, so they are known as MOS transistors. There is a positive and negative construction, the two constructions are the suited for pulling (voltage) up and pulling down. Those two constructions are "complementary". So the fundamental component is more specifically, CMOS FETs. You switch FETs by charging or discharging the FET gate layer. The power used by a processor is the dissipation due to the resistance of conductors transferring charge in and out of the gate. Holding a 1 or a 0 state does not consume power, only the changes dissipate significant power.
@abdobelbida7170
@abdobelbida7170 Жыл бұрын
Is that what "switch losses" means?
@douggale5962
@douggale5962 Жыл бұрын
@@abdobelbida7170 Yes. Usually, switching losses primarily refer to the time when a transistor is partially on, and a voltage drop across the source and drain (the switched path) dissipates power. In a switching power supply with large currents, the partially-on pulse of losses is large. That's one huge transistor with one huge pulse of losses. In a CPU, there are no large currents going through any individual transistor, but an enormous number of gates being charged or discharged simultaneously, so an enormous number of small losses sum up to a large loss that would be correctly called switching loss. All of the power that goes into a CPU is lost, all of it becomes waste heat, eventually. Another way of looking at it is, CPUs use no power, other than the power they need for the losses, once they get going.
@NotHPotter
@NotHPotter Жыл бұрын
Makes sense. In order to maintain a system of any kind of order (and thus store useful information), the system internally needs to expend energy to resist the natural entropy that seeks to degrade that information. Some kind of input is gonna be necessary to resist that decay.
@heinzerbrew
@heinzerbrew Жыл бұрын
we have lots of methods of storing information that don't require energy to maintain. Sure eventually entropy will destroy those storage devices, but we don't operate on that time scale.
@NotHPotter
@NotHPotter Жыл бұрын
@@heinzerbrew Those more stable methods are also a lot slower, although even early CDs and DVDs are approaching the point where they're no longer readable. Books weather and wear. Ultimately, it doesn't matter, though, because even if you're going to quibble over time scales, there is still some necessary effort made to preserve data in a useful state.
@TimJSwan
@TimJSwan Жыл бұрын
You killed it again, Jade. I love the topic you covered, as usual and I never saw the billiard ball example before. You should check out hashlife algorithm which gives 10^800 speedup. Another entropy saver right there.
@Snowflake_tv
@Snowflake_tv Жыл бұрын
So did I. Anyway, Hashlife algorithm???
@MrAttilaRozgonyi
@MrAttilaRozgonyi Жыл бұрын
What an awesome video. Thoroughly engaging. I couldn’t stop watching. Fantastic!! ☺️👍
@davestopforth
@davestopforth Жыл бұрын
I'm struggling with the definition of information here. Surely information is based on our perception of states and conditions. Of course for information to exist requires a transfer of energy at some point, and the information literally exists because it is a state, but it doesn't become information until we begin to perceive it. For example, a particle travelling through space can have it's direction and velocity determined, but the particle doesn't care, it just exists and it just weighs X, whilst travelling at Y towards Z. That's it. For that to become meaningful information it requires some perception. Would it not be better to say it requires energy for the creation, transfer or manipulation of information?
@heinzerbrew
@heinzerbrew Жыл бұрын
Yeah, it would have been nice if she had defined "pure information" because the information I know about doesn't need energy to simply exist. You are correct it is the creation, changing, and refreshing that requires energy. Not really sure how she doesn't get that.
@DrewNorthup
@DrewNorthup Жыл бұрын
Information at rest does actually have an energy component, but she'd be here all week explaining it.
@hyperduality2838
@hyperduality2838 Жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@compuholic82
@compuholic82 Жыл бұрын
@Dave "Would it not be better to say it requires energy for the creation, transfer or manipulation of information?" But that also means that information itself must be associated with an energy state, does it not? If the manipulation (i.e. change of state) requires or releases energy there must have been an energy level before the manipulation and an energy level after the manipulation. The difference in these energy levels is the amount of energy needed for the manipulation. In that way it is no different than any other measurement of energy. Take gravitational potential energy. If you drop a ball you can assign an energy level to the ball before and after the drop and the difference in energy levels is released as kinetic energy.
@hyperduality2838
@hyperduality2838 Жыл бұрын
@@compuholic82 There is also a 5th law of thermodynamics, energy is duality, duality is energy! Energy is dual to mass -- Einstein. Dark energy is dual to dark matter. Action is dual to reaction -- Sir Isaac Newton (the duality of force). Attraction is dual to repulsion, push is dual to pull -- forces are dual. If forces are dual then energy must be dual. Energy = force * distance. Electro is dual to magnetic -- Maxwell's equations Positive is dual to negative -- electric charge. North poles are dual to south poles -- magnetic fields. Electro-magnetic energy is dual. "May the force (duality) be with you" -- Jedi teaching. "The force (duality) is strong in this one" -- Jedi teaching. There are new laws of physics! Your mind creates or synthesizes syntropy! Thesis is dual to anti-thesis creates the converging thesis or synthesis -- the time independent Hegelian dialectic. Duality creates reality!
@Zeero3846
@Zeero3846 Жыл бұрын
I got so excited that I actually manage to guess your topic of reversible computing. I had only recently learned about through some crazy crypto proposal about an energy-backed stablecoin, and it fundamentally relied on the idea of reversible computing. In particular, it suggested a way to transfer energy over the internet. A computation doesn't necessarily have to be located on the same computer. It could happen over the network if you imagine the network as one giant computer. In performing the computation, the input side would generate heat, while the output absorbed an equivalent amount, and you essentially have a new kind of heat pump. Of course, this sounds way too easy to be true, and that's because it is, but it's still definitely cool to think about. It's somewhat reminiscent of Nikola Tesla's wireless energy tower, except perhaps it's a little less scary as it's not trying to transmit energy directly through some physical medium, like the earth and sky, but rather over our telecommunications infrastructure as data.
@josephvanname3377
@josephvanname3377 Жыл бұрын
That sounds like a complete and total snake oil scam. You cannot transfer energy over the internet. It is really sad how people like to buy into scams hook, line, and sinker while you fail to realize how cryptocurrency technologies can truly benefit reversible computation. You have also failed to even provide the name of this 'energy-backed stablecoin' which makes me even more suspicious about you.
@robbeandredstone7344
@robbeandredstone7344 Жыл бұрын
9:40 For anyone wondering, that is approximately 75eV to put it into perspective.
@TechnoMageB5
@TechnoMageB5 Жыл бұрын
2 things: 1) With current electronic computers, there is no way even with reversible logic gates to make a net zero energy use computer. The architecture itself requires energy to operate, thus the laws of entropy cannot be circumvented. Quantum computing still needs to solve the architecture problem, but at least the computing itself could theoretically work. 2) This video reminds me of a discussion at work I had with a girl years ago, who was a college student working there at the time, about logic states. She was curious as to how computers "think". I started with a similar example, the light being on or off (while switching the light in the room on and off as an illustration), that by storing and comparing millions of on/off states, computers evaluate conditions for us, and that those "light switches" are represented by the presence or absence of voltage - more basic than this video. Next time I visit that location, she tells me that literally a few days after our talk, her professor starts talking about the same thing, and here she is in class smiling to herself the whole time thinking "I know where this is going..."
@itsawonderfullife4802
@itsawonderfullife4802 Жыл бұрын
Insightful video. The "reversible logic gate" (referred to near the end of the video) is simply using (controlled and confined) classical scattering to compute. And we already know that scattering is a reversible process because it is directly dictated by the laws of mechanics (and conservation laws) such as Newton's 2nd law. And Newton's 2nd law is evidently time reversible (as are other fundamental laws of nature) because they are expressed as 2nd order differential equations (involving acceleration) which are invariant under a time-reversal transformation (=keep positions and reverse velocities). The question then again goes back to this: Given that the fundamental laws of nature are time-reversible, how come we have a thermodynamic arrow of time and irreversible macro processes (such as a traditional irreversible logic gate operating) and a common answer is that irreversibility (=thermodynamic arrow of time) is an emergent feature of an ensemble of many particles (its simply mathematics and probability). So the model "reversible logic gate" solves really nothing. It's just a toy model for a controlled Newtonian scattering, which we have known for hundreds of years. That does not tell us how to build computers which do not increase entropy.
@dwightk.schrute8696
@dwightk.schrute8696 Жыл бұрын
To illustrate the scale at play here - imagine you have an "optimal counter" that operates at the Landaurer limit and simply increments the number by 1. Now you're going to do something spectacular - let's convert all the mass in our Hubble volume to energy and feed it to our optimal counter using Einstein's famous mass-energy equivalence (roughly 2x10^69J). Assuming you and the counter survived this event, you'd see the number 2^341 (given current temperature of space 2.72K). If you have a modern CPU, it probably supports the AVX-512 instructions which allow it to easily operate on 2^512 numbers. Vast majority of numbers between 0 and 2^512 will never exist in any shape or form. This is sometimes used in computer security to inform us about the strength of systems against certain classes of attacks. I have a pet theory that the computational density of universe is constant at any point in time - with expansion of space time our Hubble volume shrinks and we have less available energy to use, on the other hand, thanks to the expansion everything gets cooler and I think the numbers even out.
@gnanay8555
@gnanay8555 Жыл бұрын
Reading this with Dwight's voice was longer than i expected 🤔
@vigilantcosmicpenguin8721
@vigilantcosmicpenguin8721 Жыл бұрын
@@gnanay8555 It took a second for my brain to determine what tone of voice Dwight would say it in. I read it as condescending, with a few digressions into matter-of-factly.
@jonathandawson3091
@jonathandawson3091 Жыл бұрын
This is an incredible video. Huge respect to you Up and Atom, I learned a lot from you - thank you!
@mystyle43
@mystyle43 Жыл бұрын
This is very importance phenomena. It even has a connection to Metaphysics, the equivalence between energy matter and information.
@Mike500912
@Mike500912 Жыл бұрын
1. Microprocessors do output energy via driving external ports, used for driving other subcircuits. 2. A logic gate isn't a fundamental building block. It itself is made up of transistors, a more fundamental building block.
@danielclv97
@danielclv97 Жыл бұрын
I love the video, but I'd like a follow up, maybe with the Maxwell's daemon, like, if this can solve the need for energy consumption during information manipulation, what stops the Maxwell's daemon to break the laws for thermodynamics? Or, is it still physically imposible to store information with no expense of energy? And if yes, then isn't the billboard compute useless in the sense that it will consume the same amount of minimum energy? You can't use information if you don't store it to interpret it after all. You'll have to at least read the position of the balls after they pass trough the computer.
@geraldsnodd
@geraldsnodd Жыл бұрын
Maxwell's Demon is a cool topic in itself :) Check out Eugene Khutoryansky's video.
@christopherknight4908
@christopherknight4908 Жыл бұрын
If I'm not mistaken, Maxwell's daemon has to expend energy to erase information. Would reversible computing eliminate this requirement? Perhaps information storage is just a different problem that would need to be solved, in addition to the energy usage of computing.
@_kopcsi_
@_kopcsi_ Жыл бұрын
@@christopherknight4908 yes, information storage is totally different from computing, but interestingly the two problems have the same source: "Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment"." -- en.wikipedia.org/wiki/Landauer%27s_principle storage of information is static, manipulation of information is dynamic. these two have totally different nature. but when we erase information, we actually manipulate it. erasure is dynamic. that's why the same principle can provide a solution for the problem of Maxwell's demon and an explanation for the fundamental energy cost of irreversible logical operators.
@hyperduality2838
@hyperduality2838 Жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@_kopcsi_
@_kopcsi_ Жыл бұрын
@@hyperduality2838 dude smoke less weed
@granduniversal
@granduniversal Жыл бұрын
This approach to entropy is like making a database for an application. You make that based upon the business rules. When you are trying to find out the business rules, you will have to deal with a lot of uncertainty. That uncertainty is not unlike the challenges that surround trying to figure out chaos math. Anyway, the more uncertainty, the more entropy. And the natural state for most people to be in is that of ignorance. When you are figuring out the business rules you will find varying interpretations, and varying ideas about what kind of a world those can go in. That part is not unlike how consciousness is not a broad spectrum thing, but rather about what we can focus on at any one point in time. In the end you hope to get to the truth. The truth is the only "state" that can answer all of the questions, logically lead to the end result desired for a particular circumstance. This thing about getting at the business rules, it is important as a metaphor. It speaks about how you can try to do math between apples and oranges sometimes, but we want there to be an equivalency. Well, there will be, if you borrow from the outside. I don't mean using things like big data to help you find the business rules. That will most likely result in scope creep, as it points out to you more ways to make differentiations, flinging you into what could be before you understand what is. I mean wrestling with it by using as little information to deduce what you have to as possible. If there is an equivalency, then the correct answer should be surrounded by a lot of falsehood, that is interest in remaining ignorant, or potential other path traces, or heat. Because this is hard work, and nobody should go around saying it isn't. If you are right about knowledge, or wisdom, there will always be a lot of people going around telling you that you are wrong, in other words. That's no reason to give up. The example also points out something about linguistics that I find fascinating. It only affects the consistency of one side of the two sides that should be active to gain effective communication. But we see from things like animal training that is really all that is necessary to achieve success, for the master. It is about reward. And reward is about what an object needs or wants, not what the master demands it needs or wants. Making databases, however, is about making it easier for everybody to do things repeatedly, with less error. Does this lesson tell us that is achievable with only one side being consistent? You know, with the system being built correctly? I guess we find that out when we use a good app versus a bad one?
@thattimestampguy
@thattimestampguy Жыл бұрын
0:55 Heat Production & Waste 1:55 Information has an energy cost ℹ️ The Landauer Limit 2:25 Logic Gate More than one Logical Gate makes a Logical Network 3:50 4:04 4 Possible Combinations, 2 Possible Outputs 00 01 10 11 6:37 Entropy Formula 8:57 Fewer Outputs Than Input States 10:26 Cooling The Computer 💻 🧊 11:37 Reversible Computers 12:07 Irreversible Operation, You can’t determine the inputs from the output 13:36 It’s Possible To Reverse 14:24 15:28 Billiard Balls 17:04
@leonstenutz6003
@leonstenutz6003 Жыл бұрын
Awesome, thx!
@landsgevaer
@landsgevaer Жыл бұрын
After having spend so much time thinking how to carefully explain an unintuitive topic and managing to produce such a wonderful video, it must be so disappointing reading so many of the comments here that don't show anything has landed. Fortunately, there are a few exceptions too. Rest assured that it is the YT audience that is lacking, not the producer... 😉
@En_theo
@En_theo Жыл бұрын
Well, she kinda misunderstood some concepts herself since she said @6:24 that entropy "never" goes down, which is false. It tends to increase but there is still a possibility that it goes down. If the universe is eternal, then that hypothesis becomes a certainty and you find yourself with renewed energy again (after a veryyyy long time though).
@landsgevaer
@landsgevaer Жыл бұрын
@@En_theo That may be a theoretical note to make, but I wouldn't say that makes the statement "wrong" for this audience. We can hopefully agree that many comments are much worse... 😉
@En_theo
@En_theo Жыл бұрын
@@landsgevaer Comments are entitled to the people who make them, as long as they don't pretend to be physicists bringing a new knowledge. I'm not blaming her, it's a common misconception that entropy "always" increases. A simple sentence like "it tends to, but the details are for another video" would be nice.
@paulthompson9668
@paulthompson9668 Жыл бұрын
Hi Jade, this video seems like a wonderful lead-in to a video on how quantum computers will require less energy due to the reversibility of their gates.
@dodokgp
@dodokgp Жыл бұрын
Not unless the cryogenic operation is obviated.
@paulthompson9668
@paulthompson9668 Жыл бұрын
@@dodokgp So let's get cracking on quantum computing at room temperature!
@AlexSchendel
@AlexSchendel Жыл бұрын
Extreme overclockers use liquid nitrogen to massively increase the clock speed of processors. Since cooling transistor gate oxide massively increases its stability, they can massively increase the voltage they run. But as you cover in the video, this amount of cooling drastically reduces the power consumed by the processor, so it all ends up still being stable and the higher voltage allows for a much higher frequency because the time taken for the transistors to flip is effectively reduced. Very cool to learn about the Landauer limit!
@muthukumaranl
@muthukumaranl Жыл бұрын
Wonderful!...Thank you! i am glad the recommender bubbled this up to me..i will now check out your other ones aswell!
@davidh.4649
@davidh.4649 Жыл бұрын
So Jade ... does the Landauer Limit apply to synapses in the human brain as well? 😁 Good thought provoking video as always!
@gotbread2
@gotbread2 Жыл бұрын
It does. Its a fundamental consequence of information erasure.
@MrAttilaRozgonyi
@MrAttilaRozgonyi Жыл бұрын
I remember reading somewhere that information actually has weight. In other words, a hard drive full of data is measurably a tiny bit heavier than that same drive when erased. I wish I could remember the actual reason why. Maybe that could make an interesting video? All the best! ☺️☺️
@Snowflake_tv
@Snowflake_tv Жыл бұрын
I'm interested in, too.
@markricher2065
@markricher2065 Жыл бұрын
I was thinking the very same thing 🤔
@Snowflake_tv
@Snowflake_tv Жыл бұрын
Is information then mass???
@MrAttilaRozgonyi
@MrAttilaRozgonyi Жыл бұрын
I have a vague recollection that it *may* be limited to the older style HDD’s which have magnetic platters and it may not be the case in SSD drives… but that’s all I can remember. :) I’d love to know more about this.
@RobBCactive
@RobBCactive Жыл бұрын
As HDD platters magnetise the coating when the drive heads set bits and sit in the Earth's magnetic field, it's possible the measured weight changes a little by adding a magnetic force but that is not gaining or losing mass.
@majorfallacy5926
@majorfallacy5926 Жыл бұрын
I'm quite familiar with thermodynamics but not so much with computer science, so this completely blew my mind
@glenncurry3041
@glenncurry3041 Жыл бұрын
What an interesting approach from a strictly information physics approach! Being old, my formal electronics education was discreet components. ICs were first coming out and the CPU did not exist. IOW what you show as a symbol for a logic gate to me was originally a batch of steering diodes and mono or bi-stable multivibrators. To me the total energy loss for that gate is based on having to overcome the barrier voltages at each of the solidstate devices internal junctions. e.g. 7v for a silicon PN junction. Plus additional components, wiring,.... Now cascade them.. and suddenly multiply that by billions on one chip! Even with today's abilities to cram massive numbers of these onto one chip, they are each still discreet devices and junctions with those needs. While you are working at a completely different level. And I see a great gap between them yet.
@hyperduality2838
@hyperduality2838 Жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@micro2743
@micro2743 Жыл бұрын
"Pure Information" (Data) can be stored on a Floppy Drive/Hard Drive/CD/DVD/Thumb Drive/SSD/etc... and it requires no energy and gives off no heat. Moving information requires Energy and produces Heat! Data stored in active memory must be refreshed and also requires Energy and produces Heat, and of coarse physical hard drives spin for a while even when they are idle. You covered logic gates, but it should also be noted that often computers are just moving data i.e. copying an image from a web server to your local hard drive, then to active memory, and eventually to your graphics card so you can see it on your screen.
@StefanNoack
@StefanNoack Жыл бұрын
Well, the floppy media requires mass, and that is equivalent to energy ;-)
@Rithmy
@Rithmy Жыл бұрын
And the wear and tear of the floppy media can be said to equal energy loss. Almost like radiation in big.
@micro2743
@micro2743 Жыл бұрын
@@StefanNoack A mass at rest is potential energy and I don't think it produces heat.
@micro2743
@micro2743 Жыл бұрын
@@Rithmy I am talking about cold storage, and I know a floppy can last 20+ years. I don't think a CDR/DVDR lasts that long. Retail CDs/DVDs are usually pressed and degrade much more slowely. Since energy cannot be destroyed, you are correct. That magnet field has to be moving somwhere else as the Floppy degrades, and theoretically creates a small amount of heat. Would we even be able to measure it?
@Rithmy
@Rithmy Жыл бұрын
@@micro2743 Isn't mass alone creating heat by pressure?
@techslfink9722
@techslfink9722 Жыл бұрын
Information is data that induces a change. If the second output of the gate is not observed, will it still have its function? I love your channel - I’ve had very few teachers who explained as good as you do!
@TheHallucinati
@TheHallucinati Жыл бұрын
I remember how my friends and I were tripping on acid around a bonfire during one of our camping excursions, and someone (can't remember who it was exactly) asked something like: "Can every question theoretically be answered at some point?". And somehow it steered the conversation to the point where I was like: " - We don't have enough energy to answer some of our questions" , and it just sounded hilarious, like "we're too tired to answer your question now, go away". So I tried to explain what I meant. Unfortunately I quickly realised that although I was in no danger of running out of energy, my verbal eloquence was very quickly exhausted. I always envied people who not only could explain things like that in a clear and concise form, but actually keep the listeners interested.
@MemphiStig
@MemphiStig Жыл бұрын
Jade's just as good as any other science channel out there, and she's more charismatic (and much cuter!) than most, if not all. She really deserves more subs and views.
@sajidrahman8613
@sajidrahman8613 Жыл бұрын
This is brilliant! Thank you so much for doing this.
@commandershepard6189
@commandershepard6189 Жыл бұрын
Cool! Good video... Some people don't understand that 0 is open circuit while 1 is closed circuit. Meaning 0 is the absence of a bit while 1 is a bit. In microchips 0 has work or foce being applied due to heat transfer from the nearby transistors and the opening of that transistor... this equats to energy loss in the application. Yeah, work applied but no work output. Cool stuff. The problem, we'll never get around thermodynamics.
@jwjarro73
@jwjarro73 6 ай бұрын
This is one of the most intuitive descriptions of entropy I've heard
@firstolasto1518
@firstolasto1518 Жыл бұрын
Brilliant!! Thank you. Amazing video. We’ll have to check out your others
@vansf3433
@vansf3433 Жыл бұрын
Whenever you have the cureent i runs through a logic gate or gated latch, you have an "on signal" coming out of an output , the signal is corresponding 1 , and when there is no current through another output, no signal is interpreted as "off signal" or 0 , and when you have a bunch of gated latches arranged in matrices to keep such bits of 1 and 0 , you have a compter memory
@JohnSmith-ut5th
@JohnSmith-ut5th Жыл бұрын
You made a logical leap from a defeating the Landauer limit for a single gate to that of a computer. Adding multiple layers of the gates you described *exponentially* increases information unless you disregard information, which is irreversible. In 2001, I developed the mathematics of a reversible computer that only needs 1 additional bit of information per gate layer (unpublished), so only linear growth.
@ciroguerra-lara6747
@ciroguerra-lara6747 6 ай бұрын
Another way to look at the Landauer principle, from what I've read, is the energy needed to erase one bit. Also I read that thermodynamic and computational reversibility are not always equivalent. Also that the Landauer limit on a Maxwell demon implies that, since in reality the demon will have eventually to erase information (we do not have infinite resources to keep information) irreversibility and entropy increase eventually catch up with us.
@stillme4084
@stillme4084 Жыл бұрын
Great video, amazing info and fun. Thanks & be well.
@sunnybeach4025
@sunnybeach4025 Жыл бұрын
I find it especially captivating that you seem to specialize in language yet use the strong physical foundation in which you conduct this display. I believe my time is best spent learning more about modern language and and and a...
@user-vb3xq9pm8t
@user-vb3xq9pm8t Жыл бұрын
Wow, you are amazing! The way you present information is great and charming! Everything is very clear. I am totally in love! Thanks for your work)
@vimicito
@vimicito Жыл бұрын
Regarding the amount of heat generated by a logic gate, there's the following considerations that - to the best of my knowledge - we tend to apply in practice. 1. The most fundamental part in a processor is the transistor. Multiple transistors can make up a logic gate, and multiple logic gates can make up an instruction. Instructions can be as simple as comparing two registers, or as complex as hardware-accelerated cryptography. 2. Instructions don't always take a single cycle. They may take more than one, especially in RISC computers. They combine simpler instructions to do something more complex. Something like, say, simplifying a power to its amount of multiplications, but in base-2 systems things get far simpler than this. 3. A processor is filled to the brim with transistors. These transistors are like electrical switches, where the push of a button is replaced by a voltage on a logic gate (the transistor's actual.. well, gate!). That gate has a certain time it needs to fill up its charge, depending on the amount of current the driver can supply (this is why MOSFET drivers are a thing), the amount of equivalent capacitance the gate has (capacitors are like electrical reservoirs), and finally the voltage that is needed to saturate the gate (i.e. turn it on). As an engineer, you can play with all of these to get the desired result. 4. The processor will run at a certain frequency, and lower is better for energy savings. This is because of that time to charge/discharge the transistor. In the trade, we call these rise and fall times respectively. In these charging and discharging states, the transistor will emit the energy put into it as heat (acting as a resistor). This is why you want these rise and fall times to be low, but too low creates ringing so don't take it too far. The frequency of the processor being increased, will cause a larger percentage of time to be spent in these states, where heat is generated. So in practice, processors tend to prefer running at a lower frequency, both to save (battery) power and to generate less heat when a high frequency is not necessary. Manufacturers like Intel, AMD and ARM do this through stepping, which can be controlled either by the BIOS, or by the operating system kernel. Most of the time will be spent at the low idling frequency, and then ramp up quickly when you want to do stuff (say, loading a web page). Then it will usually gradually step back down. Performance improvements could be made by making instructions take less cycles, at the cost of CPU complexity (Intel is known for this, and its avoidance is what underpins RISC platforms). There's other ways to do these things in practice too, but this comment is already quite long. For the handful of people that got here, thank you for reading this. You're awesome!
@williamlangley1610
@williamlangley1610 Жыл бұрын
I think your explanations are very easily understood...thanks.
@BrianHeilig1
@BrianHeilig1 Жыл бұрын
That was cool! I found myself hoping you would dive into the energy loss at the logic gate, e.g. what happens if we have super conductors and cool the gate to absolute zero? Where specifically is the energy lost in this logic gate?
@jayantchoudhary1495
@jayantchoudhary1495 Жыл бұрын
new to this channel, stuff here is awesome, keep up the good work
@gbear1005
@gbear1005 Жыл бұрын
Richard Feynman (the Nobel physicist / father of QED) gave talks on reversible computing and energy use
@stevesloan6775
@stevesloan6775 Жыл бұрын
That was really awesome. Reminds me of how the universe is created. It know what it is so it can run through its program and know all of its outcomes without any real effort at all.
@TheoWerewolf
@TheoWerewolf Жыл бұрын
To be more precise: it's the cost of *destroying* information. You can read a bit of data with almost no energy (system inefficiencies and basics like particle interactions are the limit there - you can't violate entropy after all), but *writing information*, if you're changing state, consists of destroying the original state and THAT'S what's expensive.
@gsittly
@gsittly Жыл бұрын
Channel: ♥️ Topics: ♥️♥️ Explanations: ♥️♥️♥️ Jade: ♥️♥️♥️♥️ Thank you for this great stuff you share with us 😊
@MultiSteveB
@MultiSteveB Жыл бұрын
A few thoughts and questions: 1. A Logically Reversible logic gate would (IMO) require more circuitry over that in the non-reversible one - and that comes with an energy cost to run said circuits, which could cost more than the Landauer Limit itself. 2. By adding more states (even if for the output), does that not increase Entropy? 3. Is there any relation/correlation between the Landauer Limit, and the Planck Energy?
@TheViolaBuddy
@TheViolaBuddy Жыл бұрын
For Point 1, for real systems yes, and that's also why Jade was saying that in practice for real systems we're operating millions of times above the Landauer limit, so this isn't a practical consideration for now. But the idea is thinking about a hypothetical future where we figure out how to avoid generating any waste heat at all - that would mean our circuits come with zero energy cost in terms of the circuitry and setup. But even in that super idealized case, if our logic is based on irreversible gates, we will still not be able to use zero energy (nor asymptotically close to zero energy) to do actual calculations with the circuits. For Point 2, yes - that's exactly the point, in fact. The whole problem was that entropy in the universe cannot decrease, so if the entropy of our information goes down, then the entropy must go somewhere else, and that somewhere else is heat. But if we instead keep the entropy of our information the same instead, there is no excess entropy to "leak out" as heat. The entropy is sort of "trapped" in the information instead of being lost as heat. (For Point 3 I know very little about Planck Energy so I cannot say definitively. I suspect not, just because you only really hear about Planck Energy in cosmological contexts and this isn't a cosmological context, but physics can have all sorts of surprising connections.)
@cmilkau
@cmilkau Жыл бұрын
If you want to use reversibility to eliminate the entropy going down, you have to make the gates produce their outputs by overwriting their inputs (e.g. like quantum computers do). If you have an extra output wire, and the gate changes it to the "correct" value, no matter what value it had before, that gate reduces entropy by 1 ("output wire has any value" to "output wire has a definite value"). The only way to compensate for that would be to "forget" (overwrite) one of the input values. The billiards ball machine does that, as the balls leave the input pipes. Quantum computers also do that, as the inputs can no longer be measured after they have been processed.
@richardleeskinneriii9640
@richardleeskinneriii9640 Жыл бұрын
This is interesting. In cybernetics, variety is the number of possible states in a system. The larger variety your system has, the more entropy it has. It's this link between thermodynamics and cybernetics, which is something I'm very interested in.
@michaelgonzalez9058
@michaelgonzalez9058 Жыл бұрын
That perception is the overall output input reveal all knowing
@alansmithee419
@alansmithee419 Жыл бұрын
My understanding of entropy was that the more states a system can be in *that look nearly identical to its current state* the higher its entropy? Like, the entropy of some gas all shoved into the corner of a room has much lower entropy than that same gas spread out to fill the whole room, even though in both cases it's the same amount of gas and would require the same amount of data to determine the exact position of each molecule of said gas.
@vishalmishra3046
@vishalmishra3046 Жыл бұрын
3:45 Largest prime under 1 billion is *999,999,937* . Computers are indeed amazing in their ability to answer unexpected questions with such low effort, cost and time.
@jakobthomsen1595
@jakobthomsen1595 4 ай бұрын
Nice explanation of reversible computers! An additional challenge implementing them seems to be that the extra outputs can't be thrown away (that would produce heat again), so they must somehow be "recycled".
@kozmizm
@kozmizm Жыл бұрын
In classical computing some information is destroyed(lost) when doing a calculation/computation, but if you do things cleverly, you can create an odd symmetry where the calculation done forward in time is also essentially the same as the calculation done backward in time, also referred to as reversible computing, in which no information is lost, even though it requires that many more states are preserved, and the result is sort of like the same thing as the initial state in the first place, adding quite a lot of difficulty into the design of the machine. Yet, in a superconducting environment, this is 100% efficient, theoretically, leading to zero energy wasted. That is, if my understanding of all that is correct. Hence, it is theoretically possible to have a special quantum computer that does not waste any electricity. Instead of destroying information by starting with multiple pieces of information that combine down into fewer bits of information, we preserve all the information during the transformation using our special knowledge of physics, and the number of inputs equals the number of outputs and everything is able to be deterministically reversed. Even though we only care about a portion of those outputs, no energy was destroyed. Every bit of energy has been transformed but preserved without any destruction. It's hard to wrap your head around, but it seems to be a sound idea. Kudos to whoever came up with that idea! Essentially it preserves everything so there is basically zero entropy and therefore no loss. Even if there are minuscule amounts of entropy, the circuit is so efficient that the energy loss will be almost zero compared to classical computing. You can't do this using regular transistors. You have to build this symmetry into the physical design of the system itself, using the natural properties of nature to outsmart entropy. It's not a simple problem, but in quantum gates using quantum computing, it is theoretically possible, I'm using that word again, "theoretical". Honestly, from a purely theoretical point of view it makes sense. Just look at superconductors and we see that entropy can be overcome in certain circumstances. Now, imagine if we can do superconducting materials combined with form of super-symmetrical logic gates to overcome entropic waste. it's like taking superconducting to a meta level. It's the supersymmetry hypercube, man. Like, totally dude! Far out!
@MynameisBrianZX
@MynameisBrianZX Жыл бұрын
14:36 thanks for this part in particular, never understood "why don't they just make new computers now" until this
@garyfilmer382
@garyfilmer382 Жыл бұрын
Fascinating! I was thinking of quantum computers, and time crystals, reversibility. Energy? There must be, but the mystery is, no discernible change in temperature. There must be! It’s just that it is so small that we haven’t found the means to measure it. All intriguing!
Berry's Paradox - An Algorithm For Truth
18:34
Up and Atom
Рет қаралды 431 М.
Countries Treat the Heart of Palestine #countryballs
00:13
CountryZ
Рет қаралды 27 МЛН
Smart Sigma Kid #funny #sigma #comedy
00:19
CRAZY GREAPA
Рет қаралды 9 МЛН
The Most Misunderstood Concept in Physics
27:15
Veritasium
Рет қаралды 13 МЛН
The Time-Reversibility Paradox - Why Time Flows Both Ways
15:27
Up and Atom
Рет қаралды 694 М.
World's Largest Nuclear Fusion Reactor!
29:05
Up and Atom
Рет қаралды 176 М.
You Can't Measure Time
17:33
Up and Atom
Рет қаралды 450 М.
Why We're Reaching the Theoretical Limit of Computer Power
7:27
Half as Interesting
Рет қаралды 1,6 МЛН
This Paradox Proves Einstein's Special Relativity
15:48
Up and Atom
Рет қаралды 463 М.
The Impossibility of Perpetual Motion Machines
16:31
PBS Space Time
Рет қаралды 3,4 МЛН
Nobody Knows What TIME Really Is. But it might be this...
14:10
Arvin Ash
Рет қаралды 1,1 МЛН
The Boundary of Computation
12:59
Mutual Information
Рет қаралды 956 М.
Mem VPN - в Apple Store
0:30
AndroHack
Рет қаралды 110 М.
Cadiz smart lock official account unlocks the aesthetics of returning home
0:30
Купил этот ваш VR.
37:21
Ремонтяш
Рет қаралды 240 М.
Iphone or nokia
0:15
rishton vines😇
Рет қаралды 1,7 МЛН