The Growing Semiconductor Design Problem

  Рет қаралды 273,738

Asianometry

Asianometry

Күн бұрын

In 1997, American chip consortium SEMATECH sounded an alarm to the industry about the chip design productivity gap.
They observed that integrated chip manufacturing capabilities were expanding at about 40% a year. At the time.
Yet IC design capabilities were only growing at about half of that rate. Thus, a potential crisis was brewing where design capabilities lag far behind manufacturing.
This crisis never took place for reasons we will discuss later. In turn, however, a new challenge for the industry has emerged: Verification.
In this video, we are going to look at why this previously unheralded step has become a rather big deal in today's competitive chip design world.
Links:
- The Asianometry Newsletter: asianometry.com
- Patreon: / asianometry
- The Podcast: anchor.fm/asianometry

Пікірлер: 490
@Asianometry
@Asianometry 2 жыл бұрын
I hope you enjoyed this video. It's a favorite of mine. Also. Yeah, I know I have over 100K subs now. I recorded this video 2 months ago, back when it seemed so far away. Check out other technology reviews here: kzfaq.info/sun/PLKtxx9TnH76RiptUQ22iDGxNewdxjI6Xh
@masternobody1896
@masternobody1896 2 жыл бұрын
I wish I was genius so I can make chips faster
@masternobody1896
@masternobody1896 2 жыл бұрын
rip gamiing fps stuck at 240fps
@Crunch_dGH
@Crunch_dGH 2 жыл бұрын
Very helpful, thank you very much!
@Shackkobe
@Shackkobe 2 жыл бұрын
Excellent content. But...I'm pretty sure you have already achieved this: kzfaq.info/get/bejne/qNqRlLKVrNuxc5c.htmlm54s
@gazz01
@gazz01 2 жыл бұрын
Thanks for the great content it feels nostalgic. I studied everything in Electronics and Communications Engineering(ECE) bachelor degree I'm still having knowledge of EDA, Virtuoso, Verilog, VHDL, Chip Designing and Embedded systems. In India, there are quite fewer opportunities for graduates most of the companies require post-graduation(masters).(Nvidia,Synopsys,Intel,Siemens,Qualcomm and STMicroelectronics). Because of some financial conditions, I studied Computer Science during my final year and changed my whole stream to get a job in a product based company. My Batch had 60 Students and 40 among them joined a CS/IT product based company during campus placements. But I don't regret my decisions as competition is quite high in both fields it helps me to improve but I hope one day I may work on the conjunction of both or pursue higher studies to become a design/verification engineer.
@ulwen
@ulwen 2 жыл бұрын
My father ran a product development consultant company for 30 years. He said that depending on the size/maturity of the customer the project budget would go from 80% design 20% testing for small/new customers to 20% design 80% testing for large/mature customers. Seems like chip design is no different.
@TheVlad33
@TheVlad33 2 жыл бұрын
When chips get bigger it’s almost always 80% verification. Too much money at stake
@meatybtz
@meatybtz 2 жыл бұрын
Reusable "code" was a revolution in programming that lead to growth over time that was incredible. The same held true with hardware and programmable logic arrays. As someone who "did it from scratch" every time the advent of FPGAs was a thing. Our mainboards could do new things but there was a development overhead to reinventing the wheel with them. Being able to buy logic was, a time saver. Same thing with the development of automated trace routing. Went from our king artist who could actually lay out traces manually in the program and optimize in his head to being able to click a button and populate a board. I had not thought about any of that in almost 25 years now. Sheesh.
@yum33333
@yum33333 2 жыл бұрын
"Automatic trace routing".... good lord, does anyone actually use this stuff? Even the high end ones are just so horrendous.
@monad_tcp
@monad_tcp 2 жыл бұрын
You needs compilers, not only reuse, but automation. Of course VHDL is kind of the same idea we programmers had with Fortran in the 1940s. The problem is that VHDL is horribly dated, we made much progress with formal proofs in programming languages, they need to catch up. Actually, my PhD thesis on computing science will be about compiler hardware synthesis, but totally automated.
@monad_tcp
@monad_tcp 2 жыл бұрын
@@yum33333 yes, when you have a billion transistors, you use it.
@monad_tcp
@monad_tcp 2 жыл бұрын
@@yum33333 also, whats "horrendous", if its ugly, but works, who cares ? does it respect the ERC ? then that's enough. assembly programmers used to say the same thing about machine generated machine code...
@monad_tcp
@monad_tcp 2 жыл бұрын
But I know routing could be better, but that's usually a P-Space problem in computing, its really hard to solve it very efficiently. But it doesn't matter much, you aren't usually constrained by area in silicon, not the same way as you are in a PCB. So, that's not a problem for in-chip routing. Imagine routing a 10cm PCB but now in 1m of space. There's plenty of space on die.
@kensgold
@kensgold 2 жыл бұрын
Keep in mind that in the post silicon verification process we have to test the silicon across manufacturing process, temperature, and operating voltage( or as we call it PVT) we support. the production wafers we get back from the fab, have some variance in the parameters used to make them. so during the verification process we get "split" silicon or silicon made using the extreme edge cases of those parameters to ensure that the chip functions even at those extremes. here's a good example of how this gets out of hand fast. imagine your company makes a blue tooth transmitter chip, and you want to make sure you meet the bluetooth FCC standards. you have to test a minimum of 3 pieces of silicon from each process corner. you have to test your minimum voltage, nominal voltage, max voltage. all across your temperature range lets just use -40c, -20c, 25c, 60c, 80c. all to test your spectrum emissions across all 40 broadcast channels. So in summary, 3 chips, 4 corners, 3 voltages, 5 temperatures, 40 channels for a total test count of 7200 test cases. That is for a single functional test. shit gets wild
@waynemorellini2110
@waynemorellini2110 6 ай бұрын
I'm curious. How difficult is it to set up an automated test for those? When you talk about the manufacturers cutting into the parameters. Do they cut the safety margin to the bones now. I know an old chip designer that got 10x advantages back in the day by seeing how close he could cut into the safety margin on hand designed chips and tiling software.
@kensgold
@kensgold 6 ай бұрын
@@waynemorellini2110 the parameters i was referring to are the actual transistors performance metrics. How much charge needs to build up before they flip, how fast that charge can build up, and once flipped how fast the charge drains. All of those characteristics vary within a window in a normal non process controlled production wafer. As for automation you basically have to automate everything. We have entire engineering teams dedicated to making our test infrastructure, and test cases. not to mention running the testing.
@waynemorellini2110
@waynemorellini2110 6 ай бұрын
@@kensgold Thanks. I was wondering how much time goes into preparing the test code and rig?
@kensgold
@kensgold 6 ай бұрын
@@waynemorellini2110 It very much depends on the company, and their validation strategy. Generally after tape out, and rtl freeze, there is pre production. This is where we simulate the chip on fpgas, so we can port our functional tests, and system tests to the new chip. During that time we design, and manufacture custom pcbs that we use to do validation testing. That can take anywhere from 2-3 months depending on how much changes between the old chip, and new chips. That time estimate is very dependent on the chip however. If its a generational change it will be longer than a simple rev A to rev B of the same chip.
@scottfranco1962
@scottfranco1962 2 жыл бұрын
Good overview. I left verification just after the Verilog revolution came to pass. We used hardware simulators at that point, with accelerated simulation. Sea-of-fpga emulation was not yet popular. The bottom line was that the company I worked for (nameless, but huge trust me) was new to hardware chip design. They had lots of money to buy toys, but had the idea that the tool would solve all. Thus the expectation was that the new, more expensive tool would relive them from the need to write test cases. I enjoyed being the one employee who understood the tools, but rapidly realized that it was a falling rock I had volunteered to sit under. I left the company soon after.
@prioris55555
@prioris55555 2 жыл бұрын
I worked for company (nameless but huge and begins with letter M) that verified hamburgers... :)
@spikester
@spikester 2 жыл бұрын
@@prioris55555 Nameless but huge, when a company moves so much product that makes everything work that nobody really knows who you are by a brand name. Unacceptable in todays shareholder farms. They could had chosen a better name like Corsair as a flanker brand, as they failed to move retail products Ballistixally. LOL
@gamar1226
@gamar1226 Жыл бұрын
@@prioris55555 Melanox?
@ttb1513
@ttb1513 Жыл бұрын
After the Verilog revolution? That was a long time ago. I got into chip design after helping to create that Verilog revolution. It is truly incredible how massively the transistor budgets, or design sizes, have grown. This video does a good job. But it doesn’t convey just how incredibly complex the corner cases are for chip design as compared to SW. A SW programmer would start to get a sense of what it’s like when they go from single threaded to multi-threaded programming. The amount of pipelining and concurrency and speculative execution in hardware designs is enormous. This video also doesn’t mention formal verification. Those tools can do things like exhaustively tesingt a 64-bit adder while any brute force or constrained random testing cannot achieve the same in any of our lifetimes.
@scottfranco1962
@scottfranco1962 Жыл бұрын
@@ttb1513 The last company I work at silicon design for, Seagate, was emblematic of the basic problems with ASICs. It was 1995 and they were new at ASIC design in house. And they weren't very good at it. It was random logic design before Verilog. An example of this was our timing verification. We had timing chains that were too long for the cycle time, and thus they simply alllowed for the fact that the signal would be sampled in the next clock cycle. Now if you were an ASIC designer back then, what I just said would have made you reach for the tums, if not a 911 call for cardiac arrest. Its an open invitation to metastability. And indeed, our AT&T fab guys were screaming at us to stop that. I got put in charge of hardware simulation for the design, and I have detailed this fiasco in these threads before, so won't go over it again. The bottom line was that ASIC process vendors were loosing trust in their customers to perform verification. The answer was that they included test chains in the designs that would automatically verify the designs at the silicon level. It mean that the silicon manufactured design would be verified, that is, defects on the chip would be verified regardless of what the design did. My boss, who with the freedom of time I can now certify was an idiot, was ecstatic over this new service. It was a gonna fixa all o' de problems don't ya know? I pointed out to him, pointlessly I might add, that our design could be total cow shit and still pass these tests with flying colors. It was like talking to a wall. In any case, the entire industry went that way. Designs are easier to verify now that the vast majority of designs are in Verilog. I moved on to software only, but I can happily say that there are some stunning software verification suites out there, and I am currently working on one, so here we are.
@JZL003
@JZL003 2 жыл бұрын
The random checks reminds me a lot of fuzzing, a common programing practice where you throw tons of random inputs at a program and see if it crashes. Typically in software you use something like American fuzzy loop, which actually looks at which code is hit during each run, and tries to tailor the new test cases to explore more of it. So as an example if you're fuzzing a jpg parser, even if you throw an initial random seed, it'll learn to create the jpg image header, and from there explore weird pixel values
@PaulSpades
@PaulSpades 2 жыл бұрын
software fuzzing comes from hardware testing.
@Blox117
@Blox117 2 жыл бұрын
so thats why video games are so buggy these days
@tissuepaper9962
@tissuepaper9962 2 жыл бұрын
@@Blox117 it's actually the reason they're only as buggy as they are. A million chimpanzees at teletypes will do things in your game that no human would ever think to do.
@imacds
@imacds 2 жыл бұрын
@@Blox117 Game developers aren't given enough time to add all the features let alone fix most of the bugs (bug fixing can easily take up 60%+ of the dev time) because you can always just do both of these in post-launch patches and charge extra for the former as a result. Game projects also tend to have lower testability due to the intense time constraints, the fact that it really doesn't matter if they are buggy or fail as they are just entertainment software, and the fact they can be patched at a later date so there isn't any incentive to spend extra time getting it right now vs spending a little bit longer to patch it later.
@PamSesheta
@PamSesheta 2 жыл бұрын
Fuzzing is a less constrained input approach. More constraints are applied to limit “crazy” combinations to that of actions customers will actually take. For the keyboard example, it might be better to constrain the simultaneous key presses to a max of 10 and not allow for nearly all the keys to be pressed at once all the time in all of the tests. This lets you focus on things that are likely to happen in the course of normal use rather than finding obscure issues in corner cases that won’t happen. The config space for ASICs is wide enough that full fuzzing would be time intensive and difficult to debug. It would reveal probably more problems with the test environment than the design I’ve been interested in a fuzzing approach in verification but like I said, tracing back misbehavior is often harder and the scenarios may not even be possible on the bench. It is more valuable to find bugs that impact the major use cases and can be replicated on the bench
@diffore
@diffore 2 жыл бұрын
As a full time verification engineer, I hate shortened chip development cycle (and modern capitalism in general tbh). This is causing an amazing stress on verification team due to limited time constraints (this and increased tapeout queues at fab - if you miss your time slot you would have to wait for another half of year) . Besides, there are times when throwing more team on the project will only cause more problems in a same way as throwing more cores on the processor will not yield expected N time productivity. I sometimes think about switching back to software jobs simply because verification usually ends up being an overtime mess.
@hoodoooperator.5197
@hoodoooperator.5197 2 жыл бұрын
Throwing more people at projects was what my former employer thought to be the best solution... In all cases. Needless to say, it didn't end well for them.
@ThylineTheGay
@ThylineTheGay 2 жыл бұрын
It’s so annoying how capitalism hinders growth, like rather than these companies all fighting and trying to screw each other over they could all work together to make the best product possible
@MrAngryCucaracha
@MrAngryCucaracha 2 жыл бұрын
@@ThylineTheGay i dont think you can find a more growth-oriented system than capitalism.
@TheDXPower
@TheDXPower 2 жыл бұрын
@@felixknight2278 If you actually look at historical sources (primary and secondary), their scientific complex was not efficient at all. Take their rocket program for example: In the Space Race the extremely bureaucratic nature of the Soviet's development agencies were so restricting and inefficient the top scientists in the various agencies made an "underground" board to handle matters for themselves rather, calling themselves the "Council of Chief Designers". For example, when discussing the practicality of reproducing the A-4 (technical name for the German V-2 rocket), the men in NII-1 had to report to the chiefs in the Ministry of Aviation Industries, who then proceeded to deny the military applications of rockets. The team at NII-1, in violation of their orders, still decided to document and analyze the A-4 rocket in secret. Had the team not defied instructions, the development of the R-1 would have taken significantly longer. Additionally, during this time, the three major agencies involved in rocket-related technologies (NII-1, OKB-SD, OKB-51) “worked independently of each other, despite the fact that … they were employed by the same ‘ministry,’ the People’s Commissariat of Aviation Industry.” This led to rampant conflicts-of-interest and duplicated work. Source: Challenge to Apollo - The Soviet Union and the Space Race, 1945-1974 by Asif A. Siddiqi (2000)
@rashidisw
@rashidisw 2 жыл бұрын
Theres serious lack of Massive Death Count during non-war situation for Capitalism, but such list are surely exist for Communism.
@adamrashid4100
@adamrashid4100 2 жыл бұрын
Thank you so much indeed. I can’t wait for these titles. I am starting a new job in a semi conductor company in January. 🤞
@kevinbyrne4538
@kevinbyrne4538 2 жыл бұрын
Recalls the HAL 9000 (from "2001") who confidently stated that any error made by a computer was ultimately due to human error.
@user-ty2uz4gb7v
@user-ty2uz4gb7v 2 жыл бұрын
GIGO
@Poptartsicles
@Poptartsicles 2 жыл бұрын
Good video! This is why we're hearing more and more each day about massive hardware vulnerabilities. It's getting to the point where data centers should probably stay clear of new tech for the first few years until they can be sure they won't suffer massive breaches.
@FisherGrubb
@FisherGrubb 2 жыл бұрын
I have a great quote that I believe is from a game designer at Nintendo: "It's better to release a game late than a bad one because a late one will be late until it's released, but a bad one will stay with bad reviews even if on time but crap". Kind of like how important verification is to strip as many bugs
@SianaGearz
@SianaGearz 2 жыл бұрын
Botched the wording a little bit, not to speak of the kind of vocabulary not used by lead representatives of a polite, child friendly company. Shigeru Miyamoto, quoted regarding N64 delay: "A delayed game is eventually good; but a rushed game is forever bad." Interestingly reviews don't even play into it, as they knew they had young loyal fans who would be buying the product without consulting any reviews. Disappointing them would burn this relationship forever, which is much worse than bad reviews. Remember, people didn't have Internet access back then. Magazines were around, but their reach was in tens of thousands copies while product reach was in millions for the same given region.
@DANTHETUBEMAN
@DANTHETUBEMAN 2 жыл бұрын
Seems like a good place for a A I.
@unknownmaster4342
@unknownmaster4342 2 жыл бұрын
Tell that to Cyberpunk 2077 and GTA remastered trilogy.
@FutureBoyWonder
@FutureBoyWonder 2 жыл бұрын
See this quote everywhere
@vzx
@vzx 2 жыл бұрын
That assumes delayed production does not incur further expense. In reality, workers still have to be compensated for the extra weeks/ months they have to work on the delayed project.
@catsspat
@catsspat 2 жыл бұрын
The key word here is, "art." I'm serious. Some decent (but still relatively small) percentage of people can, "learn to code," but only a very small number of them can become functional artists.
@apptouchtechnologies3722
@apptouchtechnologies3722 2 жыл бұрын
Amen
@TheNefastor
@TheNefastor 2 жыл бұрын
Unfortunately, just like artists, our talent tends be recognized only after we're gone 😅
@qk.6535
@qk.6535 2 жыл бұрын
Hello! I am a beginner in coding. I would like to get better so what do you mean exactly as ‘artist?’ Thanks!
@TheNefastor
@TheNefastor 2 жыл бұрын
@@qk.6535 basically, it's when you care about using elegant code instead of just "good enough" code. It's also when you come up with funky algorithms that can speed your code a thousandfold. Like FFT. When you see your data the way Neo sees the Matrix at the end of the movie, you'll know you've become a code artist and not just another StackOverflow leech 😄
@catsspat
@catsspat 2 жыл бұрын
@@qk.6535 This isn't a topic that could be described easily, as there are many aspects to it. For example, there are people who own 10x, or even 100x the workload of a typical code monkey, yet aren't stressed out, nor spend extra hours at work. They spend much of their time thinking about the problem, and proportionally *very small amount* of time actually coding, and practically no time debugging (at least their own code). Also, when new requirements come up (new features, specification changes, etc.), they take very little time updating their existing code to support them, because it's almost as if the code was written to support the new features. It's almost like magic, like an ability to predict the future, but it really isn't. One of the "art" aspect of chip verification is the ability to verify it for the actual functionality in real life, rather than against just some specification. For example, the specification might contain a big table of what the inputs and expected outputs are. Validating the hardware against the spec is not even a half step. Understanding the reasoning behind the entire architecture, and verifying for that is many classes beyond just standard verification.
@wfjhDUI
@wfjhDUI 2 жыл бұрын
It's horrifying and absolutely hilarious that even Intel engineers working on hardware are still mostly making regular ol' typo and "copy-paste" errors like us normies.
@abebuckingham8198
@abebuckingham8198 2 жыл бұрын
These are also the most common errors in advanced math textbooks too. I have a keen eye for errata so I notice when it's a 1 instead of an I. It's why I'm so fun at parties.
@TDRinfinity
@TDRinfinity 2 жыл бұрын
If your conception is that Intel engineers are good programmers, I have some bad news for you. Hardware engineers in general are pretty poor programmers, most having mostly written C and perl code, without much or any experience with modern software engineering practices
@wfjhDUI
@wfjhDUI 2 жыл бұрын
@@TDRinfinity They literally make the best optimizing compilers and performance tooling. Apparently they know a great deal about modern software engineering.
@Shrouded_reaper
@Shrouded_reaper 2 жыл бұрын
Also consider there are likely malicious actors who are part of designing the chips for most big enterprises.
@slicer95
@slicer95 2 жыл бұрын
@@wfjhDUI The performance, compilers and tooling team are not the ones involved in Verification.
@HaroldR
@HaroldR 2 жыл бұрын
Thank you for these amazing videos. The hardware manufacturing world is a beast of it's own and I love learning about it through your content.
@SamuelLanghorn
@SamuelLanghorn 2 жыл бұрын
what do you do with the knowledge you acquire?
@TheOnlyDamien
@TheOnlyDamien 2 жыл бұрын
I love how when you were talking about how they relied on other peoples work as dependencies I instantly was like "That sounds a lot like our old pals NodeJS/NPM" and then you immediately went into a talk about the web design comparison lol. Have you ever thought of doing interviews with people in these kinds of fields? Could be cool to hear from those dealing with it directly interspersed throughout one of these! Obviously they can't get too detailed but just general personal input would be cool to hear from those suffering on the chip frontlines lol.
@Asianometry
@Asianometry 2 жыл бұрын
I do have conversations with industry professionals on occasion, but mostly on background. They mostly ask not to be quoted.
@TheOnlyDamien
@TheOnlyDamien 2 жыл бұрын
@@Asianometry Oh yeah that's totally fair. I was thinking it would probably be a nightmare to get any to actually come on and be frank about stuff. I can only imagine how hard your research portion is as it goes now! Love the videos man, I get hyped everytime I see a notification.
@smoofles
@smoofles 2 жыл бұрын
Interviewing people who have to deal with NodeJS and NPM (and web dev in general) will just get you hours of someone going between laughing and crying without a word uttered. 🙃 (And after the interview, they’ll immediately run off all enthusiastic about this new JS framework that they _need_ to rewrite their project with.)
@Olivia-W
@Olivia-W 2 жыл бұрын
_Cough_ Log4j _cough._
@TheOnlyDamien
@TheOnlyDamien 2 жыл бұрын
@@Olivia-W How could you do this to me on this day of days, fucking Log4j. Ridiculous situation and I feel horrible for anyone dealing with it in a complex corporate environment.
@R_Haas
@R_Haas 2 жыл бұрын
Another great video. Your grasp of the subject and productivity in making these videos are a big inspiration to me. keep up the good work.
@Asianometry
@Asianometry 2 жыл бұрын
*Tips fedora*
@abeecee
@abeecee 2 жыл бұрын
this is so quality. When I get that first big embedded systems internship from my Asianometry industry knowledge, I'll be sure to sub to the patreon :)
@DavidTriphon
@DavidTriphon 2 жыл бұрын
This was an amazing video. I just started working in this field but I already recognize many of the things you've described in my workplace. Seeing it all compiled so concisely helps me maintain a model in my head of everything that is going on. Thank you! You did a great job.
@Nekroido
@Nekroido 2 жыл бұрын
As a software engineer I'm excited to reinforce my understanding that hardware design is literally the same thing on abstract level. Thank you for this great insight of the chip development hurdles
@andersjjensen
@andersjjensen 2 жыл бұрын
Intel shipped the original Pentum with a serious calculation bug. As in, you could trigger it in a simple spread sheet...
@FisherGrubb
@FisherGrubb 2 жыл бұрын
Intel also had to keep a bug in their chips for generations due to it being a "feature" that was baked into Windows etc & fixing it would have made MS recompile a lot. Pretty sad...
@TheNefastor
@TheNefastor 2 жыл бұрын
Is that the IEEE 754 non-compliance ? IIRC their hardware floating point support was notoriously unusable in anything critical.
@MrGoatflakes
@MrGoatflakes 2 жыл бұрын
@@TheNefastor nah it's the Pentium division flaw. en.m.wikipedia.org/wiki/Pentium_FDIV_bug
@TheNefastor
@TheNefastor 2 жыл бұрын
@@MrGoatflakes that's exactly what I said. Faulty floating point support.
@TheFreshSpam
@TheFreshSpam 2 жыл бұрын
Your videos are amazing on all topics. I can watch all of them all day. Great introductions and layouts to complex things we dont hear enough about
@kristianTV1974
@kristianTV1974 2 жыл бұрын
I used to work for Altera (now Intel FPGA) in the UK Mid 2000's and visited ARM in Cambridge regularly as they used the largest Stratix devices of the time (EP1S80?) to do design verification and IIRC they used the largest equivalent Xilinx Virtex devices too. The thing with FPGA though, is that due to the reprogrammable nature of the fabric, it will never operate at the same speed (it's always slower) as dedicated silicon, so the verification would be very much functional rather than timing based.
@exponentmantissa5598
@exponentmantissa5598 2 жыл бұрын
I worked in wireless design for years and one thing I can tell you about short design cycles is that although you get stuff out fast the chance for errors is of course increased. But here is the catch in the wireless arena you had at most 6 months to a year lead on competitors. If you screw up and miss with a design cycle then you have to respin and now your lead is gone. Do it again and now you are in your opponents rear view mirror. I have a couple of fantastic stories on what happens when you push too hard on the design cycle.
@evil1knight
@evil1knight 2 жыл бұрын
i literally have no clue what most the stuff you talk about in your videos but its so interesting
@kwgm8578
@kwgm8578 2 жыл бұрын
In the theoretical days object-oriented programming, ie, the methodology of building large programs using small, encapsulated, proven software modules, a pioneer in the field coined the term "software IC's" to describe what today are called objects, as computer engineers of that period were familiar with the concept of building systems with components in the hardware domain. It was interesting to hear you bring the concept full-circle at the close of this video.
@autohmae
@autohmae 2 жыл бұрын
That's amazing and makes so much sense. The difference obviously now is the scale of the software and hardware designs.
@monad_tcp
@monad_tcp 2 жыл бұрын
It amazed me how they still didn't introduce more formal verification, like we have in modern statically typed programming languages. There's an entire volume of computing science tricks that could be applied to electrical engineering. There's an entire PhD for me on that.
@kwgm8578
@kwgm8578 2 жыл бұрын
@@monad_tcp Go, Luiz! You'll discover if you're onto something rather quickly in an engineering PhD program. The real world is very unlike a KZfaq comment thread -- no BS allowed!
@ttb1513
@ttb1513 Жыл бұрын
@@kwgm8578No "BS" allowed? Only PhD’s? 😂. Or other type of BS?
@ttb1513
@ttb1513 Жыл бұрын
@@monad_tcpFormal verification is used in chip design. As a trivial example, a 64-bit or 128-bit adder can be formally verified quickly to be 100% correct for all input combinations. Look up BDD’s for a starter. The controlled randomized testing or even doing exhaustive sweep testing at the unit or block level would never come close in our lifetimes. There is inherently just a huge amount of concurrency in a hardware design. And race conditions galore. A lot of people will take away that copy-paste errors are a common error and be astounded. Those are rare. I’ve never seen that. Intel’s ancient ‘divide’ bug was due to such an error: only part of a table of values was copied over. What’s much more common (and can be included as a typo) is using the wrong delayed version of a pipeline state bit. A design can work often, but if the timing of some event shifts by one clock cycle, the design can break. Possibly because of using "state_bit_d2" instead of "state_bit_d1". That’s really a logic error though, not a typo. It would be like in this video’s traffic light example. A possible error could be a race condition where the 2nd car arrives at just the right time as the 1st opposing cars exits the intersection. The light could fail to change for the 2nd car, getting stuck. It is a contrived example, but it amounts to the decision to change the light as car1 exits mistakenly looking at whether there was a car2 already there 1 clock cycle before it showed up, so that action decides no change is needed. And car2 arrives and sees, say, that a car1 is in the intersection and that it will trigger the light to change when it exits (which it mistakenly does not). This is contrived, but it is actually a better example of a bug. A bug where the design does not meet the spec. The video mentions a spec error, not a design bug: the spec failed to state the light must be fair when congested.
@alexanderphilip1809
@alexanderphilip1809 2 жыл бұрын
I'll say this again. Yours is a criminally undersubscribed channel.
@hoodoooperator.5197
@hoodoooperator.5197 2 жыл бұрын
Great video man. I've just finished my first term in MSc Digital Systems Engineering, it was so insanely hard, but I loved it. Particularly using cadence for layout. I seem to have a knack for layout 😎
@josho6854
@josho6854 2 жыл бұрын
Excellent video! I think your comparison of chip design and web development is mostly valid. I used to work in the dram business, and respins were the rule rather than the exception. Analog circuits have infinite degrees of freedom, and are very difficult to simulate perfectly. First time right is much more achievable with a pure logic chip.
@Asianometry
@Asianometry 2 жыл бұрын
DRAM
@meneldal
@meneldal 11 ай бұрын
I do disagree about one thing, is that in web development if the library doesn't work it's mostly your problem if the guy doesn't feel like fixing it, while if you buy an IP from ARM, Cadence or Synopsys and show it doesn't follow the spec they will fix it and apologize. Also I would also mention there are a lot fewer critical bugs that happen and they do clear notes about what was changed and how it can affect you. It is very unlike companies like Google that change stuff and break your code.
@GThu1
@GThu1 7 ай бұрын
Great presentation! As a very senior software developer, I found this very interesting. It looks like that the HDL language part is very close to the regular software development. Including the (very familiar) execution and emerging problems of the practical processes and testing/qa. I want't make a remark: you mentioned, "In the software world, it is easy to handle such design problems by issuing an update". Well, it's really not in most cases. Rolling out an update for a software with major design flow, when already released, causes lot of headaches and costs. Especially when the software is creating lot of data. Also with todays firmware based designs, you can handle lot of things with an update to the hardware firmware. The two kind of development coming closer and closer by time.
@mikhailryzhov9419
@mikhailryzhov9419 2 жыл бұрын
Besides FPGA, you can prototype on ASICs designed to simulate RTL, like Palladium. You can pretty good frequencies.
@theowl2044
@theowl2044 2 жыл бұрын
I just hopped over to the silicon device industry, which I know nothing about. I didn't study all this in EE. Thank you for your videos. I hope your channel takes off.
@ronnycook3569
@ronnycook3569 2 жыл бұрын
It's very, very much like software - most of the time is not spent on the original implementation, but on looking for problems with the implementation and fixing them before they affect deployment to a production environment. Except that it's much, much harder to patch, so all that work is front-loaded.
@demoniack81
@demoniack81 2 жыл бұрын
Honestly that's only true if you suck at _originally implementing_ or you're working on old legacy software that's full of bugs already. Bugs happen obviously and they can be hard to solve, but if it's taking you more developer time to test than to actually implement things it means something's gone badly wrong. In a properly designed codebase 90% of bugs should be caught during implementation by the developer himself, and by the time you push to the test environment the application should be almost bug free. The only bugs that should be left are integration bugs that cannot be discovered untill all the other teams working on other systems have _also_ deployed to the test environment, and those should cause fatal errors right away and be easy to track down because you _should_ have extensive validation on all inputs and service responses. If you don't, you're asking for it.
@ronnycook3569
@ronnycook3569 2 жыл бұрын
@@demoniack81 The key word in your reply is "should." Many of the things that "should" happen, in practice, don't. Even in the case of faultless coding, many bugs are a result of a poorly written or incorrectly interpreted specification. Or of bugs in the tools being used; the tools are themselves typically "legacy software." The only cases I've seen where debugging and maintenance did not consume the bulk of development time were cases where the developer either couldn't be bothered or was cut short. Counterexamples welcome.
@MrRosselhoff
@MrRosselhoff 2 жыл бұрын
First of all, great video, this channel is absolutely one of my favourites! This sounds a lot like the job of CSV in pharmaceutical manufacture. Although I admit it’s technically far behind the complexity of chip design, the principles are the same, we want to avoid bugs like the incident with the Therac-25 X-Ray machine which killed a bunch of patients undergoing cancer treatment due to an error in the code. We work to the GAMP 5 ISPE standards to instruct on the level of validation. To get over the problem of performing endless testing to cover every case we use a branch of maths called DOE, design of experiments, to calculate the best permutation of experiments to conduct. I was wondering if the constrained randomisation that you described utilises some DOE or would the maths become too complex to be practical.
@FUCKaNIGGAcalledYou
@FUCKaNIGGAcalledYou 2 жыл бұрын
Amazing video. I work in this field exactly and it's a very good recollection of what we have to do. Good job!
@jgaztelu
@jgaztelu 2 жыл бұрын
There is a big push in the industry to adopt formal verification, as it can greatly reduce the effort for some verification tasks. However, it's a totally different paradigm from the traditional UVM/constrained random flow and it can't fully replace it, at least for the moment.
@abuttandahalf
@abuttandahalf 10 ай бұрын
How is formal verification different from what is currently done?
@adissentingopinion848
@adissentingopinion848 8 ай бұрын
@@abuttandahalf Basically there are mathematical proof finding algorithms already in use in academia. The gist is that the formal verification engine takes in "assertions" that you write that establish that "condition A must occur after condition B in X clock cycles", allowing individual line item requirements to be broken down into checks in the code. This is what is done in UVM and constrained random flow, but there are quite a few limitations and expansions that are far less hardware and more mathematical. Here's two statements we can use: p until q - if q does not happen,p holds forever p s_until q -q must eventually happen in the proof Leading to statements like (@ posedge clk) start ##1 !complete[*] ##1 complete |=> done ##0 eventually finished Which means "on each clock edge, if start condition is valid AND on the next clock cycle complete is NOT valid for an in indeterminate amount of clock cycles AND complete is valid on the next clock cycle, then on the next clock cycle done MUST be valid AND finished must be valid *at some point*". It's got the same level of density as regex, only now it's time based and you can overload the formal verification engine by implying testing conditions that explode exponentially. Yippee.
@sUmEgIaMbRuS
@sUmEgIaMbRuS 2 жыл бұрын
I've specifically aimed to be a verification engineer ever since I heard about it being a thing during my Bachelor's in EE. This dream came true recently, but I've been struggling to explain to friends and family what exactly it is that I'm doing. It's cool to see it addressed in a popular channel's video 😀
@Asianometry
@Asianometry 2 жыл бұрын
It’s cool to be called a popular channel. Thanks
@sUmEgIaMbRuS
@sUmEgIaMbRuS 2 жыл бұрын
@@Asianometry 100K subs means top 10 in my country :)
@jonathanthomas7287
@jonathanthomas7287 2 жыл бұрын
which country?
@sUmEgIaMbRuS
@sUmEgIaMbRuS 2 жыл бұрын
@@jonathanthomas7287 Hungary (Central Europe)
@tonymccann1978
@tonymccann1978 Жыл бұрын
Binged your videos, great stuff, keep it up
@kmikc909
@kmikc909 2 жыл бұрын
This is a great video! Your channel is really good! Thank you for sharing!
@marcin_bruczkowski
@marcin_bruczkowski 2 жыл бұрын
Amazing what YT suggests for me to watch, but I found it fascinating, relevant (chips all around us, even though I'm a not in a technical field) and very well presented. Thank you!
@anv.4614
@anv.4614 2 жыл бұрын
I have to thank you. Appreciate your competence and your knowledge which you are willing to share with your followers. Thanks
@johnl.7754
@johnl.7754 2 жыл бұрын
Well one thing I can predict about the future from when he created this video is: That he got his 100,000+ subscribers
@Asianometry
@Asianometry 2 жыл бұрын
Ha, when I made this video three months ago - I thought I would NEVER get to 100K ever.
@12-343
@12-343 2 жыл бұрын
Never really learned about the hardware side of design much, but this is very interesting. Cool video!
@ttb1513
@ttb1513 Жыл бұрын
A better example for the traffic light 5:00 is if the later car shows up just +/-1ns from when 1st car clears the intersection. This could create a corner case where two "rare" events happening at just the right relative times cause the system to hang or screw up, like the light getting stuck and never changing for the 2nd car. The "fairness" example given was that of a spec being met, but the spec was incomplete. That is a problem. But just getting a design to meet a correct spec when there are tons of concurrent and interacting events going on is very difficult in itself, especially when some of those events are quite rare.
@ttb1513
@ttb1513 Жыл бұрын
Also, relating to the keyboard testing example 13:40, this video did not mention doing unit tests of blocks of the design. At that level, it is much easier to control and sweep through interesting cases, often exhaustively. Something that usually cannot be done when the design block is exercised within the context of the entire chip. Additionally, formal verification tools are also used. They can verify properties exhaustively that cannot even be tested with functional tests. A simple example is that a 64-bit or 128-bit adder can be exhaustively tested with formal verification tools. Constrained random or brute force sweep testing cannot do the same in our lifetimes.
@martinsimlastik5457
@martinsimlastik5457 2 жыл бұрын
Great video again! Those shorter and shorter design cycles are so familiar to me :(
@ab76254
@ab76254 2 жыл бұрын
I don't have nearly enough understanding of chips to be able to meaningfully engage with this video, but very interesting nonetheless!
@jurevreca9229
@jurevreca9229 2 жыл бұрын
Hey Asianometry, I really love your videos. In case you take requests for video ideas, I would love to see a video on the growing open-source EDA tools ecosystem (Chisel, cocotb, OpenRoad, OpenRAM...).
@RomainCavallini
@RomainCavallini 2 жыл бұрын
My brother is a verification engineer, its super interesting ! and its the first time I see it talked about on youtube, I'm so stoked !
@Maouww
@Maouww 2 жыл бұрын
Yo this video is so helpful to me (as an elec eng student)! Thanks heaps
@celeron800
@celeron800 2 жыл бұрын
What's with all the screenshots of PCB designs in an ASIC video :D . Anyway the DFT (design for testability) and DV (design verification) teams are 2 to 3 times the size of DE (design engineers) + PD (physical designers) and try and work in parallel with the design process with periodic feedback provided by the other teams. Also, as a PD you run multiple time consuming verification steps - timing, noise, DRC, ERC, LVS, LEC, ANT, power, fill checks etc. which factor into the ~70-60% verification time.
@Asianometry
@Asianometry 2 жыл бұрын
Got to have something to show people
@leomethyst5045
@leomethyst5045 2 жыл бұрын
It would be nice to hear your opinion on future of RISC-V or Arm Total Solutions for IoT. Great video as always.
@Asianometry
@Asianometry 2 жыл бұрын
The story of RISC-V is still developing. I feel I wouldn't have much to say here.
@diggleboy
@diggleboy 2 жыл бұрын
I really enjoy your videos, Jon. This video essay is a great take on the semiconductor design problem. I do believe that AI will gradually step in to help alleviate some of the challenges and issues with manufacturing silicon but the industry has been slow to adopt this technology to make vast improvements to silicon manufacturing quality. Some in technology thing "if it ain't broke don't fix it" and that needs to change in my opinion.
@vyor8837
@vyor8837 2 жыл бұрын
AI isn't magic
@PKAdazGalaxiaz
@PKAdazGalaxiaz 2 жыл бұрын
The one suits all approach is hard. Taking time to make an ai might not be worth it and instead paying a team of 10 to do the testing comes cheaper. We have seen an increase in ai used for the architecture of chips but even then you spend years developing the parameters and code for the ai itself. AI shouldn't be seen as a way to put out chips at a faster rate, they screw up too. I think we just need to understand that all aspects of the chain need time to catch up, we have research projects that show we can shove more transistors into a smaller area but so what, if the machinery isn't caught up and only a few men are knowledgable enough to put it into practice it is all for nothing. Customers want to hear that the next product has a newer chip with faster speeds, they have grown accustomed to getting a new phone every year with some sort of improvement. This can only last for so long, the chain is bound to collapse at some point. wouldn't it be a waste to invest billions into a technology you know will only be relevant for a year or two. Any ai you make will lose its efficiency as soon as the chips get an upgrade and now you are left with a useless ai that needs to be changed to suit the new chip. People with proper teaching are just more cost efficient for a business.
@MoritzvonSchweinitz
@MoritzvonSchweinitz 2 жыл бұрын
Thank you for your videos! But I'm very confused because the plug for your newsletter was at the end and not the beginning!
@Asianometry
@Asianometry 2 жыл бұрын
I’ve been trying that yeah
@wngimageanddesign9546
@wngimageanddesign9546 2 жыл бұрын
Good video on verification! I entered the design verification arena after graduating with a BS in EE/CE over 20 years ago. There was still companies making shipping container sized hardware verification stations back then, but I was entering the fledgling realm of simulation software-based HW verification. Truly is a resource intensive endeavor. Even back then, and can't imagine what it's like now with Extreme UV lithography ICs. Designers were the rockstars. But the verification engineers worked just as hard but got no recognition.
@kumartatsat868
@kumartatsat868 2 жыл бұрын
makes me dive into the world of FPGAs and learn electronics from the first principles even more. thank you for such an enlightening video!
@DD8MO610
@DD8MO610 2 жыл бұрын
Great channel. I am graduating with a computer engineering degree soon and looking to do hardware related things. This vid gave great insight.
@rfengr00
@rfengr00 2 жыл бұрын
Nice overview. There is also formal verification, which I believe is akin to a mathematical proof that the logic works as intended. The formal verification is needed as there are too many combinations to check. I was also surprised that first pass success was only 38%. I wonder if this is mostly for analog and RFIC?
@lubricustheslippery5028
@lubricustheslippery5028 2 жыл бұрын
I have a feeling that it can be harder to do the formal verification than the actual chip. And then it will also be a bigger risk for errors in the formal verification than the actual product. It must be so hard just to define exact how a chip should function.
@developandplay
@developandplay 2 жыл бұрын
As a software person dabbling with chip design I initially assumed by verification the hw people mean formal verification. I was so surprised to find out that verification for hw is just a different word for testing. Honestly having worked on a few open-source hw design projects the test coverage is often worse compared to sw projects. Even talking to a few industry folks it seems like hw verification is a lot more manual and inconsistent than software test cases. Now obviously the big chip makers can afford to hire huge teams of verification engineers but it looks pretty weird to me. For software projects pure test engineers are pretty rare as just writing tests all day long would be pretty demeaning to most software engineers.
@rfengr00
@rfengr00 2 жыл бұрын
Let’s say you had something simple, like a counter with set, reset, and clock. Now make that counter 64-bits wide. It’s impossible to check the transition from every state to the next, for every input sequence; maybe take 100 years even in FPGA verification. With formal verification you can do it in a few seconds. Ha ha, don’t ask me how it works. I just know you mathematically prove it works, and it takes the same time as to verify a 4-bit counter.
@developandplay
@developandplay 2 жыл бұрын
@@rfengr00 Well with formal verification you do mathematical proofs on the structural properties of the circuit. While I do think this would have a lot of potential for both hardware and software engineering the reality is that it requires quite specific skills and has not really caught on for centuries. Aside from security critical applications that may require formal proofs like cryptography it seems like the effort is too big for broader adoption. However I do hope this changes as for hardware we should insist on correctness.
@someonespotatohmm9513
@someonespotatohmm9513 2 жыл бұрын
@@rfengr00 It becomes more complicated on more complex systems. Its not realy my field but if your code or harware has to handle a lot of edge cases it also becomes hard to describe them for formal proofs.
@anonyshinki
@anonyshinki 2 жыл бұрын
Constrained random verification, I thought I'd never hear the term again after defending my undergrad paper over ten years ago :D We were doing very basic stuff with old MIPS designs though, I was writing a Ruby DSL that generated a boatload of assembly tests for testing Verilog designs. But my academic advisor was doing similar work with the Elbrus processors (classified, of course, so we never saw any of it), so the various hardware verification related stuff that students and research assistants under him were doing might have been repurposed for that.
@msimon6808
@msimon6808 Жыл бұрын
I did a stint as a test equipment designer in aerospace. I loved it. The scheduled time for a project was never adhered to because the other design work had used it up. "We can make it up in test eqpt." must have been a mantra. Or a face saver. I loved beating their most optimistic goals. Which I mostly did. Fun. Fun. Fun.
@fbkintanar
@fbkintanar 2 жыл бұрын
Interesting. I know that hardware verification has some links to formal methods in software design. I wonder how ideas from functional programming in software, like monads for managed (side-)effects, has any present or future impact on hardware verification.
@gregparrott
@gregparrott 2 жыл бұрын
Just a few weeks ago, an 'Asianometry' video listed ~90k subscribers and the speaker said he would like to reach 100k subscribers. He made this request again here (15:54) on the day this video was released (12/5/21), and it lists 140k subscribers (including me). It looks like his base is quickly growing.
@Asianometry
@Asianometry 2 жыл бұрын
I made the video 3 months ago.
@gregparrott
@gregparrott 2 жыл бұрын
@@Asianometry Thanks for the reply. More importantly, your channel is fast growing ...well deserved.
@OrangeC7
@OrangeC7 2 жыл бұрын
These two videos were an awesome peek into how people actually manage to create the magic black box that's in all of our modern devices!
@gebys4559
@gebys4559 2 жыл бұрын
Great video. I personally hated doing verification on FPGA.
@brokula1312
@brokula1312 2 жыл бұрын
You are brilliant! Time for popcorns and another semiconductor video. Thank you!
@rammahesh3349
@rammahesh3349 2 жыл бұрын
Thanks for explaining for a layman like me in electronic industry 🎉👌
@NightRogue77
@NightRogue77 2 жыл бұрын
I’m only four minutes in and this is already one of the most fascinating things I’ve seen in a while
@ParadoxPerspective
@ParadoxPerspective 2 жыл бұрын
Jon, you're probably the best independent tech journalist alive in the world today. Thank you for your devotion to your passion and profession.
@sauravkumargupta2379
@sauravkumargupta2379 2 жыл бұрын
I am a formal verification engineer and I totally agree with ur Video. The last part of the video is what we do in formal verification. I.E randon generation of number.
@kelvinnkat
@kelvinnkat 2 жыл бұрын
The "I'd like to teach 100,000 subscribers one day" is some sort of running gag, right?
@martinchabot_FR
@martinchabot_FR 2 жыл бұрын
FPGA for verification are thing of the past (mostly). We still use them but for very limited soc or single IP verification. Today we rely on accelerated emulation like Synopsys ZeBu which can hold massive design at ease, allowing multiple instance to run different tests.
@Samsul2013
@Samsul2013 Ай бұрын
I thought the video is about the open source tool for chip design , few videos I watched were about the chip development in the past. Good for students who wants to know about the chip dev history.
@gibbogle
@gibbogle 2 жыл бұрын
Great presentation, really interesting.
@nandakanda001wasabi
@nandakanda001wasabi 2 жыл бұрын
Remember having to explain to management how I mistakenly mirrored a design layout after several sleepless days designing it to get it done. Conclusion is that sleep deprivation affects concentration.
@Travlinmo
@Travlinmo 2 жыл бұрын
This feels like one where you could re-do this ever year or two to update. Something is likely to give to speed up the QA portion of this (i.e., someone decides getting an AI involved and builds AI chips just to build the test routines).
@GoogleUser-ee8ro
@GoogleUser-ee8ro 2 жыл бұрын
Can I say, this trait of chip design (difficulty of verification scales up exponentially with complexity of chip) will fair better on ASIC than general purpose chip such as CPU or GPGPU, which has way more "corner cases" to cover and discover! It may suggest it will be increasingly uneconomic to design "general purpose" chips in the future,.......
@W1ldTangent
@W1ldTangent 2 жыл бұрын
I think for a time we'll find using machine learning AI techniques mixed with traditional we'll find a lot of short term efficiencies but long-term it'll be rendered moot if we ever get a solid grasp on quantum computing. That's a ways off though, for now, best let the algos have a crack at it before we drive our small minds insane 😂
@Bianchi77
@Bianchi77 2 жыл бұрын
Nice video clip to watch, keep it up, thank you :)
@thosewhowish2b693
@thosewhowish2b693 2 жыл бұрын
Man, 2:36 gave me pause. Is that Eclipse for RTL design? Can you explain what is going on there? Is that an existing toolchain or did someone hack together an environment with open source synthesis tools and etc.?
@01sevensix
@01sevensix 2 жыл бұрын
More outstanding work jon. thank you.
@luket2915
@luket2915 2 жыл бұрын
how did you even comment
@skazka3789
@skazka3789 2 жыл бұрын
Why is this comment from 2 months ago when the video just came out a day ago lmao
@manhoosnick
@manhoosnick 2 жыл бұрын
Wtf
@casparfriedrich3119
@casparfriedrich3119 2 жыл бұрын
@@skazka3789 Yeah I thought it was a bug with the new KZfaq Android update, seems not.
@n-i-n-o
@n-i-n-o Жыл бұрын
I can't believe this is free on the internet, thank you!
@rayoflight62
@rayoflight62 2 жыл бұрын
Verification can be conducted automatically to a very limited extent. SDA tools works like a software compiler, but the output is not a code to run on a computer, but a physical portion of circuit to be built on silicon. Only in this way you can generate a chip with four billions of transistors. Verification require that you build dozens of successive prototypes and test the device for inconsistencies (Like when you add one more core but the speed of the chip decreases for some functions). It can take years, but hands on testing it is the only way to produce a faultless SOC chip like the 888. I believe that, year on year, the improvements on existing chips will become more and more marginal. Great analysis you have made here.
@nikolaradakovic5050
@nikolaradakovic5050 2 жыл бұрын
It's impossible to make a testbench for every possible case, that's where functional coverage and formal verificiation kicks in
@bearcb
@bearcb 2 ай бұрын
Working on verification for more than two decades here, nice video. I'd just make some clarifications: 1- most of functional verification is done through simulation. The video shows emulators and FPGAs, which are often used, yes, but they are complementary resources. Same for formal verification, which is not mentioned. FPGAs and emulation are fast enough to allow testing of SoC firmware, which may not be viable in simulation (too slow), so that's the main use, not hardware verification per se. 2- in SoC verification one doesn't cover all the functionality of the IPs used. They are assumed verified already, so the focus is on their interconnection, aka integration verification. Usually IPs are verified isolated, covering their whole functionality, before they are used in an SoC.
@valentinohose1723
@valentinohose1723 Ай бұрын
hi im EE student graduating this year. how do you think about verification engineer as career path compared to design engineer?
@bearcb
@bearcb Ай бұрын
@@valentinohose1723 there is more demand for verification, because at least 2 verification engineers are needed for each designer, ideally 3 (which is not often the case, unfortunately). However design is usually preferred because it's seen as more creative, and has less pressure: the deadlines are beared by the verification team. Verification closure is the last deliverable before moving on to layout/fabrication for IPs/SoCs
@valentinohose1723
@valentinohose1723 Ай бұрын
@@bearcb thank you for your answer. as someone with 20 years of experience in the industry, would you recommend pursuing a career as a verification engineer? I'm interested in understanding the potential career advancement opportunities in this field. I've found verifying 32-bit RISC-V with UVM in a school project to be quite enjoyable.
@bearcb
@bearcb Ай бұрын
@@valentinohose1723 it's a personal decision, you have to balance pros and cons. Demand will always be there. If you enjoyed your verification experience, I'd say go for it. But again, it's very personal.
@valentinohose1723
@valentinohose1723 Ай бұрын
Thank you
@firstnamelastname3389
@firstnamelastname3389 2 жыл бұрын
"slap together a few libraries and ... boom, professional" haha too true 15:00
@davechapman6609
@davechapman6609 Жыл бұрын
During the period from 1987 to 2001, I was working as a consultant, mostly doing chip verification. When the Tech Bubble ended, the bean counters decided that Verification Engineers were a waste of money, and go rid of us all. People went and did something else. I went into Real-Time Embedded Software. A few years later, as the NRE of ASICs continued to rise, they repented and tried to hire Verification Engineers again. They found that the former Verification guys were not interested. They also found that the former Verification guys were advising the young ones not to take such jobs. I remember various conversations with headhunters about Verification Engineering jobs, which were quite unpleasant. For them. "I am not interested in a Verification job because Verification Engineers don't get enough respect." "What do you mean?" "In American today, we are able to accurately measure respect. To. The. Penny." "So, this is about money?" "Sort of. Money is a symbol of whether they really value us." "How much would it take to get you into this job?" "110% of what the Design Engineers get." "Hah. They won't do that!" "That's why their chips are not gonna work." These days, I do FPGA design. FPGAs continue to gain market share from ASICs because, well, the ASICs still have problems with Verification. Maybe they could pay what it takes to get the top people to work in Verification. Maybe they could say "Thank You" once in a while. Maybe they could try to provide some job security. Nah. It ain't gonna happen. ASICs are doomed.
@davechapman6609
@davechapman6609 Жыл бұрын
A 50% rate of re-spins sounds right to me (maybe 65% is closer to the truth). The fact that slang terms exist for "a chip which came back from the fab and does nothing" tells you a lot. . . I also suspect that many of the new versions of a chip which are supposedly "to add features" were actually involuntary bug fixes.
@jdcjr50
@jdcjr50 2 жыл бұрын
Thank you. Designers could use a class/genus/species type of chip design to help close the verification gap.
@avamander.
@avamander. 2 жыл бұрын
Such a great video. How do things like open RISC-V cores change up the process? Does it reduce the amount of verification needed? Are reusable and mass tested cores the only reasonable path forwards?
@slicer95
@slicer95 2 жыл бұрын
No just by being open source doesn't reduce the verification required
@avamander.
@avamander. 2 жыл бұрын
​@@slicer95 I think you oversimplified my question, it's not about "just being open-source".
@slicer95
@slicer95 2 жыл бұрын
@@avamander. RISC-V by itself doesn't change anything for the verification. RISC -V is just an open instruction specification, x86 and ARM were proprietary. The instruction specifcation and verification are orthogonal. For the second part even if you get reusable and mass tested cores, they still have to be verified when they are combined which is non trivial.
@robertcormia7970
@robertcormia7970 Жыл бұрын
Really well done, verification is anything but mundane!
@YoutubeBorkedMyOldHandle_why
@YoutubeBorkedMyOldHandle_why 2 жыл бұрын
Another great video as usual. I especially love the way you seamlessly mix pcb and chip design graphics. You seem to gloss over fpgas, describing them as mere tools for testing, rather than real hardware. But I wonder if this might be our future, where all devices employ chips which are programmable to some degree. It seems to me that hardware design today, has a lot more to do with programming, than with building circuits on boards. Since it has proven to be almost impossible to get programming correct the first time around, having the ability to issue an update to completely reprogram devices in the wild, would seem like a no-brainer. I am imagining hybrid processors, which contain several rock solid core elements, along with an fpga component into which companies would program their own ip. This could eliminate most costly and embarrassing fabrication goof-ups, and would also allow for cheaper standardized chips.
@7008aspen
@7008aspen 2 жыл бұрын
Dogstyle Joe Biden
@SianaGearz
@SianaGearz 2 жыл бұрын
But FPGA is also an SRAM device, which makes it somewhat slow, space and power inefficient and cost inefficient compared to dedicated silicon at sufficient volume. Back decades ago, you'd have antifuse FPGAs which were essentially programmable ROM based configurable logic devices, but those seem to have died out, having been in a bit of a no-mans-land between in-system programmable devices and dedicated silicon. Before that, there were ULAs, which were mask ROM configurable logic devices. And then microcontrollers and configurable peripherals such as DMAC and if you look at something like RP2040's PIO peripheral probably eat quite substantially into CPLD/FPGA territory for a lot of tasks these would have been used for prior, as they're much easier to program. You're also correct in recognising a configurability increase in insanely complex circuits, and that it's absolutely vital, like PC CPUs contain fixed ALUs but the actual software facing instruction set is implemented as "microcode", a rough and usually barely functional version of which is on the silicon, but it gets updated from the PC firmware on every boot. Spinning up such a device costs a fortune and they are prone to bugs being discovered in the field which need to be fixed. Also so many hardware designs have leaned into SoC territory as time went, like "oof, this is hard - let's stick a processor core in there", and thus everything even devices which you think of as fixed functionality like SD-Cards and just about every USB peripheral have become firmware driven. When you buy a USB-to-serial dedicated function chip today, you find inside a 8051-compatible core and a ROM. As universal as FPGA are, they have also always been a niche device. The promise of FPGA everywhere somehow is always nearer and always further away and this has been this way for like 20 years.
@theInsaneRodent
@theInsaneRodent 2 жыл бұрын
Have you seen the Xilinx Zynq chips? They have an ARM core with an FPGA attached. I think Xilinx's MPSoC products also have hardened elements separate from the processing cores, but I don't remember what.
@YoutubeBorkedMyOldHandle_why
@YoutubeBorkedMyOldHandle_why 2 жыл бұрын
@@theInsaneRodent Yes of course. As I was clicking the submit button, I kind of anticipated that someone would make the observation. The thing is, today FPGA manufacturers often build in microcontrollers as sub-components on primarily FPGA chips. But what I am imagining is more along the lines of device manufacturers starting with a custom set of hard microprocessor cores and peripherals, and then adding an FPGA module as a sub-component. Apple for example, started with an ARM core and then added their own 'hard' IP to create a custom chip. Apple is big enough, but still mistakes could be very costly. By doing essentially the same thing, but instead building in a soft FPGA module, the chip could be less risky (no pun intended), and could be marketed to other device manufacturers as well. Mostly semantics I suppose, but I have a feeling this is where things are heading.
@spehropefhany
@spehropefhany 2 жыл бұрын
Thanks. Amusing that you are using software metaphors to explain physical designs. We've come full-circle.
@petergoodall6258
@petergoodall6258 2 жыл бұрын
One man’s hardware is another man’s software
@WarrenKLiu
@WarrenKLiu 2 жыл бұрын
Is it true that gate array size that is actually the size of the manufacturing scale is no longer 1:1 since 32nm? I vaguely remember reading this somewhere so that today's 7nm and incoming 5nm basically have gates larger than that? If true, what are the impacts this has to chips since we are already at the 2nm process.
@VikramBamel
@VikramBamel 2 жыл бұрын
Yes you’re correct. Today, it’s usually the smallest feature on the device (may or may not be gate) is taken into consideration while deciding on a naming convention.
@jarikosonen4079
@jarikosonen4079 2 жыл бұрын
As a lot of time is taken in this verification, then this time could be reduced maybe only with more and more simulation capacity. Using maybe 16 or more cores can be required depending on the task. And analog problems can be even harder to debug. Verification software though exists quite much but with high license costs.
@x2ul725
@x2ul725 2 жыл бұрын
Some stuff like traffic signals for railways require even more verification steps. They require fail safe circuits so unsafe situations cannot be latched into place by failed software or failed hardware. This is not easy and turns out to be very expensive. Standards...
@NeutralGenericUser
@NeutralGenericUser 2 жыл бұрын
I love your videos. Thank you!
@Joemama555
@Joemama555 2 жыл бұрын
15:55 lol "i would like to reach 100,000 subscribers some day". I guess that was recorded a while ago... :)
@Asianometry
@Asianometry 2 жыл бұрын
Yeah my bad.
@mhamma6560
@mhamma6560 2 жыл бұрын
Were you born in the US or CA (i'm leaning CA)? In TW now!? Usually that line of work would leave one little time to create vids, curious to hear your story. Thanks
@MooseBoys42
@MooseBoys42 2 жыл бұрын
Verification coverage gaps are largely mitigated these days by three factors. First, most devices are online and able to receive software and firmware updates that can work around the problem. Second, most consumer devices are relatively inexpensive and designed to be replaced after less than two years, so the effect of a faulty chip is limited in scope. Third, mature supply chains mean that spinning hardware revisions is cheap, so tight iteration with progressively broader test audiences can be used. Also, "respins" are *extremely* common, and very inexpensive. Most GPUs are shipped on their third or fourth spin. It doesn't require reverification or relayout. Instead, chips are built with disconnected transistors and basic logic scattered around. If there's a bug, a change in the metal mask can be done very easily to tweak the logic.
@jagannathdas5491
@jagannathdas5491 2 жыл бұрын
you gave me a business idea!!! But if the problem exists for that long there must be a product or suite that at least claims at solving it.
@Android480
@Android480 2 жыл бұрын
The concept of abstracting the physical design of a chip into a high level programming language is absolutely fascinating.
@nulnoh219
@nulnoh219 2 жыл бұрын
3:43. For all i know that may as well be a magic circle from one of those fantasy novels.
@WalterBurton
@WalterBurton 2 жыл бұрын
Pleasantly surprised. Thanks, Mr. Algorithm. Burnt-out old software developers like me love this.
@0MoTheG
@0MoTheG 2 жыл бұрын
I studied digital design in the 2000s but never found a job in the industry. Now a decade later there are many verification / non-regression testing jobs. concerns
Analog Chip Design is an Art. Can AI Help?
15:48
Asianometry
Рет қаралды 180 М.
How Semiconductor DRAM Went 3D
19:53
Asianometry
Рет қаралды 94 М.
Она Постояла За Себя! ❤️
00:25
Глеб Рандалайнен
Рет қаралды 7 МЛН
Trágico final :(
01:00
Juan De Dios Pantoja
Рет қаралды 24 МЛН
How to open a can? 🤪 lifehack
00:25
Mr.Clabik - Friends
Рет қаралды 14 МЛН
Мама забыла взять трубочку для колы
00:25
Даша Боровик
Рет қаралды 2,3 МЛН
The Coming AI Chip Boom
15:41
Asianometry
Рет қаралды 341 М.
Why AMD's Chiplets Work
12:53
Asianometry
Рет қаралды 290 М.
6 Horribly Common PCB Design Mistakes
10:40
Predictable Designs
Рет қаралды 163 М.
The Semiconductor Security War
12:52
Asianometry
Рет қаралды 113 М.
The iPhone Forever Changed the RF Filter
22:22
Asianometry
Рет қаралды 174 М.
Processor under microscope. Nanometer journey
12:41
My Computer
Рет қаралды 1 МЛН
How a CVD Diamond is Made
14:11
Asianometry
Рет қаралды 47 М.
India's Semiconductor Design Challenge
14:14
Asianometry
Рет қаралды 85 М.
This AI Learned to Design Computer Chips! (The View of a Chip Engineer)
12:46
Она Постояла За Себя! ❤️
00:25
Глеб Рандалайнен
Рет қаралды 7 МЛН