This is Destroying Our FPS

  Рет қаралды 394,660

Vex

Vex

Күн бұрын

DirectX 12 could be destroying our FPS. This is for many reasons, but top of which is the API's sheer amount of control that it grants to developers, while at the same time, introducing SO many NEW and graphically demanding features like raytracing. Devs are having to learn faster than ever to adapt and optimize properly. What you think?
==JOIN THE DISCORD!==
/ discord
Valorant: • GREATER THAN ONE // Ge...
MuleSoft: • What is an API?
Nvidia Explains DX12 Ultimate: • DirectX 12 Ultimate on...
Daniel Owen: • Your PC isn't ready fo...
Gameranx: • What Is Vulkan & Why S...
Vulkan: • Bringing Ray Tracing t...
en.wikipedia.org/wiki/DirectX
www.digitaltrends.com/computi...
My Spotify:
open.spotify.com/artist/3Xulq...
==WINDOWS AT A HUGE DISCOUNT!==
Windows 10 pro ($15): biitt.ly/TFR1G
Windows 11 pro($21): biitt.ly/U9jCZ
Code: "vex" for 25% off!
0:00- Current State of New Games
0:30- The Common Denominator
0:57- DX11 vs DX12 in new games
2:03- How APIs affect games
3:15- DX12 is SUPPOSED to be better
3:53- DX12 Ultimate go brrrrr
5:40- Warning signs
6:43- Inexperience working with DX12
7:35- What about Vulkan?
9:47- We want better

Пікірлер: 2 200
@vextakes
@vextakes 9 ай бұрын
Just to clarify, the Witcher 3 DX12 update (released Dec, 2022) does seem like it is particularly bad compared to the original DX11. But, what does this mean for other recent releases? Lol also it’s “Application Programming Interface” :) Feeling like LTT with these corrections 🫠😳
@dafyddrogers9385
@dafyddrogers9385 9 ай бұрын
Is the Witcher 3 dx12 build so bad because it's patched ontop not built from the ground up like dx11 was?
@damara2268
@damara2268 9 ай бұрын
Also note that high vram allocation you have shown in Control is not a bad sign. It means that the devs just configured the game through dx12 to be able to preload more data in video memory when its available because that reduces potentional stutter when going into new area. If your GPU has 10GB of memory why only use 5 and leave the other 5 unused eventhough the space can be used for smoother gameplay?
@loganbogan9
@loganbogan9 9 ай бұрын
​@@dafyddrogers9385they actually didn't "build" anything at all. Instead of porting the game to a newer engine version or at least porting the current engine to DX12, they used D3D11On12. It's a translation layer for DX11 apps to run in DX12. It's horribly optimized and Microsoft themselves said only use it for 2D elements. So essentially the entire game is ran through a compatibility layer.
@Antek_S
@Antek_S 9 ай бұрын
PIN IT! Before Steve comes looking for you.
@dafyddrogers9385
@dafyddrogers9385 9 ай бұрын
@@loganbogan9 It certainly feels that way whilst playing.. Thanks for the explanation
@acuteaura
@acuteaura 9 ай бұрын
With DX12, you can treat the GPU like a console, squeezing out every inch of performance. But barely anyone seems to be putting in that effort.
@redbullsauberpetronas
@redbullsauberpetronas 9 ай бұрын
Incompetent devs, competency crisis in action
@sidewinder86ify
@sidewinder86ify 9 ай бұрын
Not really.. consoles is not magic when it comes to optimizing.. the secret really is ''lower the graphics till it stops to lag, then lock the FPS on 30 or 60'' I don't call 30FPS in 720P for good optimization.. Ps4 and PS5 is x86 system, basically a locked down PC with Linux..
@SethOmegaful
@SethOmegaful 9 ай бұрын
@@sidewinder86ify The magic of consoles is less hardware to test when optimizing your software. Thats it.
@duxnihilo
@duxnihilo 9 ай бұрын
​@@redbullsauberpetronasWouldn't they have to account for every single GPU out there? I don't think the current batch of devs, who put Witcher 3, RDR, Wolfenstein and Doom on the fucking Switch are incompetent.
@grisu1934
@grisu1934 9 ай бұрын
@@sidewinder86ify Thats not how it works, you can precompile shaders for specific consoles and because there are less consoles than gpu it is feasible todo so. also a ps4 and ps5 dont run linux but is based on FreeBSD
@hawns3212
@hawns3212 9 ай бұрын
Game Developer / Graphics Programmer here. One of the main issues when switching from D3d11 to 12 is the level of control. In 11 a lot of menial tasks are done for you pre-optimized for the GPU, however in D3d12 you have a lot more control over the fine details of tasks, but the average programmer does not know how to use it efficiently to make the extra control worth while. In addition to that, 11 is also a lot easier to learn because of how much simpler it is.
@loganbogan9
@loganbogan9 9 ай бұрын
I wish Microsoft would release a DX12-Lite or something. Has things like RT and direct storage, but it's all optimized by Microsoft so it's easier for devs to implement properly. A higher level API but with the features of DX12.
@JathraDH
@JathraDH 9 ай бұрын
Yup. Sadly devs these days have been coddled by years of high level API's and have no idea how to actually program anything anymore. You don't get that type of knowledge in a game dev course, you need a compsci background to learn it in school.
@beetheimmortal
@beetheimmortal 9 ай бұрын
@@JathraDH Which is exactly why older devs from the '90s and early 2000s were basically magicians compared to these guys. They really knew what they were doing.
@KPHIBYE
@KPHIBYE 9 ай бұрын
As a regular full stack programmer, I have a question for you. Have they removed the existing DX11 pre-optimized methods for the menial tasks in DX12 and just left you with the low-level ones and if they did, surely someone would have already written an open source wrapper library that would reintroduce them? I just can't imagine that devs are now stuck with a low-level API and have to reinvent the wheel individually.
@InvadeNormandy
@InvadeNormandy 9 ай бұрын
@@JathraDH This is also why we have games that look the same for almost two decades but run worse and worse, turns out when you turn every game engine into unity- You get unity developer quality.
@ik4659
@ik4659 9 ай бұрын
The biggest problem is that the solution has always been "throw more hardware at the problem" rather than "let's code smarter". Hardware has far outpaced the software at this point and it's a shame. I am willing to bet that there is a ton of untapped potential with this hardware that's being held back by convenient development practices. This is why I love Nintendo's engineers. Look at what they have accomplished with limited hardware. I wish more developers worked in constrained environments just so they could learn what optimization *really* looks like.
@trypwyre9024
@trypwyre9024 8 ай бұрын
I accede.
@heylow8849
@heylow8849 8 ай бұрын
I concur.
@MrGamelover23
@MrGamelover23 8 ай бұрын
What I don't understand is How are games so badly optimized when they have to be able to run on the series s? If you can get your game to run well on the series s, it should be running fine on PC since both platforms technically run Windows and use DX11 and 12.
@-TheOddity
@-TheOddity 6 ай бұрын
@@MrGamelover23one would think, but optimization comes in a lot of different forms. Usually a studio makes the game specifically for Xbox and PlayStation and it is the publishers responsibility to port it to the pc, so typically they put in minimal effort to release it faster and keep costs down. Drivers play a big role as well. Consoles usually have particular acceleration hardware along with the gpu and cpu to optimize rendering, and they don’t have to run a bloated operating system in the background so they free up computing space, ram, clock cycles, etc
@WeeeAffandi
@WeeeAffandi 5 ай бұрын
Agreed RE: Revelations on a 3DS is by far the most mind blowing thing i've ever seen
@LuaanTi
@LuaanTi 9 ай бұрын
Starting from scratch with DX12 is a great experience. But most games are not made with DX12 from scratch. They use the tooling and assets already developed for the company's earlier games. They use abstractions that were developed all the way back for DX9 (and OpenGL) that were themselves kind of poor because they were really made for DX7 and it goes on and on. They have intermediate layers to allow you to switch graphics APIs that mean you can't really do anything well - you still have to pick one as your "main" and hopefully wire things together decently enough to make the alternate APIs barely worth it. Grafting DX12 or Vulkan on top of a game built for DX10/OpenGL doesn't do much... much of the time. The main problem with DX has always been the fact that it's a Microsoft product, utterly dependent and controlled by them... and _mostly_ locked to Windows. Of course, that's still the _vast_ majority of PC gaming and you can pretty safely ignore the alternatives if you're not explicitly targeting niche audiences. I started with OpenGL. I made a few simple 3D games. But when I tried DirectX for the first time... I really never ever felt like OpenGL had anything on offer that DirectX didn't do better. I never felt like I'm missing out. Of course, Carmack always preferred OpenGL because of it's openness, portability and C-style API, and was a big supporter of Vulkan for all the same reasons (as well as the added ability for low-level coding) - but the vast majority of graphics engine developers aren't Carmack or Sweeney. Vulkan is a pain to use for very little benefit the vast majority of time. Mind, it would probably be worth it if the game environment was a bit more static, with more of a chance of that investment paying off (the way it was always on consoles, where you had a typical progression with early games that didn't really know how to use the hardware yet to the end games that squeezed stuff noone thought possible out of it). But PCs were never like that. And the thing is, while there are many similarities between the APIs... to really get their worth, you need to build your entire game around a particular API. Anything else is a waste of loads of effort, even with all the modern helper tools like cross-compiling shaders. Giving you the option to switch costs _so much_ in either performance or development time... and often both. This only got worse with time (though even the very first release of OpenGL already did some things differently from DX for no other reason than spite; people do those kinds of things to each other). In the end, unless you're Epic... you really want to make a choice. Pick the API that feels comfortable, fits your goals, has great tooling and support, and build your game/engine around it. If you care about performance or efficiency, there is no other way, really. The kind of abstractions you need to simultaneously target DX10, DX12, OpenGL and Vulkan mean you lose way too much. The tooling will suffer too, and you'll have to build a lot of stuff on your own for almost no reason. You'll be investigating absolutely infuriating performance problems that seemingly have no reason all the time. It's incredible how much abstraction in the GPU space can hurt... and how hard it is to find how, and to fix it. I've had cases where I literally traced everything instruction-by-instruction, slot-by-slot... and still couldn't figure out why it was suddenly impossible to push 100k polygons on the screen at 60 FPS with 100% GPU load. Rewriting the same code in raw DX12 was simpler, faster and suddenly, that same GPU could push 100M polygons easily. The worst thing with GPUs (even before 3D accelerators) was always how fickle they are, and how random the performance and other issues are. In my case, the vast majority of games run much better on DX12 than on DX11 or Vulkan on _my particular hardware_ . The switch from 2D to 3D is deceptively simple... until you start testing on various random PCs and find out that half of them can't even run your game for obscure reasons, while the other half has inexplicable performance issues. It's crazy, and always has been. And as always: make sure your drivers are fully up to date. But of course it can't be that simple, really - some updates break games and/or their performance :D PCs are complicated.
@trypwyre9024
@trypwyre9024 8 ай бұрын
Darn, what a twisted and complicated world we live in.
@flyingflynn
@flyingflynn 4 ай бұрын
i will not be reading all this
@wile123456
@wile123456 4 ай бұрын
It why games that use vulkan, like Doom, you know you get a well optimized experince because there is no code legacy the devs can rely on. They have to optimized or their game won't release.
@whannabi
@whannabi 4 ай бұрын
​@@wile123456yeah but that's only a matter of time before they do It again if I understand correctly
@AffectionateLocomotive
@AffectionateLocomotive 4 ай бұрын
Damm
@Barkebain
@Barkebain 9 ай бұрын
Video game optimization was something we took for granted, but that's no longer the case. The reason game optimization took place was that the video game was the product. Today the product is the monetization platform, and the video game elements added to that platform are seen as an expense to be minimized. When the product was a video game, optimizing that product was considered a valid expense. When the product is a monetization platform, clearly that's not the case.
@tsunekakou1275
@tsunekakou1275 9 ай бұрын
You're making a lot of sense.
@KraszuPolis
@KraszuPolis 9 ай бұрын
Why not? If they want to monetize the game they want to optimize it as well so more ppl will end up playing it. Also Remnant 2 has no monetization afaik, it is a product, and it is one of worst optimized games, so is TLoU.
@InZiDes
@InZiDes 9 ай бұрын
@@KraszuPolis That time and money can be used to make more games. Most of the games are not hits, making more games increase the chance. However, the problem of optimization is not that. All modern software suffers that problem. You can search "Computer Speed Gains Erased By Modern Software".
@cherryrook8684
@cherryrook8684 9 ай бұрын
I think its more so that they can now get away with it because "we have the hardware". Why optimize your game when it can be brute forced with top end hardware even though the average consumer would likely have mid-high spec hardware, as opposed to back then when devs had to work with very limited resources so they had to work smarter and more creatively in terms of problem solving.
@abdulhkeem.alhadhrami
@abdulhkeem.alhadhrami 9 ай бұрын
True words of wisdom! Just look at gta 5 to see how other developers wish to be, they are going downhill heck they even passed it at this point, and they keep on digging away!
@Drischdaan
@Drischdaan 9 ай бұрын
The problem with The Last of Us is that it runs on D3D11On12 which internally converts directx11 calls to directx12 calls. That creates unneeded overhead and decreases performance
@damara2268
@damara2268 9 ай бұрын
same for witcher 3
@Gramini
@Gramini 9 ай бұрын
I've heard that DXVK performs much better than D3D11On12, might be worth a try to use that to translate D3D11 into Vulkan.
@loganbogan9
@loganbogan9 9 ай бұрын
​@@GraminiYeah it's funny you mention that. Recently, DXVK and VKD3D-Proton have both gained support to run D3D11On12 outside of it's emulation layer. Meaning all the DX11 calls get immediately converted to Vulkan, and all the DX12 calls get immediately converted to Vulkan. Before this major update it would go DX11 -> DX12 -> Vulkan. This should honestly be installed immediately in any game that decided to use D3D11On12 as a horrendous conversion layer.
@Drischdaan
@Drischdaan 9 ай бұрын
@@Gramini my problem with that: Why would you want to do that in the first place? Having such translating layers creates unneeded overhead. If you ship for windows choose DirectX, if you ship for Linux choose Vulkan or OpenGL. If you want to use Vulkan on Windows just use it and don't introduce translate layers. Using the API directly is better in every case
@loganbogan9
@loganbogan9 9 ай бұрын
​@@DrischdaanWell you have to do that because developers use a translation layer Microsoft says to use for 2D elements for their 3D AAA games. In the case of DXVK & VKD3D-Proton, you aren't actually adding a translation layer, but moreso replacing the terribly optimized D3D11On12 wrapper with a very streamlined Vulkan implementation.
@MaxIronsThird
@MaxIronsThird 8 ай бұрын
DX11 did the work for the developers, people like Id Tech chose Vulkan bc they wanted to squeaze even more performance out of the hardware by going with a really low level API, MS seeing that, chose to make DX12 more similar to Vulkan, but guess what, not that many developers used Vulkan to begin with and most of them don't bother to optimize that much, which makes DX11 perform so much better in some many games. Bring back DX11 for Devs with no API knowledge.
@kreuner11
@kreuner11 4 ай бұрын
It doesn't have to be brought back, it's still available?
@MaxIronsThird
@MaxIronsThird 4 ай бұрын
@@kreuner11 There is a lot of newer rendering tech that is not available on Dx11.
@tosemusername
@tosemusername 4 ай бұрын
@@MaxIronsThird Nah, MS is a for-profit company, and keeping redundant stuff around is a thing usually left for the open-source community. I wish MS dropped Windows 11, but here we are. Another problem with this is that it gives in to a flawed approach to software engineering that ultimately results in not the sacrifice, but simply complete disregard for quality, excellence and competence, simply because the responsibility can be pushed to someone else, even if that someone else doesn't exist. Wirth's Law already exemplifies this someone being the hardware engineers, but the end of moore's law and dennard scaling should say it all, and something will have to give, and I just hope that we don't have to exhaust every other possible option before coming to the all-along obvious conclusion that the problem is us, the developers.
@Pavilion942
@Pavilion942 Ай бұрын
Just get a ps5
@MaxIronsThird
@MaxIronsThird Ай бұрын
@@Pavilion942 wat
@jagersama9792
@jagersama9792 9 ай бұрын
Low level APIs are not a problem, it is more difficult to get the same result but certainly have the potential to be much better although it needs to be in the right hands. We can see the potential with emulators that use Vulkan or even DXVK.
@or1on89
@or1on89 9 ай бұрын
I think that DX12 have been out for 8 years now, Microsoft keeps improving them and giving devs ways to optimise and gather control via hardware scheduling. Nvidia keeps relying on CPUs for graphics scheduling and devs keep cutting corners under pressure of publishers...it's not a DX issue, it's an industry issue.
@arenzricodexd4409
@arenzricodexd4409 9 ай бұрын
that's why direct x and open gl being created in the first place. the aim is to make things more less complex on certain aspect. Direct X 12 try to bring in that complexity back.
@FrancisBurns
@FrancisBurns 9 ай бұрын
I am with you here, it is simply too reductionist to blame one API or the other like VEX is doing in the video.
@triadwarfare
@triadwarfare 9 ай бұрын
​@@arenzricodexd4409some people like more complexity for finer controls. It's like high level vs low level programming language. It's easier to program, check for errors, and easy to learn and follow on high level languages, but you're only limited to the set programs the developers intended. For low level languages, it's hard to learn, hard to code, and you can break stuff without warning, but if you know what you're doing, you can do pretty powerful things with maximun efficiency, since you're already working on the "bare metal" of the device where performance isn't degraded by API and translation processes
@arenzricodexd4409
@arenzricodexd4409 9 ай бұрын
@@triadwarfare both option should be open. Intially that's what MS intend to do. Give DX12 to those that want more fine control and DX11 for those that did not want the complexity. Hence DX11 still given some feature parity with DX12. but then MS decided to only give new feature towards DX12 only.
@einarabelc5
@einarabelc5 9 ай бұрын
No, it's a cultural issue.
@PotatMasterRace
@PotatMasterRace 9 ай бұрын
DX12 and Vulkan are lower-level APIs and will be faster than DX11 when used right. The problem is the level of expertise and effort needed to use these tools optimally.
@Ribs351
@Ribs351 9 ай бұрын
0:59 minor correction here but API stands for Application Programming Interface. In the context of graphics, an API is basically a middle man between the game you're running and your GPU, it tells the GPU what the game wants to render.
@bbbl67
@bbbl67 9 ай бұрын
The history of DX12 and Vulcan come either directly or indirectly from AMD's Mantle. Mantle was a proof of concept by AMD, which was also practical. Some games or benchmarks using Mantle, like Ashes of the Singularity, came out. Mantle eventually got folded directly into Vulcan, and it inspired DX12. The whole concept was to be able to take control over the fine-tuning of details. The idea being able to optimize at the lowest levels. Developers were apparently "crying out" for such feature. But DX11 had a lot of features that actually made developing games much easier, without going into the bare metal. So it seems that the amount of optimization available from DX12 was just too much, and this level of control wasn't always needed.
@arenzricodexd4409
@arenzricodexd4409 9 ай бұрын
"Developers were apparently "crying out" for such feature." looking at how slow DX12 adoption i doubt developer really "crying out" for it. the one that really hope for things like this to happen on PC gaming is AMD. because that way they can pretty much reduce their graphic driver team to bare minimum and let game developer to do all the optimization job themselves. but they way how PC hardware have thousands of multiple config it was an impossible dream to begin with.
@bbbl67
@bbbl67 9 ай бұрын
@@arenzricodexd4409 Even back then, many game developers were investigating the bottlenecks. I'm sure some were saying if we could get access to those bottleneck areas, then we could clean that stuff up. I think the problem was that if you got access to all of that low-level stuff, then you can't get access to all of the high-level stuff, or vice-versa. If they had combined the high-level with the low-level then people could've probably gotten used to the low-level stuff. That's why it took so long for developers to get upto speed on DX12.
@arenzricodexd4409
@arenzricodexd4409 9 ай бұрын
@@bbbl67 personally i think the good thing with DX12/Vulkan is on the CPU side of thing. But on the gpu side they end up complicating more things by giving developer the access to things that usually done by drivers. But ultimately IHV should knows their gpu better than game developer will be. Plus game developer will not going to have the time to cater for every architecture out there. That's why to me this low level stuff so far benefit things like emulator the most where it was developer own pet project where they can take their time to implement and appropriately learn all the bell and whistles.
@Rexhunterj
@Rexhunterj 8 ай бұрын
At the time, the dev team behind BF3 and BF4 probably were crying out for the bare metal features of Mantle/Vulkan, but those devs stopped working there AFTER the launch of BF4 sadly...
@bbbl67
@bbbl67 6 ай бұрын
Yup, the low-level stuff was so different from the higher-level stuff, it was new paradigm of programming. It surprised me that they didn't just give access to the low-level stuff while keeping access the same for the high-level stuff.
@neffix6976
@neffix6976 9 ай бұрын
Bring back the original Dishonored graphics, they look amazing and every toaster can run it
@freaklatino13
@freaklatino13 9 ай бұрын
looks terrible even at 4k
@AhmadWahelsa
@AhmadWahelsa 9 ай бұрын
looks amazing even at 800x600, fun game and great art direction
@michaelmonstar4276
@michaelmonstar4276 9 ай бұрын
Bad idea for creativity. That's like making every game look like 'Mini Ninjas'. - Like what?... Oh, sorry: Like "Breath of the Wild", which wasn't even the first.
@Eduard0Nordestino
@Eduard0Nordestino 9 ай бұрын
​@@freaklatino13u gai bro
@michaelmonstar4276
@michaelmonstar4276 9 ай бұрын
@@AhmadWahelsa It makes it "great art direction" when it's still unique. Shit shouldn't be copied exactly.
@darkassasin111
@darkassasin111 9 ай бұрын
Vulkan coper here, just wanted to point out the enormous gains in 1% lows vs. DX12. In several of the RDR2 examples, the frametime graphs show big spikes in DX12 but not VK. 20+% improvement in the lows is no joke. I think you're much more likely to notice a few big stutters than ~5% average fps, so I use Vulkan/DXVK pretty much wherever possible.
@xXSilentAgent47Xx
@xXSilentAgent47Xx 9 ай бұрын
Except Vulkan shows stutters than DX12. Everyone recommended to switch to DX12 because it's causing the big fps and gameplay issue in RDR2. I couldn't even stand to look at the game running poorly on Vulkan. What's even worse is DLSS is making game worse with those ghostings on hairs, trees and grass. Even on latest DLSS is still worse. Never turn on that trash called "DLSS".
@loganbogan9
@loganbogan9 9 ай бұрын
​@@xXSilentAgent47XxSource?
@vaudou_
@vaudou_ 9 ай бұрын
Yeah I was looking at that part of the video. It's easy to see which version is going to feel better to play. 1% and 0.1% lows are everything.
@Bulletsforfree
@Bulletsforfree 9 ай бұрын
Tales from the Ass. Vulkan runs way better in RDR2 and the very few other games that use it.
@xXSilentAgent47Xx
@xXSilentAgent47Xx 9 ай бұрын
@@loganbogan9 Source? Play the game. It even has VRAM issue which i had to get the mod to fix it. I don't know what's up with all this Vulkan and DX12 but when GTA 5 runs smoothly then RDR2 should (yet the game has only DX11). Only problem in GTA 5 is too much grass and interiors like nightclub cause heavy fps drops. But never turn on that trash called DLSS. At first it looks smooth running till you notice motion blurs and ghosting.
@jamaicankyng
@jamaicankyng 9 ай бұрын
Gotta hand it to you man, I've come accross your videos before but... this topic, pacing and educational mixture has got me to finally subscribe. Awesome stuff man.
@notarandom7
@notarandom7 9 ай бұрын
There's a reason most indie devs use D3D11 and OpenGL. It's just alot easier to use! Most developers just do not have the time to do more for a chance of more performance
@Vincentsgm
@Vincentsgm 9 ай бұрын
What I hate about modern game development is that newer devs tend to use 4k textures for everything, even the smallest of assets that you'll never see.
@JamesJones-zt2yx
@JamesJones-zt2yx 9 ай бұрын
I can't help being a little amused by this, because at least at one time this is what makers of objects in Second Life were pointed out as doing. The standard complaint was "game developers optimize their work; you'd never see them doing this."
@justshitposting8411
@justshitposting8411 9 ай бұрын
Or the devs of pokemon sword and shield copying and pasting every texture every time, leaving hundreds of copies of a grass texture in both memory and in the games folders as opposed to calling the one texture for each application, the amount of engineering, trickery, and optimization that brought us the first pokemon game was nuts, what happened?
@muramasa870
@muramasa870 9 ай бұрын
They think they are Skyrim modders or something smh😂
@mythydamashii9978
@mythydamashii9978 9 ай бұрын
They should also reduce the rendering on the far background since no one is gonna look at mountains 24/7 in a survival zombie game
@user-fg6mq3dg3d
@user-fg6mq3dg3d 9 ай бұрын
What i hate about modern games is that using textures to push high fidelity and 3D Characteristics on an object should've been already an outdated method when Nanite can do all of that with actual polygons without using 5092350329502395 GB of VRAM
@gamestar81
@gamestar81 9 ай бұрын
graphics dev here. i'm gonna try and explain in less technical terms whats going on, so apologies if it seems a bit rough or i miss explain something. generally speaking it's all about how these api calls are utilized by the developers, in conjunction with how well documented these new features are with these new updates/version and how much was added to these documentations. DX11 vs DX12 is essentially opening a flood gate of freedom to the "bare metal" of the system which causes a lot of development overhead as this means more setup may need to be done to achieve the same result as you did before. devs then need to spend more time going through the documentation which takes away from development time, which causes them to rush etc. etc. compounded by trying to fit in all these new shiny features people want to see, it leads to shortcutting and not fully understanding the tools they are working with. you would see a lot of the same if these studios just simply switched to vulkan cause now these developers gotta learn a whole new api, which is just more development overhead. there's more at play here on this issue all together, but that would be outside the scope of just talking about the drivers themselves, but a summarization can be crunch and lack of senior developers(studio pending).
@xalenthas
@xalenthas 9 ай бұрын
What blows my mind is the lack of foresight by the Management teams in Software houses. It's not likeDX12 is some "fad" that'll be here today, gone tomorrow. Assigning real time to Developers to truly dig into what's possible with the API and the developing in-house tools to make full use of what's available would mean development times ultimately be reduced in the long run and a far more consistent experience for both the team and consumers. The sheer lack of foresight and planning you know the stuff they, [the management] supposedly gets paid to do, is ridiculous and of course it's then the Developers fault for poor implementations and never that of the "boss" ...
@arthurbonds7200
@arthurbonds7200 9 ай бұрын
This. DX11 had more abstraction and thus the driver does more of the heavy lifting compared to the CTM approach of DX12. DX12 allows for finer control, yet a lot more control also means a lot more places to mess up. In a perfect world, the baseline performance should always be DX11-level, not less.
@The_Noticer.
@The_Noticer. 9 ай бұрын
Woah, you mean if you're not using instanced drawcall commands in DX12 it doesn't actually improve performance? /sarcasm
@kaktusgoreng
@kaktusgoreng 9 ай бұрын
So it's a matter of documentation and mastery then? Why not make another jobdesc for vulkan engineering or directx 12 advisor and such? A specific department for low level API
@PinHeadSupliciumwtf
@PinHeadSupliciumwtf 9 ай бұрын
So with everything shareholders/suits are the issue?
@JoaoMPdS
@JoaoMPdS 8 ай бұрын
To put it simpler, is like windows 10 & 11. Windows 10 often seems more preforming, but people still update it.
@ojonathan
@ojonathan 9 ай бұрын
I'll bring my “I'm a graphics developer” moment, but before this, I need to clarify two things: I'm actually a GPGPU developer, I don't do real-time graphics stuff in general, but I do a lot of computing, and second I do agree that games seems to be struggling way more on DX12 than DX11, but I'll get into that in a moment. First I need to bring up one thing: Cyberpunk 2077 and The Witcher 3 do not really use DX12, they use D3D11On12. D3D11On12 basically translates DX11 calls to DX12 at runtime, and gives you access to features that would otherwise not be accessible on DX11, like Ray Tracing and DLSS. The problem is that D3D11On12 IS NOT MEANT to be used in production, Microsoft themselves states this. It's not like DXVK and VKD3D which is being extensively optimized and developed. About games performing worse, besides CP2077 and TW3 that still uses DX11 behind the scenes and has a lot of overhead, it's because developers need to learn a lot of stuff that was all hidden from them on DX11, and this also means that they will make a lot of mistakes, they will mismanage memory, they will use suboptimal synchronization techniques, they will do a lot of wrong things because they still don't know how to do things properly, and that's expected, it's the same for Vulkan and Metal, which are low-level APIs. Developers cannot rely on the API Runtime and Drivers doing all the optimizations for them (drivers still can do, but not to the same extent), now that's their responsibility to do this and to use those APIs correctly, and this has a really steep learning curve, graphics programming and optimization in specific is a tough task. As for games like Doom running really well on Vulkan, it's not really about Vulkan (although I would love to see more games using it instead of DX12), but about the devs. id Software devs are damn good and love to test new tech and try new things, they were really into learning and implementing Vulkan, they were not rushing because they needed to implement in order to deliver those new fancy features like Ray Tracing, by the time id Software started working on their Vulkan renderer, Nvidia's 2000 series wasn't even a thing. So they were implementing this because they wanted, not because they were forced to, this also means that they had a lot of time and way less pressure than the others.
@huntercz1226
@huntercz1226 7 ай бұрын
You got me thinking, why vkd3d-proton (which translates DX12 to Vulkan) worked with Witcher 3 even before D3D11On12 support in DXVK DXGI? Did they compile it statically? Also, Train Simulator is in similar case, but instead it uses D3D9On12, but DXVK devs just worked around it by using D3D9 path instead. Kinda confusing, not going to lie.
@futuza
@futuza 4 ай бұрын
This guy one of the few in the comments that actually knows what they're talking about. Ultimately the issue is not the APIs, but the lack of developer skill using them, and in turn the lack of skill of managers trying to find good devs and manage, train them, and use them correctly.
@ojonathan
@ojonathan 3 ай бұрын
@@huntercz1226Sorry for the late reply, I really don't look at my KZfaq notifications. VKD3D can run D3D11on12 games because D3D11on12 translates all calls to DX12, and VKD3D knows how to translate DX12 to Vulkan. DXVK support for D3D11on12 just replaces the D3D11on12 code with its own and interfaces with VKD3D to share information, that way DXVK handles DX11 to Vulkan translation, and VKD3D handles DX12 to Vulkan translation. The same goes for D3D9on12. This greatly reduces the overhead, but adds some complexity and requires effort on two fronts: DXVK and VKD3D, and both implementations need to share resources, which adds some complexity, that's why it took some time. Because D3D11on12 is just a library that implements D3D11 calls, one can just hook its own implementation and override the D3D11on12 one (which is very easy under Wine).
@ShinichiKudoQatnip
@ShinichiKudoQatnip 9 ай бұрын
Well we remember the time when vulkan and dx12 brought in asynchronous compute that boosted performance in games to almost double compared to dx11 modes, especially on the amd Polaris gpus.
@Juanguar
@Juanguar 9 ай бұрын
Only for AMD to ditch async after Polaris
@enricod.7198
@enricod.7198 9 ай бұрын
​@@Juanguarjust because devs didn't use it and that was because nvidia was shit at it lmao
@Ph42oN
@Ph42oN 9 ай бұрын
​@@Juanguar Halo infinite does have async compute, and that game runs like shit on polaris, and if i remember right async compute caused performance loss on my old RX 480.
@Psychx_
@Psychx_ 9 ай бұрын
@@Juanguar What are you talking about? Async compute is still supported and usable in all AMD graphics cards.
@niks660097
@niks660097 9 ай бұрын
@@Juanguar its not "ditched", almost all games post 2018 heavily use async compute to run compute shaders, there is too much compute going on and without async compute, even the most high end GPU will grind to halt, on vulkan its always on by default, nvidia just caught on after 10 series..
@eliadbu
@eliadbu 9 ай бұрын
Let us face the truth, most developers can't optimize their games well, or at least do it on the time frame of releasing the games. So what we get is unfinished game that is more fitting for a beta release rather a full release, and the devs finish it in the following months/years. This phenomenon has become so common, that it seems like the norm and not outlier. DX12 is not inherently awful, but since it gives more control for the game developer, it also require them to do a lot of things by themselves.
@grumpyoldwizard
@grumpyoldwizard 3 ай бұрын
This is a great channel. Subbed. Thank you.
@WSS_the_OG
@WSS_the_OG 9 ай бұрын
It's tough to pin the blame on any specific API. I think the main issue is the time constraints studio managers and publishers put on the developers to meet launch date targets, and so instead of having the time to tweak and optimise games (textures is the most obvious recent example), they're busy trying to make objects they stuff in their microtransaction stores instead, and maintain their "live services." When devs are busy trying to a fix a game that's broken on release (probably due to unrealistic time constraints), they obviously will have zero bandwidth for making their code more elegant and efficient. DX12 just happens to co-exist with all of these industry trends developing simultaneously to the rise of DX12. On paper, at least, DX12 provides many more levers for devs to pull (vs DX11) for better performance and optimisation; I just don't think they're given enough time to do that.
@kira.herself
@kira.herself 4 ай бұрын
the api itself and how things are scheduled are great, it's merely a skill or time issue to get things right
@KaidenBird
@KaidenBird 4 ай бұрын
@@kira.herself 'tis a skill issue
@futuza
@futuza 4 ай бұрын
@@KaidenBird partially, it's a skill issue, but it's also a company culture issue. The big managers of these AAA studios, want to squeeze maximum profit out of their games, so that means paying their developers as little as possible, to the point where they can get away with less skilled developers as long as their games still sell. If they wanted better devs, with better skills, they need to offer more competitive salaries, benefits, and work environment. The best devs will go wherever they're offered the most and most of the time that isn't the game industry. So yes a skill issue, but mostly with management, rather than the actually developers being the root cause.
@KaidenBird
@KaidenBird 4 ай бұрын
@@futuza very true. Vulkan is currently kicking my ass. DX12 was substantially easier, and I had more fun optimizing for DX12 than vulkan :)
@OneAngrehCat
@OneAngrehCat 9 ай бұрын
Application Programming Interface = API. Not "Advanced". Just FYI
@vextakes
@vextakes 9 ай бұрын
My b!
@FOUR22
@FOUR22 9 ай бұрын
Thnk fk I am no longer on mr 2060 😂
@OneAngrehCat
@OneAngrehCat 9 ай бұрын
@@vextakes It's cool, I just had a brain freeze moment and wondered if I had somehow forgotten my job
@diablosv36
@diablosv36 9 ай бұрын
Its not DX11 thats faster, its all the optimisation that are done in the GPU driver for DX11 that is outdoing the dev optimisations in other API, having said that youll still see DX12 and Vulcan being faster in some scenes in these games due to some of the bottlenecks.
@guitaristkuro8898
@guitaristkuro8898 9 ай бұрын
Vulkan is simply the vastly superior API that is underutilized. It will be the future of gaming, more so than DX12. Nvidia has shown heavy adoption of the API in its Next-Gen technology implementations which involves constantly updating its library of calls and functions. Combine this with Nvidia NSight and developers get a debug view of their game purely focused on performance where they can click on something and see that it needs to be optimized and what calls they can implement. Along with video guides on crucial pipeline additions to speed up rendering implementations.
@smarterthanall1659
@smarterthanall1659 9 ай бұрын
@@guitaristkuro8898 Honestly irrelevant DX will always be the leader.
@shelletonianhuman
@shelletonianhuman 9 ай бұрын
​@@smarterthanall1659Where are the facts that support this?
@smlgd
@smlgd 9 ай бұрын
@@guitaristkuro8898 It isn't vastly superior. Having used both, they're pretty much (mostly) the same. Vulkan's biggest advantage is its portability. It is also its biggest drawback because it has to have a big scope in order to work on everything, so it ends up packed with features that may mislead developers into false optimizations and makes it harder for driver developers to implement. I'm not even talking about Linux stuff, more like Android focused stuff like subpasses that are essential to GPUs that use tiling (mobile GPUs) but actually hurt performance if you use them in desktop GPUs. All in all they're just as good as the developer that uses them and the driver that implements them (and Vulkan is significantly less performant in Nvidia cards)
@Blurredborderlines
@Blurredborderlines 9 ай бұрын
Dx12 could be 1000% faster but if it’s also allocating 1000% more resources are the results actually practical for the common use? You can argue all you want that it’s “more efficient” or that it uses said resources better - it’s been almost a decade and we still have the same or even worse performance than with dx11. Partitioning 75-100% of VRAM while only accessing about 50% of it is neither efficient nor does it adequately use resources at hand - at worst this would be called “bloatware” if it was related to CPU performance.
@Zumito
@Zumito 8 ай бұрын
I think I discovered the problem, when you use DirectX12 Ultime or Vulkan (the newer versions) these brings the CPU access to the VRAM, and Its like "standby memory" in ram, but for vram
@jcm2606
@jcm2606 4 ай бұрын
That's not the issue. It's true that DX12 and Vulkan grant access to CPU-accessible memory (host visible, host coherent and host cached, to be specific) but that's an opt-in thing that you choose when allocating memory (you'll want device local if you're frequently accessing the memory from the GPU). The reason why performance tends to be worse with DX12 and Vulkan is purely because the driver is doing less bookkeeping for you. With DX11 and OpenGL the driver would be responsible deciding where to allocate memory, when to share memory between resources, how to synchronise different memory accesses and how to order work optimally by both analysing the commands you've given and factoring in usage hints into the analysis to determine out what you intend on doing so that it can figure out how to optimally do that thing. With DX12 and Vulkan almost all of that responsibility is shifted on you, the developer, so now you are the one responsible for all of this, on top of the major paradigm shifts introduced such as the movement from an immediate mode to a retained mode API and significant changes to how resource descriptors work. It's like if you took somebody proficient in C# and told them to make something in C, they'd obviously need time to adapt which they're currently not being given in the vast majority of development studios.
@kklljj70
@kklljj70 9 ай бұрын
Good informal video bro keep em up :) ILY
@dragojess
@dragojess 9 ай бұрын
Dx12 itself is actually incredibly optimized, faster than dx11 by default. My own game is on it, and easily hits a stable 120fps on an integrated gpu. Its more of a problem in the way modern games are using the extra overhead and features of dx12 to push both the cpu and gpu harder. And speaking of Vulkan, from my experience while dx12 generally has more stable fps, Vulkan is better in terms of raw speed. It generally has slightly higher max and min framerates
@enricod.7198
@enricod.7198 9 ай бұрын
Also, I'm tired to ask this, where are mesh shaders and sampler feedback streaming? The first should be free fps and the latter should be like free vram reduction according to microsoft. Yet, basically no studio uses it and crappy companies like epic don't bother to implement those into unreal engine (also, they still have 4 threads cpu util by default)
@SL4PSH0CK
@SL4PSH0CK 9 ай бұрын
There's a lot of visual eye candies, often took me to print screen comparison to tell the diff of the game. Most I notice was texture usage. In MHW, as a 3GB user, seems like a bug since the texture looks blurry but runs better.
@asdion
@asdion 9 ай бұрын
DX12 being optimized is irrelevant because of the nature of shifting responsibilities. It's like saying a DIY 3d Printer is better than a prebuild one while giving the DIY kit to an overworked underpaid employee who will be fired if he dares to overstep the deadline. DX12 should have never been this hands off, it should have offered the same abstraction as DX11 while offering devs the option to go deeper.
@floorgang420
@floorgang420 9 ай бұрын
@@enricod.7198Ray tracing won and VR lost. Mesh shaders are originally created for VR, to see what only your eye can see as per se. RT is the opposite, to render off screen to current scene, like with RT shadow, the engine will need the model, structure and alpha of the tree to render the shadow into your view.
@erikhendrickson59
@erikhendrickson59 9 ай бұрын
When I think of Vulkan, which was formerly AMD Mantle, *_CPU MULTITHREADING_* is what comes to mind first. AMD & DICE were basically a full decade ahead of their time. I remember when BF4 released and made my 4790k beg for mercy because that game would *_gladly_* push all 4Cores/8Threads to 99% when playing large-scale multiplayer maps. In fact I remember using BF4 as my primary system stress tester because it would simultaneously push both my CPU & GPU at 99% usage, while utilizing a boatload of DRAM as well, and would identify instability better than just about any synthetic stress test at the time. Even to this day, a full decade later, I would hard pressed to name a game engine with better multithreaded performance than FrostBite.
@v01d_r34l1ty
@v01d_r34l1ty 9 ай бұрын
Issue is likely porting to DX12 vs. developing around it. Same issue with Vulkan and OpenGL. Vulkan released an article saying you should develop around Vulkan rather than developing around existing OpenGL if you actually want performance gains.
@jcm2606
@jcm2606 9 ай бұрын
OpenGL vs Vulkan is at least a little closer than DX11 vs DX12 since there's a number of OpenGL extensions that make it into Vulkan-lite. Bindless textures, direct state access, multi-draw indirect, NVIDIA's command lists extension, etc. Still a far cry from Vulkan but not as bad as the jump from DX11 to DX12.
@Rexhunterj
@Rexhunterj 8 ай бұрын
@@jcm2606 Those are some of my favourite OpenGL extensions, best additions to the API since it was incepted IMO.
@ahmedameurkh1809
@ahmedameurkh1809 4 ай бұрын
what ddo you use to show the stat thing at the top ? btw cool video !
@shakkaka12435924
@shakkaka12435924 9 ай бұрын
Good info and all but damn ! Slappin with that music at the end ! Damn bro lets go
@pr0fessoro
@pr0fessoro 9 ай бұрын
I have a friend. who is a programmer and many years ago he told me "you can't make games with two left hands" this was in 1997-1998 (then we played duke nukem 3d, doom, heretic, half-life...) ... then most computers had Pentium 75 to 100 processors and video cards with 1-4mb ram. a lot of optimizations were needed if you want to sell your game - the more machines it runs on - the better... now the programmers are lazy "I don't optimize the game, whoever wants to play should get a powerful computer...
@seedbarrett
@seedbarrett 9 ай бұрын
@@sadmoneysoulja bro, every dev have to optimize when they can. It's not magic, it's logic and math mostly. Learn some basic algorithms then you will be fine. I'm a dev myself, and the other day i did a HUGE optimization on my soft. How ? Just using the same path tracing logic as DOOM.
@tarnishedpose
@tarnishedpose 9 ай бұрын
@@sadmoneysoulja Can't you see how pathetic of a defeatist mentality that is? You haven't even started your engine and you're already complaining that you'll lose the race because you THINK the crowd does not know what it takes to drive a car. That's like a politician justifying their incompetence by saying "people don't know what it's like to be a politician". No, we don't. That's why we're civilians. But we don't need to a read 20 books on economy and socio-politics to tell that something is going wrong with a country. This is quite simple. Most of us know what the purpose of "OPTIMIZATION" is on a conceptual level. But if you are a developer, then it is YOUR job(not mine) to have an in-depth understanding of the implications of "OPTIMIZATION" on a technical level, so as to be able to put them in practice and actually optimize your games properly to begin with. But it is funny you did mention that you're just an enthusiast. I mean, it really is pathetic. You're not even an actual developer, but you're pulling out some sort of reverse-appeal to authority by questioning(outright undermining) the common folk's knowledge on the subject, based on the to-be-proven "fact" that they're not developers themselves(or even enthusiasts... like yourself lol) and thus couldn't possibly know anything about "optimization" whatsoever? Like seriously, man. Did no one in your enviroment ever taught you common sense?
@moonman2051
@moonman2051 9 ай бұрын
​@@sadmoneysoulja I like that game you made, that... uh.... ah, yes, nobody knows what games you made because they're probably shit anyway.
@cz5836
@cz5836 9 ай бұрын
Easy to call them lazy now but keep in mind games had terrible graphics back in 97/98 compared to now so optimization is probably a lot more complicated now.
@HunterTracks
@HunterTracks 9 ай бұрын
It's hard to call devs lazy when the gaming industry is notorious for crunch and massive amounts of overworking. We'd get much better games with much better optimizations if the industry had learned to slow the fuck down.
@longjohn526
@longjohn526 9 ай бұрын
The problem isn't the DX12 API, in fact when properly programmed it is faster than DX11 because 1. it multicore/multithreads better and 2. the Shader Pipeline is no longer essentially FIFO but can do parallel processing as well as take Out of Order instructions. Key words are PROPERLY PROGRAMMED Take Witcher 3 Next Gen, the reason it performs poorly in DX12 compared to DX11 is because the CPU threading isn't proper programmed in DX12 and with DX11 the API takes control of multithreading BUT is limited to 4 cores with the rest just coasting. In DX12 it's up to the game engine and programmers to properly set up multithreading but it can handle pretty much unlimited threads. If Witcher 3 multithreaded correctly in DX12 instead of trying to do everything on 1 or 2 threads/cores the game performance would shoot right up there.
@Blurredborderlines
@Blurredborderlines 9 ай бұрын
W3 needs to be completely reworked to operate correctly in a new API that it wasn’t designed for and this is your argument for why it’s better? This sounds like cope to me - the performance figures speak for themselves, Dx12 is a complete resource dump that doesn’t even use the VRAM it allocated while pushing the GPU against its own ceiling in order to operate WORSE than with an older API.
@malatindez9788
@malatindez9788 7 ай бұрын
Multithreading isn't necessarily magic solution to anything and it SHOULDN'T be seen like universal panacea. More threads won't help since render threads in games are most primarily limited by RAM latency and CPU cache size. In fact, more threads might make it *slower* due to increasing cache misses as the threads fight over what memory to access. Moreover, the bandwidth of PCI-E isn't limitless and often that is the issue and multiple threads will only slow down the bandwidth. What's very great about DX12 and you probably meant are command lists. They're working completely fine in single-thread, but they are a powerful tool to harness the capabilities of mdoern GPUs. Basically instead of telling the GPU to do stuff step-by-step you send a pre-compiled list of commands which can even be cached. This means that even if the CPU hasn't yet formed a complete command list, a new frame can be rendered with just a few changed commands (such as view matrix updates), while the CPU gathers the necessary changes.
@picklechin2716
@picklechin2716 9 ай бұрын
I have an I5 6500, RX480 8gb and 16gb of 2133mhz ram. I find that in Rainbow 6 siege, DX is a slide show, while Vulkan runs the game at 144 fps(with stutters and other problems).
@deadfry42
@deadfry42 4 ай бұрын
as for porting Direct X games to other platforms, there is a tool called DXVK that translates DX api calls into Vulkan calls on the fly, and I’ve heard it results in higher frame rates, but I haven’t done the testing to prove that.
@PalladinPoker
@PalladinPoker 4 ай бұрын
On games that run DX8-10 (or older) it usually gives massive gains, DX11 is hit and miss and expect a performance drop on DX12 in most cases.
@nbohr1more917
@nbohr1more917 9 ай бұрын
The easiest way to think of it is like this: "Using DX12 \ Vulkan is like writing your own graphics drivers." Hardware manufacturers love it since they don't need to hire as many driver optimization experts but game studios are feeling the pain because the optimization task has been shifted to them.
@arenzricodexd4409
@arenzricodexd4409 9 ай бұрын
more like AMD likes it because they can have less people on driver team. still remember a few years ago when phoronix came with a news that AMD end up firing more people in their linux division once their open source driver end up being more established. on nvidia side they got the money so they can afford to do their own optimization. after all nvidia engineers should knows their hardware better than game developer that never work on the hardware development.
@CyrilCommando
@CyrilCommando 9 ай бұрын
Isn't that just a straight negative? Why should people want to "write their own drivers" when an API exists? We are seeing the result of this backwards decision in play here in this video. When things get more complex, it gets harder to program for, harder to use, even if the theoretical ceiling is higher, it doesn't matter jack shit because it's 60x harder to use and therefore no one can use it. As a programmer who's gotten into the space just in the last ~4 years, I see this ALL over the fucking industry. With frameworks & complicated design models where there could just be a simple top down procedural script. What argument is there for theoretical benefits when they are too hard to make use of practically?
@nbohr1more917
@nbohr1more917 9 ай бұрын
@@CyrilCommando yes this all seems like a downgrade. AMD pushed for it since OpenGL is so closely mapped to Nvidia hardware that it was hard to get developers to create games that ran equally well on their chips. I think they made a deal with the devil and basically told Nvidia that they would make some concessions by not directly competing in some performance segments in exchange for Nvidia adopting Vulkan and DX12 standards. Nvidia probably agreed because they too wanted to fire those expensive driver authors.
@jcm2606
@jcm2606 9 ай бұрын
@@CyrilCommando Because the older APIs were clunky, outdated and had inherent flaws with how they approached certain things, OpenGL especially. The core problem is that to optimise a game's usage of the GPU, the driver has to know how the game is using the GPU in the first place. Assuming _total_ driver optimisation (ie NVIDIA or AMD literally runs the game internally and records commands being sent to the GPU to base their driver optimisations off of) isn't taking place then the driver is essentially having to do this on-the-fly as its ingesting commands for the GPU, with some hints from the developer. Essentially imagine that the driver takes in maybe a fifth of the frame, analyses it to see what the game is doing, builds out "optimisation strategies" for lack of a better term, builds _new_ commands that take said "strategies" into account, then sends those new commands off to the GPU before moving on to the next fifth of the frame. Frame finished, move onto next frame, driver maybe uses what it learned from previous frame to skip some work in the current frame. Except now the game is doing something new, it's rendering objects that weren't included in the past frame. Uh oh, have to re-analyse it again! This not only limits the optimisations that the driver is capable of doing, it also adds a *lot* of baggage onto the driver which eats into precious CPU resources. Hence there have been multiple attempts to streamline this and move some of this onto the developer (I won't link anything since there's a lot, but look up Approaching Zero Driver Overhead if you want to know the details) which culminated into Mantle and eventually DirectX 12 and Vulkan. The idea with these newer APIs is that the developer is in a *far* better position to make these optimisations since they should know what's in each frame, so the API will instead give the developer the tools to make these optimisations themselves. Where with older APIs the driver would take care of command recording and submission, resource allocation and management, synchronisation and such, now newer APIs require the developer to take care of these themselves. This _does_ significantly increase the amount of work and responsibilities that the developer has to take care of, but it _also_ significantly increases the optimisation opportunities that are available to the game. Older APIs may not have recognised that a certain set of commands are repeated and can essentially be prerecorded and reused multiple times, but developers should ideally recognise this as they're designing their games rendering pipeline and can build that into the engine. Older APIs may not have recognised that Task A and Task B can both overlap since they don't depend on each other, but developers ideally should, etc. Initially this _was_ a royal pain in the ass with how verbose the APIs were, but nowadays there are various strategies that have been adopted by the industry to streamline things. Modern game engines have moved over to "render graph" designs where the entire frame is laid out in a sort of "flow chart" type system, allowing the engine to know exactly what should be running when. Libraries have been built up around the more annoying aspects of the APIs such as Vulkan Memory Allocator which makes memory allocation considerably easier for the developer. APIs have changed to relax some of their verbosity such as Vulkan's introduction of dynamic rendering and, recently, shader objects which makes modern game engines considerably easier to write.
@kira.herself
@kira.herself 4 ай бұрын
thats bs and entirely not true
@frozby5973
@frozby5973 9 ай бұрын
when comparing dx12 to vulkan, its hard to say because games that do enable you to switch between dx and vulkan are usually the ones that dont have the apis implemented well. its even said in a couple of vulkan learning resources that in order to use vulkan well you need to structure your application around it and abstracting vulkan(like in the games that do switch between apis) does nothing more than having a more verbose version of opengl... which in the end is probably sometimes their goal, to have the simplicity of opengl/dx11 with new features like rtx and dlss
@CMak3r
@CMak3r 9 ай бұрын
DX12 gives more control over hardware to developers, but some of them don’t yet know how to utilize it properly. The thing with real-time shader compilation stuttering is that all game shaders are compiled in-time. It’s possible to precompile all shaders before loading the game, they can be delivered through steam services, they can be compiled while loading levels, this is to developers to decide. I hope that with time more developers will be adequately qualified to work with DX12 API
@Ryukaschien
@Ryukaschien 9 ай бұрын
As I watched this video, while the information was all the more worthwhile, I started to get distracted (or rather attracted) to the possible Pokemon Gym music in the background... or was it just me? Lol. Good use of it!
@joey_f4ke238
@joey_f4ke238 9 ай бұрын
Long story short, developers suck, and i say this as one myself, bad practices are everywhere and the bigger the team the more inconsistencies in development, shortcuts, or just unoptimized programming. Some of this comes from the tight release schedules as well and it has been proved time and time again that every game has lots of potential to even double performance with time, like the guy here on youtube that made a modified mario64 version for the gamecube and improved performance to the mooon by himself. Now, the problem with dx12 and vulkan is that the more direct access you give to the hardware a developer, the more responsible they are to properly manage the resources, this happens with regular programming languages as well, low level programming always needs the developer to be very careful with how they handle resources while high level languages and api's kinda hold your hands in that regard
@Rexhunterj
@Rexhunterj 8 ай бұрын
The difference for that Mario64 mod and the mods I make for Carnivores (1998) Dinosaur Hunter on PC is that those games were made in a time when current understanding of optimisation in code, particular techniques in software dev and game dev techniques too. I can shit all over the performance of the world renderer in Carnivores today because a 1024x1024 cell terrain (each cell is 1 meter square) is simple stuff and we have since discovered methods of optimising the processing of a terrain mesh/tree.
@user-cg2gk1yw7w
@user-cg2gk1yw7w 9 ай бұрын
To be fair there's even plenty of DX11 features that devs didn't use/aren't using properly. Which is a shame. Any API could do a lot more if properly understood by devs.
@tik2368
@tik2368 8 ай бұрын
Two other important things I think to mention is that dx11 was largely focused on leaving optimization to drivers and graphics vendors, so nvidia and amd performance improvements came with the wealth of driver optimizations, whereas dx12 is fully in control of developers, this does mean that when the time is put in a game can utilize resources more efficiently depending on what the game needs, but it has a double edged sword effect where if the time isn’t put in it can end up harming performance beyond the one size fits all approach of dx11, it’s a complicated api that requires dedicated work to optimize properly but the problem then comes in that with most studios the pc port is oftentimes the least prioritized version as it is the least populated but requires the most work. Second vrs is a good example of how dx12 requires actual work dedicated to it since it has to be carefully balanced with regards to game visuals or else it can improperly blur and not make back much performance, upscalers on the other hand are much less hands on with not much resources required to make it work properly seeing as most games are able to be modded easily with support for stuff like dlss regardless of their actual support for it just so long as the games run on dx12
@josephdias3968
@josephdias3968 8 ай бұрын
I think the biggest problem is d3d11ondx12 conversion wrapper vs just straight dx12 vulkan is similar but tends to be the better api when it comes to conversion layer
@olixrr
@olixrr 9 ай бұрын
I believe that a lot of the performance hits also has to do with DRM in the background. I tried a (safe) cracked version of Dead Island 2 which is a DX12 title and features FSR2 Upscaling and Variable Rate Shading which I haven't seen used before, however the pirated version used the most of my GPU and little of my CPU, while averaging 170-200 FPS. Once I brought the game through Epic I noticed a huge FPS drop now running at 100-150, now utilizing more of my CPU. It might be interesting delving into what the DRM is doing to cause such a performance hit. I know Dead Island 2 specifically uses Denuvo v18. And my Specs are a 12-600k OC P cores @5Ghz, E cores @4.2Ghz with 3600 DDR4 Quad channel and a 3070 Gaming X Trio flashed with a Suprim X Vbios for +50W which helps for a small 10-15% FPS increase in titles.
@csgosh-tter
@csgosh-tter 9 ай бұрын
literal pirates make games run better than developers.
@loganbogan9
@loganbogan9 9 ай бұрын
I don't believe this has anything to do with APIs really. If a game uses DRM and has a DX11 and DX12 mode both will run the DRM. You are right that DRM causes performance loss, but this is the one performance penalty that shouldn't be blamed on DX12
@Psychx_
@Psychx_ 9 ай бұрын
Denuvo has been said to hurt CPU performance for years at this point.
@lilpain1997
@lilpain1997 9 ай бұрын
There have been multiple vids showing that DRM does and does not affect performance at all. Just go look them up. Some games get hit hard, others dont.
@loganbogan9
@loganbogan9 9 ай бұрын
​@@lilpain1997Just like APIs it's all about implementation lmao. If only developers where given time to port their games to PC.
@PeninsulaCity2024
@PeninsulaCity2024 9 ай бұрын
At some point, I'll be playing an older game with "last gen" graphics and I can't help but notice that it still looks as good or even better than newer games to a degree but with a lot less stress on the system. It makes me wonder why the need to push current hardware to the extreme just to have, in my opinon, a hardly noticible difference in graphics other than ray tracing and 4k + textures which may or may not be worth it.
@cattywampusq
@cattywampusq 9 ай бұрын
I love this content and this discussion, I did want to bring up the difference vulkan makes in RDR2: the 1% and 0.1% lows are much higher than on DX12, meaning a smoother experience overall. The advantages of vulkan aren't always in the high framerates.
@SplitScreamOFFICIAL
@SplitScreamOFFICIAL 9 ай бұрын
Are we talking about driver overhead or is it just devs optimization techniques specific for Dx11 while Dx12 is still inexperienced I want to see GamerNexus Cover this and see the actual cause and results
@tomars3645
@tomars3645 9 ай бұрын
gamer nexus is not a low level developer or not even a developer.
@biglittleboy9827
@biglittleboy9827 9 ай бұрын
The first game using Dx12 was made in 2015, it was ashes of the singularity. That's almost 8 and a half year. I wouldn't say Dx12 is still inexperienced.
@PadaV4
@PadaV4 9 ай бұрын
Dx12 was launched 8 years ago, at this point of time either devs have a skill issue or the API is trash itself.
@Psychx_
@Psychx_ 9 ай бұрын
Optimizations that were previously done in the driver now have to be done by the game devs. Stuff like batching draw calls, overlapping work that can happen in parallel, memory management, … Having a lot of control also comes with increased responsibilities.
@biggo4637
@biggo4637 9 ай бұрын
​@@tomars3645They are tech journalists and have the resources to find expert developers. i mean they went into Amd fabs and Evga ones! They definitely have connections
@xalenthas
@xalenthas 9 ай бұрын
One thing I believe is worth mentioning and it shows on the Red Dead Redemption 2 footage is while there appears to be little difference between the DX12 and Vulcan overall FPS, the 1% lows are generally much better with Vulcan. Look at the frame time chart; Vulcan tends to have less "stuttering" and in my opinion, gives a better overall experience.
@Ghanemq8
@Ghanemq8 9 ай бұрын
Armored Core 6 is going to be built on DX12. I really hope it's not as big of an issue. I hope it runs well
@MaxIronsThird
@MaxIronsThird 8 ай бұрын
it does, Elden Ring was also DX12.
@Ghanemq8
@Ghanemq8 8 ай бұрын
@@MaxIronsThird That explains why the performance is so shit lmao I got an RTX 3070 and an i7-10700k and I can barely run the game at a stable frame rate at medium settings. So many stutters and hiccups.
@MaxIronsThird
@MaxIronsThird 8 ай бұрын
@@Ghanemq8 there must be something wrong with your PC, I'm playing the game at 120fps(not locked but VRR makes it feel really smooth).
@soundywaivy
@soundywaivy 3 ай бұрын
Armored core 6 is the best optimized fromsoftware game, smooth as butter. Especially compared to elden ring
@hendranatanael3875
@hendranatanael3875 8 ай бұрын
Sorry out of topic may i know the music used in the background that's starting at minute 9:47 sir😄 ?
@achillesa5894
@achillesa5894 9 ай бұрын
It's definitely a per-game optimization thing. And you have to look not just at average FPS but also 1% lows (stutters). For example in Apex I get like 50% higher average FPS with DX12 when nothing is going on, but when the fighting starts I get way more noticeable stutters so I stick to DX11 despite the average FPS being lower. While on The Witcher 3 DX12 obviously allows for DLSS Quality + Frame Generation + Raytracing so it's the superior choice (for my 4070). It's unfortunate that we have to do this, but trying both (and Vulkan) on every single game is the way to go.
@Bsc8
@Bsc8 9 ай бұрын
I will be always impressed by Black Mesa, an HL2 mod than became a standalone gorgeous game (the best next-gen HL1 experience so far). Made by a small indie studio, it uses *DirectX 9.0c,* looks like it has RT on but it's all PURE OPTIMIZED RASTER! Running smooth like DooM Eternal. _edit: the engine used it's the latest 2007 update of Source, an engine made in 2004. Apex Legends and Titanfall games runs on heavly modified dx11 version of Source too._
@Ay-xq7mj
@Ay-xq7mj 9 ай бұрын
Source bakes ray tracing into the maps since css so its cheating.
@ShoryYTP
@ShoryYTP 9 ай бұрын
​@@Ay-xq7mjalmost every game does that
@biglittleboy9827
@biglittleboy9827 9 ай бұрын
That's because they actually worked to make the game beautiful instead of just relying on the engine assets and technology to do their work. Most devs nowadays are lazy and incompetent due to the progress that technology made and that make their life easier. Modders are passionate people so if they are competent you can be sure they will make a good job.
@Bsc8
@Bsc8 9 ай бұрын
​@@Ay-xq7mjmaybe but It looks like raytraced stuff, just not in real time. So to me not performance impact=shut up and take my money 😂
@Bsc8
@Bsc8 9 ай бұрын
@@biglittleboy9827 facts!
@yoda29000
@yoda29000 9 ай бұрын
Vulkan/DX12 both originate in AMD's Mantle API. The idea was to give developers more hardware control, but in exchange, they had to plan for every GPUs out there. What we're seeing now is devs that programmed for one single GPU.
@pigpuke
@pigpuke 4 ай бұрын
The whole point of an API is so you _don't_ have to code for every GPU. Otherwise, we've just been thrown back in time to the late 1990', early 2000's when they had to code for specific 3D cards. This is a massive step backwards, not a step forward.
@jcm2606
@jcm2606 3 ай бұрын
You don't need to plan for every GPU under DX12/Vulkan. You _can_ since they both expose enough information and functionality to do so, but there's always a universally supported, "good enough" approach that is guaranteed to be available to you as the DX12/Vulkan specs mandate that the vendors *must* support that approach. The problem with DX12 and Vulkan is that they both require you to be much more in tune with how the hardware works, namely when it comes to recording and submitting commands, synchronising work between commands or groups of commands, and managing memory allocations and resources. The APIs mandate that *all* vendors must support a "good enough" approach for each of these, so you don't need to manage multiple distinct code paths for each vendor or architecture, but the APIs also expose enough information and functionality that you _can_ manage multiple distinct code paths if you want to.
@titanicgames6323
@titanicgames6323 8 ай бұрын
Some of the performance issues that DX/D3D12 has is due to a security feature that Windows has named "Control Flow Guard" which can cause stuttering and inconsistent frametimes. DX11 is also bound by single-thread speed, so depending on your CPU you can get much more fps with DX12 if your CPU has slower cores (e.g. Ryzen 7 2700X and older or old Intel CPUs), so games like can Fortnite can yield ~30% higher fps with DX12 when CPU bound as long Windows security features aren't killing your frames.
@deimiosxxx
@deimiosxxx 9 ай бұрын
Yeah I noticed the DX11 vs DX12 yesterday while playing Path of Exile. On DX12 it struggled to hit 120FPS while on DX11 it was smooth sailing with 120FPS at like 60% GPU usage. To be fair it is clearly labeled that DX12 is Beta, still it's a massive difference.
@Winnetou17
@Winnetou17 9 ай бұрын
To be fair, I trust them to bring it to a good level, unlike the soulless cashgrabs that are churned left and right.
@14Gab88
@14Gab88 9 ай бұрын
I tend to have the best experience using Vulkan on PoE
@pf100andahalf
@pf100andahalf 9 ай бұрын
I was thinking about this yesterday. Things have gotten so bad that frame generation "fixes" the cpu being hammered in TLOU and jedi survivor (jedi has a dlss3 mod and it also uses 16gb vram with all the settings maxxed out). With dx12 and dx12 ultimate, devs have bitten off more than they can chew.
@TechDunk
@TechDunk 9 ай бұрын
I know for Unity games that will come out in the next few years (if they use the latest version or update the engine), dx12 will get a measurable performance boost and is as fast or faster in most cases. So that's good news
@papp215
@papp215 8 ай бұрын
@vextakes control flow guard on or off when tested dx12 games ?
@Zakmakoto
@Zakmakoto 9 ай бұрын
Vulkan, like DX12 is a close to metal API, that means that GPU vendor driver layer is just kept at the bare minimum, still that doesn't mean they can't optimise it or even (often in nvidia's case) like to derive from what MS or the Khronos group have standardised (RTX, later on integrated, VK_NV...) to keep headache on non-nVidia users and developpers. When you transfert the responsibility of optimising the draw calls, compute etc... by the CPU from GPU vendors to developpers in their engine (looking at you UE) you often get that unoptimised mess because they either don't fully control these API calls in the case they're using a third party engine like UE or Unity and it's up to Epic to support them for instance or figure it out themselves. Or they simply don't have the time for that. With DX11, quite often the GPU vendor came to the rescue and tweaked their drivers to optimise things further, it's still the case with DX12 but the results are quite minimal every time. In the end is DX12/Vulkan a bad thing? No, they appeared at a time when it was needed and developpers wanted more control. Now with the growth of UE games we simply see the limits of what a "generic" game engine can do even if the demos are incredible, there's no way they're covering all game use cases perfectly. While game dev carelessness is for sure a good reason for unoptimised games, the problem is in most case the lack of control and the separation of the game developpement and game engine developpement they have. Remnant II is a perfect example of this, and many more are.
@ashz2913
@ashz2913 9 ай бұрын
Engine programmer here and what you said is 100% true. These newer graphics API makes no assumptions over what the user want to do and require the user to explicitly declare every little things that previously can probably be dealt with via driver optimization. The responsiblity lies on the game developers rather than the vendors when it comes to these newer graphics API. It is often, of not always, better to have more control with access to lower level feature that allow us to interact with hardware more closely which in turn results in better and more otimizated real-time render. But like you said, this benefit cannot be obtained if the game developers only choose to rely on commercial game engine like UE and not willing to engineer their engine to better fit the kind of game or scene they want to build. The architecture of UE itself is designed to be modular and easy for game developers to write extensions on top of the classes that Epic has provided. But this design also comes with a lot of overhead and it is not entirely about graphics API. The games that are made with UE recently really aren't that complex. In fact they are quite simple consider the type of game only consists of few entities that is controlled by AI along with other triggers and events in a level. They are not some RTS or games that require a lot of physics simulation or just simply have a lot of activities that is going on. It really shouldn't be this demanding to run on modern hardware. Though I don't expect many developers to spend time on acquiring the right technology for their game. They will just have to rely on marketing to get more people interested and to purchase the game then patch the game after release, if they'll ever bother to address these issues.
@tux_the_astronaut
@tux_the_astronaut 9 ай бұрын
Think another issue in unreal games is devs see nanite and lumen as a magical tool that will make their game look and tun great. Like yeah nanite is nice for LOD but doesn’t mean you should stop caring about polygon limits
@GrandHighGamer
@GrandHighGamer 9 ай бұрын
VRS has a pretty minimal performance gain, while also having visible artifacting if you know what to look for. You're still basically reducing the resolution of parts of the screen (shader wise, anyway). To even get a decent improvement, you need it aggressive enough that blockiness starts to be visible.
@smlgd
@smlgd 9 ай бұрын
I think it's very niche today like for racing games where the environment becomes a big blurry mess but it's very promising for the future if we ever get foveated rendering for VR, might even make VR performance close to screen performance. But I agree that it's not like he said "free performance" in fact it's probably not worth how hard it is to actually implement it
@nikidrawsstuffs
@nikidrawsstuffs 8 ай бұрын
ive done my own tests with dx11 vs dx12 on pc fortnite (and some side research) and it gives a significant boost to performance (60-80 fps for me), but when you first switch to it, it stutters a lot and needs time to adjust or something
@Ochumvru
@Ochumvru 9 ай бұрын
the rate at which you and your channel is growing is incredible
@dand337
@dand337 9 ай бұрын
To give you some context, DX12 on wither 3 was very poorly made bc it was mainly an experiment for cd project devs to learn how to use some tools and add RT.
@Morpheus-pt3wq
@Morpheus-pt3wq 9 ай бұрын
I think it´s important to note, that the devs, who were making W3, probably no longer are in the company. Thus, the devs porting the game to DX12, had very little to no idea, how things work. Not to mention, they made the patch mandatory instead of optional...
@-MaheenUddin
@-MaheenUddin 9 ай бұрын
At rdr 2,using direct x causes frame time graph to spike often, specially noticeable at low end gpus. Whereas vulkan has a smooth frame time graph even in your video. The 1% low is what matters more than fps
@Brunnen_Gee
@Brunnen_Gee 9 ай бұрын
Many of the games I've played in recent years have the ability to swap between DX(insert version) and Vulkan. What ends up happening is you get a ton of people asking "Which is better?" And my answer is always "Try both and see."
@SyntaxDaemon
@SyntaxDaemon 9 ай бұрын
Excellent video. I'm no expert, but I do know more than enough to know that you've really done a great job.
@dunndudebemelol
@dunndudebemelol 9 ай бұрын
Early dx11 and tesselation was like this too.
@Tainted79
@Tainted79 9 ай бұрын
In rdr2, vulkan has better performance with heavy cpu workloads like massive AI crowds in Towns/Citys. Dx12 has better image quality. A good example is the field of blue bonnets (flowers) near adlers ranch in the North west of strawberry.
@thomasgessert8518
@thomasgessert8518 9 ай бұрын
The question is, how much time is really spent on code optimization? As you see in the speed increase on cpus with big caches the code and data localization is pretty low, probably caused by many indirections via pointers. The next question is, is all the control that these APIs offer really used by the developer or do they just use a high level game engine that shadow all of this? To my understandment Unreal Engine 5 is just doing this: Shadow the graphics APIs. I am no developer but sometimes I need to program myself at work. A fast provided solution is often preferred by the bosses to a speedy program.
@MooshPaw
@MooshPaw 9 ай бұрын
i find this interesting because with my 1060 i run both fortnite and Kena better on DX12 rather than 11, both get about 20fps more on average, but the studders are heavily reduced
@deblxdee
@deblxdee 9 ай бұрын
Devs doesn't really figured DX12. Additionally for it to work better you need to run around for time being for cache to load in and essentially DX12 would provide slightly lower FPS, but much more stable since it's way better at multi-threading than DX11. Also, we don't have an insight about how companies allocate their programming staff so it hard so say if there's one team that works with multiple API's or bunch of teams or some other way.
@loganbogan9
@loganbogan9 9 ай бұрын
Ugh I CAN NOT BELIEVE shader compilation is a problem in 2023 considering how many tools most devs have nowadays that support a variety of ways of eliminating it. You can build shaders at start, and that'll eliminate all stutter throughout the game (if it's not Jedi Survivor). The downside of this is that it'll take 10-20 minutes beforehand before you can play. Another wonderful option is Async shader compile. This will continue running the game while the shader is compling, reducing performance versus compiling them at start, but still better than the standard pause-then-continue method DX12 normally uses. The downside of this is depending on the game, you may be able to see through objects or effects won't load properly for a few seconds the first time. The ultimate solution to this is to merge the two methods together. Nixxes' Spider-Man port compiles shaders at start for like 30 seconds. It's the reason you can't skip the logos at the first start. After it compiles the basic shaders, any shader needed after is Asynchronously compiled, leading to very fluid gameplay with visually no Async artifacts.
@Lil_GroceryBag
@Lil_GroceryBag 4 ай бұрын
wait so should i not have installed dx12 ultimate when there was that driver to support it? or would i just not be able to play games using the dx12 ultimate api
@janus798
@janus798 9 ай бұрын
7:07 - It's not a matter of "choosing" to use it. It still requires a tremendous effort to implement those features, on top of re-architecting your backend to work with DX12 / VK.
@olebrumme6356
@olebrumme6356 9 ай бұрын
I'm not a programmer, but I got experience selling custom/prebuild PCs and benchmarking games. Vulkan runs better for the most part on most systems, but especially low and midrange PCs. In RDR2 for example Vulkan gives a smoother experience, higher 1% lows and all that, over DX12.
@BlindBison
@BlindBison 9 ай бұрын
The main problem is inefficient/poorly optimized CPU side code - in particular developers are not putting in the leg work for efficient asset streaming/traversal systems and shader compilation stuttering has also been a big problem in recent years. GPU side scaling has usually been OK but it’s the stutter struggle that’s been really unacceptable in recent times. The amount of UE games especially that ship with streaming/traversal stutter and/or shader stutter that are quite literally never fixed is astonishing. The fact that devs and publishers think it’s acceptable to make your product with such glaring technical problems is a big issue. It’s gotten so bad I pretty much just don’t buy UE games anymore on PC. DX12 / Vulkan “can be” powerful tools but only if the developer’s engineering team are up to the task - most do not seem to be unfortunately. DOOM Eternal, Red Dead 2, and post patch Cyberpunk come to mind for titles that are actually great on PC now and use low level graphics APIs.
@Wylie288
@Wylie288 4 ай бұрын
Doom Eternal is just magic. I can get it to run at 90 fps most of the time on my Steam Deck without even turning on resolution scaling. Some of the first few levels are a little iffy, but most of the game runs great on it. RDR2 is just, actually not graphically demanding at all. RDR2 hired fantastic technical artists. The has 1 light source, very few real time shadows, and the shadows that are there are generally very low quality. But the game is packed with lots of particles effects, post process, and decal effects that people DO notice instead. While on a technical level its rather low fidelity, but to human eye balls it looks very good. Its not well optimized it just had a good team of technical ARTISTS instead of just technical developers.
@Proxima_X
@Proxima_X 5 ай бұрын
What is the piano song in the end called? Gives me mad nostalgia and I don’t know for what
@lolok.9656
@lolok.9656 9 ай бұрын
Great video man!
@ReaperX7
@ReaperX7 9 ай бұрын
Denuvo is the bane of any and all games.
@anon2036
@anon2036 9 ай бұрын
Never went to college, so I don't really consider myself a true "developer" but what I have is 5 years of experience in the industry. What I can say right now is a lot of the newer devs only seem to be getting into the industry because of demand and pay. Work environment can vary company to company but it's also usually acceptable. I can count on one hand the juniors or 1-2 years experienced devs who are actually passionate about their jobs. I've had slack from managers for me being to critical in code reviews, but I can actually visibly see the decrease in quality of work over these past 3 years. My last code review I dealt with a dude (not a jr btw) who parsed an xml file into an array then proceeded to loop over it 5 times to get item names and descriptions all single threaded. It's no problem in the server, companies' rich they can afford some of that cpu time, but people shouldn't pull that shit client side makes the menu laggy.
@exusxt
@exusxt 4 ай бұрын
i can confirm the quality of code decreases for over 10 years now. Performance issues are not only common in video games. This is why often the product seems to work in test environment and then if you try it on a productive system it performs really bad or even crash.
@stealthhunter6998
@stealthhunter6998 2 ай бұрын
Is there any particlar reason traversal stuttering and shader compilation stutter has been far worse for me in DX12 than DX11 or Vulkan (one reason I use Vulkan every time I see it is cause that stuter is lessened)? Is there anyhting that can be done to help it other than running the game through DXVK especially UE5 games or something like Dead Space Remake?
@SumeaBizarro
@SumeaBizarro 4 ай бұрын
One thing with VRS is that only quite new cards seem to support it. I was not surprised that 1060 did not support it which I still rocked a year go, but that 5700xt also does not support VRS IS surprising to me. DLSS is also exclusive to somewhat fresher cards on NVIDIA but FSR is supported by much older cards which is why it makes sense to be there as performance enhancing option. The cards that are more likely to need a performance enhancing option do not support VRS.
@jvne3497
@jvne3497 9 ай бұрын
As a hobbyist graphics engine developer, often it's very hard to choose what to use which graphics API to use, DX works on Windows, Vulkan runs on most (however OSX removed support in place of Metal however I'm not sure they added it back), OpenGL runs on virtually everything but is old and fairly slow. Some of these systems have moved to using GPU graphics/compute libraries to merge all of these such as wgpu (which I believe uses gfx-rs for graphics and compute abstrations) on the Rust programming language. These use different shader modules such as SPIR-V or WGSL (which are both great formats however SPIR-V can be a nightmare sometimes). So now lets say I want to make a game, I'm going to use a game engine so I can avoid the nightmare of coding with all these potential graphics APIs, I just wanna write some shaders, lots use an ActorMVC model (CryEngine), some use an Entity Component System (Bevy Engine), some use a Node based system (Godot Engine), these all have their uses and how fast they run. But how are they built? Bevy Engine for example is 100% async with easy multi-threading however is hard to use, so why don't people use that? Well simply look at the engine, it's a bit of a nightmare and doesn't have all of these "modern" features requiring you to code your own shaders for ray tracing or the likes. So what's the issue? Honestly I'm not 100% sure. We have the options, we have the technology, we have very talented low level programmers (such as the WGPU and GFX-RS team) to make it so we don't have to interact with these large amount of graphics APIs by hand at next to no cost, we have people creating massive projects and graphics renderings that can do volumetric clouds at 0 cost (@rust_gamedev on twitter shares some neat things, even the frostbite engine devs are working with Rust and are seeing wonderful results!). So the only thing I can think of is investors pushing releases before optimizations happen, gotta meet those deadlines and meet those dollar values.
@cryptic_daemon_
@cryptic_daemon_ 9 ай бұрын
Fellow Embedded Engineer, it amazes me that developers are working with much more complex systems and , meanwhile my work is very low level, and work with mostly xtensa and arm microcontrollers. Makes kinda glad that im not a game dev ngl xD. My question to you, do you think the Rust language will help gaming?
@tsunekakou1275
@tsunekakou1275 9 ай бұрын
Only Rust programmers can say "0 cost" that confident 😚
@tsunekakou1275
@tsunekakou1275 9 ай бұрын
@@cryptic_daemon_ If you think about it, what Rust offers vs what game devs needs.. Rust workflow even slower than C++ which is kinda go against the crunchiness of what big studios wants. High friction, slow iteration, mean Indies are out. Rust eliminate some class of bugs, but like does game dev care that much about bugs?. But who knows, things could change in a few years.
@jvne3497
@jvne3497 9 ай бұрын
@@tsunekakou1275 Near 0 cost, I'm sorry I didn't diligently review a post that I had made at 12AM. Runtime always has a cost and as a proficient developer you know this I would hope, much of the code for the shaders was a compile time calculated shader meaning while it is limited in its uses, it does work.
@Bunuffin
@Bunuffin 9 ай бұрын
DX12 and Vulkan are quite hard to work with since is closer to the hardware... Vulkan can push way more (doom anyone?) but guess what, there is more coding to do. Linux translates DX to vulkan so i prefer having games with vulkan, less performance losses for me
@keit99
@keit99 9 ай бұрын
Interestingly I've some games that seem to be running with less stutter witv dxvk (in linux) than pure DX (on windows).
@benjib2691
@benjib2691 9 ай бұрын
In regard to Doom I think it's more related to id Software having some of the best engine developers out there. I've never saw a single game based on idTech 5, 6 or 7 that runs poorly. Doom 2016, Doom Eternal, Wolfenstein The New Order and Wolfenstein II The New Colussus all runs perfectly on a wide range of hardwares with excellent visuals and image quality.
@b-ranthatway8066
@b-ranthatway8066 9 ай бұрын
I just wish gaming companies would take extra time to release a well polished game. I don't care how long it takes, but make it flawless and people will buy it
@havingabrainisoverrated6214
@havingabrainisoverrated6214 8 ай бұрын
Hey wait how do you get the performance overlay?
@PercyPanleo
@PercyPanleo 9 ай бұрын
I feel like Vulkan will probably be used more often going forward if developers want to go through the effort of optimizing their games graphically since you have the Steam Deck which only supports Vulkan and OpenGL (Using a translation layer to translate DirectX into Vulkan) acting as an optimization benchmark. If your game runs well on the Steam Deck, then it will run well on most relatively modern PCs.
@zybch
@zybch 6 ай бұрын
You might as well say the same about the Switch though. It and the deck are such pathetically low performance devices that ANYTHING that can run on them will run far better on even older PCs.
@PercyPanleo
@PercyPanleo 6 ай бұрын
@@zybch the thing with the switch is with it not being a PC, developers can use settings and optimizations that they make unavailable in the PC version. With something like the Steam Deck though you can't do that since it is a PC
@TanteEmmaaa
@TanteEmmaaa 9 ай бұрын
It was always like this. For example, almost every game that has a DX9 rendering path had a lot more fps than the DX10 or DX11 rendering path on the same game. It is sad, but it really was always like that. in 95% of the games, using the LOWER API version provides you with better fps. And often with a more stable game.
@scorpiom8053
@scorpiom8053 8 ай бұрын
Yeah but then again, dx10 and 11 had more graphical features over dx9. Between 9 and 10 tho most preferred to use 9.
@m4rt_
@m4rt_ 4 ай бұрын
An Application Programming Interface (API) is as the name implies, some kind of interface to make it easier to do things. In this case, a Graphics API is an API that let's you more easily interact with the GPU and tell it how to render something. Another example of an API is a device driver. It's a piece of software that helps you interact with the computer hardware.
@kZerby
@kZerby 9 ай бұрын
Hi, could someone please tell me the name the backgroud song for the 'We want better' part ?
@RAXN12
@RAXN12 8 ай бұрын
I had worse performance with dx12 when playing Control. And another thing I'd like to add is that it's not just fps but thermals seem to go wonky. I hit 86°c with DX12 but only 72°c with DX11.
@tsunekakou1275
@tsunekakou1275 9 ай бұрын
You blame the wrong thing mate. I haven't worked with DX12 yet but it theoretically give you more control there for you can optimize the graphics pipeline better, reduce overhead. So if done correctly, games with DX12 should be less overhead than game DX11 (in a fair comparison), same go for Vulkan vs OpenGL. These next-gen modern APIs allow developers juice more out of hardware, but sometime they just juice it too hard, they should care more about optimizing for the middle to recent low-end class hardware rather than try to chase the pretty graphics at high end. Maybe game engine devs get too much money and become lazy, who knows 🤣
@loganbogan9
@loganbogan9 9 ай бұрын
He did point out that DX12 should be faster than DX11, although I do think more scrutiny could be put on devs as they're the ones completely mishandling their tools.
@thejackimonster9689
@thejackimonster9689 9 ай бұрын
He's not blaming the wrong thing to be honest. The truth is that most developers have learned to use DX11- or OpenGL-like APIs over the years which heavily depend on driver optimization. As developer you don't need to deal with memory management yourself in those older APIs which explains why some DX12 games require more VRAM these days. You also didn't need to synchronize pipeline executions or shader compilation yourself for the most part. So games could be written much less complex and developers were able to focus on the game rather than learning how a GPU works. Vulkan for example is very honest about this. Officially it's not called a graphics API but a GPU API. In my experience you not only need to deal with your hardware but also with vendor specific differences if you want to truly optimize your application. So that takes much more resources and time to do which wasn't really necessary before. Theoretically you can optimize much more than with DX11 or OpenGL. You can reduce CPU overhead heavily. You can utilize all CPU threads most efficiently and you can have access to most modern hardware features. However it takes more time to develop and you need to have much more low-level knowledge to do it properly.
@tsunekakou1275
@tsunekakou1275 9 ай бұрын
@@loganbogan9 Can we blame the chainsaw if the lumberjack prefer the axe, probably a bit. I watched the video and i still kinda don't understand what is "This" in the video title, obviously Vex doesn't know, not to flame on him, i do not know either, but i don't think graphics APIs are the main reason. Maybe PC gaming is not profitable anymore and they focus on console???
@shawdou3327
@shawdou3327 9 ай бұрын
@@tsunekakou1275I feel it is more like comparing fully functional chainsaw and you just have to fill it with fuel and then you get all possible parts of chain saw and you have to make your own from the parts. Technically you can make more optimal one when you have to build it, but it takes way more time and if you dont want to spend all that time to pick ideal parts you might get up with one that is worse then prebuild one.
@tsunekakou1275
@tsunekakou1275 9 ай бұрын
@@thejackimonster9689 Game engine developers aren't game developers, they are quite different imho. I'm sure these AAA studio have dedicated engine and teams. If we are talking about optimization we must know how to optimize for different vendors, unless if we target only a few consoles. DX12 and Vulkan weren't created in a vaccum, they were designed with help from the game industry as well. I doubt complexity is the main issue with optimization, they go hand in hand since the beginning of computers, we don't get better optimization by eliminate essential complexity. If i may speculate, if the problem was complexity then maybe game industry has dried out of new talents and the rest doesn't care anymore at least at those AAA studios with bad work environment.
@uss_liberty_incident
@uss_liberty_incident 9 ай бұрын
Thanks for putting this together
@PeterPauls
@PeterPauls 9 ай бұрын
If you watch the CPU usage in Witcher 3 DX11 vs DX12, in that case the CPU is underutilized and sometimes the performance very CPU limited in DX 12, in Dx11 not and it use more than 4 CPU cores. When I had a i5 6600k I saw that all of my 4 threads were at 100% though my GTX 1070 was at 99% all the time, a CPU upgrade to i7 7700k solved that problem. Witcher 3 was the first game that I thought we are in a good direction. But since then… not so much and I have now a Ryzen 7 5800 X3D and RTx 4080 PC.
@xenxboi5755
@xenxboi5755 9 ай бұрын
the vulcan option in rdr2 is massively better bc you get a lot less stutter as you can see in your comparison video and look at the frame time - a lot more consistent and A LOT better feeling than dx12
@K.C-2049
@K.C-2049 4 ай бұрын
I switched from DX12 to Vulkan and my frame rates in RDR2 doubled. I'm running a 4070 with a 5700x in 1440p, it's nothing spectacular, but I was shocked by the change.
@user-tb5re6zs2r
@user-tb5re6zs2r 9 ай бұрын
Direct3D12 (D3D12) is far superior to Direct3D11 (D3D11). OpenGL and D3D11 essentially manage memory behind the scenes for the developer, D3D12 and Vulkan leave the implementation and lifetime of the memory down to the developer. In fact implemented correctly the performance from Vulkan is insane, however not many engines use the API's and are often an after thought, however more and more tech is adopting these newer API's as a GPU is basically a giant parallel computing machine, and D3D12 and Vulkan are designed to take advantage of parallel computing, the implementation is not the fault of the API, it's the developers aren't implementing it well.
@Qunia
@Qunia 9 ай бұрын
The “Vram crisis” is the sole reason why I bought a 7900 xtx. If there’s a game that somehow manages to max the Vram on that, i will throw a actual riot.
@KeinNiemand
@KeinNiemand 9 ай бұрын
What we need is a high level API like DX11 or OpenGL but with modern features like ray tracing with maybe an option to do the low level bare metal stuff, forcing developers to use a low level API will just result in them doing whatever is easiest which ends up beeing worse then whatever DX11 does behind the scenes at that low level.
@Psychx_
@Psychx_ 9 ай бұрын
Then we're back to games not being able to do multi-threading, drivers getting bloated and needing extensive per-title tuning, and draw calls limiting scene complexity again. Btw - there already are abstraction layers that wrap DX12 und Vulkan into a more high-level-ish interface.
@meanmole3212
@meanmole3212 9 ай бұрын
wgpu is the real rising champion here. Maybe billion dollar studios got the time and money to optimize their software for every device and every platform using different API, but not indie or small development teams who still want to do their GPU programming themselves. Simplicity and portability packed in together under a single API with solid performance.
@tsunekakou1275
@tsunekakou1275 9 ай бұрын
@@meanmole3212 😅 if a person that have the needs to use low level APIs then why would they use someone else wrapper API that written in an immature language? Indie and small development teams probably would use off-the-self engines, but i guess there is a niche market for wgpu?
@meanmole3212
@meanmole3212 9 ай бұрын
@@tsunekakou1275 Because as OpenGL gets left in the dark and replaced with Vulkan a lot of developers find themselves in a tough spot. OpenGL is simple enough to get stuff done relative quickly compared to the overhead Vulkan adds in terms of complexity. In a way as you said there is a void to fill for people who do not want to jump to Unity, Unreal or any other bloated framework solution. If you just want to program the GPU and write your own shaders the similar way you do it in OpenGL without the need to worry about different operating systems or devices too much (in fact the protability aspect is easily worse in OpenGL compared wgpu), then wgpu is a solid choice. Notice that wgpu is not a game rendering library or engine, it's as much of a GPU API as OpenGL is, focused solely on programming the GPU. And yes, since it is still an abstraction layer over all of the other GPU APIs (you can switch arbitrarily which backend to use, DirectX, Vulkan, Metal, WASM, OpenGL) you can't do all the fine-tuned specifics that you can do with DirectX exclusively for example. Naturally there's also a performance overhead because of the abstraction, but in my experience it is very competitive. I think it is very sad that the industry does not have a standard for programming the GPU and writing shaders in general, but instead you as a developer need to, in some cases, write support for different APIs in addition to writing your shaders multiple times.
@LKNear
@LKNear 9 ай бұрын
Whatever happened to variable rate shading? It was supposed to improve gpu performance a ton lol
@filipborch-solem1354
@filipborch-solem1354 9 ай бұрын
Could be that it doesn't work well with upscaling. Look at dead space.
Overclocking Died and No One Noticed...
16:25
Vex
Рет қаралды 63 М.
No One Actually Wants Realistic Games… Here’s Why
14:01
Зу-зу Күлпәш. Стоп. (1-бөлім)
52:33
ASTANATV Movie
Рет қаралды 1,2 МЛН
Don't eat centipede 🪱😂
00:19
Nadir Sailov
Рет қаралды 22 МЛН
Зомби Апокалипсис  часть 1 🤯#shorts
00:29
INNA SERG
Рет қаралды 7 МЛН
Bro be careful where you drop the ball  #learnfromkhaby  #comedy
00:19
Khaby. Lame
Рет қаралды 20 МЛН
Why Steam's Monopoly is Actually a Good Thing?
14:51
Vex
Рет қаралды 520 М.
I watched 121 FPS guides and they’re full of lies
20:07
Aaron Bong
Рет қаралды 845 М.
This is why Ghost of Tsushima on PC is impressive
9:35
Welly
Рет қаралды 2,8 М.
WTF is going on with DX12 and Vulkan?
16:19
Linus Tech Tips
Рет қаралды 1,1 МЛН
DirectX 11 vs DirectX 12 - Is DX12 that good?
4:50
NCIX Tech Tips
Рет қаралды 192 М.
The Art of Game Optimization
10:18
Worlds In Motion
Рет қаралды 243 М.
Why Are They Making Games We Can't Play?  (GPU News)
18:04
Your NEW PC will be Irrelevant…
22:45
Vex
Рет қаралды 123 М.
Why Solo Developers Should Use Unreal
10:55
Thomas Brush
Рет қаралды 286 М.
How Nvidia Has Slowly Lost Our Trust
36:57
Vex
Рет қаралды 78 М.
Обзор игрового компьютера Макса 2в1
23:34
Samsung vs Apple Vision Pro🤯
0:31
FilmBytes
Рет қаралды 1,2 МЛН
Nokia 3310 versus Red Hot Ball
0:37
PressTube
Рет қаралды 1,9 МЛН
Save Work Efficiently on Your Computer 18/05/2024
0:51
UNIQUE PHOTO EDITING
Рет қаралды 308 М.