$1,200 GPU from 2002 - Can it run Doom 3?

  Рет қаралды 23,548

Tales of Weird Stuff

Tales of Weird Stuff

Күн бұрын

It's #GPUJune 2024! Released in late 2002 at a similar time as the Radeon 9700 and the Geforce FX, the top of the line 3Dlabs P10 architecture card cost $1,200. Let's compare the architecture of the 3Dlabs parts with its contemporaries, and, naturally, let's see if it can run Doom 3.
Twitter: / talesofweird
Instagram: / talesofweirdstuff
00:00 Intro
00:44 Previous GPU generation
04:20 Shader model 2.0 generation
07:38 Visual Processing Unit
11:25 VP990 Pro Unboxing
13:22 The Good
20:46 The Bad
26:24 The Ugly
31:43 The Weird
37:56 WindowsXP
39:54 Doom3
42:29 Closing words
OpenGL extension registry:
registry.khronos.org/OpenGL/i...
3Dlabs P9 / P10 documentation:
www.vgamuseum.info/images/doc/...

Пікірлер: 186
@phirenz
@phirenz 10 күн бұрын
I looked into the "only supporting integer" stuff. The "shader processors" that 3D labs are talking about are not the full equivalent of modern fragment shaders. 3D labs have actually split the fragment shaders across two seperate cores, these shader processors and the "texture coordinate processors", and the texture coordinate processors do support full 32bit floating point. These two cores are detached from each other, presumably with a longish FIFO between them. The idea is that the Texture coordinate processor does all the texture coordinate processing, fetches the textures, does dependant textures and then passes the final texture fetch colors on to the shader processors, which does the final calculations with 10 bit integer precision. The documentation explicitly points the texture processors can pass any raw value they calculate though to the shading units without needing to do a texture fetch. And if you check the ARB_fragment_shader extension (notice how the contact detail was 3Dlabs themselves) you will notice that it only requires the full 2^32 range for positions, normals or UV coords. Color values are only required to support the range of 2^10. This split between texture processors and final color processing was not unique to 3D labs. I believe all major graphics vendors implemented such a design for their register combiner and early pixel shader GPUs. It's why Pixel Shader model 1.x and Pixel Shader 2.0 (not 2.0a) have the concept of "texture instruction" and "arithmetic insertions". I don't think the Pixel Shader Model documentation ever actually stated those types of instructions would be run on two completely independent shader cores... but that's what was happening under the hood. I believe it was Nvidia who unified both jobs into a single "pixel shader" core with their FX line of GPUs, and this unification was part of the reason why these GPUs had bad performance, because they had to use 32bit floats for color. On the plus side, it was the first GPU with proper flow control. Even the Radeon R300 still had this split, I'm seeing references in the register documentation to seperate "texture microcode" and "ALU microcode".
@noth606
@noth606 8 күн бұрын
hmm interesting, in particular the bit about the FX stuff, I still should have one of the big boys of the FX line in storage and at the time had a bunch of discussions with people who were "pooping" on it due to articles that came out dissing the architecture. Sure, it was imperfect, but it was far from the absolute dog it was made out to be, at least the higher end models were quite OK - expecting big things from the fx5200 or how it was called was a strange idea anyway.
@cesaru3619
@cesaru3619 8 күн бұрын
3dlabs FANBOY in denial.
@TalesofWeirdStuff
@TalesofWeirdStuff 7 күн бұрын
The 3Dlabs split of higher precision texture coordinate processing and lower precision fragment processing seems most similar to the Geforce3 and Geforce4. NVIDIA had a very similar split there, and this can be seen in the NV_texture_shader and NV_register_combiners split. At least some modern architectures still have vestiges of this. On Intel GPUs, the texture instructions conceptually send a message to an asynchronous unit that performs the texture look up. Once the data has been fetched and processed, it is delivered back to the shader execution unit. Depending on numerous factors (e.g., caches) the texture access can take a wildly variable amount of time. The texture sampler hardware is on a separate piece of the die from the execution unit. The huge difference between current and vintage architectures is that current architectures are asynchronous (other things can happen in the shader while waiting for texture results) and vintage architectures were synchronous (the next phase of ALU operations would block until the texture unit was done). As I mentioned in the video, Geforce FX only had flow control in the vertex stages. Fragment processing had predication that could be used to emulate if-then-else structures (similar to conditional move instructions on CPUs), but loops were impossible. You are correct that a lot of hardware from this era has a concept of alternating "texture phases" and "ALU phases." These architectures have a limited number of texture indirections, so the compiler / driver has to try to group texture accesses into bundles to reduce the number of phases. Each Shader Model has a limit for the number of texture indirections that are guaranteed to be supported. On R200 it's really small... 2 or 3. I know ATI_fragment_shader only allowed 2 passes. I think I'm going to have to do a deep-dive video on the Radeon 8500 architecture. On Intel i915, it's also pretty small. I want to say 4, but it has been years since I did any serious work on that driver. The general idea is that a texture phase receives data from a previous stage. That is either the vertex processing / interpolation or a previous ALU phase. Texture look ups occur, and that data is fed to either the next ALU stage or the color / depth outputs. ALU stages have some number of values that can be persistent by passing through the texture stages (this is related to the "pass any raw value they calculate" bit that you mention) and some temporary values that are lost at each texture stage. Fragment shader color inputs (via gl_Color and gl_SecondaryColor variables), color outputs (via gl_FragColor), and depth output (via gl_FragDepth) are implicitly clamped to [0, 1] and may have reduced precision. Intermediate calculations are expected to be "full" precision. The spec is a little vague about what full precision would be. Issue 30 of the GLSL 1.10 spec defers to the OpenGL 1.4 spec, "It is already implicit that floating point requirements must adhere to section 2.1.1 of version 1.4 of the OpenGL Specification." Section 2.1.1 is where I assume you got the 2^32 and 2^10 values. That section also says, "The maximum representable magnitude for all other floating-point values must be at least 2^32 ." "All other floating-point values" would include intermediate calculations in a shader. NV30 and R300 had different representations. NV30 could use either 32-bit single precision or 16-bit half precision. The latter was not directly exposed in GLSL. R300 used some sort of 24-bit precision internally. It met the requirements of all the specifications at the time, but I have some memory of it causing problems for developers that expected / needed full single precision. Given that the P10 shaders can supposedly be quite large, there are a few ways they could have gotten around this. A clever compiler could analyze the shaders to determine that some calculations would be fine at lower precision. I implemented a similar pass in my compiler. For many values in simple shaders, it is very easy to prove that the range will be limited. That would cover a lot of things. The GLSL spec and OpenGL 1.4 are pretty loose in the definition of precision and the processing of exceptional values (infinity, NaN, etc.). Perhaps this enables the compiler to use a different representation for floating point values... like storing an integer numerator and integer denominator. This is just speculation. There is one thing that still bugs me. GLSL is a superset of ARB_fragment_program (the assembly language shader). If the driver and hardware can do the former, it can, by definition, do the latter. Why not support it? Why not support Shader Model 2.0?
@Hugobros3
@Hugobros3 11 күн бұрын
now you're getting into the really good stuff ! I absolutely love these, and especially the later P20/25 cards.
@TalesofWeirdStuff
@TalesofWeirdStuff 11 күн бұрын
Those later cards seem like they might be more interesting. Did you have any of the 3Dlabs cards back in the day? I am curious to hear about people's real world experiences with them.
@Hugobros3
@Hugobros3 10 күн бұрын
@@TalesofWeirdStuff I was born far too late for that. I got my first 3dlabs card (Wildcat Realizm 200) in 2011 at a flea market, without a clue on the significance it had. It was already outdated by then but ran TrackMania and Minecraft okay-ish on the terrible hands-me-down computer I had. I have collected a few other cards since, and I even managed to score some ZiiLabs ZMS stuff (the embedded ARM SoCs with StemCell GPUs, derived from their existing IP).
@kenh6096
@kenh6096 10 күн бұрын
​@@TalesofWeirdStuff I remember the realizm cards whilst impressive for some professional applications had real issues if you attempted to game on them.
@seebarry4068
@seebarry4068 9 күн бұрын
I had the 9700 pro. Good card, got about 8 years use from it.😊
@supabass4003
@supabass4003 9 күн бұрын
Damn you're lucky, mine died in 2004 :( I had the Powercolor 9700Pro AIW.
@RJARRRPCGP
@RJARRRPCGP 7 күн бұрын
@@supabass4003 Why did it die? That's a mystery, because the high-end, especially that high-end, aren't even likely to have bad caps. Except for motherboards, I usually would see bad caps more in 2004-2005.
@coolmoments2994
@coolmoments2994 8 күн бұрын
I had bought a FireGL X1 it was about $1000 USD new, still have it somewhere actually, was able to play games back in the early 2000's at insane resolutions on a super high res CRT I used to have, was awesome.
@TokeBoisen
@TokeBoisen 9 күн бұрын
I absolutely do not miss my first GPU, the Geforce FX5600XT. At that time Nvidia used XT to mean "low-performance" versions, something 15-year old me didn't know when I bought my first pre-built. Upgraded to a 6800GT later that year, and that thing was a beast at the time. Thanks for this blast from the past
@noth606
@noth606 7 күн бұрын
FX5600XT if I recall correctly was a sysintegrator only SKU, I it came as standard in some branded PC package deal? Pre-built can mean a local shop built PC package, or something totally different like an Acer, HP or similar. I recall at least Medion had FX5x00XT sysintegrator only cards that were strange ones, they seemed to be binning rejects or bus width gimped GPU's. I had either 5500xt or 5600xt that was a "faulty/disabled" 5800 where the memory controller was limited to 64mb at half speed or something like that. Very strange card, but not very useful.
@xRichUKx
@xRichUKx 11 күн бұрын
Very interested to hear about your career and experience writing graphics drivers!
@redgek
@redgek 10 күн бұрын
Yes! I'd love to hear more about.
@TalesofWeirdStuff
@TalesofWeirdStuff 7 күн бұрын
I expect that it will come out more as I continue to talk about graphics hardware and OpenGL. :)
@themidcentrist
@themidcentrist 10 күн бұрын
I had 2 Voodoo 2 cards in SLI back in the day. Great stuff! I used it to play the original Tomb Raider with a patch that allowed it to support 3DFX glide.
@NavJack27gaming
@NavJack27gaming 10 күн бұрын
Please Please cover more stuff like this! i had absolutely no idea you had this in-depth background with GPUs.
@johnmay4803
@johnmay4803 11 күн бұрын
i just want 2 say i love your vids keep up the fantastic work pal
@supabass4003
@supabass4003 9 күн бұрын
The moment you said you didn't own a 9700Pro in 2002, I knew you were going to say that you had an 8500LE!!!!
@TalesofWeirdStuff
@TalesofWeirdStuff 7 күн бұрын
Ha! The worst part is... I just got it in October or November of 2002 to upgrade my original Radeon DDR. A little late to that particular party.
@raresmacovei8382
@raresmacovei8382 5 күн бұрын
@@TalesofWeirdStuff Not really. Due to the PS2 and original Xbox, games ran on DirectX8.1 until around 2006, with some games even in 2007+ still running on that API.
@drewnewby
@drewnewby 10 күн бұрын
From tomshardware, "Old Hand Meets Young Firebrand: ATi FireGL X1 and Nvidia Quadro4 980XGL" The FireGL X1 128 MB is shown at $749, and $949 for the 256 MB version.
@drewnewby
@drewnewby 10 күн бұрын
Uwe Scheffel's other articles are worth a read too, in this case ... " British Understatement: 3Dlabs' Wildcat VP Put To The Performance Test"
@TalesofWeirdStuff
@TalesofWeirdStuff 10 күн бұрын
Thanks for finding that. $1,200 is still quite a bit more, but it's not as absurd.
@KleinMeme
@KleinMeme 8 күн бұрын
@@TalesofWeirdStuff Correlates with what i found. (mostly german Hardware/IT Sites who had reports from 2002-2003 for these cards.) :D A german Hardware Website had a report with benchmarks who stated that the ATI FireGL X1 256MB was listed and sold for 995,-€. The 3Dlabs Wildcat VP990 Pro was MSRP 999,-€ before taxes, 1160,- with VAT applied. The VP760 was listed for 449,-$ The VP870 was listed for 599,-$ The VP970 was listed for 1199,-$ As stated in a report from June 24th 2002 on a german IT website.
@Arivia1
@Arivia1 10 күн бұрын
I've barely started this video but I love how you immediately dive into the programming and feature set nitty-gritty of this? these? GPUs and other graphics technologies. It's a clear tier above most any other historical GPU content on youtube (no shame on PixelPipes I love you but this is a whole other level).
@TalesofWeirdStuff
@TalesofWeirdStuff 7 күн бұрын
I think there's plenty of room for lots of different perspectives. I equally enjoy LGR videos and Modern Vintage Gamer videos, but they tend to be very different levels of technical depth. I've been doing videos for almost 4 years, and I'm still trying to find where "my place" is.
@CallOFDutyMVP666
@CallOFDutyMVP666 8 күн бұрын
Great vid. Boosting engagements
@TalesofWeirdStuff
@TalesofWeirdStuff 7 күн бұрын
Every little bit helps. :)
@jameskolby
@jameskolby 6 күн бұрын
Thank for the wonderful video! As a computer engineering student, I find more technical explanations of relatively old hardware to be fascinating look into how we got to where we are today
@TheMajorTechie
@TheMajorTechie 10 күн бұрын
Very interesting! I've been interested in trying my hand at Windows graphics driver writing myself, though I haven't really had much of a clue of where to even start yet outside of grabbing the pre-made example drivers from the Windows DDK.
@slimebuck
@slimebuck 9 күн бұрын
back then my gpu was a voodoo 3 3000. to this day it is my favorite ever gpu. it blew me away back then. It replaced my 2x Voodoo2 SLi set up
@drewnewby
@drewnewby 10 күн бұрын
I had quite a few video cards prior, but I remember the ATI Radeon 9500 Pro as the first one I referred to as a graphics card proper.
@majbt45
@majbt45 8 күн бұрын
Really cool and in depth discussion on this sophisticated piece of graphics technology.
@gelijkjatoch1009
@gelijkjatoch1009 6 күн бұрын
Finally a very light game looking good, nicely aged.
@TalesofWeirdStuff
@TalesofWeirdStuff 6 күн бұрын
The bummer is that the scenery looks pretty good... too bad it's so dark you can't see any of it.
@gelijkjatoch1009
@gelijkjatoch1009 5 күн бұрын
@@TalesofWeirdStuff The games ware also so scary before people had ratherd to turn it off.
@buggerlugz6753
@buggerlugz6753 10 күн бұрын
The capturing VGA impressed me! :)
@TalesofWeirdStuff
@TalesofWeirdStuff 10 күн бұрын
I realized afterwards that since these cards have DVI, I could probably use my regular HDMI capture device. 🤦‍♂️ I'll try that for the next video.
@seasidegalaxystreet
@seasidegalaxystreet 8 күн бұрын
I remember back in the day owning a £600 IIyama Vision Master Pro 510 22” NF diamondtron monitor and how cool that was. A great time to be in computing.
@jtsiomb
@jtsiomb 10 күн бұрын
While you were scrolling through the extenions, I could see both ARB_vertex_shader, and ARB_fragment_shader (and the obligatory umbrella ARB_shader_language100), so it absolutely looks like it has full GLSL support. Which makes sense, because 3DLabs was at the time at the forefront of GLSL and GL2 development. I never heard of 3DLabs before reading their whitepapers and slides about it back then. The big question then is, does it has a GLSL noise() function that doesn't always return 0? Because I want one if it does.
@TalesofWeirdStuff
@TalesofWeirdStuff 10 күн бұрын
That is a fabulous question! I will for sure check the noise function. For awhile Mesa had a real noise function, but we took it out because lower end GPUs couldn't do it. It generated way too many instructions.
@AluminumHaste
@AluminumHaste 2 күн бұрын
Man the Radeon 9700 Pro was such a beast, I had one for like 3 days before I returned it, my CPU was too slow to actually use all of it. It was 700$ CDN at Bestbuy back in like 2003ish.
@djayjp
@djayjp 8 күн бұрын
I recall the 6600GT was kind of considered **the** Doom 3 GPU.
@TalesofWeirdStuff
@TalesofWeirdStuff 7 күн бұрын
Somewhere in my collection I have a 6600GT with Doom 3 on the fan shroud. I think eVGA made it?
@bobomabuse2633
@bobomabuse2633 Күн бұрын
Hatte auch die 6600GT. eine super Karte
@konsolendoc
@konsolendoc 9 күн бұрын
Cool Video
@pacocarrion7869
@pacocarrion7869 9 күн бұрын
512MB is a lot, remember that Quadro FX 3000G has 256MB (2003 released)
@AnonyDave
@AnonyDave 10 күн бұрын
Glad you have a sane definition for gpu. Nothing worse than someone looking at the most basic frame buffer (like say a sun cg3, where it's literally just a chunk of ram you can write to, and a ramdac that converts data from that ram into video data) and saying "look at this gpu"
@TalesofWeirdStuff
@TalesofWeirdStuff 10 күн бұрын
I rarely "correct people," but it does annoy me when people call even something like a Voodoo 2 a GPU. But... I always tell people the correct way to say GIF is the *opposite* of however they just said it. 😂
@kenh6096
@kenh6096 10 күн бұрын
Very glad I just found your channel, I am looking forward to the follow-up videos :)
@PixelPipes
@PixelPipes 10 күн бұрын
This is so interesting! I wonder if Doom 3 is defaulting to fixed function mode. That framerate though... Does it not have occlusion culling or am I misunderstanding that? I do remember my friends and I back in the day saw this card on NewEgg and such and just gawked at the ridiculously huge framebuffer. But we all knew it wasn't suitable for gaming.
@TalesofWeirdStuff
@TalesofWeirdStuff 7 күн бұрын
I'm going to delve into the rendering modes in the next video. I finally looked it up... "gfxinfo" will show which modes are available, and "r_renderer" can be used to specify a mode. Based on the available functionality, it should be able to do the NV20 mode, but it might be falling back to ARB. The card doesn't have occlusion query, but I don't think Doom 3 uses that. I'm not 100% sure, though. It's possible that's a factor.
@CarlScripter
@CarlScripter 10 күн бұрын
KEEP THIS GOING! 🤟😎🤟
@JamesSmith-sw3nk
@JamesSmith-sw3nk 23 сағат бұрын
"$1200? That sounds fine." - 4090 owners. 😁
@TalesofWeirdStuff
@TalesofWeirdStuff 8 сағат бұрын
Yeah... maybe in 20 years I'll do a video about that card too. 🤣
@mwitters1
@mwitters1 8 күн бұрын
I would really like a video on the geforce ti4600. That was my first really high end GPU.
@SianaGearz
@SianaGearz 9 күн бұрын
The term GPU was introduced on PC by Nvidia with Geforce 256, which introduced a vector processor for a fixed function vertex pipeline and register combiners for the pixel pipeline. Configurable, not programmable. Outside the PC the term had popped up previously with very fixed function devices. SONY called the video subsystem of the original 1994 Playstation the GPU, and i believe there may have been earlier examples. But i guess if we agree to call only programmable units GPUs i wouldn't mind :D Dreamcast not having a DVD drive is... i don't know this seems largely irrelevant? I mean who cares that Gamecube has a drive that has a DVD pickup, the discs have barely more capacity than Dreamcast's, and you can't fit a DVD in there. The rotary speed of the Dreamcast drive is also not crazy high, it's not like any of the 52x PC drives, it's just noisy because there's a massive huge gap between the lid and the drive recess, so there is a lot of air noise from air running through that gap. I'm going to bet the ARB fragment shader support is very incomplete and that it will actually reject valid programs that don't conform to hardware constraints. This is not something they could sanely do with assembly ARB fragment program.
@ammaina01
@ammaina01 7 күн бұрын
The right way. Without hardware and software that we know, your knowledge is abstract to me. What did I understand in the end? The Weird would have been a great character in the movie.
@TalesofWeirdStuff
@TalesofWeirdStuff 7 күн бұрын
That comment made me laugh so hard. Thank you! 😆😂🤣
@OzzyFan80
@OzzyFan80 8 күн бұрын
That was very interesting. I love tech and hardware especially gpu's but just computers in general. I also come from the same time period as you do. To see these 1,200 video cards that can't run a game like Doom 3 and we now have 500.00 plastic boxes that can run them at 60 to 120 fps is amazing. I just bought a new rig with a 4070 TI Super and the price of that gpu was hard to swallow but it will last quite some time. Before the 4070 the last gpu I bought were 2 Evga super over clocked editions or something and they were 768 mb cards I think without looking for specifics but yes EVGA is a company we need to come back. They made some of the most stable cards back then.
@homersimpson8955
@homersimpson8955 10 күн бұрын
Radeons supports Vertex Texturing starting from DX10 era cards R600 and higher. But i have feelings that R500 also should get a support but it was disabled for some reason. Maybe it was buggy. That would interesting to read R500 specification about this. And feelings I got from Chronicles of Riddick: Assault on Dark Athena. This game specifically requires Vertex Texturing and keep R500 as unsupported, but few Catalyst drivers actually pass the check and alow to play the game for a while. Game still crashes on level 3 and there no way to go further on R500, but it made me think that Vertex Texturing may be implemented but buggy, or maybe level 3 is the first level which really require it.)) That's interesting that 3D Labs implemented so many advanced features but forgot about basic ones as occlusion query, also it's not 100% necessary, pretty sure DOOM 3 do not using occlusion query and do geometry culling on CPU and fight overdraw with early Z pass. Waiting for the next video with more games on P10.
@arnox4554
@arnox4554 8 күн бұрын
Hey, this is mostly unrelated, but where did you get that shirt? What does it say? I'm really curious now!
@TalesofWeirdStuff
@TalesofWeirdStuff 8 сағат бұрын
The shirt is from the OpenGL "birds of a feather" session at SIGGRAPH 2021. It was celebrating the 20th anniversary of OpenGL. I'll show it a little better in the next video. :)
@danthompsett2894
@danthompsett2894 10 күн бұрын
I feel sorry that i missed this card in 2003 seeing a label with 512mb on it 2years ahead of its competitors would have really opened my eyes, then again this card really isnt designed for Gaming, its designed for making games or 3d rendeing for making films etc.... but then even now i refuse to pay over £300-£400 for a graphics card although i am leaning towards the 4070 Ti im still holding out not spending over £700 just on a graphics card as long as possible, specially since im still stuck on AM4 with DDR4 not AM5 with DDR5 which bothers me since De8eur with his Thermal Grizzly line of products has made delidding and water cooling the AM5 CPU's as easy as Pie with there delidder and frame to make it compatible with any water cooler block.
@davidlloyd1526
@davidlloyd1526 9 күн бұрын
Doom 3 was released in 2004, so I would expect a 2002 card to be able to run it.
@TheVanillatech
@TheVanillatech 9 күн бұрын
iD Software were known for pushing the boundaries. Back then, the games developers pushed the hardware guys every few months. And iD had been groundbreaking since Wolfenstein and Doom. Quake 1 required a very beefy computer. And Doom III was probably their most ambitious project of all. They worked directly with ATI (at first), gaining access to ATI's latest unreleased hardware, to code the game to stress the most capable cards out there from day one of the games release. iD made games ahead of their time. So a 2 year old GPU had no chance! Only the top end cards at the time of release were gonna "breeze" through Doom, same way only a top end Pentium 200 could "breeze" through Quake 1, 7 years earlier. ATI ripped iD Software off, a member of ATI Technologies "leaked" the code to the E3 Doom 3 demo onto the web, betraying iD's trust and releasing Carmacks code to ALL his competitors, for free! iD abandoned working with ATI on Doom 3 and switched instead to their main rival, Nvidia. This hiccup didn't do performance any favours for the gamers, as they had to essentially recode a lot of enhancements from scratch. In short, Doom 3 needed a MODERN FAST GPU to run smoothly. 2 year old hardware need not apply. The Radeon 9700, a powerhouse of a card, struggled to maintain 30fps in decent settings. Same with the top end Geforce FX5800 and FX5900. It wasn't until Nvidia released their Geforce 6XXX series, a week after the launch of Doom 3, that gamers could enjoy the engine cap of 60fps implemented in Doom 3 by using the latest greatest 6800 Ultra / 6800 GT. The 6600GT did a commendable job if you sacrificed a few settings. And ATI's X800 series was fine too, although it was clear that iD had truely abandoned optimizing for ATI (thanks to that aforementioned leak), as the 6800 series was at least a tier ahead of the X800's.
@dave7244
@dave7244 8 күн бұрын
I had a 9800 Pro and it ran pretty well. These cards aren't really for gaming. They are for stuff like CAD, CAM etc.
@TheVanillatech
@TheVanillatech 7 күн бұрын
@@dave7244 9800Pro ran at 30-40fps at decent settings. Definitely playable! But no chance of maxing the game out, even at 800x600. The 6600GT did much better, even beating the 9800XT by some margin, despite the mid range Nvidia card being less expensive. 9800Pro beat the 6600GT in HalfLife 2 though! Didn't play well with OpenGL games but it was a DX9 powerhouse.
@DanielCardei
@DanielCardei 9 күн бұрын
i have one of those card and its artifacting. i thought its worth the time and money to rebuild it but seeing how it works in doom i think its good for the shelf on display 🤭
@AldrichQuaiHoi
@AldrichQuaiHoi 9 күн бұрын
Had a wildcat card in a Dell rig with Xeon’s and rambus ram specifically for runnin Hash Animation Master 3D modelling and animation - man these were some awesome cards, made the software super stable, was looking to upgrade to a realizm ones but they canned the cards soon after
@ruthlessadmin
@ruthlessadmin 9 күн бұрын
The only thing I can imagine that would make a shader so big, is if you're building a data payload directly into it. Is that something that would ever make sense to do, though? Either way, very fascinating product. Thanks for sharing.
@fxandbabygirllvvs
@fxandbabygirllvvs 7 күн бұрын
i would hope 3d labs is still up there still producing stuff
@Pachupp85
@Pachupp85 10 күн бұрын
interesting content. Keep up.
@MaTtRoSiTy
@MaTtRoSiTy Күн бұрын
I had a 5600XT when this game released and I rushed out to buy a copy of D3 only to get a rude shock when I tried to play it. Thankfully I later upgraded to a 6600GT then a cheap used 6800GT which both ran it perfectly fine
@kenh6096
@kenh6096 10 күн бұрын
I think I recall on the old VP cards there was a slider in the drivers that did something that really helped if you tried to game on it. Could be thinking of the wrong card? sorry it's been a long time.
@TalesofWeirdStuff
@TalesofWeirdStuff 7 күн бұрын
There is a slider somewhere in the configuration panel for geometry vs texture. I think that just adjust how memory allocation on the card is prioritized, but I'm not sure. I would hope selecting "OpenGL gaming" would set the tweaks for gaming performance. There may yet be registry keys or something that could be adjust to improve performance.
@andycraig6905
@andycraig6905 19 сағат бұрын
Damn, i thought i knew enough to watch this but it became ancient greek in about 30 seconds lol
@TheVanillatech
@TheVanillatech 9 күн бұрын
My friend bought a Radeon 8500LE. I was running a Geforce 2 Ultra back then, I advised my friend to get a Geforce 3 to get similar performance to me (he was struggling to play Morrowind on his aging Geforce 2 MX). His budget didn't quite stretch to the £180 Geforce 3, so we settled on the Radeon 8500LE which was only £110. After some overclocking and tweaking, his card was surprisingly good! In 32-bit colour it could rival my GF2 Ultra in some titles, such as Giants : Kabuto. And Morrowind was vastly improved for him, along with those water shaders. Definitely not a full 8500, but still ... best bang for buck at the time.
@Tarukeys
@Tarukeys 9 күн бұрын
Radeon 9800 was a DX9 card and supported shader model 2. I had one and it was an incredible card for its time. I was running Far cry maxed out (needed 1 Gb of Ram or there was stutters, I tested it) with this beast when the game came out. Before the FX 5900, nVidia had no card worth a damn against it or the previous one, the legendary Radeon 9700 Pro. FX 5800 and the lower card were really bad. I built a Pentium 4c 2.4Ghz with dual channel ram (first gen with it), 1 GB of Corsair TwinX and the Radeon 9800. It was one hell of a PC and served me well for years, which was unusual back then. PCs were obsolete so fast in this era. Question: Are you the guy who was making the custom Omega drivers for Ati cards?
@harrylarkins1310
@harrylarkins1310 8 күн бұрын
Are you sure it needed 1gb? I played it with a 8800gtx (and a 4200ti and 5900gtx before that) with 768mb and I don't recall that, maybe I'm wrong, It was a long time ago.
@dancar2537
@dancar2537 9 күн бұрын
wow, video cards are some kind of wild thing. parallelism without limits. had i been an european politician looking to start an industry in europe i would hire you if you would like to come
@antssaar863
@antssaar863 10 күн бұрын
Awesome video on awesome card! Wish could get one of these. When talking about weird video cards (I kinda collect theme), how about stuff Intergraph made? Like GLZ thingys :D Intergraph also had their wildcat series. It come out around 1998 and had 80mb memory. I have GLZ2 or something from 1995 and it has 21mb or 24 mb (insane for its time, 72 memory modules, cad monster :) Another fun one who tried compete with nvidia and ati in early days was Kyro/kyro 2. There's also Telesensory cards. Thou they used standard "gpu-s", they added functionality for visually impaired. I'm quite sure, You would have fun time messing with these :) Have some older ones from makers forgotten or never well known but cant remember these all right now.
@harrylarkins1310
@harrylarkins1310 8 күн бұрын
I had a kyro 2, love to see a video on that chip.
@spencerbentley8852
@spencerbentley8852 7 күн бұрын
I'm wondering if the person in this video is related in some to the channel owner of the BPS space channel?
@TalesofWeirdStuff
@TalesofWeirdStuff 6 күн бұрын
We are not related that I'm aware of, but... it looks like an interesting channel!
@MrJorgalan
@MrJorgalan 10 күн бұрын
Nowadays, any graphics chip, even from the 8 and 16-bit era, is considered a GPU, which to me is an aberration. They were not GPUs, nor did we use that term back then! The term GPU started to be used with the GeForce 256 in 1999 and, as you mentioned, it should be programmable, which, as far as I know, happened a few years later. But as I said, back then we used the term graphics chip or VDP ("Video Display Processor"), or even PPU ("Pixel Processing Unit") for the SNES. Your videos are super interesting.
@eurocrusader1724
@eurocrusader1724 10 күн бұрын
Yup,I remember the GeForce 256 SDR (wich I bought at the time for my Athlon 500 rig),being addressed as a geometry processing unit(gpu). Before that time, my voodoo 2 12 Meg was defined as a 3d add in card ,just like my ATi rage pro 4meg,wich also can be defined as a display adapter. Although my S3 virge 2 meg before that was my real first 3d card. I know it's bad,but seeing dark forces 2 run in 320x200 w/texture filtering hooked me for life..
@dyslectische
@dyslectische 9 күн бұрын
Gpu come after t&l support . But really a full gpu is a directx 10 after the uni shaders introduction.
@seebarry4068
@seebarry4068 9 күн бұрын
Video card was the term I was using back then.
@MrJorgalan
@MrJorgalan 9 күн бұрын
@@seebarry4068 Today, the term "graphic card" is also used, I believe. I think the term "GPU" is more oriented towards referring to the chip itself in a strict sense and, in general, the video card
@Sean-fj9pn
@Sean-fj9pn 9 күн бұрын
I remember in the 90s the term graphics accelerator was used a lot.
@itsGeorgeAgain
@itsGeorgeAgain 10 күн бұрын
God i was drooling soooo much over a 3DLabs card! I was a teen, but i was into 3D modeling. and i did know about them. But how the heck was i to utilize such a card, even if i could afford one. *EDIT: I would love a video from you talking about 3dfx's T-buffer if you have any insights on its working and how it compared to how D3D would work.
@AzraelSWFC2011
@AzraelSWFC2011 10 күн бұрын
First time watching one of your videos... Took me a second or two to relaise you were NOT Brent Spiner... lmao. ;)
@ChrisJackson-js8rd
@ChrisJackson-js8rd 10 күн бұрын
whats your background re gpus? you clearly have some real expertise here
@TalesofWeirdStuff
@TalesofWeirdStuff 7 күн бұрын
I started by coding Amiga demos in the early 90s. More recently, I've been working at Intel on the open source (Linux) drivers for their GPUs.
@kristapsvecvagars5049
@kristapsvecvagars5049 6 күн бұрын
The blue gadget next to the mouse - a zip drive by any chance?
@TalesofWeirdStuff
@TalesofWeirdStuff 5 күн бұрын
That is a SCSI Zip drive. I used it in my previous video about NEC PC, and I hadn't put it away yet.
@kristapsvecvagars5049
@kristapsvecvagars5049 5 күн бұрын
@@TalesofWeirdStuff Awesome. Hadn't seen one in ages.
@AgentLazarus
@AgentLazarus 2 күн бұрын
I didnt know Shoenice had a brother
@ainohautamaki2648
@ainohautamaki2648 7 күн бұрын
ATi FireGL Z1 599€, X1 128 MB 799€, X1 256 MB 999€. 3Dlabs Wildcat VP970 1350€ for comparison. Couldn't find a back then price for 990 pro.
@themidcentrist
@themidcentrist 10 күн бұрын
I'd love to see Skyrim running on a overclocked high end 486 and the most powerful video card it's motherboard can support. Maybe it could get 10 SPF?
@MakeSh00t
@MakeSh00t 9 күн бұрын
Today people complain that 4090 is expensive but 20 years ago was payment 4 times worst in my country. And yes i have only 800 euros payment and i have 4090.
@drumsmoker731
@drumsmoker731 9 күн бұрын
Priorities 😁
@morkalan5226
@morkalan5226 8 күн бұрын
800per month or per year? not trying to be condescending, but sadly there are many many countrys where 800per year is more than double/triple(/or more than that even) of their salary per year
@JustAGuy85
@JustAGuy85 8 күн бұрын
I played Doom 3 on a 9800PRO. Also played it on a 6600GT. The 6600GT played it far better, but that's when Nvidia was still KILLING image quality through the drivers, too. (They also were doing the same on the previous FX5000 series, too. BLURRY image quality for higher performance) Then ATi came out with drivers that increased Anti-Aliasing performance by 40% which made the 9800PRO like a new card. But, sure, the 6600GT was faster. And I even got a 6800GS AGP later, and then an HD3850 AGP (the last AGP ever made). Waste of money, because my 3400+ (s754) wasn't powerful enough to really unleash it. But yeah, the 9800PRO played Doom 3 "okay".
@P5BDeluxeWiFi
@P5BDeluxeWiFi 8 күн бұрын
My first graphics card was a PCI FX5200, It could not run Doom 3... TPU says "Since GeForce FX 5200 does not support DirectX 11 or DirectX 12, it might not be able to run all the latest games.", I think that might be true.
@davidbrennan5
@davidbrennan5 2 күн бұрын
I had a 9800 pro and a athlon 64 3000 when doom 3 dropped.
@TalesofWeirdStuff
@TalesofWeirdStuff 8 сағат бұрын
I have a suspicion that a strong CPU would help the performance of this card. Doom 3 was demanding on the GPU and the CPU too. The Pentium 4 2.4GHz seems a little low. The minimum system spec was 1.5GHz. We'll find out soon enough.
@davidbrennan5
@davidbrennan5 Сағат бұрын
@@TalesofWeirdStuff I had the 256 bit bus 128mb ram on the card I believe. It was fairly smooth at 1024x768. I had a good air cooler on the chip with heat pipes and a medium overclock. I think it was running in the mid to high 50's FPS from what I remember and we thought that was great at the time. I think the Ram I had in the system was running at 400mhz. That was the first game I played other people online with, I had played games on a network or 1v1 with a modem but not online before this. I was working at a grocery store and saved my money for the parts. The Pentium 4 was real close in actual games from what I remember.
@davidbrennan5
@davidbrennan5 Сағат бұрын
@@TalesofWeirdStuff I had the card with the 256bit bus and 400mhz Ram medium overclock with heatpipe air-cooler. Ran smooth.
@dave7244
@dave7244 8 күн бұрын
Not really surprised it runs poorly. These look like cards for CAD / CAM and professional work.
@BrettHoTep
@BrettHoTep 8 күн бұрын
Doom 3 on a 1200 dollar graphics card from 2002?? I ran doom 3 on a 64mb mobile Radeon in 2004 and finished it so yes it will probably run just fine. The laptop was at least a year old at that point so I wouldn’t be surprised if it was at best from 2002.
@BrettHoTep
@BrettHoTep 8 күн бұрын
Mobility Radeon 9200. Turns out it was 16mb as the macpro which my friend had I remember it having double the ram and those ran 32 mb. Geez.
@TalesofWeirdStuff
@TalesofWeirdStuff 8 сағат бұрын
Wow... you *really* wanted to play that game. Lol.
@deus_nsf
@deus_nsf 9 күн бұрын
Cool stuff! What about turning off shadows? or maybe reducing a couple of special options?
@turbinegraphics16
@turbinegraphics16 10 күн бұрын
I stopped pc gaming during this period so its very interesting.
@minombredepila1580
@minombredepila1580 5 күн бұрын
Amazed by your knowledge. We know that your career is related to OpenGL, but we want to know more 😉 I would also appreciate some more links to investigate, if possible. I wonder if the name "Wilcat" is related somehow to the Intergraph Intense 3D Wildcat. I own one (the 4210 model) and I am still looking for a MoBo with a Pro110 AGP to plug it in...
@TalesofWeirdStuff
@TalesofWeirdStuff 8 сағат бұрын
3dlabs bought Intergraph IP in 2000, and they continued to use the Wildcat branding after that. I don't think it's a continuation of the architecture, but I also don't think it's a continuation of the Gamma / MX architecture from the 3dlabs in-house chips.
@minombredepila1580
@minombredepila1580 8 сағат бұрын
@@TalesofWeirdStuff Your knowledge left me speechless. Thanks for your answer ;-)
@TalesofWeirdStuff
@TalesofWeirdStuff 7 сағат бұрын
In fairness... I had to look it up on Wikipedia. :D
@minombredepila1580
@minombredepila1580 7 сағат бұрын
@@TalesofWeirdStuff Shhhhhhhh !!!!! 😀
@obi-wankenobi1190
@obi-wankenobi1190 6 күн бұрын
R300 decimated everything NV30 stood for the benchmarks in games prove more than enough, NV30 like NV35 & NV38 struggled a lot in DX 9.0a/DirectX 9.0b, due to the Pixel pipeline / vertex shader pipeline build up. The R300 which was used for the Radeon 9500/9700 series only did DX 9.0a as where the 9800 series used the R350 which had DX 9.0b support, this fied a few things the R300's lacked the R360 was used for the 9800 XT this just was the final card for the Radeon 9000 series. R420 added 4:1 Texture compression and 256Bit GDDR3 VRAM interface beyond this things only got better for ATi. NVIDIA's FX series was their absolute worst, ATi really got them by surprise, the architecture of R300 was far superior as well it simply could do more and it cost less.
@mraltoid19
@mraltoid19 10 күн бұрын
Doom 3 came out 2 years after this card. Pretty sure an RX-6950XT and RTX3090 will be able to play the new Doom coming out in a couple years pretty well .
@TheVanillatech
@TheVanillatech 9 күн бұрын
Things were different back then. Rapidly accelerating technology, hardware guys pushing software developers and vice versa, it was more than a 2 horse race still (barely) so more competition. Generational gains in GPU speed was insane. Doom III was groundbreaking, as were all iD games of that era. 2 year old GPU's didn't stand a chance, more than 10 years in todays industry. They were aiming for the GF3 but that boat sailed, as the Ti4600 provided almost 2x the horsepower of the GF3 and even the flagship GF4 didn't stand a chance in Doom III. Geforce FX5800/5900 did an okay job ( playable at least ) as did the Radeon 9700/9800, but neither could come close to running Doom III in max settings at the engine cap of 60fps! More like 25-35fps, until you sacrificed details in settings or dropped to lower resolutions. Doom III didn't run "smooth" on anything until a week after it's release, when Nvidia released the Geforce 6800GT/Ultra. ATI's X800 series also did the game justice, to a lesser extent.
@prezeskodaty4637
@prezeskodaty4637 3 күн бұрын
radeon X1950 XT agp is the most efficient graphics card for agp8 bus if anyone is looking for maximum performance for retro desktop computer that's why you need to buy motherboard with agp8pro bus for example abit ic7max3 to connect more efficient graphics cards. graphics cards in the future because agp8pro gives you a larger power supply and the next generation agp9 bus will be identical to the agp8ultra bus but agp9 will not support backwards compatibility with agp4 which is why agp8pro is better supporting agp4
@keithsimpson2685
@keithsimpson2685 8 күн бұрын
Radeon 9800 pro was the first real gfx card I had. Got it late when it was cheap.
@RJARRRPCGP
@RJARRRPCGP 7 күн бұрын
Unfortunately, in 2007, I got a used one, and it was a bad video card! Even the BIOS screen would get corrupted!
@DenverStarkey
@DenverStarkey 8 күн бұрын
i had a radeon 9700 pro back when doom 3 came out it ran it like 25-40 fps @ 1024x768 4:3 settings on high with shadows on med. some areas it'd drop to 18fps though and that shit used to piss me off.
@harrylarkins1310
@harrylarkins1310 8 күн бұрын
I had a fx 5900 at the time and it wasn't too good there either.
@DenverStarkey
@DenverStarkey 7 күн бұрын
@@harrylarkins1310 yeah doom 3 was a late OG xbox /PS2 era game (but on PC) , and the radeon 9000 series and nvidia 5000 series were really kinda prior to that gen , so they struggled with games like doom 3 or Far cry. that were games of the latter half of the PS2 /XBOX era. now the cards that ran them like champs for the time were the nvidia 6000 series and the ATI Radeon 800XT series (not to be confused with AMD's modern XT' line). a year after it came out i actually had got a radeon 850XT that ran doom 3 at 1024x768 setting maxed with frame rates of 50+ which at the time was amazing to me . prior to that card. my frame rate target had always been 35-40 fps 60 fps was a rarity becaue hardware and software was developing so fast from 1996-2005
@Agoz8375
@Agoz8375 8 күн бұрын
Hello, I run doom 3 with a geforce 2 MX400 64 MB sdram lol. It runs in very low good 😊.
@AshtonCoolman
@AshtonCoolman 10 күн бұрын
OpenGL is really forgiving
@mauriciosanchez5922
@mauriciosanchez5922 10 күн бұрын
44 minutos para probar el juego 2 minutos
@zulutgseta8276
@zulutgseta8276 8 күн бұрын
Im clearly have no time for took a longer chit chat. 😫
@DanielWha
@DanielWha 8 күн бұрын
1200$ and now they prob cost 5 bucks (rtx 4090 will meet this fate some day)
@awilliams1701
@awilliams1701 10 күн бұрын
I'm actually impressed. While it looks like crap (I think the colors don't look quite right), it's semi smooth. On my high end card doom 3 ran worse than this. I had an athlon xp.
@kaptentigu5919
@kaptentigu5919 9 күн бұрын
Even Voodoo2 can run Doom3.
@CompatibilityMadness
@CompatibilityMadness 10 күн бұрын
Hello, first up : Great and informative video ! I hope to hear more about those cards (and maybe P20s), later :) For DOOM3 you could just run build-in benchmark (hit "Ctrl + Shift + ~" keys and type in console : "timedemo demo1 1", that "1" at the end is for precaching stuff (to make first test result more in line with next runs). This should make benchmarking easier and more 1:1 between cards. Side note : Doom3 v1.0 doesn't have 60FPS lock, but all versions after do. So, if you go over 60FPS during testing it won't show it in results. Alternatively, you could try to run Far Cry 1 (which has both D3D and OpenGL path as well), and which has free and working HOC benchmark tool available for easy benchmarking. Do you plan to test fillrate on those ? I usually run Fillrate benchmark 2004 (it's free and has multiple options for fillrate), or 3DMarks with feature tests enabled, to see how different cards stack up (advance version keys that are required for feature tests can be copied from official legacy UL website).
@Vote.ReformUK
@Vote.ReformUK 10 күн бұрын
I had a 9700 Pro, the first GPU I bought with my first work pay cheque. I couldn't afford to build the whole PC, I had to build it slowly over the year. That thing lasted me until the 8800GT, I remember having to do mods to games to get them to work like Oblivion in 2006 I think wouldn't run but someone had a shader model patch or something. The only difference it made was you couldn't run all the nasty blur and bloom that game ran :/ So you made it so the game couldn't artificially run on the 9700 Pro for that? To make the game look worse! Bioshock was another game that did that and shader model 3 made no difference what so ever, it was an UE 2.5 game :/ Though that is a 9800 Pro here because of that silver heatsink. I'm so glad I never knew about 9700 Pros failing back in the day, I always had mine laid on the desk with no case (couldn't afford one) and so I had a desk fan blowing directly on it and that must have saved it from overheating.
@supabass4003
@supabass4003 9 күн бұрын
Yeah man you were lucky to have yours last until the 8000 series, my Powercolor 9700Pro AIW died in 2004.
@Vote.ReformUK
@Vote.ReformUK 9 күн бұрын
@@supabass4003 I simply think it was because I had that extra cooling blowing on it.
@wumpols
@wumpols 9 күн бұрын
sorry i have to say it, you look like an old mark zuckerburg
@thelasthallow
@thelasthallow 9 күн бұрын
to all of the complaints for this card, why do you think the company doesnt exist anymore? because they tried to do alot of things nobody else did but then left out alof of usefuel features others like Nvidia and AMD used. Nvidia and AMD were more generalised and cheaper.
@vigilant_1934
@vigilant_1934 6 күн бұрын
ATI not AMD. AMD didn't acquire ATI until 2006 so 4 years after this 3DLabs card came out. ATI had been in the graphics business since the late 1980's if I recall correctly.
@thelasthallow
@thelasthallow 4 күн бұрын
@@vigilant_1934 who gives a shit, you didnt contribute anything to the discussion what so ever, just came in with your dumb ass superiority complex.
@guily6669
@guily6669 9 күн бұрын
those 512MB do nothing for gaming jeez, in 2004 I bought the ATI X9600XT for less than 100€ and could play doom 3 fine at my max res 1024x768 and almost maxed out, I believe I just reduced shadows and had no AA...
@supabass4003
@supabass4003 9 күн бұрын
My 9700pro couldn't do that, I think your memory might be a bit fuzzy!
@guily6669
@guily6669 9 күн бұрын
@@supabass4003 Nop for almost 2 years most games run on high I just usually reduced shadows and no AA. I might be wrong is in the resolution, my display maybe was 1024x768. Ps: and off course wasn't 100% 60FPS all the time but was always close to it most of the game... I had more trouble on Far Cry 1 and even when I updated to a ATI X1800GTO they updated Far Cry and could never run it maxed out LOL, was the benchmark before Crysis 😁
@prezeskodaty4637
@prezeskodaty4637 3 күн бұрын
graphics card nvidia quadro fx 1000 agp connect with sli cable to one nvidia quadro1nv10 pci to one 3dfx voodoo1 pci to one 3dfx voodoo2 pci to make windows millennium work normally and computer game elder scrols 3
@Synthematix
@Synthematix 10 күн бұрын
And nvidia claim ray tracing is something new, er no. its been around since the mid 90s
@ruthlessadmin
@ruthlessadmin 9 күн бұрын
It's actually been a part of 3D computer graphics from the start
@kungfujesus06
@kungfujesus06 9 күн бұрын
Sure, but not typically in graphics hardware. Even Pixar's renderman cheated with rasterization tricks. Some sfx houses were doing ray tracing by that point but it was almost entirely on the CPU and usually distributed over a render farm.
@ruthlessadmin
@ruthlessadmin 9 күн бұрын
@@kungfujesus06 I thought you meant the rendering technique in general.
@techpriest4787
@techpriest4787 9 күн бұрын
Erm. Nvidia never said that. It was even Nvidia that teached us of science that happened during the middle ages with strings if I recall. What Nvidia said and did was to give us hardware RT that is also not to be confused with compute RT which is not actual hardware RT. Also. Before you get the "the more you buy the more you save" wrong. No. It is not for gamers. That is the reason why you fail to understand it in the first place. It is for businesses. The more productive the tools are the more profit there is. So you buy more tools to save even more. Duh...
@williehrmann
@williehrmann 9 күн бұрын
Of course it was. I was waiting on Real time raytracing in games from 2002ish. Was sure sooner or later it will come and I don't have to Puke anymore everytime i see a screenspace reflection disappear when i Look down.
@powerplay074
@powerplay074 8 күн бұрын
Can 1200d 2002 gpu run game from 2003? Lol
@brugj03
@brugj03 9 күн бұрын
Time saver: It can`t run doom3. Done.
@MrDukeeeey
@MrDukeeeey 10 күн бұрын
Your definition of gpu is odd. Old voodoo cards were definitely gpus. They had no 2d support. Sure the pipeline was fixed function but that's not as limiting as it sounds, you could do vertex shaders essentially cpu side. But they did rasterisation, texture filtering etc Modern gpus are really less about rasterisation and drawing and more just about general compute, just in a simd fashion.
@tenchi71
@tenchi71 9 күн бұрын
No. They were 3d accelerators.
@TalesofWeirdStuff
@TalesofWeirdStuff 7 күн бұрын
This is a bit pedantic, and that's why I rarely give people grief about it. But since you brought it up... ;) Back in the day... that would have been called "3D accelerator" as @tenchi71 says. This was to differentiate it from a card like the ET4000 that was just a "video card." At one time even cards like ET4000 and S3 Trio were called out as "Windows accelerator" or "2D accelerator" to differentiate them from more simple VGA cards like Trident 8900. GPU is supposed to be a riff on CPU. The feature that sets CPUs apart from the more primitive logic that preceded it was the ability to run programs. That's why I use that as the difference between a GPU and a 3D accelerator. In all of these cases, the cards in the old category eventually cease to be made, so the old words disappear. When the old words disappear, people start calling everything by the new words... even when it doesn't really apply.
@jimblorg6263
@jimblorg6263 6 күн бұрын
Anyone who was playing Doom 3 was playing on a 6800 GT, doom 3 released in late 2004, the 6800 released in spring that year.
@awilliams1701
@awilliams1701 10 күн бұрын
I'm not optimistic (just starting the video). I found that doom 3 (at least not with a playable framerate) just doesn't work on AGP cards. I had a high end 7000 series card. It was supposed to be the fastest AGP card ever made.......it couldn't do doom 3. Or maybe it was some kind of AMD + AGP bug?
@JohnDoe-ip3oq
@JohnDoe-ip3oq 10 күн бұрын
What? Doom 3 ran fine on literally all reasonable agp cards, even the gf4mx which was a rebranded gf2. People even hacked the voodoo 2 driver for it. I know it ran fine on dx8 and dx9 cards. If you can't run it, you're doing something wrong, bad hardware/driver IDK. Also, afaik Radeon had the better agp cards, and switched to bridge chips later than Nvidia. Runs great on the x800 through x1900. X800 doesn't do sm3, but most sm3 games had community sm3 patches.
@awilliams1701
@awilliams1701 10 күн бұрын
@@JohnDoe-ip3oq For whatever reason it ran like shit for me. My sister had this cheap ass computer and put in an GF 8100 and got better general performance than my high end 7000 series. I want to say it was a 7950. It sounded like it should have had 2 GPUs, but it didn't. Once I moved over to core 2 with PCI-E I never had anymore issues.
@JohnDoe-ip3oq
@JohnDoe-ip3oq 9 күн бұрын
@@awilliams1701 those dual GPU cards were super broken, especially because they were not intended for agp, optimized PCIe for the independent lanes, putting that on a bridge chip and wonky drivers is a guaranteed bad time. SLI was never perfect, but definitely trash until dx10 bare minimum. I would not recommend Nvidia for retro gaming anyway, because Nvidia had garbage hardware until dx10. Dx8 was kinda acceptable if you didn't use anisotropic filtering, but nobody wrote games for it on PC. Xbox did, but even Halo was ported to dx9. Devs did not care to write assembly for dx8 both vendors, and ended up lowest common denominator. Then only ATI had good dx9, and Nvidia yet again ruined games with poor performance and optimization was lowest common denominator until sm3. Which Nvidia had poor performance on again, but developers at that point didn't care and maxed out sm3, best on Radeon. The dx8 Radeon did have driver issues on Windows 98, no problem on XP/dx9. OpenGL was better on Nvidia, but you are basically talking quake 3 and doom 3, which were fine on Radeon. So why Nvidia at all? Maybe some games like splinter cell using Nvidia specific shadow effects. That's about it, and definitely not a good reason for one game, and overall worse performance. Also, anything not Windows 98 doesn't need period accurate hardware, and runs better on modern hardware and windows. Especially if you're using steam, doesn't work on XP anymore. At least a dual GPU agp card should have collectors value on eBay. That's one positive I guess.
@awilliams1701
@awilliams1701 9 күн бұрын
@@JohnDoe-ip3oq It had the number indicating dual GPU, but it wasn't. It was single. It was like 7950 or something. But.......I also had a 6800.....or was it a 6600? I don't recall. Neither card was playable on my Athlon XP in doom 3. They were perfectly fine in unreal 3 which looked just as good. So I don't know. lol
@awilliams1701
@awilliams1701 9 күн бұрын
@@JohnDoe-ip3oq I do remember when I moved from a geforce 2 gts pro to a 5200 (still in the mindset that is only normal and TI models and having no idea the 5200 was trash). I found the performance was mostly unchanged........except......particle effects. They were a LOT faster. I remember seeing smoke in most games would tank the frame rate.......but not anymore.
@cesaru3619
@cesaru3619 8 күн бұрын
lol rip 3dlabs your gpus were crap because you didn't learn to research new chips or even rent new ones...
@user-dh8ji4ti8b
@user-dh8ji4ti8b 9 күн бұрын
Doom 3 came in 2004 and and it is a little bit strange to ask if it will run 2002 GPU.
@michaelcosta2181
@michaelcosta2181 Күн бұрын
Too much bla bla 👎
Why did slot CPUs exist?
58:44
Tales of Weird Stuff
Рет қаралды 69 М.
I Bought a Cursed "Gaming" PC...
35:55
Budget-Builds Official
Рет қаралды 71 М.
3 wheeler new bike fitting
00:19
Ruhul Shorts
Рет қаралды 48 МЛН
버블티로 체감되는 요즘 물가
00:16
진영민yeongmin
Рет қаралды 52 МЛН
Why Doom is Awesome: Binary Space Partitioning
26:25
ShreddedNerd
Рет қаралды 1 МЛН
Nintendo is erasing its history - The war against ROMS
14:21
Modern Vintage Gamer
Рет қаралды 763 М.
Is it Worth it? OG XBOX CPU Upgrade 733 MHz to 1000 MHz (1 GHz)
12:25
Retro Renegade Repairs
Рет қаралды 47 М.
Game Boy Games (and "Portal") on a Graphing Calculator!
23:15
Michael MJD
Рет қаралды 88 М.
How do Video Game Graphics Work?
21:00
Branch Education
Рет қаралды 3,3 МЛН
Retro PC repairathon part 1: Triage
54:08
Tales of Weird Stuff
Рет қаралды 5 М.
Can this mount actually HELP GPU temps?
16:30
JayzTwoCents
Рет қаралды 198 М.
Mac Pro 2006 Upgrade
2:45:44
ItsMyNaturalColour
Рет қаралды 24 М.
Celeron 300A: The best CPU you could buy in 1998!
17:26
Bits und Bolts
Рет қаралды 22 М.
WWDC 2024 Recap: Is Apple Intelligence Legit?
18:23
Marques Brownlee
Рет қаралды 6 МЛН
Cadiz smart lock official account unlocks the aesthetics of returning home
0:30
Хотела заскамить на Айфон!😱📱(@gertieinar)
0:21
Взрывная История
Рет қаралды 3,6 МЛН