Yes, I know: more AI. Love it or hate it, u31 ทางเข้า that's the way things are going. For us gamers, it started with upscaling, then frame gen, then Multi Frame Gen, and soon, it seems, fully AI-generated frames.
At GDC today Nvidia announced that "neural shading support will come to DirectX preview in April, unlocking the power of AI Tensor Cores in NVIDIA GeForce RTX GPUs inside of graphics shaders used to program video games…
The end-goal might presumably be to have the game engine tell the GPU information about the primary in-game qualities—objects, movement, and so on—and have AI flesh out the rest of the picture.
It's difficult to imagine how that could work without any information on how to flesh out said picture, but that would be where the "game data and shader code" training comes in: Developers can give the AI model a good idea of what stuff should be like when rendered, and then when users actually play the game, the AI model can do its damndest to replicate that.
As Nvidia's Blackwell white paper explains: "Rather than writing complex shader code to describe these [shader] functions, developers train AI models to approximate the result that the shader code would have computed."
This will presumably be tailored to Blackwell given Nvidia has worked with Microsoft to develop the Cooperative Vectors API, though Nvidia does say that "some of [the developer-created neural shaders] will also run on prior generation GPUs."
We already had an idea that this was in the works, as in December 2024 we saw Inno3D speak about "Neural Rendering Capabilities" in its then-upcoming graphics cards. We'd also seen mention of neural rendering from Nvidia before, but not in a context that could actually be implemented in games just yet.
And then with the launch of the RTX 50-series cards and the RTX Blackwell architecture we had our first look at Neural Shaders in action at CES. With the likes of neural texture compression (offering a touted 7x saving in VRAM usage), RTX Skin (as seen in HL2 Remix's meaty headcrabs), RTX Neural Radiance Cache (also featured in HL2 Remix), RTX Neural Faces, and RTX Neural Materials all promising to offer an enhanced level of realism in games without utterly tanking frame rates.
Nvidia VP of Developer Technology John Spitzer calls this "the future of graphics" and Microsoft u31 เข้าสู่ระบบ Direct3D dev manager Shawn Hargreaves seems to agree, saying that its addition of "Cooperative Vectors support to DirectX and HLSL… will advance the future of graphics programming by enabling neural rendering across the gaming industry."
It's almost a reflex for me to be sceptical of anything AI, but I must remember that my scepticism over frame gen has been slowly abated. I remember seeing character hands moving through the in-game HUDs and writing off DLSS 3 frame gen when it launched, but now those problems are rare and even latency isn't half-bad if you have a high baseline frame rate.
So I'll try to keep my mind open to at least the possibility that this could actually be a step forward. At any rate, we'll find out before long—just a ทางเข้า winner55 ผ่านโทรศัพท์ มือ ถือ few weeks until devs can start trying it out.

CoinCatcher400
Website layout is very clean, intuitive, and easy to navigate. I can quickly find my favorite games, access promotions, and check my account details without any confusion. It’s a pleasure to use.
CoinDropper230
The bonuses are nice and offer great value, although they could be a bit more frequent. I love being part of the VIP program, which gives me extra rewards and makes me feel appreciated as a loyal player.
BetKing688
Some games take a while to load on mobile, but once they start, the gameplay is smooth and exciting. I hope future updates improve mobile performance, but I still enjoy playing several hours a day.