Archive: nVidia Cg and AVS


24th July 2002 01:59 UTC

nVidia Cg and AVS
Some of you may have seen this post or have heard of it else where, but Nvidia has just launched a new graphics oriented programming language called Cg. When I heard about this one thing that popped into my head was that this might be a sweet thing to do AVS in. Now I know nothing about C and little about the AVS, but Cg seems to have some simple SIN COS etc. functions and if AVS was done in cg it would be hardware accelerated and therefore Exteremely fast, we're talking 1600x1200 32bpp 4x Anti-Aliased here *I would imagine. Maybe not.:weird::confused::)

OK now that that's done this may be a really stupid idea and someone will clue me in to why this won't work. I know many ppl have wanted hardware accelerated AVS for a long long time and I just thought that this could be something to think about.

I don't know if each 'preset' would have to be written in Cg or if a code base could be established for a user friendly interface such as the current one.


24th July 2002 07:55 UTC

Cg is just a nice way to program Vertex and Pixel shaders... it's not a complete programming language. And just because two languages can both calculate a sine doesn't mean they are even closely related.

However, using pixel and vertex shaders for a hardware version of AVS would be cool: shaders, especially pixel shaders are quite revolutionary and a huge step forward. But Cg doesn't really add anything to it, it just makes it easier for programmers to write shaders.

To do the actual scripting in AVS as Cg doesn't make much sense: programming shaders requires a good knowledge of what you're doing and not everyone's cup of tea. And using the 'type random formulas'-approach doesn't work as well with them.
However shaders could be used to do some AVS effects that couldn't be done by 3D hardware efficiently before (e.g. Trans/Movement).

An important thing to remember is that it would only work on the most recent cards though and considering that people still regularly come here asking for a non-MMX version of AVS, it'd mean a much smaller userbase.

Don't get me wrong, I'd love to see a hardware-accelerated AVS too, just don't expect the whole problem to be solved just because nVidia pulls another stunt :). Those nVidia demos are always very cool, but they are usually misleading because they are focused on one effect only. So while you might indeed be able to draw a human face in full detail with their cards, it'd take all the card's power. And in any 3D application there's certainly more than just one human face.


24th July 2002 13:27 UTC

Thanks :D :D Like I said I wasn't really sure what I was talking about so thanks for the clarification.
As far as the integrity (w/w?) of nVidia's statements, I agree they're usually pretty hyped up by the PR dept. However I'm pretty sure that this stuff can be handled full screen by the next gen GPU's, Their CineFX has to compete with ATI's RasterMan or whatever they call it. But that's not really the topic here so it doesn't matter.

Anyway thanks for the input maybe hardware support will come later until then I'll just watch my 14 fps fullscreen vis.;) :winamp: :D


24th July 2002 13:34 UTC

By the way I didn't mean to really bash nVidia: they make great products. I just wanted to point out that you'll never see something like that picture of the human face in a game today, because a game has to do a LOT more than just human faces, while the demo's nVidia makes are focused completely on one thing and completely optimized for their hardware.