[wplug] video card power consumption -- performance comparison ...

Bryan J. Smith b.j.smith at ieee.org
Mon Sep 10 08:29:09 EDT 2007


On Mon, 2007-09-10 at 08:15 -0400, Bryan J. Smith wrote:
> With that all said ...
> nVidia's latest "lower-end" products in the G80/8000 series is supposed
> to cut power significantly over the G70/7000 and NV40/6000 series.
> E.g., they regularly highly the power savings of the Go 8400 and Go 8600
> in notebooks over their equivalent Go 7000 series.  But they haven't
> released a "lower power" version for the desktop yet AFAIK, or maybe
> there is a GeForce 8400 now.  In any case, there are GeForce 7050
> chipset integrated, as well as GeForce 6150, 6100 and 6150LE (basically
> worse than 6100) chipset integrated video.  The great thing is that they
> _commonly_ come in sub-$50 MicroATX mainboards, great for small
> form-factor.  The other nice thing is that the MCP44, MCP51 and MCP61
> peripherals have _very_well_supported_ and 100% GPL drivers for ATA,
> NIC, audio, etc... on kernel 2.6 -- better than Intel IMHO (especially
> ATA at times, Intel seems to keep 'screwing the pooch' with newer ICH8
> and ICH9 logic changes that prevent the existing support from working).

The G70 series (e.g., GeForce 7050 chipset integrated GPU) are supposed
to have far better video off-load than the NV40 (e.g., GeForce 61x0
chipset integrated GPU), although the G80 even better than G70 (although
I haven't seen the G80 chipset integrated GPU yet -- they typically come
out much later after the cards).  MPlayer and other open source projects
often take advantage of these nVidia accelerations, although standard Xv
and other video, overlay, etc... features "just work" because nVidia is
really good at supporting open standards (even if their drivers are not
open source -- that's always been their hallmark, including AIGLX almost
"off-the-bat" in a beta driver).

I don't think people realize how much is "missing" in the Intel open
source drivers -- both the utter lack of a kernel driver, plus the lack
of features in the user-space X11 driver -- which affect even 2D,
including video playback, overlay, etc...

Anyhoo, my point, your current nVidia GeForce 6600LE _is_ overkill.
Back in 2006, I used to maintain tables on the 6000/7000 series ... (I
stopped because of BlogSpot's stupid formatting issues):  
http://thebs413.blogspot.com/2006/02/geforce-6-and-7-series-variants-nuts.html


The NV43 (GeForce 6600 series) uses a 8/4 vertex/pixel design with
128-bit external memory bus.  The GeForce 6600LE is a "crippled" 4/3
vertex/pixel design, paired with standard DDR or sometimes DDR2 memory.
I.e., they are probably the GPU ICs that had damaged units on the wafer
or otherwise had units "fail" testes under nominal operations.

The NV44 (GeForce 6100/6150) is a 2/1 vertex/pixel design, using the
main system's external memory bus.  It's going to be slower for 3D
rendering than the 6600LE (which is probably 3-4x faster), but it should
have no issue and seem no different for 2D output, including video
playback and overlay.

The G72/G74(?) (GeForce 7050) is also a 2/1 vertex/pixel design, using
the main system's external memory bus.  There are improvements in the
architecture, so even with a drop-off in clock (also saving power), it
typically bests the NV44 by 10-25%.  But it's not a major jump up in
performance either.

Again, I haven't seen any "chipset integrated" GPUs yet from the G80
series, not even for notebooks.  But I haven't kept up either.


-- 
Bryan J. Smith         Professional, Technical Annoyance
mailto:b.j.smith at ieee.org   http://thebs413.blogspot.com
--------------------------------------------------------
        Fission Power:  An Inconvenient Solution




More information about the wplug mailing list