Talk:GeForce 8 series
This is the talk page for discussing improvements to the GeForce 8 series article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find video game sources: "GeForce 8 series" – news · newspapers · books · scholar · JSTOR · free images · free news sources · TWL · NYT · WP reference · VG/RS · VG/RL · WPVG/Talk |
This article links to one or more target anchors that no longer exist.
Please help fix the broken anchors. You can remove this template after fixing the problems. | Reporting errors |
First Article
[edit]This is the first time I have created a new article on wikipedia. Constructive feedback will definately be needed. I read many of the rules and regulations about posting, and have made many edits to existing articles. I just need to know what else I need to make this "fit" with the GeForce series. Please submit feedback! Bourgeoisdude 18:32, 21 February 2006 (UTC)
Perhaps some readers will feel that this article is too speculative or unneeded. Personally I have no problem with the article, since the next graphics series from Nvidia will be notable and will probably be named GeForce 8. Nonetheless, we should add only official statements to the article, avoiding speculation. Shawnc 05:42, 25 February 2006 (UTC)
- Thanks Shawnc. Bourgeoisdude 23:25, 27 February 2006 (UTC)
- It seems to me like this article contains a lot of original research, which is against Wikipedia policy. Please cite sources for these rumors. 66.245.44.26 21:42, 17 July 2006 (UTC)
- Thanks Shawnc. Bourgeoisdude 23:25, 27 February 2006 (UTC)
dunno about u guys but i think that directx 10 is used for generating unreal2006 graphics(shaders,light/glow effects, etc.) also i would imagine that clock speeds for the next generation of geforce would reach 1ghz. 165.21.154.111
Why is there nothing posted regarding the fact that the entire 8400M & 8600M series chips are faulty, they literally self-destruct, do to a major design flaw? Please help me understand the logic behind this???AeroAtom (talk) 04:14, 20 May 2009 (UTC)
No mention about CUDA introduction which is a major thing. — Preceding unsigned comment added by Akostadi (talk • contribs) 08:43, 23 December 2017 (UTC)
Release Date
[edit]Are you sure the expected release date is June/July this year? Narwaffle 04:32, 21 April 2006 (UTC)Narwaffle
- No. That's been the rumor though...I think you're right, it is likely that rumor is false. Will see if any "more accurate" rumors are present... Bourgeoisdude 17:52, 21 April 2006 (UTC)
- I cannot find any significant rumors about the release date...unless someone or something tells me otherwise, I will edit this without the release date info as I am unable to verify its authenticity. Bourgeoisdude 17:18, 27 April 2006 (UTC)
- Found rumors that seem to point that NVIDIA's newest GPU family will be released in the late Fall, so changed the article accordingly. Bourgeoisdude 23:05, 2 May 2006 (UTC)
- I cannot find any significant rumors about the release date...unless someone or something tells me otherwise, I will edit this without the release date info as I am unable to verify its authenticity. Bourgeoisdude 17:18, 27 April 2006 (UTC)
We should put as a rule : a link to the Inquirer doesn't count as a source.
Why the recent edit of release date by anon. ip? I have made a thorough search for information, but the best info I could find was this Inquirer article. (Yes, I know. It's the Inquirer, but still, there isn't exactly an abundance of reliable sources here...) No sources I could find pointed towards anything but a 2006 launch, though. --Fat Hobbit 18:23, 3 August 2006 (UTC)
Nov 8th is the launch date! Source: http://www.geforcelan.com Looks like they are launching the next "platform" at a LAN Party.
Also, when talking of dates please avoid this howler: "in the summer of 2007". That's January, where I live. 150.101.30.44 (talk) 05:17, 2 February 2008 (UTC)
Vandalism Watch
[edit]Why is it that this page is such a magnet for vandalism? Was it something I said? Bourgeoisdude 14:42, 18 May 2006 (UTC)
- I would guess it's because of the constant ATI/Nvidia fanboy warring... --Fat Hobbit 18:23, 3 August 2006 (UTC)
Uhh...
[edit]"In order to combat power supply concerns, Nvidia has declared that G80 will be the first graphics card in the world to run entirely off of the souls of dead babies. This will make running the G80 much cheaper for the average end user."
Blasphemy. :D
Vandalism?
Is it wrong for me to be LMAO from that statement? I must give him credit at least for the humor, nonetheless this is not the place... Bourgeoisdude 19:16, 3 October 2006 (UTC)
Dual Chip vs Dual Core
[edit]I changed it to say 'dual chip' since 'dual core' video cards (like the 7950 GTX) actually use two separate chips each with one core on them. The concept of a 'dual core' video card is a misnomer, i.e. every card since the Voodoo2 has been 'dual core' in that they have multiple pixel pipelines.
Manufacturing process?
[edit]The article says that they'll be switching to 80nm some time in the future, but makes no mention of what it is now... --203.206.183.160 08:04, 26 October 2006 (UTC)
- Who posted that they'll use a 80 nm process? That may be true, but I'm suspicious, since the standard processes of today and tomorrow are 90 nm, 65 nm, and 45 nm. This is the first I've heard of an 80 nm process. Patrick Gill 01:13, 12 November 2006 (UTC)
- No need to be suspicious, they will make an 80nm chip, AMD wanted to release an ATI chip using 80nm before Nvidia, but AMD are having some trouble getting it out, so Nvidia will release there 80nm 8800GTX before AMD/ATI. Here is an article from The Enquirer; http://www.theinq.com/default.aspx?article=36673
Article Expansion/Restructure
[edit]With the release of the cards coming tomorrow, the article should head in the direction of the other graphics cards articles. For example: http://en.wikipedia.org/wiki/GeForce_6_Series, http://en.wikipedia.org/wiki/GeForce_7_Series, http://en.wikipedia.org/wiki/Radeon_R520. Pictures and individual benchmarks, while exciting for us hardware enthusiasts, aren't encyclopedic. The rumors section will need to go. The official nVidia logos for the GeForce 8ks should be added. Great care should be taken to cite the card's hardware specs instead of just editing in information and numbers. These are just my thoughts, I look forward to working with everyone on the article =) Tyro 05:02, 8 November 2006 (UTC)
- Agreed, let's cut the 'rumors' for now since the real thing is here. Bourgeoisdude 21:32, 8 November 2006 (UTC)
Benchmark Testing
[edit]Can you be more specific to what cards you are using and whether they are stock or overclocked and at what speeds?
RTM is here
[edit]Alright guys, I got us started on moving this from a future product to a current one, now we need to change everything else. Any help in transitioning would be great, especially as I am still trying to figure out all the ways that wiki works. The card looks really awesome by the way: http://www.nvidia.com/page/geforce_8800.html Bourgeoisdude 21:30, 8 November 2006 (UTC)
Fixed - Almost
[edit]I've done most everything to make it updated, but it's in a weird format.
The "Adressing the Rumors" section was just the old one with some minor word changes, and some question marks for unknown information (better than being unupdated and misleading).
I got rid of some old references, corrected the details table + added information to it, and changed the "Production Information" article. Viperman5686 --November 8, 7:16PM Eastern.
- I did some pretty heavy chopping up and reorganizing, I think this layout will work though. A section to say what all the GF8 cards have in common, and then a section for each series (8800, 8600 when it comes out, etc). I tried to keep the language from being too technical as well, so someone reading the article won't need a huge GPU vocab dictionarty to get by. Tyro 09:46, 9 November 2006 (UTC)
Pixel Pipelines?
[edit]Correct me if i'm wrong, but im fairly certain this card has pixel pipelines. Someone took my section of the technical summary out mentioning pixel pipelines and changed the wiki to say, "Historically, graphics cards had fixed numbers of non-unified shaders or pipelines. The graphics card's rendering power could become bottlenecked waiting on one high demand shader type."
Look at the fill rate, a simple math equation can tell you the 8800GTX has 64 pipelines and the 8800GTS has 48. (Fillrate*1000)/(Core MHz), 36800/575=64;24000/500=48. Does the above quote mean they have 64 and 48 including the vertex pipelines, effectively reducing the real world fill rate?
Viperman5686 7:03AM Eastern November 9th, 2006.
- I may have removed this while I was trying to reorganize the article, sorry. I used [3] and several pages from [4]as my basis to write about the new shaders. As I understand it, the old style pipelines have been done away with in favor of the new "stream processors" Tyro 20:40, 9 November 2006 (UTC)
- You definitely can't refer to pixel pipelines anymore. That was becoming inaccurate with cards from the last couple years too. G80 doesn't have anything like the pixel pipelines of older cards. --Swaaye 21:09, 9 November 2006 (UTC)
- So basically it has "Stream Pipelines" that do vertex, pixel, physics, and geometry? Or can the term pipeline not be used at all? --Viperman5686 Thursday, 2006-11-09 T 21:37 UTC
- Nevermind, I get it now. Pretty weird that nVidia didn't even tell us if they would use a unified architecture or not, it was hard to understand at first. It looks like we'll have to come up with a new way to easily define raw power of same brand cards, because before it was just MHz*Pipelines --Viperman5686 Thursday, 2006-11-09 21:48 UTC
- So basically it has "Stream Pipelines" that do vertex, pixel, physics, and geometry? Or can the term pipeline not be used at all? --Viperman5686 Thursday, 2006-11-09 T 21:37 UTC
- I'm not entirely sure myself yet. Trying to digest all of this stuff takes some time. But yea, the stream processors do most of the work and then the ROP area outputs pixels. Or something close to that lol. Read the insanely in-depth Beyond3D article if you'd like to take a gander at understanding it (yikes).
- Some programs (like Everest Engineer Ed.) tell me that my 8800GTS has 20 pixel piplines with 1 TMU per pipeline. Not sure if that is correct. The GPU is running at 575 MHz, and fillrate is reported at 11500 Mpixels/s.
- Pipelines haven't been a good measure for a LONG time. Consider that the X800XT PE massively outgunned the 6800 Ultra in fillrate yet it wasn't much ahead it in most cases. Pixel pipelines date back to the Voodoo1 (a 1x1 design). Things have changed so much since then that they became almost unrelated to overall performance. Another example, how the 16-pipe R580 defeats the 24-pipe G71 because it has more pixel shader resources, and pixel shader #s were decoupled from the pixel pipelines in even the previous generation. Then there's also memory efficiency, ROP numbers, etc.--Swaaye 21:52, 9 November 2006 (UTC)
- Yeah, when you compare ATi cards vs. nVidia cards, then that's exactly what you get. What I meant was: when you want to look at the 7900 GT vs the 7600 GT, the number of pipelines and clock speed can give a good idea of the performance difference. --Viperman5686 Thursday, 2006-11-09 21:48 UTC
- Yeah that's true I suppose. In each generation at least. With GeForce 8 it'll probably come down to how many sets of stream processors there are. 8 in 8800, etc.--Swaaye 01:34, 10 November 2006 (UTC)
- Pipelines haven't been a good measure for a LONG time. Consider that the X800XT PE massively outgunned the 6800 Ultra in fillrate yet it wasn't much ahead it in most cases. Pixel pipelines date back to the Voodoo1 (a 1x1 design). Things have changed so much since then that they became almost unrelated to overall performance. Another example, how the 16-pipe R580 defeats the 24-pipe G71 because it has more pixel shader resources, and pixel shader #s were decoupled from the pixel pipelines in even the previous generation. Then there's also memory efficiency, ROP numbers, etc.--Swaaye 21:52, 9 November 2006 (UTC)
- Surely that won't be the case either. Obviously it will make a big difference but it would seem to me stream processor clock speed (which will affect how much each processor can do/second) is probably going to have a fair effect as will memory bandwidth (there will surely still be instances when the card is memory limited) and also core clock speed (I'm assuming this could be a limiting factor depending on the scene too) Nil Einne 07:31, 10 November 2006 (UTC)
- The number of stream processor groups will definitely drop with lower end models. Why? Because it will create smaller GPUs. A mid-range or low-end card can't have a 670-ish million transistor chip. Clock speed will change too, but I don't know how much. Previous generations had mid-range boards running similar clock speeds as high-end models, but without the same amount of actual computational resources they were obviously slower. --Swaaye 18:02, 10 November 2006 (UTC)
Table
[edit]They both Say nVidia GeForce 8800 GTX, the 1 with the lowers specs should say nVidida GeForce 8800GTS
High end & mid range
[edit]Does anyone really think the 8800 GTS can be called mid-range? I would argue the 8800 GTS is the high end and the 8800 GTX is the ultra high end. The midrange and entry level are unannouced although as this article speculates, they'll probably be the 8300 and 8600. Nil Einne 04:49, 10 November 2006 (UTC)
- I've been bold and changed it back to how it was. An anon is the one who changed it here. This is also more consistent with this article and our Geforce 7 and I guess 6 articles BTW... Nil Einne 04:54, 10 November 2006 (UTC)
8 groups of 16 stream processors
[edit]I don't have a great understanding of processor design so forgive me if this is stupid but the article is not 100% clear if there is any restriction within a group. What I mean is are all 128 stream processors completely independent from each other? From the article and my general knowledge I'm guessing they are all more or less independent. Obviously you could only disable a whole group (for GTS etc) but other then that I'm assuming they can each act independently. However perhaps there are limitations e.g. on bandwidth etc Nil Einne 07:24, 10 November 2006 (UTC)
- Check out the reference link. Or just check out either Tech Report's review or Beyond3D's architectural analysis. Both have more data than needs to be in this article. --Swaaye 18:04, 10 November 2006 (UTC)
- I see it like like 8 processors with 16 cores. They each have their own cache. The 8800 GTS has 6 processors with 16 cores. They can all work together. --Viperman5686
- Does the 8800 GTS really have 6 processors with 16 cores (or whatever you want to call it) or 8 processors with 16 cores, 2 of which have been disabled and may or may not actually be working fine? I would assume it's the later Nil Einne 12:44, 11 November 2006 (UTC)
- It is the same chip being built extracted of the same wafers as the gtx variant, except that indeed some part of it may be malfunctioning or not working at the target rate but the design allows them to be shut down, making a fully functional, slightly less powerful chip. We may see more variants in the future as the possibilities are almost endless (number of memory partitions, number of alus, clocks etc.).
- Does the 8800 GTS really have 6 processors with 16 cores (or whatever you want to call it) or 8 processors with 16 cores, 2 of which have been disabled and may or may not actually be working fine? I would assume it's the later Nil Einne 12:44, 11 November 2006 (UTC)
- I see it like like 8 processors with 16 cores. They each have their own cache. The 8800 GTS has 6 processors with 16 cores. They can all work together. --Viperman5686
The description saying that the card is composed of "8 groups of 16 stream processors" does not match with the programming information presented in the CUDA documentation: http://developer.download.nvidia.com/compute/cuda/0_8/NVIDIA_CUDA_Programming_Guide_0.8.pdf See page 49 for a description of the 8800 hardware. Using NVIDIA's terminology, there are 16 "multiprocessors" each composed of 8 "processors". It should be noted that the "processors" are not independent, and are more like floating point units rather than independent execution cores. All 8 of them in a group have to execute the same instruction at once. At the multiprocessor level, they can act more independently. So the current description should be changed to "16 groups of 8 stream processors" for the GTX and "12 groups of 8 stream processors" for the GTS. Stan Seibert 22:46, 17 March 2007 (UTC)
AGP
[edit]will this card support agp
- Not natively, but in theory at least, it is perhaps conceivable that there could be one of those PCI-e to AGP bridge chips used to allow it to work on an AGP slot, but I'm going to say at this point it is highly unlikely. The new-gen video cards really need newer cpu archetectures to perform optimally anyways, so providing a new-gen graphics card to an old-gen PC may be an unwise decision. Heck, I'd be willing to bet that an old PCI version of a GeForce 8 series card (8300LE??) may be released before an AGP version would. So far, no official announcements have been made by nvidia or any of its manufacturers to support AGP or PCI versions, but we can't yet rule it out. Bourgeoisdude 22:05, 29 November 2006 (UTC)
Power consumption
[edit]GeForce 8 has very high power consumption!
- I think you should more appropriately say GeForce 8800 has high power consumption. I have no doubt that the mid/low-end models will be much more frugal.--Swaaye 20:27, 18 November 2006 (UTC)
- It says the 8600GTS draws 71W, but tests on the net, (Xbitlabs which is a believable source) show it to draw a modest 47W. Which sounds more plausible as there are passively cooled 8600GTS cards where the heatsink isn't overdimensioned. 85.19.140.9 01:39, 20 August 2007 (UTC)
SLI
[edit]It's been about 6 years since I've built a gaming desktop PC, thinking about doing it again soon, and I know absolutely nothing about SLI except that having two 8800GTX's sounds good. The Wikipedia SLI article mentions that on high end cards, the advantages of SLI can be diminished, does anyone know how this applies to the 8800GTX? Does anyone know what its SLI performance is like? --GothMoggie 15:21, 23 November 2006 (UTC)
- This page is for helping to improve the article. Your question would be better answered on a tech forum such as arstechnica or tomshardware or anandtech or something. Please see WP:NOT. Thanks. --Yamla 15:37, 23 November 2006 (UTC)
- SLI mode on two 8800GTX's uses 75% of the total processing capability of two cards. Its like having the power of one card and half the power of the other card. However, one 8800GTX/GTS outperforms two 7950GT's in SLI. —The preceding unsigned comment was added by 167.206.216.189 (talk) 20:16, 12 January 2007 (UTC).
I'd like to see the specs on two 8600 Ultras SLI'd together... the total cost would be LESS than a single 8800GTS! I can't wait for March... --Dante Alighieri | Talk 22:43, 25 January 2007 (UTC)
We should talk about architecture improvements
[edit]I think that someone (i will if you agree with me i this) should write about the new features in way to show the improvements on the geforce 8, that is completely different from geforce 7 architecture. There are many, like the filters, geometry shader and cuda processing. Why bother making a page that shows the card clocks, you have to show to people why this card is good. Even because the most change was in the architecture. In Nvidia page there is a pdf with 55 pages that shows all the improvements, we only need to write them here. Thanks!!--Darktorres 18:45, 1 December 2006 (UTC)
- I agree with you--or at least I agree with what I believe you to be saying. I think it should follow the format/layout of previous wiki articles, but we should have extra content because it is a major architecture change. Let's model the page to be similar in layout to the GeForce 7 article, and have content extras like those in the GeForce 256 article (since it was the last "major architectural change"). Last edited by Bourgeoisdude 20:58, 1 December 2006 (UTC)
- Ok, i saw the Geforce 256 article and i will make on the same layout. I pretend to make a `changes to architecture` and show what has changed. Something that could change is the fact that others articles (Geforce 7, for example) don't talk about architecture, only numbers and numbers. Numbers aren't the only important thing, it's like thinking that only the Gigahertz of your processor counts. But on my reference there is the difference between arct. 7 and 8. And i try to make it as simple as possible! --Darktorres 00:43, 3 December 2006 (UTC)
- Problem is that I doubt any of us have a grasp on the hardware. I know I don't. Beyond3D has the best architectural look bar none. Those guys are graphics programmers and have chatted with the engineers. Trying to duplicate that seems a bit like reinventing the wheel, especially when we don't really know for sure that we're right. I definitely think what I threw together for the 8800 here is enough. I have considered making that info more generic too. We sure don't need that kind of dense coverage on every eventual model, and all of the models will undoubtedly have the same technology at work. --Swaaye 23:20, 5 December 2006 (UTC)
- I agree completely. The 7 Series wiki didn't define pixel chaders, vertex shaders, ROPs, etc, and neither should this. Viperman5686 01:30, 9 December 2006 (UTC)
- Wikipedia:Make_technical_articles_accessible. This article is already too technical, the average reader would have a very hard time understanding it. Tyro 10:47, 9 December 2006 (UTC)
- Hmm, ok. But some info like CUDA tech is missing, it's just like PureVideo, it's something new that could end with everyone using it (or not). I will add these. I didn't wanted to add Shaders, ROPS, but the fact that it can do HDR+AA should be mentioned.Darktorres 21:34, 12 December 2006 (UTC)
I added some of the modification i had made on word but it's needing some formatting, i still learning how this wiki works...Darktorres 21:37, 12 December 2006 (UTC)
Bottlenecks
[edit]Is 8800's performance less dependant on CPU and memory speed that older cards?
- Due to the 8800's massively powerful GPU, it is quite easy for the CPU to hold it up. Cooldude7273 03:45, 20 January 2007 (UTC)
Should we make subheadings for 8800GTS and 8800GTX, like previous geforce wiki formats?
[edit]Well, the question says it all. So should they be in their own separate sections? (e.g., Geforce 7800 GT and 7800 GTX are in separate sections, etc.) Bourgeoisdude 16:04, 21 December 2006 (UTC)
- The differences between the 8800GTX and GTS is GPU clock speed and Memory clock speed and Stream Processor. There is not really enought difference to warrent there own sections. The sections would repeat 90% of the same information. —The preceding unsigned comment was added by 167.206.216.189 (talk) 20:20, 12 January 2007 (UTC).
- You can make subheadings eventually...as more 8 series cards are released. Right now, the list isn't just long enough.
better than xenos and RSX
[edit]is the geforce 8 series better than xbox 360s and ps3s gpu? —The preceding unsigned comment was added by Falcon866 (talk • contribs) 01:47, 23 December 2006 (UTC).
- Much better in basically every way. Also vastly more expensive. --Swaaye 07:18, 23 December 2006 (UTC)
8900?
[edit]Is there any word on when the 8900 will be released and what specs we will be looking at with this to be added to the page? Mattm591 21:04, 26 December 2006 (UTC)
- Nope--so far, no official word from anyone about the 8900 series at this time...Bourgeoisdude 01:04, 13 January 2007 (UTC)
Upcoming Products
[edit]Under the "Rumors" section, I added brief information about the release of the middle-end and lower-end cards. The article I got my information can be found here.
8600 specs, but "official" enough?
[edit]that page lists the specifications of the 8600GT and 8600Ultra. but is it a good enough source to be used as a source here? Pik d 09:00, 3 February 2007 (UTC)
Th Inquirer I would not call a good source of anything, and I would take what they say with a grain of salt. Candle 86 15:35, 20 February 2007 (UTC)
- Sounds like you're confusing The Inquirer with The Enquirer. 69.85.180.21 19:28, 26 February 2007 (UTC)
no im not, faud is always wrong Candle 86 04:10, 6 March 2007 (UTC)
Longest card ever?
[edit]It seems unlikely that this is the longest consumer graphics card ever, since a full length PCI card is a bit over 12" long, and I'm pretty darn sure that there existed some full length PCI graphics cards which would have qualified as "consumer". --Dyfrgi 06:45, 8 February 2007 (UTC)
No, maybe full length ISA cards, but I have yet to see any full size PCI graphics cards at all. The biggest PCI card I ever saw was a Banshee and it wasnt that long, about the same size as a Geforce 5900 Candle 86 15:35, 20 February 2007 (UTC)
Wasn't there a dual gpu Voodoo (or some thing from around that time erra) that was full PCI? --71.124.171.221 18:33, 28 August 2007 (UTC)
PCI-e/8600
[edit]It's been said that the 8800 cards won't run with just 4 or 8 PCI-e lanes (eg [5]. Any news on whether or not the same will be true for 8600 cards? 69.85.163.51 04:05, 26 February 2007 (UTC)
Theoretical peak performance
[edit]I modified the theoretical peak performance from 520 to 345 GFLOPs. The REAL specs, released by NVidia, can be found at NVidia CUDA site, in the CUDA programming guide. I also linked to a forum where an internal NVidia developer explains how the FLOPs are estimated. The previous link was obviously wrong, the SP cannot perform 3 FLOPs per clock but 2 (one MAD or one MUL and an ADD, not one MAD and an ADD as the old link suggests).
Something
[edit]How about not using symbols like tide (~) to represent terms like "about (amount)"... They look awful in encyclopedia... --202.71.240.18 09:26, 8 April 2007 (UTC)
It's not official until it's official
[edit]Since NVIDIA has not made any official announcements as to the 8600 series avaiability other than the generalised statements like "Spring of 2007", the April 17th date is not official, is it? I just hate having things stated as fact before we know they are fact, and the chart just proclaims the release of the 8600 series the 17th but we do not KNOW that. Put it in the article and say "such and such source" claims the release will be April 17th or whatever, but in the chart it is as if NVIDIA has it set in stone and can be misleading. Bourgeoisdude 15:09, 10 April 2007 (UTC)
- Nevermind, read the Dailytech article again, although nvidia.com does not say it is happening, nvidia has indeed announced the new cards. I guess I'm just a little too over-anxious to ensure it only proclaims future cards if they're official, but they are official this time... Bourgeoisdude 15:19, 10 April 2007 (UTC)
Is 8500 really "entry-level"?/ 8300 release date
[edit]According to the infobox at the top of the article, the 8500 GT is listed as an "entry-level"/low-end GPU. I don't think that's quite accurate. Although the 8500 is the weakest card of the current bunch (as of April 17th), is surely won't be in a couple of months. I think we should leave this entry labeled "TBA", and following the style of the other Geforce series articles (Geforce FX,6,7,etc.), by removing the "GT", "GTS", etc. to prevent eventual overcrowding. And we probably shouldn't put in the 8300 until it's more official. So, I think it should look something like this:---------------------------------------------------------------------------->
Release date | November 2006 |
---|---|
Codename | G80+ |
Cards | |
Entry-level | TBA |
Mid-range | GeForce 8500[1], GeForce 8600[2] |
High-end | GeForce 8800 |
DirectX | 10.0, Shader Model 4.0 |
Finally, correct me if I'm wrong, but I don't think any of the sources after the 1st bullet point in the "Future Development" section say that the any other card other than the 8500 GT, 8600 GT, and 8600 GTS will ship on April 17th. So I think the bullet point should be reworded to indicate that the other cards mentioned (the 8600 Ultra and the 8300 series) will ship soon after the 17th, but not on the 17th (however, it is important to note they will ship, as there are entries in the .inf's of newer drivers and some websites that mark their existance).-
67.167.93.51 22:27, 10 April 2007 (UTC)
I believe Nvidia claimed the 8500 as entry level. It has the same price as entry-levelGPUs in the 7 and 6 series, anyway. (Just like the 6200 when I bought it a year ago). But if you insist, you can leave it that way because 8500 is a too high number indeed and there will be -someday- 8300\8400 in the future.
References
No 8800gs
[edit]This website says that NVIDIA will NOT release an 8800 gs that it was a typo in the 158 driver release notes. [6] 199.8.170.40 15:09, 26 April 2007 (UTC)
- I've removed the reference to it. --Xyzzyplugh 06:16, 6 May 2007 (UTC)
GeForce 8M series announced
[edit]While there is little if any info on the GeForce 7 article in regards to the mobility chips, can we at least attempt to integrate the info on the GeForce 8 mobile gpus? I'm just thinking if we don't start now, we'll never do it. I'll start a section and put the reference URL and such, assuming I stay 'unbusy' at work for a while... Bourgeoisdude 16:02, 10 May 2007 (UTC)
- Well I hate to start a frame and not finish it...unexpected issues up. Any help would be greatly appreciated, thanks. Bourgeoisdude 16:32, 10 May 2007 (UTC)
Floating Point Performance
[edit]According to several websites the peak theoretical FLOPS don't reach ~ 500 Gigaflops because the MUL operation or something isn't always available. So should I change the FLOP count or say it is not always possible to reach this peak FLOP count because the MUL operation isn't available.
AMD's Radeon HD 2900 XT graphics processor - The Tech Report
And that page links this Beyond 3D page as its source NVIDIA G80: Architecture and GPU Analysis - Page 11
Someone look into this because I'm not completely knowledgeable with this FLOP stuff, just repeating what the site said. --Sat84 11:33, 15 May 2007 (UTC)
The numbers in each of these individual 'series' pages are not matching the numbers in the full List_of_Nvidia_graphics_processing_units, and it seems that the full list is closer to reality, for example https://www.techpowerup.com/gpudb/758/geforce-8800-gts-512 states the same 416gflops as in the list Gendalv (talk) 01:45, 24 February 2018 (UTC)
The best?
[edit]isn't the 8800 Ultra card currently the bestt in the world? shouldn't this be noted? --AnYoNe! 17:19, 22 May 2007 (UTC)
- It all depend on on the POV, and NPOV. Plouiche 15:49, 25 May 2007 (UTC)
- I think it was already noted on the main GeForce page, if not here, but a more accurate term would be "the fastest" rather than best. 74.103.180.140 17:15, 14 June 2007 (UTC)
- Content like that also dates the page and that means more work in the near future when it is inevitably surpassed. It's also not the fastest in every case, such as in some older games that don't work quite right with the cards due to driver issues. --Swaaye 17:37, 14 June 2007 (UTC)
Power consumption
[edit]I'm reasonably sure what you've got posted as the 8800GTX's powerconsumption is wrong, and i remember reading in two places that the 8800 Ultra was a revision to the G80 that allowed it to consume less power then the 8800GTX, which I believe was rated for 177W.
yeah here we are: Firingsquad claims: "8800GTX 177W, 8800 Ultra, 175W, 8800GTS 147W". http://firingsquad.com/hardware/nvidia_geforce_8800_ultra/ Anand states: "Beyond cooling, NVIDIA has altered the G80 silicon. Though they could not go into the specifics, NVIDIA indicated that layout has been changed to allow for higher clocks. They have also enhanced the 90nm process they are using to fab the chips. Adjustments targeted at improving clock speed and reducing power (which can sometimes work against each other) were made. "
Im gonna go ahead and change that, till someone can proove me wrong.
- Er you are wrong, I don't exactly know what max board power means but I'm sure the 8800 Ultra WILL draw more power as its core/shader/mem clocks are higher and the revision enabled higher clock-speeds not power consumption as I remember. Here's a link to some power consumption figures [7] and [8] but I just reverted to the old ones.--Sat84 04:29, 7 June 2007 (UTC)
GeForce 8800 GTX (XFX model pictured) Image
[edit]I notice the "GeForce 8800 GTX (XFX model pictured)" image is not very sharp, I was wondering if it would be okay/proper to upload an image of a sharper photo, but different model (Evga 8800GTS).
Not a huge/important thing, but curious.
--KittenMya 20:55, 14 June 2007 (UTC)
- certainly. I was thinking of doing the same but I haven't had the will to pull the card out of my computer. :) --Swaaye 22:51, 14 June 2007 (UTC)
Crossfire on Intel MoBo??
[edit]i heard the guys at voodoo managed to make crossfire work on an nvidia boards. could it be done on an Intel boards too? —Preceding unsigned comment added by 203.130.242.203 (talk) 07:10, 9 September 2007 (UTC)
8800GT specs
[edit]I know http://forums.firingsquad.com/firingsquad/board/message?board.id=hardware&message.id=110268 isn't exactly a reliable source, but it's probably not far wrong in this case. I can't think of a clean way to tag the 8800GT line as "not certain", so if people feel strongly enough about it, undo the edit.
AntiStatic 07:21, 6 October 2007 (UTC)
GeForce 8800 GT stuff
[edit]Just a quick 8800 GT "to do"/discussion list to avoid any edit warring: 1) 8800 GT GPU clock - The link I gave said 740 MHz, someone edited it to 600 MHz which also rings a bell for me, though I'm not sure why. Include one, both, neither? 2) Shader clock - I can't find any mention of it anywhere. Ditto for power and transistor count. 3) Pictures - the only pics I'm aware of are the ones released by mobile01.com and also at http://forums.vr-zone.com/showthread.php?t=193203 - both of these appear to be from the same source, as they are both identical save for the watermarks. Either that or the vr-zone pictures are just ripoffs of the mobile01 images with the watermarks photoshopped out. Although they look like press-pack images, I'm pretty sure neither of these are GFDL'd nor could be used via fair use.
Also, need a cite for 8800 GTS v2. I can probably hunt one down tomorrow if noone else can be bothered :)
- done. cited inquirer article
Gah, too many people editing the same page at once ... AntiStatic 15:51, 8 October 2007 (UTC)
I moved the 8800 section to the "future Development" section for now, because it is a future chip right now. A section with words like "leaked" and other things of a speculative nature should be under the future development section until more is known (for sure) about the card. —Preceding unsigned comment added by 71.235.45.130 (talk) 20:05, 25 October 2007 (UTC)
Midrange
[edit]- Although marketed as midrange cards, the 8600 series has been criticised by a few publications, such as PC Format and Custom PC for being underpowered successors to the GeForce 7 series, in a similar way to ATI's competing HD 2600.
While I'm aware of the criticisms, as it stands, the above statement is potentially misleading. There is no question that the 8600 series is better then the 7600 series. The reason for the criticism has been that the 7900GS and X1950Pro which in some locations, particularly at around the launch of the 8600 series were comparable priced but were usually better performing. This criticism has been particular vocal given that the X1650 and 7600 were generally unquestionable better then the previous high end, and the same with the 6600GT compared to the 9800XT. However there 8600 series were obviously always intended to be successors to the mid range only so even though there is valid criticism, it needs to be worded properly Nil Einne 21:21, 27 October 2007 (UTC)
- I'm left confused by the claims that the 8500GT is a bit slower than the 6600GT, partly because I swapped my 6600GT out for a 8500GT a while back and benchmarked the crap out of them in the process. The 8500 was slightly faster than the 6600 in some tests, and massively faster in others. Supertin (talk) 08:04, 15 October 2008 (UTC)
8800gts rev2
[edit]This info about this card in the table is inaccurate. The clock speed is around 576 mhz, not 600. The shader clocks are not, to my knowledge, 1500. The card does not have a release date of december. Certain companies, such as bfg, are already selling new cards with 112 shader processors. If you look on their site, it will either say "112 shader processors" or 96+ shader processors, which is, I assume, the "rev 2" card with more shaders enabled that everyone is expecting.
-reply:
I would have corrected this but I don't know how to edit the charts. Check this thread for an explanation of the differences in full: http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2126925
Basically what happened here is that the chart is referring to the 8800gts rev3 as rev2.... rev2 was released the same day as the 8800GT and it is based on the G80 with more stream processors enabled. rev3 will be released dec 12th and it is based on a G92 with more stream processors enabled compared to the GT
Each SP cluster is 16 stream processors: 8800GTSv1 = G80 with 6/8 SP clusters 8800GTSv2 = G80 with 7/8 SP clusters 8800GTX = G80 with 8/8 SP clusters 8800GT = G92 with 7/8 SP clusters. 8800GTSv3 = G92 with 8/8 SP clusters.
So that one you already saw being sold with "112 shader processors" is the real rev2, the one listed here in the chart as rev2 is actually rev3.
If someone knows how to edit those charts please correct this.
Also there are stock speeds declared by nvidia and then there are a plethorea of overclock speeds set by each manufacturer.
Taltamir (talk) 23:25, 9 December 2007 (UTC)
Removal of 8600 "controversy" section
[edit]I'm not so sure what is controversial about the 85/8600 series. 8600 GT thoroughly outperforms 7600 GT and X1600, while 8600 GTS does outperform cards such as 7900 GT and X1950 Pro in a few games (i.e. Oblivion). Secondly, they have dropped in price dramatically since launch, making them excellent values especially if you desire their top-of-generation video decoding hardware.--Swaaye 20:14, 6 November 2007 (UTC)
Do we REALLY need a US-centric tag?
[edit]We're dealing with graphics cards here, of course it's US-centric. They're owned by US companies who design them in the US, build them in the US, and reveal them in the US. Please, if you can somehow make computer parts more geographically appealing, be my guess. I'll give you until the end of the month before I remove the template. —Preceding unsigned comment added by 71.199.151.242 (talk) 01:28, 15 November 2007 (UTC)
- They are built in Taiwan, much more likely. Is there any piece of consumer electronics still made in the US? 219.79.72.53 (talk) 08:18, 21 November 2007 (UTC)
Those mindless violent battle games are certainly made in the US. Along with actual guns, drugs, sex toys, and budget deficits.220.244.75.163 (talk) 00:23, 18 October 2013 (UTC)
Adding PureVideo/2 to notes
[edit]The section says that 8500/8600 both have Purevideo2, does the 8300 GS/GT have it?219.79.72.53 (talk) 08:19, 21 November 2007 (UTC)
8800GTS rev2 doesn't add up
[edit]Unless I'm missing something here, the listed stats for the GTSrev2 don't make sense - they indicate a shader clockspeed lower than the GT (same core) and the same number of shader processors, but still show a notably higher shader performance in GFLOPS. In theory, the card is supposedly more powerful - so the problem lies either with the number of shader pipes (possibly 128) or the shader clockspeed (1500 or higher). According to http://www.tweaktown.com/articles/1234/2 their anonymous engineering sample lists 650/1625mhz core/shader clocks, however those may not be reliable due to the nature of the sample. Also, http://www.thinkdigit.com/forum/showthread.php?p=672226 lists stats for the 8800gts rev2 and some sources, indicating the same clockspeeds as the same engineering sample and 128 shader units. This makes sense as producing a GPU core with 112 shader units doesn't make a lot of sense, considering the power-of-two grouping of shader groups. Taking these into account as well as the 3 FLOPS per ALU value used by the other 8800 cards, I have updated the table accordingly to bring the G92 cards in line with what makes sense. —Preceding unsigned comment added by 154.5.186.216 (talk) 10:30, 2 December 2007 (UTC)
Fair use rationale for Image:GeForce newlogo.png
[edit]Image:GeForce newlogo.png is being used on this article. I notice the image page specifies that the image is being used under fair use but there is no explanation or rationale as to why its use in this Wikipedia article constitutes fair use. In addition to the boilerplate fair use template, you must also write out on the image description page a specific explanation or rationale for why using this image in each article is consistent with fair use.
Please go to the image description page and edit it to include a fair use rationale. Using one of the templates at Wikipedia:Fair use rationale guideline is an easy way to insure that your image is in compliance with Wikipedia policy, but remember that you must complete the template. Do not simply insert a blank template on an image page.
If there is other fair use media, consider checking that you have specified the fair use rationale on the other images used on this page. Note that any fair use images lacking such an explanation can be deleted one week after being tagged, as described on criteria for speedy deletion. If you have any questions please ask them at the Media copyright questions page. Thank you.
BetacommandBot (talk) 23:46, 22 December 2007 (UTC)
Speculation removed
[edit]"...rumoured on both Chiphell.com and vr-zone.com for the G100 chip, which will result in the 9800GTX, Nvidia's new high end offer to come:
- Codenamed G100
- 65nm process
- 256 shader processors
- 780MHz core clock
- 3200MHz memory clock
- 512-bit memory width
- 2048MB (256X8) GDDR5 chips
- GDDR5 @ 0.25-0.5ns
- Dual DVI-out
- Supports DX 10.1, VP3
- 15-25% lower TDP than 8800GTS
"
These are really just rumors based on highly uneducated guesses. Wrong chip type - 512 bit bus is 16 chips, not 8; 256 processors would drive the chip size out of TSMC limits; wrong TDP estimate (it calculates pretty well); GDDR5 isn't expected to run at 3200MHz, et cetera, et cetera. I removed that nonsense and put an internal link to Comparison of NVIDIA Graphics Processing Units where some data confirmed by sources has appeared by now. CP/M comm |Wikipedia Neutrality Project| 16:18, 2 January 2008 (UTC)
8800GT compatible with PCI Express 1.0a
[edit]There are still no citations for the statements regarding the 8800GT being incompatible with PCI-E 1.0a. Nvidia says they are fully backward compatible (Answer ID 2120), and I know from personal experience that it works on my DFI NF4 PCI-E 1.0a motherboard.
PCI Express 2.0 Support Question Are PCI-Express 2.0 graphics cards / motherboards compatible with PCI-Express 1.1 and 1.0? Answer Yes, PCI Express 2.0 products are fully backwards compatible with existing PCI Express products and platforms. —Preceding unsigned comment added by 168.166.80.136 (talk) 20:16, 10 March 2008 (UTC)
168.166.80.136 (talk) 16:47, 11 March 2008 (UTC)
I seem to be having the exact symptoms listed under the incompatibility issues, however my hardware doesn't match. I'm using a BFG GeForce 8800 GT 512MB PCIe 2.0 x16, and my motherboard is an MSI P7N SLI Platinum LGA 775 NVIDIA nForce 750i SLI ATX, which claims to be PCIe 2.0 x16. I had this problem when I first got the BFG, but when I populated the memory timings in the BIOS (rather than leaving it on AUTO), the problem went away. For some reason, it's not going away this time. I've already replaced the card with the manufacturer, it just acts like it's not even there. My old, significantly weaker ATI Radeon x850 XT will work fine without any problems however. Any suggestions? --MarkoOhNo (talk) 09:37, 8 February 2010 (UTC)
Apparently there is a 8800GS
[edit]Or else why would Newegg have them in stock? [9] 24.6.46.92 (talk) 22:36, 5 April 2008 (UTC)
GTX8800 size issue?
[edit]I'm looking at a PC case, the instructions say "NVIDIA 8800GTX compatible." Is there something weird about the 8800GTX that would make it incompatible with certain PC cases? If so, this should be mentioned. ---Ransom (--208.25.0.2 (talk) 19:44, 7 August 2008 (UTC))
- At launch, they were larger (longer) than anything that came before. It's less special today, but back then many enthusiast cases required cutting metal to get the part to fit. Add it if you feel like it. —Preceding unsigned comment added by 198.36.86.87 (talk) 11:08, 7 October 2008 (UTC)
Sega Europa-R
[edit]It has suggested that Wikipedia note about use of GeForce 8 Series in most recent board, that used in Sega Rally 3. It is called Sega Europa-R. The bold-printed is an advesory to this problem.
Check this note: Europa-R Specifications
- CPU: Intel Pentium Dual-Core
- RAM: 4 GB
- GPU: NVIDIA GeForce 8800
- Other: Compatible HDTV (High Definition), DVD Drive Support, Sega ALL.NET online support
- Protection: High Spec original security module. —Preceding unsigned comment added by 125.160.194.63 (talk) 23:33, 11 October 2008 (UTC)
8500/8600 section
[edit]This section needs many citations for it's claims about the 8500 GT. I haven't seen any articles claiming that the 8500 is "on par" with the 6600 GT. There are so many things wrong with this section - comparing it with the high end 8800 cards? Obviously they won't perform as well. They're not high end cards. The "older cards" such as the 7900 GS mentioned cannot be compared with these mid-range cards either - the 7900 is a high end card (albeit, an older series). All in all, these many criticisms need sources, and the section needs to be rewritten to reflect a neutral POV. Delta (talk) 01:08, 28 November 2008 (UTC)
Apparently, the 8400GS has a version with 512 MB memory as well. --89.147.67.118 (talk) 12:17, 28 March 2009 (UTC)
Redirection from NV50
[edit]http://en.wikipedia.org/wiki/NV50 redirects to http://en.wikipedia.org/wiki/Geforce8. This is not really correct because the GeForce 9 series also uses the NV50 chipset. I dunno what to do best in this situation. A new page just for NV50 seems to be superfluous. —Preceding unsigned comment added by 89.245.198.184 (talk) 02:12, 8 April 2009 (UTC)
acronyms
[edit]Need to define the suffixes GS, GT, GTX, etc. —Preceding unsigned comment added by Mwaisberg (talk • contribs) 08:41, 1 December 2009 (UTC)
Mac Pro compatibility
[edit]This article is completely silent on the issues with compatibility with Mac Pro computers, specifically the two models of 8800 GT for Mac and their compatibility with 16-bit and 32-bit EFI in 1st and 2nd Generation Mac Pro computers (MacPro1,1 and MacPro3,1) respectively. Apple as an article on this ( http://support.apple.com/kb/HT2848 ) and NVIDIA does also ( http://www.nvidia.com/object/product_geforce_8800gt_for_mac_us.html , javascript:LoadOverviewFeatures(); ("1st Generation Mac Pro" link)). Other information is hard to come by; even vendors of these cards can't seem to keep their inventory straight, shipping 630-9897 cards for machines for which only 630-9492 will work. GeForce 8-series cards are required for compatibility of Portal and other Source-engine games for Mac Pro under Steam.
Some message boards talk of certain character sequences in serial numbers on these cards which don't appear to be wholly accurate. 216.229.13.9 (talk) 20:44, 18 July 2010 (UTC) \
Geometry Shaders
[edit]Isn't this the first nVidia card with Geometry Shaders? Why isn't the term "geometry shader" mentioned anywhere on the page? —Preceding unsigned comment added by 65.50.39.225 (talk) 04:02, 16 September 2010 (UTC)
Mostly Dead Links circa Sep 2011
[edit]When I started checking some of the links, I got 404 errors from most of the nvidia.com links, too many to mark each one. The original content might be moved to the legacy devices section, or perhaps they are stored in wayback machine on archive.org? It is difficult to keep up with such vendors. There are probably a lot of folks who still own these video cards, so I can't advocate archiving the site. For example, I was looking for some information on how to configure Linux kernel options, gart window size and all. Nividia considers the series obsolete, so this might be of historical interest although reference citations no longer exist. There are probably a lot of pages in a similar state, just opening the subject for conversation, what do we do with such data? Wiki should be the go-to place for cutting-edge technology, maybe also for information you can't find anywhere else? Hpfeil (talk) 19:49, 22 September 2011 (UTC)
8200?
[edit]No information whatsoever? — Preceding unsigned comment added by 207.81.82.232 (talk) 05:42, 5 October 2011 (UTC)
8800GT Initial Price
[edit]I made the edit for changing the initial price of the 8800GT from $200 to $300. I specifically remember going to Microcenter right around the time it came out and they wanted $389 for a reference model (PNY) and on Newegg it was only $329. I asked the salesperson to price match it and he initially said no, but as I was walking in the parking lot of the sales person stopped me and said "We can match the price!" and to this day I will never forget that moment. --Greg Pace (talk) 19:48, 9 March 2016 (UTC)
External links modified
[edit]Hello fellow Wikipedians,
I have just added archive links to one external link on GeForce 8 series. Please take a moment to review my edit. You may add {{cbignore}}
after the link to keep me from modifying it, if I keep adding bad data, but formatting bugs should be reported instead. Alternatively, you can add {{nobots|deny=InternetArchiveBot}}
to keep me off the page altogether, but should be used as a last resort. I made the following changes:
- Attempted to fix sourcing for http://support.apple.com/kb/TS2377
When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}
).
This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}}
(last update: 5 June 2024).
- If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
- If you found an error with any archives or the URLs themselves, you can fix them with this tool.
Cheers.—cyberbot IITalk to my owner:Online 12:22, 28 March 2016 (UTC)
External links modified
[edit]Hello fellow Wikipedians,
I have just modified 2 external links on GeForce 8 series. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
- Added archive https://web.archive.org/web/20070925073721/http://www.theinquirer.net/default.aspx?article=38884 to http://www.theinquirer.net/default.aspx?article=38884
- Added archive https://web.archive.org/web/20100105062655/http://support.apple.com/kb/TS2377 to http://support.apple.com/kb/TS2377
When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}
).
This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}}
(last update: 5 June 2024).
- If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
- If you found an error with any archives or the URLs themselves, you can fix them with this tool.
Cheers.—InternetArchiveBot (Report bug) 19:49, 20 July 2016 (UTC)
Screen refresh rates
[edit]Yoshi the crocodile (talk) 22:06, 1 April 2022 (UTC)
The article does not mention the screen refresh rates yet. I remember a GeForce from circa 2007, likely 8400, having no fixed limit on screen fixed limitation on screen refresh rate.
The forum post at forums.t***hardware.com/threads/can-gs-8400-go-1920x1080-144hz-120hz.2917022/ (not directly linked because their owners excluded themselves from the Wayback Machine, therefore untrustworthy) claims that it goes up to 144 Hz.