<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>VR World &#187; fx5800</title>
	<atom:link href="http://www.vrworld.com/tag/fx5800/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.vrworld.com</link>
	<description></description>
	<lastBuildDate>Fri, 10 Apr 2015 07:52:59 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.1.1</generator>
	<item>
		<title>GeForce GTX285 on sale, our specs confirmed</title>
		<link>http://www.vrworld.com/2009/01/02/geforce-gtx285-on-sale-our-specs-confirmed/</link>
		<comments>http://www.vrworld.com/2009/01/02/geforce-gtx285-on-sale-our-specs-confirmed/#comments</comments>
		<pubDate>Fri, 02 Jan 2009 11:40:54 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[fx5800]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[Gigabyte]]></category>
		<category><![CDATA[gt206]]></category>
		<category><![CDATA[gt206 specs]]></category>
		<category><![CDATA[gtx285]]></category>
		<category><![CDATA[Hong Kong]]></category>
		<category><![CDATA[Quadro CX]]></category>
		<category><![CDATA[quadro fx4800]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=891</guid>
		<description><![CDATA[<p>For the past couple of weeks, I&#8217;ve been closely following what&#8217;s going on with the 55nm refresh from Nvidia. GT200b (GT200-100-B2) series chips begun their ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/01/02/geforce-gtx285-on-sale-our-specs-confirmed/">GeForce GTX285 on sale, our specs confirmed</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>For the past couple of weeks, I&#8217;ve been closely following what&#8217;s going on with the 55nm refresh from Nvidia. GT200b (GT200-100-B2) series chips begun their life in Quadro CX and FX4800/5800 cards, and then started selling as GeForce GTX260 55nm.</p>
<p>On January 8, 2009, Nvidia will officially introduce GeForce GTX285 1GB and GTX295 1.8 GB cards. Or that was the theory. As it usually happens, manufacturers &#8220;accidentally&#8221; started to sell early, and this time, the &#8220;honor&#8221; of going on sale first goes to GigaByte.</p>
<p>Thanks to HKEPC, we learned that <a href="http://www.hkepc.com/2178" target="_blank">two Hong Kong shops sell GTX285 by Gigabyte</a>. This means GigaByte will be remembered as the first company to offer GTX285 on sale (first blood for GTX260 55nm went to EVGA). Prices are ranged between 410-440 USD, but you can expect it to drop further &#8211; this boards sell with at least $30-50 per store margin for being first (as usual).</p>
<p>GPU-wise, specifications are identical to Quadro FX 5800 &#8211; GPU is clocked to 648 MHz, while shaders are working at 1.48 GHz. GDDR3 memory is clocked to 1.24 GHz, meaning you have 158,976 MB/s or 155.25 GB/s to play with. Power consumption is set at 183W and this was the reason for putting 6+6-pin PEG connectors, instead of usual 8+6 configuration.<br />
While this may be good news for owners of older PSUs without 8-pin PEG connector, overclockers will turn their heads to enthusiast manufacturers such as BFG, EVGA, PALIT and others for the 8+6 versions of the card. 6+6+PCIe slot can only provide 236W of power, meaning you have 53W for overclocking.</p>
<p>In the days of original GTX280, TDP was set at 236W and 8+6+PCIe slot configuration could provide 300W of juice &#8211; or 64W. Still, I may be wrong on this one, since Shamino recently broke 3DMark world record by using single GeForce GTX 285 card with 1.1 GHz core and 2 GHz Shader clock (you think Peter did that with a 65nm GPU? Think again <img src="http://cdn.vrworld.com/wp-includes/images/smilies/icon_wink.gif" alt=";-)" class="wp-smiley" /> )</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/01/02/geforce-gtx285-on-sale-our-specs-confirmed/">GeForce GTX285 on sale, our specs confirmed</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2009/01/02/geforce-gtx285-on-sale-our-specs-confirmed/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Nvidia to launch 55nm GPUs on Tuesday, December 16th?</title>
		<link>http://www.vrworld.com/2008/12/11/nvidia-to-launch-55nm-gpus-on-tuesday-december-16th/</link>
		<comments>http://www.vrworld.com/2008/12/11/nvidia-to-launch-55nm-gpus-on-tuesday-december-16th/#comments</comments>
		<pubDate>Thu, 11 Dec 2008 01:59:39 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[55nm gt200]]></category>
		<category><![CDATA[c1060]]></category>
		<category><![CDATA[fx5800]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[graphics naming convention]]></category>
		<category><![CDATA[gt206]]></category>
		<category><![CDATA[gt212]]></category>
		<category><![CDATA[GTX260]]></category>
		<category><![CDATA[GTX260-216 1.79 GB]]></category>
		<category><![CDATA[GTX260-216 896MB]]></category>
		<category><![CDATA[GTX280-240 1.0 GB]]></category>
		<category><![CDATA[GTX280-240 2.0GB]]></category>
		<category><![CDATA[gtx290]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[nvidia marketing mess]]></category>
		<category><![CDATA[Pentium 4 Extreme Edition 955]]></category>
		<category><![CDATA[Quadro CX]]></category>
		<category><![CDATA[quadro fx4800]]></category>
		<category><![CDATA[Radeon X1800XTX CrossFire Edition]]></category>
		<category><![CDATA[Tesla]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=783</guid>
		<description><![CDATA[<p>Nvidia prepares a launch of 55nm parts for 2008 - nope, they're not going to wait for CES 2009. At least, that's what I heard from couple of sources...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/11/nvidia-to-launch-55nm-gpus-on-tuesday-december-16th/">Nvidia to launch 55nm GPUs on Tuesday, December 16th?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>We heard that nVIDIA is preparing something big for next week&#8230; in a form of properly timed, events taking place around the globe. Well, there are no confirmations, but there will be usual suspects, a gathering of press, partners, nVIDIA executives&#8230; and so on. If tose rumors are true, press would get a weekend of testing, and the products would be launched either on Tuesday, December 16th, or Thursday, December 18th. Personally, I feel that this belongs to &#8220;no way&#8221; category, but I cannot dismiss the rumor if there is something whispering from the rumor mill.</p>
<p>Everybody we asked remained coy about the timelines. Nobody wanted to confirm anything, but one thing is certain &#8211; according to my sources, nVIDIA is not going to wait for CES 2009 to introduce its 55nm parts. Are these rumors true? Well, like any rumor, take it with a fairly large grain of salt. But I am just the messenger here, don&#8217;t shoot <img src="http://cdn.vrworld.com/wp-includes/images/smilies/icon_smile.gif" alt=":-)" class="wp-smiley" /></p>
<p>The 55nm die-shrink, GT206 is already shipping in volume in the form of Quadro CX, FX 4800 and FX 5800 boards. Same story applies to Tesla cards &#8211; we saw some papers from system integrators mentioning reduced power consumption for C1060. </p>
<p>And with the number of leaked pictures floating around, showing GeForce GTX260 with 896 MB of memory, talks about GTX260 with 1.79GB, GTX280 cards with 2GB of GDDR3 memory&#8230; we know at least two partners are seriously contemplating releasing the 1.79 and 2GB cards, so the battle among nVIDIA partners is definitely heating up.</p>
<p>But will we see a surprise (and most certainly paper) launch already this week? Only time can tell. Back in 2005, ATI had a Christmas launch of Radeon X1800XTX CrossFire Edition, aligned with the launch of Pentium 4 Extreme Edition 955. Both turned to be power-hogs and performance duds, replaced by more powerful and elegant X1900XTX, while 955 was replaced with the worst overheating CPU of all times, the 965. The stigma about 965 was so strong that Intel decided to relaunch the number with Core i7 Extreme 965. We spoke with some Intel folk, and they told me that &#8220;it was time do do 965 right&#8221;.</p>
<p>Will nVIDIA have more luck with Christmas launch of 55nm parts, even if its only for show? Only time will tell. For now, there are some promising facts, like single 6-pin power connector on Quadro CX and FX 4800 (effectively GTX 260).</p>
<p>Is this 55nm line-up?</p>
<p>GTX260-216 896MB</p>
<p>GTX260-216 1.79 GB</p>
<p>GTX280-240 1.0 GB</p>
<p>GTX280-240 2.0GB</p>
<p>And here lies the question. Why the company did not rename the 55nm parts to GTX270 and GTX290 &#8211; clocks are different, power is different, only cooling is the same? Well, brace for impact. According to a large nVIDIA&#8217;s partner, it seems that the company ran out of time to allow its AIB partners to print new boxes, and the decision was made to simply rename the older parts and still have new stuff to ship. I cannot validate the source, since it looks, well&#8230; just incredible &#8211; but I have to leave this option open.</p>
<p>All in all, 2008 turned into a big mess as far as Nvidia is concerned. The company delivered world&#8217;s first GPU with hardware FP64 dual-precision and sacrificed hefty part of the die (bear in mind that DP unit takes space as three regular ones&#8230; and there is one unit per cluster of eight), but the naming confusion is something that this company should not allow.</p>
<p>If the company started anew with GTX 200 series, and had prepared renaming of the old parts as G80 (Geforce 9300, 9400, 9600) and GT130 (9800GT, 9800GTX+) why oh why oh why they didn&#8217;t went with GTX 240 (for GTX260-192), GTX  260 (GTX260-216), GTX 270 (55nm part) GTX 280 (GTX 280), GTX 290 (55nm part) and end up with GX2 295 for the dual part? Logic is so easy to find in the car industry, but impossible to find in the IT industry. Both AMD, Intel, ATI and Nvidia did a fine job of crapping on their engineer&#8217;s brilliant jobs. </p>
<p>What&#8217;s wrong with &#8220;AMD Phenom X4 2.5 GHz&#8221;, since that&#8217;s how most people are going to call their product anyways. Core i7 965? Call it The Overclocking Monster Core i7 3.2 GHz and we&#8217;re clear. Radeon 4870? Well, that&#8217;s a good continuation from 3870, let&#8217;s hope they won&#8217;t mess it up.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/11/nvidia-to-launch-55nm-gpus-on-tuesday-december-16th/">Nvidia to launch 55nm GPUs on Tuesday, December 16th?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/12/11/nvidia-to-launch-55nm-gpus-on-tuesday-december-16th/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Zotac leaks pictures of 55nm GTX260</title>
		<link>http://www.vrworld.com/2008/12/05/zotac-leaks-pictures-of-55nm-gtx260-with-15-gb-of-memory/</link>
		<comments>http://www.vrworld.com/2008/12/05/zotac-leaks-pictures-of-55nm-gtx260-with-15-gb-of-memory/#comments</comments>
		<pubDate>Fri, 05 Dec 2008 11:02:33 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Internet]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[15GB]]></category>
		<category><![CDATA[3 gb]]></category>
		<category><![CDATA[55nm gpu]]></category>
		<category><![CDATA[896 mb]]></category>
		<category><![CDATA[fx5800]]></category>
		<category><![CDATA[GPU power consumption]]></category>
		<category><![CDATA[gt206]]></category>
		<category><![CDATA[gt212]]></category>
		<category><![CDATA[gtx260 overclocking]]></category>
		<category><![CDATA[gtx260-216]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Quadro CX]]></category>
		<category><![CDATA[quadro fx4800]]></category>
		<category><![CDATA[Zotac]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=709</guid>
		<description><![CDATA[<p>First leaked news about GeForce cards with the upcoming 55nm GPU. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/05/zotac-leaks-pictures-of-55nm-gtx260-with-15-gb-of-memory/">Zotac leaks pictures of 55nm GTX260</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>And so it happens&#8230; after several leaks <a href="http://theovalich.wordpress.com/2008/12/04/nvidia-55nm-gt206-reviewed-dramatic-reduction-in-power-consumption/" target="_blank">about the deployment of 55nm GPUs as Quadro CX / FX 4800 / 5800</a>, we finally received some solid 55nm GeForce news from the Far East. Chinese colleagues at<a href="http://www.expreview.com/news/hard/2008-12-05/1228468866d10731.html" target="_blank"> Expreview managed to get their hands on Zotac GTX 260-216 based on P654 PCB design</a>.</p>
<p style="text-align:left;"> </p>
<div id="attachment_710" style="width: 510px" class="wp-caption aligncenter"><img class="size-full wp-image-710" title="zotac_55nmgtx260216" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/zotac_55nmgtx260216.jpg" alt="55nm chip on a GeForce card" width="500" height="344" /><p class="wp-caption-text">55nm chip on a GeForce card</p></div>
<p> </p>
<p> </p>
<p style="text-align:left;">This card features Volterra multiphase power regulation (<a href="http://theovalich.wordpress.com/2008/11/24/nvidias-deadly-flaw-and-how-to-fix-it-no-more-gtx280-squealing/" target="_blank">no more Nvidia squealing, yes!</a>), 14 memory chips (instead of standard seven) and 55nm GT200-103-B2 chip. 14 memory chips leaves room for cards with 1.5 GB of GDDR3 memory, and if dual-bank is used, GTX260 can support 3GB memory on the single card.</p>
<p style="text-align:left;">Does this mean GTX295 will feature 3GB of GDDR3 memory? Only time will tell&#8230;</p>
<p style="text-align:left;">Zotac board comes with standard GTX260-216 clocks, but the board features two 6-pin PEG adapters. Since Quadro FX 4800 works with just one, this board just may be overclockers dream. Second PEG adapter provides additional 75W, so the board can consume 225W instead of maximum 150W on Quadro CX/FX4800.</p>
<p style="text-align:left;">When this card hits the market, you can expect overclock it to at least 650 MHz for the GPU and 1500 MHz for the shaders (default clock on FX5800). It will be interesting to see how far can enthusiasts push the 55nm GPU, since this board should result in wonders when cooled with water or something even higher&#8230;</p>
<p style="text-align:left;">As it stands right now, the only card with 55nm GPU featuring all 240 shader units is Quadro FX 5800. It is possible that current yields suck so bad&#8230; until we see GTX &#8220;270&#8221; or GTX280 based on P656 PCB, we know that there aren&#8217;t many 55nm GPUs available for production with all 240 shaders on it.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/05/zotac-leaks-pictures-of-55nm-gtx260-with-15-gb-of-memory/">Zotac leaks pictures of 55nm GTX260</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/12/05/zotac-leaks-pictures-of-55nm-gtx260-with-15-gb-of-memory/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Nvidia 55nm GT206 reviewed, dramatic reduction in power consumption</title>
		<link>http://www.vrworld.com/2008/12/04/nvidia-55nm-gt206-reviewed-dramatic-reduction-in-power-consumption/</link>
		<comments>http://www.vrworld.com/2008/12/04/nvidia-55nm-gt206-reviewed-dramatic-reduction-in-power-consumption/#comments</comments>
		<pubDate>Thu, 04 Dec 2008 17:00:49 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[3d professor]]></category>
		<category><![CDATA[40nm]]></category>
		<category><![CDATA[55nm gpu]]></category>
		<category><![CDATA[6-pin PCIe]]></category>
		<category><![CDATA[6-pin PEG]]></category>
		<category><![CDATA[fx4800]]></category>
		<category><![CDATA[fx5800]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[gpu-z]]></category>
		<category><![CDATA[gt200-b]]></category>
		<category><![CDATA[gt200-c]]></category>
		<category><![CDATA[gt206]]></category>
		<category><![CDATA[gt212]]></category>
		<category><![CDATA[GTX260]]></category>
		<category><![CDATA[GTX280]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[nvidia 55nm]]></category>
		<category><![CDATA[power consumption]]></category>
		<category><![CDATA[Quadro]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=695</guid>
		<description><![CDATA[<p>  A while ago, I wrote a piece stating that Nvidia decided to launch 55nm GT206 as Quadros first. The reason for that is the ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/04/nvidia-55nm-gt206-reviewed-dramatic-reduction-in-power-consumption/">Nvidia 55nm GT206 reviewed, dramatic reduction in power consumption</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p> </p>
<p>A while ago, I wrote a piece stating that <a href="http://theovalich.wordpress.com/2008/11/11/55nm-gt206-gpu-powers-both-gtx290-and-quadro-fx-5800/" target="_blank">Nvidia decided to launch 55nm GT206 as Quadros first</a>. The reason for that is <a href="http://www.theinquirer.net/gb/inquirer/news/2008/12/03/nvidia-55nm-parts-update" target="_blank">the number of problems that Nvidia had in die-shrink process</a>, so the company had to roll-out GT206 in the same way as its old NV30 (Quadro FX 2000 shipped before GeForce FX5800) or as AMD likes to launch its CPUs &#8211; commercial parts (Opteron) are launched first, followed by consumer ones (Phenom, Athlon, Turnmeon).</p>
<p>Thus, GT206 (G200 B Series &#8211; A series marked 65nm parts, B series denominates 55nm parts, G200 C series should mark the 40nm GPUs) debuted as Quadro CX, FX 4800 and FX 5800. Quadro CX and FX 4800 are essentially identical parts: 55nm GPU with 192 shaders (48 shaders and 6 dual-precision units are disabled for yield purposes) with 1.5 GB of GDDR3 memory, while FX 5800 features a combo of 55nm GPU and 4GB of GDDR3 memory.</p>
<p> </p>
<div id="attachment_696" style="width: 508px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_55nmvs65nmgpu.jpg" rel="lightbox-0"><img class="size-full wp-image-696" title="nvidia_55nmvs65nmgpu" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_55nmvs65nmgpu.jpg" alt="55nm vs. 65nm parts, with power consumption and all..." width="498" height="203" /></a><p class="wp-caption-text">55nm vs. 65nm parts, with power consumption and all...</p></div>
<p> </p>
<p>Getting back on track with this story, the honor of being <a href="http://www.3dprofessor.org/Reviews%20Folder%20Pages/FX4800/FX4800P1.htm" target="_blank">the first review of GT206 GPU belong to no other than 3D Professor</a>. 3D Professor got his hands on Quadro FX 4800, part that was silently rolled out yesterday. In his review, the declared maximum power consumption was only 146 Watts. What makes the matters more important is the fact that this is a first high-end graphics card in three years to feature just one 6-pin power connector. It seems like the GPU manufacturers finally started to truly work on reducing the power consumption, while offering more and more performance.</p>

<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_55nmvs65nmgpu.jpg' rel="lightbox[gallery-0]"><img width="498" height="203" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_55nmvs65nmgpu.jpg" class="attachment-vw_medium" alt="55nm vs. 65nm parts, with power consumption and all..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_01.jpg' rel="lightbox[gallery-0]"><img width="500" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_01-500x420.jpg" class="attachment-vw_medium" alt="The test system over at 3D Professor - Core i7 meets Quadro FX 4800" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_02.jpg' rel="lightbox[gallery-0]"><img width="500" height="213" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_02.jpg" class="attachment-vw_medium" alt="High-end GPU is there, paired with only one 6-pin power connector..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_03.jpg' rel="lightbox[gallery-0]"><img width="390" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_03-390x420.jpg" class="attachment-vw_medium" alt="GPU-Z 0.2.8 was not able to detect the GPU properly, same case with 0.2.9." /></a>

<p>We managed to get a screenshot from GPU-Z, but as you can see for yourself, GPU-Z does not correctly recognize Quadro FX 4800 and its 55nm GPU. The only numbers that correlate to Nvidia&#8217;s official product page are ones that talk about GPU clocks. What makes the situation interesting is that  Nvidia declares memory bandwidth at 76.8 GB/s, or 700 MHz DDR. In fact, 1.5 gigs of GDDR3 memory comes clocked at 800 MHz DDR (1.6 GT/s) and has 87.5 GB/s to play with. </p>
<p>Well, more at 3D Professor&#8217;s page &#8211; enjoy in this <a href="http://www.3dprofessor.org/Reviews%20Folder%20Pages/FX4800/FX4800P1.htm" target="_blank">world&#8217;s first review of Quadro FX 4800</a> and the first review of 55nm GPU from Nvidia. Bear in mind this is a professional review of professional card for professionals – which means no 3DMark score :-(. But <a href="http://www.3dprofessor.org/Reviews%20Folder%20Pages/FX4800/FX4800P11.htm" target="_blank">PCMark score is here</a>.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/04/nvidia-55nm-gt206-reviewed-dramatic-reduction-in-power-consumption/">Nvidia 55nm GT206 reviewed, dramatic reduction in power consumption</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/12/04/nvidia-55nm-gt206-reviewed-dramatic-reduction-in-power-consumption/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
	</channel>
</rss>

<!-- Performance optimized by W3 Total Cache. Learn more: http://www.w3-edge.com/wordpress-plugins/

Content Delivery Network via Amazon Web Services: CloudFront: cdn.vrworld.com

 Served from: www.vrworld.com @ 2015-04-10 15:53:21 by W3 Total Cache -->