<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>VR World &#187; GTX260</title>
	<atom:link href="http://www.vrworld.com/tag/gtx260/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.vrworld.com</link>
	<description></description>
	<lastBuildDate>Fri, 10 Apr 2015 07:54:22 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.1.1</generator>
	<item>
		<title>Nvidia to launch 55nm GPUs on Tuesday, December 16th?</title>
		<link>http://www.vrworld.com/2008/12/11/nvidia-to-launch-55nm-gpus-on-tuesday-december-16th/</link>
		<comments>http://www.vrworld.com/2008/12/11/nvidia-to-launch-55nm-gpus-on-tuesday-december-16th/#comments</comments>
		<pubDate>Thu, 11 Dec 2008 01:59:39 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[55nm gt200]]></category>
		<category><![CDATA[c1060]]></category>
		<category><![CDATA[fx5800]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[graphics naming convention]]></category>
		<category><![CDATA[gt206]]></category>
		<category><![CDATA[gt212]]></category>
		<category><![CDATA[GTX260]]></category>
		<category><![CDATA[GTX260-216 1.79 GB]]></category>
		<category><![CDATA[GTX260-216 896MB]]></category>
		<category><![CDATA[GTX280-240 1.0 GB]]></category>
		<category><![CDATA[GTX280-240 2.0GB]]></category>
		<category><![CDATA[gtx290]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[nvidia marketing mess]]></category>
		<category><![CDATA[Pentium 4 Extreme Edition 955]]></category>
		<category><![CDATA[Quadro CX]]></category>
		<category><![CDATA[quadro fx4800]]></category>
		<category><![CDATA[Radeon X1800XTX CrossFire Edition]]></category>
		<category><![CDATA[Tesla]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=783</guid>
		<description><![CDATA[<p>Nvidia prepares a launch of 55nm parts for 2008 - nope, they're not going to wait for CES 2009. At least, that's what I heard from couple of sources...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/11/nvidia-to-launch-55nm-gpus-on-tuesday-december-16th/">Nvidia to launch 55nm GPUs on Tuesday, December 16th?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>We heard that nVIDIA is preparing something big for next week&#8230; in a form of properly timed, events taking place around the globe. Well, there are no confirmations, but there will be usual suspects, a gathering of press, partners, nVIDIA executives&#8230; and so on. If tose rumors are true, press would get a weekend of testing, and the products would be launched either on Tuesday, December 16th, or Thursday, December 18th. Personally, I feel that this belongs to &#8220;no way&#8221; category, but I cannot dismiss the rumor if there is something whispering from the rumor mill.</p>
<p>Everybody we asked remained coy about the timelines. Nobody wanted to confirm anything, but one thing is certain &#8211; according to my sources, nVIDIA is not going to wait for CES 2009 to introduce its 55nm parts. Are these rumors true? Well, like any rumor, take it with a fairly large grain of salt. But I am just the messenger here, don&#8217;t shoot <img src="http://cdn.vrworld.com/wp-includes/images/smilies/icon_smile.gif" alt=":-)" class="wp-smiley" /></p>
<p>The 55nm die-shrink, GT206 is already shipping in volume in the form of Quadro CX, FX 4800 and FX 5800 boards. Same story applies to Tesla cards &#8211; we saw some papers from system integrators mentioning reduced power consumption for C1060. </p>
<p>And with the number of leaked pictures floating around, showing GeForce GTX260 with 896 MB of memory, talks about GTX260 with 1.79GB, GTX280 cards with 2GB of GDDR3 memory&#8230; we know at least two partners are seriously contemplating releasing the 1.79 and 2GB cards, so the battle among nVIDIA partners is definitely heating up.</p>
<p>But will we see a surprise (and most certainly paper) launch already this week? Only time can tell. Back in 2005, ATI had a Christmas launch of Radeon X1800XTX CrossFire Edition, aligned with the launch of Pentium 4 Extreme Edition 955. Both turned to be power-hogs and performance duds, replaced by more powerful and elegant X1900XTX, while 955 was replaced with the worst overheating CPU of all times, the 965. The stigma about 965 was so strong that Intel decided to relaunch the number with Core i7 Extreme 965. We spoke with some Intel folk, and they told me that &#8220;it was time do do 965 right&#8221;.</p>
<p>Will nVIDIA have more luck with Christmas launch of 55nm parts, even if its only for show? Only time will tell. For now, there are some promising facts, like single 6-pin power connector on Quadro CX and FX 4800 (effectively GTX 260).</p>
<p>Is this 55nm line-up?</p>
<p>GTX260-216 896MB</p>
<p>GTX260-216 1.79 GB</p>
<p>GTX280-240 1.0 GB</p>
<p>GTX280-240 2.0GB</p>
<p>And here lies the question. Why the company did not rename the 55nm parts to GTX270 and GTX290 &#8211; clocks are different, power is different, only cooling is the same? Well, brace for impact. According to a large nVIDIA&#8217;s partner, it seems that the company ran out of time to allow its AIB partners to print new boxes, and the decision was made to simply rename the older parts and still have new stuff to ship. I cannot validate the source, since it looks, well&#8230; just incredible &#8211; but I have to leave this option open.</p>
<p>All in all, 2008 turned into a big mess as far as Nvidia is concerned. The company delivered world&#8217;s first GPU with hardware FP64 dual-precision and sacrificed hefty part of the die (bear in mind that DP unit takes space as three regular ones&#8230; and there is one unit per cluster of eight), but the naming confusion is something that this company should not allow.</p>
<p>If the company started anew with GTX 200 series, and had prepared renaming of the old parts as G80 (Geforce 9300, 9400, 9600) and GT130 (9800GT, 9800GTX+) why oh why oh why they didn&#8217;t went with GTX 240 (for GTX260-192), GTX  260 (GTX260-216), GTX 270 (55nm part) GTX 280 (GTX 280), GTX 290 (55nm part) and end up with GX2 295 for the dual part? Logic is so easy to find in the car industry, but impossible to find in the IT industry. Both AMD, Intel, ATI and Nvidia did a fine job of crapping on their engineer&#8217;s brilliant jobs. </p>
<p>What&#8217;s wrong with &#8220;AMD Phenom X4 2.5 GHz&#8221;, since that&#8217;s how most people are going to call their product anyways. Core i7 965? Call it The Overclocking Monster Core i7 3.2 GHz and we&#8217;re clear. Radeon 4870? Well, that&#8217;s a good continuation from 3870, let&#8217;s hope they won&#8217;t mess it up.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/11/nvidia-to-launch-55nm-gpus-on-tuesday-december-16th/">Nvidia to launch 55nm GPUs on Tuesday, December 16th?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/12/11/nvidia-to-launch-55nm-gpus-on-tuesday-december-16th/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia 55nm GT206 reviewed, dramatic reduction in power consumption</title>
		<link>http://www.vrworld.com/2008/12/04/nvidia-55nm-gt206-reviewed-dramatic-reduction-in-power-consumption/</link>
		<comments>http://www.vrworld.com/2008/12/04/nvidia-55nm-gt206-reviewed-dramatic-reduction-in-power-consumption/#comments</comments>
		<pubDate>Thu, 04 Dec 2008 17:00:49 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[3d professor]]></category>
		<category><![CDATA[40nm]]></category>
		<category><![CDATA[55nm gpu]]></category>
		<category><![CDATA[6-pin PCIe]]></category>
		<category><![CDATA[6-pin PEG]]></category>
		<category><![CDATA[fx4800]]></category>
		<category><![CDATA[fx5800]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[gpu-z]]></category>
		<category><![CDATA[gt200-b]]></category>
		<category><![CDATA[gt200-c]]></category>
		<category><![CDATA[gt206]]></category>
		<category><![CDATA[gt212]]></category>
		<category><![CDATA[GTX260]]></category>
		<category><![CDATA[GTX280]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[nvidia 55nm]]></category>
		<category><![CDATA[power consumption]]></category>
		<category><![CDATA[Quadro]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=695</guid>
		<description><![CDATA[<p>  A while ago, I wrote a piece stating that Nvidia decided to launch 55nm GT206 as Quadros first. The reason for that is the ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/04/nvidia-55nm-gt206-reviewed-dramatic-reduction-in-power-consumption/">Nvidia 55nm GT206 reviewed, dramatic reduction in power consumption</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p> </p>
<p>A while ago, I wrote a piece stating that <a href="http://theovalich.wordpress.com/2008/11/11/55nm-gt206-gpu-powers-both-gtx290-and-quadro-fx-5800/" target="_blank">Nvidia decided to launch 55nm GT206 as Quadros first</a>. The reason for that is <a href="http://www.theinquirer.net/gb/inquirer/news/2008/12/03/nvidia-55nm-parts-update" target="_blank">the number of problems that Nvidia had in die-shrink process</a>, so the company had to roll-out GT206 in the same way as its old NV30 (Quadro FX 2000 shipped before GeForce FX5800) or as AMD likes to launch its CPUs &#8211; commercial parts (Opteron) are launched first, followed by consumer ones (Phenom, Athlon, Turnmeon).</p>
<p>Thus, GT206 (G200 B Series &#8211; A series marked 65nm parts, B series denominates 55nm parts, G200 C series should mark the 40nm GPUs) debuted as Quadro CX, FX 4800 and FX 5800. Quadro CX and FX 4800 are essentially identical parts: 55nm GPU with 192 shaders (48 shaders and 6 dual-precision units are disabled for yield purposes) with 1.5 GB of GDDR3 memory, while FX 5800 features a combo of 55nm GPU and 4GB of GDDR3 memory.</p>
<p> </p>
<div id="attachment_696" style="width: 508px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_55nmvs65nmgpu.jpg" rel="lightbox-0"><img class="size-full wp-image-696" title="nvidia_55nmvs65nmgpu" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_55nmvs65nmgpu.jpg" alt="55nm vs. 65nm parts, with power consumption and all..." width="498" height="203" /></a><p class="wp-caption-text">55nm vs. 65nm parts, with power consumption and all...</p></div>
<p> </p>
<p>Getting back on track with this story, the honor of being <a href="http://www.3dprofessor.org/Reviews%20Folder%20Pages/FX4800/FX4800P1.htm" target="_blank">the first review of GT206 GPU belong to no other than 3D Professor</a>. 3D Professor got his hands on Quadro FX 4800, part that was silently rolled out yesterday. In his review, the declared maximum power consumption was only 146 Watts. What makes the matters more important is the fact that this is a first high-end graphics card in three years to feature just one 6-pin power connector. It seems like the GPU manufacturers finally started to truly work on reducing the power consumption, while offering more and more performance.</p>

<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_55nmvs65nmgpu.jpg' rel="lightbox[gallery-0]"><img width="498" height="203" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_55nmvs65nmgpu.jpg" class="attachment-vw_medium" alt="55nm vs. 65nm parts, with power consumption and all..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_01.jpg' rel="lightbox[gallery-0]"><img width="500" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_01-500x420.jpg" class="attachment-vw_medium" alt="The test system over at 3D Professor - Core i7 meets Quadro FX 4800" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_02.jpg' rel="lightbox[gallery-0]"><img width="500" height="213" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_02.jpg" class="attachment-vw_medium" alt="High-end GPU is there, paired with only one 6-pin power connector..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_03.jpg' rel="lightbox[gallery-0]"><img width="390" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_quadro4800_3dprof_03-390x420.jpg" class="attachment-vw_medium" alt="GPU-Z 0.2.8 was not able to detect the GPU properly, same case with 0.2.9." /></a>

<p>We managed to get a screenshot from GPU-Z, but as you can see for yourself, GPU-Z does not correctly recognize Quadro FX 4800 and its 55nm GPU. The only numbers that correlate to Nvidia&#8217;s official product page are ones that talk about GPU clocks. What makes the situation interesting is that  Nvidia declares memory bandwidth at 76.8 GB/s, or 700 MHz DDR. In fact, 1.5 gigs of GDDR3 memory comes clocked at 800 MHz DDR (1.6 GT/s) and has 87.5 GB/s to play with. </p>
<p>Well, more at 3D Professor&#8217;s page &#8211; enjoy in this <a href="http://www.3dprofessor.org/Reviews%20Folder%20Pages/FX4800/FX4800P1.htm" target="_blank">world&#8217;s first review of Quadro FX 4800</a> and the first review of 55nm GPU from Nvidia. Bear in mind this is a professional review of professional card for professionals – which means no 3DMark score :-(. But <a href="http://www.3dprofessor.org/Reviews%20Folder%20Pages/FX4800/FX4800P11.htm" target="_blank">PCMark score is here</a>.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/04/nvidia-55nm-gt206-reviewed-dramatic-reduction-in-power-consumption/">Nvidia 55nm GT206 reviewed, dramatic reduction in power consumption</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/12/04/nvidia-55nm-gt206-reviewed-dramatic-reduction-in-power-consumption/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>UPDATED: Nvidia&#8217;s &#8220;deadly&#8221; flaw and how to fix it &#8211; no more squealing!</title>
		<link>http://www.vrworld.com/2008/11/24/nvidias-deadly-flaw-and-how-to-fix-it-no-more-gtx280-squealing/</link>
		<comments>http://www.vrworld.com/2008/11/24/nvidias-deadly-flaw-and-how-to-fix-it-no-more-gtx280-squealing/#comments</comments>
		<pubDate>Mon, 24 Nov 2008 12:00:19 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[8600gts]]></category>
		<category><![CDATA[8800GT]]></category>
		<category><![CDATA[8800gts]]></category>
		<category><![CDATA[9800gx2]]></category>
		<category><![CDATA[Asus]]></category>
		<category><![CDATA[ATI]]></category>
		<category><![CDATA[EVGA]]></category>
		<category><![CDATA[Gainward]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GTX260]]></category>
		<category><![CDATA[GTX280]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Palit]]></category>
		<category><![CDATA[Radeon]]></category>
		<category><![CDATA[Sapphire]]></category>
		<category><![CDATA[squealing]]></category>
		<category><![CDATA[x1800xt]]></category>
		<category><![CDATA[x850xt]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=566</guid>
		<description><![CDATA[<p>It is no secret that I am huge fan of Folding@Home project, or that I love to play computer games (when I find time :-(. ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/11/24/nvidias-deadly-flaw-and-how-to-fix-it-no-more-gtx280-squealing/">UPDATED: Nvidia&#8217;s &#8220;deadly&#8221; flaw and how to fix it &#8211; no more squealing!</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>It is no secret that I am huge fan of Folding@Home project, or that I love to play computer games (when I find time :-(. Both of these activities put high amounts of strain on components inside the computer, and any weakness in product design can be easily discovered.</p>
<p>This tale speaks of a company that makes great chips, but also has a serious design flaw: PCB design. As long as story about &#8220;Built by Nvidia&#8221; components was told, there were <a href="http://www.google.com/search?hl=en&amp;q=nvidia+squealing&amp;btnG=Google+Search&amp;aq=f&amp;oq=" target="_blank">isolated cases of &#8220;squealing&#8221;</a>. This squealing is caused by vibration of copper coils, and is not present on products designed by people that take attention at these things. Read: if your card has Digital Voltage Regulation Module (DVRM, as Iwill originally called it &#8211; Digital PWM is more popular these days) or all solid-state caps and shielded chokes, no sound should be produced. But, if your part has coils or non-shielded capacitors/chokes, you could be &#8220;enjoying&#8221; in squealing sounds of electronics.</p>
<p>To make the matters clear, certain products from BOTH ATI and Nvidia can squeal under load. ATI moved to clear the issue, Nvidia didn&#8217;t. Unfortunately, I wasn&#8217;t able to record squealing with any of my microphones (upcoming test lab will feature ultra-sensitive microphone equipment), but in a silent computer with three Noctua fans, any non-standard behavior is noticeable. This high-pitched noise is often eaten by the sound of fans, but if you have a silent rig, it gets really, I mean REALLY &#8211; annoying.</p>
<p>The squealing is only appearing when the GPU is cranked all the way up, in Folding@Home, Far Cry 2, Crysis: Warhead &#8211; the same cards that squealed like pigs in Crysis didn&#8217;t do the same in Unreal Tournament 3, Fallout 3 or Race Driver: GRID.</p>
<p>After experiencing squealing with my reference Nvidia GTX280 card in the past month or so, I&#8217;ve thoroughly checked following products:</p>
<ul>
<li>ATI Radeon X850XT</li>
<li>ATI Radeon X1800XT CrossFire Edition</li>
<li>ATI Radeon 2900XT 512MB</li>
<li>ATI FireGL V8600 1024MB (2900XT)</li>
<li>ATI Radeon 3850 256MB</li>
<li>ASUS EN9800GX2 1024MB TOP</li>
<li>ASUS EN9800GTX 512MB TOP</li>
<li>EVGA GeForce GTX260 Core 216 896MB x2</li>
<li>EVGA GeForce GTX280 SuperClocked 1024MB</li>
<li>EVGA GeForce GTX280 SSC 1024MB x2</li>
<li>Gainward GeForce 8800GTS 640MB</li>
<li>Gainward GeForce 8800GT 512MB</li>
<li>Palit Radeon 4850 512MB x2</li>
<li>Palit Radeon 4870 512MB x2</li>
<li>Palit GeForce 9800GX2 1024MB x2</li>
<li>Palit GeForce GTX280 1024MB</li>
<li>Sapphire Atomic 3870 512MB</li>
<li>XFX GeForce 8600GTS 256MB XXX Edition</li>
</ul>
<p>&#8220;Squealing&#8221; appeared on several Nvidia and a single ATI board &#8211; and on EVGA 680i motherboard. On EVGA&#8217;s 780i and 790i FTW boards, where Nvidia design was replaced with EPoX engineering brilliance, no squealing appeared. I never noticed any squealing on following motherboards:</p>
<ul>
<li>ASUS M3A78-T (AMD 790GX+SB750)</li>
<li>ASUS Maximus Formula (X38+ICH9R)</li>
<li>ASUS Maximus II Formula (P45+ICH10R)</li>
<li>ASUS P5E Deluxe (X48+ICH9R)</li>
<li>GigaByte MA-790GX-DQ6</li>
<li>MSI K9A2 Platinum (790FX+SB600)</li>
</ul>
<p>Since squealing is coming as a consequence of a high-amp 12V rail, I decided to test the cards with several power supplies:</p>
<ul>
<li><a href="http://www.antec.com/usa/productDetails.php?lan=us&amp;id=27850" target="_blank">Antec TruePower Quattro 850W </a></li>
<li><a href="http://www.corsair.com/products/hx/default.aspx" target="_blank">Corsair HX620W</a></li>
<li><a href="http://www.hipergroup.com/products.php?lv=3&amp;cate=1&amp;type=25&amp;pid=25" target="_blank">Hiper  Type R II 680W</a></li>
<li><a href="http://www.hipergroup.com/products.php?lv=3&amp;cate=1&amp;type=25&amp;pid=28" target="_blank">Hiper Type R II 880W</a></li>
<li><a href="http://thermaltakeusa.com/Product.aspx?S=1207&amp;ID=1503" target="_blank">Thermaltake Toughpower 850W </a></li>
<li>Thermaltake Toughpower 900W Prototype &#8211; never released</li>
</ul>
<p>I also had that luck of testing the 9800GX2, GTX280 and ATI Radeon 2900, 3850 and 4850/4870 cards on two continents. First place where I did the test was Livermore, CA, using standard US 110V/60Hz current. Second location was Zagreb, Croatia, using standard Euro 220V/50Hz current.<br />
This is the list of products that squealed in Crysis/Crysis: Warhead/Far Cry 2/Folding@Home:</p>
<ul>
<li>ATI Radeon X850XT</li>
<li>ATI Radeon X1800XT CrossFire Edition</li>
<li>ATI Radeon 3850 256MB</li>
<li>ASUS EN9800GX2 1024MB TOP</li>
<li>EVGA GeForce GTX280 SuperClocked 1024MB</li>
<li>Gainward GeForce 8800GT 512MB</li>
<li>Nvidia GeForce GTX280 1024MB</li>
<li>Palit GeForce 9800GX2 1024MB</li>
<li>Palit GeForce GTX280 1024MB</li>
</ul>
<p>As you can see, quite large number of cards produced some sort of noise, but with different variations. Most irritating were ASUS/Palit 9800GX2 and Nvidia&#8217;s GTX280, while other cards produced more subtle, but still high pitched noise. Power hogs like ATI Radeon 2900XT and new babies such as Palit Radeon 4850 and 4870 didn&#8217;t squealed. The reason is very simple: ATI pioneered the usage of digital power management (excellent design by Volterra) with 2900XT/V8600, went back to cost-effective analog capacitors/chokes on 3800, saw squealing re-appearing and again went digital with 4800 series. Result is very simple &#8211; no squealing under any circumstance.</p>
<p><strong>Solution</strong><br />
If you own a card that squeals, you might ask yourself what to do. At present, only EVGA makes its own custom design cards with GeForce GTX 260 Core 216 and latest GTX 280 designs. All other partners are forced to use Nvidia&#8217;s reference design and well, squealing may or may not appear on your setup.</p>
<p>If you own a card that squeals, you should do following things:</p>
<ul>
<li> Change the power cable. Incredible, but it did work on some cases reported by my friends.</li>
<li> Is your power clean or &#8220;dirty&#8221;? Putting a power-filter such as UPS might help.</li>
<li>If these two fail &#8211; mod the board.</li>
</ul>
<p>Note that for instance, one Palit 9800GX2 squealed, two didn&#8217;t. After the mod, not a single one did. EVGA GTX280 Superclocked board (nV reference design) squealed, SSC ones (EVGA design) were good as gold. Gainward&#8217;s 8800GT continued to squeal after the mod.</p>
<p>We&#8217;re not talking here about &#8220;if you get Nvidia card, it will squeal&#8221;, but rather this issue is an isolated one, or &#8220;just how lucky you are&#8221;. However, this does not absolve board designers from full blame on this issue, since the &#8220;slaughtered pig squeal&#8221; issue could have been avoided by using digital circuitry.</p>
<p>Personally, I decided to go with warranty-voiding &#8220;coloring&#8221; of the board using color-less nail polish. For this experiment, we took Palit&#8217;s GTX280 and dismantled it. Daniela took each and every power component and soaked it with polish, and where she could, Daniela filled the inside of the capacitor/choke. We also removed all the factory-default thermal paste from the GPU and replace it with Gelid&#8217;s GX-1 compound. That reduced load temperature by 3 degrees, as we wanted to lower the thermal load of the PCB.</p>

<a href='http://cdn.vrworld.com/wp-content/uploads/2009/02/gtx280_step1_2.jpg' rel="lightbox[gallery-1]"><img width="500" height="329" src="http://cdn.vrworld.com/wp-content/uploads/2009/02/gtx280_step1_2.jpg" class="attachment-vw_medium" alt="Caps should be the ones producing squealing sound, but in case of our card, nail polish was needed elsewhere as well." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_01.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_01-750x420.jpg" class="attachment-vw_medium" alt="Daniela started to dismantle the board - for precise things, deploy precise people ;)" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_03.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_03-750x420.jpg" class="attachment-vw_medium" alt="Unscrewing proved to be quite uneventful..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_02.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_02-750x420.jpg" class="attachment-vw_medium" alt="GTX280 is not that hard to dismantle, but there are some things you have to be careful about - for instance, the board is not connected only with scews... you need to use manual force as well" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_04.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_04-750x420.jpg" class="attachment-vw_medium" alt="Removing the power connector and we were almost done with the first stage" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_05.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_05-750x420.jpg" class="attachment-vw_medium" alt="GTX280 ready to be modified..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_06.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_06-750x420.jpg" class="attachment-vw_medium" alt="...but not without mandatory money shot. This time around, I chose 5 Kunas. There are already ton of pics on the net with quarter dollar or euro, so this one targets new audience ;)" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_07.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_07-750x420.jpg" class="attachment-vw_medium" alt="Receipt for the evening - GTX280, and nail polish. No, this Palit baby will not go out and join Paris Hilton in partying ;)" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_08.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_08-750x420.jpg" class="attachment-vw_medium" alt="Every capacitor and choke was drowned in polish, since our first attempt didn&#039;t end up well - squeeling still existed. This &quot;drowning&quot; worked ;)" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_09.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/11/nvidia_squealgtx280_09-750x420.jpg" class="attachment-vw_medium" alt="And that was that." /></a>

<p>After putting the card back in the system, we turned Folding@Home back on and saw that squealing was almost gone and we only had a CPU and PSU fans on (OCZ Vendetta + Thermaltake Toughpower). It is not a 100% solution, but with all the fans back in the system, the board continued to fold and rock in games.</p>
<p><strong>Conclusion<br />
</strong>This issue is only the latest in history of recent scrutiny on Nvidia parts. Personally, I do not understand a dumb move done by circuitry designers who decided to continue using old, analog power management in time when Digital PWM is becoming more and more available. It is not true that Nvidia didn&#8217;t knew about the issue, since the first reports about squealing are traced back to nForce 680i and GeForce 8600GTS cards. Nvidia&#8217;s GTX200 series debuted at $449 and $649 price points and there is no explanation why more expensive digital circuitry could not be used. ATI introduced digital PWM with 2900XT, went back to analog with 3800 series, saw squealing re-appearing and went fully digital with the 4800 series. Case closed as far as Red Team is concerned. I spoke with several sources inside Nvidia&#8217;s and ATI&#8217;s partners, and they all moved forward to clear the squealing issue in their own custom designs, such as EVGA&#8217;s FTW series of motherboards of latest GTX200 cards.</p>
<p>We hope that GT206 and GT212-based cards will feature digital circuitry and that Nvidia will move in 21st century, as far as PCB design is concerned. Nvidia, here&#8217;s a free hint. If you need a contact in Volterra, I know a guy that knows a guy, we can make GT212 work all nicely, and SILENT!</p>
<p><em><strong>P.S. </strong>I wish to thank Ivan and his girlfriend Daniela for all the help and dismantling their own GTX280 board. BTW Ivan, sorry to put it in public, but the digital camera on Sony Ericsson P1e sux.</em> I wasn&#8217;t able to kill the noise even after 20min per picture in Photoshop. Grrr&#8230;</p>
<p><strong>UPDATE</strong><strong> February 1, 2009 00:40AM CET</strong>: I decided to update this article with a detailed picture of GeForce GTX280 and markings where nail polish or hot glue should be applied. Note that I haven&#8217;t tried the hot glue method myself. What needs to be isolated are the caps (marked with red line), but in case of Palit GeForce GTX280, squealing didn&#8217;t stop until Daniela put nail polish on the remaining power distribution elements as well (blue line).</p>
<div id="attachment_1013" style="width: 510px" class="wp-caption aligncenter"><img class="size-full wp-image-1013" title="gtx280_step1_2" src="http://cdn.vrworld.com/wp-content/uploads/2009/02/gtx280_step1_2.jpg" alt="Caps should be the ones producing squealing sound, but in case of our card, nail polish was needed elsewhere as well." width="500" height="329" /><p class="wp-caption-text">Caps should be the ones producing squealing sound, but in case of our card, nail polish was needed elsewhere as well.</p></div>
<p>Picture is provided courtesy of TechPowerUp! Thanks guys.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/11/24/nvidias-deadly-flaw-and-how-to-fix-it-no-more-gtx280-squealing/">UPDATED: Nvidia&#8217;s &#8220;deadly&#8221; flaw and how to fix it &#8211; no more squealing!</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/11/24/nvidias-deadly-flaw-and-how-to-fix-it-no-more-gtx280-squealing/feed/</wfw:commentRss>
		<slash:comments>92</slash:comments>
		</item>
		<item>
		<title>ASUS kills PATA and PCI standards!</title>
		<link>http://www.vrworld.com/2008/10/29/asus-kills-pata-and-pci-standards/</link>
		<comments>http://www.vrworld.com/2008/10/29/asus-kills-pata-and-pci-standards/#comments</comments>
		<pubDate>Wed, 29 Oct 2008 20:00:09 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[CPU]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[Memory & Storage Space]]></category>
		<category><![CDATA[3-Way SLI]]></category>
		<category><![CDATA[4850]]></category>
		<category><![CDATA[4870]]></category>
		<category><![CDATA[4870X2]]></category>
		<category><![CDATA[6-Way GPU]]></category>
		<category><![CDATA[Ageia]]></category>
		<category><![CDATA[Asus]]></category>
		<category><![CDATA[ATI]]></category>
		<category><![CDATA[Bigfoot]]></category>
		<category><![CDATA[Creative Labs]]></category>
		<category><![CDATA[CrossFire]]></category>
		<category><![CDATA[CrossFireX]]></category>
		<category><![CDATA[Folding@Home]]></category>
		<category><![CDATA[GPGPU]]></category>
		<category><![CDATA[GPU Computing]]></category>
		<category><![CDATA[GTX260]]></category>
		<category><![CDATA[GTX280]]></category>
		<category><![CDATA[IBM]]></category>
		<category><![CDATA[Killer NIC]]></category>
		<category><![CDATA[motherboard]]></category>
		<category><![CDATA[nForce 200]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[PATA]]></category>
		<category><![CDATA[PCI Express]]></category>
		<category><![CDATA[PCIe]]></category>
		<category><![CDATA[PhysX]]></category>
		<category><![CDATA[PS/2]]></category>
		<category><![CDATA[rendering]]></category>
		<category><![CDATA[revolution]]></category>
		<category><![CDATA[SATA]]></category>
		<category><![CDATA[scientific]]></category>
		<category><![CDATA[single slot]]></category>
		<category><![CDATA[SLI]]></category>
		<category><![CDATA[Ultimate]]></category>
		<category><![CDATA[ultimate folding]]></category>
		<category><![CDATA[water cooling]]></category>
		<category><![CDATA[X58]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=229</guid>
		<description><![CDATA[<p>Back on the INQ, I wrote about dangers lying ahead for AGEIA, Creative Labs and Bigfoot Networks, representatives of these respected companies just told me ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/10/29/asus-kills-pata-and-pci-standards/">ASUS kills PATA and PCI standards!</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>Back on the INQ, I wrote about dangers lying ahead for AGEIA, Creative Labs and Bigfoot Networks, representatives of these respected companies just told me that their business model is solid and that they are indeed, future-proof.<br />
Well, that turned out nicely &#8211; AGEIA never took off because of $250 charge for a PCI card, Creative now exists almost solely on patent charges and selling off its own property, while Bigfoot networks made the greatest network card on the planet &#8211; and failed to pack it up in an attractive and future-proof package.<br />
The reason for this rant is <a href="http://www.xfastest.com/viewthread.php?tid=15508&amp;extra=&amp;page=1" target="_blank">a story on Xfastest.com</a>, introducing ASUS P6T6-WS Revolution motherboard . Under this name lies the look of all motherboards coming to market in the next couple of years.<br />
P6T-WS is based on Intel&#8217;s X58 plus nForce 200 chipset, and the reason for naming it REVOLUTION is the fact that there are no PCI slots on the motherboard. Yes, P6T6-WS features no less than six PCI Express x16 slots &#8211; offering a possibility of installing six single-slot graphic cards.</p>
<div id="attachment_230" style="width: 510px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2008/10/asus_p6tws.jpg" rel="lightbox-0"><img class="size-full wp-image-230" title="asus_p6tws" src="http://cdn.vrworld.com/wp-content/uploads/2008/10/asus_p6tws.jpg" alt="The motherboard for the ultimate workstation" width="500" height="333" /></a><p class="wp-caption-text">The motherboard for the ultimate workstation</p></div>
<p>The board supports both SLI and CrossFire in their respective maximum configurations (3 or 4 GPUs), but what makes this board really interesting is the fact that you could connect 12 LCD displays on it, or create a GPGPU/rendering/scientific/folding farm in a single case. Putting six ATI Radeon 4850 graphics cards would enable roughly 6TFLOPS of computing power. In case of Nvidia, you would have to pick up GeForce 9800GT (Palit has single-slot 1GB card) and have less theoretical computing power, but in terms of folding, you would be looking at 30-35.000 PPD system (at a cost of two GTX260 cards).<br />
This is really impressive engineering feat from ASUS, with the only disappointment being usage of RealTek GbE controller. For a workstation motherboard, I would much happier if Marvell was on-board.<br />
Storage-wise, you can install no less than eight SATA devices and not a single IDE device, since ASUS stayed in &#8220;Revolution&#8221; theme and killed of the PATA connector. Also, I found that a shared PS/2 port was also pretty neat solution, even though real revolution would be killing both PS/2 slots. This way, you still have one legacy part: PS/2.</p>
<div id="attachment_232" style="width: 510px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2008/10/asus_p6tws_ps2.jpg" rel="lightbox-1"><img class="size-full wp-image-232" title="asus_p6tws_ps2" src="http://cdn.vrworld.com/wp-content/uploads/2008/10/asus_p6tws_ps2.jpg" alt="There is one shared PS/2 port, for either keyboard or a mouse" width="500" height="174" /></a><p class="wp-caption-text">There is one shared PS/2 port, for either keyboard or a mouse</p></div>
<p>Funny part of this story is that if anybody would have a time machine and go back to IBM engineers in 1986-7 frame and told them that only remain of their failed standard is going to be a keyboard/mouse connector, and that PS/2 connector will outlive PATA, I guess they would call you… crazy? Lunatic? Infidel? <img src="http://cdn.vrworld.com/wp-includes/images/smilies/icon_wink.gif" alt=";-)" class="wp-smiley" /></p>
<p>P.S. If you&#8217;re wondering&#8230; yes, the answer is true. There are no technical issues that would prevent you from installing 3-Way SLI and 4-Way CrossFireX setup, consisting out of three GTX280 and two 4870X2 cards. Only problem is that you would have to have a watercooling setup, since you are limited to single-slot cooling solutions. I guess Asetek, CoolIT or somebody similar could come up with a solution for this &#8220;problem&#8221;.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/10/29/asus-kills-pata-and-pci-standards/">ASUS kills PATA and PCI standards!</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/10/29/asus-kills-pata-and-pci-standards/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>Nvidia&#8217;s $50 card destroys ATI&#8217;s $500 one or &#8220;Why ATI sucks in Folding?&#8221;</title>
		<link>http://www.vrworld.com/2008/10/24/why-nvidia-destroys-ati-in-folding-at-hom/</link>
		<comments>http://www.vrworld.com/2008/10/24/why-nvidia-destroys-ati-in-folding-at-hom/#comments</comments>
		<pubDate>Fri, 24 Oct 2008 16:00:19 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[Software]]></category>
		<category><![CDATA[8800]]></category>
		<category><![CDATA[9600]]></category>
		<category><![CDATA[9600 gso]]></category>
		<category><![CDATA[9800]]></category>
		<category><![CDATA[ATI]]></category>
		<category><![CDATA[berkeley]]></category>
		<category><![CDATA[EVGA]]></category>
		<category><![CDATA[FireGL]]></category>
		<category><![CDATA[FirePro]]></category>
		<category><![CDATA[Folding]]></category>
		<category><![CDATA[Folding@Home]]></category>
		<category><![CDATA[Gainward]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GPGPU]]></category>
		<category><![CDATA[GPU Computing]]></category>
		<category><![CDATA[GTX260]]></category>
		<category><![CDATA[GTX280]]></category>
		<category><![CDATA[LeadTek]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Palit]]></category>
		<category><![CDATA[PowerColor]]></category>
		<category><![CDATA[Quadro]]></category>
		<category><![CDATA[Radeon]]></category>
		<category><![CDATA[Sapphire]]></category>
		<category><![CDATA[seti]]></category>
		<category><![CDATA[seti@home]]></category>
		<category><![CDATA[stanford university]]></category>
		<category><![CDATA[XFX]]></category>
		<category><![CDATA[Zotac]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=190</guid>
		<description><![CDATA[<p>As you might already know, I am a bit enthusiastic when it comes to distributed computing. I&#8217;ve been looking for aliens through SETI@home, later with ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/10/24/why-nvidia-destroys-ati-in-folding-at-hom/">Nvidia&#8217;s $50 card destroys ATI&#8217;s $500 one or &#8220;Why ATI sucks in Folding?&#8221;</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>As you might already know, I am a bit enthusiastic when it comes to distributed computing. I&#8217;ve been looking for aliens through SETI@home, later with BOINC… but then, <a href="http://folding.stanford.edu/English/Science" target="_blank">Folding@Home</a> showed up and I became an enthusiast for this valuable project from Stanford University. My family had some share of dealings with Alzheimer&#8217;s (aka AD) and Parkinson&#8217;s diseases (aka PD) and I won&#8217;t go here into what psychological and ultimately financial stress that families around the world, including my own &#8211; have to endure.<br />
Folding@Home is also a project that pioneered the use of GPUs for distributed computing (if I am wrong on this one, feel free to correct me). Back in the summer of 2006, I heard that ATI and Stanford are working Folding@Home GPGPU client. I now remember my articles and articles from a lot of colleagues who all criticized Nvidia for not having a F@H client.</p>
<div id="attachment_196" style="width: 510px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2008/10/folding_nvdavsati.jpg" rel="lightbox-0"><img class="size-full wp-image-196" title="folding_nvdavsati" src="http://cdn.vrworld.com/wp-content/uploads/2008/10/folding_nvdavsati.jpg" alt="Nvidia's client may not look as nice as ATI one, but it's the efficiency that counts..." width="500" height="348" /></a><p class="wp-caption-text">Nvidia&#39;s client may not look as nice as ATI one, but it&#39;s the efficiency that counts...</p></div>
<p>Fast forward to GTX280 launch and the Vijay Pande team debuted the Folding@Home client for Nvidia chips as well. Nvidia and ATI lead a short marketing war who can fold better and things went quiet… apparently, for a reason.<br />
The reason why things went quiet is probably the &#8220;inconvenient truth&#8221;: ATI showed up with Radeon 4800 series and demolished Nvidia&#8217;s dominance in the segment, with GTX260 and 280 going through radical price drops in order to stay competitive. However, ATI&#8217;s Radeon 4800 series has one field where the card is losing against 5-10x cheaper cards: Folding@Home.<br />
The 10x argument lies in comparison between current ATI&#8217;s flagship, the  Radeon 4870X2 and Nvidia&#8217;s GeForce 9600GSO. This $50 card can easily out-fold ATI Radeon 4870X2, which retails for more than 500 USD/450EUR in respective markets.<br />
In the past weeks, I&#8217;ve conducted a series of tests with various graphics cards (all that I own or could put my hands on), and the results were quite depressing if you own an ATI card. I&#8217;ve asked some of my contacts in AMD why the performance is so bad and the answers were ranging from &#8220;we wanted to make best gamer&#8217;s card, not a card for Folding&#8221; to sad silence. It seems to me that the difference lies in shader type and clock: ATI&#8217;s R6xx and RV7xx architecture lies around big fat units and lot of tiny ones (64+256 in case of Radeon 3800, 80+720 in case of Radeon 4800), and the clock is much lower than in case with GeForce cards. At the same time, Nvidia went the other route and came up with large number of &#8220;fat&#8221; units, while the company didn&#8217;t even count the &#8220;thin&#8221; (MADD) ones.<br />
When we compare the GTX280 and 4870X2, comparisons are just astounding: in a period of a month, EVGA&#8217;s GTX280 SSC achieved an average of 6,802 points per day, while ATI Radeon 4870X2 managed puny 3,870 ppd. At the same time, I&#8217;ve witnessed higher PPD scores achieved even by two-year old GeForce 8800GTS 640 MB, which was quite a surprise. Around two weeks ago, I started following PPD numbers using FahMon on a large number of systems that mostly bear the same configuration: dua-core processor or more, 2GB system memory or more and the graphics cards. In all cases, with the help of my friends, I&#8217;ve managed to check FahMon and KakaoStats for rougly 25 cards and came to a surprising result.<br />
With the recent update to the GPU2 client and new Fah_Core11.exe (ATI uses v1.17, Nvidia v1.15), the community witnessed further fall in number of completed packets per day. If you&#8217;re not familiar with Folding@Home packets, every package features certain number of mathematical simulations for tested protein &#8211; in case of Nvidia, packet consists out of 25 million, while ATI&#8217;s one features 10 million operations. However, due do different type of mathematical operations, Nvidia&#8217;s packet usually will result in 480 points, while ATI&#8217;s 10 million will return 548 points (or recently introduced ATI packets with 338 points).<br />
Like I previously wrote, the table below is not the result of one packet score and Excel calculation, but rather continuous number crunching over the course of several weeks, with one week used for measurement.</p>
<p><strong><br />
Improvised Top 20 Folding@Home GPUs:</strong></p>
<ol>
<li><span style="color:#339966;">Nvidia GeForce GTX280 1GB (EVGA SSC)</span></li>
<li><span style="color:#339966;">Nvidia GeForce GTX260-216 898MB (EVGA SSC)</span></li>
<li><span style="color:#339966;">Nvidia GeForce GTX260 898MB (EVGA Superclocked) </span></li>
<li><span style="color:#339966;">Nvidia GeForce 9800GTX+ 512MB (ASUS TOP)</span></li>
<li><span style="color:#339966;">Nvidia Quadro FX 4600 SDI 768MB (PNY)</span></li>
<li><span style="color:#339966;">Nvidia GeForce 9800GTX 512MB (ASUS TOP)</span></li>
<li><span style="color:#339966;">Nvidia GeForce 8800GTX 768MB (Zotac AMP! Edition)</span></li>
<li><span style="color:#339966;">Nvidia GeForce 8800Ultra 768MB (XFX XXX Edition)</span></li>
<li><span style="color:#339966;">Nvidia GeForce 8800GTS 512MB (Gainward)</span></li>
<li><span style="color:#339966;">Nvidia GeForce 8800GT 512MB (Gainward)</span></li>
<li><span style="color:#339966;">Nvidia GeForce 9600GSO 768MB (EVGA)</span></li>
<li><span style="color:#339966;">Nvidia GeForce 8800GTS 640MB (LeadTek)</span></li>
<li><span style="color:#ff0000;">ATI Radeon 4870X2 2GB (PowerColor)</span></li>
<li><span style="color:#ff0000;">ATI Radeon 4870 512MB (PALIT)</span></li>
<li><span style="color:#339966;">Nvidia GeForce 9600GT 256MB (Zotac)</span></li>
<li><span style="color:#ff0000;">ATI Radeon 4850 512MB (PALIT)</span></li>
<li><span style="color:#ff0000;">ATI Radeon 3870 512MB (Sapphire Atomic)</span></li>
<li><span style="color:#ff0000;">ATI FireGL V8600 1GB (ATI)</span></li>
<li><span style="color:#339966;">Nvidia GeForce 8600GTS 256MB (XFX XXX Edition)</span></li>
<li><span style="color:#ff0000;">ATI Radeon 3850 256MB (Sapphire)</span></li>
</ol>
<p>This is not a complete table by no means, since I am missing several new GPUs. But in this one, as you can see for yourself &#8211; results are quite dramatic for the red team. Two year old GeForce GPUs demolished otherwise-brilliant Radeon series, and it is incredible that even GeForce 9600 will outfold Radeon 4850. This is a rude wake-up call for guys at Markham, because this is just unbelievable.<br />
Personally, I am running a combination of AMD Spider platform (9850BE + 790GX + ATI Radeon 4870X2) and hybrid Intel&#8217;s V8-Skulltrail platform with Quadro FX 4600 SDI.<br />
Of course, everything can be changed with a simple driver update. I don&#8217;t understand what happened with AMD/ATI, company that lead the field of GPGPU computing for so long – why should AMD work on optimizing Folding@Home client&#8230; I am aware that AMD poached Mike Houston from Stanford to work on Brooke+ and now OpenCL APIs, but surely the performance didn&#8217;t went downhill from the influence of just one person. Or just maybe…<br />
Overall, I hope that Catalyst 8.11 or 8.12 will bring more performance for ATI cards, since I do not believe that it would be so hard to optimize drivers for GPGPU/GPU Computing usage. For now, in Folding@Home, ATI is complete washout.</p>
<p>For the end of this article, if you find that your GPU cycles could be used for something good, I invite you to <a href="http://theovalich.wordpress.com/2008/10/19/foldinghome-team/" target="_blank">read the following article</a> and join F@H family, regardless of what client (<a href="http://folding.stanford.edu/English/Download" target="_blank">CPU</a> or <a href="http://folding.stanford.edu/English/DownloadWinOther" target="_blank">GPU</a>) or team you choose in the end. Intel, AMD, ATI, Nvidia, Windows, Linux or Mac OS &#8211; it does not matter, just join &#8211; If you want, of course.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/10/24/why-nvidia-destroys-ati-in-folding-at-hom/">Nvidia&#8217;s $50 card destroys ATI&#8217;s $500 one or &#8220;Why ATI sucks in Folding?&#8221;</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/10/24/why-nvidia-destroys-ati-in-folding-at-hom/feed/</wfw:commentRss>
		<slash:comments>41</slash:comments>
		</item>
		<item>
		<title>Nvidia makes a &#8220;stupid&#8221; call with brilliant RapiHD</title>
		<link>http://www.vrworld.com/2008/10/23/nvidia-makes-a-stupid-call-with-gpu-accelerated-rapihd/</link>
		<comments>http://www.vrworld.com/2008/10/23/nvidia-makes-a-stupid-call-with-gpu-accelerated-rapihd/#comments</comments>
		<pubDate>Thu, 23 Oct 2008 04:00:22 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Software]]></category>
		<category><![CDATA[32-bit]]></category>
		<category><![CDATA[64-bit]]></category>
		<category><![CDATA[Adobe]]></category>
		<category><![CDATA[CS4]]></category>
		<category><![CDATA[Elemental technologies]]></category>
		<category><![CDATA[FX 5800]]></category>
		<category><![CDATA[GTX260]]></category>
		<category><![CDATA[Hollywood]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Opteron]]></category>
		<category><![CDATA[Premiere Pro]]></category>
		<category><![CDATA[Quadro]]></category>
		<category><![CDATA[Quadro CX]]></category>
		<category><![CDATA[RapiHD]]></category>
		<category><![CDATA[SDI]]></category>
		<category><![CDATA[SDI add-on]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=146</guid>
		<description><![CDATA[<p>With the release of Adobe Creative Studio 4, Elemental Technologies finally launched their own RapiHD CUDA-accelerator for Premiere Pro. As team of users of Sony ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/10/23/nvidia-makes-a-stupid-call-with-gpu-accelerated-rapihd/">Nvidia makes a &#8220;stupid&#8221; call with brilliant RapiHD</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>With the release of Adobe Creative Studio 4, Elemental Technologies finally launched their own RapiHD CUDA-accelerator for Premiere Pro.</p>
<div id="attachment_150" style="width: 260px" class="wp-caption alignright"><a href="http://cdn.vrworld.com/wp-content/uploads/2008/10/elementaltech_rapihdlogo.jpg" rel="lightbox-0"><img class="size-full wp-image-150" title="elementaltech_rapihdlogo" src="http://cdn.vrworld.com/wp-content/uploads/2008/10/elementaltech_rapihdlogo.jpg" alt="RapiHD... only for Quadro CS... CX. Darn shame..." width="250" height="176" /></a><p class="wp-caption-text">RapiHD... only for Quadro CS... CX. Darn shame...</p></div>
<p>As team of users of Sony 1080p and RED One (4K FTW!) camera, my guys expected that RapiHD will be a brilliant add-on to my configuration consisted out of two 4-core Intel Xeons@3 GHz, brilliant ASUS Skulltrail-lite motherboard and Nvidia Quadro FX 4600 SDI. SDI is a paramount when working with HD signal and RED camera, since it tremendously speeds up the workflow. I am first to admit that I am not exactly at home with video production per se, but I well know what to do to make things work, and make them work fast. And now, here comes RapiHD. I wrote about this plug-in couple of times on TG Daily, and saw the funny side of 32-bit limitation when it came to testing on Quadro FX 5800 on 32-bit OS (FX5800 has 4GB of RAM, thus performance sucked unless you would reg-hack the card down to 2GB or upgrade to 64-bit OS).</p>
<p>The CS4 launch came and went, and RapiHD site got the links to Quadro CX product web-page. So, what happened? Well, Nvidia has exclusive bundling deal for RapiHD and the company decided to bundle that pack exclusively with Quadro CX. I spoke with Elemental Technologies Sam Blackman and he told me that Adobe certified Nvidia Quadro CX and that is the preferred solution for Adobe CS4. Alarms went in my head like no tomorrow, since Quadro CSCX comes with quite an interesting set of specs. Of course, I reserve the right to be proven wrong if Quadro CX has some special juice inside. But I see a cut-down GT200 chip with 192 processors, 384-bit memory controller and 1.5 GB of video memory. In essence, we&#8217;re talking about this:</p>
<p>&#8220;Old&#8221; GTX260 GPU &#8211; 64-bit memory controller lane + higher density GDDR3 = Quadro CX.</p>
<p>So yes, compatibility with 32-bit OS is a given here, but shouldn&#8217;t you tell to video production industry &#8211; &#8220;please migrate to 64-bit, things are rosy there?&#8221; Everybody and his uncle know that Hollywood listened and now FX is better than ever (BTW, full-scale models in Battlestar Galactica are rendered on AMD Opteron machine with Quadro FX inside &#8211; no render machine mumbo jumbo &#8211; and that was made possible with a 64-bit OS and 16GB of memory). Guys, please, put RapiHD as an option or a bundle on SDI add-on card as well, since the most powerful and wanted solution for movie producers would be Quadro FX 5800 plus SDI add-on card. That&#8217;s my opinion &#8211; my two cents, of course.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/10/23/nvidia-makes-a-stupid-call-with-gpu-accelerated-rapihd/">Nvidia makes a &#8220;stupid&#8221; call with brilliant RapiHD</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/10/23/nvidia-makes-a-stupid-call-with-gpu-accelerated-rapihd/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>UPDATE: EVGA to launch Intel X58 motherboards</title>
		<link>http://www.vrworld.com/2008/10/22/world-exclusive-evga-to-launch-intel-x58-motherboards/</link>
		<comments>http://www.vrworld.com/2008/10/22/world-exclusive-evga-to-launch-intel-x58-motherboards/#comments</comments>
		<pubDate>Tue, 21 Oct 2008 22:00:52 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[CPU]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[3-SLI]]></category>
		<category><![CDATA[3-Way]]></category>
		<category><![CDATA[55nm]]></category>
		<category><![CDATA[Asus]]></category>
		<category><![CDATA[DDR3]]></category>
		<category><![CDATA[epox]]></category>
		<category><![CDATA[EVGA]]></category>
		<category><![CDATA[Gigabyte]]></category>
		<category><![CDATA[GTX260]]></category>
		<category><![CDATA[GTX270]]></category>
		<category><![CDATA[Kinc]]></category>
		<category><![CDATA[lifetime warranty]]></category>
		<category><![CDATA[MSI]]></category>
		<category><![CDATA[nForce 200]]></category>
		<category><![CDATA[nForce bridge]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Overclocking]]></category>
		<category><![CDATA[Peter Tan]]></category>
		<category><![CDATA[Shamino]]></category>
		<category><![CDATA[Step-up]]></category>
		<category><![CDATA[Tri-SLI]]></category>
		<category><![CDATA[Triple-SLI]]></category>
		<category><![CDATA[Vince]]></category>
		<category><![CDATA[X58]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=134</guid>
		<description><![CDATA[<p>When it comes to add-in board vendors, EVGA is probably the most faithful company in the business. Ever since the company launched, Nvidia was the ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/10/22/world-exclusive-evga-to-launch-intel-x58-motherboards/">UPDATE: EVGA to launch Intel X58 motherboards</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>When it comes to add-in board vendors, EVGA is probably the most faithful company in the business. Ever since the company launched, Nvidia was the only name EVGA wanted to hear about. But, things are about to change.<br />
Here are the facts:<br />
1)    EVGA does not want to miss the Core i7 train<br />
2)    Nvidia is not making a chipset for Intel Core i7<br />
3)    EVGA poached excellent engineering team from now-defunct EPoX and does not want that team to do nothing until MCP8-series show up<br />
Well, those facts end with a really simple result. EVGA is preparing to launch its first non-Nvidia based motherboard, but it will still have Nvidia chips on it. You&#8217;ve guessed it right &#8211; X58+nForce 200 bridges for full Triple-SLI capability.</p>
<div id="attachment_139" style="width: 510px" class="wp-caption alignnone"><a href="http://cdn.vrworld.com/wp-content/uploads/2008/10/evga_x58.jpg" rel="lightbox-0"><img class="size-full wp-image-139" title="evga_x58" src="http://cdn.vrworld.com/wp-content/uploads/2008/10/evga_x58.jpg" alt="EVGA's first non-Nvidia chipset based motherboard. Note that all SATA ports are angled, so even three GPUs won't limit your storage capabilities" width="500" height="657" /></a><p class="wp-caption-text">EVGA&#39;s first non-nV-based motherboard - all SATA connectors are angled, so your storage expansion is not limited even with 3-Way SLI. Good move.</p></div>
<p>The motherboard is being designed by ex-EPoX engineering team, bringing plentiful of overclocking capabilities, Digital PWM, and fully solid-state caps across the board. Special attention is being given to providing top juice to graphics cards, so if you decide to go for the gold and grab 3-SLI setup with three water-cooled GTX270 boards, be our guest.<br />
But, that is not all. This is the first motherboard, which will be followed by ultimate motherboard for this Christmas, no questions asked. The enthusiast motherboard is actually being designed by a world class overclocker. Yes, the one and only Peter Tan a.k.a. Shamino, who is making &#8220;Shamino special&#8221;.</p>
<p>For those unaware – Brian Flood from Mushkin, Shamino, Kinc and I share a special connection. According to German police, we all had the honor of dying and resurrecting. Last year, we were all pronounced &#8220;missing presumably dead&#8221; when some <a href="http://www.theinquirer.net/gb/inquirer/news/2007/08/24/54ghz-conroe-gets-stolen-in-theo-disaster-diary" target="_blank">east-German thieves stole helluwa stuff from our brand new VW Passat Variant</a>, en route to the airport. Well, VW is one car I will *never* buy. Piece of alarm-unsecured junk.<br />
Getting back to the subject, EVGA is bringing several things to high-end X58 market that nobody has. First of all, the company will offer limited lifetime warranty (just as with all of their high-end products) and yes, 90-day Step up program.<br />
If that is not making ASUS, Gigabyte, MSI and other motherboard makers sweat, we don&#8217;t know what will. Lifetime warranty on a motherboard? Free upgrade program? Bloody hell, I am buying that one.</p>
<p>UPDATED: Inserted picture of the motherboard. Courtesy of <a href="http://www.xtremesystems.org/forums/showthread.php?p=3373662#post3373662" target="_blank">XtremeSystems forums</a>.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/10/22/world-exclusive-evga-to-launch-intel-x58-motherboards/">UPDATE: EVGA to launch Intel X58 motherboards</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/10/22/world-exclusive-evga-to-launch-intel-x58-motherboards/feed/</wfw:commentRss>
		<slash:comments>8</slash:comments>
		</item>
	</channel>
</rss>

<!-- Performance optimized by W3 Total Cache. Learn more: http://www.w3-edge.com/wordpress-plugins/

Content Delivery Network via Amazon Web Services: CloudFront: cdn.vrworld.com

 Served from: www.vrworld.com @ 2015-04-10 17:11:45 by W3 Total Cache -->