<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>VR World &#187; GeForce</title>
	<atom:link href="http://www.vrworld.com/tag/geforce/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.vrworld.com</link>
	<description></description>
	<lastBuildDate>Fri, 10 Apr 2015 07:54:22 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.1.1</generator>
	<item>
		<title>Nvidia Teases More Pascal Details at GTC 2015</title>
		<link>http://www.vrworld.com/2015/03/18/nvidia-teases-more-pascal-details-at-gtc-2015/</link>
		<comments>http://www.vrworld.com/2015/03/18/nvidia-teases-more-pascal-details-at-gtc-2015/#comments</comments>
		<pubDate>Wed, 18 Mar 2015 01:29:32 +0000</pubDate>
		<dc:creator><![CDATA[Sam Reynolds]]></dc:creator>
				<category><![CDATA[Companies]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GTC 2015]]></category>
		<category><![CDATA[Jen-Hsun Huang]]></category>
		<category><![CDATA[Maxwell]]></category>
		<category><![CDATA[NASDAQ: NVDA]]></category>
		<category><![CDATA[Pascal]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=50192</guid>
		<description><![CDATA[<p>New GPU architecture promises ten-times the performance of Maxwell.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/18/nvidia-teases-more-pascal-details-at-gtc-2015/">Nvidia Teases More Pascal Details at GTC 2015</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1471" height="932" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/PascalBoard-1.jpg" class="attachment-post-thumbnail wp-post-image" alt="PascalBoard (1)" /></p><p><a href="http://www.vrworld.com/tag/nvidia/">Nvidia’s</a> (<a href="http://www.google.com/finance?cid=662925">NASDAQ: NVDA</a>) CEO Jen Hsun Huang gave the world another look at the GPU successor to Maxwell at its GPU Technology Conference Conference (GTC 2015) in San Jose Tuesday.</p>
<p>Pascal was first announced as a mystery GPU between Maxwell and Volta at last year’s GTC. Tuesday’s announcement gives us the first concrete details of Pascal.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2015/03/Pascal1.png" rel="lightbox-0"><img class="aligncenter size-medium wp-image-50194" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Pascal1-600x209.png" alt="Pascal1" width="600" height="209" /></a></p>
<p>Huang said that Pascal, which is set to arrive in 2016, would have a ten-fold overall average improvement over Maxwell, and a four times boost in mixed-precision workloads. As far as performance per watt, it will offer a two-fold performance over Maxwell. However he later cautioned this was “CEO Math” and actual performance may vary.</p>
<p><strong><strong><a href="http://cdn.vrworld.com/wp-content/uploads/2015/03/Pascal2.png" rel="lightbox-1"><img class="aligncenter size-medium wp-image-50193" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Pascal2-600x206.png" alt="Pascal2" width="600" height="206" /></a></strong></strong></p>
<p>Pascal will use a suite of new technologies, including 3D-stacked memory and NVLink (Huang says it will offer a five-fold improvement over PCI-E). It will be built on TSMC (<a href="http://www.google.com/finance?cid=674465">TPE: 2330</a>) 16nm FF+ (FinFet plus &#8212; the follow-up to FinFET)  process node. It will also use High Bandwidth Memory, allowing a three-fold improvement of bandwidth for its 32 GB of RAM.</p>
<p>Cards with Pascal will likely be marketed towards CUDA workstations, or perhaps as some sort of competitor to Intel’s (<a href="http://www.google.com/finance?cid=284784">NASDAQ: INTC</a>) Xeon Phi co-processors. Game developers have a tough time pushing the limits of current generation cards as it is.</p>
<p>Pricing and availability of Pascal-based cards will be available later this year or early next year.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/18/nvidia-teases-more-pascal-details-at-gtc-2015/">Nvidia Teases More Pascal Details at GTC 2015</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/18/nvidia-teases-more-pascal-details-at-gtc-2015/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Galax Brings Out The New GTX 970 Black Edition</title>
		<link>http://www.vrworld.com/2014/12/02/galax-brings-new-gtx-970-black-edition/</link>
		<comments>http://www.vrworld.com/2014/12/02/galax-brings-new-gtx-970-black-edition/#comments</comments>
		<pubDate>Wed, 03 Dec 2014 07:38:24 +0000</pubDate>
		<dc:creator><![CDATA[VR World Staff]]></dc:creator>
				<category><![CDATA[News]]></category>
		<category><![CDATA[970]]></category>
		<category><![CDATA[980]]></category>
		<category><![CDATA[Black Edition]]></category>
		<category><![CDATA[form factor]]></category>
		<category><![CDATA[Galax]]></category>
		<category><![CDATA[Galaxy]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Graphics Card]]></category>
		<category><![CDATA[gtx]]></category>
		<category><![CDATA[GTX 970]]></category>
		<category><![CDATA[MATX]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[sff]]></category>
		<category><![CDATA[small form factor]]></category>
		<category><![CDATA[video card]]></category>
		<category><![CDATA[videocard]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=42593</guid>
		<description><![CDATA[<p>Galax has brought out its own addition to add to the growing lineup of small form factor GTX 970s, the new GTX 970 Black Edition and it is only 19cm long.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/12/02/galax-brings-new-gtx-970-black-edition/">Galax Brings Out The New GTX 970 Black Edition</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1200" height="913" src="http://cdn.vrworld.com/wp-content/uploads/2014/12/Galax-GTX-970-Black-Edtion-1.jpg" class="attachment-post-thumbnail wp-post-image" alt="Galax GTX 970 Black Edtion - 1" /></p><p>Galax has brought out its own addition to add to the growing lineup of small form factor GTX 970s.  The new GTX 970 Black Edition is only 19cm long and packs as much power as a standard sized GTX 970.</p>
<p>Today Galax (aka Galaxy) announced a brand new videocard in its GTX 970 lineup, the GTX 970 Black Edition.  The card is meant to be a small form factor that will be compact enough to fit in many small form factor(SSF) cases.  The card is only 19cm long and has a very nice custom cooler employing twin 80mm fans sitting atop a aluminum fins fed by three heat-pipes.  The card come factory overclocked with a base clock of 1126MHz and a boost clock of 1266MHz.  Galax decided to leave the memory alone as it is running the standard 7GHz memory clock.  The card will be available very shortly for about €320 from a variety of retailers and etailers.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/12/Galax-GTX-970-Black-Edtion-2.jpg" rel="lightbox-0"><img class="aligncenter size-medium wp-image-42596" src="http://cdn.vrworld.com/wp-content/uploads/2014/12/Galax-GTX-970-Black-Edtion-2-600x471.jpg" alt="Galax GTX 970 Black Edtion - 2" width="600" height="471" /></a></p>
<p>The Nvidia GTX 970 is currently the &#8220;it&#8221; card of the moment as it is quite frankly the model that provides the best price to performance ratio.  The GTX 970s have been in a constant state of in-stock and out of stock from various sellers due to the popularity.  This has created a lot of frustration for consumers who are wanting to upgrade only to find that it is constantly out of stock, and it becomes a game of check the store for availability.  The fact is that these cards are so powerful and so energy efficient that to be able to build a small form factor build with one of these new cards would be just as powerful as a full sized system.  The small form factor builds are becoming increasingly popular amount enthusiast and with the Intel X99 motherboards coming out in mATX sizes users can now create powerful rigs that take only a fraction of the space as they used to while being more powerful.<a href="http://cdn.vrworld.com/wp-content/uploads/2014/12/12c.jpg" rel="lightbox-1"><img class="aligncenter size-medium wp-image-42594" src="http://cdn.vrworld.com/wp-content/uploads/2014/12/12c-600x557.jpg" alt="Galax GTX 970 Black Edtion - 3" width="600" height="557" /></a></p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/12/02/galax-brings-new-gtx-970-black-edition/">Galax Brings Out The New GTX 970 Black Edition</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/12/02/galax-brings-new-gtx-970-black-edition/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia&#039;s GTX 980 and GTX 970 Launch, One Month Later</title>
		<link>http://www.vrworld.com/2014/10/24/nvidias-gtx-980-and-gtx-970-launch-one-month-later/</link>
		<comments>http://www.vrworld.com/2014/10/24/nvidias-gtx-980-and-gtx-970-launch-one-month-later/#comments</comments>
		<pubDate>Sat, 25 Oct 2014 00:39:05 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[4K]]></category>
		<category><![CDATA[DSR]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GeForce GTX 970]]></category>
		<category><![CDATA[GeForce GTX 980]]></category>
		<category><![CDATA[GTX 970]]></category>
		<category><![CDATA[GTX 980]]></category>
		<category><![CDATA[mfaa]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[SLI]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=40429</guid>
		<description><![CDATA[<p>It has been a month since Nvidia launched their Maxwell GTX 980 and GTX 970 GPUs and we are taking a look at how well they have or haven't sold.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/10/24/nvidias-gtx-980-and-gtx-970-launch-one-month-later/">Nvidia&#039;s GTX 980 and GTX 970 Launch, One Month Later</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="638" height="748" src="http://cdn.vrworld.com/wp-content/uploads/2014/09/Untitled2.png" class="attachment-post-thumbnail wp-post-image" alt="ATTO Disk Benchmark D3 Station" /></p><p>Since the September launch of Maxwell for high-end, we&#8217;ve seen Nvidia launch the GeForce GTX 980 as well as the GTX 970 and their accompanying mobile bretheren, the GeForce GTX 980M and GTX 970M. Now that Nvidia&#8217;s high-end GPU has had roughly a month of sales, we want to take a look at how well they&#8217;re doing and exactly why they are doing so well or so poorly. The GTX 980 is going for anywhere between $549 and $599 and the GTX 970 is selling for anywhere between $329 and $409.</p>
<p>We first decided to talk to some retailers, specifically some etailers across the globe. From our conversations with them they had indicated that they are essentially selling as many GTX 980s and GTX 970s as they possibly can and that when the cards had initially launched, they had difficulty keeping them in stock. Now, if we look at the stock situation for the GTX 980 and GTX 970 now, it appears to be better. A quick <a href="http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&amp;N=100007709%20600536666&amp;IsNodeId=1&amp;name=GeForce%20GTX%20900%20series" target="_blank">stock check of GTX 980 and GTX 970s on Newegg</a> shows a fairly good amount of cards in stock, while some others are still out of stock or on order. If you <a href="http://www.amazon.com/gp/bestsellers/electronics/284822/" target="_blank">look on Amazon</a>, you can actually see that other than the &#8216;budget&#8217; cards like the GTX 750 Ti, 8400GS and HD5450, the rest of the top 10 cards on Amazon are all GTX 980s and 970s, mostly being 970s. Do keep in mind that Amazon&#8217;s numbers are a rolling total, rather than a weekly or monthly figure meaning that the relatively new GTX 970s are already selling like crazy.</p>
<p>This lines up with what we&#8217;ve been hearing from our retailer friends who have all said that the GTX 970 is their best selling graphics card and that in many cases people are buying two of them at a time because of their relatively inexpensive price of $329. Two GTX 970s in SLI is going to be a lot more powerful than a single GTX 980 and only at a $100 price premium, if that. The retailers we spoke with also indicated that they had double digit SLI sales (dual 970) and it continues to be the best selling card that they have in their inventory. Some retailers even said, if they had more cards available to them, they&#8217;d probably sell even more.</p>
<p>In addition to retailers, we also polled a few system builders anonymously. In their case, we saw essentially the same behavior with the GTX 970 being the most popular card that they sold and the GTX 980 coming in second after it. Considering the $220 price difference, that doesn&#8217;t seem that far fetched. All system builders also indicated that the release of these cards, like many graphics card launches, helped boost sales significantly the the point where they could actually measure the difference in demand.</p>
<p>The really big factor here in all of this is that Nvidia has had an incredibly strong launch and has sold more cards than we&#8217;re actually able to count. What&#8217;s more interesting was that one of the retailers pointed us to a fairly thoughtful point, we haven&#8217;t officially broken into the holiday buying season yet and they believe that Nvidia&#8217;s graphics card sales are going to skyrocket during the holiday season. And considering <a title="GeForce GTX 980 Review: More Performance at Lower Power" href="http://www.brightsideofnews.com/2014/09/18/geforce-gtx-980-review-performance-lower-power/" target="_blank">how dominant the GTX 980 is in benchmarks</a>, they suspect that many GTX 980 buyers are going to wait until the holiday season when they&#8217;ve been able to save up more money or get their holiday bonuses. The GTX 980 is a more expensive card, which is what&#8217;s partially keeping it from being as successful as the GTX 970. That&#8217;s primarily because the GTX 970 is 30% cheaper and only about 10-20% slower than the GTX 980, the more affordable card will always sell better than the highest end card if it delivers a good value, which the GTX 970 undeniably is.</p>
<p>So, we&#8217;ll be looking towards this holiday season to see exactly how well Nvidia&#8217;s GeForce GTX 980 and GTX 970 do, but if early sales figures are any indication they are going to do extremely well and have a fantastic quarter.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/10/24/nvidias-gtx-980-and-gtx-970-launch-one-month-later/">Nvidia&#039;s GTX 980 and GTX 970 Launch, One Month Later</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/10/24/nvidias-gtx-980-and-gtx-970-launch-one-month-later/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>Nvidia May Launch the GeForce GTX 880 in September</title>
		<link>http://www.vrworld.com/2014/08/05/nvidia-may-launch-geforce-gtx-880-september/</link>
		<comments>http://www.vrworld.com/2014/08/05/nvidia-may-launch-geforce-gtx-880-september/#comments</comments>
		<pubDate>Tue, 05 Aug 2014 14:29:38 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Rumors]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GeForce GTX 880]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[Graphics Card]]></category>
		<category><![CDATA[GTX 880]]></category>
		<category><![CDATA[IDF]]></category>
		<category><![CDATA[Maxwell]]></category>
		<category><![CDATA[Nvidia]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=37176</guid>
		<description><![CDATA[<p>Nvidia’s next-generation video card launch details may have been indadvertedly leaked by a Gigabyte executive. In an interview with Expreview, the executive claimed that Nvidia ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/08/05/nvidia-may-launch-geforce-gtx-880-september/">Nvidia May Launch the GeForce GTX 880 in September</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="980" height="653" src="http://cdn.vrworld.com/wp-content/uploads/2014/06/Gamers-Day1.jpg" class="attachment-post-thumbnail wp-post-image" alt="Nvidia Gamer&#039;s Day" /></p><p class="p1"><span class="s1">Nvidia’s next-generation video card launch details may have been indadvertedly leaked by a Gigabyte executive. In an interview with <a href="http://www.expreview.com/35092.html" target="_blank">Expreview</a>, the executive claimed that Nvidia will unveil the GeForce GTX 880 in the month of September, with Gigabyte coming out with a custom card based on the reference design, dubbed the GTX 880 G1.Gamer. </span></p>
<p class="p1"><span class="s1">Details of the card are scant at this moment, but it is estimated that Gigabyte’s GTX 880 will feature a thermal cooler that can handle loads as high as 650W, and will feature a silent cooling solution that makes little to no noise when running in idle load. Also rumored is a 256-bit interface, along with 4GB GDDR5 video memory.  The first about the card<a href="http://www.brightsideofnews.com/2014/06/17/new-geforce-gtx-880-gtx-870-details-leak/"> leaked online</a> in June. The GPU that will power the card is allegedly called GM204, and will be manufactured using a 28 nm process. Difficulties in shifting to large scale manufacturing are said to be a limiting factor in launching cards on the 20 nm process.</span></p>
<p class="p1"><span class="s1">Additional information based on Zauba database listings suggests that Nvidia is currently in the testing phase with the GTX 880. Nvidia has a facility in Hyderabad that works on integrating the software drivers for its video cards, so it is likely that reference designs of the GTX 880 have been sent to the facility for testing. </span></p>
<p class="p1"><span class="s1">More details regarding the launch of the GTX 880 will likely be divulged at Gamescon, which is being held later this week. </span></p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/08/05/nvidia-may-launch-geforce-gtx-880-september/">Nvidia May Launch the GeForce GTX 880 in September</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/08/05/nvidia-may-launch-geforce-gtx-880-september/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>New GeForce GTX 880, GTX 870 Details Leak</title>
		<link>http://www.vrworld.com/2014/06/17/new-geforce-gtx-880-gtx-870-details-leak/</link>
		<comments>http://www.vrworld.com/2014/06/17/new-geforce-gtx-880-gtx-870-details-leak/#comments</comments>
		<pubDate>Tue, 17 Jun 2014 17:04:49 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Rumors]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GeForce GTX 880]]></category>
		<category><![CDATA[GK-110]]></category>
		<category><![CDATA[GK104]]></category>
		<category><![CDATA[GM 204]]></category>
		<category><![CDATA[GM 207]]></category>
		<category><![CDATA[GM-104]]></category>
		<category><![CDATA[GM-110]]></category>
		<category><![CDATA[GM204]]></category>
		<category><![CDATA[GTX 870]]></category>
		<category><![CDATA[GTX 880]]></category>
		<category><![CDATA[Kepler]]></category>
		<category><![CDATA[Maxwell]]></category>
		<category><![CDATA[Nvidia]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=35985</guid>
		<description><![CDATA[<p>As always, there are going to be a plethora of rumors about the next GPUs coming from AMD and Nvidia, so it comes as no ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/06/17/new-geforce-gtx-880-gtx-870-details-leak/">New GeForce GTX 880, GTX 870 Details Leak</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="668" height="258" src="http://cdn.vrworld.com/wp-content/uploads/2014/06/FeatureImage-geforce-gtx-6601.jpg" class="attachment-post-thumbnail wp-post-image" alt="GeForce GTX 880" /></p><p>As always, there are going to be a plethora of rumors about the next GPUs coming from AMD and Nvidia, so it comes as no surprise that new details are leaking about Nvidia&#8217;s next generation of GPUs based on Maxwell. Nvidia has already launched the Maxwell architecture with the <a title="Nvidia’s GTX 750 Ti Introduces Maxwell and a Whole Lotta Cache" href="http://www.brightsideofnews.com/2014/02/18/nvidias-gtx-750-ti-introduces-maxwell-and-a-whole-lotta-cache/">incredibly efficient and modular GTX 750 Ti</a>. However, this is their one and only Maxwell part and there are lots of people wondering when we&#8217;ll see the mid and high-end parts.</p>
<p>So, <a href="http://www.sweclockers.com/nyhet/18948-geforce-gtx-880-och-gtx-870-med-maxwell-anlander-till-hosten" target="_blank">today&#8217;s rumor</a>, about the GeForce GTX 880 and GTX 870 are clearly feeding off of this desire for information and likely enlighten when and what we can expect to see from Nvidia. The first and foremost important detail is that their sources claim that the GeForce GTX 880 and GTX 870 will be launching in the fourth quarter of 2014, most likely in October or November. This would give Nvidia ample time to prepare for the fall/winter game release rush and to have a new and exciting product for the holiday season.</p>
<p>Also, as expected this part will still be based upon the 28nm process from TSMC as the countless rumors of TSMC&#8217;s 20nm delays seem to never end. While we all believe that both AMD and Nvidia would love to manufacture their latest generation of high-end GPUs on TSMC&#8217;s 20nm process, it simply does not appear to be ready for their multi-billion transistor chips. Let&#8217;s remember that Nvidia&#8217;s GK-110 is actually a 7.1 billion transistor chip and there is a very strong liklihood that a Maxwell part like what we would expect to be the GM-110 is very likely going to be a GTX 9 series part and not a GeForce GTX 880.</p>
<p>As we saw from Nvidia in the past, they generally launch a cut-down version of the architecture first like the GTX 680 (GK104) and then once their process is more mature and their thermals/yields are manageable, then you&#8217;ll see a GTX 780 Ti (GK110) that fully follows the &#8216;full-blown&#8217; architecture design without any compromises. A lot of sites are referring to the Maxwell batch of GPUs as GM204 and GM-210 however there is no reason to believe that Nvidia would do this, especially since the GTX 750 Ti is a GM107 chip. If the process hasn&#8217;t changed and the architecture is the same, I don&#8217;t see any reason why the GeForce GTX 880 could not be a GM-104 with its full-blown successor being a GM110 or GM210 if they decide to go with the 20nm die shrink (very likely).</p>
<p>However, some of the rumors continually refer to the high-end GPUs as GM204, which leads some people to believe that we may see more than just higher performance out of a GeForce GTX 880 or GTX 870 part. Realistically, though, I don&#8217;t really see this being possible and those parts may be reserved for professional or HPC applications that will get released shortly after the GeForce GTX 880.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/06/17/new-geforce-gtx-880-gtx-870-details-leak/">New GeForce GTX 880, GTX 870 Details Leak</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/06/17/new-geforce-gtx-880-gtx-870-details-leak/feed/</wfw:commentRss>
		<slash:comments>4</slash:comments>
		</item>
		<item>
		<title>Nvidia Throws Gamer&#039;s Day at Computex 2014</title>
		<link>http://www.vrworld.com/2014/06/02/nvidia-throws-gamers-day-computex-2014/</link>
		<comments>http://www.vrworld.com/2014/06/02/nvidia-throws-gamers-day-computex-2014/#comments</comments>
		<pubDate>Mon, 02 Jun 2014 14:58:22 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[2014]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[4K]]></category>
		<category><![CDATA[g-sync]]></category>
		<category><![CDATA[Gamer's Day]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[gsync]]></category>
		<category><![CDATA[League of Legends]]></category>
		<category><![CDATA[LOL]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Tegra Note 7]]></category>
		<category><![CDATA[Ultra HD]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=35504</guid>
		<description><![CDATA[<p>Today, the day before Computex 2014 officially starts, Nvidia held their very first Gamer&#8217;s Day across the street from Taipei&#8217;s famous Taipei 101 at ATT ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/06/02/nvidia-throws-gamers-day-computex-2014/">Nvidia Throws Gamer&#039;s Day at Computex 2014</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="980" height="653" src="http://cdn.vrworld.com/wp-content/uploads/2014/06/Gamers-Day1.jpg" class="attachment-post-thumbnail wp-post-image" alt="Nvidia Gamer&#039;s Day" /></p><p>Today, the day before Computex 2014 officially starts, Nvidia held their <a href="http://www.geforce.com.tw/landing-page/nvidia-gamers-day" target="_blank">very first Gamer&#8217;s Day</a> across the street from Taipei&#8217;s famous Taipei 101 at ATT 4 Fun&#8217;s 7th floor Show Box. The company had trucks driving around the area making the public and Computex attendees alike aware of the all-day festivities which started at 11 am and went til 7 pm.</p>
<p>While we didn&#8217;t get a chance to stay for the whole event, we did spend quite a bit of time walking around the venue and watching the festivities unfold, mostly in Mandarin. The focal point of the event was the professional League of Legends tournament being held on the main stage with commentators and $10,000 USD prize. There were also a series of amateur gamers playing League of Legends on separate machines. All of the gaming machines for the day with the exception of the three 4K gaming display were powered by MSI and their logo was plastered all over the event.</p>
<p>The Gamer&#8217;s Day also had a series of giveaways that would go on over the course of the day and included Nvidia-powered devices like a Gigabyte Tegra Note 7 as well as Nvidia SHIELDs. The last and ultimate giveaway for the day, at the end, was one GeForce GTX Titan Z. To do the math, this card is $3,000 USD and would come out to about 90,000 NTD (New Taiwanese Dollars), an even more daunting sounding price. According to Nvidia&#8217;s figures, they had around 1,500 people pre-register for the all day Gamer&#8217;s Day event. While we didn&#8217;t get any exact figures of how many people attended Nvidia&#8217;s Gamer&#8217;s Day, I can say with absolute certainty that their numbers were well into the high hundreds as people came and went over the course of the day.</p>
<p>Nvidia was also showing off a multitude of new G-Sync monitors from their partners including Acer&#8217;s 4K G-Sync monitor, ASUS&#8217; 2560&#215;1440 G-Sync monitor as well as G-Sync monitors from AOC and Phillips.</p>
<p>Nvidia chose to do a Gamer&#8217;s Day instead of having a booth on the show floor this year and it isn&#8217;t going to be the last time they do it either. Because at E3, Nvidia has already announced similar plans to forego a booth on the show floor and to build a Gamer&#8217;s Day-like experience outside of the convention center in order to make themselves more accessible to the public and local gamers. Nvidia is still going to have to pay the convention organizers at both events in order to do what they&#8217;re doing, but it alos</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/06/02/nvidia-throws-gamers-day-computex-2014/">Nvidia Throws Gamer&#039;s Day at Computex 2014</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/06/02/nvidia-throws-gamers-day-computex-2014/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia Launches GeForce GTX Titan Z Today, for $3,000</title>
		<link>http://www.vrworld.com/2014/05/28/nvidia-launches-geforce-gtx-titan-z-today-3000/</link>
		<comments>http://www.vrworld.com/2014/05/28/nvidia-launches-geforce-gtx-titan-z-today-3000/#comments</comments>
		<pubDate>Wed, 28 May 2014 18:41:11 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[295X2]]></category>
		<category><![CDATA[AMD Radeon R9 295X2]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GeForce GTX]]></category>
		<category><![CDATA[GeForce GTX Titan Z Review]]></category>
		<category><![CDATA[GTX Titan]]></category>
		<category><![CDATA[GTX Titan Z]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Nvidia Titan Z]]></category>
		<category><![CDATA[R9 295X2]]></category>
		<category><![CDATA[Titan Z]]></category>
		<category><![CDATA[Titan Z Review]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=35390</guid>
		<description><![CDATA[<p>So, Nvidia has finally launched their much awaited $3,000 graphics card which isn&#8217;t quite good enough for professional applications (no professional drivers, like Quadro) and ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/05/28/nvidia-launches-geforce-gtx-titan-z-today-3000/">Nvidia Launches GeForce GTX Titan Z Today, for $3,000</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="670" height="447" src="http://cdn.vrworld.com/wp-content/uploads/2014/05/titan-z-51.jpg" class="attachment-post-thumbnail wp-post-image" alt="GeForce GTX Titan Z" /></p><p>So, Nvidia has finally launched their much awaited $3,000 graphics card which isn&#8217;t quite good enough for professional applications (no professional drivers, like Quadro) and is too expensive for 99.99% of gamers to ever consider as a GPU. The <a title="GTC 2014 Keynote – GTX Titan Z and Pascal Announced" href="http://www.brightsideofnews.com/2014/03/25/gtc-2014-keynote-gtx-titan-z-and-pascal-announced/">Titan Z was originally announced</a> at Nvidia&#8217;s GTC 2014 back in March and there were rumors it was supposed to <a title="Nvidia GeForce GTX Titan Z is… Coming Soon?" href="http://www.brightsideofnews.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/">drop earlier this month but got delayed</a>. Either way, the card is now available and if you&#8217;re willing to pay twice the price of an <a title="AMD Radeon R9 295X2 Review: The Definitive 4K Gaming Graphics Card" href="http://www.brightsideofnews.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/">AMD Radeon R9 295X2</a>, which isn&#8217;t really much slower, then you can buy it right now from multiple online etailers and system builders.</p>
<p>The Titan Z is clearly Nvidia&#8217;s response to the <a title="AMD Radeon R9 295X2 Review: The Definitive 4K Gaming Graphics Card" href="http://www.brightsideofnews.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/">R9 295X2 which we reviewed last month</a>, but Nvidia some how justifies charging double the price because it is a combination of two of Nvidia&#8217;s fastest GTX Titan GPUs. Even though they did this, they ended up clocking these two GPUs much lower than if you went out and bought a GTX Titan on its own. Yes, it does have 12 GB of GDDR5, but the reality is that you don&#8217;t need 12 GB of GDDR5 in order to drive a single 4K display and a single <a href="http://www.nvidia.com/gtx-700-graphics-cards/gtx-titan-z/" target="_blank">GTX Titan Z</a> isn&#8217;t going to be enough to drive three 4K displays and actually game on them at decent FPS. So, in reality, you&#8217;ll need two GTX Titan Z&#8217;s in order to be able to drive three 4K displays anyways, which is going to be incredibly hot and expensive since it&#8217;ll end up costing you $6,000 for the GPUs alone when you could spend $4,000 and probably get better performance.</p>
<div id="attachment_35396" style="width: 1280px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/05/TitanZBare1.jpg" rel="lightbox-0"><img class="size-full wp-image-35396" src="http://cdn.vrworld.com/wp-content/uploads/2014/05/TitanZBare1.jpg" alt="GTX Titan Z Bare GPU" width="1270" height="714" /></a><p class="wp-caption-text">The GTX Titan Z&#8217;s GPUs shown bare, compliments of Digital Storm</p></div>
<p>The GTX Titan Z is Nvidia&#8217;s attempt to one up AMD, but the problem is that because it is air cooled they are forced to dial down the thermals of the card and its power consumption. AMD&#8217;s solution was to fully watercool their card, which they did get some criticism for having to do, but ultimately the card runs very cool and very quietly for being a dual GPU monster. In fact, Nvidia clocked their GPUs downward quite a bit because a <a href="http://www.nvidia.com/gtx-700-graphics-cards/gtx-titan-black/" target="_blank">GTX Titan Black</a>, which is comparable to the GPUs onboard this card is clocked at 889 MHz base clock and 980 MHz boost clock. Meanwhile, the GTX Titan Z&#8217;s GPUs clock in at a much slower 705 and 876 MHz base and boost clock, making it fundamentally a lot slower as well. Also, according to <a href="http://www.digitalstormonline.com/unlocked/nvidia-geforce-gtx-titan-z-review-and-4k-benchmarks-idnum280/" target="_blank">Digital Storm&#8217;s review</a> (the only one on the internet at launch is from a system builder??) the Titan Z isn&#8217;t really much faster than AMD&#8217;s R9 295X2 and only barely beats it in a few games.</p>
<p>So, basically, with the GTX Titan Z you&#8217;re paying double what AMD charges for the R9 295X2 for barely better (or worse) performance and a downclocked card (while AMD actually overclocked the 295&#215;2 compared to the 290X). And not just that, you&#8217;re simply better served if you just buy GTX Titan Black GPUs because they are fully unlocked GPUs and they consistently outperform the GTX Titan Z by a pretty wide margin (because of the clocks).</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/05/28/nvidia-launches-geforce-gtx-titan-z-today-3000/">Nvidia Launches GeForce GTX Titan Z Today, for $3,000</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/05/28/nvidia-launches-geforce-gtx-titan-z-today-3000/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia Reports Strong 1Q 2015, Following Error</title>
		<link>http://www.vrworld.com/2014/05/08/nvidia-reports-strong-1q-2015-following-error/</link>
		<comments>http://www.vrworld.com/2014/05/08/nvidia-reports-strong-1q-2015-following-error/#comments</comments>
		<pubDate>Thu, 08 May 2014 23:56:20 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[1Q 2014]]></category>
		<category><![CDATA[1Q 2015]]></category>
		<category><![CDATA[earnings]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[HPC]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Nvidia Earnings]]></category>
		<category><![CDATA[Professional]]></category>
		<category><![CDATA[Q1 2014]]></category>
		<category><![CDATA[Q1 2015]]></category>
		<category><![CDATA[Quadro]]></category>
		<category><![CDATA[server]]></category>
		<category><![CDATA[Supercomputer]]></category>
		<category><![CDATA[Tegra]]></category>
		<category><![CDATA[Tesla]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=35032</guid>
		<description><![CDATA[<p>As we had reported, Nvidia announced very strong preliminary earnings for the fiscal first quarter of 2015, calendar Q1 2014. They were supposed to announce ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/05/08/nvidia-reports-strong-1q-2015-following-error/">Nvidia Reports Strong 1Q 2015, Following Error</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="2000" height="1476" src="http://cdn.vrworld.com/wp-content/uploads/2014/05/Nvidia-Logo1.png" class="attachment-post-thumbnail wp-post-image" alt="Nvidia GPU Logo" /></p><p>As we <a href="http://www.brightsideofnews.com/2014/05/07/nvidia-mistakenly-announces-quarterly-earnings-full-report-tomorrow/" target="_blank">had reported</a>, Nvidia announced very strong preliminary earnings for the fiscal first quarter of 2015, calendar Q1 2014. They were supposed to announce their earnings today, May 8th, however someone had mistakenly sent the preliminary earnings announcement to 100 internal users and they decided to make those figures public to avoid any potential insider trading issues.</p>
<p>In terms of Nvidia’s earnings [<a href="https://www.google.com/finance?q=NASDAQ:NVDA" target="_blank">NASDAQ:NVDA</a>] themselves, <a href="http://nvidianews.nvidia.com/News/NVIDIA-Financial-Results-for-First-Quarter-Fiscal-2015-b1c.aspx" target="_blank">the company reported for their fiscal first quarter of 2015</a>, which is actually the first calendar quarter of 2014, earnings of $136 million on $1.1 billion in revenue, which is down sequentially from the fourth quarter where Nvidia is traditionally their strongest. As a result, it is expected to see that their revenue was down 4% quarter over quarter and profit down the same amount. In fact, it is actually really good to only take a 4% quarter over quarter reduction from your strongest quarter, most companies generally see a much larger decrease because of how big their third and fourth quarters are for their business. So, then, it comes as no surprise that when compared to the same quarter a year ago, Nvidia was actually up 16% in terms of revenue ($954 million last year vs. $1,102 this year) and 85% in terms of profitability, which is HUGE. Nvidia took in $77.9 million in profit in their first quarter last year, compared to $136 million this year which accounts for the 85% GAAP increase. Now, if you look at their non-GAAP EPS, then you’ll see that it is a slightly more moderated 61% increase, but even so, it indicates Nvidia is strong as ever, even with some weaker business units.</p>
<p>GAAP EPS were $0.24, up 85% from $0.13 a year ago and down 4% from $0.25 in the previous quarter, and non-GAAP earnings were $0.29, up 61% from $0.18 and down 9% from $0.32 in the previous quarter. Nvidia also saw their gross margin grow to 54.8 % from the previous quarter’s 54.1 % to the same quarter a year ago’s 54.3.</p>
<div id="attachment_35033" style="width: 990px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/05/NvidiaRevenue1.jpg" rel="lightbox-0"><img class="wp-image-35033" src="http://cdn.vrworld.com/wp-content/uploads/2014/05/NvidiaRevenue1.jpg" alt="NvidiaRevenue" width="980" height="266" /></a><p class="wp-caption-text">Nvidia Quarterly Revenue by Business Unit</p></div>
<p>As you can see above, Nvidia&#8217;s revenue in their GPU business was actually down quarter over quarter, considering the seasonality of GPU sales, but still up 14% year over year. This is primarily because of Nvidia&#8217;s renewed strength in their mobile GPU business in laptops where they saw their strongest marketshare since 2010.  They also saw some renewed strength in their Tegra processor division, which saw growth thanks to their automotive design wins which are now shipping inside Audi&#8217;s vehicles. This is combined with Nvidia&#8217;s continued strength in their Quadro and Tesla business which has continually buoyed the company&#8217;s profitability for the past few years and appears to continue to do so, but to a lesser degree this quarter. Overall revenue was up 16% year over year, meaning that Nvidia continues to improve their strength especially with their high-end products where they saw growth as high as 50%. And even though notebook sales and volumes went down, Nvidia still saw growth in their side of the business, selling the dedicated GPU.</p>
<p>The good thing for Nvidia is that they&#8217;ve diversified the placement of their Tegra business with smartphones, tablets and automotive. For their competitors, who are more mobile-heavy, they are experiencing much better growth in their mobile business because of their exposure in automotive in addition to tablets and smartphones. Even though, realistically, Nvidia is still needing a lot of help with their smartphone design wins when compared to their competitors. Perhaps, with Tegra K1 they will get much better pull with Tegra than they did with the Tegra 4 SoC.</p>
<p>Nvidia&#8217;s outlook for the next quarter are effectively flat in Q2, in terms of both revenue and operating expenses. As a result of today&#8217;s announcements and the company&#8217;s outlook, the shares of NVDA are actually trading down 3%, which is disappointing considering the fact that their outlook isn&#8217;t necessarily bad and they weren&#8217;t necessarily doing poorly.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/05/08/nvidia-reports-strong-1q-2015-following-error/">Nvidia Reports Strong 1Q 2015, Following Error</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/05/08/nvidia-reports-strong-1q-2015-following-error/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia GeForce GTX Titan Z is&#8230; Coming Soon?</title>
		<link>http://www.vrworld.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/</link>
		<comments>http://www.vrworld.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/#comments</comments>
		<pubDate>Wed, 30 Apr 2014 17:03:26 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Rumors]]></category>
		<category><![CDATA[Asus]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GeForce GTX]]></category>
		<category><![CDATA[GK-110]]></category>
		<category><![CDATA[GTX Titan Z]]></category>
		<category><![CDATA[GTXTITANZ-12GD5]]></category>
		<category><![CDATA[Kepler]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Nvidia GeForce]]></category>
		<category><![CDATA[Titan]]></category>
		<category><![CDATA[Titan Z]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=34811</guid>
		<description><![CDATA[<p>We were a bit surprised to see an announcement on Techpowerup! that ASUS had launched their GTX Titan Z without any hoopla from Nvidia or ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/">Nvidia GeForce GTX Titan Z is&#8230; Coming Soon?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="805" height="465" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/ASUS_GTXTITANZ-12GD5_011.jpg" class="attachment-post-thumbnail wp-post-image" alt="ASUS_GTXTITANZ-12GD5_01" /></p><p>We were a bit surprised to see <a href="http://www.techpowerup.com/200339/asus-announces-the-geforce-gtx-titan-z-dual-gpu-graphics-card.html" target="_blank">an announcement on Techpowerup!</a> that ASUS had launched their GTX Titan Z without any hoopla from Nvidia or any of their other board partners. So, it comes as little surprise that the article itself has since been pulled and that any and all mentions of ASUS&#8217; GTX Titan Z have disappeared. But of course, the damage has already been done and Pandora&#8217;s box has been opened. However, there isn&#8217;t that much about this card that is really a mystery.</p>
<p>The GTX Titan Z is expected to be a dual Titan graphics card, air cooled, that enables some of the best dual precision compute performance on earth while still delivering an incredible gaming experience, all while doing it with only one graphics card. This is possible because of the dual GPUs and 12 GB of RAM, which in theory would make the Titan Z a possibly better single card solution for driving three 4K monitors simultaneously. Obviously, based on our findings with the AMD R9 295X2, the likelihood of driving all three of those 4K monitors while gaming is out of the question, but driving a single 4K monitor is very likely possible if not encouraged. What&#8217;s more interesting is that Nvidia told us the price of the GTX Titan Z before we actually knew anything else about the card and as a result, it costs as much as two of AMD&#8217;s latest high-end GPU, the R9 295X2 at $3,000 a pop.</p>
<p>&nbsp;</p>
<div id="attachment_34815" style="width: 845px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/ASUS_GTXTITANZ-12GD5_021.jpg" rel="lightbox-0"><img class="size-full wp-image-34815" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/ASUS_GTXTITANZ-12GD5_021.jpg" alt="ASUS GTX Titan Z" width="835" height="705" /></a><p class="wp-caption-text">ASUS GTX Titan Z</p></div>
<p>Based on what we saw from Techpowerup, the ASUS card will have its own real-time graphics tuning utility called GPU Tweak to allow on-the-fly adjustments, even though I don&#8217;t know many people that need to make on-the-fly adjustments to a $3,000 card. Remember, this isn&#8217;t quite a professional card, but it also isn&#8217;t quite a gaming card since it is $3,000 and that takes it out of the reach of about 99.99% of gamers. It will also have each GPU running at a base clock of  705 MHz and a boost clock of 876 MHz, meaning that this card is significantly slower per GPU than its single GPU counterparts, which isn&#8217;t necessarily unexpected considering the TDP of each GPU. With both GPUs and memory combined, this card will have a total of 12 GB of GDDR5 memory and 5760 CUDA cores, which are what this card is designed to deliver, lots of memory and lots of cores.</p>
<p>According to the Techpowerup! article this GPU is supposed to be available worldwide on April 29th, yesterday. However, we&#8217;re hearing that there is a slight delay with this card and that date has been pushed back to May 8th, essentially a week away from tomorrow. Which may explain why ASUS had Techpowerup pull the article and why there is no news about it, Techpowerup may have just not gotten the memo.</p>
<p style="text-align: center;"><b>SPECIFICATIONS: </b><b style="text-align: center;">GTXTITANZ-12GD5</b></p>
<p style="text-align: center;"><span style="text-align: center;">Graphics Engine: NVIDIA GeForce GTX TITAN Z</span><br />
<span style="text-align: center;">Bus Standard: PCI Express 3.0</span><br />
<span style="text-align: center;">OpenGL: OpenGL 4.4</span><br />
<span style="text-align: center;">Video Memory: 12 GB GDDR5</span><br />
<span style="text-align: center;">GPU Boost Clock: 876 MHz</span><br />
<span style="text-align: center;">GPU Base Clock: 705 MHz</span><br />
<span style="text-align: center;">CUDA Cores: 5760</span><br />
<span style="text-align: center;">Memory Clock: 7000 MHz</span><br />
<span style="text-align: center;">Memory Interface: 768 bit</span><br />
<span style="text-align: center;">Output: 1 x Native DVI-I, 1 x Native DVI-D,1 x Native HDMI, 1 x Native DisplayPort 1.2</span></p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/">Nvidia GeForce GTX Titan Z is&#8230; Coming Soon?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>BSN 5th Anniversary Nvidia Giveaway Winners Selected!</title>
		<link>http://www.vrworld.com/2014/04/09/bsn-5th-anniversary-nvidia-giveaway-winners-selected/</link>
		<comments>http://www.vrworld.com/2014/04/09/bsn-5th-anniversary-nvidia-giveaway-winners-selected/#comments</comments>
		<pubDate>Wed, 09 Apr 2014 16:09:47 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Bright Side of News]]></category>
		<category><![CDATA[brightsideofnews.com]]></category>
		<category><![CDATA[contest]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[Giveaway]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[GTX Titan]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[SHIELD]]></category>
		<category><![CDATA[Tegra]]></category>
		<category><![CDATA[Tegra Note 7]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=34327</guid>
		<description><![CDATA[<p>As promised, we&#8217;ve selected the winners for our Nvidia 5th Anniversary Giveaway. This giveaway was the second massive giveaway of our multi-week-long giveaway to commemorate ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/09/bsn-5th-anniversary-nvidia-giveaway-winners-selected/">BSN 5th Anniversary Nvidia Giveaway Winners Selected!</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1920" height="1094" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/GTXTitan_1920_11.jpg" class="attachment-post-thumbnail wp-post-image" alt="GeForce GTX Titan" /></p><p>As promised, we&#8217;ve selected the winners for our <a href="http://www.brightsideofnews.com/news/2014/3/31/bsn--and-nvidia-partner-up-for-massive-5th-anniversary-giveaway.aspx">Nvidia 5th Anniversary Giveaway</a>. This giveaway was the second massive giveaway of our multi-week-long giveaway to commemorate our 5th anniversary as a site. It followed our <a href="http://www.brightsideofnews.com/news/2014/3/25/bsn--and-thermaltake---5-year-anniversary-giveaway.aspx#">Thermaltake</a> and <a href="http://www.brightsideofnews.com/2014/03/28/bsn-and-ea-partner-for-titanfall-giveaway/" target="_blank">Titanfall</a> giveaways. We had over 876 entries in the giveaway and we&#8217;ve selected the winners based on their comments, tweets, FB shares and Google+ +1&#8217;s. Some contestants did not follow the rules and attempted to leave multiple comments under the same user name or under different user names using different IP addresses. So, please, do not try to cheat the system. You end up cheating yourself out of a prize. In this case, one of you lost the chance to win an Nvidia SHIELD, so next time follow the rules and leave a comment, not comments. We&#8217;re happy to run these giveaways for our readers, but we also want to ensure that those winners are ones that have won the prize fair and square.</p>
<p style="text-align: center;"><strong>The winners of the contest are as follows:</strong></p>
<p style="text-align: center;">GTX Titan:<br />
Jake P</p>
<p style="text-align: center;">Nvidia SHIELD:<br />
Jason Li</p>
<p style="text-align: center;">Nvidia Tegra Note 7 with Smartcover:<br />
David (no last name, from the UK)<br />
Mack Brown</p>
<p>Congratulations to the winners and we hope that you guys remember to claim your prizes within 24 hours at our giveaways at brightsideofnews dot com email address. Thank you all for participating and don&#8217;t forget that we&#8217;re going to have a TON more giveaways these coming weeks and that we&#8217;re only about half way through with our pile of hardware to give away&#8230;</p>
<p>Once again, thank you all for participating and good luck in our other giveaways!</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/09/bsn-5th-anniversary-nvidia-giveaway-winners-selected/">BSN 5th Anniversary Nvidia Giveaway Winners Selected!</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/04/09/bsn-5th-anniversary-nvidia-giveaway-winners-selected/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>UPDATE: Nvidia owes millions of $ in GTX295 backlog</title>
		<link>http://www.vrworld.com/2009/03/14/nvidia-owes-millions-of-in-gtx295-backlog/</link>
		<comments>http://www.vrworld.com/2009/03/14/nvidia-owes-millions-of-in-gtx295-backlog/#comments</comments>
		<pubDate>Sat, 14 Mar 2009 00:41:56 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[ati vs nvidia 2009]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[gtx 295]]></category>
		<category><![CDATA[gtx 295 shortage]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[nvidia shortage]]></category>
		<category><![CDATA[nvidia vs ati 2009]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=1176</guid>
		<description><![CDATA[<p>It seems to us that Nvidia did a neat PR stunt called GeForce GTX 295. This card is maybe the fastest on the market, but ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/03/14/nvidia-owes-millions-of-in-gtx295-backlog/">UPDATE: Nvidia owes millions of $ in GTX295 backlog</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>It seems to us that Nvidia did a neat PR stunt called GeForce GTX 295. This card is maybe the fastest on the market, but the company is not making great progress in making those cards available to partners. We spoke to several sources in different vendors across the Globe, and one thing was the same: for the past couple of weeks, Nvidia did not deliver GTX295 cards and the backlog of already purchased cards is now measured in well over a million of greenbacks.</p>
<div id="attachment_1180" style="width: 510px" class="wp-caption aligncenter"><img class="size-full wp-image-1180" title="gtx_295_right" src="http://cdn.vrworld.com/wp-content/uploads/2009/03/gtx_295_right.jpg" alt="GTX295 naked. Picture courtesy of TechPowerUp!" width="500" height="384" /><p class="wp-caption-text">GTX295 naked. Picture courtesy of TechPowerUp!</p></div>
<p>Yep, you&#8217;ve read if correctly: Nvidia&#8217;s partners sold thousands of GTX 295 boards, and at the price of 520-550 bucks per card (or Euro), we&#8217;re talking about millions of USD/EUR. One can only wonder what is going on in Graphzilla&#8217;s head… there is an alleged recession going on, their quarterly results dropped by 50% to less than 500M USD a quarter, and they fail to deliver already sold boards &#8211; backorders.</p>
<p>We wonder what kind of excuse will nV come up with this time, if the company misses its own sales estimates. They succeeded in selling thousands and thousands of GTX295 boards, but their backlog is staggering.</p>
<p><strong>UPDATE &#8211; March 14, 2009 12:51 CET</strong> &#8211; I have uploaded the wrong picture&#8230;previous picture was the picture of 9800GX2. Apologise for any confusion, this is the right image of naked GTX295. Thanks to all readers that noticed that something is &#8220;off&#8221; with the story&#8230;</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/03/14/nvidia-owes-millions-of-in-gtx295-backlog/">UPDATE: Nvidia owes millions of $ in GTX295 backlog</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2009/03/14/nvidia-owes-millions-of-in-gtx295-backlog/feed/</wfw:commentRss>
		<slash:comments>4</slash:comments>
		</item>
		<item>
		<title>Legal and PR trouble looming for mobile GTX260/280?</title>
		<link>http://www.vrworld.com/2009/02/26/legal-and-pr-trouble-looming-for-mobile-gtx260280/</link>
		<comments>http://www.vrworld.com/2009/02/26/legal-and-pr-trouble-looming-for-mobile-gtx260280/#comments</comments>
		<pubDate>Thu, 26 Feb 2009 21:30:43 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Mobile Computing]]></category>
		<category><![CDATA[consumer protection]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[geforce rebranding]]></category>
		<category><![CDATA[gtx 260]]></category>
		<category><![CDATA[gtx 260M]]></category>
		<category><![CDATA[gtx 280]]></category>
		<category><![CDATA[gtx 280M]]></category>
		<category><![CDATA[gtx260m]]></category>
		<category><![CDATA[gtx280m]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[nvidia cheats]]></category>
		<category><![CDATA[nvidia misleads]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=1127</guid>
		<description><![CDATA[<p>After going through dozen or so phone calls and IM conversations with several worried investors, analysts and attorneys, I felt inclined to write this story. ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/02/26/legal-and-pr-trouble-looming-for-mobile-gtx260280/">Legal and PR trouble looming for mobile GTX260/280?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>After going through dozen or so phone calls and IM conversations with several worried investors, analysts and attorneys, I felt inclined to write this story. One has to wonder what branding wizards at Nvidia thought when they decided to brand three year old architecture under the same name as current desktop parts. In case you didn&#8217;t know, Nvidia is going to use the same 55nm G92b chip on GeForce GTS240/250 boards for desktop and GTX260M and GTX280M for notebooks. This is nothing else but a disastrous call.</p>
<p>G92 chip now spans across four product generations &#8211; GeForce 8800GT/GTS512/GSO, 9800GT/GTX/GTX+, GTS240/250 and GTX260M/280M. The decision Nvidia obviously made can constitute as &#8220;deliberate misleading naming for confusing of consumers&#8221; and is making this company potential for not just consumer (private class-action/consumer watchdog) action, but also legal wrath from consumer protection agencies in the US, and several countries in EMEA region &#8211; Germany, Sweden and UK being prime examples.</p>
<p>If Nvidia labeled their mobile parts as GTS240/250, nobody would notice &#8211; because it would be a case of rebranding (in all honesty, ATI isn&#8217;t a saint there either, branding some Mobility Radeon 2000 parts as 3000 series and then rebranding some 3000 chips into 4000). But in this case, the company is deliberately naming GTS240/260 as already shipping desktop parts &#8211; GTX260 and 280, both based on GT200 chip.</p>
<div id="attachment_1128" style="width: 510px" class="wp-caption aligncenter"><img class="size-full wp-image-1128" title="real_gtx280" src="http://cdn.vrworld.com/wp-content/uploads/2009/02/real_gtx280.jpg" alt="Will the real GeForce GTX 280 please stand up?" width="500" height="210" /><p class="wp-caption-text">Will the real GeForce GTX 280 please stand up?</p></div>
<p>The list of differences doesn&#8217;t stop there &#8211; we have a GTX280 consisting out of 128 shader processors (GTX280M), or 240 (the original GTX280), different memory controllers (256-bit vs. 512-bit) and the list goes on.</p>
<p>The potential legal part lies in the fact that if consumer types something like  &#8220;GeForce GTX 280 notebook&#8221; in &#8220;accepted way of finding information relevant to consumer&#8221; such as search engines &#8211; consumer will find information and glowing reviews about desktop parts and potentially form a wrong opinion/expectation about the product itself.  This is a slippery slope that none of IT companies slipped on, but as computers became a commodity, you can expect more and more watchdogs looking into the land of (rebranded) silicon.</p>
<p>Just like all birds in trees know, GT200 chip is in whole another league compared to G92 architecture, and this is nothing else but deliberate misleading of consumers. In case G92b is really named GTX260M/280M, expect a PR disaster and potential legal issues.</p>
<p>Sorry Nvidia, but according to conversations I had in the past 48 hours, it looks like Intel chipset license war won&#8217;t be the only court meeting. This branding exercise can lead to a class action lawsuit on behalf of consumers and worse of all, an official inquiry is possible in four states (that I know so far). Biggest damage would of course, be a stock downgrade from worried investors.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/02/26/legal-and-pr-trouble-looming-for-mobile-gtx260280/">Legal and PR trouble looming for mobile GTX260/280?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2009/02/26/legal-and-pr-trouble-looming-for-mobile-gtx260280/feed/</wfw:commentRss>
		<slash:comments>10</slash:comments>
		</item>
		<item>
		<title>Germans test latest Matrox hardware</title>
		<link>http://www.vrworld.com/2009/01/27/germans-test-latest-matrox-hardware/</link>
		<comments>http://www.vrworld.com/2009/01/27/germans-test-latest-matrox-hardware/#comments</comments>
		<pubDate>Tue, 27 Jan 2009 16:35:22 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[ATI]]></category>
		<category><![CDATA[Chrome]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[m9125]]></category>
		<category><![CDATA[Matrox]]></category>
		<category><![CDATA[matrox mistake]]></category>
		<category><![CDATA[millenium]]></category>
		<category><![CDATA[Radeon]]></category>
		<category><![CDATA[S3]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=979</guid>
		<description><![CDATA[<p>Point a finger at your brain and ask yourself, when was the last time you heard about a review of a Matrox graphics card? Yep, ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/01/27/germans-test-latest-matrox-hardware/">Germans test latest Matrox hardware</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>Point a finger at your brain and ask yourself, when was the last time you heard about a review of a Matrox graphics card? Yep, my thoughts exactly… see, telepathy works. <img src="http://cdn.vrworld.com/wp-includes/images/smilies/icon_smile.gif" alt=":-)" class="wp-smiley" /></p>
<p>Courtesy of legendary German site 3DCenter.Org, we can see what Matrox is actually… manufacturing these days. Fellow journalists spent some time with Matrox M9125 graphics card, and compared it to graphics cards from ATi, Nvidia, Intel and S3. Five-vendor GPU test… when was the last time you saw something like this? Agree, trip down the memory lane…</p>
<div id="attachment_980" style="width: 510px" class="wp-caption aligncenter"><img class="size-full wp-image-980" title="matrox_gpu" src="http://cdn.vrworld.com/wp-content/uploads/2009/01/matrox_gpu.jpg" alt="Note the &quot;Golden Finger&quot; on the upper left side of PCB. No, this is not for Multi-GPU mode, but for connecting to specialized perihperal cards from Matrox." width="500" height="361" /><p class="wp-caption-text">Note the &quot;Golden Finger&quot; on the upper left side of PCB. No, this is not for Multi-GPU mode, but for connecting to specialized perihperal cards from Matrox.</p></div>
<p>Anyways, the boards that this card tested against were Radeon 3450, X4500, GeForce 7300GT and 8400GS and for final, Chrome 430GT. If you want to see how Matrox scored against these &#8220;monsters&#8221;, head over to <a href="http://www.3dcenter.org/artikel/matrox-m9125-benchmarks" target="_blank">3DCenter and check it out</a>.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/01/27/germans-test-latest-matrox-hardware/">Germans test latest Matrox hardware</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2009/01/27/germans-test-latest-matrox-hardware/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>STEAM: Intel, ATI gain users, AMD+Nvidia continue the negative trend</title>
		<link>http://www.vrworld.com/2009/01/15/steam-intel-ati-gain-users-amdnvidia-continue-the-negative-trend/</link>
		<comments>http://www.vrworld.com/2009/01/15/steam-intel-ati-gain-users-amdnvidia-continue-the-negative-trend/#comments</comments>
		<pubDate>Thu, 15 Jan 2009 15:50:34 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[Internet]]></category>
		<category><![CDATA[Microsoft]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Software]]></category>
		<category><![CDATA[amd/ati]]></category>
		<category><![CDATA[Athlon]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[multi-gpu]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Phenom]]></category>
		<category><![CDATA[quad core]]></category>
		<category><![CDATA[Radeon]]></category>
		<category><![CDATA[Steam]]></category>
		<category><![CDATA[steam hardware survey]]></category>
		<category><![CDATA[Valve]]></category>
		<category><![CDATA[widescreen]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=940</guid>
		<description><![CDATA[<p>Like a clockwork, Steam released its hardware survey for December 2008. A lot of interesting gains with the biggest winners being Intel processors, ATI graphics cards and Windows Vista operating system.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/01/15/steam-intel-ati-gain-users-amdnvidia-continue-the-negative-trend/">STEAM: Intel, ATI gain users, AMD+Nvidia continue the negative trend</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>Following its tradition, Valve released December stats for its Steam Hardware Survey. Without any doubt, this is most comprehensive &#8220;feel&#8221; what gamers use, given the fact that Steam is used by over 15 million people.</p>
<div id="attachment_941" style="width: 510px" class="wp-caption aligncenter"><img class="size-full wp-image-941" title="steamsurvey122008" src="http://cdn.vrworld.com/wp-content/uploads/2009/01/steamsurvey122008.jpg" alt="Steam Hardware Survey, December 2008" width="500" height="398" /><p class="wp-caption-text">Steam Hardware Survey, December 2008</p></div>
<p><a href="http://theovalich.wordpress.com/2008/12/14/new-steam-survey-confirms-intel-nvidia-dominate-the-market-share/" target="_blank">Following our article with November 2008 details</a>, here are the changes:</p>
<ul>
<li>AMD lost 1.04% of CPU share in December alone &#8211; Intel gained that share</li>
<li>Nvidia lost 0.47% of its GPU share, while AMD/ATI gained 0.45%</li>
<li>Out of those 0.47%, 0.25% bought a graphics card with 1GB of video memory</li>
<li>Quad-Core processors are now used by 10.76%, up 0.64% from Nov &#8217;08</li>
<li>50.81% of all users have at least 2 cores to play with</li>
<li>DirectX 10 API is now used by 22.88% of all users, up 1.45% from Nov &#8217;08</li>
<li>41.5% users use widescreen resolution, gained 1.47% from Nov&#8217;08</li>
<li>Multi-GPU configurations are still used by less than 2% of all users, with 1.84% (+0.06%). This means that just 300.000 Steam gamers have more than one GPU in their computer.</li>
<li>Revenue-wise, these 300.000 gamers spent 200+ million USD on their GPU hardware alone, or equal revenue to 1.2 million mainstream users (if they spent $199 on a card)</li>
</ul>
<p>We can conclude that roughly 1% of all users (150.000) bought new systems, mostly equipped with quad-core processors and ATI Radeon graphics cards. Given the fact that cut-off for SHS is 10th in the month, only Steam Hardware Survey for January 2009 will show all the Christmas sales results. Then, we can see what happened with the base of Steam users.</p>
<p>This does not include the whole world, so there is a lot of room for free interpretation, but bear in mind that this number includes more than 15.5 million users worldwide. And Steam itself is gaining users, so it will be interesting to see how it will all equal out.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/01/15/steam-intel-ati-gain-users-amdnvidia-continue-the-negative-trend/">STEAM: Intel, ATI gain users, AMD+Nvidia continue the negative trend</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2009/01/15/steam-intel-ati-gain-users-amdnvidia-continue-the-negative-trend/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia&#8217;s 3D Glasses to cost $199? That&#8217;s hellishly expensive!</title>
		<link>http://www.vrworld.com/2009/01/05/nvidias-3d-glasses-to-cost-199-thats-hellishly-expensive/</link>
		<comments>http://www.vrworld.com/2009/01/05/nvidias-3d-glasses-to-cost-199-thats-hellishly-expensive/#comments</comments>
		<pubDate>Mon, 05 Jan 2009 13:20:18 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[3D]]></category>
		<category><![CDATA[3D glasses]]></category>
		<category><![CDATA[3d googles]]></category>
		<category><![CDATA[3D Vision]]></category>
		<category><![CDATA[CES]]></category>
		<category><![CDATA[Facebook]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[Las Vegas]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Skype]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=899</guid>
		<description><![CDATA[<p>Nvidia 3D glasses to cost $199... incredibly expensive.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/01/05/nvidias-3d-glasses-to-cost-199-thats-hellishly-expensive/">Nvidia&#8217;s 3D Glasses to cost $199? That&#8217;s hellishly expensive!</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>As the first day of CES approaches, more and more journalists are boasting their Skype stats and Facebook statuses with &#8220;NDA end is drawing near&#8221;, mostly referring to the launch of AMD Phenom II, Nvidia 3D Vision, GeForce GTX 285 and 295. Same applies to many other products that will launch at CES.</p>
<p>Personally, I am interested in 3D Vision the most. Then again, you already know I am a bit quirk when it comes to IT, since I also game a lot using OCZ NIA, steering wheels and so on. Ever since I remember, 3D was a big thing for me. I remember buying comic books that came with glasses and becoming hooked on seeing Phantom, Mister No, Transformers, Superman and all &#8220;hovering&#8221; above the pages. Almost a decade later, I watched Freedie&#8217;s Dead using the same red/blue paper glasses. Anaglyph technology was far from perfect, could cause hellish headaches and overall, was not a good technology. But it was Cheap with capital C. Watching movies such as Superman Returns in IMAX 3D (all too short 20 minutes) only begged for decent 3D technology. Dolby 3D provides just that, but the technology is not usable in the world of computers. Big <img src="http://cdn.vrworld.com/wp-includes/images/smilies/icon_sad.gif" alt=":-(" class="wp-smiley" /> from me.</p>
<p>Nvidia&#8217;s 3D Vision promised all that, but using active shutter technology meant only one thing &#8211; it will be expensive to make. <a href="http://www.theinquirer.net/inquirer/news/192/1050192/nvidia-picks-the-wrong-3d-glasses-technology" target="_blank">My ex-colleague from The INQ already made his take on how expensive this technology will be</a>, but he didn&#8217;t mention the price, and wrote some inaccuracies (one emitter can play host to 1000 glasses if they&#8217;re in range – nV demonstrated 3D tech to press using a single emitter for the 50 of us in Munich). I managed to get the price from some sources, and all that I can say, I really hope it won&#8217;t be so high. The price I am hearing is $199 for one box. If this really is the future price in e-tail and retail, there is not much to be said besides &#8211; you screwed up. Badly.</p>
<p>I would have understanding if Nvidia teamed up with Oakley and brought top notch quality looks, offered customization (for prescription lenses), or something like that, but a cheap plastic-looking part (they actually use an expensive material to ensure durability, but it does look… cheap) costing more than $99 is just something that will not work. I am waiting to see the final product before I bring the judgment upon it, but unless the part comes at acceptable price point, I am sad to see that 3D will probably stay away from mainstream audience. Even OCZ NIA now is in sub-100 bracket, and yes, I would recommend that you go out and buy the part. Playing around with Nvidia&#8217;s 3D glasses was a blast on both Nvision and GeForce Plus event in Munich, but the key part is availability to the market. Pricing it out of the range is not something that will speed up the adoption of technology…</p>
<p>Thus, maybe 3D Vision will be a cool product in 2010, when the company will be forced to cut the price down to $99. Until then, this is an expensive gimmick. A gimmick that really works and offers great experience (in my opinion), but a gimmick nevertheless.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/01/05/nvidias-3d-glasses-to-cost-199-thats-hellishly-expensive/">Nvidia&#8217;s 3D Glasses to cost $199? That&#8217;s hellishly expensive!</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2009/01/05/nvidias-3d-glasses-to-cost-199-thats-hellishly-expensive/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>GeForce GTX285 on sale, our specs confirmed</title>
		<link>http://www.vrworld.com/2009/01/02/geforce-gtx285-on-sale-our-specs-confirmed/</link>
		<comments>http://www.vrworld.com/2009/01/02/geforce-gtx285-on-sale-our-specs-confirmed/#comments</comments>
		<pubDate>Fri, 02 Jan 2009 11:40:54 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[fx5800]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[Gigabyte]]></category>
		<category><![CDATA[gt206]]></category>
		<category><![CDATA[gt206 specs]]></category>
		<category><![CDATA[gtx285]]></category>
		<category><![CDATA[Hong Kong]]></category>
		<category><![CDATA[Quadro CX]]></category>
		<category><![CDATA[quadro fx4800]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=891</guid>
		<description><![CDATA[<p>For the past couple of weeks, I&#8217;ve been closely following what&#8217;s going on with the 55nm refresh from Nvidia. GT200b (GT200-100-B2) series chips begun their ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/01/02/geforce-gtx285-on-sale-our-specs-confirmed/">GeForce GTX285 on sale, our specs confirmed</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>For the past couple of weeks, I&#8217;ve been closely following what&#8217;s going on with the 55nm refresh from Nvidia. GT200b (GT200-100-B2) series chips begun their life in Quadro CX and FX4800/5800 cards, and then started selling as GeForce GTX260 55nm.</p>
<p>On January 8, 2009, Nvidia will officially introduce GeForce GTX285 1GB and GTX295 1.8 GB cards. Or that was the theory. As it usually happens, manufacturers &#8220;accidentally&#8221; started to sell early, and this time, the &#8220;honor&#8221; of going on sale first goes to GigaByte.</p>
<p>Thanks to HKEPC, we learned that <a href="http://www.hkepc.com/2178" target="_blank">two Hong Kong shops sell GTX285 by Gigabyte</a>. This means GigaByte will be remembered as the first company to offer GTX285 on sale (first blood for GTX260 55nm went to EVGA). Prices are ranged between 410-440 USD, but you can expect it to drop further &#8211; this boards sell with at least $30-50 per store margin for being first (as usual).</p>
<p>GPU-wise, specifications are identical to Quadro FX 5800 &#8211; GPU is clocked to 648 MHz, while shaders are working at 1.48 GHz. GDDR3 memory is clocked to 1.24 GHz, meaning you have 158,976 MB/s or 155.25 GB/s to play with. Power consumption is set at 183W and this was the reason for putting 6+6-pin PEG connectors, instead of usual 8+6 configuration.<br />
While this may be good news for owners of older PSUs without 8-pin PEG connector, overclockers will turn their heads to enthusiast manufacturers such as BFG, EVGA, PALIT and others for the 8+6 versions of the card. 6+6+PCIe slot can only provide 236W of power, meaning you have 53W for overclocking.</p>
<p>In the days of original GTX280, TDP was set at 236W and 8+6+PCIe slot configuration could provide 300W of juice &#8211; or 64W. Still, I may be wrong on this one, since Shamino recently broke 3DMark world record by using single GeForce GTX 285 card with 1.1 GHz core and 2 GHz Shader clock (you think Peter did that with a 65nm GPU? Think again <img src="http://cdn.vrworld.com/wp-includes/images/smilies/icon_wink.gif" alt=";-)" class="wp-smiley" /> )</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2009/01/02/geforce-gtx285-on-sale-our-specs-confirmed/">GeForce GTX285 on sale, our specs confirmed</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2009/01/02/geforce-gtx285-on-sale-our-specs-confirmed/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Leaked GTX295 scores are genuine</title>
		<link>http://www.vrworld.com/2008/12/16/leaked-gtx295-scores-are-genuine/</link>
		<comments>http://www.vrworld.com/2008/12/16/leaked-gtx295-scores-are-genuine/#comments</comments>
		<pubDate>Tue, 16 Dec 2008 22:57:17 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[$499 card]]></category>
		<category><![CDATA[4870X2]]></category>
		<category><![CDATA[ati 2009]]></category>
		<category><![CDATA[Dual GPU]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[graphics cards 2009]]></category>
		<category><![CDATA[gt200 gx2]]></category>
		<category><![CDATA[gt206]]></category>
		<category><![CDATA[gt212]]></category>
		<category><![CDATA[gtx295]]></category>
		<category><![CDATA[multi-gpu]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[nvidia 2009]]></category>
		<category><![CDATA[Radeon]]></category>
		<category><![CDATA[rv770]]></category>
		<category><![CDATA[Santa Clara]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=823</guid>
		<description><![CDATA[<p>Far Eastern site leaked first performance results of Nvidia's answer to the awesome 4870X2. The name is GTX295, and it is based upon two 55nm GT206 chips and odd-numbered 1.79 GB of video memory.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/16/leaked-gtx295-scores-are-genuine/">Leaked GTX295 scores are genuine</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>IT168.com is a site located in the Far East, and these guys are known for snatching exclusives from the factory floors. In the case of everybody&#8217;s favorite green parts, guys from vga.it168.com managed to get their hands on upcoming GeForce GTX 295 card and ran some preliminary benchmarks. Sadly, IT168.com retracted the story, but it was too late &#8211; Internet caught up with the pictures, which I am bringing here for your viewing pleasure.</p>
<p>From what we can see, this card is an interesting combo between GTX260 and 280. For starters, this is nothing else but two GTX260 boards beside the shader part.</p>
<ul>
<li>Clock speeds? GTX260 x2.</li>
<li>Number of ROP units? GTX260 x2.</li>
<li>Number of Texture units? GTX260 x2.</li>
<li>Amount of memory? GTX260 x2.</li>
<li>Number of shaders? Well&#8230; GTX280. x2.</li>
</ul>

<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_gtx295_02.jpg' rel="lightbox[gallery-0]"><img width="500" height="319" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_gtx295_02.jpg" class="attachment-vw_medium" alt="The board in its final design... HDMI and two DVIs make the end of one GTX295..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_gtx295.jpg' rel="lightbox[gallery-0]"><img width="500" height="319" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_gtx295.jpg" class="attachment-vw_medium" alt="...for LEGO lovers, this is how the part is going to look inside." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_gtx295_03.jpg' rel="lightbox[gallery-0]"><img width="500" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_gtx295_03-500x420.jpg" class="attachment-vw_medium" alt="Spec comparison, courtesy of IT168.com" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_gtx295_04.jpg' rel="lightbox[gallery-0]"><img width="500" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_gtx295_04-500x420.jpg" class="attachment-vw_medium" alt="There were two performance tables, but I will omit the PhysX one... just makes no point. Scores in Dead Space promise a lot of high-res fun." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_gtx295_05.jpg' rel="lightbox[gallery-0]"><img width="500" height="320" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_gtx295_05.jpg" class="attachment-vw_medium" alt="Power consumption should put all those &quot;55nm eats too much power&quot; rumors to rest." /></a>

<p>So, we have a part that was intended to be doubled GTX260, but Nvidia saw that performance-wise, it may not be enough to overtake the Radeon 4870X2 or the upcoming overclocked versions&#8230; thus, the company decided to play it safe and unlock all the 240 shaders that each part posess.</p>
<p>Smart move or not? Well, the launch date was moved from early December to first day of CES, but what can you do. This plays perfectly with the decision of not going with separate 55nm GeForce GTX 270, because of all the 65nm inventory that company currently has. Thus, all those chips will become GTX260-216 and (if any) GTX260-192. However, GTX280 will not induce 55nm chips, but the company is preparing GTX285, part with same clocks as ones on Quadro FX5800. Core is 648 MHz, Shaders are set at 1.48 GHz, while 1GB of memory is set at 2.48 GHz.</p>
<p>GTX 295 is going to sit on top of the lineup, followed by GTX285, remaining stock of GTX280 and abundance of GTX260-216 parts (55nm/65nm combo). Unlike some sites, that claimed that 55nm is a power hog, it is now more than obvious that 55nm GTX260 consumes less power than Radeon 4870 and that is no small feat indeed. After all, RV770 features 999 million transistors, while the GT206 carries whopping 40% more, at 1.4 billion.</p>
<p>Now, if only people in charge of memory controller didn&#8217;t made a grave mistake and nuked GDDR5 support for political reasons (back in days, while Nvidia was well&#8230; feeling quite egoistic), who knows where the power consumption battle would end. The kicking part is that memory controller people are furious with upper echelons in Graphzilla, because if the company adopted GDDR5 support for the 55nm refresh, GTX295 could feature GDDR5 memory because the traces would be much more simpler to route and there would be no &#8220;PCB looking like a maze&#8221; issues that every GDDR3-based design has.</p>
<p>GDDR5 memory is the way to go for the future of this industry and even though GTX295 will have excellent performance, green goblins have only themselves to blame what would happen with GT206 + GDDR5. We won&#8217;t know the answer before the GT212 (40nm die-shrink) at earliest.</p>
<p>All we know is that we have a heated battle for the market in the $499 range again. But this time, with two dual-GPU parts with more than 1.5 GB of memory each. Well, ATI has the advantage there, 2GB GDDR5 vs. 1.79GB GDDR3.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/16/leaked-gtx295-scores-are-genuine/">Leaked GTX295 scores are genuine</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/12/16/leaked-gtx295-scores-are-genuine/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>Nvidia&#8217;s 3D glasses are surprisingly expensive</title>
		<link>http://www.vrworld.com/2008/12/15/nvidias-3d-glasses-are-surprisingly-expensive/</link>
		<comments>http://www.vrworld.com/2008/12/15/nvidias-3d-glasses-are-surprisingly-expensive/#comments</comments>
		<pubDate>Mon, 15 Dec 2008 16:54:35 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Software]]></category>
		<category><![CDATA[3D glasses]]></category>
		<category><![CDATA[3D goggles]]></category>
		<category><![CDATA[3D Vision]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[goggles]]></category>
		<category><![CDATA[graphics plus]]></category>
		<category><![CDATA[munich]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[nvidia 3d glasses]]></category>
		<category><![CDATA[oakley]]></category>
		<category><![CDATA[stereoscopy]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=820</guid>
		<description><![CDATA[<p>Back at Nvision 08 in August, Jen-Hsun demonstrated 3D Vision during his keynote speech. While the whole audience enjoyed in demonstration of 3D technology, nobody knew how the technology works. We bring more details and gallery of this hot upcoming product.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/15/nvidias-3d-glasses-are-surprisingly-expensive/">Nvidia&#8217;s 3D glasses are surprisingly expensive</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>Last week in Munich, Nvidia held Graphics Plus event, where the company demonstrated its new marketing strategy, but also announced products that will heat up the hearts of press and minds of potential buyers. After driving through the highest snow that fell in Kärnten region of Austria in 80 years, time was up for listening what Nvidia has to offer in 2009.</p>
<p>There were talks about PhysX games, upcoming GeForce cards, the re-birth of old-new SLI mode… general applications such as CyberLink Power Director and Adobe CS4… and of course, 3D Vision. Back at Nvision 08 in August, Jen-Hsun demonstrated 3D Vision during his keynote speech. While the whole audience enjoyed in demonstration of 3D technology, nobody knew how the technology works. </p>
<p>Nvidia wanted to push the product in time for Christmas, but 3D Vision relies on availability of 120 Hz refresh rate. This means your decade old CRT will do in recreating a 3D world, but you can&#8217;t expect that shiny new 30&#8243; Dell or HP will work. Samsung and Viewsonic are going to bring 120Hz LCD displays to market, but not in 2008. Thus, you can expect demos of this technology on CES 2009 in early January, and real world-wide rollout by CeBIT 2009 in March. It all depends how much time Viewsonic, Samsung and others need (to deliver 120Hz LCD monitors).</p>
<p>First of all, the product is divided into two major parts: glasses and IR emitter. 3D Goggles are wireless, with USB port being added for charging purposes. There is no set limit on the number of goggles that can connect onto a single IR emitting cube, but we don&#8217;t know will the company make a kit featuring more than one glasses.</p>
<p>IR Emitter can connect to LCD TV, LCD display, and projector or just about anything able to pull out refresh rate of 120 Hz. The technology works that every eye gets 60 Hz refresh and glasses blank the lens for 60 times in a second, so that 120 Hz display yields smooth 3D gameplay in 60 fps or so… I have tried the technology several times, and I have to say, it is really enjoyable. During the past couple of events, Nvidia demonstrated the technology using Call of Duty 5, Far Cry 2, Crysis Warhead, and Race Driver GRID and so on. Enjoyable part is that 3D Vision tech works on almost every game, and this immediately brings up the replay value of games that you already purchased.</p>

<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_01.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_01-750x420.jpg" class="attachment-vw_medium" alt="Retail box for the product..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_02.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_02-750x420.jpg" class="attachment-vw_medium" alt="...hides the following content: glasses, cleaning cloth, carrying case, USB cable, DVI-HDMI adapter cable, Stereo cable and manuals." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_03.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_03-750x420.jpg" class="attachment-vw_medium" alt="HDMI cable is a must for any 120Hz LCD TV..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_09.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_09-750x420.jpg" class="attachment-vw_medium" alt="Ready to play on a DLP projector ;)" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_04.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_04-750x420.jpg" class="attachment-vw_medium" alt="The list of box content..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_05.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_05-750x420.jpg" class="attachment-vw_medium" alt="...hardware requirements are plain simple. GeForce 8 and newer hardware...Vista is mandatory, sadly." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_06.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_06-750x420.jpg" class="attachment-vw_medium" alt="How stuff works dotcom ;)" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_07.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_07-750x420.jpg" class="attachment-vw_medium" alt="How to connect..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_08.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_08-750x420.jpg" class="attachment-vw_medium" alt="Mandatory epilepsy warning and registration card..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_11.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_11-750x420.jpg" class="attachment-vw_medium" alt="Non-3D image..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_12.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_12-750x420.jpg" class="attachment-vw_medium" alt="Pressing the green button on the emitter turns the stereovision on. Note that your goggles have to be switched on." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_13.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_13-750x420.jpg" class="attachment-vw_medium" alt="Back of the IR emitter hides the scroll wheel - decide what level of stereoscopy you want..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_14.jpg' rel="lightbox[gallery-1]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_14-750x420.jpg" class="attachment-vw_medium" alt="During presentation, a single IR emitter was used to display image to several dozen glasses." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_10.jpg' rel="lightbox[gallery-1]"><img width="750" height="374" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/nvidia_3dvision_10-750x374.jpg" class="attachment-vw_medium" alt="Yep, playing games with prescription glasses on." /></a>

<p>Playing WoW, Warhammer Online or GRID is one thing, but you can take old Age of Empires or Anno 1503 and dive in the 3D world of these titles… in case of Age of Empires III, I want to play the game again, as soon as I am able to lay my hands on these glasses. </p>
<p>The downfall of this standard is the price, and yet unknown way of distribution. Nvidia is advertising 3D Vision as their own product, just like Tesla, with no room for AIBs. We have asked key partners do they plan on to carry this product under their own brand and the answer is still up in the air. It is widely expected that Nvidia will treat this product in the same way as Quadro cards, selecting partners for regions. We&#8217;ll see&#8230;</p>
<p>But the major show-stopper is the price. I asked about the price, and heard $199 from more than one source (including people from Nvidia). Now, this is where I have to raise the red flag. 3D technology looks awesome, and glasses work like a charm, but $199?</p>
<p>For $199, you can get Oakley Thump 2 sunglasses, with integrated MP3 player. The quality of Nvidia&#8217;s goggles is higher than your regular 3D glasses and yes, higher quality than Dolby 3D goggles. But, Nvidia&#8217;s 3D Vision goggles cannot hold a candle to the build quality of Oakley shades. For $199, top quality is expected, not shiny plastic. This may sound harsh, but this is a product that should last for next couple of years, and this is where Dolby Laboratories made a better design move… I am just not sure that the glasses are sturdy enough to survive long-hour gaming sessions, LAN parties and so on. Who knows, nV might bring a nice surprise, but I&#8217;ll remain cautious for now.</p>
<p>The technology is great, but the price has to come down to $99 range to have a shot at becoming a mainstream standard.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/15/nvidias-3d-glasses-are-surprisingly-expensive/">Nvidia&#8217;s 3D glasses are surprisingly expensive</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/12/15/nvidias-3d-glasses-are-surprisingly-expensive/feed/</wfw:commentRss>
		<slash:comments>6</slash:comments>
		</item>
		<item>
		<title>ANNOUNCEMENT: Winners of PALIT Folding Challenge 2008</title>
		<link>http://www.vrworld.com/2008/12/15/announcement-winners-of-palit-folding-challenge-2008/</link>
		<comments>http://www.vrworld.com/2008/12/15/announcement-winners-of-palit-folding-challenge-2008/#comments</comments>
		<pubDate>Mon, 15 Dec 2008 10:32:52 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[3DMark]]></category>
		<category><![CDATA[3dmark vantage]]></category>
		<category><![CDATA[Boris Romac]]></category>
		<category><![CDATA[Chad Hudson]]></category>
		<category><![CDATA[Eyvind Niklasson]]></category>
		<category><![CDATA[F@H]]></category>
		<category><![CDATA[Folding]]></category>
		<category><![CDATA[folding stats]]></category>
		<category><![CDATA[Folding@Home]]></category>
		<category><![CDATA[Francisc Kurko]]></category>
		<category><![CDATA[Futuremark]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[Goran Gjura]]></category>
		<category><![CDATA[GTX280]]></category>
		<category><![CDATA[Ivan Faj]]></category>
		<category><![CDATA[Jen-Hsun Huang]]></category>
		<category><![CDATA[Jonathan Worrel]]></category>
		<category><![CDATA[Joza Habijan]]></category>
		<category><![CDATA[Nenad Miodrag]]></category>
		<category><![CDATA[Nikola Peruncic]]></category>
		<category><![CDATA[Palit]]></category>
		<category><![CDATA[Radu Neagu]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=796</guid>
		<description><![CDATA[<p>And the winners of PALIT Folding Challenge 2008 are...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/15/announcement-winners-of-palit-folding-challenge-2008/">ANNOUNCEMENT: Winners of PALIT Folding Challenge 2008</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>It is with great pleasure that I can announce the winner of PALIT Folding Challenge 2008. This contest surprised me with great enthusiasm by fellow members, and I cannot say how happy I am to be in position to give something back. Without further adue, the winners of inaugural Folding Challenge are:</p>
<p><strong>Francisc Kurko, Romania</strong> &#8211; PALIT GeForce GTX280 graphics card, signed by Jen-Hsun Huang</p>
<p>Winners of 3DMarkVantage retail licenses are:</p>
<p><strong>Radu Neagu, Romania</strong></p>
<p><strong>Boris Romac, Croatia</strong></p>
<p><strong>Jonathan Worrel </strong>(Furious Fandango)<strong>, United States of America</strong></p>
<p><strong>Ivan Faj, Croatia</strong></p>
<p><strong>Nikola Peruncic</strong> (McPingvin), Croatia</p>
<p><strong>Chad Hudson</strong> (Dragoon1101)<strong>, United States of America</strong></p>
<p><strong>Nenad Miodrag</strong> (nmiodrag)<strong>, Croatia</strong></p>
<p><strong>Goran Gjura </strong>(spikeygoran)<strong>, Croatia</strong></p>
<p><strong>Eyvind Niklasson, Sweden</strong></p>
<p><strong>Joza Habijan, Croatia</strong></p>
<p>In the end, I wish to congratulate each and every one for competing in PALIT Folding Challenge 2008. I am pleasantly surprised at the effort placed by all members of this small team. Also, bear in mind that the new giveaway is already planned, and this time I will focus on giving as much prizes as possible, with several tweaks in the contest rules. Rest assured, everything that I do will be global, so no exclusions. Everybody is welcome in the group, and as the new site starts to grow, you can be sure there are plenty more surprises. For starters, we&#8217;ll make sure that we have a dedicated place for all folders on the site, in the form of site section and forum section.</p>
<p>All winners need to contact me at <a href="mailto:FoldingChallenge@gmail.com" target="_blank">FoldingChallenge@gmail.com</a> with their shipping address, so that I could organize shipping of prizes.</p>

<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/foldingchallenge1stprize.jpg' rel="lightbox[gallery-2]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/foldingchallenge1stprize-750x420.jpg" class="attachment-vw_medium" alt="Jen-Hsun signed the board as soon as winner was known..." /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/foldingchallenge2ndprizes.jpg' rel="lightbox[gallery-2]"><img width="350" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/foldingchallenge2ndprizes-350x420.jpg" class="attachment-vw_medium" alt="Futuremark provides retail licenses for 3DMarkVantage Advanced Edition" /></a>
<a href='http://cdn.vrworld.com/wp-content/uploads/2008/12/foldingchallenge15122008.jpg' rel="lightbox[gallery-2]"><img width="750" height="420" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/foldingchallenge15122008-750x420.jpg" class="attachment-vw_medium" alt="The Final Table... congratulations to all winners.." /></a>

<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/15/announcement-winners-of-palit-folding-challenge-2008/">ANNOUNCEMENT: Winners of PALIT Folding Challenge 2008</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/12/15/announcement-winners-of-palit-folding-challenge-2008/feed/</wfw:commentRss>
		<slash:comments>19</slash:comments>
		</item>
		<item>
		<title>New Steam survey confirms Intel, Nvidia dominate the market share</title>
		<link>http://www.vrworld.com/2008/12/14/new-steam-survey-confirms-intel-nvidia-dominate-the-market-share/</link>
		<comments>http://www.vrworld.com/2008/12/14/new-steam-survey-confirms-intel-nvidia-dominate-the-market-share/#comments</comments>
		<pubDate>Sun, 14 Dec 2008 01:37:17 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[CPU]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[Microsoft]]></category>
		<category><![CDATA[Software]]></category>
		<category><![CDATA[ATI]]></category>
		<category><![CDATA[CrossFire]]></category>
		<category><![CDATA[DirectX 10]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[geforce 8800]]></category>
		<category><![CDATA[hardware survey]]></category>
		<category><![CDATA[multi-gpu]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[overclockers]]></category>
		<category><![CDATA[Overclocking]]></category>
		<category><![CDATA[SLI]]></category>
		<category><![CDATA[Steam]]></category>
		<category><![CDATA[Valve]]></category>
		<category><![CDATA[Vista]]></category>
		<category><![CDATA[vista is failure]]></category>
		<category><![CDATA[widescreen]]></category>
		<category><![CDATA[Windows XP]]></category>
		<category><![CDATA[WinXP]]></category>
		<category><![CDATA[worlds largest survey]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=789</guid>
		<description><![CDATA[<p>Developing a game and want to know what gamers use? Valve Corporation gathered more than 15 million users of their Steam digital distribution platform, but probably the most interesting part is world famous "Steam Hardware Survey". How many people use high-end hardware? What kind of displays do gamers use? We analyze last last six months...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/14/new-steam-survey-confirms-intel-nvidia-dominate-the-market-share/">New Steam survey confirms Intel, Nvidia dominate the market share</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>When it comes to game development, Valve&#8217;s software distribution platform is probably the most important part of the industry.</p>
<p>Steam has more than 15 million users worldwide and is to software distribution what World of Warcraft is in the world of MMO games. But probably the most interesting part of <a href="http://store.steampowered.com/hwsurvey" target="_blank">Steam is world famous &#8220;Steam Hardware Survey&#8221;</a>. Every month, Valve collect data from 15 million users and lists what kind of hardware do people use. This is an invaluable process that gives developers golden information about platforms they should target. In my discussions with game developers, I had countless verbal battles with people who didn&#8217;t want to create a game for high-end hardware, because the adoption rate is too low. Well, think again.</p>
<p>In survey for November 2008, stats show that Intel leads the CPU share with 63.62%, while AMD owns the remaining 36.38%. Dual-core dominates with 49.04%, e.g. almost eight million people own a dual-core processor. Quad-core captured just 10.43%, which means quaddies have a mountain to climb. It is surprising to see that 40.19% still own a single-core processor, but multi-thread support is a must-have feature today, not tomorrow &#8211; 59.47% own a multi-thread capable computer.</p>
<p>CPU-wise, best sellers are Core 2 Duo E6600 and E6700 processors (2.4-2.66 GHz), while owners of AMD platform just cannot get enough of Athlon 64 X2 2.2 GHz (4400+). Graphics-wise, Nvidia captured 65.11% of all steam users, translated into more than 10 million gamers. Here comes the most interesting part of the survey. According to Steam, GeForce 8800 captured the hearts of no less than 32.35% of users. Out of 10 million Nvidia users, more than three million people own a high-performing GeForce 8800 card &#8211; it is almost incredible to see that amount of 3D horsepower taking more market share than numerous mainstream and low-end cards. </p>
<p>API is a key decided in what platform to go with, and software developers will appreciate the fact that even though almost half of all Steam users use DirectX 10 hardware, only 21.43% of all steam users can actually use DX10 API. Windows XP is loved by more than massive majority of DX10 HW-owning users.</p>
<p> </p>
<div id="attachment_790" style="width: 510px" class="wp-caption aligncenter"><img class="size-full wp-image-790" title="steam_hw-survey" src="http://cdn.vrworld.com/wp-content/uploads/2008/12/steam_hw-survey.jpg" alt="Numbers don't lie, and these are results from more than 15M people around the world. World's largest IT-related survey, that's for sure." width="500" height="398" /><p class="wp-caption-text">Numbers don&#39;t lie, and these are results from more than 15M people around the world. World&#39;s largest IT-related survey, that&#39;s for sure.</p></div>
<p> </p>
<p> </p>
<p><strong>Summary</strong></p>
<p>All in all, I would conclude that this survey proves just how popular Core 2 Duo and GeForce 8800 are. If you&#8217;re working on game code optimization today, and plan to launch the game on Steam in 2009, focus your efforts on following parts:</p>
<ul>
<li>Vista is a failure, gamers are waiting for Windows 7 to tell Good Bye to Windows XP</li>
<li>Focus optimization on two cores, most likely in 2.5 GHz range</li>
<li>480.000 users overclocked their CPU beyond fastest shipping clock, additional 1.5 million moderately overclock their machines</li>
<li>Users have around 100GB of free space on the hard drive</li>
<li>Most of users have 512MB of video memory.</li>
<li>Most popular resolution is 1024&#215;768</li>
<li>Most widescreen users use 27&#8243; screens (surprised?), followed by 24&#8243; ones. Thus, for widescreen focus on FullHD resolution</li>
<li>Multi-GPU is esoteric at best, with 1.79% of overall share. Yes, only 280.000 people have multi-GPU configuration, with SLI dominating the share with 1.55% (240.000).</li>
<li>Currently, nobody uses 4 GPUs with ATI chips and mere 4600 people own two 7950GX2 cards</li>
<li>Valve currently makes world&#8217;s largest IT-based survey, and probably one of largest surveys in existence (does anybody know how many people Nielsen actually track?)  </li>
</ul>
<p>All in all, Steam survey offered an interesting insight. I&#8217;ll follow-up on this one in our future monthly reports on Steam.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/12/14/new-steam-survey-confirms-intel-nvidia-dominate-the-market-share/">New Steam survey confirms Intel, Nvidia dominate the market share</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/12/14/new-steam-survey-confirms-intel-nvidia-dominate-the-market-share/feed/</wfw:commentRss>
		<slash:comments>6</slash:comments>
		</item>
	</channel>
</rss>

<!-- Performance optimized by W3 Total Cache. Learn more: http://www.w3-edge.com/wordpress-plugins/

Content Delivery Network via Amazon Web Services: CloudFront: cdn.vrworld.com

 Served from: www.vrworld.com @ 2015-04-10 17:14:08 by W3 Total Cache -->