<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>VR World &#187; Nvidia</title>
	<atom:link href="http://www.vrworld.com/tag/nvidia/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.vrworld.com</link>
	<description></description>
	<lastBuildDate>Thu, 09 Apr 2015 20:31:19 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.1.1</generator>
	<item>
		<title>Uncle Sam Shocks Intel With a Ban on Xeon Supercomputers in China</title>
		<link>http://www.vrworld.com/2015/04/07/usa-shocks-intel-ban-on-china-xeon-supercomputers/</link>
		<comments>http://www.vrworld.com/2015/04/07/usa-shocks-intel-ban-on-china-xeon-supercomputers/#comments</comments>
		<pubDate>Tue, 07 Apr 2015 04:21:15 +0000</pubDate>
		<dc:creator><![CDATA[VR World Staff]]></dc:creator>
				<category><![CDATA[AMD]]></category>
		<category><![CDATA[Analysis]]></category>
		<category><![CDATA[Asia Pacific (APAC)]]></category>
		<category><![CDATA[Breaking]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[China]]></category>
		<category><![CDATA[CPU]]></category>
		<category><![CDATA[Enterprise]]></category>
		<category><![CDATA[Exclusive]]></category>
		<category><![CDATA[Global Politics]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[US]]></category>
		<category><![CDATA[Alpha]]></category>
		<category><![CDATA[APAC]]></category>
		<category><![CDATA[ASEAN]]></category>
		<category><![CDATA[Brian Krizanich]]></category>
		<category><![CDATA[Brian Krzanich]]></category>
		<category><![CDATA[Changsha]]></category>
		<category><![CDATA[Denial List]]></category>
		<category><![CDATA[EFLOPS]]></category>
		<category><![CDATA[Exascale]]></category>
		<category><![CDATA[GPGPU]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[Guangzhou]]></category>
		<category><![CDATA[Guangzhou Supercomputer Center]]></category>
		<category><![CDATA[HPC]]></category>
		<category><![CDATA[IBM]]></category>
		<category><![CDATA[INTC]]></category>
		<category><![CDATA[ISA]]></category>
		<category><![CDATA[Loongson]]></category>
		<category><![CDATA[MIPS]]></category>
		<category><![CDATA[NUDT]]></category>
		<category><![CDATA[NVDA]]></category>
		<category><![CDATA[PFLOPS]]></category>
		<category><![CDATA[Power8]]></category>
		<category><![CDATA[Power9]]></category>
		<category><![CDATA[Rule Britannia]]></category>
		<category><![CDATA[Shenwei]]></category>
		<category><![CDATA[Tesla]]></category>
		<category><![CDATA[Tianhe]]></category>
		<category><![CDATA[Tianhe-2]]></category>
		<category><![CDATA[Uncle Sam]]></category>
		<category><![CDATA[x86]]></category>
		<category><![CDATA[xeon]]></category>
		<category><![CDATA[Xeon Phi]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=51616</guid>
		<description><![CDATA[<p>Just as Intel&#8217;s (NASDAQ: INTC) CEO Brian Krzanich opens the regular staff meetings before a dramatically reduced IDF2015 Shenzhen conference, it is a good time to review how ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/04/07/usa-shocks-intel-ban-on-china-xeon-supercomputers/">Uncle Sam Shocks Intel With a Ban on Xeon Supercomputers in China</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1000" height="513" src="http://cdn.vrworld.com/wp-content/uploads/2015/04/China_Tianhe2.jpg" class="attachment-post-thumbnail wp-post-image" alt="China&#039;s Tianhe-2 supercomputer is world&#039;s fastest supercomputer, at 33 PFLOPS demonstrated and 55 PFLOPS theoretical performance." /></p><p>Just as <a title="Intel Corporate Bios" href="http://www.intel.com/newsroom/assets/bio/CorpOfficers.htm" target="_blank">Intel&#8217;s (NASDAQ: INTC) CEO Brian Krzanich</a> opens the regular staff meetings before a dramatically reduced <a title="IDF2015 Shenzhen" href="http://www.intel.com/content/www/us/en/intel-developer-forum-idf/shenzhen/2015/idf-2015-shenzhen.html" target="_blank">IDF2015 Shenzhen</a> conference, it is a good time to review how government and enterprises don&#8217;t see eye to eye when it comes to strategic business.</p>
<div id="attachment_51624" style="width: 610px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2015/04/China_Tianhe2.jpg" rel="lightbox-0"><img class="wp-image-51624 size-medium" src="http://cdn.vrworld.com/wp-content/uploads/2015/04/China_Tianhe2-600x308.jpg" alt="China's Tianhe-2 supercomputer is world's fastest supercomputer, at 33 PFLOPS demonstrated and 55 PFLOPS theoretical performance." width="600" height="308" /></a><p class="wp-caption-text">China&#8217;s Tianhe-2 supercomputer is world&#8217;s fastest supercomputer, at 33 PFLOPS demonstrated and 55 PFLOPS theoretical performance.</p></div>
<p>Remember the Tianhe-2 machine at Guangzhou Supercomputer Center, the current World&#8217;s number one according to Top 500 Supercomputer list? Unlike some other China supercomputers – Tianhe-2 is fully Intel based machine,  the world’s largest assembly of Intel Xeon CPUs and Xeon Phi accelerators.</p>
<p>Even after Intel ‘opened the kimono’ and gave a nearly 70%  discount on its processors and accelerators, it has given Intel, and therefore US technology sector a major foothold in China and Asian region as such. Over the course of past two years, we were involved in a lot of discussions with Intel staff who were not privy to see the financial impact of the deal &#8212; and even argued our undoubtedly solid information. We’re not here to report how things should be, or are in marketing and investor presentations to its numerous staff, but how things really are.</p>
<p>During 2015, the Tianhe-2 supercomputer was supposed to be doubled in its size, up to 110 PFLOPs peak, again using the very same Intel processors and accelerators. Since now these are mature products with lower real manufacturing cost for Intel, they could finally make some real money.</p>
<p>Well, it was not to be: our tweety bird from the window chirped to us that Uncle Sam has put this supercomputer centre, together with National University of Defense Technology in Changsha, the system’s creators, and Tianjin centre, among others, on so a so-called &#8220;Denial List&#8221;, which prevents any high technology from the USA to be sold to these sites. Our sources used even <a href="https://www.youtube.com/watch?v=E_Vhdfao0Zs.">harsher words</a>.</p>
<p>Knowing that these several sites alone are expected to order some 250+ PFLOPS of compute in the next few years (around 500,000 top-end Broadwell-EP Xeon E5v4 processors, or  approximately $1 billion high margin list price) and they were THE Intel friendly ones, this is quite a loss to Intel, thanks to Uncle Sam.</p>
<p>But, what&#8217;s worse strategic loss in time is that, based on this decision as an excuse, indigenous China high end processor architectures can now push the government to gradually remove any dependence on US. This means just one thing: an AMD or Intel x86 processor technology is increasingly becoming errata non grata. Should the Chinese government react in force, it will give the Chinese vendors the blank check support to go all the way a developing their Alpha, POWER and MIPS processors for both the government and the mainstream commercial use.</p>
<p>You may think they are not up to the mark, but remember how fast British ARM architecture became the dominant processing architecture in the world. And this group doesn&#8217;t need to worry about the antiquated x86 ISA, worry about satisfying the dumbed down shareholder masses, or overpaying their marketing and sales staff, as well as the fat check, golden parachute-protected CxOs.</p>
<p>They have taken the best that the USA has developed (some of key Alpha, GPGPU and MIPS architects left US over the course of past four years, a lot of them due to non-renewed visas) and discarded due to corporate shenanigans, and the continued developing it much farther than anyone expected both on hardware and software side.</p>
<div id="attachment_51622" style="width: 610px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2015/04/ShenWei_SW1600.jpg" rel="lightbox-1"><img class="wp-image-51622 size-medium" src="http://cdn.vrworld.com/wp-content/uploads/2015/04/ShenWei_SW1600-600x342.jpg" alt="Five years ago, ShenWei showed a CPU that performed faster than the fastest GPUs of the time. Now, fourth generation is approaching." width="600" height="342" /></a><p class="wp-caption-text">Five years ago, ShenWei showed a CPU that performed faster than the fastest GPUs of the time. Now, fifth generation is approaching, slotting between Tesla and FirePro GPGPUs and next-gen Xeon Phi accelerators. However, this is not an accelerator or a GPGPU &#8211; this is a CPU.</p></div>
<p>So, thanks to Uncle Sam, China might not have a 110 PFLOPS Intel based supercomputer but it definitely will launch a 100 PFLOPS system based on upcoming 64-core, TFLOPS-class <a title="ShenWei on Wikipedia" href="http://en.wikipedia.org/wiki/ShenWei" target="_blank">ShenWei Alpha</a>, with true blue CPUs possibly faster per socket then even the next generation Xeon Phi or Volta/Pascal-based Teslas.  Next, of course 100 PFLOPS Chinese POWER8 or 9 &#8212; (thank you IBM) and then possibly even <a title="Loongson" href="http://www.loongson.cn/" target="_blank">Loongson MIPS</a> &#8211; -it may come back into the high end field with renewed government support because of this Uncle Sam move. All are clean, elegant, scalable high end RISC architectures.</p>
<p>So who are the winners and losers from this?</p>
<p>NUDT and Tianhe may be the losers for now, but only short term. They will simply speed up their HPC ARM plan.</p>
<p>Intel comes out the big loser from this and a lot: who will want to do a phased deployment large x86 machine in China now, and worry about future phases? Then comes Uncle Sam himself: they lost even that little bit of influence on the high end China HPC. How is that for &#8220;cutting your nose to spite your face?&#8221;.</p>
<p>&nbsp;</p>
<p><strong><em>VR WORLD&#8217;s </em> Analysis: </strong>US government moves accelerate the Chinese CPU roadmap while curtailing juiciest sales for Intel and other US vendors.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/04/07/usa-shocks-intel-ban-on-china-xeon-supercomputers/">Uncle Sam Shocks Intel With a Ban on Xeon Supercomputers in China</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/04/07/usa-shocks-intel-ban-on-china-xeon-supercomputers/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>Nvidia May Already Be Working On A GTX 980 Ti</title>
		<link>http://www.vrworld.com/2015/03/28/nvidia-may-already-be-working-on-a-gtx-980-ti/</link>
		<comments>http://www.vrworld.com/2015/03/28/nvidia-may-already-be-working-on-a-gtx-980-ti/#comments</comments>
		<pubDate>Sat, 28 Mar 2015 15:57:16 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[Companies]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[GeForce GTX 980 Ti]]></category>
		<category><![CDATA[GM200]]></category>
		<category><![CDATA[GM204]]></category>
		<category><![CDATA[GTX 980 Ti]]></category>
		<category><![CDATA[GTX 990]]></category>
		<category><![CDATA[Radeon R9 390X]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=51053</guid>
		<description><![CDATA[<p>The GTX 980 Ti is said to feature the same GM200 silicon as the Titan X, and will likely be available in factory overclocked editions. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/28/nvidia-may-already-be-working-on-a-gtx-980-ti/">Nvidia May Already Be Working On A GTX 980 Ti</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="5616" height="3744" src="http://cdn.vrworld.com/wp-content/uploads/2014/09/GeForce_GTX_590.jpg" class="attachment-post-thumbnail wp-post-image" alt="GeForce_GTX_590" /></p><p>After launching the <a href="http://www.vrworld.com/tag/titan-x" target="_blank">Titan X</a> earlier this month, it looks like Nvidia (<a href="https://www.google.com/finance?q=nvidia&amp;ei=N1UWVdnHJM6GuQTf64C4Bw" target="_blank">NASDAQ:NVDA</a>) is set to unveil a second video card based on the GM200 silicon, the GeForce GTX 980 Ti. The card is rumored to offer significant increase in better performance when compared to the GM204-based <a href="http://www.vrworld.com/tag/gtx-980" target="_blank">GTX 980</a>, which is why Nvidia may decide to call the card the GTX 990.</p>
<p>Based on the <a href="http://www.sweclockers.com/nyhet/20265-geforce-gtx-980-ti-anlander-efter-sommaren" target="_blank">latest leak</a>, the GTX 980 Ti is said to offer the full complement of 3,072 CUDA cores available in the GM200, while reducing the video memory to 6GB GDDR5, down from the 12GB featured on the Titan X. Memory bus will remain the same at 384-bit, and with the GTX 980 Ti, Nvidia will allow its partners to offer custom variations of the reference design that include custom cooling solutions and factory unlocked models.</p>
<p>Doing so has the potential for the GTX 980 Ti to be faster than the Titan X, which is only offered in a reference design with the stock cooler. There&#8217;s no confirmation as to when we&#8217;ll see the card, with the rumor only suggesting that Nvidia is targeting its launch to coincide with that of AMD&#8217;s (<a href="https://www.google.com/finance?q=amd&amp;ei=QVUWVdiYCuP_uQSd6oDYCQ" target="_blank">NASDAQ:AMD</a>) Radeon R9 390X. The Radeon R9 300 series will make its debut <a title="AMD R9 300 Series Said to Launch At Computex 2015" href="http://www.vrworld.com/2015/03/08/amd-r9-300-series-said-launch-computex-2015/" target="_blank">at Computex</a>, so it is possible that we may see the GTX 980 Ti at around the same time.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/28/nvidia-may-already-be-working-on-a-gtx-980-ti/">Nvidia May Already Be Working On A GTX 980 Ti</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/28/nvidia-may-already-be-working-on-a-gtx-980-ti/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Intel Gunning To Challenge Nvidia At HPC With &#8216;Knights Landing&#8217; Xeon Phi Processor</title>
		<link>http://www.vrworld.com/2015/03/27/intel-gunning-to-challenge-nvidia-at-hpc-with-knights-landing-xeon-phi-processor/</link>
		<comments>http://www.vrworld.com/2015/03/27/intel-gunning-to-challenge-nvidia-at-hpc-with-knights-landing-xeon-phi-processor/#comments</comments>
		<pubDate>Fri, 27 Mar 2015 03:43:01 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[Companies]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[HPC]]></category>
		<category><![CDATA[Knight's Landing]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Servers]]></category>
		<category><![CDATA[Xeon Phi]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=51021</guid>
		<description><![CDATA[<p>Intel's Knights Landing is set to offer three times the amount of performance as the current-gen Knights Corner. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/27/intel-gunning-to-challenge-nvidia-at-hpc-with-knights-landing-xeon-phi-processor/">Intel Gunning To Challenge Nvidia At HPC With &#8216;Knights Landing&#8217; Xeon Phi Processor</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1277" height="717" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Knights-Landing.jpg" class="attachment-post-thumbnail wp-post-image" alt="Knights Landing" /></p><p>Intel (<a href="https://www.google.com/finance?q=intel&amp;ei=h70UVeHlJ4nwuATq94DYBQ" target="_blank">NASDAQ:INTC</a>) has shed further details on its second-generation Xeon Phi CPU, known as Knights Landing.</p>
<p>The processor features several technical achievements, starting with a 14nm manufacturing process, which is a first in this series. Designed to offer high-performance computing, Knights Landing differs from other server-based CPUs in that it uses lots of low-energy cores to run parallel tasks, whereas offerings from IBM (<a href="https://www.google.com/finance?q=ibm&amp;ei=sL0UVbm9HcevugS_r4KAAQ" target="_blank">NYSE:IBM</a>) or Oracle (<a href="https://www.google.com/finance?q=oracle&amp;ei=DL4UVfjpI47luATvpIG4Dg" target="_blank">NYSE:ORCL</a>) use fewer but more powerful cores.</p>
<p>Built on Intel&#8217;s MIC (Many Integrated Core) architecture with a total of 8 billion transistors, Knights Landing runs a modified version of the Atom Silvermont x86 core in a tile configuration, with a single tile featuring two cores and vector execution units along with shared L2 cache as well as a circuitry that connects the tile to the rest of the mesh network. Intel has mentioned that each Knights Landing package would include a processor with 30 or more tiles and eight on-chip memory modules. Another major highlight with Knights Landing is that it would be able to function as a host processor, meaning that it can boot and run x86 operating systems and application code without any need for recompilation. It can also act as a co-processor.</p>
<p>Talking about memory, the chip vendor has announced that Knights Landing would feature eight 2GB stacks of memory, totaling up to 16GB. The chip is manufactured at Micron, and looks to be a variant of the manufacturer&#8217;s Hybrid Memory Cube, which involves stacking memory and using an embedded logic chip to deliver higher bandwidth at a lower power. Micron has mentioned that its HMC modules will be able to transfer data 15 times faster than a standard DDR3 module, while utilizing 70% less energy. Along with on-chip memory, Knights Landing will come with six memory channels that can connect a total of 384GB DDR4 memory.</p>
<p>The result of the new manufacturing process, core design and memory is that Knights Landing will offer three times the performance as the current-gen Knights Corner, with Intel claiming 3 teraflops double-precision and 6 teraflops single-precision performance. That number is close to the 7 teraflops figure Nvidia (<a href="https://www.google.com/finance?q=intel&amp;ei=h70UVeHlJ4nwuATq94DYBQ" target="_blank">NASDAQ:INTC</a>) <a href="http://www.vrworld.com/2015/03/18/nvidia-officially-launches-the-geforce-gtx-titan-x/" target="_blank">touted</a> during the launch of its latest video card, the <a href="http://www.vrworld.com/tag/titan-x" target="_blank">Titan X</a>.</p>
<p>It is no wonder, then, that Intel is aiming for the same use-cases as Nvidia for Knights Landing, with the chip vendor stating that the CPU can be used for deep learning and data analytics. Nvidia, however, has invested significant resources in its platform, and is offering tools such as the <a href="http://www.vrworld.com/2015/03/18/gtc-2015-nvidia-unveils-digits-devbox-supercomputer-aimed-at-researchers/" target="_blank">Digits</a> software framework. Even if Intel does not manage to successfully challenge Nvidia in the Knights Landing, it is witnessing a great amount of demand, with over 50 companies set to sell server systems with the CPU as the host.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/27/intel-gunning-to-challenge-nvidia-at-hpc-with-knights-landing-xeon-phi-processor/">Intel Gunning To Challenge Nvidia At HPC With &#8216;Knights Landing&#8217; Xeon Phi Processor</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/27/intel-gunning-to-challenge-nvidia-at-hpc-with-knights-landing-xeon-phi-processor/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>EVGA&#8217;s GTX 980 Hybrid Comes With An AIO Liquid Cooler</title>
		<link>http://www.vrworld.com/2015/03/27/evgas-gtx-980-hybrid-comes-with-an-aio-liquid-cooler/</link>
		<comments>http://www.vrworld.com/2015/03/27/evgas-gtx-980-hybrid-comes-with-an-aio-liquid-cooler/#comments</comments>
		<pubDate>Fri, 27 Mar 2015 03:42:36 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[Companies]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[EVGA]]></category>
		<category><![CDATA[GeForce GTX 980]]></category>
		<category><![CDATA[GM204]]></category>
		<category><![CDATA[GTX 980 Hybrid]]></category>
		<category><![CDATA[Maxwell]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=51022</guid>
		<description><![CDATA[<p>The GTX 980 Hybrid makes an already stunning card even better. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/27/evgas-gtx-980-hybrid-comes-with-an-aio-liquid-cooler/">EVGA&#8217;s GTX 980 Hybrid Comes With An AIO Liquid Cooler</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1131" height="1128" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/GTX-980-Hybrid.jpg" class="attachment-post-thumbnail wp-post-image" alt="GTX 980 Hybrid" /></p><p>EVGA has launched the first GeForce GTX 980 with an all-in-one liquid and air cooler, with the card dubbed the GTX 980 Hybrid. The custom cooling option is claimed to reduce temperatures by a full 25 degrees Celsius when compared to a reference GTX 980, allowing EVGA to substantially overclock the card out of the gate.</p>
<p>The GTX 980 Hybrid features a base clock of 1291MHz and a boost clock of 1393MHz, with the vendor stating that there will be plenty headroom left for further overclocking. The card comes with 4GB GDDR5 memory on a 256-bit interface and a bandwidth of 7010MHz. The liquid-cooled portion directs heat away from the GPU, while the included air cooler is used to cool the memory and VRM sections of the video card.</p>
<p>This isn&#8217;t the first AIO liquid-cooled video card, as that distinction goes to AMD&#8217;s Radeon R9 295X2, but this is the first time we&#8217;re seeing this system used on the GTX 980. Based on Nvidia&#8217;s (<a href="https://www.google.com/finance?q=nvidia&amp;ei=QMUUVZGYEtOLuQTn6ICoDQ" target="_blank">NASDAQ:NVDA</a>) new Maxwell architecture, the GTX 980 features the GM2014 GPU with 2048 CUDA cores.</p>
<p>The GTX 980 is listed on <a href="http://www.evga.com/products/Product.aspx?pn=04G-P4-1989-KR" target="_blank">EVGA&#8217;s website</a> for US $649.99/€779,00. If you&#8217;re interested in just getting the cooler, EVGA is <a href="http://www.evga.com/products/Product.aspx?pn=400-HY-H980-B1" target="_blank">offering it</a> for $99.99.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/27/evgas-gtx-980-hybrid-comes-with-an-aio-liquid-cooler/">EVGA&#8217;s GTX 980 Hybrid Comes With An AIO Liquid Cooler</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/27/evgas-gtx-980-hybrid-comes-with-an-aio-liquid-cooler/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia Officially Launches The GeForce GTX Titan X</title>
		<link>http://www.vrworld.com/2015/03/18/nvidia-officially-launches-the-geforce-gtx-titan-x/</link>
		<comments>http://www.vrworld.com/2015/03/18/nvidia-officially-launches-the-geforce-gtx-titan-x/#comments</comments>
		<pubDate>Wed, 18 Mar 2015 11:23:37 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[GeForce GTX 980]]></category>
		<category><![CDATA[GM200]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[Maxwell]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Titan-X]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=50226</guid>
		<description><![CDATA[<p>The Titan X is billed as the most powerful single-GPU solution available today. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/18/nvidia-officially-launches-the-geforce-gtx-titan-x/">Nvidia Officially Launches The GeForce GTX Titan X</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="893" height="768" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Nvidia-GTX-Titan-X.jpg" class="attachment-post-thumbnail wp-post-image" alt="Nvidia GTX Titan X" /></p><p>After serving up a few details regarding the card at the Game Developers Conference earlier this month, Nvidia (<a href="https://www.google.com/finance?q=nvidia&amp;ei=wV8JVeHoJsevugSlzYD4BA" target="_blank">NASDAQ:NVDA</a>) has officially launched its latest offering in the Titan series, the <a href="http://www.vrworld.com/tag/titan-x" target="_blank">Titan X</a>. The card will be priced at $999 and will be available in the month of May.</p>
<p>The specs on offer with the Titan X match what the <a title="Nvidia Titan X Specs Detailed Ahead Of Official Launch" href="http://www.vrworld.com/2015/03/17/nvidia-titan-x-specs-detailed-ahead-of-official-launch/" target="_blank">last round of leaks</a> predicted, with the card featuring a 28nm GM200 GPU that offers 3072 CUDA cores, 192 TMUs, 96 ROPs, 3MB of L2 cache along with 12GB of GDDR5 memory with a 384-bit memory interface. Base clock for the card is at 1002MHz, with a boost clock of 1078MHz.</p>
<p>Nvidia has stated that there is more than sufficient headroom available for overclocking the Titan X, and the card will reach frequencies of 1200MH without any hassles. The GM200 silicon is basically an enhanced version of the GM204 GPU that was introduced with the GTX 980. We see a 50% increase in bandwidth, CUDA cores and ROP count, with the end result being a card that features a maximum TDP of 250W.</p>
<p>The cooler itself is similar to what we&#8217;ve seen in earlier-generation Titan cards, although this time around the metal shroud has a black coat of paint. Nvidia is once again touting that the Titan X is the fastest single-GPU solution available in the world today, with the card billed more as a luxury gaming video card than an entry-level compute card, like the Kepler-based GTX Titan and GTX Titan Black. For that reason, its pricing at $999 may turn out to be the only sore point given that cards like the GTX 980 in SLI can outperform the Titan X.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/18/nvidia-officially-launches-the-geforce-gtx-titan-x/">Nvidia Officially Launches The GeForce GTX Titan X</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/18/nvidia-officially-launches-the-geforce-gtx-titan-x/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>GTC 2015: Nvidia Unveils Digits DevBox Supercomputer Aimed At Researchers</title>
		<link>http://www.vrworld.com/2015/03/18/gtc-2015-nvidia-unveils-digits-devbox-supercomputer-aimed-at-researchers/</link>
		<comments>http://www.vrworld.com/2015/03/18/gtc-2015-nvidia-unveils-digits-devbox-supercomputer-aimed-at-researchers/#comments</comments>
		<pubDate>Wed, 18 Mar 2015 02:48:09 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[Digits DevBox]]></category>
		<category><![CDATA[GTC 2015]]></category>
		<category><![CDATA[NASDAQ: NVDA]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Titan-X]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=50208</guid>
		<description><![CDATA[<p>Digits DevBox is the most amount of computational performance you can get from a machine powered by a wall socket. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/18/gtc-2015-nvidia-unveils-digits-devbox-supercomputer-aimed-at-researchers/">GTC 2015: Nvidia Unveils Digits DevBox Supercomputer Aimed At Researchers</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="640" height="426" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/DevBox.png" class="attachment-post-thumbnail wp-post-image" alt="DevBox" /></p><p>Nvidia (<a href="https://www.google.com/finance?q=nvidia&amp;ei=qeMIVcC6K8HkuATckIGIAQ" target="_blank">NASDAQ:NVDA</a>) has announced its own computing solution aimed at deep-learning researchers at the GPU Technology Conference. Called Digits DevBox, the system includes four <a href="http://www.vrworld.com/tag/titan-x/" target="_blank">Titan X</a> GPUs in quad-SLI, making it the most powerful configuration of GPU hardware in a single PC available today.</p>
<p>The box will feature the Digits software framework, which gives deep-learning researchers a series of tools that will enable them to configure and monitor deep neural networks, as well as process data. The system itself is powered by Ubuntu 14.04, and comes with the following frameworks and libraries installed: Caffe, Torch, Theano, BIDMach, cuDNN v2, and CUDA 7.0.</p>
<p>Other hardware includes an Asus X99 motherboard with an Intel Haswell-E Core i7  processor, a combination of mechanical and flash-based storage that includes 3&#215;3 TB drives in RAID 5, as well as  M.2 SATA and SSDs along with a 1500W power supply.</p>
<p>The cost of the system at $15,000 is several times more than what you pay for just the hardware, so it&#8217;s clear Nvidia is also chrging a premium for its software-driven framework. The chip vendor is <a href="https://developer.nvidia.com/devbox" target="_blank">now taking registrations</a> for the Digits DevBox, and there&#8217;s an online configurator as well that can be accessed by filling out <a href="http://info.nvidianews.com/build_a_system_nvidia_3_15.html" target="_blank">this form</a>.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/18/gtc-2015-nvidia-unveils-digits-devbox-supercomputer-aimed-at-researchers/">GTC 2015: Nvidia Unveils Digits DevBox Supercomputer Aimed At Researchers</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/18/gtc-2015-nvidia-unveils-digits-devbox-supercomputer-aimed-at-researchers/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia Announces Pricing And A Launch Date For Drive PX Self-Driving Car Platform At GTC 2015</title>
		<link>http://www.vrworld.com/2015/03/18/nvidia-announces-pricing-and-a-launch-date-for-drive-px-self-driving-car-platform-at-gtc-2015/</link>
		<comments>http://www.vrworld.com/2015/03/18/nvidia-announces-pricing-and-a-launch-date-for-drive-px-self-driving-car-platform-at-gtc-2015/#comments</comments>
		<pubDate>Wed, 18 Mar 2015 02:04:48 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[News]]></category>
		<category><![CDATA[autonomous cars]]></category>
		<category><![CDATA[Drive PX]]></category>
		<category><![CDATA[GTC 2015]]></category>
		<category><![CDATA[Jen-Hsun Huang]]></category>
		<category><![CDATA[Nvidia]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=50191</guid>
		<description><![CDATA[<p>Drive PX will be powered by a deep learning platform that will enable the system to react to different scenarios over time. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/18/nvidia-announces-pricing-and-a-launch-date-for-drive-px-self-driving-car-platform-at-gtc-2015/">Nvidia Announces Pricing And A Launch Date For Drive PX Self-Driving Car Platform At GTC 2015</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="650" height="433" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Nvidia-Drive-PX.png" class="attachment-post-thumbnail wp-post-image" alt="Nvidia Drive PX" /></p><p>At the GPU Technology Conference, Nvidia (<a href="https://www.google.com/finance?q=nvidia&amp;ei=zdoIVfnRK5KMuQS5i4LoBw" target="_blank">NASDAQ:NVDA</a>) announced a developer kit for its self-driving car platform, dubbed <a title="CES 2015: Nvidia Announces Mobile Maxwell, Automotive Computers [Updated]" href="http://www.vrworld.com/2015/01/05/ces-2015-nvidia-announces-mobile-maxwell/" target="_blank">Drive PX</a>.</p>
<p>One of the key features of the platform is its hardware, which is powered  by two Tegra X1 chips and comes with the ability to process video simultaneously from 12 different HD cameras placed around a vehicle that feed information about the environment back to the system, thereby allowing Drive PX to facilitate surround view, collision avoidance, pedestrian detection, mirror-less operation, cross-traffic monitoring and driver-state monitoring.</p>
<p>Nvidia is billing deep learning as the integral differentiator for the Drive PX platform, with CEO Jen-Hsun Huang stating that the system is far more advanced than conventional Advanced Driver Assistance Systems available today, most of which center around lane guidance and collision prevention. Drive PX will be able to train itself to react to different situations, and there&#8217;s the added feature that any actions learned in one vehicle are shared with the platform as a whole.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2015/03/nvidia-drive-px.jpg" rel="lightbox-0"><img class="alignnone size-large wp-image-50203" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/nvidia-drive-px-1920x1080.jpg" alt="nvidia-drive-px" width="1140" height="641" /></a></p>
<p>Enabling the learning experience is a deep neural network software development kit called Digits that allows the Drive PX system to understand objects in the real world and react to ever-changing scenarios. The system will be available in May to select automakers, automotive suppliers and research institutions, with the developer kit priced at $10,000.</p>
<p>&nbsp;</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/18/nvidia-announces-pricing-and-a-launch-date-for-drive-px-self-driving-car-platform-at-gtc-2015/">Nvidia Announces Pricing And A Launch Date For Drive PX Self-Driving Car Platform At GTC 2015</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/18/nvidia-announces-pricing-and-a-launch-date-for-drive-px-self-driving-car-platform-at-gtc-2015/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia Teases More Pascal Details at GTC 2015</title>
		<link>http://www.vrworld.com/2015/03/18/nvidia-teases-more-pascal-details-at-gtc-2015/</link>
		<comments>http://www.vrworld.com/2015/03/18/nvidia-teases-more-pascal-details-at-gtc-2015/#comments</comments>
		<pubDate>Wed, 18 Mar 2015 01:29:32 +0000</pubDate>
		<dc:creator><![CDATA[Sam Reynolds]]></dc:creator>
				<category><![CDATA[Companies]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GTC 2015]]></category>
		<category><![CDATA[Jen-Hsun Huang]]></category>
		<category><![CDATA[Maxwell]]></category>
		<category><![CDATA[NASDAQ: NVDA]]></category>
		<category><![CDATA[Pascal]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=50192</guid>
		<description><![CDATA[<p>New GPU architecture promises ten-times the performance of Maxwell.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/18/nvidia-teases-more-pascal-details-at-gtc-2015/">Nvidia Teases More Pascal Details at GTC 2015</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1471" height="932" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/PascalBoard-1.jpg" class="attachment-post-thumbnail wp-post-image" alt="PascalBoard (1)" /></p><p><a href="http://www.vrworld.com/tag/nvidia/">Nvidia’s</a> (<a href="http://www.google.com/finance?cid=662925">NASDAQ: NVDA</a>) CEO Jen Hsun Huang gave the world another look at the GPU successor to Maxwell at its GPU Technology Conference Conference (GTC 2015) in San Jose Tuesday.</p>
<p>Pascal was first announced as a mystery GPU between Maxwell and Volta at last year’s GTC. Tuesday’s announcement gives us the first concrete details of Pascal.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2015/03/Pascal1.png" rel="lightbox-0"><img class="aligncenter size-medium wp-image-50194" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Pascal1-600x209.png" alt="Pascal1" width="600" height="209" /></a></p>
<p>Huang said that Pascal, which is set to arrive in 2016, would have a ten-fold overall average improvement over Maxwell, and a four times boost in mixed-precision workloads. As far as performance per watt, it will offer a two-fold performance over Maxwell. However he later cautioned this was “CEO Math” and actual performance may vary.</p>
<p><strong><strong><a href="http://cdn.vrworld.com/wp-content/uploads/2015/03/Pascal2.png" rel="lightbox-1"><img class="aligncenter size-medium wp-image-50193" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Pascal2-600x206.png" alt="Pascal2" width="600" height="206" /></a></strong></strong></p>
<p>Pascal will use a suite of new technologies, including 3D-stacked memory and NVLink (Huang says it will offer a five-fold improvement over PCI-E). It will be built on TSMC (<a href="http://www.google.com/finance?cid=674465">TPE: 2330</a>) 16nm FF+ (FinFet plus &#8212; the follow-up to FinFET)  process node. It will also use High Bandwidth Memory, allowing a three-fold improvement of bandwidth for its 32 GB of RAM.</p>
<p>Cards with Pascal will likely be marketed towards CUDA workstations, or perhaps as some sort of competitor to Intel’s (<a href="http://www.google.com/finance?cid=284784">NASDAQ: INTC</a>) Xeon Phi co-processors. Game developers have a tough time pushing the limits of current generation cards as it is.</p>
<p>Pricing and availability of Pascal-based cards will be available later this year or early next year.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/18/nvidia-teases-more-pascal-details-at-gtc-2015/">Nvidia Teases More Pascal Details at GTC 2015</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/18/nvidia-teases-more-pascal-details-at-gtc-2015/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia Titan X Specs Detailed Ahead Of Official Launch</title>
		<link>http://www.vrworld.com/2015/03/17/nvidia-titan-x-specs-detailed-ahead-of-official-launch/</link>
		<comments>http://www.vrworld.com/2015/03/17/nvidia-titan-x-specs-detailed-ahead-of-official-launch/#comments</comments>
		<pubDate>Tue, 17 Mar 2015 14:55:30 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Game Developers Conference]]></category>
		<category><![CDATA[GM200]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[GPU Technology Conference]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Titan-X]]></category>
		<category><![CDATA[video card]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=50111</guid>
		<description><![CDATA[<p>Titan X offers a bandwidth of 336GB/s and an astounding 12GB video memory. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/17/nvidia-titan-x-specs-detailed-ahead-of-official-launch/">Nvidia Titan X Specs Detailed Ahead Of Official Launch</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1920" height="1094" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/GTXTitan_1920_11.jpg" class="attachment-post-thumbnail wp-post-image" alt="GeForce GTX Titan" /></p><p>Although Nvidia announced the <a title="Nvidia’s Titan X Is The Most Powerful GPU In The World" href="http://www.vrworld.com/2015/03/05/nvidias-titan-x-powerful-gpu-world/" target="_blank">Titan X</a> during Epic Games&#8217; demo at the Game Developers Conference earlier this month, the chip vendor is yet to share the full specs of the video card. The card is set to be officially announced at Nvidia&#8217;s <a href="http://www.gputechconf.com/" target="_blank">GPU Technology Conference</a> later this week, but the full specifications of the GM200 silicon — which is what the card is based on — have been leaked.</p>
<p><a href="http://videocardz.com/55136/nvidia-geforce-gtx-titan-x-specifications" target="_blank">Videocardz</a> posted the block diagram of the GPU, with the fully-unlocked silicon featuring 6 Graphics Processing Clusters (GPCs), each holding 4 Maxwell Streaming Multiprocessor (SMMs), for a total of 24 SMMs, or 3072 CUDA cores. A total of 6 memory controllers, each with 64-bit memory means that the Titan X has a memory bus width of 384-bit. Other specs include 192 Texture Mapping Units (TMUs) and 96 Raster Operation Units (ROPs).</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2015/03/NVIDIA-Maxwell-GM200-Block-Diagram.jpg" rel="lightbox-0"><img class="alignnone size-large wp-image-50136" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/NVIDIA-Maxwell-GM200-Block-Diagram-1557x1080.jpg" alt="NVIDIA-Maxwell-GM200-Block-Diagram" width="1140" height="791" /></a></p>
<p>In terms of bandwidth, the Titan X clocks in at 1002MHz base and 1089MHz boost, and will offer an astounding 12GB of GDDR5 memory at 1753MHz. While not confirmed, the video card is set to have a TDP of 250W, with one eight-pin and six-pin PCI-Express power connectors. Display connectors include three DisplayPort, one HDMI and one DVI port. <a title="First GeForce Titan X Benchmarks Appear Online" href="http://www.vrworld.com/2015/03/12/first-geforce-titan-x-benchmarks-appear-online/" target="_blank">Synthetic benchmarks</a> of the card have leaked last week, indicating significant gains in performance over the GTX 980.</p>
<p>Pricing of the Titan X is said to be the case as earlier-generation models, which would come out to $999. The card is slated to make its debut tomorrow, which is when we&#8217;ll have further details, as well as real-world benchmarks.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/17/nvidia-titan-x-specs-detailed-ahead-of-official-launch/">Nvidia Titan X Specs Detailed Ahead Of Official Launch</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/17/nvidia-titan-x-specs-detailed-ahead-of-official-launch/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>First GeForce Titan X Benchmarks Appear Online</title>
		<link>http://www.vrworld.com/2015/03/12/first-geforce-titan-x-benchmarks-appear-online/</link>
		<comments>http://www.vrworld.com/2015/03/12/first-geforce-titan-x-benchmarks-appear-online/#comments</comments>
		<pubDate>Thu, 12 Mar 2015 05:00:55 +0000</pubDate>
		<dc:creator><![CDATA[Sam Reynolds]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[GeForce Titan X]]></category>
		<category><![CDATA[NASDAQ: NVDA]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Titan X benchmarks]]></category>
		<category><![CDATA[Titan-X]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=49767</guid>
		<description><![CDATA[<p>The full details on the card won’t be released until Nvidia’s GTC, but these leaked benchmarks reveal a lot. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/12/first-geforce-titan-x-benchmarks-appear-online/">First GeForce Titan X Benchmarks Appear Online</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="800" height="533" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/TItan-Z-Jen-Hsun-Huang.jpg" class="attachment-post-thumbnail wp-post-image" alt="TItan-Z-Jen-Hsun-Huang" /></p><p><a href="http://www.vrworld.com/tag/nvidia/">Nvidia’s</a> (<a href="http://www.google.com/finance?cid=662925">NASDAQ: NVDA</a>) upcoming <a href="http://www.vrworld.com/2015/03/05/nvidias-titan-x-powerful-gpu-world/">GeForce Titan X</a> is shrouded in secrecy until its official launch later this month at the GPU Technology Conference.</p>
<p>But with any product like this there are always leaks. <a href="http://videocardz.com/55013/nvidia-geforce-gtx-titan-x-3dmark-performance"><i>Videocardz </i></a>has published benchmarks on how the card stacks up to the competition.</p>
<p>Based on these benchmarks, the Titan X appears to have 3,072 CUDA cores, 192 TMUs, 96 ROPs, 1,002MHz core clockspeed (boost is unknown, and the core could be subject to change) and a 1,750MHz memory clock. The memory bus is 384-bit, resulting in 336 GB/s of available bandwidth. We already know that the card will ship with 12GB of VRAM.</p>
<p>Here’s a breakdown of the results:</p>
<ul>
<li>22903 points in 3DMark 11 Performance. A 34.8% boost over a reference GeForce GTX 980</li>
<li>7427 points in 3DMark 11 Extreme. A 39.9% boost over a reference GeForce GTX 980</li>
<li>17470 points in 3DMark FireStrike. A 36% boost over a reference GTX 980.</li>
</ul>
<p>The important thing to remember is that this performance is subject to change. The card’s drivers are likely not yet finalized, thus these numbers from synthetic benchmarks are not representative of the final performance of the card.</p>
<p>The card&#8217;s exact availability will be announced at GTC later this month. The big sticking point for the card will be its pricing. A competitively priced card will make it a force to be reckoned with.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/12/first-geforce-titan-x-benchmarks-appear-online/">First GeForce Titan X Benchmarks Appear Online</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/12/first-geforce-titan-x-benchmarks-appear-online/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Take a Good Look at the Nvidia Titan X</title>
		<link>http://www.vrworld.com/2015/03/09/take-good-look-nvidia-titan-x/</link>
		<comments>http://www.vrworld.com/2015/03/09/take-good-look-nvidia-titan-x/#comments</comments>
		<pubDate>Mon, 09 Mar 2015 11:00:26 +0000</pubDate>
		<dc:creator><![CDATA[Sam Reynolds]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[GeForce Titan X]]></category>
		<category><![CDATA[GPU Technology Conference]]></category>
		<category><![CDATA[GTC]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Titan-X]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=49524</guid>
		<description><![CDATA[<p>Not much is known about the specs of the Titan X, but these photos reveal a bit more about Nvidia’s upcoming card. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/09/take-good-look-nvidia-titan-x/">Take a Good Look at the Nvidia Titan X</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="2000" height="1476" src="http://cdn.vrworld.com/wp-content/uploads/2014/05/Nvidia-Logo1.png" class="attachment-post-thumbnail wp-post-image" alt="Nvidia GPU Logo" /></p><p>Last week at the <a href="http://www.vrworld.com/2015/03/05/nvidias-titan-x-powerful-gpu-world/">Game Developers Conference in San Francisco, </a> <a href="http://www.vrworld.com/tag/nvidia/">Nvidia</a> (<a href="http://www.google.com/finance?cid=662925">NASDAQ: NVDA</a>) gave us the first glimpse at its upcoming Titan X card.</p>
<p>Not much is known about the card as the exact specs will be revealed during its launch at the upcoming Nvidia GPU Technology Conference taking place later this month. We do know, however, that its GPU will feature 12GB VRAM, the transistor count will be upped from 7 billion to 8 billion, and the memory bus interface is set to 384-bit.</p>
<p><a href="http://www.maximumpc.com/nvidia_geforce_gtx_titan_x_we_touched_it_2015#slide-2"><i>Maximum PC</i></a> was able to get its hands on the Titan X and posted a few close-up pictures of it. While the magazine wasn’t able to reveal anything more due to an NDA, we do see that the card has both 8-pin and 6-pin power connectors as well as a bevy of output ports: three DisplayPorts, one HDMI, and one DVI.<br />
<img class="aligncenter wp-image-49526 size-medium" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/29b-1-600x534.jpg" alt="titan-x-2" width="600" height="534" /><img class="aligncenter wp-image-49525 size-medium" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/29a-600x494.jpg" alt="titan-x-3" width="600" height="494" />As the Titan X will be based on Nvidia&#8217;s Maxwell architecture, many are looking forward to seeing the card in benchmarking action. With specs that are sure to be much more impressive than anything the competition can offer, the card should be able to easily chew through 4K and VR gaming &#8212; both are very high-intensity applications.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2015/03/29d.jpg" rel="lightbox-0"><img class="aligncenter wp-image-49527 size-medium" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/29d-600x489.jpg" alt="titan-x-1" width="600" height="489" /></a></p>
<p>Nvidia&#8217;s last attempt at a Titan, the Titan Z, was riddled with problems. Originally announced the the GTC 2014 conference, the card was long delayed. When it eventually came out it was not competitive against the Radeon R9 295 X2 on a price or performance basis. While it was really intended for CUDA programming as opposed to gaming, the Titan Z did eventually find a win inside Alienware’s Aurora gaming rig.</p>
<p>More details including pricing and availability will be available March 17-20 at the GPU Technology Conference.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/09/take-good-look-nvidia-titan-x/">Take a Good Look at the Nvidia Titan X</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/09/take-good-look-nvidia-titan-x/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia&#8217;s Titan X Is The Most Powerful GPU In The World</title>
		<link>http://www.vrworld.com/2015/03/05/nvidias-titan-x-powerful-gpu-world/</link>
		<comments>http://www.vrworld.com/2015/03/05/nvidias-titan-x-powerful-gpu-world/#comments</comments>
		<pubDate>Thu, 05 Mar 2015 09:47:04 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[2015]]></category>
		<category><![CDATA[Game Developer Conference (GDC)]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[GeForce Titan X]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Titan-X]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=49154</guid>
		<description><![CDATA[<p>The Titan X features an astounding 12GB VRAM. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/05/nvidias-titan-x-powerful-gpu-world/">Nvidia&#8217;s Titan X Is The Most Powerful GPU In The World</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="668" height="258" src="http://cdn.vrworld.com/wp-content/uploads/2014/06/FeatureImage-geforce-gtx-6601.jpg" class="attachment-post-thumbnail wp-post-image" alt="GeForce GTX 880" /></p><p>During Epic Games&#8217; Unreal Engine 4 session at the Game Developers Conference, Nvidia (<a href="https://www.google.com/finance?q=nvidia&amp;ei=RyD4VMqmFYfkugTwl4GQCg" target="_blank">NASDAQ:NVDA</a>) CEO Jen-Hsun Huang <a href="http://blogs.nvidia.com/blog/2015/03/04/smaug/" target="_blank">announced </a>the vendor&#8217;s latest GPU in the Titan family, the Titan X. Huang hasn&#8217;t shared hardware details of the card, but what we do know is that the GPU will feature 12GB VRAM, twice as high as what was seen on the first-generation Titan.</p>
<p>Also revealed was the transistor count, which sees an increase from 7 billion to 8 billion. The memory bus interface is at 384-bit, the same as the earlier iterations in the Titan series. TDP figures were not mentioned as well, although considering that the card is based on the Maxwell architecture, it is likely we&#8217;ll see significant improvements in power draw. Nvidia is retaining the metal cooler and shroud, with the manufacturing process of the card also set to remain at 28nm.</p>
<p>The idea behind unveiling the Titan X during GDC was to highlight the graphic capabilities of the card, which was used to drive Weta Digital&#8217;s (the studio behind <em>The Hobbit</em> films) “Thief in the Shadows” VR demo running on Oculus&#8217; VR headset.  Nvidia will reveal all the features on offer with the Titan X at its GPU Technology Conference, which is being held later this month in San Jose.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/05/nvidias-titan-x-powerful-gpu-world/">Nvidia&#8217;s Titan X Is The Most Powerful GPU In The World</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/05/nvidias-titan-x-powerful-gpu-world/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia&#8217;s New Shield Console Leads The Charge In Next-Gen Android Gaming</title>
		<link>http://www.vrworld.com/2015/03/05/nvidia-shield-console-gdc-2015/</link>
		<comments>http://www.vrworld.com/2015/03/05/nvidia-shield-console-gdc-2015/#comments</comments>
		<pubDate>Thu, 05 Mar 2015 03:48:30 +0000</pubDate>
		<dc:creator><![CDATA[Derek Strickland]]></dc:creator>
				<category><![CDATA[2015]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Game Developer Conference (GDC)]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Android TV]]></category>
		<category><![CDATA[GDC 2015]]></category>
		<category><![CDATA[Microconsole]]></category>
		<category><![CDATA[NASDAQ: NVDA]]></category>
		<category><![CDATA[Nvidia]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=49077</guid>
		<description><![CDATA[<p>The new Nvidia Shield console marries the worlds of "PC-quality gaming" with 4K video streaming all in a sleek, high-performance package.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/05/nvidia-shield-console-gdc-2015/">Nvidia&#8217;s New Shield Console Leads The Charge In Next-Gen Android Gaming</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="843" height="565" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Nvidia-Shield-COnsole-3.png" class="attachment-post-thumbnail wp-post-image" alt="Nvidia Shield COnsole 3" /></p><p>At GDC 2015 Nvidia (<a href="https://www.google.com/finance?cid=662925" target="_blank"><strong>NASDAQ: NVDA</strong></a>) took the stage to announce the <a href="http://shield.nvidia.com/store/console" target="_blank">Nvidia Shield console</a>, the &#8220;world&#8217;s first 4K Android TV&#8221; set-top boxwith a beefy emphasis on gaming.</p>
<p>Set in a sleek and angular form, the Nvidia Shield console takes aim at the consumer&#8217;s living room space by providing 4K-ready TV streaming as well as high-performance gaming. Essentially it aims to marry the best of both worlds with 4K HD video and &#8220;PC-like&#8221; game performance all at a reasonable cost.</p>
<p>While the new box does multimedia streaming, Nvidia makes it clear that it&#8217;s primarily aimed at gamers.</p>
<p><img class=" size-medium wp-image-49085 aligncenter" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Nvidia-shield-microconsole-600x438.png" alt="Nvidia shield microconsole" width="600" height="438" /></p>
<p>At the GDC reveal Nvidia CEO Jen-Hsun Huang outright claimed that the new Shield console is &#8220;35 times more powerful than the next set top box&#8221;, and even goes as far as to say it delivers twice the performance of an Xbox 360.</p>
<p>Harnessing the power of its new Nvidia Tegra X1 &#8220;mobile superchip&#8221;, the Nvidia Shield will run a number of demanding Android-ported games like<em> Crysis 3</em>, <em>Doom 3: BFG</em>, and the <em>Borderlands: Pre-Sequel, Metal Gear Solid: Revengeance </em>and more.</p>
<p>Huang affirmed that over 50 games would be available upon launch.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2015/03/Tegra-X1-Processor.jpg" rel="lightbox-0"><img class=" size-medium wp-image-49094 aligncenter" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Tegra-X1-Processor-600x352.jpg" alt="Tegra X1 Processor" width="600" height="352" /></a></p>
<p>The Tegra X1 is only half of the equation, though; the other half leverages the power of the clouds via the supercomputer that runs Nvidia&#8217;s new <a href="http://www.theverge.com/2015/3/3/8146065/nvidia-grid-1080p-game-streaming" target="_blank">GRID subscription-based game streaming service</a>.</p>
<p>What&#8217;s unique about the GRID service is that subscribers can remote play games that are streamed from a faraway supercomputer. It&#8217;s bit like the PlayStation 4&#8217;s Remote Play with the PS Vita handheld, only on a bigger and much more powerful scale.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2015/03/Shield-Games.jpg" rel="lightbox-1"><img class=" size-medium wp-image-49088 aligncenter" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Shield-Games-600x352.jpg" alt="Shield Games" width="600" height="352" /></a></p>
<p>And the GRID service will have major AAA games like <em>The Witcher 3: Wild Hunt,</em> <em>Batman: Arkham Knight</em>, <em>Metal Gear Solid V: Ground Zeroes</em> and many, many more.</p>
<p>Nvidia has strongly affirmed that the Shield microconsole will be the portal that leads to the &#8220;next generation of Android gaming&#8221;, promising &#8220;PC-quality graphics&#8221; and fluid performance.</p>
<p>For those who prefer marathon gaming sessions, Huang promised that the Shield&#8217;s controllers would have at least 30 hours of life&#8211;which is a nice touch to be sure.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2015/03/Android-TV.png" rel="lightbox-2"><img class=" size-medium wp-image-49102 aligncenter" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Android-TV-600x243.png" alt="Android TV" width="600" height="243" /></a></p>
<p>As for video streaming, the set top box comes with a nifty remote with a built-in microphone for easy hands-free searching. As far as 4K TV goes, the Shield can stream local and internet-based 4K video from a Gigabit Ethernet connection through HDMI.</p>
<p>It&#8217;s light on internal memory, with just 16GB on board, but it&#8217;s expandable via microSD and sports two USB 3.0 outlets for extra HDD&#8217;s. Considering how big 4K video files are you&#8217;re going to need every bit of extra space you can get.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2015/03/Nvidia-Shield-Console-Android-TV.jpg" rel="lightbox-3"><img class=" size-medium wp-image-49086 aligncenter" src="http://cdn.vrworld.com/wp-content/uploads/2015/03/Nvidia-Shield-Console-Android-TV-600x300.jpg" alt="Nvidia Shield Console Android TV" width="600" height="300" /></a></p>
<p>With a number of peripherals all vying for control of the living room, it&#8217;s interesting to see Nvidia embrace a kind of Xbox One-style approach to the next generation of Android TV microconsoles.</p>
<p>As far as power, performance and functionality, the Nvidia Shield console is attempting to find that sweet spot between gaming consoles and PC&#8217;s, but only time will tell if it hits the mark or not.</p>
<p>The Shield console is slated to release in <strong>May 2015</strong> for a price point of $200, and will come with the set-top box and a controller, plus all the applicable cords and hook-ups.</p>
<p>Below we have a full spec sheet as well as compatibility and other features.</p>
<table class="table-specs-inner">
<tbody>
<tr>
<td>Processor</td>
<td>NVIDIA® Tegra® X1 processor<br />
256-core NVIDIA Maxwell™ GPU with 3 GB RAM</td>
</tr>
<tr>
<td>Video Features</td>
<td>4K Ultra-HD ready with 4K playback and capture up to 60 fps (VP9, H265, H264)</td>
</tr>
<tr>
<td>Audio</td>
<td>7.1 and 5.1 surround sound pass through over HDMI<br />
High-resolution audio up-sample to 24-bit/192 kHz over USB<br />
High-resolution audio playback up to 24-bit/192 kHz over HDMI and USB</td>
</tr>
<tr>
<td>Storage</td>
<td>16 GB</td>
</tr>
<tr>
<td>Wireless</td>
<td>802.11ac 2&#215;2 MIMO 2.4 GHz and 5 GHz Wi-Fi<br />
Bluetooth 4.1/BLE</td>
</tr>
<tr>
<td>Interfaces</td>
<td>Gigabit Ethernet<br />
HDMI 2.0<br />
Two USB 3.0 (Type A)<br />
Micro-USB 2.0<br />
MicroSD slot<br />
IR receiver (compatible with Logitech Harmony)</td>
</tr>
</tbody>
</table>
<p><strong>Gaming Features</strong></p>
<p>SHIELD controller compatible<br />
<a href="http://shield.nvidia.com/game-stream">NVIDIA GameStream</a><br />
<a href="http://shield.nvidia.com/share">NVIDIA Share</a><br />
<a href="http://shield.nvidia.com/grid-game-streaming">NVIDIA GRID game streaming service</a>Software UpdatesSHIELD software upgrades directly from NVIDIAPower40W power adapter</p>
<p><strong>Weight and Size:</strong> Weight: 23oz / 654g, Height: 5.1in / 130mm Width: 8.3in / 210mm, Depth: 1.0in / 25mm</p>
<p><strong>Operating System:</strong> Android TV™, Google Cast™ Ready</p>
<p><strong>Included Apps:</strong> PLEX</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/03/05/nvidia-shield-console-gdc-2015/">Nvidia&#8217;s New Shield Console Leads The Charge In Next-Gen Android Gaming</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/03/05/nvidia-shield-console-gdc-2015/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Your Next Gaming PC Might Have a Radeon and Nvidia GPU</title>
		<link>http://www.vrworld.com/2015/02/26/next-gaming-pc-will-might-radeon-nvidia-gpu/</link>
		<comments>http://www.vrworld.com/2015/02/26/next-gaming-pc-will-might-radeon-nvidia-gpu/#comments</comments>
		<pubDate>Thu, 26 Feb 2015 03:10:14 +0000</pubDate>
		<dc:creator><![CDATA[Sam Reynolds]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Rumors]]></category>
		<category><![CDATA[Software]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Direct X 12]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[Nvidia]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=47915</guid>
		<description><![CDATA[<p>According to one report, DirectX 12 will support Explicit Asynchronous Multi-GPU capabilities across different manufacturers’ GPUs. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/26/next-gaming-pc-will-might-radeon-nvidia-gpu/">Your Next Gaming PC Might Have a Radeon and Nvidia GPU</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="580" height="388" src="http://cdn.vrworld.com/wp-content/uploads/2014/10/directx-12-logo-100251209-large.png" class="attachment-post-thumbnail wp-post-image" alt="directx-12-logo-100251209-large" /></p><p>Microsoft’s (<a href="http://www.google.com/finance?cid=358464">NASDAQ: MSFT</a>) upcoming DirectX 12 has a ton of exciting features and optimizations that we already know about. However the most interesting new feature comes via a report from <a href="http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html"><i>Tom’s Hardware</i></a>: the ability to pool resources from multiple GPUs from different manufacturers.</p>
<p>The ability to work with two different GPU architectures in the same system comes from DirectX 12’s support of something called Explicit Asynchronous Multi-GPU capabilities and Split Frame Rendering (SFR). Effectively this pools all GPU resources into a bucket and allows them to be utilized as one.</p>
<p>SFR allows developers to manually divide data between the two GPUs to allow them to work together on each frame with the work for each frame divided between the two cards. The older method, called Alternate Frame Rendering (AFR). AFR is considered to be less efficient because it required both GPUs to have all of the data already in their frame buffers. Each GPU would render an alternate frame. The downside of this, however, is that the cards would work in parallel and not independently. If you had two cards with 4GB of memory each, you’d only have 4GB of useable space in the frame buffer in total.</p>
<p>The kicker is that <i>Tom’s</i> source said that SFR will be supported across multiple GPU architectures in the same system. It will treat both GPUs as one. Reportedly this would allow a system with an AMD (<a href="http://www.google.com/finance?cid=327">NASDAQ: AMD</a>) APU and an Nvidia (<a href="http://www.google.com/finance?cid=662925">NASDAQ: NVDA</a>) GPU to have the two GPUs work together. The same would go for a discrete AMD and Nvidia GPU in the same system.</p>
<p>In the end this will be up to developers to make use of and optimize their code accordingly. AMD’s Mantle already supports a form of SFR, so in a way the precedent is already there.</p>
<p>More details on DirectX 12 &#8212; and a confirmation of this report &#8212; will likely be available during March’s Game Developer Conference in San Francisco.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/26/next-gaming-pc-will-might-radeon-nvidia-gpu/">Your Next Gaming PC Might Have a Radeon and Nvidia GPU</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/02/26/next-gaming-pc-will-might-radeon-nvidia-gpu/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia CEO Comes Clean on GTX 970: ‘We’ll Do A Better Job Next Time’</title>
		<link>http://www.vrworld.com/2015/02/25/nvidia-ceo-comes-clean-gtx-970-well-better-job-next-time/</link>
		<comments>http://www.vrworld.com/2015/02/25/nvidia-ceo-comes-clean-gtx-970-well-better-job-next-time/#comments</comments>
		<pubDate>Wed, 25 Feb 2015 04:12:50 +0000</pubDate>
		<dc:creator><![CDATA[Sam Reynolds]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[GTX 970]]></category>
		<category><![CDATA[Jen-Hsun Huang]]></category>
		<category><![CDATA[Nvidia]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=47845</guid>
		<description><![CDATA[<p>Jen-Hsun Huang explains his side of the story. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/25/nvidia-ceo-comes-clean-gtx-970-well-better-job-next-time/">Nvidia CEO Comes Clean on GTX 970: ‘We’ll Do A Better Job Next Time’</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1383" height="1080" src="http://cdn.vrworld.com/wp-content/uploads/2015/01/Nvidia-GTX-970.jpg" class="attachment-post-thumbnail wp-post-image" alt="Nvidia GTX 970" /></p><p>Nvidia’s (<a href="http://www.google.com/finance?cid=662925">NASDAQ: NVDA</a>) CEO Jen-Hsun Huang has posted an open letter on his <a href="http://blogs.nvidia.com/blog/2015/02/24/gtx-970/">company’s blog</a> addressing concerns over the GTX 970.</p>
<p>Nvidia and its add-in-board partners have faced a rough month because of the GTX 970 and its lack of advertised memory. <a href="http://www.vrworld.com/2015/01/26/nvidia-denies-design-flaws-gtx-970-vram-allocation-issue/">Initially</a>, in late January, Nvidia denied any problems with the card, however <a href="http://www.vrworld.com/2015/01/30/nvidia-admits-error-specifications-gtx-970-will-help-refunds/">a few days later</a> the company issued another press statement acknowledging it didn’t communicate specifications correctly and offered a driver update.</p>
<p>“We invented a new memory architecture in Maxwell. This new capability was created so that reduced-configurations of Maxwell can have a larger framebuffer – i.e., so that GTX 970 is not limited to 3GB, and can have an additional 1GB. GTX 970 is a 4GB card,” Huang wrote. “However, the upper 512MB of the additional 1GB is segmented and has reduced bandwidth. This is a good design because we were able to add an additional 1GB for GTX 970 and our software engineers can keep less frequently used data in the 512MB segment.”</p>
<p>Huang blamed a lack of proper communication with the press and reviewers for the confusion.</p>
<p>“Instead of being excited that we invented a way to increase memory of the GTX 970 from 3GB to 4GB, some were disappointed that we didn&#8217;t better describe the segmented nature of the architecture for that last 1GB of memory,” he wrote.</p>
<p>“This new feature of Maxwell should have been clearly detailed from the beginning.”</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/25/nvidia-ceo-comes-clean-gtx-970-well-better-job-next-time/">Nvidia CEO Comes Clean on GTX 970: ‘We’ll Do A Better Job Next Time’</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/02/25/nvidia-ceo-comes-clean-gtx-970-well-better-job-next-time/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>GTX 970 Owners In India Left To Fend For Themselves</title>
		<link>http://www.vrworld.com/2015/02/23/gtx-970-owners-india-left-fend/</link>
		<comments>http://www.vrworld.com/2015/02/23/gtx-970-owners-india-left-fend/#comments</comments>
		<pubDate>Mon, 23 Feb 2015 12:06:23 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[Asia Pacific (APAC)]]></category>
		<category><![CDATA[Global Politics]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[GTX 970]]></category>
		<category><![CDATA[India]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[video cards]]></category>
		<category><![CDATA[Zotac]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=47671</guid>
		<description><![CDATA[<p>Indian users with the GTX 970 are left without any options as Nvidia's partners fail to acknowledge issues with the card. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/23/gtx-970-owners-india-left-fend/">GTX 970 Owners In India Left To Fend For Themselves</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="2480" height="1890" src="http://cdn.vrworld.com/wp-content/uploads/2014/09/Zotac-GTX-970-Omega-Edition-.jpg" class="attachment-post-thumbnail wp-post-image" alt="Zotac GTX 970 Omega Edition" /></p><p>Nvidia (<a href="http://https://www.google.com/finance?q=nvda&amp;ei=LRXrVMG7AaqxiQKC9YGwCQ" target="_blank">NASDAQ:NVDA</a>) is under fire for incorrectly listing the specs of the GTX 970, and while the manufacturer has announced that it will provide a way for users to return the card to those looking to do so, that does not seem to extend to a majority of Nvidia&#8217;s partners around the globe.</p>
<p>Major retailers such as <a href="http://www.reddit.com/r/hardware/comments/2w4g58/amazon_issuing_20_refunds_to_customers_who_bought/" target="_blank">Amazon and NewEgg</a> are offering refunds to any user looking to return the GTX 970, but in countries like India, customers are left to fend for themselves as retailers are refusing to process returns.</p>
<p>The <a href="http://www.vrworld.com/2015/01/30/nvidia-admits-error-specifications-gtx-970-will-help-refunds/">issue with the GTX 970</a> is that Nvidia implemented a new memory segmentation in the card, which meant that the 4GB available memory was sectioned into a 3.5GB segment and a 512MB segment. The memory controllers in the 3.5GB segment have direct access to the L2 cache, and deliver an advertised bandwidth of 196GB/s, but the last 512MB has to interface with another memory controller to access the cache, leading to a drastic reduction in bandwidth to 28GB/s.</p>
<p>The situation is more pronounced when a user has the two or more GTX 970 cards in SLI, with a noticeable decrease in framerates when gaming at higher resolutions. Indian hardware site <em><a href="http://www.hardwarebbq.com/community/threads/issues-with-gtx-970.1011/" target="_blank">HardwareBBQ</a> </em>posted a conversation between a user facing such issues with the GTX 970 in SLI, and his failed attempts to get a refund for the card. The conversation starts off with the user talking to an Nvidia engineer on the GeForce forums, after which he is redirected to talk to the card vendor directly to initiate the refund process (which in this case is Zotac India).</p>
<p class="p1"><span class="s1">However, the e-mail communique results in the technical manager for Zotac India, Swarn Singh, refusing to provide a refund as the GTX 970 was delivered &#8220;as it is written on the box with all specification.&#8221; The user was redirected to contact Nvidia, with Singh stating that a refund would be entertained only if the card suffered from any warranty-related issues. A discussion with an Nvidia engineer based out of India by the name of Rajaram also fails to yield any results, with the engineer stating that it is not &#8220;possible to suffer lag (microstutters) when the FPS is high&#8221; on the GTX 970.</span></p>
<p class="p1"><span class="s1">Zotac India&#8217;s stubbornness in acknowledging an issue exists with the GTX 970 is alarming, considering that the brand is one of several vendors processing returns on a global level. The situation is exacerbated when taken into account the fact that video cards often carry a markup of 30 to 35% in the country.</span></p>
<p><span style="line-height: 1.5;">Furthermore, Nvidia claimed that it would assist users that were unable to claim a return for their GTX 970, but the conversations with retailers and <a href="http://www.erodov.com/forums/no-refund-india-gtx-970-cards/77399.html" target="_blank">discussions on message boards</a> in India show that the chip manufacturer is not offering a solution to users who purchased the card in the country.</span></p>
<p>With card vendors like Zotac directing users back to Nvidia and the chip vendor not providing any recourse regarding the issue, Indian users are largely left in the cold when it comes to dealing with returns or refunds for the beleaguered GTX 970.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/23/gtx-970-owners-india-left-fend/">GTX 970 Owners In India Left To Fend For Themselves</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/02/23/gtx-970-owners-india-left-fend/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>What’s Nvidia Planning for March 3?</title>
		<link>http://www.vrworld.com/2015/02/11/whats-nvidia-planning-march-3/</link>
		<comments>http://www.vrworld.com/2015/02/11/whats-nvidia-planning-march-3/#comments</comments>
		<pubDate>Wed, 11 Feb 2015 06:45:26 +0000</pubDate>
		<dc:creator><![CDATA[Sam Reynolds]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Tegra]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=47039</guid>
		<description><![CDATA[<p> Something that the company spent a half-decade working on gets its debut in March. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/11/whats-nvidia-planning-march-3/">What’s Nvidia Planning for March 3?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="2000" height="1476" src="http://cdn.vrworld.com/wp-content/uploads/2014/05/Nvidia-Logo1.png" class="attachment-post-thumbnail wp-post-image" alt="Nvidia GPU Logo" /></p><p>Nvidia (<a href="http://www.google.com/finance?cid=662925">NASDAQ: NVDA</a>) has been sending out invitations to journalists for a product launch in San Francisco that’s scheduled to take place March 3 which will “redefine the future of gaming.”</p>
<p>Here&#8217;s the invitation below, via <em>Android Police</em>.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2015/02/nexus2cee_invite_thumb.png" rel="lightbox-0"><img class="aligncenter size-medium wp-image-47040" src="http://cdn.vrworld.com/wp-content/uploads/2015/02/nexus2cee_invite_thumb-600x592.png" alt="nexus2cee_invite_thumb" width="600" height="592" /></a></p>
<p>It’s anyone’s guess as to what exactly Nvidia will be showing off. The date is right in the middle of Mobile World Congress 2015, so it’s doubtful that it would be something mobile related. Rumors of the mystery device being a follow-up to the Shield 2 are also bunk, as it doesn’t fit the five-years in the making narrative. The Virtual Reality market is crowded, and it would make more sense for Nvidia to co-operate with Oculus rather than compete against it. So what else could it be? An Nvidia-powered console?</p>
<p>Nvidia is completely mum on the issue, and there have been no definite leaks to point in any direction. Until March 3, it will be a mystery.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/11/whats-nvidia-planning-march-3/">What’s Nvidia Planning for March 3?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/02/11/whats-nvidia-planning-march-3/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia Aiming To Roll Out Shield Tablet With Tegra X1 Shortly</title>
		<link>http://www.vrworld.com/2015/02/06/nvidia-aiming-roll-shield-tablet-tegra-x1-shortly/</link>
		<comments>http://www.vrworld.com/2015/02/06/nvidia-aiming-roll-shield-tablet-tegra-x1-shortly/#comments</comments>
		<pubDate>Fri, 06 Feb 2015 08:12:34 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Shield Tablet]]></category>
		<category><![CDATA[tegra x1]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=46661</guid>
		<description><![CDATA[<p>The 2015 Shield Tablet will feature Nvidia's recently announced Tegra X1. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/06/nvidia-aiming-roll-shield-tablet-tegra-x1-shortly/">Nvidia Aiming To Roll Out Shield Tablet With Tegra X1 Shortly</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1456" height="970" src="http://cdn.vrworld.com/wp-content/uploads/2015/01/Tegra-X1.jpg" class="attachment-post-thumbnail wp-post-image" alt="Tegra X1" /></p><p>After rolling out the Shield Tablet last year, it looks like Nvidia (<a href="https://www.google.com/finance?q=nvidia&amp;ei=mlvUVNSpIdPaqQHXrYHwAg" target="_blank">NASDAQ:NVDA</a>) is set to announce a 2015 refresh of the device that features the vendor&#8217;s recently announced Tegra X1. The vendor has not shared a timeline as to when the tablet will be made available, but <a href="http://www.fudzilla.com/news/mobile/36937-shield-tegra-x1-coming-soon" target="_blank">recent rumor</a>s suggest that we&#8217;ll know more about the tablet at Nvidia&#8217;s GPU technology conference, which is scheduled for March 17.</p>
<p>The Tegra X1 is billed as a super chip, and features a 64-bit CPU in an octa-core configuration that sees four Cortex A57 cores and four Cortex A53 cores. The Cortex A57 has 2MB L2 cache that is shared across all four cores, and has 32KB L1 data cache along with 48KB L1 instruction cache for each core. The Cortex A53 has 512KB L2 cache shared by all four cores with 32KB L1 data and 32KB L1 instruction cache.</p>
<p>Nvidia has stuck with stock Cortex cores (unlike the Denver variant of the Tegra K1) mainly due to time-to-market reasons, and as such we may get to see a custom-core configuration at a later stage this year. The Tegra X1 utilizes the big.LITTLE architecture, but Nvidia has mentioned that it is using its own interconnect in lieu of ARM&#8217;s CCI-400 interconnect, and cluster migration rather than global task scheduling, through which it is able to activate all eight cores at once.</p>
<p>The GPU territory, the Tegra X1 brings Nvidia&#8217;s Maxwell architecture to mobile devices. Boasting 256 CUDA cores, the GPU in the Tegra X1 is claimed to be twice as fast as that of the Tegra K1, with half the energy consumption. Tegra X1 can drive 4K screens, and features the H.265 hardware decoder.</p>
<p>As for the tablet that will feature the Tegra X1, the 2015 Shield Tablet is rumored to retain its 8-inch form factors, with two configurations on offer: a Wi-Fi only variant and an LTE model. We&#8217;ll know more in March.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/06/nvidia-aiming-roll-shield-tablet-tegra-x1-shortly/">Nvidia Aiming To Roll Out Shield Tablet With Tegra X1 Shortly</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/02/06/nvidia-aiming-roll-shield-tablet-tegra-x1-shortly/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Analyst Says Nvidia-Samsung Alliance Possible for 14 nm</title>
		<link>http://www.vrworld.com/2015/02/05/analyst-says-nvidia-samsung-alliance-possible-14-nm/</link>
		<comments>http://www.vrworld.com/2015/02/05/analyst-says-nvidia-samsung-alliance-possible-14-nm/#comments</comments>
		<pubDate>Thu, 05 Feb 2015 04:07:40 +0000</pubDate>
		<dc:creator><![CDATA[Sam Reynolds]]></dc:creator>
				<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[samsung]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=46602</guid>
		<description><![CDATA[<p>Despite ongoing legal battles, the new will work together to produce new chips. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/05/analyst-says-nvidia-samsung-alliance-possible-14-nm/">Analyst Says Nvidia-Samsung Alliance Possible for 14 nm</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="600" height="399" src="http://cdn.vrworld.com/wp-content/uploads/2015/02/29222413.jpg" class="attachment-post-thumbnail wp-post-image" alt="29222413" /></p><p>Rivals in the technology industry &#8212; even those locked in legal battle &#8212; have a history of working together, and according to <a href="http://www.businesskorea.co.kr/article/8864/leading-ap-production-samsung-expected-lead-ap-production-supplying-qualcomm-apple">one report </a>from Korea Samsung (<a href="http://www.google.com/finance?cid=151610035517112">KRX:005930</a>) and Nvidia (<a href="http://www.google.com/finance?cid=662925">NASDAQ: NVDA</a>) will be cooperating to manufacture Nvidia’s new GPUs on Samsung’s 14nm process.</p>
<p>The reports come via a claim by analyst Park Yu-ak of Meritz Securities who says that Samsung’s semiconductor business will be mass supplying chips for the likes of Apple (<a href="http://www.google.com/finance?cid=22144">NASDAQ: APPL</a>), Qualcomm (<a href="http://www.google.com/finance?cid=656142">NASDAQ: QCOM</a>), and Nvidia beginning in the second quarter of this year. It’s unclear exactly the extent of Samsung’s cooperation with Apple and Qualcomm. Taiwan’s TSMC (<a href="http://www.google.com/finance?cid=674465">TPE: 2330</a>) is said to be working with Apple on production of its next-generation chips, but the extent of their cooperation is unclear at this time.</p>
<p>The reports of this partnership seem deeply ironic as Nvidia and Samsung are locked in a legal battle over the very thing the two might be working together to produce. In November, Nvidia sued Samsung and Qualcomm <a href="http://www.vrworld.com/2014/09/04/nvidia-sues-samsung-qualcomm-patent-infringement/">over allegations</a> of patent infringement relating to its GPU technology. In turn, Samsung countersued and pushed the US International Trade Commission to block the sale of Nvidia’s desktop GPU products in the US.</p>
<p>A desktop GPU produced on the 14nm process is unlikely, so the likely product is the Tegra X1. Tegra doesn’t have a great track record of consistent hardware wins, to the volume to which Samsung would be producing is anyone’s guess.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/05/analyst-says-nvidia-samsung-alliance-possible-14-nm/">Analyst Says Nvidia-Samsung Alliance Possible for 14 nm</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/02/05/analyst-says-nvidia-samsung-alliance-possible-14-nm/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>MSI Marks 100 Million Nvidia GeForce Card Sales With Limited Edition GTX 970 And GTX 960</title>
		<link>http://www.vrworld.com/2015/01/30/msi-marks-100-million-nvidia-geforce-card-sales-limited-edition-gtx-970-gtx-960/</link>
		<comments>http://www.vrworld.com/2015/01/30/msi-marks-100-million-nvidia-geforce-card-sales-limited-edition-gtx-970-gtx-960/#comments</comments>
		<pubDate>Fri, 30 Jan 2015 03:58:31 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[GTX 960]]></category>
		<category><![CDATA[GTX 970]]></category>
		<category><![CDATA[Limited edition]]></category>
		<category><![CDATA[MSI]]></category>
		<category><![CDATA[Nvidia]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=46153</guid>
		<description><![CDATA[<p>MSI's latest limited edition cards feature Nvidia's bright green color scheme. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/01/30/msi-marks-100-million-nvidia-geforce-card-sales-limited-edition-gtx-970-gtx-960/">MSI Marks 100 Million Nvidia GeForce Card Sales With Limited Edition GTX 970 And GTX 960</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="825" height="652" src="http://cdn.vrworld.com/wp-content/uploads/2015/01/GTX-970-limited-edition.jpg" class="attachment-post-thumbnail wp-post-image" alt="GTX 970 limited edition" /></p><p>At CES, MSI (<a href="www.google.com/finance/company_news?q=TPE:2377">TPE: 2377</a>) announced that it had sold 100 million Nvidia (<a href="https://www.google.com/finance?q=nvidia&amp;ei=i5rKVLuEMK-ziALz6oDgCw" target="_blank">NASDAQ:NVDA</a>) video cards. To commemorate the milestone, the vendor is launching limited edition GTX 970 and GTX 960 with a bright green color scheme that matches Nvidia&#8217;s logo.</p>
<p>Both cards feature MSI&#8217;s Twin Frozr V design, and come with technologies such as Zero Frozr – which aims to eliminate noise by disabling the fans when not under load. There&#8217;s also dual 100mm Torx fans, as well as a custom backplate that is designed to improve structural rigidity. Along with the green color scheme, MSi has stated that it will include a &#8220;gift for dedicated MSI fans&#8221; with each video card that is included in the box.</p>
<p>The GTX 970 is based on the GM204 GPU and has a core clock of 1,140 MHz with a boost clock of 1,279 MHz. There&#8217;s 4GB GDDR5 video memory clocked at 7.0 GHz, and connectivity includes DL-DVI-D, DL-DVI-I, HDMI and DisplayPort 1.2.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2015/01/GTX-960-limited-edition.jpg" rel="lightbox-0"><img class="alignnone size-full wp-image-46168" src="http://cdn.vrworld.com/wp-content/uploads/2015/01/GTX-960-limited-edition.jpg" alt="GTX 960 limited edition" width="818" height="669" /></a></p>
<p>MSI&#8217;s limited edition GTX 960 has the GM206 GPU, which features a much smaller die size aimed at maximizing energy efficiency. The card&#8217;s core clock is 1,216 MHz, with boost clock at 1,279 MHz. Video memory is at 2GB, and connectivity is in the form of DL-DVI-I, HDMI and 3x DisplayPort 1.2.</p>
<p>These aren&#8217;t the first limited edition GTX 900 series cards MSI has launched, as the vendor <a title="MSI Announces The GTX 970 Gaming 4G Golden Edition" href="http://www.vrworld.com/2014/11/17/msi-announces-gtx-970-gaming-4g-golden-edition/" target="_blank">introduced the golden</a> GTX 970 in November 2014. As is often the case with such editions, the 100 million commemorative editions of the GTX 970 and GTX 960 will likely command a premium over the standard cards.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/01/30/msi-marks-100-million-nvidia-geforce-card-sales-limited-edition-gtx-970-gtx-960/">MSI Marks 100 Million Nvidia GeForce Card Sales With Limited Edition GTX 970 And GTX 960</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/01/30/msi-marks-100-million-nvidia-geforce-card-sales-limited-edition-gtx-970-gtx-960/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
	</channel>
</rss>

<!-- Performance optimized by W3 Total Cache. Learn more: http://www.w3-edge.com/wordpress-plugins/

Content Delivery Network via Amazon Web Services: CloudFront: cdn.vrworld.com

 Served from: www.vrworld.com @ 2015-04-10 12:05:51 by W3 Total Cache -->