<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>VR World &#187; GeForce GTX</title>
	<atom:link href="http://www.vrworld.com/tag/geforce-gtx/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.vrworld.com</link>
	<description></description>
	<lastBuildDate>Fri, 10 Apr 2015 07:54:22 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.1.1</generator>
	<item>
		<title>GeForce GTX 980 Review: More Performance at Lower Power</title>
		<link>http://www.vrworld.com/2014/09/18/geforce-gtx-980-review-performance-lower-power/</link>
		<comments>http://www.vrworld.com/2014/09/18/geforce-gtx-980-review-performance-lower-power/#comments</comments>
		<pubDate>Fri, 19 Sep 2014 02:30:21 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Reviews]]></category>
		<category><![CDATA[256 Bit]]></category>
		<category><![CDATA[290]]></category>
		<category><![CDATA[290X]]></category>
		<category><![CDATA[4K]]></category>
		<category><![CDATA[AA]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[API]]></category>
		<category><![CDATA[Asynchronous Warp]]></category>
		<category><![CDATA[bus]]></category>
		<category><![CDATA[DirectX 12]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[DSR]]></category>
		<category><![CDATA[DX 11.3]]></category>
		<category><![CDATA[DX12]]></category>
		<category><![CDATA[GeForce GTX]]></category>
		<category><![CDATA[GeForce GTX 980]]></category>
		<category><![CDATA[Global Illumination]]></category>
		<category><![CDATA[Graphics Card]]></category>
		<category><![CDATA[GTX 980]]></category>
		<category><![CDATA[GTX 980 Review]]></category>
		<category><![CDATA[GTX980]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[price]]></category>
		<category><![CDATA[R9 290]]></category>
		<category><![CDATA[R9 290X]]></category>
		<category><![CDATA[Radeon]]></category>
		<category><![CDATA[Supersampling]]></category>
		<category><![CDATA[Voxel]]></category>
		<category><![CDATA[Voxel Global Illumination]]></category>
		<category><![CDATA[VXGI]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=38897</guid>
		<description><![CDATA[<p>The Nvidia GeForce GTX 980 is Nvidia&#8217;s latest and greatest graphics card featuring the company&#8217;s new Maxwell GPU architecture. Nvidia claims that Maxwell is able to ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/09/18/geforce-gtx-980-review-performance-lower-power/">GeForce GTX 980 Review: More Performance at Lower Power</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="980" height="452" src="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_Front.jpg" class="attachment-post-thumbnail wp-post-image" alt="NVIDIA GeForce GTX 980" /></p><p>The Nvidia GeForce GTX 980 is Nvidia&#8217;s latest and greatest graphics card featuring the company&#8217;s new Maxwell GPU architecture. Nvidia claims that Maxwell is able to maintain performance while delivering better power efficiency. Sure, the Kepler architecture brought some amazing improvements when compared to the infamous Fermi architecture, but it was less revolutionary than the Maxwell architecture which debuted last year in the GTX 750 Ti.</p>
<p>Below, you can see a single SMM block diagram of the Maxwell architecture, followed by the full GM-204 architecture. Keep in mind that this is not the full-blown version of Maxwell.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/GeForce_GTX_980_SM_Diagram_FINAL.jpg" rel="lightbox-0"><img class="aligncenter size-medium wp-image-38907" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/GeForce_GTX_980_SM_Diagram_FINAL-320x600.jpg" alt="GeForce_GTX_980_SM_Diagram_FINAL" width="320" height="600" /></a></p>
<p>The GeForce GTX 980 is based upon Nvidia&#8217;s GM-204 GPU which is a mid-range version of Nvidia&#8217;s full Maxwell architecture. Even though the GTX 980 is being sold as a high-end card, it actually slots very similarly into Nvidia&#8217;s product lineup like the GTX 680 did.</p>
<p>The GTX 680 eventually became the GTX 770 and slotted in below the GTX 780 (a chopped down Titan) and the 780 Ti which was the full Kepler architecture and above the 760 Ti, also a chopped down card. So, with the GTX 980 we should be able to compare to the GTX 680 which was GK-104 and the GTX 780 Ti, which was full-blown Kepler. The GTX 980 is also thermally 30 watts less power than the GTX 680 Kepler card while performing far faster than it.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/GeForce_GTX_980_Block_Diagram_FINAL.jpg" rel="lightbox-1"><img class="aligncenter size-medium wp-image-38905" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/GeForce_GTX_980_Block_Diagram_FINAL-600x559.jpg" alt="GeForce_GTX_980_Block_Diagram_FINAL" width="600" height="559" /></a></p>
<p>In the new GPU, one of the most notable improvements was the increase of the L2 cache from 512 Kb all the way up to 2048 Kb. You can also see that Nvidia has made some significant improvements to a lot of the GPU&#8217;s design in order to improve efficiency. And the net result is that the GTX 980 has a TDP of 165w while the GTX 680 had a TDP of 195w, that&#8217;s a reduction of 30W or just under 20% in a single generation (going from GK-104 to GM-204) using the same process node (28nm). However, in order to build a GM-210 Nvidia will need a process shrink to enable them to shrink the die size and gain even more power efficiency and build a very dense 10 billion+ transistor chip.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/SpecsTable_980.jpg" rel="lightbox-2"><img class="aligncenter size-medium wp-image-38949" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/SpecsTable_980-600x486.jpg" alt="SpecsTable_980" width="600" height="486" /></a></p>
<p>In addition to the GM-204 GPU, Nvidia also opted to push for a standard 4GB of GDDR5 memory at 7 Gbps, resulting in some impressive memory bandwidth figures even though they only have a 256-bit memory bus.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/Maxwell_GM204_DIE_3D_V17_Final.jpg" rel="lightbox-3"><img class="aligncenter size-medium wp-image-38918" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/Maxwell_GM204_DIE_3D_V17_Final-600x337.jpg" alt="Maxwell_GM204_DIE_3D_V17_Final" width="600" height="337" /></a></p>
<h2>Hardware</h2>
<p>Moving on from the GPU and GPU architecture of the GTX 980, it&#8217;s easy to see that the hardware bears a very strong resemblance to the Kepler years starting with the GTX Titan. However, it is different in a few ways, including the fact that the card has two 6-pin PCIe connectors which means that it can draw up to 225w of power from the PCIe slot and power connectors in total. So, even though this card has a TDP of 165w, it can theoretically draw up to 225w, which means that this card could be an impressive overclocker with the appropriate cooling and voltage regulation.</p>
<p>Nvidia also included a backplate for the GTX 980 in order to help more evenly cool the back of the graphics card. This backplate, though, does  partially come off near the power connectors in order to properly allow for airflow into the fan when run in a very close SLI configuration with two or more cards.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_Front.jpg" rel="lightbox-4"><img class="size-medium wp-image-38929 aligncenter" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_Front-600x276.jpg" alt="NVIDIA GeForce GTX 980" width="600" height="276" /></a></p>
<p><img class="aligncenter size-medium wp-image-38928" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_BackPiece-600x366.jpg" alt="NVIDIA_GeForce_GTX_980_BackPiece" width="600" height="366" /></p>
<p>Below, you can see the GTX 980 with the fan shroud removed but with the GPU heatsink, memory heatsink and fan still attached.</p>
<p><img class="aligncenter size-medium wp-image-38931" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontNoShroud-600x399.jpg" alt="NVIDIA_GeForce_GTX_980_FrontNoShroud" width="600" height="399" /></p>
<p>Once the GPU heatsink is removed you can see the bare GPU with the memory heatsink and fan (which are one assembly).</p>
<p>&nbsp;</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontFan.jpg" rel="lightbox-5"><img class="aligncenter size-medium wp-image-38930" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontFan-600x399.jpg" alt="NVIDIA_GeForce_GTX_980_FrontFan" width="600" height="399" /></a></p>
<p>Then, once the whole assembly is removed you can see the GPU, memory chips, power phases and all of the various PCB markings, which actually show us that Nvidia only included 5 power phases on the GTX 980 even though the PCB can accommodate up to 7 power phases which could mean that this card may have some seriously overclocked versions already available at launch using the reference PCB.<a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontNoShroud.jpg" rel="lightbox-6"><br />
</a> <a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontPCB.jpg" rel="lightbox-7"><img class="aligncenter size-medium wp-image-38932" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontPCB-600x399.jpg" alt="NVIDIA_GeForce_GTX_980_FrontPCB" width="600" height="399" /></a></p>
<p>The card also features three DisplayPort 1.2 connectors as well as a dual-link DVI connector and an HDMI 2.0 connector which gives you the ability to drive 4K in multiple ways as well as run displays at up to 5K resolution per display even though HDMI 2.0 only supports 4K and DisplayPort 1.2 only technically supports 4K as well. So, really, the maximum resolution per display is still 4096 x 2160.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/980Back_98.jpg" rel="lightbox-8"><img class="aligncenter size-medium wp-image-38965" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/980Back_98-600x232.jpg" alt="980Back_98" width="600" height="232" /></a></p>
<h2>Software</h2>
<p>During Nvidia&#8217;s recent Editor&#8217;s Day &#8212; which is used to brief the press on upcoming products &#8212; for the GTX 980 Nvidia showed off a lot of things that directly and indirectly involved the GTX 980. Many of the advancements of the GTX 980 come in the form of software, which includes DirectX 12 and DirectX 11.3.  But that doesn&#8217;t change the fact that Nvidia was already running a DX 12 ported demo of Fable running on two GTX 980s.</p>
<p>Nvidia made four big announcements about the GTX 980 that were outside of DX 12 and DX 11.3 and those pertain to Nvidia&#8217;s own VXGI, MFAA, DSR and their advancements with HMDs (head-mounted displays) like the Oculus VR.</p>
<p>MFAA &#8211; Multi-Frame Sampled Anti-Aliasing is Nvidia&#8217;s own technique of enabling higher AA visual quality while only experiencing a few percentage points of a performance hit compared to a lower quality MSAA. Essentially, Nvidia is claiming to deliver 4X MSAA-level quality at 2X MSAA performance (give or take a few percentage points). However, this feature is not quite finished yet and will be enabled in a future driver for testing and enabling higher quality AA at better performance levels.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/MFAA.jpg" rel="lightbox-9"><img class="aligncenter size-medium wp-image-38919" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/MFAA-600x333.jpg" alt="MFAA" width="600" height="333" /></a></p>
<p>In addition to MFAA, Nvidia has also implemented DSR (Dynamic Super Resolution) which is essentially smart Supersampling with an applied filter. What it allows you to do is essentially trick the game into thinking you&#8217;ve got a much higher resolution display (like a 4K display) and as a result it will serve you higher quality textures and render the game in 4K. This generally results in much higher quality images even though Nvidia&#8217;s DSR technology will shrink the image back down to your monitor&#8217;s native resolution (like 1080P). This is great for both Nvidia and gamers because it means gamers can get a better looking game without needing to spend more money on a monitor and Nvidia can sell more expensive more powerful graphics cards without consumers needing to buy expensive 4K monitors.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/DSR.jpg" rel="lightbox-10"><img class="aligncenter size-medium wp-image-38904" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/DSR-600x336.jpg" alt="DSR" width="600" height="336" /></a></p>
<p>Nvidia also talked about its own new technology called VXGI with a demonstration of the moon landing which uses the company&#8217;s own voxel-based global illumination engine. VXGI utilizes certain things within Maxwell&#8217;s hardware and within the game engine itself (Unreal Engine 4) in order to more efficiently and realistically recreate the bouncing of light off objects and to do it in realtime. VXGI itself isn&#8217;t implemented in any engine yet, but the expectation is that Unreal Engine 4 should have it by the fourth quarter of this year and we could very likely see it in games as soon as next year.</p>
<p>In addition to the VXGI stuff, Nvidia also took a stab at head-mounted displays and the latency problem. The company&#8217;s solution, dubbed Asynchronous Warp, is designed to half the latency of VR-related gaming in order to improve the overall experience and responsiveness of the platform. They went step by step looking for ways to improve VR performance until they reached Asynchronous Warp</p>
<p><img class="aligncenter size-medium wp-image-38913" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/HMDLatency-600x342.jpg" alt="HMDLatency" width="600" height="342" /></p>
<p><img class="aligncenter size-medium wp-image-38914" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/HMDLatency2-600x337.jpg" alt="HMDLatency2" width="600" height="337" /></p>
<p><img class="aligncenter size-medium wp-image-38915" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/HMDLatency3-600x337.jpg" alt="HMDLatency3" width="600" height="337" /></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/HMDLatencyAsyncWarp.jpg" rel="lightbox-11"><img class="aligncenter size-medium wp-image-38916" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/HMDLatencyAsyncWarp-600x338.jpg" alt="HMDLatencyAsyncWarp" width="600" height="338" /></a> <a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/HMDLatency3.jpg" rel="lightbox-12"><br />
</a>Asynchronous warp takes the last scene rendered by the GPU and updates it based on the latest head position information taken from the VR sensor. By warping the rendered image late in the pipeline to more closely match head position, Nvidia avoids discontinuities between head movement and action on screen while also dramatically reducing latency. We haven&#8217;t tested this out ourselves yet, but this is a pretty drastic leap forward for VR if it can actually be applied across the VR landscape.</p>
<h2>Performance</h2>
<p>For performance, we looked at the GTX 980&#8217;s synthetic, compute, and gaming benchmarks to evaluate whether or not it really is as significant of an improvement over the GTX 680 and possibly even the GTX 78o Ti. After all, Nvidia wouldn&#8217;t really be naming this card the GTX 980 unless it really could perform in such a way.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/GTX980_980.jpg" rel="lightbox-13"><img class="aligncenter size-medium wp-image-38966" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/GTX980_980-600x337.jpg" alt="GTX980_980" width="600" height="337" /></a></p>
<p>The testbed consisted of an Intel Core i7 4960X cooled by a Corsair H100 on a Gigabyte X79 motherboard with 16 GB of DDR3 2400 MHz memory along with a Thermaltake 1475W Gold PSU and Patriot 128GB SSD all sitting atop a Dimastech Hard Bench.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/3DMark-Fire-Strike-Extreme.jpg" rel="lightbox-14"><img class="aligncenter size-medium wp-image-38899" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/3DMark-Fire-Strike-Extreme-600x293.jpg" alt="3DMark Fire Strike Extreme" width="600" height="293" /></a></p>
<p>First, we tested 3DMark using the Fire Strike Extreme test in order to give the best idea of high-end performance against other cards. Here, it fell between two GTX 680&#8217;s in SLI and two 7970&#8217;s in CrossFireX. It did beat the GTX 780 Ti, and proved that it was indeed more than twice as fast as the GTX 680, which Nvidia was essentially claiming during the majority of the presentations.</p>
<p>After 3DMark, we also wanted to take a look at the Unigine set of synthetic benchmarks with Unigine&#8217;s Heaven and Valley benchmarks.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/Unigine-Heaven-4.0-Benchmark.jpg" rel="lightbox-15"><img class="aligncenter size-medium wp-image-38935" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/Unigine-Heaven-4.0-Benchmark-600x265.jpg" alt="Unigine Heaven 4.0 Benchmark" width="600" height="265" /></a></p>
<p>&nbsp;</p>
<p>As you can see from Unigine Heaven, the GTX 980 outperformed the GTX Titan and R9 290 by a fairly healthy margin and sat somewhere close to the HD 7970 GHz editions in CrossFire. Obviously this is a single GPU, but the fact that it falls within the realm of multi-GPU performance is awesome on its own.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/UnigineValley.jpg" rel="lightbox-16"><img class="aligncenter size-medium wp-image-38955" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/UnigineValley-600x266.jpg" alt="UnigineValley" width="600" height="266" /></a></p>
<p>In the Unigine Valley benchmark, we saw a much less drastic or impressive performance difference with the GTX 980 essentially falling between the GTX 780 and GTX Titan in terms of performance but still well out performing the R9 290 and AMD&#8217;s Hawaii GPU.</p>
<p>Following those benchmarks, we also took at look at two OpenCL benchmarks to see how Maxwell stacks up against AMD and how much Nvidia has improved over the previous Kepler generation. There was much talk that Nvidia had improved their OpenCL performance from one generation to the other so it was interesting to see if that was true and by how much. We tested LuxMark 2.0 and CompuBench 1.5 for our OpenCL testing.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/LuxMarkOpenCL.jpg" rel="lightbox-17"><img class="aligncenter size-medium wp-image-38917" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/LuxMarkOpenCL-600x255.jpg" alt="LuxMarkOpenCL" width="600" height="255" /></a></p>
<p>In LuxMark, the GTX 980 performed fantastically, showing that it was faster than two GTX Titans and an R9 290. Of course, it wasn&#8217;t as fast as three GTX Titans or multiple 7970s, a 7990 or an R9 295X2, but I suspect that multiple GTX 980 GPUs could give AMD a run for their money since all of the faster AMD cards are multi-GPU.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/Compubench-1.5.jpg" rel="lightbox-18"><img class="aligncenter size-medium wp-image-38901" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/Compubench-1.5-600x236.jpg" alt="Compubench 1.5" width="600" height="236" /></a></p>
<p>In Compubench we saw some interesting results with the GTX 980 trading punches with the R9 290X beating it in some OpenCL tests and losing to it in others. If anything, the GeForce GTX 980 shows that Nvidia is a far more capable OpenCL competitor to AMD than the GTX 780 Ti ever was.</p>
<p>Following those synthetic benchmarks, we ran a series of 4K benchmarks to see how the GTX 980 stacks up against the most stressful gaming environments. In our tests, we played Battlefield 4, Crysis 3 and Counter Strike: Global Offensive at varying levels of detail.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/Battlefield-4-Benchmark.jpg" rel="lightbox-19"><img class="aligncenter size-medium wp-image-38900" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/Battlefield-4-Benchmark-600x247.jpg" alt="Battlefield 4 Benchmark" width="600" height="247" /></a></p>
<p>In Battlefield 4, we can clearly see that the GTX 980 outperforms the GTX 780 Ti as well as the R9 290 but still falls short of coming anywhere near the monstrous $1,500 R9 295X2. However, the GTX 980 was without a doubt playable FPS and never dipped below 30 FPS according to our measurements.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/Crysis-3-4K-Benchmarks.jpg" rel="lightbox-20"><img class="aligncenter size-medium wp-image-38902" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/Crysis-3-4K-Benchmarks-600x276.jpg" alt="Crysis 3 4K Benchmarks" width="600" height="276" /></a></p>
<p>&nbsp;</p>
<p>In Crysis, we once again saw the GTX 980 outperform the GTX 780 Ti and the R9 290, but it still struggled to keep up with the R9 295X2 (which is triple the price). This is primarily because of the lack of memory and memory bandwidth to properly play Crysis 3 at those settings. So, if you want to run Crysis 3 at Very High settings with 4x MSAA, you&#8217;ll probably need a second GPU and then you should get pretty playable FPS.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/CSGO-Benchmark.jpg" rel="lightbox-21"><img class="aligncenter size-medium wp-image-38903" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/CSGO-Benchmark-600x275.jpg" alt="CSGO Benchmark" width="600" height="275" /></a></p>
<p>&nbsp;</p>
<p>In Counter Strike: Global Offensive, we weren&#8217;t expecting to see anything but triple digit FPS, but what is important is that the GTX 980 beats out the R9 290 and 780 Ti in terms of 4K performance and did cap at 300 max FPS at times. If you want to have the ultimate 4K gaming experience in CSGO you can totally do it with any of these cards, but the GTX 980 does it at a fraction of the power.</p>
<h2>Power and Overclocking</h2>
<p>At idle, the card ran at about 10% of TDP, or 16W and draws up to 90% of TDP or 148W under most gaming scenarios that we measured. The card never went over 80C and idled at 36C under normal usage. The maximum temperatures as well as idle temps may actually be higher than expected because of the fact that the testing scenario had higher ambient temperatures than normal due to a heatwave.</p>
<p>Last but not least, was overclocking which was more surprising than anyone would have expected. Sure, this card is a very low power card with a lot of in-bound power, but the overclocks achieved were simply mind blowing. In order to test the overclocks, 3Dmark Fire Strike Extreme was run for validation purposes.</p>
<p>In overclocking this card, we were able to push it to a GPU clock offset of +260 on the GPU base clock and +100 on the memory&#8217;s frequency. As a result the GPU base clock went up to 1,387 MHz and boost clock of a whopping 1,553 MHz, something that we have never seen from an air cooled GPU (yes, the fans were at 100% at that point). Even so, this performance was astonishing and resulted in some amazing 3DMark Fire Strike Extreme scores. We&#8217;ve also included some of the other overclocks that were achieved on the way to the max overclock.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/GTX-980-OC-1388.jpg" rel="lightbox-22"><img class="aligncenter size-medium wp-image-38912" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/GTX-980-OC-1388-600x442.jpg" alt="GTX 980 OC 1388" width="600" height="442" /></a></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/GTX-980-3DMark-Overclocking.jpg" rel="lightbox-23"><img class="aligncenter size-medium wp-image-38909" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/GTX-980-3DMark-Overclocking-600x316.jpg" alt="GTX 980 3DMark Overclocking" width="600" height="316" /></a></p>
<p>As you can see above, the overclocked GTX 980 actually outperforms two Radeon HD 7970 GHz editions in CrossFire X as well as all the other cards anywhere near it. The only things that are faster are two GTX Titans in SLI and an R9 295X2. This is also being done at a very small amount of power, 206W to be exact, which means that there&#8217;s still more overclocking headroom left on this card, about 19W. As such, one would expect that consumers may see even more overclocked versions of the GTX 980 with some impressive manufacturer clocks that very likely could be pushed even further.</p>
<h2>Conclusion</h2>
<p>The GTX 980 is an absolutely stunning graphics card that delivers on many of Nvidia&#8217;s promises (namely the 2x + performance of the GTX 680) and does it at an absolutely amazing level of power. But that&#8217;s not even the best part, Nvidia released this card today at an even more competitive price of $549, which is the reason why AMD&#8217;s 290X recently had a price drop from $549 to $449. But do keep in mind that even though the R9 290X is cheaper, it still does draw more power and won&#8217;t overclock anywhere near as well as this card.</p>
<p>Nvidia is also releasing a cost-down version of the GTX 980 with the GTX 970, which understandably is a fairly slower version at $329. Unfortunately, we weren&#8217;t sent one for testing so we can&#8217;t tell you exactly how much slower it is, but it may be a major consideration if the GTX 980 is too rich for your blood.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_3Qtr.jpg" rel="lightbox-24"><img class="aligncenter size-medium wp-image-38923" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_3Qtr-600x473.jpg" alt="NVIDIA_GeForce_GTX_980_3Qtr" width="600" height="473" /></a></p>
<p>Nvidia has without a doubt hit a homerun with the GTX 980 and Maxwell and it will be interesting to see what AMD has to answer this astounding performance and power improvement over the previous generation. This may not necessarily be a huge upgrade for anyone running a GTX 780 Ti, but it is a pretty serious upgrade for almost any other gamer out there that doesn&#8217;t already have that card. And not just that, the GTX 780 Ti is a $700 graphics card and you&#8217;re getting better performance at significantly lower wattage for much less money.</p>
<p>The GTX 980 is a great piece of GPU architecture and is a must buy for anyone looking to buy a new high-end graphics card this holiday season. It only makes us wonder what will eventually be possible once Nvidia unleashes the GM-210 full-blown Maxwell on this world, hopefully next year. As such, this card wins our Editor&#8217;s Choice Award and immediate buy recommendation.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/09/18/geforce-gtx-980-review-performance-lower-power/">GeForce GTX 980 Review: More Performance at Lower Power</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/09/18/geforce-gtx-980-review-performance-lower-power/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia Launches GeForce GTX Titan Z Today, for $3,000</title>
		<link>http://www.vrworld.com/2014/05/28/nvidia-launches-geforce-gtx-titan-z-today-3000/</link>
		<comments>http://www.vrworld.com/2014/05/28/nvidia-launches-geforce-gtx-titan-z-today-3000/#comments</comments>
		<pubDate>Wed, 28 May 2014 18:41:11 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[295X2]]></category>
		<category><![CDATA[AMD Radeon R9 295X2]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GeForce GTX]]></category>
		<category><![CDATA[GeForce GTX Titan Z Review]]></category>
		<category><![CDATA[GTX Titan]]></category>
		<category><![CDATA[GTX Titan Z]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Nvidia Titan Z]]></category>
		<category><![CDATA[R9 295X2]]></category>
		<category><![CDATA[Titan Z]]></category>
		<category><![CDATA[Titan Z Review]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=35390</guid>
		<description><![CDATA[<p>So, Nvidia has finally launched their much awaited $3,000 graphics card which isn&#8217;t quite good enough for professional applications (no professional drivers, like Quadro) and ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/05/28/nvidia-launches-geforce-gtx-titan-z-today-3000/">Nvidia Launches GeForce GTX Titan Z Today, for $3,000</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="670" height="447" src="http://cdn.vrworld.com/wp-content/uploads/2014/05/titan-z-51.jpg" class="attachment-post-thumbnail wp-post-image" alt="GeForce GTX Titan Z" /></p><p>So, Nvidia has finally launched their much awaited $3,000 graphics card which isn&#8217;t quite good enough for professional applications (no professional drivers, like Quadro) and is too expensive for 99.99% of gamers to ever consider as a GPU. The <a title="GTC 2014 Keynote – GTX Titan Z and Pascal Announced" href="http://www.brightsideofnews.com/2014/03/25/gtc-2014-keynote-gtx-titan-z-and-pascal-announced/">Titan Z was originally announced</a> at Nvidia&#8217;s GTC 2014 back in March and there were rumors it was supposed to <a title="Nvidia GeForce GTX Titan Z is… Coming Soon?" href="http://www.brightsideofnews.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/">drop earlier this month but got delayed</a>. Either way, the card is now available and if you&#8217;re willing to pay twice the price of an <a title="AMD Radeon R9 295X2 Review: The Definitive 4K Gaming Graphics Card" href="http://www.brightsideofnews.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/">AMD Radeon R9 295X2</a>, which isn&#8217;t really much slower, then you can buy it right now from multiple online etailers and system builders.</p>
<p>The Titan Z is clearly Nvidia&#8217;s response to the <a title="AMD Radeon R9 295X2 Review: The Definitive 4K Gaming Graphics Card" href="http://www.brightsideofnews.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/">R9 295X2 which we reviewed last month</a>, but Nvidia some how justifies charging double the price because it is a combination of two of Nvidia&#8217;s fastest GTX Titan GPUs. Even though they did this, they ended up clocking these two GPUs much lower than if you went out and bought a GTX Titan on its own. Yes, it does have 12 GB of GDDR5, but the reality is that you don&#8217;t need 12 GB of GDDR5 in order to drive a single 4K display and a single <a href="http://www.nvidia.com/gtx-700-graphics-cards/gtx-titan-z/" target="_blank">GTX Titan Z</a> isn&#8217;t going to be enough to drive three 4K displays and actually game on them at decent FPS. So, in reality, you&#8217;ll need two GTX Titan Z&#8217;s in order to be able to drive three 4K displays anyways, which is going to be incredibly hot and expensive since it&#8217;ll end up costing you $6,000 for the GPUs alone when you could spend $4,000 and probably get better performance.</p>
<div id="attachment_35396" style="width: 1280px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/05/TitanZBare1.jpg" rel="lightbox-0"><img class="size-full wp-image-35396" src="http://cdn.vrworld.com/wp-content/uploads/2014/05/TitanZBare1.jpg" alt="GTX Titan Z Bare GPU" width="1270" height="714" /></a><p class="wp-caption-text">The GTX Titan Z&#8217;s GPUs shown bare, compliments of Digital Storm</p></div>
<p>The GTX Titan Z is Nvidia&#8217;s attempt to one up AMD, but the problem is that because it is air cooled they are forced to dial down the thermals of the card and its power consumption. AMD&#8217;s solution was to fully watercool their card, which they did get some criticism for having to do, but ultimately the card runs very cool and very quietly for being a dual GPU monster. In fact, Nvidia clocked their GPUs downward quite a bit because a <a href="http://www.nvidia.com/gtx-700-graphics-cards/gtx-titan-black/" target="_blank">GTX Titan Black</a>, which is comparable to the GPUs onboard this card is clocked at 889 MHz base clock and 980 MHz boost clock. Meanwhile, the GTX Titan Z&#8217;s GPUs clock in at a much slower 705 and 876 MHz base and boost clock, making it fundamentally a lot slower as well. Also, according to <a href="http://www.digitalstormonline.com/unlocked/nvidia-geforce-gtx-titan-z-review-and-4k-benchmarks-idnum280/" target="_blank">Digital Storm&#8217;s review</a> (the only one on the internet at launch is from a system builder??) the Titan Z isn&#8217;t really much faster than AMD&#8217;s R9 295X2 and only barely beats it in a few games.</p>
<p>So, basically, with the GTX Titan Z you&#8217;re paying double what AMD charges for the R9 295X2 for barely better (or worse) performance and a downclocked card (while AMD actually overclocked the 295&#215;2 compared to the 290X). And not just that, you&#8217;re simply better served if you just buy GTX Titan Black GPUs because they are fully unlocked GPUs and they consistently outperform the GTX Titan Z by a pretty wide margin (because of the clocks).</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/05/28/nvidia-launches-geforce-gtx-titan-z-today-3000/">Nvidia Launches GeForce GTX Titan Z Today, for $3,000</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/05/28/nvidia-launches-geforce-gtx-titan-z-today-3000/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nvidia GeForce GTX Titan Z is&#8230; Coming Soon?</title>
		<link>http://www.vrworld.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/</link>
		<comments>http://www.vrworld.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/#comments</comments>
		<pubDate>Wed, 30 Apr 2014 17:03:26 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Rumors]]></category>
		<category><![CDATA[Asus]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GeForce GTX]]></category>
		<category><![CDATA[GK-110]]></category>
		<category><![CDATA[GTX Titan Z]]></category>
		<category><![CDATA[GTXTITANZ-12GD5]]></category>
		<category><![CDATA[Kepler]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Nvidia GeForce]]></category>
		<category><![CDATA[Titan]]></category>
		<category><![CDATA[Titan Z]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=34811</guid>
		<description><![CDATA[<p>We were a bit surprised to see an announcement on Techpowerup! that ASUS had launched their GTX Titan Z without any hoopla from Nvidia or ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/">Nvidia GeForce GTX Titan Z is&#8230; Coming Soon?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="805" height="465" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/ASUS_GTXTITANZ-12GD5_011.jpg" class="attachment-post-thumbnail wp-post-image" alt="ASUS_GTXTITANZ-12GD5_01" /></p><p>We were a bit surprised to see <a href="http://www.techpowerup.com/200339/asus-announces-the-geforce-gtx-titan-z-dual-gpu-graphics-card.html" target="_blank">an announcement on Techpowerup!</a> that ASUS had launched their GTX Titan Z without any hoopla from Nvidia or any of their other board partners. So, it comes as little surprise that the article itself has since been pulled and that any and all mentions of ASUS&#8217; GTX Titan Z have disappeared. But of course, the damage has already been done and Pandora&#8217;s box has been opened. However, there isn&#8217;t that much about this card that is really a mystery.</p>
<p>The GTX Titan Z is expected to be a dual Titan graphics card, air cooled, that enables some of the best dual precision compute performance on earth while still delivering an incredible gaming experience, all while doing it with only one graphics card. This is possible because of the dual GPUs and 12 GB of RAM, which in theory would make the Titan Z a possibly better single card solution for driving three 4K monitors simultaneously. Obviously, based on our findings with the AMD R9 295X2, the likelihood of driving all three of those 4K monitors while gaming is out of the question, but driving a single 4K monitor is very likely possible if not encouraged. What&#8217;s more interesting is that Nvidia told us the price of the GTX Titan Z before we actually knew anything else about the card and as a result, it costs as much as two of AMD&#8217;s latest high-end GPU, the R9 295X2 at $3,000 a pop.</p>
<p>&nbsp;</p>
<div id="attachment_34815" style="width: 845px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/ASUS_GTXTITANZ-12GD5_021.jpg" rel="lightbox-0"><img class="size-full wp-image-34815" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/ASUS_GTXTITANZ-12GD5_021.jpg" alt="ASUS GTX Titan Z" width="835" height="705" /></a><p class="wp-caption-text">ASUS GTX Titan Z</p></div>
<p>Based on what we saw from Techpowerup, the ASUS card will have its own real-time graphics tuning utility called GPU Tweak to allow on-the-fly adjustments, even though I don&#8217;t know many people that need to make on-the-fly adjustments to a $3,000 card. Remember, this isn&#8217;t quite a professional card, but it also isn&#8217;t quite a gaming card since it is $3,000 and that takes it out of the reach of about 99.99% of gamers. It will also have each GPU running at a base clock of  705 MHz and a boost clock of 876 MHz, meaning that this card is significantly slower per GPU than its single GPU counterparts, which isn&#8217;t necessarily unexpected considering the TDP of each GPU. With both GPUs and memory combined, this card will have a total of 12 GB of GDDR5 memory and 5760 CUDA cores, which are what this card is designed to deliver, lots of memory and lots of cores.</p>
<p>According to the Techpowerup! article this GPU is supposed to be available worldwide on April 29th, yesterday. However, we&#8217;re hearing that there is a slight delay with this card and that date has been pushed back to May 8th, essentially a week away from tomorrow. Which may explain why ASUS had Techpowerup pull the article and why there is no news about it, Techpowerup may have just not gotten the memo.</p>
<p style="text-align: center;"><b>SPECIFICATIONS: </b><b style="text-align: center;">GTXTITANZ-12GD5</b></p>
<p style="text-align: center;"><span style="text-align: center;">Graphics Engine: NVIDIA GeForce GTX TITAN Z</span><br />
<span style="text-align: center;">Bus Standard: PCI Express 3.0</span><br />
<span style="text-align: center;">OpenGL: OpenGL 4.4</span><br />
<span style="text-align: center;">Video Memory: 12 GB GDDR5</span><br />
<span style="text-align: center;">GPU Boost Clock: 876 MHz</span><br />
<span style="text-align: center;">GPU Base Clock: 705 MHz</span><br />
<span style="text-align: center;">CUDA Cores: 5760</span><br />
<span style="text-align: center;">Memory Clock: 7000 MHz</span><br />
<span style="text-align: center;">Memory Interface: 768 bit</span><br />
<span style="text-align: center;">Output: 1 x Native DVI-I, 1 x Native DVI-D,1 x Native HDMI, 1 x Native DisplayPort 1.2</span></p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/">Nvidia GeForce GTX Titan Z is&#8230; Coming Soon?</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/04/30/nvidia-geforce-gtx-titan-z-coming-soon/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>

<!-- Performance optimized by W3 Total Cache. Learn more: http://www.w3-edge.com/wordpress-plugins/

Content Delivery Network via Amazon Web Services: CloudFront: cdn.vrworld.com

 Served from: www.vrworld.com @ 2015-04-10 17:35:44 by W3 Total Cache -->