<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>VR World &#187; Hawaii</title>
	<atom:link href="http://www.vrworld.com/tag/hawaii/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.vrworld.com</link>
	<description></description>
	<lastBuildDate>Fri, 10 Apr 2015 07:54:22 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.1.1</generator>
	<item>
		<title>AMD Announces Radeon R9 285 During Live Stream</title>
		<link>http://www.vrworld.com/2014/08/23/amd-announces-radeon-r9-285-r9-285x-live-stream/</link>
		<comments>http://www.vrworld.com/2014/08/23/amd-announces-radeon-r9-285-r9-285x-live-stream/#comments</comments>
		<pubDate>Sat, 23 Aug 2014 17:56:21 +0000</pubDate>
		<dc:creator><![CDATA[Sam Reynolds]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[AMD Radeon]]></category>
		<category><![CDATA[AMD Radeon R9 285]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[Hawaii]]></category>
		<category><![CDATA[Hawaii Pro]]></category>
		<category><![CDATA[R9 285]]></category>
		<category><![CDATA[Radeon]]></category>
		<category><![CDATA[Shader Cores]]></category>
		<category><![CDATA[Tahiti]]></category>
		<category><![CDATA[Tahiti Pro]]></category>
		<category><![CDATA[TFLOPS]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=37973</guid>
		<description><![CDATA[<p>In a live event streamed from AMD’s office in Austin hosted by Richard Huddy, AMD announced the Radeon R9 285 and giving the company&#8217;s most ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/08/23/amd-announces-radeon-r9-285-r9-285x-live-stream/">AMD Announces Radeon R9 285 During Live Stream</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="2847" height="1537" src="http://cdn.vrworld.com/wp-content/uploads/2014/08/amd-stage-apu-131.jpg" class="attachment-post-thumbnail wp-post-image" alt="AMD Restructuring" /></p><p>In a live event streamed from AMD’s office in Austin hosted by Richard Huddy, AMD announced the Radeon R9 285 and giving the company&#8217;s most recent Hawaii architecture a modified and more affordable version. This card is clearly AMD&#8217;s attempt to attack Nvidia&#8217;s GeForce GTX 760.</p>
<p>The card is intended to replace the R9 280 (Tahiti Pro) card in AMD’s lineup. According to specs released by AMD, the card will have 1,792 Stream Processors, a 918 MHz clock, 2GB of memory clocked at 5.5 GHz on a 256-bit wide memory interface, 32 ROPs and 112 texture units. The card is said to offer 3.29 TFLOPS of peak performance.</p>
<p>This is compared to its predecessor, the <a href="http://www.amd.com/en-gb/products/graphics/desktop/7000/7900" target="_blank">Radeon HD 7950,</a> which was only capable of 2.87 TFLOPS of peak performance, even though it had 3GB of RAM and a 384-bit bus and exactly the same amount of shader cores (and roughly the same clocks).</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/08/radeon-r295x2-1.jpg" rel="lightbox-0"><img class="aligncenter size-full wp-image-37976" src="http://cdn.vrworld.com/wp-content/uploads/2014/08/radeon-r295x2-1.jpg" alt="radeon-r295x2-1" width="1254" height="726" /></a></p>
<p>According to AMD production manager Devon Nekechuk, the card, along with all of AMD’s GCN GPUs, will support Direct X 12, including this newly announced Radeon R9 285.</p>
<p>AMD has also released some early performance numbers of the card running 3D Mark FireStrike. On a system with a Core i7-4960X CPU and 16GB of DDR3 memory the R9 285 scored P7066 on performance mode, and X3513 on extreme mode. These benchmarks have not been verified independently.</p>
<p>Both versions (2GB and 4GB) of the card will be released on September 2, starting at $249.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/08/23/amd-announces-radeon-r9-285-r9-285x-live-stream/">AMD Announces Radeon R9 285 During Live Stream</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/08/23/amd-announces-radeon-r9-285-r9-285x-live-stream/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>AMD Launches W8100, Cuts GPUs Prices 50% for First GPU</title>
		<link>http://www.vrworld.com/2014/06/23/amd-launches-w8100-cuts-gpus-prices-50-first-gpu/</link>
		<comments>http://www.vrworld.com/2014/06/23/amd-launches-w8100-cuts-gpus-prices-50-first-gpu/#comments</comments>
		<pubDate>Tue, 24 Jun 2014 03:01:28 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Audio/Video]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Software]]></category>
		<category><![CDATA[AMD FirePro]]></category>
		<category><![CDATA[FirePro]]></category>
		<category><![CDATA[FirePro W8100]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[Hawaii]]></category>
		<category><![CDATA[K20]]></category>
		<category><![CDATA[K5000]]></category>
		<category><![CDATA[Kepler]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[OpenCL]]></category>
		<category><![CDATA[Professional]]></category>
		<category><![CDATA[W8100]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=36140</guid>
		<description><![CDATA[<p>Today was an interesting day in AMDland, first the company announced their latest GPU, the FirePro W8100 and then later in the day they announced ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/06/23/amd-launches-w8100-cuts-gpus-prices-50-first-gpu/">AMD Launches W8100, Cuts GPUs Prices 50% for First GPU</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="980" height="431" src="http://cdn.vrworld.com/wp-content/uploads/2014/06/W8100_9801.jpg" class="attachment-post-thumbnail wp-post-image" alt="W8100" /></p><p>Today was an interesting day in AMDland, first the company <a href="http://www.amd.com/en-us/press-releases/Pages/new-amd-professional-2014jun23.aspx" target="_blank">announced their latest GPU</a>, the FirePro W8100 and then later in the day they announced a program where you could buy any of their latest GPUs for a whopping 50% as long as its the first one, every subsequent one will be full price.  But first, you have to go through <a href="http://www.fireprographics.com/experience/us/apply.asp" target="_blank">an &#8216;approval process&#8217;</a>. Now, let&#8217;s get back to the new GPU AMD just announced, what is it exactly? Well, the FirePro W8100 is part of AMD&#8217;s professional line of graphics cards branded as FirePro.</p>
<p>So, looking at the rough specs we can see that the W8100 delivers over 2 TFLOPs of double precision, which is actually less than what <a title="Intel’s New Knight’s Landing Xeon Phi Combines Omni Scale Fabric with HMC" href="http://www.brightsideofnews.com/2014/06/23/intel-new-knights-landing-combines-omni-scale-fabric-hmc/" target="_blank">Intel&#8217;s new Knight&#8217;s Landing</a> is capable of delivering, which was announced today. It does, however, also do over 4 TFLOPs of single precision which is quite impressive since its double Nvidia&#8217;s K5000&#8217;s 2.1 TFLOPs. This GPU is effectively a professional version of <a title="AMD Radeon R9 290: Blowing the Doors off the Competition" href="http://www.brightsideofnews.com/2013/11/08/amd-radeon-r9-290-blowing-the-doors-off-the-competition/" target="_blank">AMD&#8217;s R9 290 GPU which we reviewed</a> and found overall to be a very impressive GPU for the money, and it still is. What makes this GPU different, however is that it can drive four 4K displays simultaneously and has 8 GB of GDDR5 memory as opposed to 4 GB, making better use of the 512-bit memory bus on the Hawaii Pro GPU inside. This is, however, less than what the W9100 supports which is six 4K displays. But realistically you won&#8217;t be doing any gaming on these 4K displays so it doesn&#8217;t seem outrageous to think someone could be using 32 million pixels. AMD accomplishes this through putting four DisplayPort 1.2 connectors on the back of the card as you can see above and below.</p>
<div id="attachment_36145" style="width: 990px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/06/W8100_6_9801.jpg" rel="lightbox-0"><img class="size-full wp-image-36145" src="http://cdn.vrworld.com/wp-content/uploads/2014/06/W8100_6_9801.jpg" alt="W8100" width="980" height="426" /></a><p class="wp-caption-text">W8100 Specifications, current and future</p></div>
<p>As you can see from the above specs, AMD has decided to change the GPU&#8217;s name to an engine and say its clocked at 824 MHz, a solid 123 MHz less than the R9 290 gaming graphics card that it mimics. It does, however have double the memory of the R9 290 which is why it is capable of driving up to four 4K displays. AMD also powers it with two 6-pin power connectors, drawing 220W and supporting PCIe 3.0, everything pretty standard here. It also supports OpenCL 1.2 and already has OpenCL 2.0 support baked-in, which is good to know for anyone planning to buy a &#8216;future-proof&#8217; GPU. It also supports OpenGL 4.3 and will support OpenGL 4.4, which isn&#8217;t that much of a feat as most of that support will be accomplished though a driver update. What is interesting, though, is that it supports DirectX 11.2, but AMD is making no mention of future compatibility with DirectX 12 at all, which seems a bit missing. It isn&#8217;t anything shocking since this graphics card is based on a GPU that was announced in 2013, but it is still interesting that AMD has nothing to mention there.</p>
<p>AMD also couldn&#8217;t help but compare themselves to Nvidia&#8217;s Quadro K5000, Nvidia&#8217;s older professional workstation GPU (as they&#8217;re currently on the K6000) so naturally, here in AMD&#8217;s comparison they basically spank Nvidia. Yes, the W8100 is $2499, which makes it more price comparable with the K5000 as opposed to the K6000 which sells for a whopping $4,999 and is more comparable with AMD&#8217;s W9100.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/06/W8100_2_9801.jpg" rel="lightbox-1"><img class="size-full wp-image-36141" src="http://cdn.vrworld.com/wp-content/uploads/2014/06/W8100_2_9801.jpg" alt="W8100" width="980" height="484" /></a></p>
<p>AMD also draws a comparison against <a title="Nvidia Maximus 2 Reviewed – The Great One" href="http://www.brightsideofnews.com/2013/09/12/nvidia-maximus-2-reviewed-the-great-one/" target="_blank">Nvidia&#8217;s Maximus 2 development platform, which we also reviewed</a>, as that solution is absolutely bulletproof but also incredibly expensive. Here AMD is claiming that they deliver more performance and doing it with fewer GPUs and with comparable memory. However, AMD doesn&#8217;t talk about the development scenarios that it enables or how good their professional drivers are compared to Nvidia&#8217;s. The Maximus 2 platform (and subsequent versions) are all about stability and reliability and not necessarily about performance as we learned in our review. So, until AMD can put these GPUs in our hands and show us that their GPUs and platforms are as stable as Nvidia&#8217;s in the same applications, then we&#8217;re not entirely sure that AMD can draw these comparisons. Yes, fewer GPUs will consume less power, but sometimes power isn&#8217;t as much of a concern when in professional graphics scenarios.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/06/W8100_3_9801.jpg" rel="lightbox-2"><img class="aligncenter wp-image-36142 size-full" src="http://cdn.vrworld.com/wp-content/uploads/2014/06/W8100_3_9801.jpg" alt="W8100" width="980" height="404" /></a></p>
<p>&nbsp;</p>
<p>Last but not least, AMD&#8217;s W8100 was benchmarked in a ton of AMD-favorable benchmarks and applications (mostly OpenCL heavy) and they obviously won pretty well. However, the most interesting benchmark to me that isn&#8217;t cherry picked by AMD was their DaVinci Resolve performance benchmark showing scaling in Resolve using W8100&#8217;s. In that benchmark they show almost 100% scaling with DaVinci Resolve, which may be incredibly attractive to professionals that do lots of heavy post-processing.</p>
<div id="attachment_36147" style="width: 990px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/06/W8100_Resolve_9801.jpg" rel="lightbox-3"><img class="size-full wp-image-36147" src="http://cdn.vrworld.com/wp-content/uploads/2014/06/W8100_Resolve_9801.jpg" alt="W8100" width="980" height="479" /></a><p class="wp-caption-text">DaVinci Resolve performance scaling with W8100</p></div>
<p>Also, in regards to <a href="http://links.em.experience.amd.com/servlet/MailView?ms=MjEwMzQ4MTES1&amp;r=NzMzNTE5MTkwMzgS1&amp;j=MzQxMjc5MjU0S0&amp;mt=1&amp;rt=0" target="_blank">AMD&#8217;s 50% off promotion</a>, there are actually only specific GPUs eligible for the promotion, including the W9100. And frankly, if you&#8217;re going to use the 50% off promotion, you might as well use it on their fastest and most expensive (and capable) professional graphics card. Other options include $800 off the MSRP of the W8000, $450 off the MSRP of the W7000, $1250 off the S9000&#8217;s MSRP and $715 off the S7000 at MSRP price. So, obviously it isn&#8217;t 50% off all professional graphic cards, but rather up to 50% off some of them.</p>
<p>I&#8217;m not sure why AMD is doing this, maybe to introduce people to their GPUs by getting to buy one cheaply, which isn&#8217;t a bad sales strategy. However, it may also be that they&#8217;re desperate to sell these GPUs and are cherry picking specific models and prices in order to make sure that they&#8217;re still making a profit on them.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/06/23/amd-launches-w8100-cuts-gpus-prices-50-first-gpu/">AMD Launches W8100, Cuts GPUs Prices 50% for First GPU</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/06/23/amd-launches-w8100-cuts-gpus-prices-50-first-gpu/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>AMD Radeon R9 295X2 Review: The Definitive 4K Gaming Graphics Card</title>
		<link>http://www.vrworld.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/</link>
		<comments>http://www.vrworld.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/#comments</comments>
		<pubDate>Wed, 09 Apr 2014 17:55:31 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Reviews]]></category>
		<category><![CDATA[3840x2160]]></category>
		<category><![CDATA[4K]]></category>
		<category><![CDATA[4K Gaming]]></category>
		<category><![CDATA[500 Watts]]></category>
		<category><![CDATA[500w]]></category>
		<category><![CDATA[512-bit]]></category>
		<category><![CDATA[8GB]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[AMD Radeon]]></category>
		<category><![CDATA[AMD Radeon R9 295X2]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[Dual GPU]]></category>
		<category><![CDATA[GDDR5]]></category>
		<category><![CDATA[Hawaii]]></category>
		<category><![CDATA[Islands]]></category>
		<category><![CDATA[memory bus]]></category>
		<category><![CDATA[power]]></category>
		<category><![CDATA[PSU]]></category>
		<category><![CDATA[Shader Cores]]></category>
		<category><![CDATA[Vesuvius]]></category>
		<category><![CDATA[Volcanic Islands]]></category>

		<guid isPermaLink="false">http://wp.bsne.ws/?p=34324</guid>
		<description><![CDATA[<p>AMD has been teasing their Radeon R9 295X2 codenamed Vesuvius for quite some time now, including a lengthy &#8216;viral&#8217; campaign that involved secret agents, semi-ambiguous ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/">AMD Radeon R9 295X2 Review: The Definitive 4K Gaming Graphics Card</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1920" height="1000" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00381.jpg" class="attachment-post-thumbnail wp-post-image" alt="DSC_0038" /></p><p>AMD has been teasing their Radeon R9 295X2 codenamed Vesuvius for quite some time now, including a lengthy &#8216;viral&#8217; campaign that involved secret agents, <a title="Viral Marketing for AMD Vesuvius" href="http://www.brightsideofnews.com/news/2014/3/19/amds-2betterthan1-vesuvius-viral-marketing-continues.aspx" target="_blank">semi-ambiguous packages of Volcanic Island water and chips</a> as well as <a title="AMD Teases new Dual GPU" href="http://www.brightsideofnews.com/news/2014/3/13/amd-teases-new-dual-gpu-card-with-2betterthan1-viral-ad.aspx" target="_blank">creepy photos of yours truly</a>. Now that the secret is out of the case, we can finally tell you about AMD&#8217;s new card and exactly what it is intended to do. First and foremost, this card&#8217;s sole purpose is to deliver a single graphics card 4K gaming experience. Something that is currently impossible even with the latest crop of AMD&#8217;s Radeon R9 290X and Nvidia&#8217;s GeForce GTX 780 Ti. In this review, we&#8217;ll be seeing whether or not AMD&#8217;s latest and greatest graphics card is capable of delivering true 4K gaming in a single card.</p>
<p>The name of the AMD Radeon R9 295X2 itself doesn&#8217;t quite make sense to me in terms of naming when you consider that it is really just two R9 290X&#8217;s in a single dual GPU graphics card. I would&#8217;ve called it the R9 290X2, but perhaps I&#8217;m missing the point here. But when you take into consideration that this card is effective two R9 290X&#8217;s strapped onto a single graphics card and you remember all of the thermal issues they had with a single R9 290X it was obvious they needed to go liquid cooling. And they did, so they tapped Asetek for a custom cooling solution to cool both Hawaii GPUs in this Vesuvius card.</p>
<p>The Radeon R9 290X2 has both an LED lit fan as well as an LED lit logo between the two liquid cooling tubes.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_01051.jpg" rel="lightbox-0"><img class="aligncenter  wp-image-34313" alt="DSC_0105" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_01051.jpg" width="1152" height="574" /></a></p>
<p><span style="line-height: 1.5em;">The R9 295X2 came to us in a briefcase, which we originally teased in an article, even though we&#8217;re not actually sure what the retail packaging for this card will be since these were press samples. Judging by </span><a style="line-height: 1.5em;" href="http://rog.asus.com/314282014/gaming-graphics-cards-2/asus-announces-r9-295x2-graphics-card/" target="_self">ASUS&#8217; own packaging</a><span style="line-height: 1.5em;">, it doesn&#8217;t look like it&#8217;ll be coming in a briefcase like ours did, what a shame.</span></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_9979_9101.jpg" rel="lightbox-1"><img class="aligncenter size-full wp-image-34336" alt="DSC_9979_910" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_9979_9101.jpg" width="910" height="490" /></a> <a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_0004_9101.jpg" rel="lightbox-2"><img class="aligncenter size-full wp-image-34335" alt="DSC_0004_910" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_0004_9101.jpg" width="910" height="910" /></a></p>
<p>Looking the card itself in detail, you are getting a dual Hawaii ASIC graphics card that allows for a maximum capacity of 8GB GDDR5 memory. When you compare it side to side against the R9 290X and the R9 290 you can see exactly what you&#8217;re getting in terms of performance and raw support.</p>
<p>Furthermore, the compute performance is actually more than 2x the 5.6 TFLOPS that the R9 290X is rated at at 11.5 TFLOPS instead of the expected 11.2 TFLOPS as the GPUs actually work up to 1018 MHz rather than 1 GHz on the R9 290X (the reason for the 295X2 naming?). Higher clocks were enabled thanks to the cooling and improved binning more than anything else, which we&#8217;ll talk about next. The overall compute capability of 11.5 TFLOPS is a result of the fact that it has 5,632 cores/stream processors enabling a texture fill rate up to 358.3 GT/s.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardSpecs1.jpg" rel="lightbox-3"><img class="aligncenter  wp-image-34294" alt="CardSpecs" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardSpecs1.jpg" width="1152" height="624" /></a> <a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/TransistorsAndShaders1.jpg" rel="lightbox-4"><img class="aligncenter  wp-image-34317" alt="TransistorsAndShaders" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/TransistorsAndShaders1.jpg" width="1152" height="603" /></a></p>
<p>&nbsp;</p>
<p>In order to combat the issues that AMD had with the R9 290X and overheating causing thermal throttling of the GPU, AMD approached Asetek to help them water cool the entire graphics card. Asetek is very good at making self-contained liquid cooling systems and they&#8217;re capable of scaling in the thousands if not hundreds of thousands. So, as a result they helped AMD engineer a solution that would help them not only cool a single Hawaii GPU but two slightly higher clocked Hawaii GPUs, a pretty tall order.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/AsetekWatercooling11.jpg" rel="lightbox-5"><img class="aligncenter  wp-image-34337" alt="AsetekWatercooling" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/AsetekWatercooling11.jpg" width="1152" height="620" /></a></p>
<p>If you look at the card itself, there are a few major parts to the whole system. Primarily the metal backplate, the PCB with the GPUS and memory, the Asetek liquid cooling blocks and full-length cooling plate, and the metal cover and fan. There is also a heat exchanger (radiator) attached to each of the Asetek waterblock/pumps and each block and pump is connected to the other. If the GPUs are already water cooled why have a fan on the cover of the card, some of you might say? Well, that&#8217;s because the fan&#8217;s purpose is to keep the memory chips cool as well as the voltage regulators in the middle. Remember, voltage regulators typically are the first failing points of most dual GPU cards.</p>
<p>&nbsp;</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardBreakDown1.jpg" rel="lightbox-6"><img class="aligncenter  wp-image-34292" alt="CardBreakDown" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardBreakDown1.jpg" width="1152" height="575" /></a> <a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardDimensions1.jpg" rel="lightbox-7"><img class="aligncenter  wp-image-34293" alt="CardDimensions" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardDimensions1.jpg" width="1152" height="508" /></a></p>
<p>As a result, you get a very very powerful graphics card that is fully contained and self-sufficient, as long as you give it 500 watts of power. That means that this card has two PCI Express 8-pin power connectors and it also means that you need to have the right PSU with enough current on the +12v rail in order to power it safely and in a stable manner. For this purpose, we will be using a Thermaltake Toughpower XT 1475W power supply. Obviously you don&#8217;t need that much unless you plan to run two of these cards in a single system.</p>
<p>&nbsp;</p>
<p>The back of the card is also populated with four MiniDP and one Dual-Link DVI connector, netting you a total of five monitors, at up to 2560&#215;1600 resolution if you use all 5. However, you could theoretically drive three 4K monitors off of this display &#8211; realistically you&#8217;d want to have two of these GPUs in order to drive a triple-4K display setup smoothly. Gaming in 25 million pixels? Everything is possible.</p>
<p>That takes us to the GPU&#8217;s overall performance and evaluating its capability as a 4K gaming card and whether or not it is actually worth the $1,499 pricetag. Not just that, but how does it utilize the 8GB of GDDR5 and does it run anywhere nearly as hot as the R9 290X with the new cooling solution?</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00301.jpg" rel="lightbox-8"><img class="aligncenter  wp-image-34303" alt="DSC_0030" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00301.jpg" width="1152" height="533" /></a></p>
<p>For our testing, we tested synthetic benchmarks, OpenCL benchmarks and game benchmarks. All of these benchmarks were tested with both Nvidia&#8217;s and AMD&#8217;s latest graphics drivers and AMD&#8217;s Mantle was enabled where possible (Battlefield 4). The system we tested the cards on was powered by a Core i7-4960X with 16GB of DDR3-1600 and on an X79 board in an x16 PCIe slot. The PSU, as mentioned earlier, was a Thermaltake Toughpower XT 1475W and we were using a SHARP Sharp PN-K321 monitor to test 4K. We did not test in any other resolution other than 4K because it seems ridiculous to spend $1,500 on a graphics card and use it for anything other than 4K, seriously. Theoretically, you could dedicate it towards compute, but in that case we&#8217;ve got you covered.</p>
<p>The tests that we ran in our benchmarking were 3DMark Fire Strike Extreme, Kishonti Compubench (CLBenchmark), Luxmark v2.0, Battlefield 4, Crysis 3 and Counter Strike: Global Offensive.</p>
<p>In our testing, we tested the R9 295X2, R9 290 and the GTX 780 Ti. We had some issues getting the last gen Radeon HD 7990 to work properly with our 4K monitor in the game tests so we had to throw it out and will likely have to figure out a solution to get it work in the future. Unfortunately, that issue presented AMD&#8217;s drivers at its worst. However, we do have some non-real-gaming HD 7990 benchmarks in our 3DMark and OpenCL benchmarks to give an idea of performance over the HD 7990.</p>
<p><span style="font-weight: bold;">Synthetic Benchmarks</span></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/3DmarkFireStrikeExtremeFixed1.jpg" rel="lightbox-9"><img class="aligncenter size-full wp-image-34318" alt="3DmarkFireStrikeExtremeFixed" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/3DmarkFireStrikeExtremeFixed1.jpg" width="1012" height="599" /></a></p>
<p>In 3DMark, which was the first test we ran, we were able to see that the R9 295X2 pretty much came in as the fastest single card that we&#8217;ve ever tested by a huge margin. The R9 295X2 scored a 3DMark Fire Strike Extreme score of 8403 in our testing, which bested our GTX Titan SLI setup with 7391 and the HD 7990 which scored 4639. It also beat the R9 290&#8217;s 4631 by almost twice, even though the 290X would have scored higher so the R9 295X2 isn&#8217;t quite double the performance of the Hawaii GPU in 3DMark. But even so, it is still the fastest multi-GPU setup we&#8217;ve tested in 3DMark.</p>
<p><span style="font-weight: bold;">Compute Performance</span></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/KishontiCLBench1.jpg" rel="lightbox-10"><img class="aligncenter size-full wp-image-34322" alt="KishontiCLBench" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/KishontiCLBench1.jpg" width="919" height="466" /></a></p>
<p>In Kishonti&#8217;s CompuBench, the new name for their CLBenchmark (since they&#8217;ve added renderscript on mobile) we wanted to see how each card handled compute as Nvidia has traditionally struggled against AMD and here we found some really interesting results. In terms of OpenCL compute, AMD&#8217;s Radeon R9 295X2 actually outperformed three of Nvidia&#8217;s GTX Titans and obviously the short lived Radeon HD 7990. This card scored 686,475 points which is higher than the three Titans&#8217; 646,233 and the Radeon HD 7990&#8217;s 560,695. What is interesting, though is that when compared to the R9 290&#8217;s 333,444 score, the R9 295X2 scored more than double, meaning that the R9 295X2 likely scores exactly double that of the R9 290X.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/LuxMarkv21.jpg" rel="lightbox-11"><img class="aligncenter size-full wp-image-34323" alt="LuxMarkv2" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/LuxMarkv21.jpg" width="913" height="440" /></a></p>
<p>The story was a little different in Luxmark v2.0 however, in the end the R9 295X2 was still much faster than all of the competitive GPU solutions and AMD actually took the top three slots even against three of Nvidia&#8217;s GTX Titans. The R9 295X2 is a really powerful card for compute and that is quite evident here with the score of 5350 over the 7970 GHz editions in CrossFireX (4679) and the HD 7990 (4475).</p>
<p><span style="font-weight: bold;">Real World Game Performance</span></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/BF4_4K1.jpg" rel="lightbox-12"><img class="aligncenter size-full wp-image-34319" alt="BF4_4K" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/BF4_4K1.jpg" width="769" height="353" /></a></p>
<p>In Battlefield 4 we wanted to get a playable scenario for all of the card that we tested so we set the settings to the high preset rather than the highest preset, which I believe is Ultra. We also opted for 4x MSAA instead of 8x MSAA. As a result, all of the card provided playable settings with the R9 295X2 delivering an average performance of 81 FPS compared to the R9 290 and 780 Ti which both got an average of 40 FPS. Now, considering that the R9 295X2 had an average of 81 FPS we would likely be able to crank the settings up to the ultra preset which would enable us to still have playable graphics at near maximum settings. Thanks to Mantle AMD&#8217;s R9 295X2 is able to deliver an impressive amount of performance in BF4 in 4K. During our benchmarking the R9 295X used a maximum of 6.7 GB out of the GPU&#8217;s available 8GB of memory, indicating some wiggle room for graphics settings bump, especially with Mantle, which would probably bring the card closer to 8GB.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/Crysis3_4K1.jpg" rel="lightbox-13"><img class="aligncenter" alt="Crysis3_4K" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/Crysis3_4K1.jpg" width="763" height="394" /></a></p>
<p>With Crysis 3 we just wanted to make the graphics cards cry and to see what would happen if we ran the game at nearly the maximum settings possible while still allowing the R9 295X2 to be playable. This was achieved by setting the game settings to Very High and dialing in the MSAA at 4x instead of 8x. As a result, we got the R9 295X2 to perform at an average of 29 FPS while the 780 Ti performed at 20 FPS and the R9 290 at 18 FPS. Even though the game looked absolutely stunning, at these settings in 4K the game was only playable on the R9 295X2 and would&#8217;ve required two 780 Tis or two R9 290s to get anywhere near similar performance. What was the most interesting thing was that this game far surpassed our memory utilization on the GPU than Battlefield 4 did. At its peak, the R9 295X2 used 7.5GB of 8 GB of memory, indicating that we had effectively maxxed out the card&#8217;s capabilities at 4K.</p>
<p>Thus, if you think that 8GB of video memory is an overkill for a graphics card, think again.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/CSGO_4k_6891.jpg" rel="lightbox-14"><img class="aligncenter size-full wp-image-34321" alt="CSGO_4k_689" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/CSGO_4k_6891.jpg" width="760" height="387" /></a></p>
<p>In terms of testing Counter Strike: Global Offensive, we knew that we weren&#8217;t going to be getting anything crazy in terms of graphics, but we did know that we would be able to easily play the game at 4K on all of our cards and to get high enough FPS to be able to differentiate between the cards more effectively. Not to mention the fact that this would be a Source Engine game while the others were Frostbite and CryEngine. Personally, I play a lot of 4K so this interested me a lot and since Counter Strike: Global Offensive is on its way to be one of the most played games on Steam, it only makes sense to test it out in 4K and see what kind of performance it would get. We also set the game at its maximum possible settings in order to try to lower the FPS as much as possible and improve the overall visual quality at 4K.</p>
<p>No surprise, at max settings all of these cards managed over 150 FPS. The R9 295X2 managed to deliver an average of 250 FPS (300 is the max set by the Source Engine) while the R9 290 managed 163 FPS and the 780 Ti 148 FPS. So, with the R9 295X2 you&#8217;re basically getting almost 100 FPS over the single GPUs in 4K. And in this scenario the memory utilization of the R9 295X2 was expectedly much lower than the other two games with it only utilizing 3.6 GB of the 8GB of memory.</p>
<p>That wraps up our performance testing. But before we move on to thermals we wanted to talk about the reason why we mentioned memory utilization. We wanted to illustrate the purpose of going with an 8GB card rather than a 16GB card, 8GB is pretty much enough in almost any scenario unless you believe that you are going to be running Crysis 3 at 4K. Anyone trying to sell you a card with more than 8GB is probably selling you more memory than you need and if you&#8217;re not running 4K you probably don&#8217;t need more than 3 or 4GB. Just keep that in mind as add-in board partners try to charge more for memory that you&#8217;ll never use.</p>
<p>We also wanted to note that even with AMD&#8217;s latest drivers and a custom CrossFire profile, certain benchmarks simply would not run dual GPUs. We also had some problems with configuring the 4K monitor with the R9 295X2, however, we were able to work those out. To us, it seemed that this card&#8217;s drivers still needed some work, but were playable in the three games that we tested them in. Also, AMD&#8217;s frame pacing appeared to work well as we didn&#8217;t notice any stuttering or nearly as much tearing as we had with previous AMD dual GPU setups.</p>
<p><span style="font-weight: bold;">Temperatures</span></p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00111.jpg" rel="lightbox-15"><img class="aligncenter  wp-image-34299" alt="DSC_0011" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00111.jpg" width="1152" height="394" /></a></p>
<p>Now, getting back to the card, we wanted to talk about the thermals. For me, when I first heard about this card and its possible existence my first concern was cooling and temperatures. AMD really struggled a lot with the R9 290X in terms of perception because the card topped out at water-boiling 95 Celsius under extreme loads and as a result actually throttled performance. Of course, their add-in board partners eventually released better cooling solutions that alleviated this issue to a degree, but that concern came back again once I heard that AMD would be making what was supposed to be a dual Hawaii GPU. To be honest, my first thought about a dual Hawaii card was that it had to be dual 290s and that it would have to be water cooled.</p>
<p>When I found out that this card would be two slightly higher clocked 290X&#8217;s I was a little concerned about how it would perform thermally and if they could handle the temperatures generated by these GPUs. Even when I saw the cooling solution I wasn&#8217;t sure that the 120mm Asetek radiator would be able to handle a maximum of 500W of GPU power dissipation. After all, most CPU coolers that are cooling 130W CPUs also have 120mm radiators and are effectively cooling half the TDP.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00251.jpg" rel="lightbox-16"><img class="aligncenter  wp-image-34301" alt="DSC_0025" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00251.jpg" width="1152" height="766" /></a></p>
<p>Upon doing all of my testing, I was able to conclude that the first GPU at idle was running at a normal temperature of around 36C which is pretty good for a normal graphics card, but fairly expected for a water cooled graphics card. Now, what really caught me by surprise was the temperature of the first GPU (the most heavily used GPU) under maximum load. The GPU never got hotter than 67C, even when the radiator and tubes were warm to the touch. Because of that, this card is capable of delivering a very stable level of performance and to do it consistently time and time again. I don&#8217;t know exactly what the guys at Asetek and AMD did, <a href="http://asetek.com/press-room/news/2014/amd-selects-asetek-to-liquid-cool-the-world%E2%80%99s-fastest-graphics-card.aspx" target="_self">but they have engineered a pretty fantastic cooling solution</a> that I frankly thought might be underpowered for these GPUs.</p>
<p><span><span style="font-weight: bold;">Conclusion</span></span></p>
<p>The Radeon R9 295X2 is a $1,500 card. There&#8217;s no getting around that. In fact, it makes a lot of sense if you think about it. The R9 290X should sell for around $550, even though there&#8217;s still a fair amount of gouging as a result of last year&#8217;s cryptocurrency rush. However, <a href="http://www.newegg.com/Product/Product.aspx?gclid=CObM56jb1L0CFUNhfgod6ocAUg&amp;Item=N82E16814129278&amp;nm_mc=KNC-GoogleAdwords&amp;cm_mmc=KNC-GoogleAdwords-_-pla-_-Desktop+Graphics+Cards-_-N82E16814129278&amp;ef_id=UhMeXwAAAWKGwRMG:20140410005755:s" target="_self">Newegg is selling a Visiontek R9 290X</a> for just under $600 with a non-reference cooler. Now, if you take the $550 price and assume it is multiplied by two, you&#8217;ve already arrived at an $1,100 card purely by the assumption of 2X the price. However, this card is also water cooled, which normally carries a $100-$200 premium on any graphics card, bringing the expected retail price up to about $1,300. However, most dual GPU cards rarely sell for 2X the price of their base-model single GPU parents and <a href="http://www.brightsideofnews.com/news/2014/3/25/gtc-2014-keynote---gtx-titan-z-and-pascal-announced.aspx" target="_self">Nvidia announced the Titan Z was going to be selling for $3000</a>. This gave AMD some wiggle room on price and I suspected that they would be selling it somewhere betwen $1,300 and $1,500. And here we are, at $1,500.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_01021.jpg" rel="lightbox-17"><img class="aligncenter  wp-image-34310" alt="DSC_0102" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_01021.jpg" width="1152" height="824" /></a></p>
<p>The card itself is when compared to buying two R9 290Xs simply does not provide more value than owning two cards. However, it is significantly quieter and cooler than running two R9 290Xs and it provides a pretty high level of gaming performance in 4K with a single graphics card. However, in order to properly make good use of either this GPU or any dual 290X setup, you should really be looking at a 4K display. So, this card makes sense if you want the absolute fastest or if you want to be able to cram four Hawaii GPUs into a board with only two PCIe slots or if your case simply isn&#8217;t big enough to fit four GPUs. Not to mention, cooling four Hawaii GPUs in any case on air would be significantly more difficult than two of these Vesuvius cards. In fact, you would be spending around $2,300 on four 290Xs and would probably have a much hotter and louder setup than if you spent the extra $700 and went with two 295X2s. So, there is some sense in going with these cards if you absolutely have to have the fastest and the best and use 4K.</p>
<p>In terms of value, just have one thing in mind: for the price of a single GeForce GTX Titan Z, you can purchase a R9 295X2 and not one, but TWO 28&#8243; 4K 60Hz panels from Samsung. If you dislike the TN panel, you can still afford a 24&#8243; IPS 4K Panel from Dell. Talk about a value deal.</p>
<p>I have to say that I was personally quite impressed with this card itself, especially on the performance and thermals side of things. If you want the absolute latest and greatest, not to mention fastest, fastest in the world, you have to buy the R9 295X2. And if you want to buy just one card for 4K gaming, this is it. And because of that, we&#8217;re awarding this card our editor&#8217;s choice award.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/editors-choice_prosumer1.gif" rel="lightbox-18"><img class="aligncenter size-full wp-image-34342" alt="editors-choice_prosumer" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/editors-choice_prosumer1.gif" width="618" height="68" /></a></p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/">AMD Radeon R9 295X2 Review: The Definitive 4K Gaming Graphics Card</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
	</channel>
</rss>

<!-- Performance optimized by W3 Total Cache. Learn more: http://www.w3-edge.com/wordpress-plugins/

Content Delivery Network via Amazon Web Services: CloudFront: cdn.vrworld.com

 Served from: www.vrworld.com @ 2015-04-10 17:44:31 by W3 Total Cache -->