<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>VR World &#187; DisplayPort</title>
	<atom:link href="http://www.vrworld.com/tag/displayport/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.vrworld.com</link>
	<description></description>
	<lastBuildDate>Fri, 10 Apr 2015 07:54:22 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.1.1</generator>
	<item>
		<title>Embedded DisplayPort Standard 1.4a Supports 8K, AMD FreeSync</title>
		<link>http://www.vrworld.com/2015/02/11/embedded-displayport-standard-1-4a-supports-8k-amd-freesync/</link>
		<comments>http://www.vrworld.com/2015/02/11/embedded-displayport-standard-1-4a-supports-8k-amd-freesync/#comments</comments>
		<pubDate>Wed, 11 Feb 2015 10:34:48 +0000</pubDate>
		<dc:creator><![CDATA[Harish Jonnalagadda]]></dc:creator>
				<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[4K]]></category>
		<category><![CDATA[8K]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[eDP]]></category>
		<category><![CDATA[eDP 1.4a]]></category>
		<category><![CDATA[VESA]]></category>

		<guid isPermaLink="false">http://www.vrworld.com/?p=46995</guid>
		<description><![CDATA[<p>The latest eDP standard will allow your mobile to drive an 8K screen. </p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/11/embedded-displayport-standard-1-4a-supports-8k-amd-freesync/">Embedded DisplayPort Standard 1.4a Supports 8K, AMD FreeSync</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="640" height="348" src="http://cdn.vrworld.com/wp-content/uploads/2015/02/VESA-8K.jpg" class="attachment-post-thumbnail wp-post-image" alt="VESA-8K" /></p><p>The Video Electronics Standards Association (VESA) has announced an update to the Embedded DisplayPort (eDP) standard, bringing it to 1.4a. The major additions in this version include the introduction of the Display Stream Compression (DSC) standard (1.1) and a new segmented panel architecture that facilitates an increase in the amount of bandwidth available.</p>
<p>With eDP 1.4a, the focus is to enable higher bandwidth in systems with integrated graphics solutions such as smartphones, tablets and notebooks. With Quad HD mobile devices already available and 4K touted to be the next big thing in the world of smartphones, bandwidth required to drive a screen is continually growing. As such, eDP 1.4a has four lanes, each with a bandwidth of 8.1Gbps. The lanes can be used individually or combined to provide a theoretical bandwidth of 32.4Gbps. The amount of available bandwidth can easily drive a 4K screen (3840×2160) at 120Hz with 10-bit color and an 8K display at 60Hz.</p>
<p>Display Stream Compression is a compression technology that VESA claims does not result in any noticeable difference in quality, leading to &#8220;visually lossless&#8221; content. The technology allows data reduction by as much as three times the original size. There&#8217;s also enhancements to Panel Self-Refresh, which further adds to the overall data reduction by only changing the updated pixels frame to frame. We can only gauge the veracity of these claims once we see the standard in consumer-level hardware, which will occur next year.</p>
<p>Also available in eDP 1.4a is optional support for Adaptive Sync, or AMD&#8217;s FreeSync. The technology works in conjunction with an AMD-enabled video card and serves to minimize frame-tearing and stutter. As the feature is an optional one in the standard, it is likely we won&#8217;t see it in all eDP 1.4a monitors.</p>
<p>While eDP 1.4a is a move to enable 4K and beyond resolutions, there are other significant factors in bringing 4K or even 8K to a mobile, including GPU and display technology.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2015/02/11/embedded-displayport-standard-1-4a-supports-8k-amd-freesync/">Embedded DisplayPort Standard 1.4a Supports 8K, AMD FreeSync</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2015/02/11/embedded-displayport-standard-1-4a-supports-8k-amd-freesync/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>AMD Partners With Samsung for FreeSync Displays</title>
		<link>http://www.vrworld.com/2014/11/20/amd-partners-samsung-freesync-displays/</link>
		<comments>http://www.vrworld.com/2014/11/20/amd-partners-samsung-freesync-displays/#comments</comments>
		<pubDate>Fri, 21 Nov 2014 06:21:08 +0000</pubDate>
		<dc:creator><![CDATA[VR World Staff]]></dc:creator>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[005930]]></category>
		<category><![CDATA[1.2a]]></category>
		<category><![CDATA[144Hz]]></category>
		<category><![CDATA[4K]]></category>
		<category><![CDATA[Acer]]></category>
		<category><![CDATA[Adaptive-Sync]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Asus]]></category>
		<category><![CDATA[BenQ]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[Free Sync]]></category>
		<category><![CDATA[FreeSync]]></category>
		<category><![CDATA[Future of Compute]]></category>
		<category><![CDATA[g-sync]]></category>
		<category><![CDATA[monitor]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Philips]]></category>
		<category><![CDATA[QHD]]></category>
		<category><![CDATA[ROG]]></category>
		<category><![CDATA[samsung]]></category>
		<category><![CDATA[Swift]]></category>
		<category><![CDATA[UE850]]></category>
		<category><![CDATA[UHD]]></category>
		<category><![CDATA[VESA]]></category>
		<category><![CDATA[XB280HK]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=42012</guid>
		<description><![CDATA[<p>Today at AMD's Future of Compute event in Singapore AMD announced that it has partnered with Samsung to put out FreeSync displays.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/11/20/amd-partners-samsung-freesync-displays/">AMD Partners With Samsung for FreeSync Displays</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1271" height="708" src="http://cdn.vrworld.com/wp-content/uploads/2014/11/FreeSync.jpg" class="attachment-post-thumbnail wp-post-image" alt="AMD FreeSync" /></p><p>AMD (<a href="https://www.google.com/finance?cid=327">NYSE<span id="dccae15a-a141-4575-8f04-f04ad29acdeb" class="GINGER_SOFTWARE_mark">:</span>AMD</a>) and Samsung (<a href="https://www.google.com/finance?q=KRX%3A005930&amp;sq=samsung&amp;sp=1&amp;ei=r79uVPnmJOaTswfU64H4Ag">KRX:005930</a>) announced a partnership today at the AMD Future of Compute event in held in Singapore.</p>
<p>The partnership will have Samsung making FreeSync enabled displays, and they will begin to appear in March of next year.  The first will be 23.6 inch and 28 inch versions of the UD590 which will both be UHD (4K) displays.  Later will come some more UHD displays in the UE850 line with 23.6 inch, 28 inch<span id="d6346844-86f5-44ce-bd50-69f46634a6d3" class="GINGER_SOFTWARE_mark"> ,</span> and 31.5 inch models.  AMD&#8217;s FreeSync is its alternative to Nvidia&#8217;s G-SYNC tech that has been in the market for over a year now. AMD went the royalty free route with its FreeSync technology that is based on the Adaptive-Sync open standard for DisplayPort.  VESA is implementing this technology in the new DisplayPort 1.2a and DisplayPort 1.3 specification.  With the lower costs of FreeSync tech this will no doubt encourage more manufacturers to use it in upcoming models.</p>
<p><iframe src="//www.youtube.com/embed/QAWtKK9ga2k" width="1280" height="720" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p>It may be fairly late in the game for FreeSync since G-Sync has done well <span id="f0e3dcba-81b5-49a8-be1d-aa101ec29612" class="GINGER_SOFTWARE_mark">with</span> gaming enthusiasts looking to get a good gaming monitor.  <span id="6f776835-a1c7-49ea-b14f-98d998868440" class="GINGER_SOFTWARE_mark">Nvidia (<a href="www.google.com/finance?cid=662925">NASDAQ: NVDA</a>)</span> has made headway getting companies on the G-SYNC train, such as ASUS (<a href="www.google.com/finance?cid=674388">TPE: 2357</a>), BenQ (TPE:8215), Philips, and Acer (<a href="www.google.com/finance?cid=681406">TPE:2353</a>).  The 27&#8243; QHD (2560&#215;1440) 144Hz <a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16824236405&amp;cm_re=PG278Q-_-24-236-405-_-Product">ASUS ROG Swift PG278Q</a> is by far one of the best G-SYNC monitors that is currently available but sells for $799 and isn&#8217;t even 4K.  There are also UHD versions available such as the <a href="http://www.amazon.com/gp/product/B00O0Z5682/ref=as_li_tl?ie=UTF8&amp;camp=1789&amp;creative=9325&amp;creativeASIN=B00O0Z5682&amp;linkCode=as2&amp;tag=brsiofne0e-20&amp;linkId=5JIIPESM6OHDKTG5">Acer XB280HK that retails for just under $800</a><img style="border: none !important; margin: 0px !important;" src="http://ir-na.amazon-adsystem.com/e/ir?t=brsiofne0e-20&amp;l=as2&amp;o=1&amp;a=B00O0Z5682" alt="" width="1" height="1" border="0" />. But even with a decent amount of monitors on the market, Nvidia&#8217;s G-Sync monitors command a serious premium over equally specced monitors without G-Sync. As such, this could potentially put Nvidia at a disadvantage since many monitor manufacturers are constantly looking for ways to differentiate without spending too much money on the bill of materials (BOM).</p>
<p>There is little doubt that FreeSync will do well and that the partnership between AMD and Samsung will do well with the products.  With about four months until the models come out users will have plenty of time to set some money aside each month now that we have a date.  This will be the go-to monitor of choice for the AMD fans and there sure are plenty of them waiting for the next-<span id="70e5c307-b1e6-4446-bd40-16a41db51df0" class="GINGER_SOFTWARE_mark">gen</span> AMD product to pair with a FreeSync monitor. There has also been talk of companies like BENQ and even ASUS potentially releasing FreeSync compatible monitors. The reality of the situation is that adaptive sync is the superior technology and will very likely get adopted by Intel and that will help AMD overcome Nvidia&#8217;s overall size and dominance in the graphics market. Once AMD can partner with Intel (crazy to think) to push FreeSync/Adaptive Sync in both companies&#8217; products then G-Sync will effectively be dead. But you can&#8217;t blame Nvidia for wanting to improve gaming, even at a fairly high cost.</p>
<p>The partnership will see Samsung releasing new FreeSync monitors in March 2015.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/11/20/amd-partners-samsung-freesync-displays/">AMD Partners With Samsung for FreeSync Displays</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/11/20/amd-partners-samsung-freesync-displays/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Top Five Notebooks of 2014</title>
		<link>http://www.vrworld.com/2014/11/17/top-five-notebooks-of-2014/</link>
		<comments>http://www.vrworld.com/2014/11/17/top-five-notebooks-of-2014/#comments</comments>
		<pubDate>Mon, 17 Nov 2014 08:54:07 +0000</pubDate>
		<dc:creator><![CDATA[VR World Staff]]></dc:creator>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Guides]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Holiday Shopping Guides]]></category>
		<category><![CDATA[Mobile Computing]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[1080P]]></category>
		<category><![CDATA[128GB]]></category>
		<category><![CDATA[16GB]]></category>
		<category><![CDATA[1TB]]></category>
		<category><![CDATA[2014]]></category>
		<category><![CDATA[4710HQ]]></category>
		<category><![CDATA[8GB]]></category>
		<category><![CDATA[980m]]></category>
		<category><![CDATA[Apple]]></category>
		<category><![CDATA[Black Friday]]></category>
		<category><![CDATA[Black Friday 2014]]></category>
		<category><![CDATA[Chrome]]></category>
		<category><![CDATA[Chrome OS]]></category>
		<category><![CDATA[Chromebook]]></category>
		<category><![CDATA[Chromebook 2]]></category>
		<category><![CDATA[DDR3]]></category>
		<category><![CDATA[display]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[Dominator Pro]]></category>
		<category><![CDATA[Google]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[GT60]]></category>
		<category><![CDATA[GT70]]></category>
		<category><![CDATA[GT72]]></category>
		<category><![CDATA[gtx]]></category>
		<category><![CDATA[HDMI]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[IPS]]></category>
		<category><![CDATA[Laptop]]></category>
		<category><![CDATA[Lenovo]]></category>
		<category><![CDATA[M.2]]></category>
		<category><![CDATA[Macbook]]></category>
		<category><![CDATA[Macbook Air]]></category>
		<category><![CDATA[MacBook Pro]]></category>
		<category><![CDATA[MSI]]></category>
		<category><![CDATA[Notebook]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[OSX]]></category>
		<category><![CDATA[PC]]></category>
		<category><![CDATA[raid]]></category>
		<category><![CDATA[resolution]]></category>
		<category><![CDATA[Retina]]></category>
		<category><![CDATA[SSD]]></category>
		<category><![CDATA[SuperRAID]]></category>
		<category><![CDATA[Top]]></category>
		<category><![CDATA[Toshiba]]></category>
		<category><![CDATA[Wi-Fi]]></category>
		<category><![CDATA[Windows]]></category>
		<category><![CDATA[X1]]></category>
		<category><![CDATA[X1 Carbon]]></category>
		<category><![CDATA[Yoga]]></category>
		<category><![CDATA[Yoga 3 Pro]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=41690</guid>
		<description><![CDATA[<p>With the holidays approaching fast many people will be getting new notebooks this holiday season, we give you our pick for the top 5.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/11/17/top-five-notebooks-of-2014/">Top Five Notebooks of 2014</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1500" height="918" src="http://cdn.vrworld.com/wp-content/uploads/2014/11/toshiba-chromebook-2-MAIN.jpg" class="attachment-post-thumbnail wp-post-image" alt="toshiba chromebook 2 -MAIN" /></p><p>With the holidays approaching fast and the sales that come with it many people will be getting new notebooks this holiday season.  Researching what the best notebooks out there are takes time and we though we could help out by giving you our choices in five different categories of notebooks.  They are gaming, ultraportable, Chromebook, business, and Apple MacBook.  With the latest computer parts in these notebooks it will be certain that they will be able to earn there keep for quite some time.  There are many things to consider when looking for a notebook and budget is a big one.  Most of these choices are going to be in the higher end range since they perform better.</p>
<h2>Gaming Notebook: <a href="http://www.amazon.com/gp/product/B00O4ORYN4/ref=as_li_tl?ie=UTF8&amp;camp=1789&amp;creative=9325&amp;creativeASIN=B00O4ORYN4&amp;linkCode=as2&amp;tag=brsiofne0e-20&amp;linkId=FC6GA66ARVBS5TLB"> MSI Computer GT72  Dominator Pro with GTX 980M &#8211; $2,299</a><img style="border: none !important; margin: 0px !important;" src="http://ir-na.amazon-adsystem.com/e/ir?t=brsiofne0e-20&amp;l=as2&amp;o=1&amp;a=B00O4ORYN4" alt="" width="1" height="1" border="0" /></h2>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/11/msi-gt72-dominator-pro.jpg" rel="lightbox-0"><img class="aligncenter size-medium wp-image-41701" src="http://cdn.vrworld.com/wp-content/uploads/2014/11/msi-gt72-dominator-pro-600x493.jpg" alt="msi gt72 dominator pro" width="600" height="493" /></a></p>
<p>This gaming notebook is the top of the line from MSI, and it does not disappoint.  Equipped with Intel i7-4710HQ, 16GB of DDR3-1600, a 128GB SSD, and a Nvidia GTX 980M 8GB, this will tear through any recent and upcoming game.  The best thing about the GT72 Dominator Pro is that MSI has taken steps to ensure that you can update the graphics down the line when new and better GPUs come out.  This means that the investment made in this computer will last a lot longer than comparable notebooks that are stuck with their graphics for the life of the product. If you are wanting some more speed from the notebook then throw in four identical M.2 SSDs inside and put them in RAID 0 for read/write speeds of up to 1600MB/s.  From experience with MSI&#8217;s SuperRAID as it is called I can attest to how massive of an improvement that speeds like this can be, it makes using the notebook just so much more enjoyable.  You can even attach three monitors to the notebook with two mini-displayport and one HDMI outputs.  The keyboard feels like no other notebook keyboard and is made by the gaming company SteelSeries. Bottom line is that if this is in your budget and you are a gamer or even a power user, I have little doubt that you will love this notebook.</p>
<h2>Apple Notebook: <a href="http://www.amazon.com/gp/product/B00G2MB7KW/ref=as_li_tl?ie=UTF8&amp;camp=1789&amp;creative=9325&amp;creativeASIN=B00G2MB7KW&amp;linkCode=as2&amp;tag=brsiofne0e-20&amp;linkId=54HBDESOICO2KH4B"> 15&#8243; 2.5GHz MacBook Pro with Retina display &#8211; $2,374.99</a><img style="border: none !important; margin: 0px !important;" src="http://ir-na.amazon-adsystem.com/e/ir?t=brsiofne0e-20&amp;l=as2&amp;o=1&amp;a=B00G2MB7KW" alt="" width="1" height="1" border="0" /></h2>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/11/apple-macbook-pro-15-retina.jpg" rel="lightbox-1"><img class="aligncenter size-medium wp-image-41698" src="http://cdn.vrworld.com/wp-content/uploads/2014/11/apple-macbook-pro-15-retina-600x369.jpg" alt="apple macbook pro 15 retina" width="600" height="369" /></a></p>
<p>There is no denying it, Apple makes good notebooks and they are arguably the best notebooks for running Windows on as well.  The Retina display will allow for better multi-tasking with various windows as the resolution gives lots of screen real estate.  The Retina display will be great for those who edit photos or video as the display will reproduce colors very well. The construction of the MacBook Pro is amazing and the fit and finish are things that other companies could learn a thing or two from.  The choice with the 2.5GHz Intel i7 was made for what would be the most capable and powerful option with a reasonable price.  At Apple you can customize the MacBook Pro with up to a 1TB SSD and a 2.8GHz Intel i7.</p>
<h2>Ultraportable Notebook: <a href="http://www.amazon.com/gp/product/B00OVFGU36/ref=as_li_tl?ie=UTF8&amp;camp=1789&amp;creative=9325&amp;creativeASIN=B00OVFGU36&amp;linkCode=as2&amp;tag=brsiofne0e-20&amp;linkId=BW7WV4SKM5HG3H2N">Lenovo Yoga 3 Pro &#8211; $1,382.99</a></h2>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/11/Lenovo-Yoga-3-Pro.jpg" rel="lightbox-2"><img class="aligncenter size-medium wp-image-41700" src="http://cdn.vrworld.com/wp-content/uploads/2014/11/Lenovo-Yoga-3-Pro-600x590.jpg" alt="Lenovo Yoga 3 Pro" width="600" height="590" /></a></p>
<p>The 2-in-1 Lenovo Yoga 3 Pro is the latest model in the Yoga line and features an amazing watchband style hinge.  The hinge is made up of hundreds of individual pieces that allows the notebook to flex like no other and fold to use it in different positions.  The screen is a high resolution (3200&#215;1800) IPS touchscreen that will reproduce colors very well and look much better than lower quality panels.  If needing something very light, portable, and a long battery life then this notebook should be on your list of ones to check out.  The stylish Yoga 3 Pro is one of the most stunning designs out now and is powered by the brand new Intel Core M processor.</p>
<h2> Chromebook &#8211; <a href="http://www.toshiba.com/us/computers/laptops/chromebook/cb30-2hd">Toshiba Chromebook 2 1080p $329.99</a></h2>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/11/toshiba-chromebook-2.jpg" rel="lightbox-3"><img class="aligncenter size-medium wp-image-41702" src="http://cdn.vrworld.com/wp-content/uploads/2014/11/toshiba-chromebook-2-600x538.jpg" alt="toshiba chromebook 2" width="600" height="538" /></a></p>
<p>Chromebooks are a new breed of notebook, and people are really loving them.  Chromebooks generally are affordable, light, and have long battery life making it an easy choice for those who need something portable.  The best thing about this model is likely the 1080p IPS panel that will look much better than panels that are in other models.  Powering the notebook is an Intel Celeron processor, and while not very powerful it will get the job done and sip power giving longer battery life.  The battery should be good for up to 8 hours and 45 minutes.  The audio has been fine-tuned by Skullcandy meaning that it should deliver much nicer sound than the standard Chromebook.  This will be useful if you enjoy listening to streaming music while you surf the web or do work.  The 802.11ac Wi-Fi is blazing fast and will allow you to get better coverage over other Chromebooks.</p>
<h2>Business Notebook: <a href="http://www.amazon.com/gp/product/B00HQ96JU8/ref=as_li_tl?ie=UTF8&amp;camp=1789&amp;creative=9325&amp;creativeASIN=B00HQ96JU8&amp;linkCode=as2&amp;tag=brsiofne0e-20&amp;linkId=H57OVDOLZLV2PZX6">Lenovo X1 Carbon &#8211; $1,619 </a><a href="http://www.amazon.com/gp/product/B00HQ96JU8/ref=as_li_tl?ie=UTF8&amp;camp=1789&amp;creative=9325&amp;creativeASIN=B00HQ96JU8&amp;linkCode=as2&amp;tag=brsiofne0e-20&amp;linkId=H57OVDOLZLV2PZX6"><img style="border: none !important; margin: 0px !important;" src="http://ir-na.amazon-adsystem.com/e/ir?t=brsiofne0e-20&amp;l=as2&amp;o=1&amp;a=B00HQ96JU8" alt="" width="1" height="1" border="0" /><br />
</a></h2>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/11/Lenovo-X1-Carbon.jpg" rel="lightbox-4"><img class="aligncenter size-medium wp-image-41699" src="http://cdn.vrworld.com/wp-content/uploads/2014/11/Lenovo-X1-Carbon-600x400.jpg" alt="Lenovo X1 Carbon" width="600" height="400" /></a></p>
<p>When looking for a business notebook it is wise to choose something that is portable and light so you can easily get work done no matter where you are working.  The X1 Carbon also offers great battery life allowing up to 9 hours from a single charge.  Lenovo is the the most used PC maker for businesses and they sure do know what they are doing with its lineup that is geared for productivity.  The keyboard is surprisingly good and has back-lighting making it easier to get work done while on that red-eye to an important meeting or at home in the evening.  The integrated fingerprint reader should help simplify secure login by allowing you to simply swipe a finger over it.  When it comes to a business notebook, choosing Lenovo is a wise choice that millions have made before.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/11/17/top-five-notebooks-of-2014/">Top Five Notebooks of 2014</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/11/17/top-five-notebooks-of-2014/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>One Cable to Rule Them All: USB Type C With DisplayPort Alt Mode</title>
		<link>http://www.vrworld.com/2014/09/22/one-cable-rule-usb-type-c-displayport-alt/</link>
		<comments>http://www.vrworld.com/2014/09/22/one-cable-rule-usb-type-c-displayport-alt/#comments</comments>
		<pubDate>Mon, 22 Sep 2014 21:09:25 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[Audio/Video]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Mobile Computing]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Software]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[DisplayPort 1.2]]></category>
		<category><![CDATA[DisplayPort 1.3]]></category>
		<category><![CDATA[DisplayPort Alt]]></category>
		<category><![CDATA[DisplayPort Alt Mode]]></category>
		<category><![CDATA[DisplayPort over USB]]></category>
		<category><![CDATA[usb]]></category>
		<category><![CDATA[usb 3.0]]></category>
		<category><![CDATA[USB 3.0 Promoter Group]]></category>
		<category><![CDATA[USB 3.1]]></category>
		<category><![CDATA[USB IF]]></category>
		<category><![CDATA[USB Type C]]></category>
		<category><![CDATA[VESA]]></category>
		<category><![CDATA[VESA DisplayPort]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=39106</guid>
		<description><![CDATA[<p>After just recently announcing the DisplayPort 1.3 standard, VESA has today announced yet another major step forward for the entire electronics industry. VESA, the governing ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/09/22/one-cable-rule-usb-type-c-displayport-alt/">One Cable to Rule Them All: USB Type C With DisplayPort Alt Mode</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="958" height="568" src="http://cdn.vrworld.com/wp-content/uploads/2014/09/DisplayPortAltMode_Trimmed2.jpg" class="attachment-post-thumbnail wp-post-image" alt="DisplayPort Alt Mode" /></p><p>After just recently announcing the DisplayPort 1.3 standard, VESA has today <a href="http://www.vesa.org/news/vesa-brings-displayport-to-new-usb-type-c-connector/" target="_blank">announced</a> yet another major step forward for the entire electronics industry.</p>
<p><a href="http://www.vesa.org/" target="_blank">VESA</a>, the governing body behind DisplayPort Alt actually has been working with the <a href="http://www.usb.org/home" target="_blank">USB 3.0 promoter group</a> to integrate the new DisplayPort Alt protocols into the new USB Type C connector due to be implemented in future computers and mobile devices. The brilliance of this partnership and announcement is that it combines the simplicity of USB Type C with the interoperability of DisplayPort across various standards and connectors. Remember, USB Type C is the USB IF&#8217;s own third standard connector (in addition to A and B) which allows for a perfectly reversible connector that is not only significantly smaller than the current USB connectors, but also orientation agnostic and capable of delivering up to 100 watts of power.</p>
<div id="attachment_39127" style="width: 610px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/USBTypeCPinOutDiagram.jpg" rel="lightbox-0"><img class="size-medium wp-image-39127" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/USBTypeCPinOutDiagram-600x292.jpg" alt="USB Type C Pinout Diagram" width="600" height="292" /></a><p class="wp-caption-text">USB Type C Pin Out Diagram &#8211; Notice how its symmetrical and reversible</p></div>
<p>In simple terms, that means that you can have a 4K video signal transmitted over the very same cable that also powers your device and sends it other data. You could theoretically connect a Dell (<a href="www.google.com/finance?cid=153088">NASDAQ: DELL</a>) 4K display that has USB 3.1 ports and power over a single USB Type C connector, which would mean that most devices would only require a single USB Type C connector for all purposes. This usage model works only when a USB Type C connector is connected to another USB Type C connector, however there is still quite a bit of interoperatbility between previous USB connectors and standards with USB Type C and DisplayPort Alt mode.</p>
<div id="attachment_39125" style="width: 610px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/ExampleConfigs.jpg" rel="lightbox-1"><img class="size-medium wp-image-39125" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/ExampleConfigs-600x434.jpg" alt="Example Configurations of USB Type C with DisplayPort Alt Mode" width="600" height="434" /></a><p class="wp-caption-text">Example Configurations of USB Type C with DisplayPort Alt Mode</p></div>
<h2>Watch out Thunderbolt</h2>
<p>With an industry standard USB Type C connector on both ends consumers can expect that their device will have up to USB 3.1 Gen 1 (5 Gbps) bandwidth, combined with up to 100 watts of power and DisplayPort audio and video signaling capability. This presents a direct challenge to Apple&#8217;s own Thunderbolt connector which is capable of both data and video like USB 3.1, but lacks power capabilities as well as the ability to send data, power and video at the same time. In fact, there have been rumors that Apple (<a href="www.google.com/finance?cid=22144">NASDAQ: AAPL</a>) is actually behind pushing USB Type C and DisplayPort Alt due to the fact that its Thunderbolt cables still require Intel&#8217;s (<a href="www.google.com/finance?cid=284784">NASDAQ: INTC</a>) proprietary technology and don&#8217;t actually help improve bandwidth much greater than what USB 3.1 Gen 2 offers. A USB Type C cable with USB 3.1 Gen 2 (10 Gbps) is essentially just as capable as Thunderbolt is, as both standards only support 10 Gbps per cable.</p>
<p>Also, with DisplayPort Alt, a laptop or mobile device manufacturer doesn&#8217;t have to worry about what display a consumer might want to use because DisplayPort effectively supports all legacy standards in addition to DisplayPort (HDMI, DVI, VGA).</p>
<p>It also removes the problem that DisplayPort has been having with manufacturers, which is battling for connector space on PCBs and fitting into manufacturers&#8217; progressively thinner and thinner designs. Now, everyone will want to have a USB Type C connector in their devices purely because of the fact that it fully support DisplayPort Alt mode which means that you can get full DisplayPort functionality through a USB cable and don&#8217;t need an additional connector (like you would with HDMI or standard DisplayPort connector). DisplayPort Alt mode essentially means that mobile device manufacturers can toss standards like HDMI and MHL to the wind and adopt a single cable for everything while still supporting legacy standards at the same time.</p>
<p>VESA is working with the USB IF to create a standard set of testing procedures for cables to certify them for DisplayPort Alt mode which will be signified by a simple DisplayPort logo on the USB cable, letting the consumer know that the cable is capable of DisplayPort Alt mode and therefore all of the things that come with that, including video signaling capabilities. The goal is to make this certification and testing procedure to be part of the USB Type C certification process so that manufacturers can easily adhere to it without needing additional equipment or testing procedures.</p>
<p>With this announcement, device manufacturers can now focus on a single connector standard and unify around it on a global scale. It will satisfy governments&#8217; demands to have a single power connector across all mobile devices and it may even entice Apple to move away from their own proprietary lightning connector in order to make consumers&#8217; lives easier and manufacturing costs lower. No more apologizing for not having a lightning connector or worrying about whether or not a friend might have a spare charger. It also will reduce the amount of connectors and cables that consumers may have to deal with in general as the industry moves towards a single standard connector for virtually everything &#8216;cabled&#8217;.</p>
<p>USB Type C and DisplayPort Alt mode are a match made in heaven and are a really great example of when companies (and standards organizations) work together to create industry standards that make technology better for everyone.</p>
<p>&nbsp;</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/09/22/one-cable-rule-usb-type-c-displayport-alt/">One Cable to Rule Them All: USB Type C With DisplayPort Alt Mode</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/09/22/one-cable-rule-usb-type-c-displayport-alt/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
		<item>
		<title>GeForce GTX 980 Review: More Performance at Lower Power</title>
		<link>http://www.vrworld.com/2014/09/18/geforce-gtx-980-review-performance-lower-power/</link>
		<comments>http://www.vrworld.com/2014/09/18/geforce-gtx-980-review-performance-lower-power/#comments</comments>
		<pubDate>Fri, 19 Sep 2014 02:30:21 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Reviews]]></category>
		<category><![CDATA[256 Bit]]></category>
		<category><![CDATA[290]]></category>
		<category><![CDATA[290X]]></category>
		<category><![CDATA[4K]]></category>
		<category><![CDATA[AA]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[API]]></category>
		<category><![CDATA[Asynchronous Warp]]></category>
		<category><![CDATA[bus]]></category>
		<category><![CDATA[DirectX 12]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[DSR]]></category>
		<category><![CDATA[DX 11.3]]></category>
		<category><![CDATA[DX12]]></category>
		<category><![CDATA[GeForce GTX]]></category>
		<category><![CDATA[GeForce GTX 980]]></category>
		<category><![CDATA[Global Illumination]]></category>
		<category><![CDATA[Graphics Card]]></category>
		<category><![CDATA[GTX 980]]></category>
		<category><![CDATA[GTX 980 Review]]></category>
		<category><![CDATA[GTX980]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[price]]></category>
		<category><![CDATA[R9 290]]></category>
		<category><![CDATA[R9 290X]]></category>
		<category><![CDATA[Radeon]]></category>
		<category><![CDATA[Supersampling]]></category>
		<category><![CDATA[Voxel]]></category>
		<category><![CDATA[Voxel Global Illumination]]></category>
		<category><![CDATA[VXGI]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=38897</guid>
		<description><![CDATA[<p>The Nvidia GeForce GTX 980 is Nvidia&#8217;s latest and greatest graphics card featuring the company&#8217;s new Maxwell GPU architecture. Nvidia claims that Maxwell is able to ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/09/18/geforce-gtx-980-review-performance-lower-power/">GeForce GTX 980 Review: More Performance at Lower Power</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="980" height="452" src="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_Front.jpg" class="attachment-post-thumbnail wp-post-image" alt="NVIDIA GeForce GTX 980" /></p><p>The Nvidia GeForce GTX 980 is Nvidia&#8217;s latest and greatest graphics card featuring the company&#8217;s new Maxwell GPU architecture. Nvidia claims that Maxwell is able to maintain performance while delivering better power efficiency. Sure, the Kepler architecture brought some amazing improvements when compared to the infamous Fermi architecture, but it was less revolutionary than the Maxwell architecture which debuted last year in the GTX 750 Ti.</p>
<p>Below, you can see a single SMM block diagram of the Maxwell architecture, followed by the full GM-204 architecture. Keep in mind that this is not the full-blown version of Maxwell.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/GeForce_GTX_980_SM_Diagram_FINAL.jpg" rel="lightbox-0"><img class="aligncenter size-medium wp-image-38907" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/GeForce_GTX_980_SM_Diagram_FINAL-320x600.jpg" alt="GeForce_GTX_980_SM_Diagram_FINAL" width="320" height="600" /></a></p>
<p>The GeForce GTX 980 is based upon Nvidia&#8217;s GM-204 GPU which is a mid-range version of Nvidia&#8217;s full Maxwell architecture. Even though the GTX 980 is being sold as a high-end card, it actually slots very similarly into Nvidia&#8217;s product lineup like the GTX 680 did.</p>
<p>The GTX 680 eventually became the GTX 770 and slotted in below the GTX 780 (a chopped down Titan) and the 780 Ti which was the full Kepler architecture and above the 760 Ti, also a chopped down card. So, with the GTX 980 we should be able to compare to the GTX 680 which was GK-104 and the GTX 780 Ti, which was full-blown Kepler. The GTX 980 is also thermally 30 watts less power than the GTX 680 Kepler card while performing far faster than it.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/GeForce_GTX_980_Block_Diagram_FINAL.jpg" rel="lightbox-1"><img class="aligncenter size-medium wp-image-38905" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/GeForce_GTX_980_Block_Diagram_FINAL-600x559.jpg" alt="GeForce_GTX_980_Block_Diagram_FINAL" width="600" height="559" /></a></p>
<p>In the new GPU, one of the most notable improvements was the increase of the L2 cache from 512 Kb all the way up to 2048 Kb. You can also see that Nvidia has made some significant improvements to a lot of the GPU&#8217;s design in order to improve efficiency. And the net result is that the GTX 980 has a TDP of 165w while the GTX 680 had a TDP of 195w, that&#8217;s a reduction of 30W or just under 20% in a single generation (going from GK-104 to GM-204) using the same process node (28nm). However, in order to build a GM-210 Nvidia will need a process shrink to enable them to shrink the die size and gain even more power efficiency and build a very dense 10 billion+ transistor chip.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/SpecsTable_980.jpg" rel="lightbox-2"><img class="aligncenter size-medium wp-image-38949" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/SpecsTable_980-600x486.jpg" alt="SpecsTable_980" width="600" height="486" /></a></p>
<p>In addition to the GM-204 GPU, Nvidia also opted to push for a standard 4GB of GDDR5 memory at 7 Gbps, resulting in some impressive memory bandwidth figures even though they only have a 256-bit memory bus.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/Maxwell_GM204_DIE_3D_V17_Final.jpg" rel="lightbox-3"><img class="aligncenter size-medium wp-image-38918" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/Maxwell_GM204_DIE_3D_V17_Final-600x337.jpg" alt="Maxwell_GM204_DIE_3D_V17_Final" width="600" height="337" /></a></p>
<h2>Hardware</h2>
<p>Moving on from the GPU and GPU architecture of the GTX 980, it&#8217;s easy to see that the hardware bears a very strong resemblance to the Kepler years starting with the GTX Titan. However, it is different in a few ways, including the fact that the card has two 6-pin PCIe connectors which means that it can draw up to 225w of power from the PCIe slot and power connectors in total. So, even though this card has a TDP of 165w, it can theoretically draw up to 225w, which means that this card could be an impressive overclocker with the appropriate cooling and voltage regulation.</p>
<p>Nvidia also included a backplate for the GTX 980 in order to help more evenly cool the back of the graphics card. This backplate, though, does  partially come off near the power connectors in order to properly allow for airflow into the fan when run in a very close SLI configuration with two or more cards.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_Front.jpg" rel="lightbox-4"><img class="size-medium wp-image-38929 aligncenter" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_Front-600x276.jpg" alt="NVIDIA GeForce GTX 980" width="600" height="276" /></a></p>
<p><img class="aligncenter size-medium wp-image-38928" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_BackPiece-600x366.jpg" alt="NVIDIA_GeForce_GTX_980_BackPiece" width="600" height="366" /></p>
<p>Below, you can see the GTX 980 with the fan shroud removed but with the GPU heatsink, memory heatsink and fan still attached.</p>
<p><img class="aligncenter size-medium wp-image-38931" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontNoShroud-600x399.jpg" alt="NVIDIA_GeForce_GTX_980_FrontNoShroud" width="600" height="399" /></p>
<p>Once the GPU heatsink is removed you can see the bare GPU with the memory heatsink and fan (which are one assembly).</p>
<p>&nbsp;</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontFan.jpg" rel="lightbox-5"><img class="aligncenter size-medium wp-image-38930" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontFan-600x399.jpg" alt="NVIDIA_GeForce_GTX_980_FrontFan" width="600" height="399" /></a></p>
<p>Then, once the whole assembly is removed you can see the GPU, memory chips, power phases and all of the various PCB markings, which actually show us that Nvidia only included 5 power phases on the GTX 980 even though the PCB can accommodate up to 7 power phases which could mean that this card may have some seriously overclocked versions already available at launch using the reference PCB.<a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontNoShroud.jpg" rel="lightbox-6"><br />
</a> <a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontPCB.jpg" rel="lightbox-7"><img class="aligncenter size-medium wp-image-38932" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_FrontPCB-600x399.jpg" alt="NVIDIA_GeForce_GTX_980_FrontPCB" width="600" height="399" /></a></p>
<p>The card also features three DisplayPort 1.2 connectors as well as a dual-link DVI connector and an HDMI 2.0 connector which gives you the ability to drive 4K in multiple ways as well as run displays at up to 5K resolution per display even though HDMI 2.0 only supports 4K and DisplayPort 1.2 only technically supports 4K as well. So, really, the maximum resolution per display is still 4096 x 2160.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/980Back_98.jpg" rel="lightbox-8"><img class="aligncenter size-medium wp-image-38965" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/980Back_98-600x232.jpg" alt="980Back_98" width="600" height="232" /></a></p>
<h2>Software</h2>
<p>During Nvidia&#8217;s recent Editor&#8217;s Day &#8212; which is used to brief the press on upcoming products &#8212; for the GTX 980 Nvidia showed off a lot of things that directly and indirectly involved the GTX 980. Many of the advancements of the GTX 980 come in the form of software, which includes DirectX 12 and DirectX 11.3.  But that doesn&#8217;t change the fact that Nvidia was already running a DX 12 ported demo of Fable running on two GTX 980s.</p>
<p>Nvidia made four big announcements about the GTX 980 that were outside of DX 12 and DX 11.3 and those pertain to Nvidia&#8217;s own VXGI, MFAA, DSR and their advancements with HMDs (head-mounted displays) like the Oculus VR.</p>
<p>MFAA &#8211; Multi-Frame Sampled Anti-Aliasing is Nvidia&#8217;s own technique of enabling higher AA visual quality while only experiencing a few percentage points of a performance hit compared to a lower quality MSAA. Essentially, Nvidia is claiming to deliver 4X MSAA-level quality at 2X MSAA performance (give or take a few percentage points). However, this feature is not quite finished yet and will be enabled in a future driver for testing and enabling higher quality AA at better performance levels.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/MFAA.jpg" rel="lightbox-9"><img class="aligncenter size-medium wp-image-38919" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/MFAA-600x333.jpg" alt="MFAA" width="600" height="333" /></a></p>
<p>In addition to MFAA, Nvidia has also implemented DSR (Dynamic Super Resolution) which is essentially smart Supersampling with an applied filter. What it allows you to do is essentially trick the game into thinking you&#8217;ve got a much higher resolution display (like a 4K display) and as a result it will serve you higher quality textures and render the game in 4K. This generally results in much higher quality images even though Nvidia&#8217;s DSR technology will shrink the image back down to your monitor&#8217;s native resolution (like 1080P). This is great for both Nvidia and gamers because it means gamers can get a better looking game without needing to spend more money on a monitor and Nvidia can sell more expensive more powerful graphics cards without consumers needing to buy expensive 4K monitors.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/DSR.jpg" rel="lightbox-10"><img class="aligncenter size-medium wp-image-38904" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/DSR-600x336.jpg" alt="DSR" width="600" height="336" /></a></p>
<p>Nvidia also talked about its own new technology called VXGI with a demonstration of the moon landing which uses the company&#8217;s own voxel-based global illumination engine. VXGI utilizes certain things within Maxwell&#8217;s hardware and within the game engine itself (Unreal Engine 4) in order to more efficiently and realistically recreate the bouncing of light off objects and to do it in realtime. VXGI itself isn&#8217;t implemented in any engine yet, but the expectation is that Unreal Engine 4 should have it by the fourth quarter of this year and we could very likely see it in games as soon as next year.</p>
<p>In addition to the VXGI stuff, Nvidia also took a stab at head-mounted displays and the latency problem. The company&#8217;s solution, dubbed Asynchronous Warp, is designed to half the latency of VR-related gaming in order to improve the overall experience and responsiveness of the platform. They went step by step looking for ways to improve VR performance until they reached Asynchronous Warp</p>
<p><img class="aligncenter size-medium wp-image-38913" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/HMDLatency-600x342.jpg" alt="HMDLatency" width="600" height="342" /></p>
<p><img class="aligncenter size-medium wp-image-38914" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/HMDLatency2-600x337.jpg" alt="HMDLatency2" width="600" height="337" /></p>
<p><img class="aligncenter size-medium wp-image-38915" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/HMDLatency3-600x337.jpg" alt="HMDLatency3" width="600" height="337" /></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/HMDLatencyAsyncWarp.jpg" rel="lightbox-11"><img class="aligncenter size-medium wp-image-38916" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/HMDLatencyAsyncWarp-600x338.jpg" alt="HMDLatencyAsyncWarp" width="600" height="338" /></a> <a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/HMDLatency3.jpg" rel="lightbox-12"><br />
</a>Asynchronous warp takes the last scene rendered by the GPU and updates it based on the latest head position information taken from the VR sensor. By warping the rendered image late in the pipeline to more closely match head position, Nvidia avoids discontinuities between head movement and action on screen while also dramatically reducing latency. We haven&#8217;t tested this out ourselves yet, but this is a pretty drastic leap forward for VR if it can actually be applied across the VR landscape.</p>
<h2>Performance</h2>
<p>For performance, we looked at the GTX 980&#8217;s synthetic, compute, and gaming benchmarks to evaluate whether or not it really is as significant of an improvement over the GTX 680 and possibly even the GTX 78o Ti. After all, Nvidia wouldn&#8217;t really be naming this card the GTX 980 unless it really could perform in such a way.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/GTX980_980.jpg" rel="lightbox-13"><img class="aligncenter size-medium wp-image-38966" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/GTX980_980-600x337.jpg" alt="GTX980_980" width="600" height="337" /></a></p>
<p>The testbed consisted of an Intel Core i7 4960X cooled by a Corsair H100 on a Gigabyte X79 motherboard with 16 GB of DDR3 2400 MHz memory along with a Thermaltake 1475W Gold PSU and Patriot 128GB SSD all sitting atop a Dimastech Hard Bench.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/3DMark-Fire-Strike-Extreme.jpg" rel="lightbox-14"><img class="aligncenter size-medium wp-image-38899" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/3DMark-Fire-Strike-Extreme-600x293.jpg" alt="3DMark Fire Strike Extreme" width="600" height="293" /></a></p>
<p>First, we tested 3DMark using the Fire Strike Extreme test in order to give the best idea of high-end performance against other cards. Here, it fell between two GTX 680&#8217;s in SLI and two 7970&#8217;s in CrossFireX. It did beat the GTX 780 Ti, and proved that it was indeed more than twice as fast as the GTX 680, which Nvidia was essentially claiming during the majority of the presentations.</p>
<p>After 3DMark, we also wanted to take a look at the Unigine set of synthetic benchmarks with Unigine&#8217;s Heaven and Valley benchmarks.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/Unigine-Heaven-4.0-Benchmark.jpg" rel="lightbox-15"><img class="aligncenter size-medium wp-image-38935" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/Unigine-Heaven-4.0-Benchmark-600x265.jpg" alt="Unigine Heaven 4.0 Benchmark" width="600" height="265" /></a></p>
<p>&nbsp;</p>
<p>As you can see from Unigine Heaven, the GTX 980 outperformed the GTX Titan and R9 290 by a fairly healthy margin and sat somewhere close to the HD 7970 GHz editions in CrossFire. Obviously this is a single GPU, but the fact that it falls within the realm of multi-GPU performance is awesome on its own.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/UnigineValley.jpg" rel="lightbox-16"><img class="aligncenter size-medium wp-image-38955" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/UnigineValley-600x266.jpg" alt="UnigineValley" width="600" height="266" /></a></p>
<p>In the Unigine Valley benchmark, we saw a much less drastic or impressive performance difference with the GTX 980 essentially falling between the GTX 780 and GTX Titan in terms of performance but still well out performing the R9 290 and AMD&#8217;s Hawaii GPU.</p>
<p>Following those benchmarks, we also took at look at two OpenCL benchmarks to see how Maxwell stacks up against AMD and how much Nvidia has improved over the previous Kepler generation. There was much talk that Nvidia had improved their OpenCL performance from one generation to the other so it was interesting to see if that was true and by how much. We tested LuxMark 2.0 and CompuBench 1.5 for our OpenCL testing.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/LuxMarkOpenCL.jpg" rel="lightbox-17"><img class="aligncenter size-medium wp-image-38917" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/LuxMarkOpenCL-600x255.jpg" alt="LuxMarkOpenCL" width="600" height="255" /></a></p>
<p>In LuxMark, the GTX 980 performed fantastically, showing that it was faster than two GTX Titans and an R9 290. Of course, it wasn&#8217;t as fast as three GTX Titans or multiple 7970s, a 7990 or an R9 295X2, but I suspect that multiple GTX 980 GPUs could give AMD a run for their money since all of the faster AMD cards are multi-GPU.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/Compubench-1.5.jpg" rel="lightbox-18"><img class="aligncenter size-medium wp-image-38901" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/Compubench-1.5-600x236.jpg" alt="Compubench 1.5" width="600" height="236" /></a></p>
<p>In Compubench we saw some interesting results with the GTX 980 trading punches with the R9 290X beating it in some OpenCL tests and losing to it in others. If anything, the GeForce GTX 980 shows that Nvidia is a far more capable OpenCL competitor to AMD than the GTX 780 Ti ever was.</p>
<p>Following those synthetic benchmarks, we ran a series of 4K benchmarks to see how the GTX 980 stacks up against the most stressful gaming environments. In our tests, we played Battlefield 4, Crysis 3 and Counter Strike: Global Offensive at varying levels of detail.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/Battlefield-4-Benchmark.jpg" rel="lightbox-19"><img class="aligncenter size-medium wp-image-38900" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/Battlefield-4-Benchmark-600x247.jpg" alt="Battlefield 4 Benchmark" width="600" height="247" /></a></p>
<p>In Battlefield 4, we can clearly see that the GTX 980 outperforms the GTX 780 Ti as well as the R9 290 but still falls short of coming anywhere near the monstrous $1,500 R9 295X2. However, the GTX 980 was without a doubt playable FPS and never dipped below 30 FPS according to our measurements.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/Crysis-3-4K-Benchmarks.jpg" rel="lightbox-20"><img class="aligncenter size-medium wp-image-38902" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/Crysis-3-4K-Benchmarks-600x276.jpg" alt="Crysis 3 4K Benchmarks" width="600" height="276" /></a></p>
<p>&nbsp;</p>
<p>In Crysis, we once again saw the GTX 980 outperform the GTX 780 Ti and the R9 290, but it still struggled to keep up with the R9 295X2 (which is triple the price). This is primarily because of the lack of memory and memory bandwidth to properly play Crysis 3 at those settings. So, if you want to run Crysis 3 at Very High settings with 4x MSAA, you&#8217;ll probably need a second GPU and then you should get pretty playable FPS.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/CSGO-Benchmark.jpg" rel="lightbox-21"><img class="aligncenter size-medium wp-image-38903" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/CSGO-Benchmark-600x275.jpg" alt="CSGO Benchmark" width="600" height="275" /></a></p>
<p>&nbsp;</p>
<p>In Counter Strike: Global Offensive, we weren&#8217;t expecting to see anything but triple digit FPS, but what is important is that the GTX 980 beats out the R9 290 and 780 Ti in terms of 4K performance and did cap at 300 max FPS at times. If you want to have the ultimate 4K gaming experience in CSGO you can totally do it with any of these cards, but the GTX 980 does it at a fraction of the power.</p>
<h2>Power and Overclocking</h2>
<p>At idle, the card ran at about 10% of TDP, or 16W and draws up to 90% of TDP or 148W under most gaming scenarios that we measured. The card never went over 80C and idled at 36C under normal usage. The maximum temperatures as well as idle temps may actually be higher than expected because of the fact that the testing scenario had higher ambient temperatures than normal due to a heatwave.</p>
<p>Last but not least, was overclocking which was more surprising than anyone would have expected. Sure, this card is a very low power card with a lot of in-bound power, but the overclocks achieved were simply mind blowing. In order to test the overclocks, 3Dmark Fire Strike Extreme was run for validation purposes.</p>
<p>In overclocking this card, we were able to push it to a GPU clock offset of +260 on the GPU base clock and +100 on the memory&#8217;s frequency. As a result the GPU base clock went up to 1,387 MHz and boost clock of a whopping 1,553 MHz, something that we have never seen from an air cooled GPU (yes, the fans were at 100% at that point). Even so, this performance was astonishing and resulted in some amazing 3DMark Fire Strike Extreme scores. We&#8217;ve also included some of the other overclocks that were achieved on the way to the max overclock.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/GTX-980-OC-1388.jpg" rel="lightbox-22"><img class="aligncenter size-medium wp-image-38912" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/GTX-980-OC-1388-600x442.jpg" alt="GTX 980 OC 1388" width="600" height="442" /></a></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/GTX-980-3DMark-Overclocking.jpg" rel="lightbox-23"><img class="aligncenter size-medium wp-image-38909" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/GTX-980-3DMark-Overclocking-600x316.jpg" alt="GTX 980 3DMark Overclocking" width="600" height="316" /></a></p>
<p>As you can see above, the overclocked GTX 980 actually outperforms two Radeon HD 7970 GHz editions in CrossFire X as well as all the other cards anywhere near it. The only things that are faster are two GTX Titans in SLI and an R9 295X2. This is also being done at a very small amount of power, 206W to be exact, which means that there&#8217;s still more overclocking headroom left on this card, about 19W. As such, one would expect that consumers may see even more overclocked versions of the GTX 980 with some impressive manufacturer clocks that very likely could be pushed even further.</p>
<h2>Conclusion</h2>
<p>The GTX 980 is an absolutely stunning graphics card that delivers on many of Nvidia&#8217;s promises (namely the 2x + performance of the GTX 680) and does it at an absolutely amazing level of power. But that&#8217;s not even the best part, Nvidia released this card today at an even more competitive price of $549, which is the reason why AMD&#8217;s 290X recently had a price drop from $549 to $449. But do keep in mind that even though the R9 290X is cheaper, it still does draw more power and won&#8217;t overclock anywhere near as well as this card.</p>
<p>Nvidia is also releasing a cost-down version of the GTX 980 with the GTX 970, which understandably is a fairly slower version at $329. Unfortunately, we weren&#8217;t sent one for testing so we can&#8217;t tell you exactly how much slower it is, but it may be a major consideration if the GTX 980 is too rich for your blood.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_3Qtr.jpg" rel="lightbox-24"><img class="aligncenter size-medium wp-image-38923" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/NVIDIA_GeForce_GTX_980_3Qtr-600x473.jpg" alt="NVIDIA_GeForce_GTX_980_3Qtr" width="600" height="473" /></a></p>
<p>Nvidia has without a doubt hit a homerun with the GTX 980 and Maxwell and it will be interesting to see what AMD has to answer this astounding performance and power improvement over the previous generation. This may not necessarily be a huge upgrade for anyone running a GTX 780 Ti, but it is a pretty serious upgrade for almost any other gamer out there that doesn&#8217;t already have that card. And not just that, the GTX 780 Ti is a $700 graphics card and you&#8217;re getting better performance at significantly lower wattage for much less money.</p>
<p>The GTX 980 is a great piece of GPU architecture and is a must buy for anyone looking to buy a new high-end graphics card this holiday season. It only makes us wonder what will eventually be possible once Nvidia unleashes the GM-210 full-blown Maxwell on this world, hopefully next year. As such, this card wins our Editor&#8217;s Choice Award and immediate buy recommendation.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/09/18/geforce-gtx-980-review-performance-lower-power/">GeForce GTX 980 Review: More Performance at Lower Power</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/09/18/geforce-gtx-980-review-performance-lower-power/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>VESA DisplayPort 1.3 Standard Announced</title>
		<link>http://www.vrworld.com/2014/09/15/vesa-displayport-1-3-standard-announced/</link>
		<comments>http://www.vrworld.com/2014/09/15/vesa-displayport-1-3-standard-announced/#comments</comments>
		<pubDate>Mon, 15 Sep 2014 16:10:11 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[Audio/Video]]></category>
		<category><![CDATA[Enterprise]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[32.4 Gbps]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[DisplayPort 1.2]]></category>
		<category><![CDATA[DisplayPort 1.3]]></category>
		<category><![CDATA[Dockport]]></category>
		<category><![CDATA[HDMI]]></category>
		<category><![CDATA[HDMI 2.0]]></category>
		<category><![CDATA[VESA]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=38800</guid>
		<description><![CDATA[<p>VESA officially announced the DisplayPort 1.3 Standard Monday, something long-time readers of Bright Side of News* may have already been familiar with. Many months ago, ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/09/15/vesa-displayport-1-3-standard-announced/">VESA DisplayPort 1.3 Standard Announced</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="4971" height="3058" src="http://cdn.vrworld.com/wp-content/uploads/2014/09/Displayport-cable.jpg" class="attachment-post-thumbnail wp-post-image" alt="DisplayPort 1.3 Cable" /></p><p>VESA officially announced the DisplayPort 1.3 Standard Monday, something long-time readers of <em>Bright Side of News* </em>may have already been familiar with.</p>
<p>Many months ago, there was <a title="BSN* Exclusive: DisplayPort 1.3 to Support 8K and 4K 3D" href="http://www.brightsideofnews.com/2013/12/03/displayport-13-to-support-8k2c-standard-expected-in-q2-2014/">talk about DisplayPort 1.3 exclusively on <em>BSN*</em></a>, those rumors pointed to 8K and 4K 3D. In terms of bandwidth, the 8.1 Gbps per channel was spot on as VESA&#8217;s new DisplayPort 1.3 bandwidth clocks in at exactly 32.4 Gbits/second. This is more than enough to support 4K video (as DisplayPort 1.2 already does this with much less bandwidth). However, the increase to 32.4 Gbps appears to be more of a future-proofing of DisplayPort 1.3 as it already incorporates 4K support as well as adaptive sync from the <a title="Adaptive-Sync Added to VESA DisplayPort 1.2a Standard" href="http://www.brightsideofnews.com/2014/05/12/vesa-adds-adaptive-sync-displayport-1-2-standard/">DisplayPort 1.2a</a> standard that preceded it.</p>
<p>&nbsp;</p>
<div id="attachment_38810" style="width: 610px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/09/DisplayPortBandwidth.jpg" rel="lightbox-0"><img class="size-medium wp-image-38810" src="http://www.brightsideofnews.com/wp-content/uploads/2014/09/DisplayPortBandwidth-600x600.jpg" alt="DisplayPort Bandwidth" width="600" height="600" /></a><p class="wp-caption-text">DisplayPort Bandwidth</p></div>
<p>The new DisplayPort 1.3 standard also brings HDCP 2.2 and HDMI 2.0 compatibility, enabling you to take a DisplayPort connector and make it into an HDMI 2.0 connector at will via an adapter. Thanks to the added bandwidth of the DisplayPort 1.3 standard, there is also the possibility of supporting 4K displays at 60 Hz and up to 24-bit color, and as if that all weren&#8217;t enough there are additional data channels available for USB 3.0 connectivity with Dockport. This means that we could expect to see USB 3.0 connectors on some 4K monitors using only a single data cable and a separate cable for power.</p>
<p>VESA is currently only talking about 5K resolution support like for the<a href="http://www.dell.com/learn/us/en/uscorp1/secure/dell-venue-sf?c=us&amp;l=en&amp;s=corp" target="_blank"> recently announced Dell 5K monitor</a> with DisplayPort 1.3, which very likely means that we won&#8217;t see 4K3D and 8K support until DisplayPort 1.3a when VESA adopts their compression standard as part of DisplayPort. While there isn&#8217;t necessarily an exact timeframe for such an update, there&#8217;s a very good chance that the industry could see it follow shortly after DisplayPort 1.3&#8217;s announcement today. It is a little disappointing that the compression standard didn&#8217;t make it into DisplayPort 1.3, but there could have been a timing issue that caused a delay that wouldn&#8217;t have enabled DisplayPort 1.3 to have the necessary compression to support 8K resolution.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/09/15/vesa-displayport-1-3-standard-announced/">VESA DisplayPort 1.3 Standard Announced</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/09/15/vesa-displayport-1-3-standard-announced/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Adaptive-Sync Added to VESA DisplayPort 1.2a Standard</title>
		<link>http://www.vrworld.com/2014/05/12/vesa-adds-adaptive-sync-displayport-1-2-standard/</link>
		<comments>http://www.vrworld.com/2014/05/12/vesa-adds-adaptive-sync-displayport-1-2-standard/#comments</comments>
		<pubDate>Mon, 12 May 2014 16:59:19 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Audio/Video]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Adaptive-Sync]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Display Stream Compression]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[DisplayPort 1.2]]></category>
		<category><![CDATA[DisplayPort 1.2a]]></category>
		<category><![CDATA[FPS]]></category>
		<category><![CDATA[Frame Rate]]></category>
		<category><![CDATA[Framrate]]></category>
		<category><![CDATA[FreeSync]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Refresh Rate]]></category>
		<category><![CDATA[Sync]]></category>
		<category><![CDATA[VESA]]></category>
		<category><![CDATA[VSync]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=35063</guid>
		<description><![CDATA[<p>As we had already reported, a component of the AMD technology coined as FreeSync has finally been ratified and standardized by VESA as part of the ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/05/12/vesa-adds-adaptive-sync-displayport-1-2-standard/">Adaptive-Sync Added to VESA DisplayPort 1.2a Standard</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="982" height="333" src="http://cdn.vrworld.com/wp-content/uploads/2014/05/VESA1.jpg" class="attachment-post-thumbnail wp-post-image" alt="VESA Logo" /></p><p>As we <a title="AMD’s ‘FreeSync’ Ratified by VESA, More to Come" href="http://www.brightsideofnews.com/2014/04/11/amds-freesync-ratified-by-vesa/" target="_blank">had already reported</a>, a component of the AMD technology coined as FreeSync has finally been ratified and standardized by VESA as part of the DisplayPort 1.2a standard. As of today, however, the standard will be known as <a href="http://www.vesa.org/featured-articles/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/" target="_blank">Adaptive-Sync</a>. While it may not be as catchy for companies like AMD, which liked to brag about the fact that their version of adaptive refresh rate syncing was a &#8216;free&#8217; upgrade the truth is the technology is best named Adaptive-Sync. This is a more open version of what <a title="Nvidia Introduces G-Sync – The Death of V-Sync" href="http://www.brightsideofnews.com/2013/10/18/nvidia-introduces-g-sync-the-death-of-v-sync/" target="_blank">Nvidia&#8217;s currently doing with their G-Sync technology</a>, which is also an adaptive-sync technology, but requires Nvidia GPU and Nvidia monitor electronics both of which are not only cost prohibitive but fairly closed in terms of accessibility to others. As such, Nvidia must be applauded for having brought this technology to market and for having made the issue a topic of discussion and ultimately resulting in DisplayPort&#8217;s supporting of this new adaptive-sync technology.</p>
<div id="attachment_35066" style="width: 478px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/05/a-sync1.jpg" rel="lightbox-0"><img class="size-full wp-image-35066" src="http://cdn.vrworld.com/wp-content/uploads/2014/05/a-sync1.jpg" alt="Adaptive-Sync" width="468" height="191" /></a><p class="wp-caption-text">How Adaptive-Sync works in different scenarios</p></div>
<p>Adaptive-Sync is a great technology because it will allow both gaming desktop and notebook manufacturers to not only smooth out the frame rates of gaming and graphics, but also to only refresh the monitor as many times as the GPU is capable of delivering frames. This means that with fewer refreshes we could see much better power consumption and battery life out of these displays, which ultimately results in better power bills for people that always have their monitors on and better battery life for mobile devices that connect to an external display. Adaptive-Sync has been a part of VESA&#8217;s embedded DisplayPort, eDP, spec since 2009 and as a result, a lot of adaptive-sync technology is already incorporated into a lot of the components for displays that rely on eDP for internal signaling.</p>
<p>Implementation of DisplayPort Adaptive-Sync technology is offered to VESA members free without any license fee, which means we will likely see big monitor manufacturers like Dell and Samsung adopting this standard fairly quickly. This announcement, <a href="http://www.brightsideofnews.com/2014/04/11/amds-freesync-ratified-by-vesa/" target="_blank">as we had stated earlier</a> would likely happen in May, and precede <a href="http://www.brightsideofnews.com/2013/12/03/displayport-13-to-support-8k2c-standard-expected-in-q2-2014/" target="_blank">VESA&#8217;s DisplayPort 1.3</a> announcement. This announcement is expected to come around the late Q2, early Q3 timeframe and will include all of the DisplayPort 1.2a features including adaptive-sync. And thanks to the inclusion of adaptive-sync into DisplayPort 1.2, the DisplayPort 1.3 standard may have an easier time enabling things like 8K video since supporting 8K video at variable lower frame rates going to be less bandwidth intensive than 30 or 60p. While we don&#8217;t know what the exact final specifications of DisplayPort 1.3 will be, we do know it will likely include VESA&#8217;s new video compression standard called Display Stream Compression which is designed to support up to 8K video. As we&#8217;ve said before, the people at VESA would really like to be able to maintain the same cabling as DisplayPort 1.1 and 1.2, however it may not be possible in order to enable 8K and 4K 3D technologies. Until then, we&#8217;ll keep you updated and informed on the latest from VESA and all of the video standards.</p>
<p>&nbsp;</p>
<h4><strong>Update 11:35 am: AMD has sent us a series of Q&amp;A about the new DisplayPort Adaptive-sync technology addressing some people&#8217;s questions</strong></h4>
<p style="padding-left: 30px;"><strong>Q:What is DisplayPort™ Adaptive-Sync?</strong><br />
A: DisplayPort™ Adaptive-Sync is a new addition to the DisplayPort™ 1.2a specification, ported from the embedded DisplayPort™ v1.0 specification. DisplayPort™ Adaptive-Sync provides an industry-standard mechanism that enables real-time adjustment of a monitor’s refresh rate of a display over a DisplayPort™ link.</p>
<p style="padding-left: 30px;"><strong>Q: What is Project FreeSync?</strong><br />
A: Project FreeSync is an AMD effort to leverage industry standards, like DisplayPort™ Adaptive-Sync, to deliver dynamic refresh rates. Dynamic refresh rates synchronize the refresh rate of a compatible monitor to the framerate of a user’s AMD Radeon™ graphics to reduce or eliminate stuttering, juddering and/or tearing during gaming and video playback.</p>
<p style="padding-left: 30px;"><strong>Q: How are DisplayPort™ Adaptive-Sync and Project FreeSync different?</strong><br />
A: DisplayPort™ Adaptive-Sync is an ingredient DisplayPort™ feature that enables real-time adjustment of monitor refresh rates required by technologies like Project FreeSync. Project FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort™ Adaptive-Sync protocols to enable user-facing benefits: smooth, tearing-free and low-latency gameplay and video.</p>
<p style="padding-left: 30px;"><strong>Q: Is DisplayPort™ Adaptive-Sync the industry-standard version of Project FreeSync?</strong><br />
A: The DisplayPort™ Adaptive-Sync specification was ported from the Embedded DisplayPort™ specification through a proposal to the VESA group by AMD. DisplayPort™ Adaptive-Sync is an ingredient feature of a DisplayPort™ link and an industry standard that enables technologies like Project FreeSync.</p>
<p style="padding-left: 30px;"><strong>Q: What are the requirements to use FreeSync?</strong><br />
A: To take advantage of the benefits of Project FreeSync, users will require: a monitor compatible with DisplayPort™ Adaptive-Sync, a compatible AMD Radeon™ GPU with a DisplayPort™ connection, and a compatible AMD Catalyst™ graphics driver. AMD plans to release a compatible graphics driver to coincide with the introduction of the first DisplayPort™ Adaptive-Sync monitors.</p>
<p style="padding-left: 30px;"><strong>Q: When can I buy a monitor compatible with Project FreeSync?</strong><br />
A: AMD has undertaken every necessary effort to enable Project FreeSync in the display ecosystem. Monitor vendors are now integrating the DisplayPort™ Adaptive-Sync specification and productizing compatible displays. AMD is working closely with these vendors to bring products to market, and we expect compatible monitors within 6-12 months.</p>
<p style="padding-left: 30px;"><strong>Q: What AMD Radeon™ GPUs are compatible with Project FreeSync?</strong><br />
A: The first discrete GPUs compatible with Project FreeSync are the AMD Radeon™ R9 290X, R9 290, R7 260X and R7 260 graphics cards. Project FreeSync is also compatible with AMD APUs codenamed “Kabini,” “Temash,” “Beema,” and “Mullins.” All compatible products must be connected via DisplayPort™ to a display that supports DisplayPort™ Adaptive-Sync.</p>
<p style="padding-left: 30px;"><strong>Q: How is Project Freesync different from NVIDIA G-Sync?</strong><br />
A: While both technologies have similar benefits, G-Sync uses expensive and proprietary hardware. In contrast, Project FreeSync utilizes the industry-standard DisplayPort™ Adaptive-Sync specification to promote wider adoption, lower cost of ownership, and a broad ecosystem of compatibility.</p>
<p style="padding-left: 30px;"><strong>Q: Why should gamers purchase a system that utilizes Project FreeSync?</strong><br />
A: Project FreeSync’s ability to synchronize the refresh rate of a display to the framerate of a graphics card can eliminate visual artifacts that many gamers are especially sensitive to: screen tearing, input lag, and stuttering. Project FreeSync aims to accomplish this through an open ecosystem that does not require licensing fees from participants, which encourages broad adoption and low end-user costs.</p>
<p style="padding-left: 30px;"><strong>Q: What is the supported range of refresh rates with FreeSync and DisplayPort™ Adaptive-Sync?</strong><br />
A: AMD Radeon™ graphics cards will support a wide variety of dynamic refresh ranges with Project FreeSync. Using DisplayPort™ Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/05/12/vesa-adds-adaptive-sync-displayport-1-2-standard/">Adaptive-Sync Added to VESA DisplayPort 1.2a Standard</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/05/12/vesa-adds-adaptive-sync-displayport-1-2-standard/feed/</wfw:commentRss>
		<slash:comments>8</slash:comments>
		</item>
		<item>
		<title>AMD&#039;s &#039;FreeSync&#039; Ratified by VESA, More to Come</title>
		<link>http://www.vrworld.com/2014/04/11/amds-freesync-ratified-by-vesa/</link>
		<comments>http://www.vrworld.com/2014/04/11/amds-freesync-ratified-by-vesa/#comments</comments>
		<pubDate>Fri, 11 Apr 2014 16:56:17 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Audio/Video]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Rumors]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Bandwidth]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[DisplayPort 1.2a]]></category>
		<category><![CDATA[DisplayPort 1.3]]></category>
		<category><![CDATA[FPS]]></category>
		<category><![CDATA[Frame Rate]]></category>
		<category><![CDATA[Framerate]]></category>
		<category><![CDATA[g-sync]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[gsync]]></category>
		<category><![CDATA[monitor]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[VESA]]></category>

		<guid isPermaLink="false">http://www.brightsideofnews.com/?p=34414</guid>
		<description><![CDATA[<p>As many of you may already know, AMD has proposed a standard unofficially dubbed FreeSync as a way to allow monitors to sync with graphics ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/11/amds-freesync-ratified-by-vesa/">AMD&#039;s &#039;FreeSync&#039; Ratified by VESA, More to Come</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="800" height="539" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/AMDVESA1.jpg" class="attachment-post-thumbnail wp-post-image" alt="AMDVESA" /></p><p>As many of you may already know, AMD has proposed a standard unofficially dubbed FreeSync as a way to allow monitors to sync with graphics cards in a way that allows them not to render half frames and to sync the refresh rate of the monitor with the frame rate of the GPU. This standard is designed to be a &#8216;free&#8217; alternative to <a href="http://www.brightsideofnews.com/2013/10/18/nvidia-introduces-g-sync-the-death-of-v-sync/">Nvidia&#8217;s announced G-Sync</a> which will only work with Nvidia&#8217;s own GPUs and monitor hardware. AMD&#8217;s solution is designed to be more &#8216;open&#8217; and &#8216;free&#8217; meaning that you don&#8217;t have to necessarily commit to a specific brand of graphics card or monitor. While I haven&#8217;t seen FreeSync, I have seen G-Sync and I can tell you that the experience without a doubt elevates the quality of gaming without a doubt in my mind.</p>
<p>Now, there has recently been a rumor circulating around the press for the past few days about the fact that AMD&#8217;s dubbed FreeSync standard has been ratified by VESA and will become part of the DisplayPort standard. We did some digging around and managed to get some pretty definitive answers about this new standard that will be part of DisplayPort 1.2a. We were told that while the name itself isn&#8217;t too bad, there is a strong likelihood that when the new standard is officially announced as part of DisplayPort  that it&#8217;ll lose its current name.  We were also told that there&#8217;s a strong likelihood that we could see it as soon as May, shortly before VESA makes another announcement.</p>
<p>That announcement will precede an even bigger one, in the announcement of DisplayPort 1.3. For those unfamiliar with DisplayPort 1.3, <a href="http://www.brightsideofnews.com/2013/12/03/displayport-13-to-support-8k2c-standard-expected-in-q2-2014/">we wrote a pretty lengthy exclusive article back in December</a> detailing all of the aspects of DisplayPort 1.3, however, once FreeSync gets integrated into 1.2a, that feature will also be added to 1.3 by default. I suspect that because of FreeSync, there may be a possibility that we could see DisplayPort 1.3 gaining faster adoption as a result of the fact that a generally lower refresh rate will allow for less power drawn by the display. In fact, one of the best applications for FreeSync in DisplayPort 1.2a and later standards will be the fact that monitors will spend less time refreshing and as a result drawing less power. Not to mention the overall experiential improvements. Additionally, it may be easier to implement things like 8K video over DisplayPort 1.3 if we know that initial testing can be done at lower refresh rates than 60 Hz. Hopefully they&#8217;ve resolved the cable bandwidth issue with DisplayPort 1.3, and if we find out anything new about FreeSync or DisplayPort 1.3, we&#8217;ll be sure to report it to you as soon as possible.</p>
<p>The expectation is that DisplayPort 1.3 will be announced sometime in late Q2 or early Q3, which leaves DisplayPort 1.2 to come anywhere between that time and now.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/11/amds-freesync-ratified-by-vesa/">AMD&#039;s &#039;FreeSync&#039; Ratified by VESA, More to Come</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/04/11/amds-freesync-ratified-by-vesa/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>AMD Radeon R9 295X2 Review: The Definitive 4K Gaming Graphics Card</title>
		<link>http://www.vrworld.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/</link>
		<comments>http://www.vrworld.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/#comments</comments>
		<pubDate>Wed, 09 Apr 2014 17:55:31 +0000</pubDate>
		<dc:creator><![CDATA[Anshel Sag]]></dc:creator>
				<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Reviews]]></category>
		<category><![CDATA[3840x2160]]></category>
		<category><![CDATA[4K]]></category>
		<category><![CDATA[4K Gaming]]></category>
		<category><![CDATA[500 Watts]]></category>
		<category><![CDATA[500w]]></category>
		<category><![CDATA[512-bit]]></category>
		<category><![CDATA[8GB]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[AMD Radeon]]></category>
		<category><![CDATA[AMD Radeon R9 295X2]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[Dual GPU]]></category>
		<category><![CDATA[GDDR5]]></category>
		<category><![CDATA[Hawaii]]></category>
		<category><![CDATA[Islands]]></category>
		<category><![CDATA[memory bus]]></category>
		<category><![CDATA[power]]></category>
		<category><![CDATA[PSU]]></category>
		<category><![CDATA[Shader Cores]]></category>
		<category><![CDATA[Vesuvius]]></category>
		<category><![CDATA[Volcanic Islands]]></category>

		<guid isPermaLink="false">http://wp.bsne.ws/?p=34324</guid>
		<description><![CDATA[<p>AMD has been teasing their Radeon R9 295X2 codenamed Vesuvius for quite some time now, including a lengthy &#8216;viral&#8217; campaign that involved secret agents, semi-ambiguous ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/">AMD Radeon R9 295X2 Review: The Definitive 4K Gaming Graphics Card</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><img width="1920" height="1000" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00381.jpg" class="attachment-post-thumbnail wp-post-image" alt="DSC_0038" /></p><p>AMD has been teasing their Radeon R9 295X2 codenamed Vesuvius for quite some time now, including a lengthy &#8216;viral&#8217; campaign that involved secret agents, <a title="Viral Marketing for AMD Vesuvius" href="http://www.brightsideofnews.com/news/2014/3/19/amds-2betterthan1-vesuvius-viral-marketing-continues.aspx" target="_blank">semi-ambiguous packages of Volcanic Island water and chips</a> as well as <a title="AMD Teases new Dual GPU" href="http://www.brightsideofnews.com/news/2014/3/13/amd-teases-new-dual-gpu-card-with-2betterthan1-viral-ad.aspx" target="_blank">creepy photos of yours truly</a>. Now that the secret is out of the case, we can finally tell you about AMD&#8217;s new card and exactly what it is intended to do. First and foremost, this card&#8217;s sole purpose is to deliver a single graphics card 4K gaming experience. Something that is currently impossible even with the latest crop of AMD&#8217;s Radeon R9 290X and Nvidia&#8217;s GeForce GTX 780 Ti. In this review, we&#8217;ll be seeing whether or not AMD&#8217;s latest and greatest graphics card is capable of delivering true 4K gaming in a single card.</p>
<p>The name of the AMD Radeon R9 295X2 itself doesn&#8217;t quite make sense to me in terms of naming when you consider that it is really just two R9 290X&#8217;s in a single dual GPU graphics card. I would&#8217;ve called it the R9 290X2, but perhaps I&#8217;m missing the point here. But when you take into consideration that this card is effective two R9 290X&#8217;s strapped onto a single graphics card and you remember all of the thermal issues they had with a single R9 290X it was obvious they needed to go liquid cooling. And they did, so they tapped Asetek for a custom cooling solution to cool both Hawaii GPUs in this Vesuvius card.</p>
<p>The Radeon R9 290X2 has both an LED lit fan as well as an LED lit logo between the two liquid cooling tubes.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_01051.jpg" rel="lightbox-0"><img class="aligncenter  wp-image-34313" alt="DSC_0105" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_01051.jpg" width="1152" height="574" /></a></p>
<p><span style="line-height: 1.5em;">The R9 295X2 came to us in a briefcase, which we originally teased in an article, even though we&#8217;re not actually sure what the retail packaging for this card will be since these were press samples. Judging by </span><a style="line-height: 1.5em;" href="http://rog.asus.com/314282014/gaming-graphics-cards-2/asus-announces-r9-295x2-graphics-card/" target="_self">ASUS&#8217; own packaging</a><span style="line-height: 1.5em;">, it doesn&#8217;t look like it&#8217;ll be coming in a briefcase like ours did, what a shame.</span></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_9979_9101.jpg" rel="lightbox-1"><img class="aligncenter size-full wp-image-34336" alt="DSC_9979_910" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_9979_9101.jpg" width="910" height="490" /></a> <a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_0004_9101.jpg" rel="lightbox-2"><img class="aligncenter size-full wp-image-34335" alt="DSC_0004_910" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_0004_9101.jpg" width="910" height="910" /></a></p>
<p>Looking the card itself in detail, you are getting a dual Hawaii ASIC graphics card that allows for a maximum capacity of 8GB GDDR5 memory. When you compare it side to side against the R9 290X and the R9 290 you can see exactly what you&#8217;re getting in terms of performance and raw support.</p>
<p>Furthermore, the compute performance is actually more than 2x the 5.6 TFLOPS that the R9 290X is rated at at 11.5 TFLOPS instead of the expected 11.2 TFLOPS as the GPUs actually work up to 1018 MHz rather than 1 GHz on the R9 290X (the reason for the 295X2 naming?). Higher clocks were enabled thanks to the cooling and improved binning more than anything else, which we&#8217;ll talk about next. The overall compute capability of 11.5 TFLOPS is a result of the fact that it has 5,632 cores/stream processors enabling a texture fill rate up to 358.3 GT/s.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardSpecs1.jpg" rel="lightbox-3"><img class="aligncenter  wp-image-34294" alt="CardSpecs" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardSpecs1.jpg" width="1152" height="624" /></a> <a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/TransistorsAndShaders1.jpg" rel="lightbox-4"><img class="aligncenter  wp-image-34317" alt="TransistorsAndShaders" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/TransistorsAndShaders1.jpg" width="1152" height="603" /></a></p>
<p>&nbsp;</p>
<p>In order to combat the issues that AMD had with the R9 290X and overheating causing thermal throttling of the GPU, AMD approached Asetek to help them water cool the entire graphics card. Asetek is very good at making self-contained liquid cooling systems and they&#8217;re capable of scaling in the thousands if not hundreds of thousands. So, as a result they helped AMD engineer a solution that would help them not only cool a single Hawaii GPU but two slightly higher clocked Hawaii GPUs, a pretty tall order.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/AsetekWatercooling11.jpg" rel="lightbox-5"><img class="aligncenter  wp-image-34337" alt="AsetekWatercooling" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/AsetekWatercooling11.jpg" width="1152" height="620" /></a></p>
<p>If you look at the card itself, there are a few major parts to the whole system. Primarily the metal backplate, the PCB with the GPUS and memory, the Asetek liquid cooling blocks and full-length cooling plate, and the metal cover and fan. There is also a heat exchanger (radiator) attached to each of the Asetek waterblock/pumps and each block and pump is connected to the other. If the GPUs are already water cooled why have a fan on the cover of the card, some of you might say? Well, that&#8217;s because the fan&#8217;s purpose is to keep the memory chips cool as well as the voltage regulators in the middle. Remember, voltage regulators typically are the first failing points of most dual GPU cards.</p>
<p>&nbsp;</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardBreakDown1.jpg" rel="lightbox-6"><img class="aligncenter  wp-image-34292" alt="CardBreakDown" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardBreakDown1.jpg" width="1152" height="575" /></a> <a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardDimensions1.jpg" rel="lightbox-7"><img class="aligncenter  wp-image-34293" alt="CardDimensions" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/CardDimensions1.jpg" width="1152" height="508" /></a></p>
<p>As a result, you get a very very powerful graphics card that is fully contained and self-sufficient, as long as you give it 500 watts of power. That means that this card has two PCI Express 8-pin power connectors and it also means that you need to have the right PSU with enough current on the +12v rail in order to power it safely and in a stable manner. For this purpose, we will be using a Thermaltake Toughpower XT 1475W power supply. Obviously you don&#8217;t need that much unless you plan to run two of these cards in a single system.</p>
<p>&nbsp;</p>
<p>The back of the card is also populated with four MiniDP and one Dual-Link DVI connector, netting you a total of five monitors, at up to 2560&#215;1600 resolution if you use all 5. However, you could theoretically drive three 4K monitors off of this display &#8211; realistically you&#8217;d want to have two of these GPUs in order to drive a triple-4K display setup smoothly. Gaming in 25 million pixels? Everything is possible.</p>
<p>That takes us to the GPU&#8217;s overall performance and evaluating its capability as a 4K gaming card and whether or not it is actually worth the $1,499 pricetag. Not just that, but how does it utilize the 8GB of GDDR5 and does it run anywhere nearly as hot as the R9 290X with the new cooling solution?</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00301.jpg" rel="lightbox-8"><img class="aligncenter  wp-image-34303" alt="DSC_0030" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00301.jpg" width="1152" height="533" /></a></p>
<p>For our testing, we tested synthetic benchmarks, OpenCL benchmarks and game benchmarks. All of these benchmarks were tested with both Nvidia&#8217;s and AMD&#8217;s latest graphics drivers and AMD&#8217;s Mantle was enabled where possible (Battlefield 4). The system we tested the cards on was powered by a Core i7-4960X with 16GB of DDR3-1600 and on an X79 board in an x16 PCIe slot. The PSU, as mentioned earlier, was a Thermaltake Toughpower XT 1475W and we were using a SHARP Sharp PN-K321 monitor to test 4K. We did not test in any other resolution other than 4K because it seems ridiculous to spend $1,500 on a graphics card and use it for anything other than 4K, seriously. Theoretically, you could dedicate it towards compute, but in that case we&#8217;ve got you covered.</p>
<p>The tests that we ran in our benchmarking were 3DMark Fire Strike Extreme, Kishonti Compubench (CLBenchmark), Luxmark v2.0, Battlefield 4, Crysis 3 and Counter Strike: Global Offensive.</p>
<p>In our testing, we tested the R9 295X2, R9 290 and the GTX 780 Ti. We had some issues getting the last gen Radeon HD 7990 to work properly with our 4K monitor in the game tests so we had to throw it out and will likely have to figure out a solution to get it work in the future. Unfortunately, that issue presented AMD&#8217;s drivers at its worst. However, we do have some non-real-gaming HD 7990 benchmarks in our 3DMark and OpenCL benchmarks to give an idea of performance over the HD 7990.</p>
<p><span style="font-weight: bold;">Synthetic Benchmarks</span></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/3DmarkFireStrikeExtremeFixed1.jpg" rel="lightbox-9"><img class="aligncenter size-full wp-image-34318" alt="3DmarkFireStrikeExtremeFixed" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/3DmarkFireStrikeExtremeFixed1.jpg" width="1012" height="599" /></a></p>
<p>In 3DMark, which was the first test we ran, we were able to see that the R9 295X2 pretty much came in as the fastest single card that we&#8217;ve ever tested by a huge margin. The R9 295X2 scored a 3DMark Fire Strike Extreme score of 8403 in our testing, which bested our GTX Titan SLI setup with 7391 and the HD 7990 which scored 4639. It also beat the R9 290&#8217;s 4631 by almost twice, even though the 290X would have scored higher so the R9 295X2 isn&#8217;t quite double the performance of the Hawaii GPU in 3DMark. But even so, it is still the fastest multi-GPU setup we&#8217;ve tested in 3DMark.</p>
<p><span style="font-weight: bold;">Compute Performance</span></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/KishontiCLBench1.jpg" rel="lightbox-10"><img class="aligncenter size-full wp-image-34322" alt="KishontiCLBench" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/KishontiCLBench1.jpg" width="919" height="466" /></a></p>
<p>In Kishonti&#8217;s CompuBench, the new name for their CLBenchmark (since they&#8217;ve added renderscript on mobile) we wanted to see how each card handled compute as Nvidia has traditionally struggled against AMD and here we found some really interesting results. In terms of OpenCL compute, AMD&#8217;s Radeon R9 295X2 actually outperformed three of Nvidia&#8217;s GTX Titans and obviously the short lived Radeon HD 7990. This card scored 686,475 points which is higher than the three Titans&#8217; 646,233 and the Radeon HD 7990&#8217;s 560,695. What is interesting, though is that when compared to the R9 290&#8217;s 333,444 score, the R9 295X2 scored more than double, meaning that the R9 295X2 likely scores exactly double that of the R9 290X.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/LuxMarkv21.jpg" rel="lightbox-11"><img class="aligncenter size-full wp-image-34323" alt="LuxMarkv2" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/LuxMarkv21.jpg" width="913" height="440" /></a></p>
<p>The story was a little different in Luxmark v2.0 however, in the end the R9 295X2 was still much faster than all of the competitive GPU solutions and AMD actually took the top three slots even against three of Nvidia&#8217;s GTX Titans. The R9 295X2 is a really powerful card for compute and that is quite evident here with the score of 5350 over the 7970 GHz editions in CrossFireX (4679) and the HD 7990 (4475).</p>
<p><span style="font-weight: bold;">Real World Game Performance</span></p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/BF4_4K1.jpg" rel="lightbox-12"><img class="aligncenter size-full wp-image-34319" alt="BF4_4K" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/BF4_4K1.jpg" width="769" height="353" /></a></p>
<p>In Battlefield 4 we wanted to get a playable scenario for all of the card that we tested so we set the settings to the high preset rather than the highest preset, which I believe is Ultra. We also opted for 4x MSAA instead of 8x MSAA. As a result, all of the card provided playable settings with the R9 295X2 delivering an average performance of 81 FPS compared to the R9 290 and 780 Ti which both got an average of 40 FPS. Now, considering that the R9 295X2 had an average of 81 FPS we would likely be able to crank the settings up to the ultra preset which would enable us to still have playable graphics at near maximum settings. Thanks to Mantle AMD&#8217;s R9 295X2 is able to deliver an impressive amount of performance in BF4 in 4K. During our benchmarking the R9 295X used a maximum of 6.7 GB out of the GPU&#8217;s available 8GB of memory, indicating some wiggle room for graphics settings bump, especially with Mantle, which would probably bring the card closer to 8GB.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/Crysis3_4K1.jpg" rel="lightbox-13"><img class="aligncenter" alt="Crysis3_4K" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/Crysis3_4K1.jpg" width="763" height="394" /></a></p>
<p>With Crysis 3 we just wanted to make the graphics cards cry and to see what would happen if we ran the game at nearly the maximum settings possible while still allowing the R9 295X2 to be playable. This was achieved by setting the game settings to Very High and dialing in the MSAA at 4x instead of 8x. As a result, we got the R9 295X2 to perform at an average of 29 FPS while the 780 Ti performed at 20 FPS and the R9 290 at 18 FPS. Even though the game looked absolutely stunning, at these settings in 4K the game was only playable on the R9 295X2 and would&#8217;ve required two 780 Tis or two R9 290s to get anywhere near similar performance. What was the most interesting thing was that this game far surpassed our memory utilization on the GPU than Battlefield 4 did. At its peak, the R9 295X2 used 7.5GB of 8 GB of memory, indicating that we had effectively maxxed out the card&#8217;s capabilities at 4K.</p>
<p>Thus, if you think that 8GB of video memory is an overkill for a graphics card, think again.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/CSGO_4k_6891.jpg" rel="lightbox-14"><img class="aligncenter size-full wp-image-34321" alt="CSGO_4k_689" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/CSGO_4k_6891.jpg" width="760" height="387" /></a></p>
<p>In terms of testing Counter Strike: Global Offensive, we knew that we weren&#8217;t going to be getting anything crazy in terms of graphics, but we did know that we would be able to easily play the game at 4K on all of our cards and to get high enough FPS to be able to differentiate between the cards more effectively. Not to mention the fact that this would be a Source Engine game while the others were Frostbite and CryEngine. Personally, I play a lot of 4K so this interested me a lot and since Counter Strike: Global Offensive is on its way to be one of the most played games on Steam, it only makes sense to test it out in 4K and see what kind of performance it would get. We also set the game at its maximum possible settings in order to try to lower the FPS as much as possible and improve the overall visual quality at 4K.</p>
<p>No surprise, at max settings all of these cards managed over 150 FPS. The R9 295X2 managed to deliver an average of 250 FPS (300 is the max set by the Source Engine) while the R9 290 managed 163 FPS and the 780 Ti 148 FPS. So, with the R9 295X2 you&#8217;re basically getting almost 100 FPS over the single GPUs in 4K. And in this scenario the memory utilization of the R9 295X2 was expectedly much lower than the other two games with it only utilizing 3.6 GB of the 8GB of memory.</p>
<p>That wraps up our performance testing. But before we move on to thermals we wanted to talk about the reason why we mentioned memory utilization. We wanted to illustrate the purpose of going with an 8GB card rather than a 16GB card, 8GB is pretty much enough in almost any scenario unless you believe that you are going to be running Crysis 3 at 4K. Anyone trying to sell you a card with more than 8GB is probably selling you more memory than you need and if you&#8217;re not running 4K you probably don&#8217;t need more than 3 or 4GB. Just keep that in mind as add-in board partners try to charge more for memory that you&#8217;ll never use.</p>
<p>We also wanted to note that even with AMD&#8217;s latest drivers and a custom CrossFire profile, certain benchmarks simply would not run dual GPUs. We also had some problems with configuring the 4K monitor with the R9 295X2, however, we were able to work those out. To us, it seemed that this card&#8217;s drivers still needed some work, but were playable in the three games that we tested them in. Also, AMD&#8217;s frame pacing appeared to work well as we didn&#8217;t notice any stuttering or nearly as much tearing as we had with previous AMD dual GPU setups.</p>
<p><span style="font-weight: bold;">Temperatures</span></p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00111.jpg" rel="lightbox-15"><img class="aligncenter  wp-image-34299" alt="DSC_0011" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00111.jpg" width="1152" height="394" /></a></p>
<p>Now, getting back to the card, we wanted to talk about the thermals. For me, when I first heard about this card and its possible existence my first concern was cooling and temperatures. AMD really struggled a lot with the R9 290X in terms of perception because the card topped out at water-boiling 95 Celsius under extreme loads and as a result actually throttled performance. Of course, their add-in board partners eventually released better cooling solutions that alleviated this issue to a degree, but that concern came back again once I heard that AMD would be making what was supposed to be a dual Hawaii GPU. To be honest, my first thought about a dual Hawaii card was that it had to be dual 290s and that it would have to be water cooled.</p>
<p>When I found out that this card would be two slightly higher clocked 290X&#8217;s I was a little concerned about how it would perform thermally and if they could handle the temperatures generated by these GPUs. Even when I saw the cooling solution I wasn&#8217;t sure that the 120mm Asetek radiator would be able to handle a maximum of 500W of GPU power dissipation. After all, most CPU coolers that are cooling 130W CPUs also have 120mm radiators and are effectively cooling half the TDP.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00251.jpg" rel="lightbox-16"><img class="aligncenter  wp-image-34301" alt="DSC_0025" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_00251.jpg" width="1152" height="766" /></a></p>
<p>Upon doing all of my testing, I was able to conclude that the first GPU at idle was running at a normal temperature of around 36C which is pretty good for a normal graphics card, but fairly expected for a water cooled graphics card. Now, what really caught me by surprise was the temperature of the first GPU (the most heavily used GPU) under maximum load. The GPU never got hotter than 67C, even when the radiator and tubes were warm to the touch. Because of that, this card is capable of delivering a very stable level of performance and to do it consistently time and time again. I don&#8217;t know exactly what the guys at Asetek and AMD did, <a href="http://asetek.com/press-room/news/2014/amd-selects-asetek-to-liquid-cool-the-world%E2%80%99s-fastest-graphics-card.aspx" target="_self">but they have engineered a pretty fantastic cooling solution</a> that I frankly thought might be underpowered for these GPUs.</p>
<p><span><span style="font-weight: bold;">Conclusion</span></span></p>
<p>The Radeon R9 295X2 is a $1,500 card. There&#8217;s no getting around that. In fact, it makes a lot of sense if you think about it. The R9 290X should sell for around $550, even though there&#8217;s still a fair amount of gouging as a result of last year&#8217;s cryptocurrency rush. However, <a href="http://www.newegg.com/Product/Product.aspx?gclid=CObM56jb1L0CFUNhfgod6ocAUg&amp;Item=N82E16814129278&amp;nm_mc=KNC-GoogleAdwords&amp;cm_mmc=KNC-GoogleAdwords-_-pla-_-Desktop+Graphics+Cards-_-N82E16814129278&amp;ef_id=UhMeXwAAAWKGwRMG:20140410005755:s" target="_self">Newegg is selling a Visiontek R9 290X</a> for just under $600 with a non-reference cooler. Now, if you take the $550 price and assume it is multiplied by two, you&#8217;ve already arrived at an $1,100 card purely by the assumption of 2X the price. However, this card is also water cooled, which normally carries a $100-$200 premium on any graphics card, bringing the expected retail price up to about $1,300. However, most dual GPU cards rarely sell for 2X the price of their base-model single GPU parents and <a href="http://www.brightsideofnews.com/news/2014/3/25/gtc-2014-keynote---gtx-titan-z-and-pascal-announced.aspx" target="_self">Nvidia announced the Titan Z was going to be selling for $3000</a>. This gave AMD some wiggle room on price and I suspected that they would be selling it somewhere betwen $1,300 and $1,500. And here we are, at $1,500.</p>
<p style="text-align: center;"><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_01021.jpg" rel="lightbox-17"><img class="aligncenter  wp-image-34310" alt="DSC_0102" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/DSC_01021.jpg" width="1152" height="824" /></a></p>
<p>The card itself is when compared to buying two R9 290Xs simply does not provide more value than owning two cards. However, it is significantly quieter and cooler than running two R9 290Xs and it provides a pretty high level of gaming performance in 4K with a single graphics card. However, in order to properly make good use of either this GPU or any dual 290X setup, you should really be looking at a 4K display. So, this card makes sense if you want the absolute fastest or if you want to be able to cram four Hawaii GPUs into a board with only two PCIe slots or if your case simply isn&#8217;t big enough to fit four GPUs. Not to mention, cooling four Hawaii GPUs in any case on air would be significantly more difficult than two of these Vesuvius cards. In fact, you would be spending around $2,300 on four 290Xs and would probably have a much hotter and louder setup than if you spent the extra $700 and went with two 295X2s. So, there is some sense in going with these cards if you absolutely have to have the fastest and the best and use 4K.</p>
<p>In terms of value, just have one thing in mind: for the price of a single GeForce GTX Titan Z, you can purchase a R9 295X2 and not one, but TWO 28&#8243; 4K 60Hz panels from Samsung. If you dislike the TN panel, you can still afford a 24&#8243; IPS 4K Panel from Dell. Talk about a value deal.</p>
<p>I have to say that I was personally quite impressed with this card itself, especially on the performance and thermals side of things. If you want the absolute latest and greatest, not to mention fastest, fastest in the world, you have to buy the R9 295X2. And if you want to buy just one card for 4K gaming, this is it. And because of that, we&#8217;re awarding this card our editor&#8217;s choice award.</p>
<p><a href="http://cdn.vrworld.com/wp-content/uploads/2014/04/editors-choice_prosumer1.gif" rel="lightbox-18"><img class="aligncenter size-full wp-image-34342" alt="editors-choice_prosumer" src="http://cdn.vrworld.com/wp-content/uploads/2014/04/editors-choice_prosumer1.gif" width="618" height="68" /></a></p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/">AMD Radeon R9 295X2 Review: The Definitive 4K Gaming Graphics Card</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2014/04/09/amd-radeon-r9-295x2-review-the-definitive-4k-gaming-graphics-card/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
		<item>
		<title>Zotac releases affordable DisplayPort cards</title>
		<link>http://www.vrworld.com/2008/11/11/zotac-releases-affordable-displayport-cards/</link>
		<comments>http://www.vrworld.com/2008/11/11/zotac-releases-affordable-displayport-cards/#comments</comments>
		<pubDate>Tue, 11 Nov 2008 18:00:19 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[el cheapo]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[HDMI]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Zotac]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=378</guid>
		<description><![CDATA[<p>Since DisplayPort will be the &#8220;Flavor of the year&#8221; in 2009, and start to replace DVI and analog D-SUB, more and more companies are joining ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/11/11/zotac-releases-affordable-displayport-cards/">Zotac releases affordable DisplayPort cards</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>Since DisplayPort will be the &#8220;Flavor of the year&#8221; in 2009, and start to replace DVI and analog D-SUB, more and more companies are joining in with their products that feature this connector.<br />
Zotac decided to launch the most affordable cards so far &#8211; based on GeForce 9400GT and 9500GT, these boards are targeting those entry-level systems that will be inside those &#8220;Christmas special&#8221; systems that will be equipped with &#8220;displays for 2009&#8243;.</p>
<div id="attachment_379" style="width: 510px" class="wp-caption aligncenter"><a href="http://cdn.vrworld.com/wp-content/uploads/2008/11/zotac_displayport_9400and95.jpg" rel="lightbox-0"><img class="size-full wp-image-379" title="zotac_displayport_9400and95" src="http://cdn.vrworld.com/wp-content/uploads/2008/11/zotac_displayport_9400and95.jpg" alt="Two el cheapo parts..." width="500" height="186" /></a><p class="wp-caption-text">Two el cheapo parts...</p></div>
<p>All in all, interesting parts.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/11/11/zotac-releases-affordable-displayport-cards/">Zotac releases affordable DisplayPort cards</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/11/11/zotac-releases-affordable-displayport-cards/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>New &#8220;nForce for AMD&#8221; chipsets take shape under GeForce name</title>
		<link>http://www.vrworld.com/2008/10/20/new-nforce-for-amd-chipsets-take-shape-under-geforce-name/</link>
		<comments>http://www.vrworld.com/2008/10/20/new-nforce-for-amd-chipsets-take-shape-under-geforce-name/#comments</comments>
		<pubDate>Mon, 20 Oct 2008 12:00:39 +0000</pubDate>
		<dc:creator><![CDATA[Theo Valich]]></dc:creator>
				<category><![CDATA[3D]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Business]]></category>
		<category><![CDATA[CPU]]></category>
		<category><![CDATA[Graphics]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Memory & Storage Space]]></category>
		<category><![CDATA[Microsoft]]></category>
		<category><![CDATA[Software]]></category>
		<category><![CDATA[ACC]]></category>
		<category><![CDATA[Advanced Clock Calibration]]></category>
		<category><![CDATA[AM2]]></category>
		<category><![CDATA[Athlon]]></category>
		<category><![CDATA[Computex]]></category>
		<category><![CDATA[DDR2]]></category>
		<category><![CDATA[DDR3]]></category>
		<category><![CDATA[DisplayPort]]></category>
		<category><![CDATA[DVI]]></category>
		<category><![CDATA[GeForce]]></category>
		<category><![CDATA[HDMI]]></category>
		<category><![CDATA[HybridSLI]]></category>
		<category><![CDATA[nForce]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[OCZ]]></category>
		<category><![CDATA[Phenom]]></category>
		<category><![CDATA[SLI-Ready]]></category>
		<category><![CDATA[Taipei]]></category>

		<guid isPermaLink="false">http://theovalich.wordpress.com/?p=105</guid>
		<description><![CDATA[<p>Last week, Chinese site Expreview.com published a story about the new generation of nForce chipsets for AMD processors. We managed to find more details through ...</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/10/20/new-nforce-for-amd-chipsets-take-shape-under-geforce-name/">New &#8220;nForce for AMD&#8221; chipsets take shape under GeForce name</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>Last week, Chinese site Expreview.com published a story <a href="http://en.expreview.com/2008/10/16/nvidia-never-leaves-mobo-business-preparing-mcp82.html#more-1086" target="_blank">about the new generation of nForce chipsets for AMD</a> processors.</p>
<p>We managed to find more details through course of the weekend. For starters, the lineup will consist out of three (not two) chipsets with various capabilities.</p>
<p>MCP82-S1, MCP82-S2 and MCP82-S3 will round the lineup, all targeting their respective markets (high-end, mainstream and entry-level). The S1 and S2 will support SLI, while S3 targets lucrative OEM/ODM deals &#8211; our sources indicate that this variant will be pitched as a successor to GeForce 6150 line that conquered many Dells, HPs and Acers out there.<br />
<strong>MCP82-S1</strong> targets the high-end, with Hybrid-SLI technology attaching up to three graphics cards in 3-Way SLI (one card connected via x16 slot, two via x8 slots).  The chipset features 35 PCIe lanes and 7 links. You can expect that a typical MCP82-S1 mainboard layout for will include three PCIe x16, two x1 and one PCI 3.0 slot. Or two PCI and single x1 slot.<br />
<strong>MCP82-S2</strong> supports 19 PCIe lanes and 4 links, so typical layout will be two PCIe x8 slots and two x1 slots. Or, single PCIe x16 and three x1 slots. As you can see, this board does not target the multi-GPU gamer, but rather offers compelling single-card experience. HybridSLI is supported, of course.<br />
<strong>MCP82-S3</strong> should be available only in micro-ATX form factor, offering a cut-down cost effective version of S2. These motherboards will target customers that are entering the world of computing. If Nvidia gets the price down, you will see this chipset even in netbooks. We feel that combination between next-generation Tegra and MCP82-S3 would be quite interesting for netbook market. Of course, if Microsoft allows licensing of Windows Mobile for a &#8220;cut-down notebook&#8221;.</p>
<div id="attachment_108" style="width: 510px" class="wp-caption alignnone"><a href="http://cdn.vrworld.com/wp-content/uploads/2008/10/nvidia_nf780aref.jpg" rel="lightbox-0"><img class="size-full wp-image-108" title="nvidia_nf780aref" src="http://cdn.vrworld.com/wp-content/uploads/2008/10/nvidia_nf780aref.jpg" alt="In the future, you should connect your monitor to the motherboard." width="500" height="306" /></a><p class="wp-caption-text">In the future, you should connect your monitor to the motherboard.</p></div>
<p>Integrated GPU in all three chipsets is based between G92 and GT200 chips, offering improved performance in CUDA applications. I haven&#8217;t been able to find out will it bring 32 or 40 shader processors (thus, two or three FP64 DP units). mGPU features Dual-Link DVI, HDMI and analog VGA connectors. If Nvidia really wants us to connect our displays to their mGPU, they should put two Dual-Link DVIs or two DisplayPorts. Only then they can start thinking about removing one DVI connector from their GeForce cards, not before (more about that later).<br />
Storage-wise, Nvidia continues with hardware RAID support &#8211; RAID 0, 0+1 and 5 are all supported through six SATA connectors. Hardware network chips will continue to offer GbE or paired GbE speeds through two RJ-45 connectors. Of course, if you activate paired GbE network, you get single RJ-45 with 2Gbps speed.<br />
Memory-wise, MCP8 series supports AM2, AM2+ and AM3 processor sockets, so you can expect to find DIMM slots for either DDR2-800/1066 or DDR3-1066/1333/1600. SLI Memory will continue to be supported from the box and there is little doubt that new SLI-Ready memory will not appear in time for launch. If AMD scores a big one with their 45nm line-up, we might even see the continuation of AMD-CPU-Only memory from OCZ, for instance.<br />
Further integration with AMD processors includes full support for Advanced Clock Calibration, feature from latest Southbridge chips from ATI. With ACC, you can push the CPUs further than you were before. Usually, a gain is around 200-400 MHz (from 2.8-3.0 to 3.2 on air), and the overclocking capabilities of AM3 processors remain unknown.<br />
The biggest question that lies in the air is &#8211; will the nForce brand survive? Destiny of S3 is already sealed and delivered (GeForce), but time will tell will we see nForce 890a and 870a, or will the naming convention swing the GeForce way. We expect these chips to launch in June, during Computex Taipei 2009.</p>
<p>The post <a rel="nofollow" href="http://www.vrworld.com/2008/10/20/new-nforce-for-amd-chipsets-take-shape-under-geforce-name/">New &#8220;nForce for AMD&#8221; chipsets take shape under GeForce name</a> appeared first on <a rel="nofollow" href="http://www.vrworld.com">VR World</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.vrworld.com/2008/10/20/new-nforce-for-amd-chipsets-take-shape-under-geforce-name/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
	</channel>
</rss>

<!-- Performance optimized by W3 Total Cache. Learn more: http://www.w3-edge.com/wordpress-plugins/

Content Delivery Network via Amazon Web Services: CloudFront: cdn.vrworld.com

 Served from: www.vrworld.com @ 2015-04-10 16:04:52 by W3 Total Cache -->