|Latest EXTREME Overclocking Reviews
AMD Phenom II X4 975 BE & 840 Quad-Core Processors
Review | January 4, 2011
Today AMD is releasing two new processors for the main-stream market. First up is a new top-end quad-core, the 3.6 GHz Phenom II X4 975 Black Edition. The X4 975 BE is simply a 100 MHz speed bump over the already familiar X4 970 BE with no other design changes. The second new processor is a budget quad-core, the 3.2 GHz Phenom II X4 840.
AMD Phenom II X6 1100T BE Hex-Core Processor
Review | December 7, 2010
We saw back in September AMD released six new CPUs to add to their product mix. Today AMD is showcasing three new processors, including a new flagship model, the AMD Phenom II X6 1100T Black Edition. The other two processors are a Phenom II X2 565 Black Edition, and an Athlon II X3 455. All three of these processors are simply 100 MHz speed-bumps over previous top-end models.
AMD Phenom II X6 1075T & X4 970 BE & X2 560 BE Processors
Review | September 21, 2010
About five months ago AMD made a ground-breaking announcement with their first pair of Phenom II X6 hex-core desktop processors. Today AMD is releasing six new processors to add to their existing mix of Phenom II & Athlon II models. The most notable of these new processors is a mid-range hex-core, the Phenom II X6 1075T, priced almost in between the existing two X6 models. The quad-core Phenom II X4 970 BE is also a welcome addition.
ASUS P7P55D-E Deluxe Motherboard
Review | September 14, 2010
With the mid-range 1156 socket becoming ever so popular, ASUS is bringing their offering to the table. The ASUS P7P55D-E Deluxe motherboard has a great feature set including USB 3.0 and SATA 6Gb/s. This coupled with their TurboV EVO overclocking package and Hybrid 48 phase power, this board should really shine. ASUS has taken the P55 chipset and added some extras of their own to take this motherboard to the next level.
SilverStone Grandia GD05 SFF HTPC Chassis
Review | July 14, 2010
HTPCs are gaining popularity among computer system builders. With digital movies, and the streaming media market growing every day more and more people want a computer hooked to their television for entertainment. They don’t want an ugly computer sitting in their living room either. Enter the Silverstone Grandia GD05. This beautiful chassis combines sleek lines with a black brushed aluminum finish to bring computing into your living room.
Tuesday July 22nd, 2014
| Recently appointed CEO Satya Nadella announced the largest layoffs in Microsoft’s 39 year history today, with a staggering 18,000 jobs on the chopping block. The goal, according to Nadella is to “simplify the way we work to drive greater accountability, become more agile and move faster” signifying Nadella's goal to bring some focus to Microsoft's portfolio of services while also seemingly looking to play down the job losses.
The last large round of layoffs at Microsoft came in 2009, after the stock market crash. That round of layoffs was the previous largest ever at 5,800 positions, and today’s announcement dwarfs that number substantially. But not all departments will share this burden evenly, with the recently acquired Nokia employees getting the brunt of the cuts. In April, Microsoft closed the acquisition of the Nokia mobile phone business, and in the process added 25,000 employees to its payroll. Nadella announced today that 50% of those employees will be let go. Some will be factory workers from some of the in-house manufacturing Nokia owned, and the remainder will be from the handset business itself.
The remaining 5,500 employees to be laid off will therefore come from within Microsoft itself, as it attempts to concentrate on some of its more successful offerings. Excluding the Nokia losses, which are often expected after a merger of this sort, the total number of Microsoft employees being affected is not significantly different than the 2009 cuts.
Former Nokia CEO, now Microsoft Executive VP of Devices and Services, Stephen Elop laid out some of the upcoming changes in his own letter to his employees. Elop promises a focus on Windows Phone, with a near term goal of driving up Windows Phone volume by focusing on the affordable smartphone segments. With that announcement comes the death of the strange Nokia X series of AOSP phones, which debuted at MWC 2014 and were updated with a new model only a couple of weeks ago. While I would make the argument that there was little need for the X series at all, it is doubly frustrating to anyone who bought into the platform to find it killed off so quickly. The X series would be easy prey for cuts like these, because it didn’t really offer anything new to Android or to Microsoft. While it promised to be low cost, retail pricing for the X line was often more than the low cost Lumia phones. The X series had no place in a Microsoft owned Nokia, and should have been killed a while ago.
Elop also announced that they would continue to work on the high end phone range as well. Historically Windows Phone has suffered selling flagship models for many reasons, but it appears that they are not ready to give up the fight in this market yet. He also specifically called out Surface, Perceptive Pixel, and Xbox as new areas of innovation, which likely means those brands are safe for the time being.
The remainder of the Nokia feature phone lines appear to be immediately canceled. This is a segment that has been rapidly shrinking in recent years, with the consumer push towards smartphones, so this is likely a good strategic move by Microsoft. The work done on Windows Phone to allow it to work well on low cost hardware is also likely another big reason for this.
Another major announcement was the closure of the Xbox Entertainment Studios which had a goal of providing original content for Xbox Live members. Several projects such as “Signal to Noise” and “Halo: Nightfall” that were mid production will be completed, but after that content is delivered the studio will be closed.
The full ramifications of these job cuts won’t be known for some time, but it seems fair to say that Nadella wants to put his own stamp on the company. Removing the Nokia X line, the Asha and S40 lines, and an entertainment studio seem like reasonable things to cut if you want to focus your company. Nadella speaks about flattening the organization out, which should help them be quicker to execute on ideas. These kinds of steps, though painful for the employees, can be better for the company in the long run. For quite some time, the perception is that Microsoft is not agile enough to respond to new markets, and it appears that Satya Nadella is trying to focus his company on its strength and that should have a net positive for the company. Microsoft’s next earnings call comes on July 22nd, at which point we may get more details about upcoming plans.
Posted By CybrSlydr @ 8:48 PM
Wednesday July 16th, 2014
| That thing in the above picture is an SSD, and a hoofing big one too. The Plextor M6e is the first M.2 SSD I’ve had arrive in the office, and it’s a 512GB drive that aims to circumvent the limitations of current SATA connections by using the same PCI Express bus that's been providing oodles of bandwidth to graphics cards for years.
In fairness some SSD manufacturers, like OCZ and Kingspec, have already been producing PCIe-based drives that slot in side-by-side with your graphics card. Those have been using the combined performance of multiple SSDs to create the extra speed, were this Plextor M6e is doing all the work itself.
The M.2 interface in most of the Z97 motherboards I’ve tested has a theoretical limit of 1GB/s compared with the 600MB/s limits of SATA. The beauty of using the PCIe bus is that in the future manufacturers can open up more PCIe lanes to allow for even higher possible bandwidth around the 4GB/s mark.
The Plextor M6e isn’t quite up in those numbers just yet. My preliminary tests have the 512GB version hitting sequential read/write figures of 676MB/s and 620MB/s respectively in the AS SSD benchmarking software.
While that bests any SATA-based SSD I’ve ever tested—including Samsung’s latest V-NANDtastic 850 Pro—the 4k random read/write results are nowhere near as spectacular.
At 31MB/s for the reads it’s up there with the best, but at just 72MB/s for the writes it’s considerably slower than the bargain-basement Crucial MX100 512GB SATA drive.
So, while the Plextor M6e is demonstrably breaking the limits of the SATA barrier, it’s not doing it by much and not making any inroads into the responsiveness 4k tests.
It’s early days for the M.2 interface, but once we get the proper SSD-focused NVMe standard rather than the current AHCI—the standard protocol for elderly spinning platter hard drives—I think we’ll start to see things change massively in the SSD world.
Source: PC Gamer
Posted By CybrSlydr @ 12:02 PM
Monday July 7th, 2014
| A new computer called the "HummingBoard" takes on the same basic shape as the Raspberry Pi but uses a more powerful processor and supports more operating systems.
SolidRun, which also makes the CuBox-i computer we wrote about, just started selling the HummingBoard in several configurations ranging from $45 to $100, not including the price of a power adapter and Micro SD card.
"The HummingBoard allows you to run many open source operating systems—such as Ubuntu, Debian, and Arch—as well as Android and XBMC," SolidRun says. "With its core technology based on SolidRun’s state-of-the-art Micro System on a Module (MicroSOM), it has ready-to-use OS images, and its open hardware comes with full schematics and layout. Best of all, as a Linux single board computer, the HummingBoard is backed by the global digital maker community, which means you can alter the product in any way you like and get full kernel upstreaming support and all the assistance you need."
HummingBoard uses a 1GHz ARMv7 processor rather than the 700MHz ARMv6 one that has worked well for the Raspberry Pi yet limits the number of operating systems it can run. HummingBoard configurations use single- and dual-core i.MX 6 chips based on the ARM Cortex-A9 architecture, and they range from 512MB to 1GB of memory.
Other features include OpenGL support, up to Gigabit Ethernet, support for mSATA and PCIe mini cards, HDMI, GPIO pins, LVDS display out, a camera interface, and powered USB.
The HummingBoard was "cleverly designed to mimic the Raspberry Pi’s dimensions and layout," Geek.com wrote. "That means it’ll fit into the hundreds of ready-to-use Raspberry Pi cases." In addition, "[t]he processor sits on its own module, which means you may be able to purchase upgrades for it in the future."
Here's a video from SolidRun that compares the HummingBoard to the Raspberry Pi: https://www.youtube.com/watch?v=dnGiYir07as
Source: Ars Technica
Posted By CybrSlydr @ 9:47 AM
| I attended both Apple's and Google's developer conference keynotes last month, and I experienced strong deja vu on more than one occasion. Both companies talked about design and consistency. Both companies talked about improving back-end services. And both companies talked about new initiatives to make stuff on your phone appear seamlessly on your tablet or laptop.
"Users almost always have a smartphone with them, including when they are using a Chromebook," said Google's Sundar Pichai. "So we want to increasingly connect those experiences together, so they have a seamless experience across their devices."
At or around the time the Android L release comes out this fall, this means your phone and your Chromebook are going to be able to share even more stuff than they already do. If you have your phone with you, it can unlock your Chromebook (and if you have your smartwatch with you, it can unlock your phone). If you get a call or a text or your battery is running low, you'll be told about it on your Chromebook. Some Android apps are even going to be able to run in Chrome OS, though Google didn't talk much about the technical details.
It was all very similar to the "Continuity" feature that Apple's Craig Federighi showed off on the same stage in the same room three weeks before (at least both companies can still share a conference hall). When iOS 8 and OS X Yosemite arrive in the fall, AirDrop will be able to move files between iOS devices and Macs. "Handoff" can send e-mails, webpages, and even files from iCloud-enabled applications on iOS to their counterparts in OS X (or vice-versa). You can receive texts alongside iMessages in the Messages app, and you can make and receive phone calls from your Mac even if your phone is in another room.
This isn't about which company is copying from which—this kind of integration is a logical next step for both Apple and Google after years of moving various operating systems and services closer and closer together. This is about ecosystem lock-in. All of these features sound like great, logical ways to extend both companies' platforms, since you can often assume that someone using an Apple phone will be using an Apple computer. They're also going to make it harder than ever for you to extricate yourself from a given company's ecosystem once you've become embedded in it.
Follow link for rest of story
Source: Ars Technica
Posted By CybrSlydr @ 9:43 AM
| By Luke Karmali
Microsoft has announced Kinect 2 for PC costs £159 / $199.
The device is available now for pre-order on the Microsoft Store, ahead of release on July 15.
Though Kinect's primary use on Xbox One was arguably gaming, the device's launch on PC will target applications more heavily. It also ships without software, which is licensed separately.
It's also worth noting that this is not the device that will work with the new Kinect-less Xbox One; we still don't know how much you'll pay for one of those.
Posted By CybrSlydr @ 9:31 AM
Wednesday June 25th, 2014
| We will not reveal our source dev (whose employment we confirmed and vowed we would not reveal the identity of) but the info should be taken with a grain of salt as a result. Today, I have the ultimate displeasure to inform the public that apparently the E3 demo of The Division was running on a PC and will be downgraded overall. Developers often overshoot for the moon and end up delivering next to nothing in terms of the visual garbage that the final retail copies end up becoming (Far Cry 3, Watch Dogs, Dark Souls II in comparison to their non-downgraded counterparts). This sort of false advertising and marketing absolutely has to stop. It is a vile and dementedly sick way of companies to make money off of people who obviously preorder because the game is visually impressive. Yes – some may claim “gameplay weighs in more” but this is arguable.
He tells us the following:
We really loved the reception to the demo we showed on the PC version at E3. Currently as it stands, there is definitely a lot of push coming from publishers to not make the experience so different on consoles as to alienate people into thinking that next generation is not as powerful as PC. This is probably what happened at Ubisoft Montreal. I think that while making stability changes is definitely important, it does not completely obliterate a lot of enhanced rendering applications.
Right now we already took out quite a lot of screen space reflections from the game and are working on asset management the best we can given consoles have that great unified memory. Naturally we will also be using online servers and have to produce a synchronization that higher graphics add to the latency so it had to be turned down. To me it still looks good, but not as good as the original reveal. I am sure as we get closer to launch and the actual console versions of the game featuring SD (Snowdrop) that it will start to seem all too obvious to people especially those on PCs. I just wanted to write and let you know that it definitely is not just stability but marketing politics plays into this a lot as well.
UPDATED 2nd Response from The Division Developer: Truth be told in regards to your question that while ‘Yes’ the lead platform is the PC, we simply cannot have such a big gap. As you know when the first WATCH DOGS Review was published by that one site, Ubisoft called it a “false review” and I am sure everyone can see how bad that sounded when they saw the game did look marginally better than something that was a last generation GTA IV. But no, they will not admit that they practice this or actively downgrade a game. It is much easier to say they removed things for stability which is often a lie as you can tell by the post-issues which are expected in any production we do.
Also to answer your 3rd question, no…they will never fully disclose what was removed from what build as no laws ask them to do so in terms of consumer rights. If we as developers published that information in very real terms for the consumer such as “Replaced particle fog simulation with 2d layer simulation in 3d space, removed particles from all explosions, lowered explosion volume multiplier by 20x, removed X # of trees and civilians, etc.” we would be out of a lot of sales and probably it would actually require too much time to deliver on the current hype that a lot of downgraded games see which look incredible with a vertical slice. I do share this in the hope’s that my colleagues and publishers and a lot of people who make false promises and do demonstrations which wrongfully create too much hype that they cannot deliver on ultimately stop doing such things. I want to see the industry actually move forward and not be so full of itself by promising too much and delivering too little. Regards
Our insider who is currently in the graphics technical division at Ubisoft Massive in Sweden contacted us because he too is sick of the practices that a company like Ubisoft has become all too known for. If Ubisoft denies downgrades have not happened and uses the lame excuse that “it is for the gamers and stability we did what we did” then there is certainly no reason for the PC/console parity to exist because currently the downgraded Watch Dogs runs sub-par still which is an utter joke. Everyone knows “next generation” currently as it stands is utter marketing BS. Of course, a lot of the uneducated folks out there feel this is more. Next gen, means next gen! If this is the case, PC raw throughput has the greatest power of any console despite having lower development focus (due to piracy). Essentially if it is not obvious by now: Next Gen has diminished any chances of making graphics leaps for the marketers to make more money on “next-gen” until the next next-gen comes out. It is a great marketing hype that is all too common in the gaming industry.
Bottom line: Publishers and developers – stop lying and rely on actual gameplay that is close to the real thing to do your marketing for you. And if you did remove a lot of features that affected the stability of the game, make sure to release a full disclosure of what this is before the game comes out. Oh wait…but then you would not see as many sales. Tsk tsk.
Posted By: Usman Ihtsham
ON Friday, June 20th, 2014
Source: What If Gaming
This isn't your normal news site, but it bears posting simply because of what it says - and sounds reasonable.
Posted By CybrSlydr @ 9:03 AM
Monday June 23rd, 2014
| Crytek may be enduring some financial difficulties of late, as Eurogamer reports that the developer has missed payroll at both its Bulgarian and UK offices recently.
That report followed an article on German outlet GameStar, which claimed the developer was on the brink of bankruptcy. It cited a source with a large publisher as saying "the vultures are circling," and rival companies have begun attempts to poach the studio's best talent.
However, the report also said Crytek co-founder Avni Yerli admitted earlier this month that the studio's transition to free-to-play games had been difficult, but claimed the studio was also on the verge of securing new financing. GameStar suggested an acquisition from World of Tanks publisher Wargaming was a possibility, but Eurogamer said the company may be entertaining an investment from a Chinese outfit instead.
Eurogamer reported that staff at Crytek's Sofia studio in Bulgaria hadn't been paid for two months. Meanwhile, Crytek UK, which just recently unveiled Homefront: The Revolution, failed to pay employees on time. The site added that staff have been upset by what they see as a lack of transparency from management over those issues.
A Crytek representative denied the claims to Eurogamer, saying, "Regardless of what some media are reporting, mostly based on a recent article published by GameStar, the information in those reports and in the GameStar article itself are rumors which Crytek deny."
Source: Games Industry.biz
Related Update: 6/26/2014
40 employees have left and studio can't bring in replacements quick enough, source claims
A number of staff at Crytek UK have not been paid their full salary since April 21st, a source connected to the matter has told Develop.
The source, who has ties with the studio, said since April, employees had received small payments of around £700 last month. At the time they had been told a deal was being made to secure money from Deutsche Bank, but that was since delayed.
A further payment was paid on June 16th, with staff then told to expect payment on Friday, June 27th. Our source claims however “this now looks like it won't happen either”.
Crytek UK is currently working on Homefront: The Revolution under what been a team of 90 developers.
Develop understands that since work on the game began, 40 staff have left, and the issues with salaries “has added to that number”. While there has been high turnover with new recruits coming in, it was said that the team cannot hire as fast as people are leaving.
It was also claimed that a number of staff have been promoted to senior roles recently, with such employees required to hand in three months notice if they choose to resign. These staff have also allegedly not been given pay rises “equivalent to the job role”, leading to suggestions it was a tactic to ensure staff stay on at the studio.
As a result of these issues, morale at the studio is said to be generally low, with unhappiness particularly directed at “differences with the creative direction of the project”, as well as pay.
Despite the staff departures, our source said that following the cancellation of Ryse 2, many developers from the Frankfurt office – where the sequel to the Xbox One launch title was to be made – are now working with Crytek UK on Homefront.
Developers at the Germany-based office have had their own problems however, with issues apparently starting as early as last year, though at that time the UK studio was unaffected. It is not currently clear however exactly what has happened in Frankfurt.
A Crytek spokesperson declined to comment on the matter.Source: Develop-Online.net
Welp, I mean look at EA, it was cheaper for Dice to make their own engine then to license and modify Crytek engine.
They should of copied Epic, they had the talent. Oh well.
Posted By @dmin @ 8:20 PM
Wednesday June 11th, 2014
| One of the most interesting things I saw at this year’s Computex was CVision's glasses-free 3D technology. You likely have not heard of the company before because they are currently not spending any money on B2C marketing or PR as they are focusing on selling/licensing the technology to OEMs to bring it to the mainstream market.
The way their technology works is unique. Instead of requiring a special panel or hardware, all that is needed is a custom film, or convergence of thin-film barrier as it's officially called, that is applied on top of the panel. That film along with CVision's software is able to produce a 3D experience that doesn't require glasses and to be honest, the quality was just awesome. CVision showed me a couple of short videos to highlight the 3D experience and I didn't notice that it was 3D unless I specifically looked for it. I mean, that's how smooth it was. There was no ghosting or bleeding, just sharp picture in 3D. The viewing angles were also as good as you would expect from an IPS panel -- the 3D effect didn't suffer at all even when viewed from an angle. Of course, if I moved the device or my eyes/head while playbacking the video, the smoothness was lost but as long as I held the device steady and focused on the screen there were no significant drawbacks compared to 2D.
CVision's software even supports 2D to 3D conversion on the fly, so playing Angry Birds in 3D wasn't a problem at all and it was actually very cool as the game itself suits well for 3D. Photos can also be converted to 3D and CVision showed me a couple of photos they had taken on the show floor with the phone. The camera itself was similar to what you can find inside any smartphone, so the conversion was done purely in software and the result was decent. I'm not sure if an exhibition hall is the best target for 3D photos as obviously it works the best when you are just focusing on one object but it was still clearly 3D but not as impressive as the videos or Angry Birds.
The main advantage of CVision's technology is that it can be applied to any device without the need for major re-engineering. The film itself is very thin and it is the only thing that is needed in terms of hardware and the prototype devices CVision had at their booth were as slim as any other high-end smartphone in the market. Currently the cost is about $3 per inch but CVision believes that they can cut this to half with higher volumes. The technology can scale to any size but as CVision is more of a technology company than a real manufacturer, they don't have the equipment to manufacture the films for TV size screens at this moment. However, their roadmap does include a 42" 1080p TV but it might be more of a concept at this point.
All in all, this is the first time I'm truly excited about 3D. I've never been a fan of the glasses and all the glasses-free 3D technologies I've seen so far have had too many limitations to make them better than 2D in my opinion. CVision is currently in talks with several smartphone and tablet OEMs to bring the technology to the mainstream market and I sure hope the OEMs see the potential. I mean, either I got totally fooled by their (non-existing) marketing or their technology "just works".
Posted By CybrSlydr @ 1:34 PM
| We are now six months down the line from the AMD Kaveri launch, and the only two Kaveri processors available on Newegg today are the A10-7850K at $170 and the A10-7700K at $160. Both of these SKUs come with games as part of the purchase, but as AMD’s biggest desktop processor launch of the year, one might have expected more processors to come to market by this point. This is especially true as AMD sampled the A8-7600 SKU to media with a configurable TDP which showcased a large jump in graphics APU performance at the 45W TDP margin, but this model number has not hit consumer shelves in North America. Perhaps then we get a sigh of relief that AMD are announcing seven new Kaveri APUs, including that A8-7600.
Posted By CybrSlydr @ 1:32 PM
| If anyone outside Apple saw Swift coming, they certainly weren't making any public predictions. In the middle of a keynote filled with the sorts of announcements you'd expect (even if the details were a surprise), Apple this week announced that it has created a modern replacement for the Objective-C, a programming language the company has used since shortly after Steve Jobs founded NeXT.
Swift wasn't a "sometime before the year's out"-style announcement, either. The same day, a 550-page language guide appeared in the iBooks store. Developers were also given access to Xcode 6 betas, which allow application development using the new language. Whatever changes were needed to get the entire Cocoa toolkit to play nice with Swift are apparently already done.
While we haven't yet produced any Swift code, we have read the entire language guide and looked at the code samples Apple provided. What follows is our first take on the language itself, along with some ideas about what Apple hopes to accomplish.
Source: Ars Technica
So, as someone who is completely unfamiliar with coding, how does this sound?
Posted By CybrSlydr @ 1:29 PM
Tuesday June 10th, 2014
Here is an official doom 4 teaser trailer. Looks like it will just be called Doom. The game is set to be finally be revealed at quakecon next month!
Posted By kyle2227 @ 12:56 PM
Wednesday June 4th, 2014
| Core M finally looks like a substantial upgrade from the previous generation.[quote2]When it comes to processors used in today’s computers (be they laptops, desktops, or servers), Intel remains the king. However, as consumers find themselves increasingly moving away from being tied down to a desktop towards mobile devices, Intel still wants to be at the forefront of innovation when it comes to processor performance and efficiency.
With processors based on ARM architecture clearly dominating in the smartphone and tablet space, Intel is looking to push back heavily starting at the convertible PC level and downward. To show its commitment, Intel is introducing a new Core M processor that is based on the 14nm Broadwell architecture. Intel calls the Core M the “most energy-efficient Intel Core processor” to date, and states that the processor will enable a broad range of thin, lightweight, and more importantly, quiet mobile devices.
Compared to the previous generation Core offerings, the Core M will have a 60 percent lower TDP, 20 to 40 percent better performance, and a 50 percent smaller package footprint.
Posted By WiCKeD @ 1:56 AM
Thursday May 29th, 2014
| By Seth G. MacyNvidia announced today that its newest graphics card, the GTX Titan Z, is now available for purchase.
Initially announced back in March, Nvidia says the Titan Z is the fastest and "most advanced" card it's ever built. Nvidia makes the claim that a system running three of its new Titan Z GPUs could run the same workload as Google Brain, Google's multi-million dollar neural network.
The card has 12 GB of 7Gbps GDDR5 video memory, 5,760 of Nvidia's own CUDA cores, and two GK110 GTX Titan Black GPUs. Nvidia says that the card delivers resolutions of 3840x2160 on 4K monitors.
To keep the card running cool, Nvidia has cased it in an aluminum outer body and placed an aluminum heat plate between the fan and the VRAM. The plate has ducted channels designed to airflow and the "acoustic qualities" of the card.
"With its efficient power usage and other advanced attributes, the GeForce GTX TITAN Z is the perfect card for Small Form Factor gaming systems," said the company on its website.
The card has DVI-I, DVI-D, DisplayPort and HDMI outputs and can support "multi-monitor" displays, according to Nvidia.
Pricing for this behemoth begins at $2,999 USD.
Posted By CybrSlydr @ 5:18 PM
Wednesday May 28th, 2014
| "Last weekend AMD issued some bold statements to Forbes about Nvidia’s GameWorks developer program, and how it may have impacted performance of Ubisoft’s Watch Dogs on AMD hardware. These claims stretched beyond just Watch Dogs and extended into the greater PC gaming ecosystem, with AMD’s Robert Hallock passionately explaining that GameWorks represents “a clear and present threat to gamers by deliberately crippling performance on AMD products.” Now Nvidia is firing back, intent on setting the record straight."
"Nvidia’s Cebenoyan responded directly to this during our conversation: “I’ve heard that before from AMD and it’s a little mysterious to me. We don’t and we never have restricted anyone from getting access as part of our agreements. Not with Watch Dogs and not with any other titles. Our agreements focus on interesting things we’re going to do together to improve the experience for all PC gamers and of course for Nvidia customers. We don’t have anything in there restricting anyone from accessing source code or binaries. Developers are free to give builds out to whoever they want. It’s their product.” "
Come on AMD, that's quite a childish move to do with NO PROF to back you up.
Posted By Prozium @ 10:21 PM
Wednesday May 21st, 2014
| The last half-year or so has seen the concept of variable refresh desktop monitors advance rather quickly. After sitting on the technology backburner for a number of years, the issue came to the forefront of the PC graphics industry late last year when NVIDIA announced G-Sync, the first such desktop implementation (and unfortunately proprietary implementation) of the concept. AMD in turn fired back at NVIDIA at CES this year, demonstrating their FreeSync concept, which could implement variable refresh through features found in the embedded DisplayPort (eDP) standard. Since then the technology has been in something of a holding pattern – NVIDIA and their partners are still prepping retail G-Sync monitors, meanwhile AMD and the VESA have needed to bridge the specification gap between eDP and DisplayPort.
To that end, the VESA sends word today that they have done just that with the latest update to the DisplayPort 1.2a standard. Adaptive-Sync (not to be confused with NVIDIA’s Adaptive V-Sync), the eDP feature that allows for variable refresh monitors, has been added to the DisplayPort 1.2a standard as an optional feature. We’ve been expecting this addition since AMD first announced their FreeSync concept, however until now it wasn’t clear whether Adaptive-Sync would first be added to DisplayPort 1.2a or rolled into the forthcoming DisplayPort 1.3 standard, so we’re glad to see that it’s the former rather than the latter.
With the standard now having been settled, this frees up GPU manufacturers and display manufacturers to move forward on implementing it in hardware and drivers. The good news is that the underlying technology is fairly old – eDP was ratified in 2009 – so while we’re not accustomed to seeing Adaptive-Sync on desktop hardware, there are GPU and display/controller manufacturers who have experience with the technology. That said, since this feature isn’t present in today’s display controllers there’s still a need to iterate on the hardware and its firmware, even if it’s just making small modifications to existing designs (this being the advantage of doing a DP 1.2a extension).
AMD for their part sent over a notice that they’re already working with display manufacturers to get the technology into future monitors, with their estimate being 6-12 months for Adaptive-Sync capable displays to hit the market. There’s no real precedent for this kind of change, so it’s hard to say just what a realistic number within that window is; but historically vendors have been slow to update their hardware for new DisplayPort standards, and NVIDIA’s own efforts have still taken many months even with NVIDIA’s extra muscle and close relationship. With that in mind we suspect 12 months is more realistic than 6, though we’d be happy to be wrong.
Meanwhile the VESA for their part is touting the full range of benefits for Adapative-Sync. This includes both the gaming angle that NVIDIA and AMD have recently been pushing and the power savings angle that drove the creation of Adaptive-Sync and eDP in the first place. Admittedly the power gains are miniscule and generally unimportant for a desktop scenario, but they are there. Outside of gaming what’s more interesting is the ability to apply Adaptive-Sync to video playback, allowing for the elimination of the judder that’s common when playing back 24fps/25fps content on today’s 60Hz displays.
Along with the addition of Adaptive-Sync to the DisplayPort standard, the VESA will also be putting together a new (yet to be revealed) logo for the technology. Since Adaptive-Sync is an optional feature not every DisplayPort device will support it, so those devices that do support it will sport a logo to visibility indicate their compliance. The logo will go hand-in-hand with the VESA’s forthcoming Adaptive-Sync compliance test, so manufacturers will need to pass the test before they’re able to use the logo.
Moving on, coinciding with today’s announcement from the VESA AMD sent along their own release on the subject. In it, AMD notes that they’re immediately preparing for Adaptive-Sync, though they will be continuing to promote it under the FreeSync brand. AMD is telling us that as of this point most of their GCN 1.1 products will support Adaptive-Sync, including the desktop Radeon 290 and 260 series, along with most of AMD’s current APUs: Beema/Mullins, Kaveri (AMD had mistakenly omitted this from their list), and even the GCN 1.0 Kabini/Temesh. Meanwhile AMD has not yet commented on whether their GCN 1.0 video cards will support Adaptive-Sync, so the outcome of that remains to be seen. But for all of the supported products the underlying hardware is already Adaptive-Sync capable, so it’s just a matter of AMD rolling out support for it in their drivers.
AMD’s release also contains an interesting note on supported refresh rates: “Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.” While the upper-bounds of those ranges are in-line with numbers we’ve seen before, the sub-30Hz refresh rates on the other hand are unexpected. As you might recall from our look at G-Sync, even though LCD monitors don’t suffer from anything quite like phosphor decay as CRT monitors do, there is still a need to periodically refresh an LCD to keep the pixels from drifting. As a result G-Sync has a minimum refresh rate of 30Hz, whereas AMD is explicitly promising lower refresh rates. Since the pixel drift issue is an underlying issue with the LCD technology there is presumably something in Adaptive-Sync to compensate for this – the display is likely initiating a self-refresh – though at the end of the day the variable refresh rate means that you can always set the refresh rate to a multiple of the targeted refresh rate and get the same results.
Finally, we also had a brief chat with NVIDIA about whether they would support Adaptive-Sync on current generation hardware. NVIDIA tells us that they can’t comment at this time since there aren’t any Adaptive-Sync displays available. It’s entirely possible this is just NVIDIA being coy, however like all device vendors they do have to pass the VESA’s compliance tests. So if nothing else NVIDIA’s “no comment” is technically correct: until they pass that test they are limited in what they can say about being Adaptive-Sync compliant.
Though while we’re on the subject, this also brings up the matter of NVIDIA’s competing G-Sync technology. Because of NVIDIA’s head-start on the variable refresh concept with G-Sync, for the next year or so they will continue to be the only vendor with retail support for variable refresh. The modified Asus monitors have been available for a few months now, and the retail G-Sync monitors are still due this quarter the last we heard from NVIDIA. So until Adaptive-Sync monitors hit the market G-Sync is the only option.
Ultimately it remains to be seen what will become of G-Sync – NVIDIA seems to be in this for the long haul as part of their broader ecosystem plans – and there is the matter of whether the technical differences between Adaptive-Sync and G-Sync result in meaningful performance differences between the two technologies. With that said, even if NVIDIA keeps G-Sync around we would hope to see them support Adaptive-Sync just as well as AMD.
Posted By CybrSlydr @ 10:22 PM
EXTREME Overclocking Newsletter
Thousands of PC enthusiasts are already subscribed to the EXTREME Overclocking Newsletter, have you signed up yet?
Most Downloaded Files
Recently Added Files
Compare Prices On Top Brands!