Friday June 16th, 2017
| On June 8, a 21-second video appeared at ataribox.com, teasing a plastic and wood box cryptically described as "a brand-new Atari product, years in the making." It was unclear whether the console was simply an emulator box for retro Atari games or something actually new. Atari failed to show the teased product at last week's E3 video game expo.
However, Atari CEO Fred Chesnais has now told GamesBeat that a new Atari game console is indeed in the works.
"We're back in the hardware business," Chesnais said.
He gave scant details about what the new console is all about and when it would be officially unveiled but said it's "based on PC technology" and added that "Atari is still working on the design and will reveal it at a later date."
Posted By CybrSlydr @ 11:19 AM
Wednesday May 31st, 2017
| Threadripper stepped out in public for the first time at Computex in Taipei, where AMD ran its 16-core Ryzen-based desktop chip through various benchmarks and games. Even better for AMD fans: The company showed off the highly anticipated monster CPU running two of the company’s Radeon RX Vega in CrossFire mode.
AMD also hit back at Intel after Tuesday's revelation of the Core i9. Sure, Intel's new high-end CPU will max out at 18 cores compared to Threadripper's 16. AMD's surprise: Threadripper would not have 44 PCIe lanes as previously rumored, but rather an insane 64 lanes of PCIe Gen 3.0.
Having access to a full 64 lanes on all Threadripper chips is a big deal. Typical consumer platforms such as the Ryzen 7 and Core i7-7700K top out at 16 lanes of PCIe. Intel’s new Core i9/X299 platform hikes that up to 44 lanes of PCIe, but only in the higher-end chips, creating artificial limitations on more modest versions.
AMD’s Ryzen Threadripper/X399 platform takes I/O to a level never seen before in a consumer desktop machine. Most consumers won’t need that kind of speed, but anyone who piles on the storage options and multiple GPUs will welcome AMD’s 64 PCIe lanes.
AMD officials also showed off the company’s upcoming Ryzen-based mobile chip in what looked like a reference convertible laptop. Officials said it would feature four cores alongside Radeon graphics in a single chip.
What’s amazing is the contrast in scale between the monster Ryzen Threadripper and the mobile chip, code-named Raven Ridge.
Source: PC World
Posted By CybrSlydr @ 3:46 PM
Tuesday May 30th, 2017
| By Devindra Hardawar
Last year at Computex, Intel unveiled its first 10-core consumer CPU, the company's move into the world of a "megatasking." It was a pricey chip, launching at around $1,700, but it satisfied the needs for users who needed to juggle several intensive tasks at once. Now, Intel is upping the ante with a whole new family of processors for enthusiasts, the Core X-series, and it's spearheaded by its first 18-core CPU, the i9-7980XE.
Priced at $1,999, the 7980XE is clearly not a chip you'd see in an average desktop. Instead, it's more of a statement from Intel. It beats out AMD's 16-core Threadripper CPU, which was slated to be that company's most powerful consumer processor for 2017. And it gives Intel yet another way to satisfy the demands of power-hungry users who might want to do things like play games in 4K while broadcasting them in HD over Twitch. And as if its massive core count wasn't enough, the i9-7980XE is also the first Intel consumer chip that packs in over a teraflop worth of computing power.
If 18 cores is a bit too rich for you, Intel also has other Core i9 Extreme Edition chips in 10, 12, 14 and 16-core variants. Perhaps the best news for hardware geeks: the 10 core i9-7900X will retail for $999, a significant discount from last year's version.
All of the i9 chips feature base clock speeds of 3.3GHz, reaching up to 4.3GHz dual-core speeds with Turbo Boost 2.0 and 4.5GHz with Turbo Boost 3.0. And speaking of Turbo Boost 3.0, its performance has also been improved in the new Extreme Edition chips to increase both single and dual-core speeds. Rounding out the X-Series family are the quad-core i5-7640X and i7 models in 4, 6 and 8-core models.
While it might all seem like overkill, Intel says its Core i9 lineup was driven by the surprising demand for last year's 10-core chip. "Broadwell-E was kind of an experiment," an Intel rep said. "It sold... Proving that our enthusiast community will go after the best of the best... Yes we're adding higher core count, but we're also introducing lower core counts. Scalability on both ends are what we went after."
As you can imagine, stuffing more cores into a processor leads to some significant heat issues. For that reason, Intel developed its own liquid cooling solution, which will work across these new chips, as well as some previous generations. All of the new Core i9 processors, along with the 6 and 8-core i7 chips, feature scorching hot 140W thermal design points (TDPs), the maximum amount of power that they'll draw. That's the same as last year's 10-core CPU, but it's still well above the 91W TDP from Intel's more affordable i7-7700K.
Over the past few years, Intel's laptop chips have been far more interesting than its desktop CPUs. Partially, that's because the rise of ultraportables and convertible laptops have shifted its focus away from delivering as much computing power as possible, to offering a reasonable amount of processing power efficiently. The new Core i9 X-series processors might not be feasible for most consumers, but for the hardware geeks who treat their rigs like hot rods, they're a dream come true.
Posted By CybrSlydr @ 7:34 AM
Tuesday May 16th, 2017
| Let's be honest, Intel hasn't released anything particularly exciting in a long while now. It's the reason my primary desktop is still rocking an Intel Core i7-4790K Devil's Canyon processor released several years ago. Sure, I could scamper over to an X99 configuration, but at this point it makes more sense to see what Skylake-X and Kaby Lake-X bring to the table. That remains to be seen, though if the latest rumor is to believed, at minimum we can expect new branding, at least for Skylake-X.
A forum member at AnandTech posted what is purported to be an internal slide outlining a new crop of Core i9 processors. The photo is blurry and low quality, of course, because for whatever reason every leaker in the world seems to own a Fisher Price camera and has an aversion to screenshots. But criticisms over the quality of the leak aside, it looks like Intel is readying a potent lineup.
If information is accurate, Intel's Kaby Lake-X processors will stick with Intel's Core i7 branding. There will be two chips in this tier, including:
Core i7-7740K: 4C/8T, 4.3GHz to 4.5GHz, 8MB cache, 112W, 16 PCIe lanes
Core i7-7640K: 4C/4T, 4GHz to 4.2GHz, 6MB cache, 112W, 16 PCIe lanes
The other four processors shown in the slide are all Skylake-X chips with Intel's new Core i9 branding. It's a sensible change, with Intel moving all new 6-core and higher processors to the new brand. The i9 parts consist of the following:
Core i9-7920X: 12C/24T, unknown clocks, 16.5MB cache, 140W, 44 PCIe lanes
Core i9-7900X: 10C/20T, 3.3GHz to 4.3GHz, 13.75MB cache, 140W, 44 PCIe lanes
Core i9-7820X: 8C/16T, 3.6GHz to 4.3GHz, 11MB cache, 140W, 28 PCIe lanes
Core i9-7800X: 6C/12T, 3.5GHz to 4GHz, 8.25MB cache, 140W, 28 PCIe lanes
The two middle SKUs will also feature Turbo Max support with both the Core i9-7920X and Core i9-7900X being able to hit 4.5GHz in some situations. None of the other processors leaked here list Turbo Max support, though it's possible the Core i9-7920X will, since no clockspeeds were provided.
All the Core i9 Skylake-X and Core i7 Kaby Lake-X CPUs will run on Intel's upcoming X99/LGA2066 platform. That includes support for quad-channel DDR4-2666 memory, according to the slide.
None of this is official, of course, but if the specs do end up being accurate, it is interesting that Intel will again slash the number of PCIe lanes on its lower end enthusiast processors. That could push power users to purchase a higher end part even if the additional physical cores and threads aren't needed. Meanwhile, it looks like AMD may take a different approach with its Ryzen 9 series for enthusiasts, all of which are rumored to offer 44 PCIe lanes.
In related news, Guru3D dug up some supposed benchmarks of the Core i9-7900X and Core i9-7920X. In UserBenchmark, the former scored 107 points in single-core performance and 1,467 points in multi-core performance, whereas the latter scored 130 points and 1,760 points, respectively. As a point of reference, Intel's Core i7-6950X scored 117 points in single-core performance and 1,526 points in multi-core performance.
Source: PC Gamer
Posted By CybrSlydr @ 3:14 PM
Monday May 8th, 2017
| Modern processors can run at temperatures ranging from 25 to 90 degrees, depending on configuration, cooling and workload. That said, when a CPU takes on a heavy load, that increase tends to be gradual, rather than instantaneous. And it certainly shouldn't occur for basic, undemanding tasks. Unfortunately, Intel's Core i7-7700k might have a temperature problem, with spikes of 30;deg&C not uncommon when, say, opening a webpage.
Intel officially took notice of the 7700k's supposed issues after a post by "BC93Key" appeared on the company's forums. However, it seems reports of the processor's unpredictable behaviour had been doing the rounds among users before then.
Here's the gist of BC93Key's complaint:
I have found that the i7-7700k reports a momentary (a second or less) temperature spike +25 > 35 degrees Celsius anytime a program is opened, a webpage is opened, a background app runs etc. The temperature blip cascades through the cores in random order; not the same every time. This causes my heatsink fan to constantly cycle up and down. Temperatures otherwise report as steady, normal increases. Peak temperature under Prime95 blend test is 71 degrees Celsius.
It's important to note that BC93Key is running their system stock — that is, no overclocking or modifications to the hardware.
Now, it's not unusual for an idling processor to ramp up quickly once something starts happening, but a spike of 30°C is insane. It didn't take long for others to come out of the woodwork and report similar experiences.
Aside from basic troubleshooting, it took three weeks before Intel responded with concrete news, though it wasn't what users wanted to hear:
In our internal investigation, we did not observe temperature variation outside of the expected behavior and recommended specifications. For processor specifications, please refer to the Intel® Core™ i7-7700K Processor Product Specifications ... We do not recommend running outside the processor specifications, such as by exceeding processor frequency or voltage specifications, or removing of the integrated heat spreader (sometimes called "de-lidding"). These actions will void the processor warranty.
So as far as Intel is concerned, it's working as intended, which means anyone hoping for a driver update, microcode patch or refund may be out of luck. For those unsatisfied with the company's response, well, Intel's not the only player in town.
You Tony like the solution though, pretty much they're saying that unlocked multi isn't to be used for overclocking.
*“We do not recommend running outside the processor specifications, such as by exceeding processor frequency or voltage specifications, or removing of the integrated heat spreader (sometimes called “de-lidding”). These actions will void the processor warranty.”
So why is it unlocked Intel?
They're actually suggesting you change your fan curve so you don't have to listen to your fans rapidly changing rpms when you open Firefox....
Relid ftw! Void that warranty!
Posted By The Dude @ 6:20 AM
Thursday March 30th, 2017
| The teasing is over: Destiny 2, a video game for which I am already brainstorming silly ledes, will be out September 8 for PS4, Xbox One, and PC.
And here’s Bungie describing the game:
Humanity’s last safe city has fallen to an overwhelming invasion force, led by Ghaul, the imposing commander of the brutal Red Legion. He has stripped the city’s Guardians of their power, and forced the survivors to flee. You will venture to mysterious, unexplored worlds of our solar system to discover an arsenal of weapons and devastating new combat abilities. To defeat the Red Legion and confront Ghaul, you must reunite humanity’s scattered heroes, stand together, and fight back to reclaim our home.
As we reported last year, this Destiny sequel is aiming to feel like a fresh start for Bungie’s ongoing franchise, which has picked up a great deal of baggage since it first launched in September of 2014. In addition to coming to PC, Destiny 2 will offer a clean break for everyone, leaving behind all of our old weapons and gear.
Because the day wouldn’t be complete without more teasing, Bungie says there’ll be a “gameplay premiere live stream” on May 18.
Posted By CybrSlydr @ 12:45 PM
| If you’re skeptical whether “optimizations” can truly improve gaming performance on the disruptive new Ryzen CPU, AMD has a message for you: They really can.
On Thursday the company released benchmark results from a beta version of Ashes of the Singularity that showed a sizable increase in performance from just a few weeks of tuning for the company’s new CPU.
Why this matters: When AMD’s Ryzen launched with bat-out-of-hell application performance but somewhat slower gaming performance than Intel’s rival CPUs, it spawned an Unsolved Mysteries-like search for the cause of such a puzzling disparity. Many theories later (including one that has absolved Microsoft), the only one that seems to be standing are the games themselves.
AMD’s numbers show that patching Ashes of the Singularity: Escalation with Ryzen optimization could increase performance 26 to 34 percent, a significant boost for Ryzen.
Here's your independent verification, too: AMD officials gave PCWorld early access to a beta that features the Ryzen optimizations, which we tested under our control.
How we tested
For our original Ryzen review, we tested using four DDR4/2133 modules, which is the maximum clock speed for RAM when the memory controller is fully loaded. Because AMD says Ryzen performance can be improved using higher-clocked memory, we stripped out two modules, bringing the system to 16GB, and upped the speed to DDR4/2933. We also updated the BIOS on our Asus Crosshair VI Hero motherboard to the latest publicly available. The same GeForce GTX 1080 GPU handled the graphics chores.
The beta game executable was downloaded from Steam directly and not provided by AMD. Our Ryzen review actually used the original Ashes of the Singularity, but for this test, the beta required using the Ashes of the Singularity: Escalation expansion pack version.
The result? AMD’s not fronting. Our own tests found that running Ashes of the Singularity: Escalation gave a 20- to 28-percent boost in our testing conditions.
We also conducted CPU-centric testing, which puts more objects on the screen with more AI and physics to stress more cores. The bump wasn’t quite as significant, but there’s still a healthy increase in performance from just tweaking the game code.
The good news is, you can test it too. A patched version of the game containing the Ryzen optimizations should be immediately available on Steam for you to download and test.
But what about Intel?
Of course, you’re wondering how this optimization helps Ryzen compete with Intel’s chips, such as the Core i7-7700K. The patch helps, but it doesn’t make it as fast. In the first chart, for example, a stock-clocked Core i7-7700K would be pushing 92 frames per second. Some of that clearly comes from the Kaby Lake’s higher clock speed (which generally runs at 500MHz faster or more), but some of it also comes from games optimization.
In fact, that’s why I featured the same Ryzen CPU in our charts above. Developers tell PCWorld Ryzen tuning is still in its infancy, and it’s somewhat unfair to pit the two chips against other right now with the code as it is.
“Every processor is different on how you tune it, and Ryzen gave us some new data points on optimization,” Oxide’s Dan Baker told PCWorld. “We’ve invested thousands of hours tuning Intel CPUs to get every last bit of performance out of them, but comparatively little time so far on Ryzen.”
Baker said Oxide wanted to get the beta out to the world so users could at least see the potential. Oxide’s CEO also said (in a statement released by AMD), “as good as AMD Ryzen is right now—and it’s remarkably fast—we’ve already seen that we can tweak games like Ashes of the Singularity to take even more advantage of its impressive core count and processing power. AMD Ryzen brings resources to the table that will change what people will come to expect from a PC gaming experience.”
Oxide isn’t the only one starting to tune for Ryzen. Bethesda also said it had formed a partnership with AMD to optimize and support the company’s CPUs and GPUs.
What this all means: When AMD CEO Lisa Su addressed the gaming disparity just after Ryzen’s launch by saying “vital optimizations” will only make it better, I have to admit I was in the skeptical column. That’s because promised optimizations are basically the tech industry’s version of “the check is in the mail.” But with Oxide squeezing out so much more performance in just a few short weeks of tuning, there’s probably a lot more to come from Ryzen.
Source: PC World
Posted By CybrSlydr @ 12:20 PM
Tuesday March 28th, 2017
| Late last year Razer resurrected its Blade Pro laptop line, finally stuffing it with hardware worthy of the “Pro” appellation. Our three-word review: We loved it. It’s a great machine, if you can afford it.
And now it’s a bit better, thanks to the standard year-over-year refresh. Razer released details on a new Blade Pro today—it’s keeping Nvidia’s GeForce GTX 1080, but moving over from an Intel i7-6700HQ at 2.6GHz to an overclockable i7-7820HK processor at 2.9GHz. The Blade Pro’s 32GB of RAM also gets a timing bump up to 2,667MHz (from 2,133MHz).
The really interesting news though: The Blade Pro is now the first-ever laptop to receive THX Mobile Certification, “an accreditation reserved for high-performance mobile phones, tablets, and laptops.” From the press release:
“Through the processes of THX, the Razer Blade Pro screen is calibrated and tested for resolution, color accuracy and video playback performance...Similarly, the audio jack on the new Razer Blade Pro met THX requirements for voltage output, frequency response, distortion, signal-to-noise ratio, and crosstalk that guarantees clear sound through headphones.”
It’s worth noting that only the headphone jack is THX-certified, not the built-in speakers—an important point, I think, given people usually associate THX with surround sound systems. While the Blade Pro’s speakers are certainly better than your average laptop's, they’re still not amazing by any means.
We could also debate all day about the usefulness of THX certification. Is your non-certified Blade Pro from six months ago suddenly a decrepit old hag? Not at all. Razer’s even using the same 4K IGZO display on this new THX-certified laptop as it did on the 2016 model—just calibrated slightly differently, and with (presumably) a big ol’ THX stamp on the box. So yeah, this is a bit of a marketing win more than anything else.
On the other hand it does prove the Blade Pro is one hell of a laptop. A THX representative confirmed to me that competing laptops have undergone testing, but Razer’s is the first to meet the standards of this new Mobile Certification program. That makes it somewhat-objectively the best laptop in the world for the moment, at least by THX’s standards—meaning as far as the display and headphone jack are concerned.
Is that useful? I don’t know. The display is certainly an important aspect with laptops, so THX Mobile Certification isn’t a wholly made-up honorific. Still, it does seem of limited use to tech nerds—no consideration given to internal hardware, benchmarks, or anything we usually use to compare laptops. The Blade Pro is THX-certified to be easy on the eyes, and that’s about it.
I guess you’ll have to keep reading our PCWorld reviews for the full picture.
Source: PC World
Posted By CybrSlydr @ 11:13 AM
Monday March 27th, 2017
| If you’ve ever boiled with inner turmoil at the failure of your Android device to recognize an “OK Google” command, you know AI speech recognition and natural language processing still have a long way to go. In many ways, it’s symptomatic of taking a disembodied, top-down approach to language that treats words as sounds, rather than experiences. However, the folks at the OpenAI group (of Elon Musk fame) have made new strides in creating an AI that uses grounded, compositional language the way we do. This is both inspirational — what could be the dawn of new era in communication — and more than a little alarming.
To better appreciate the departure this research represents, it’s important to understand the relevance of grounded, compositional language, as opposed to the canned responses offered up by Siri or Google Assistant. For decades, a faction of AI researchers have insisted that in order for AI to ever achieve something like common sense and an ability to communicate in a non-rigid fashion, it would need some form of embodiment – that is, some experiential loci from which to view and interpret its surroundings. The concept of grounded language follows from this principle and implies the ability to connect words and their meaning with someone’s individual experiences.
This is an important distinction. Imagine a blind person who has never seen the color blue interpreting the word’s meaning in a sentence. They have no reference beyond the way other people have used the word “blue” before. This could be likened to how Siri or Google Assistant responds to a query – there is no experiential basis behind the response. A grounded use of language stems from an an entities personal experiences. This is precisely the kind of language adopted by the agents in the OpenAI research project.
Compositional language, on the other hand, denotes the ability to string together multiple words to form more complex meanings. Certain monkeys for instance have different warning calls they use to differentiate between a snake and a bird of prey. But their language cannot be termed compositional, because they will never string these together to form more complex meanings, such as “the bird is carrying the snake.” The language developed by the AI agents at OpenAI, though still simple by human standards, represents advancement beyond almost anything seen in the animal kingdom.
Even more amazing, the researchers never explicitly programmed this AI communication. Instead, it “evolved” as a response to a reinforcement learning problem. While the jargon can get a bit technical, the OpenAI blog does a decent job of parsing it. The important thing to grok is the language was never defined, but rather hit upon as a solution to a general problem of learning to communicate. This type of AI method is called reinforcement learning, and involves the use of a reward signal to continually guide the agent towards an optimum outcome. It can be likened to the difference between giving someone a map up a hill, and handing them an altimeter and saying don’t stop hiking until you reach a maximum altitude. One approach lends itself to a single path, the other to a galaxy of alternatives.
It’s not surprising, therefore, that AI agents developed some truly weird methods of communication – for instance, one in which the length of the spaces between communications came to represent different meanings, not unlike Morse code. At the moment, the AI language is completely non-human, with no English equivalent. And while there has been some talk of creating a translation tool to make the language readable in English, I think it’s worth simply marveling at the weirdness of these new communications. They may represent the closest thing to an alien language we have thus far encountered.
Posted By CybrSlydr @ 10:21 AM
Thursday March 9th, 2017
| Nvidia’s mighty Titan has fallen, as it always does.
Jaws dropped when the second-gen Titan X stomped onto the scene in August, and for more reasons than one. The monster graphics card was the first to ever flirt with consistently hitting the hallowed 60-frames-per-second mark at 4K resolution with everything cranked to 11—but that privilege cost a cool $1,200. Fast-forward five months: Nvidia’s teasing the GTX 1080 Ti as the “ultimate GeForce” card, with more performance than the Titan X for just—“just”—$700. That’s what the GTX 1080 Founders Edition cost at launch, and Nvidia says the Ti stomps the base GTX 1080.
Graphics-card lust truly is the cruelest obsession.
But does the GeForce GTX 1080 Ti live up to Nvidia’s hype? Is this the 4K-capable graphics card that gamers flush with tax-return money have been waiting for?
Yes. Oh my, yes. Let’s dig in.
Source: PC World
Posted By CybrSlydr @ 1:18 PM
Tuesday March 7th, 2017
| He has a new challenge as AMD's server chief: to bring back the glory days of chipmaker's server business, which is now in tatters. A mega-chip called Naples, which has 32 cores and is based on the Zen architecture, will be the first test of AMD's return to the server market.
Source: PC World
Posted By Almost Tactf @ 11:33 AM
Thursday March 2nd, 2017
| Admit it. You love underdog tales. The Cleveland Cavaliers coming back from a 3-1 deficit against the Golden State Warriors. The New York Giants defeating the 18-0 New England Patriots, and the Average Joes beating the heavily favored Purple Cobras in the dodgeball finals.
Well, you can now add AMD’s highly anticipated Ryzen CPU to that list of epic comebacks in history. Yes, disbeliever, AMD’s Ryzen almost—almost—lives up to the hype. What’s more, it delivers the goods at an unbeatable price: $499 for the highest-end Ryzen 7 1800X, half the cost of its closest Intel competitor.
But before AMD fanboys run off to rub it into Intel fanboys’ faces, there’s a very important thing you need to know about this CPU and its puzzling Jekyll and Hyde performance. For some, we dare say, it might even be a deal breaker. Read on.
Source: PC World
Posted By CybrSlydr @ 10:03 AM
Wednesday March 1st, 2017
| One of key detractors for a complete immersion into virtual reality experience is the presence of cables which hinder your digital movement. Hence, the question for the VR industry is… what is freedom and how one becomes free? If you ask that question to someone from MSI, he or she would just smile and gave you something… backpack alike. You would than realize that suddenly it’s possible to have your cake and eat it too. It is possible to hunt dragons next to your sleeping girlfriend or boyfriend without waking her (or him). Actually, not just one pack but several big, ugly packs of dragons.
What comes next is to ride off into sunset – what an awesome bag this is. During the Tokyo Game Show 2016, MSI announced “world’s first-ever VR Backpack,” the MSI VR One. We view this proclamation as somewhat funny, given that HP announced its OMEN backpack in May, followed by Alienware on E3 2016 (June) and XMG from Schenker Technologies announced their product on Gamescom in Cologne, Germany. Then again, MSI is the only backpack available for purchase.
Source: VR World
Yeah, I don't see this catching on.
Posted By CybrSlydr @ 6:55 AM
As we close out the second day of the Game Developers Conference (GDC) in San Francisco, Nvidia has revealed that a new high-end graphics card will arrive on the market next month.
The GeForce GTX 1080 Ti will the be the firm's fastest gaming GPU to date based on the Pascal architecture. The GTX 1080 Ti comes with 11GB of GDDR5X memory and offers impressive specifications that delivers "up to 35 percent more performance of the GTX 1080".
Some of the finer details of the card lie below:
Massive Features for Massive Performance:
• The GTX 1080 Ti includes 3,584 NVIDIA® CUDA® cores and a massive 11GB frame buffer running at an unheard of 11Gbps. It delivers up to 35 percent faster performance than the GeForce GTX 1080 and up to 78 percent faster performance than the GTX 1070.(1) The GTX 1080 Ti is even faster than the NVIDIA TITAN X Pascal, its $1,200 big brother that was designed for deep learning and artificial intelligence.
• Next-Gen Memory Architecture: GTX 1080 Ti is the world's first GPU to feature Micron's next-gen G5X memory. 11GB of G5X memory running a blazing 11Gbps quad data rate delivers the most effective memory bandwidth of any modern gaming GPU. And it still has plenty of headroom for overclocking.
• Advanced FinFET Process: The GTX 1080 Ti is manufactured on the industry's cutting-edge FinFET process. Its 12 billion transistors deliver a dramatic increase in performance and efficiency over previous-generation products.
• Meticulous Craftsmanship: The GTX 1080 Ti runs as cool as it looks due to superior heat dissipation from a new high-airflow thermal solution with vapor chamber cooling, 2x the airflow area and a power architecture featuring a seven-phase power design with 14 high-efficiency dualFETs.
• Support for Advanced Graphics Technologies: 4K, VR, NVIDIA G-SYNC™ HDR and NVIDIA GameWorks™ offer interactive, cinematic experiences accompanied by incredibly smooth gameplay.
The new GeForce GTX 1080 Ti will be available from various partners with the Founders Edition being made available for pre-order starting on March 2 for a price of $699. The card will make its retail debut on March 10.
Posted By sysigy @ 3:29 AM
Tuesday February 28th, 2017
| The expected showdown between Radeon Vega GPUs and the GeForce GTX 1080 Ti at GDC on Tuesday won’t be a showdown after all. New technical details about AMD’s hotly anticipated enthusiast-class graphics cards were almost nowhere to be found during the company’s “Capsaicin & Cream” livestream—though Radeon head Raja Koduri did reveal that the brand name for Vega GPUs will indeed be “Radeon RX Vega,” rather than RX 490 or RX 580.
What a tease.
Koduri also showcased a brief Deus Ex: Mankind Divided demo that suggested that Vega’s high-bandwidth cache controller can increase average and minimum frame rates by 50 and 100 percent, respectively, in memory-limited games. Impressive! Another quick demo with AMD’s TressFX technology revealed that Vega’s rapid packed math feature could double compute rates, which let the demo render twice as many hair strands as a Vega system with RPM disabled.
But the majority of the event focused on the sort of nitty-gritty technical details befitting a Game Developers Conference—though some of the announcements benefit everyday gamers, too.
Most notable is a new technology deal with Bethesda, the publisher of Doom, Fallout, The Elder Scrolls, Dishonored, and more. While partnerships between graphics companies and developers have typically involved just a single, specific games, AMD’s deal with Bethesda spans multiple games across a range of series. The crux is primarily to implement Vulkan, the open DirectX 12 alternative that rose from the ashes of AMD’s Mantle technology, as well as “the computing and graphics power of AMD Ryzen CPUs [and] Radeon GPUs.”
Bethesda’s id Software worked closely with AMD to implement Vulkan and other technologies in Doom, and the results were nothing short of spectacular. It’s tantalizing to think of that technical expertise potentially supercharging Bethesda’s other series. We’ll have to see how it shakes out; no additional specifics were announced, though earlier this month rumors of Vega being tied to Bethesda’s ambitious Prey surfaced.
Radeon’s also fueling a more affordable rival to GeForce Now for PCs. LiquidSky, like GFN, allows you to stream full-blown PC games from its cloud servers to any PC, Mac, or Android device—even ones that couldn’t otherwise play games. And soon those servers will rely on Radeon Vega graphics, AMD announced, both to deliver higher performance and to split the capabilities of a single GPU among several users.
The rest of the announcements revolved around virtual reality. First, AMD’s unlocking some nifty VR technology tricks. Most notable is support for the HTC Vive’s async reprojection: This works somewhat similarly to the Oculus Rift’s Asynchronous Timewarp to reduce nausea-inducing judder when a game’s frame rate drops below 90 frames per second.
Nvidia graphics cards have supported async reprojection since the tech launched last November, but AMD needed to unlock new hardware features in Radeon cards. Whereas the Rift’s Timewarp utilizes the dedicated asynchronous compute engine hardware inside Radeon cards, the Vive’s async reprojection leans on graphics threads, and AMD’s implementation relies on ultra-fast preemption and context switching between those threads. Look for it to release sometime in March.
AMD is also adding support for forward rendering in virtual reality, because standard deferred rendering has a performance cost and doesn’t work nicely with MSAA antialiasing in VR. The newfound support will appear in version 4.15 of Epic’s widely used Unreal Engine 4.
Finally, some new virtual-reality games were exclusively revealed during AMD’s livestream, with the developers of each praising Radeon’s new forward-rendering technology. Sprint Vector, a new game by Survios, the creators of the wildly popular Raw Data, introduced a “unique intelligent fluid locomotion system”—a whole new way of moving in VR. Also on display: Overrun, an expansion for the ROM: Extraction game developed by a squad of Call of Duty veterans, and Reaping Rewards, a VR experience by Limitless Studios that explores “the emotional choices of a young Grim Reaper as you learn about life and death from your mentor.”
Hardware enthusiasts definitely won’t be left hanging at GDC 2017. During last week’s Ryzen launch event, AMD showed the first-ever Vega graphics card running in the wild. And it looks ****ed near certain that Nvidia will reveal the long-awaited GTX 1080 Ti at its own event later tonight, before those long-awaited Ryzen processors hit the streets on March 2.
Editor's note: This article was updated to include Vega brand info.
Source: PC World
Posted By CybrSlydr @ 5:07 PM