Wednesday August 5th, 2015
| A large number of users invested into Intel based platforms during the Core 2 Quad, Nehalem and Sandy Bridge releases. Sandy Bridge was notable because it inferred a large performance gain at stock speeds, and with a good processor anyone could reach 4.7 GHz and even higher using a good high end cooler. With that, Intel has had a problem enticing these users to upgrade because their performance has been constantly matched by Ivy Bridge, Haswell and Broadwell – for every 5% IPC increase from the CPU, an average 200 MHz was lost on the good overclock and they would have to find a good overclocking CPU again. There was no great reason, apart from chipset functionality to upgrade.
That changes with Skylake.
From a clock-to-clock performance perspective, Skylake gives an average ~25% better performance in CPU based benchmarks, and when running both generations of processors at their stock speeds that increase jumps up to 37%. In specific tests, it is even higher. When you scale up to a 4.5 GHz Skylake against a 4.7 GHz Sandy Bridge, the 4% frequency difference is only a tiny portion of that. There are other added benefits, such as the move to a DDR4 memory topology that has denser memory modules, as well as PCIe storage and even PCIe 3.0 graphics connectivity.
As I said above, Skylake is not necessarily the most ground breaking architecture over Haswell. It affords a 19% CPU performance gain over the i7-4770K and 5% over the i7-4790K. There is a small minor issue with gaming that disappears when you use synthetics, but only to the tune of a couple of percentage points. For that minor hit, the package combo of processor, chipset and DDR4 should be an intrigue in the minds of Sandy Bridge (and older) owners.
Posted By chartiet @ 8:31 AM
Friday July 31st, 2015
| In February, 2010, Google announced its Fiber project, designed to offer a gigabit service with broadband speeds 100 times faster than what most Americans were receiving at that time. For several years, incumbent providers did not react with improved services, saying instead that such services were too expensive and consumers didn't care about getting a gig.
In the last year, however, as Google started to offer gigabit services at prices comparable to much slower offerings by cable and phone company providers and expanded Fiber's service territory, the incumbents and others have started to announce their own gigabit offerings at similar price points, setting the stage for a potential "game of gigs," in which tens of millions of Americans have the potential to receive faster, better and cheaper broadband.
There is no question that Google Fiber is a seminal development in the broadband market. The question is what is the lesson for policy?
Posted By CybrSlydr @ 9:41 PM
Monday July 27th, 2015
| By Seth G. MacyNvidia's next mid-range graphics card is rumored for debut on August 17. The GeForce GTX 950, as it's now known, will compete against AMD's Radeon R7 370.
According to TechSpot, the GTX 950 will have Nvidia's GM206 GPU, which first debuted in the GTX 960 back in January. However, 25% of the CUDA cores in the GM206 will be disabled for the GTX 950, giving it a core count of 768.
The reported clock speeds for the GTX 950 are between 1150 and 1250 MHz, with a boost clock speed between 1350 and 1450 MHz. This actually puts the speed of the 950 higher than the 960, which has a base clock speed of 1127 MHz and a boosted speed of 1178 MHz.
The card will come with 2GB GDDR5 memory, and the 90W TDP card is apparently powered by a single 8-pin PCIe connector.
The Radeon R7 370, the card the GTX 950 is aimed for, has 4GB GDDR5 memory and a similar $150 USD price point. Radeon unveiled its 300 series of cards at an event at E3.
Nvidia released its GTX 980 Ti earlier this year, a positive beast of a graphics card with a beastlier price tag than the reported 950, coming in at an MSRP of $650. The 950 should satisfy budget-conscious gamers who want to strike a balance between power and price.
Posted By CybrSlydr @ 4:49 PM
Thursday July 9th, 2015
| While Intel is finally getting its 14-nanometer sized chips out to the public, IBM today announced an even more impressive silicon breakthrough: The production of the first working 7nm chip. It's particularly impressive since it took years for chip makers like Intel to move from 22nm chips to 14nm, which offer better power efficiency and faster overall speeds thanks to their denser manufacturing. IBM's 7nm chip, produced together with partners including GlobalFoundries (which is taking over IBM's semiconductor business) and Samsung, will offer similar benefits, but the road to get there was vastly more complex than 14nm chips. IBM says it's using silicon germanium in electricity-conducting channels on the chip, as well as a new lithography method, dubbed Extreme Ultraviolet, to print finer circuits (which are around 10,000 times thinner than human hair). Perhaps most intriguingly it also keeps Moore's Law, the notion that computing power will double roughly every 18 months, alive for the next few years.
"The implications of our achievement are huge for the computer industry," wrote Mukesh Khare, IBM's VP of semiconductor technology research. "By making the chips inside computers more powerful and more efficient, IBM and our partners will be able to produce the next generations of servers and storage systems for cloud computing, big data analytics and cognitive computing."
Posted By CybrSlydr @ 3:43 PM
| Intel Compute Stick Review: Don’t Buy It
Who wants a cheap HDMI stick that can turn any TV into a full Windows computer? Everybody, right? That’s what we thought. Oh god were we wrong. When Intel announced the $150 Compute Stick at CES, we figured it could become the ultimate miniature PC for all kinds of people. Too bad it’s terrible.
Theoretically, there are loads of things you could do with a computer this tiny. You could work from it, of course, or browse the web from your couch. Watch Hulu without a subscription. Stream games from another computer. My editor Sean Hollister was excited to load Steam on it, plug in an Xbox 360 wireless adapter, and play lightweight games like Nidhogg with buddies on a big screen without lugging a console around. I was dubiously optimistic I could turn the Stick into a Kodi media streamer, accessing videos from my desktop PC over my home network.
Do some of these things work? Sure. But using this under-equipped PC is a giant pain in the ass—to the point that it’s probably not worth it.
What is it?
An attempt to cram an entire desktop computer into a tiny $150 HDMI-dongle that you plug into any TV—that doesn’t quite stick the landing. Boy, it sure sounded good on paper, though: a quad-core 1.33GHz Intel Atom processor, 2GB of RAM, and 32GB of storage all in a compact, pocketable package? Yes please. The Compute Stick’s tiny case even features a full-sized USB port and a MicroSD card reader. What’s not to like? As it turns out, almost everything.
All you need is an app to make your phone a wireless keyboard/mouse.
Posted By maxgull @ 9:36 AM
Thursday July 2nd, 2015
| By Alex OsbornBatman: Arkham Knight publisher Warner Bros. Interactive was apparently well aware of the technical issues that currently plague the PC version of Rocksteady Studio's trilogy-capper.
According to sources close to Kotaku, the publisher knew the PC version had problems months ago, but decided to ship the game anyway.
"I will say that it’s pretty rich for WB to act like they had no idea the game was in such a horrible state," an anonymous QA tester said. "It’s been like this for months and all the problems we see now were the exact same, unchanged, almost a year ago."
A few days ago, Warner Bros. decided to suspend sales of Arkham Knight on PC in response to an overwhelming number of reports from gamers having issues. We've reached out to Warner Bros. for comment regarding its alleged knowledge of these issues pre-release, and will update as soon as we receive a reply.
That said, the console version of Rocksteady's superhero game has been received incredibly well by critics, with IGN's Dan Stapelton praising the title in his review for its "excellent gameplay variety" and "detailed open world."
That's not good.
Posted By CybrSlydr @ 9:04 AM
Wednesday June 24th, 2015
| Last week at E3, AMD pulled the wraps off its new flagship GPU: the liquid-cooled Radeon R9 Fury X. It’s the most powerful GPU AMD has ever created, as well as the company’s first video card to feature new architecture and a new type of memory. Even before its announcement, speculation ran wild about how it’d compare to Nvidia’s GTX 980 Ti, which is equally priced at $650.
When specs and AMD-provided performance numbers came out late last week, it appeared that the two cards were pretty closely matched—and after some extensive testing, we can confirm that both cards are indeed pretty darn close in performance.
Posted By CybrSlydr @ 11:42 PM
Tuesday June 16th, 2015
| The wait is finally over. A year and a half after the launch of the R200-series, months after Nvidia refreshed its entire GeForce lineup, AMD has lifted the veil off its new Radeon graphics card lineup: the Fury series, powered by revolutionary high-bandwidth memory (HBM), and the company’s brand new Fiji GPU.
Leading the charge is the AMD Radeon R9 Fury X, which is capable of driving Tomb Raider to 45 frames per second at 5K resolution and Ultra settings, and offers 1.5 times the performance per watt of AMD’s previous R9 290X flagship. The graphics card boasts 4,096 stream processors—an incredible jump over the R9 290X’s 2816. The Radeon R9 Fury X packs 8.9 billion transistors, compared to the R9 290X’s 6.3 billion transistors. Given that AMD—like Nvidia—is still using the 28nm manufacturing process, the Fiji GPU itself must be massive.
Fury X has officially launched.
The R9 300 series will ship this coming Thursday.
Fury X Water Cooling Edition is shipping June 24th @ $649
Fury Air cooled is shipping July 14th @ $549
Fury X Specs
8.9 Billion Transistors
Fury Nano is later this summer. Lisa stated 1.5x performance per watt of 290x for Fury X and 2x performance per watt for Fury nano. Pics of Fury X, Nano, and dual Fury in attachments. VSR is coming as well something that has been lacking from AMD for sometime.
I think that surprisingly i'm loving the form factor. I could make one hell of an mITX with two Fury Nanos all on water. Would make for the ultimate lan PC or travel PC. Meanwhile quantum would be smaller than any console and as powerful as a full size tower with a Fury X in it. Seems about the same size as the smallest steam machines.
Fury X industrial design
Fury X product overview
High-Bandwidth Memory with more than 3x the bandwidth per watt over GDDR51 along with a 4096-bit memory interface - the highest AMD GPU memory bandwidth ever.
Sleek industrial styling including GPU Tach activity indicator and LED illumination all in a compact 7.5-inch card that’s all about raw graphics power.
Powerful performance, bleeding-edge technology, liquid-cooled, whisper-quiet and future-ready for extreme 4K and VR gaming.3
Posted By Blackops_2 @ 2:28 PM
Thursday June 11th, 2015
| The virtual reality company, bought by Facebook last year for $2 billion, said Thursday it plans to launch its consumer Rift headset early next year in a partnership with Microsoft that will tie together the Xbox One game console to the Oculus platform.
The final version of the Rift, shown today for the first time, will come with an Xbox One wireless controller, as well as a standing camera to track your head movements and whether you're standing and moving around the room or sitting down. Oculus is also working on hand controllers of its own, called Oculus Touch, that resemble small joysticks with looping rings around the base. That hardware is being designed to bring more realistic hand-motions to VR worlds that will let people interact with the environment.
Posted By CybrSlydr @ 1:40 PM
Wednesday June 10th, 2015
| ...As usual, we have several synthetic benchmarks covered by Videocardz which were taken from 3DMark’s data base which has validated several new cards. Previously, the GeForce GTX 980 Ti synthetic benchmarks covered by Videocardz turned out very true in the sense that they were close to the real benchmarks and we can again expect these results to be close to the cards which hit the market later this month. The benchmark shows the Fiji XT based Radeon Fury X performing faster or 100 3DMarks better than the GeForce GTX 980 Ti in Firestrike Ultra and Extreme (both cards are clocked at reference speeds). The GeForce GTX 980 Ti OC shows additional performance due to higher clock speeds but we can expect the same from the Fury X which is designed for OC with its unique water cooling design which is a beauty. The GeForce GTX Titan X is faster in 3DMark Firestrike Extreme but is slightly slower in 3DMark Firestrike Ultra compared to the Radeon Fury X due to its higher ROP count which benefit in gaming at 4K and higher resolutions.
The Radeon R9 390X and Radeon R9 390 can also be seen in the benchmarks with the Hawaii Pro with faster clock speeds performing slightly ahead of the GeForce GTX 970 while the Radeon R9 390X is shown faster than the Radeon R9 290X due to larger VRAM and higher clock speeds but lags behind the GeForce GTX 980 in both 3DMark benchmarks.
Posted By CybrSlydr @ 1:57 PM
Friday June 5th, 2015
| By Luke Karmali
Valve has announced the first pre-order details for a series of upcoming Steam Machines and when they'll be available.
Though lots of Steam Machines have been announced, only a few will be available to pre-order now along with the Steam Controller. Doing so, however, will allow you to pick the hardware up several weeks before anyone else.
North American gamers will be able to pre-order an Alienware Steam Machine, Steam Link or Steam Controller from GameStop or Steam as of today and get their order on October 16, with the official launch not happening until November 10. This offer is only available for a limited time.
In Europe and Canada, GameStop, EB Games, Micromania, GAME and Steam are offering the Steam Link and Steam Controller for pre-order.
CyberPower's Steam Machine is set to be available directly from its site in all regions. More information on other Steam Machines will be coming in the next few weeks.
For a full breakdown on the variety of models, check out IGN's Steam Machine roundup.
Posted By CybrSlydr @ 11:01 AM
Monday June 1st, 2015
| For much of the last year now, the story of the high-end video card market has been the story of NVIDIA. In September of 2014 the company launched the GeForce GTX 980, the first and at the time most powerful member of their Maxwell 2 architecture, setting a new mark for both power efficiency and performance, securing their lead of high-end of the video card market. NVIDIA then followed that up in March with the launch of the GeForce GTX Titan X, NVIDIA’s true flagship Maxwell part, and a part that only served to further cement their lead.
Based on the very powerful (and very large) GM200 GPU, GTX Titan X is currently untouched in performance. However priced at $1000, it is also currently untouched in price. In NVIDIA’s current lineup there is a rather sizable gap between the $550 GTX 980 and $1000 GTX Titan X, and perhaps more significantly GTX Titan X was the only GM200 part on the market. With NVIDIA launching their fully enabled flagship card first, it was only a matter of time until they released a cheaper card based on a cut-down version of the GM200 GPU in order to fill that pricing hole and to put salvaged GM200s to good use.
Now just a bit over two months since the launch of the GTX Titan X, NVIDIA launching their second GM200 card, GeForce GTX 980 Ti. Based on the aforementioned cut-down version of GM200, GTX 980 Ti is the expected junior version of GTX Titan X, delivering GM200 at a cheaper price point. But calling GTX 980 Ti a cheaper GM200 may be selling it short; “cheaper” implies that GTX 980 Ti is a much lesser card. At $649, GTX 980 Ti is definitely cheaper, but the card that is launching today is not to be underestimated. GTX 980 Ti may be intended to be GTX Titan X’s junior, but with the excellent performance it delivers, GTX 980 Ti may as well be GTX Titan X itself.
Posted By CybrSlydr @ 11:47 AM
| XCOM 2 Announced -- IGN First
Fight to free Earth from alien domination in the PC-exclusive sequel.
By Dan StapletonLast week, 2K teased us with a cryptic website: AdventFuture.org. On it, a supposedly utopian near-future government called the Advent Administration offered a better life for citizens through genetic enhancement, while hackers subversively replaced text and images revealing that Advent isn’t as benevolent as it seems.
Today, IGN First officially announces that the game 2K hinted at is in fact Firaxis Games’ XCOM 2, a full sequel to 2012’s critically acclaimed XCOM: Enemy Unknown and a continuation of the now 20-year-old legendary turn-based tactical squad combat series, which will be available exclusively on PC this November. Watch the debut trailer above for a glimpse into a future where the aliens are in control of the Earth, and XCOM has gone underground to fight to overthrow their Advent government. This new guerrilla force will face more powerful enemies in unpredictable combat scenarios as they fight to turn the tables on a technologically superior enemy.
Over the next month IGN will reveal huge details about XCOM 2, uncovered during our visit to Firaxis’ Maryland studio. Tomorrow you’ll get an in-depth overview and interview with Creative Director Jake Solomon that’ll highlight the major new features, including new soldier classes, new aliens, stealth-infused tactics, procedurally generated maps, and more. Following that, we’ll have a Rewind Theater examination of all the details in today’s trailer reveal with Solomon and Lead Producer Garth DeAngelis.
And, later this week we’ll go in-depth with Solomon and DeAngelis on the reasons behind the bold move to take XCOM from a multiplatform series to a PC exclusive, and how XCOM 2 will be tailored to take advantage of that single platform’s strengths – followed by Firaxis’ exciting plans to support modders and their work.
You might notice a lack of gameplay video, but fear not: in two weeks, we’ll debut the first in-game footage of XCOM 2 on our IGN E3 Live Show.
All of that and much, much more XCOM 2 is coming your way. As a huge XCOM fan, I couldn’t be more excited to tell you all about it. http://oystatic.ignimgs.com/src/core...d_dpad_red.png
Posted By CybrSlydr @ 11:39 AM
Saturday May 23rd, 2015
| Microsoft just rolled out the latest Windows 10 preview build, and apparently a Reddit user is already tinkering around with the preview build of DirectX 12 that comes with it. If you have been on the internet any time after the Redmond-based firm announced the latest API last year at GDC, you would know that there’s a grand debate on whether or not the DirectX 12 will improve the performance of PC and Xbox One hardware by leaps and bounds. So to look into things first-hand, the user decided to test the preview build of the API on his more than 3 year old hardware; Nvidia Geforce GTX 670 and Intel Core i7 2600K, and the results were quite amazing.
According to him, the tests he ran show a boost of close to 400% in draw call throughput. As visible in the image below, the draw call count according to the results on DirectX 11 was 1,515,965 whereas on multi-thread, it was 2,532, 181. Although, when the user switched to DirectX 12, the number of draw calls raised to 8,562,158, which is around 330% increase in the total performance.
The user stated that he was quite surprised to see DirectX 12 already performing really well despite it being a preview build. He also explained how the improvement in performance is being calculated by the benchmark software; “There’s no actual point score. All it’s doing is increasing the number of draw calls by increasing scene complexity. It just keeps going until the framerate drops to 30, then notes the calls/sec and bails. Since it’s only issuing calls for primitives (apparently anyways) it’s actually giving you a solid idea of how raw output is limited by the number of draw calls that can be dispatched.”
Posted By CybrSlydr @ 5:06 PM
Friday May 8th, 2015
Windows 10 is going to be the last major revision of the operating system.
Jerry Nixon, a Microsoft development executive, said in a conference speech this week that Windows 10 would be the "last version" of the dominant desktop software.
His comments were echoed by Microsoft which said it would update Windows in future in an "ongoing manner". Instead of new stand-alone versions, Windows 10 would be improved in regular instalments, the firm said. Mr Nixon made his comments during Microsoft's Ignite conference held in Chicago this week.
In a statement, Microsoft said Mr Nixon's comments reflected a change in the way that it made its software. "Windows will be delivered as a service bringing new innovations and updates in an ongoing manner," it said, adding that it expected there to be a "long future" for Windows.
Which likely means that there will be "service packs" or "tiers" of the software and you have to pay at regular intervals to stay updated. Looks like MS really is going to go for a "fee" based OS from now on
Posted By lexandro @ 3:23 PM