Tuesday October 6th, 2015
| After weeks of leaks, Microsoft is officially unveiling the Lumia 950 today. It has been 18 months since the last flagship Windows phone was announced, and the new Lumia 950 is here to help launch Windows 10 Mobile. The Lumia 950 will be available starting November for $549.
Microsoft has opted for a 5.2-inch WQHD (1440 x 2560) OLED display on the Lumia 950, coupled with 3GB of RAM and a Snapdragon 808 processor. That makes it one of the more powerful Windows phones we've seen to date, and the first to make use of Qualcomm's latest processors.
Also, it has liquid cooling. Seriously.
Source: The Verge
Posted By CybrSlydr @ 10:22 AM
Friday October 2nd, 2015
| By Mitch Dyer Update: Microsoft has provided official comment to IGN on numerous subjects.
Havok will not be limited to Microsoft exclusively. “We will continue to license Havok’s technology to the broad AAA games industry," a representative told IGN. "This also means that we will continue to license Havok’s technology to run across various game consoles including Sony and Nintendo.”
On the subject of what Microsoft paid for the acquisition, we were told, “We are not discussing financial details at this time.”
Finally, Microsoft told IGN, "We are working closely with the team at Havok to ensure a smooth transition, but have nothing further to announce at this time.”
Havok, the company known for its fantastic physics in video games, has been acquired by Microsoft. The Xbox publisher purchased Havok from Intel. Microsoft explained that this is part of "building the most complete cloud service."
"Microsoft’s acquisition of Havok continues our tradition of empowering developers by providing them with the tools to unleash their creativity to the world," the company's corporate article reads. "We will continue to innovate for the benefit of development partners. Part of this innovation will include building the most complete cloud service, which we’ve just started to show through games like 'Crackdown 3.'"
Havok was used recently for Destiny, Mortal Kombat X, Dark Souls 2, and Watch Dogs.
Microsoft did not announce if Havok will continue to provide physics for third-party games, nor how much the acquisition cost.
Posted By CybrSlydr @ 5:55 PM
Tuesday September 22nd, 2015
| Office is 25 years old this year. Some of the individual components are older, but it was 1990 that Microsoft first released a combined Office bundle, containing Word 1.1, Excel 2, and PowerPoint 2. Through the peculiar quirks of Microsoft's versioning scheme, today marks the 17th release, version number 16.0, branded Office 2016.
You can tell that you're using Office 2016 and not its predecessor Office 2013 because, by default, Office 2016 colorizes the title bars of each app to make them reflect each application's distinctive color (except for Outlook, which remains distressingly blue after Microsoft decided that it should no longer be gold for some inexplicable reason). That's optional, and if you prefer the more muted look you can disable it. This leaves you with something that looks like Office 2013 with only a few minor variations.
Desktop productivity applications haven't really changed much for many years. Office 2007 shook up the interface in a big way with its ribbon interface, and refined it in Office 2010 with the introduction of the "backstage" view used for saving, opening, and printing documents. But even the ribbon didn't change the basic structure of the apps, or the way they interoperate with one another. One feels that there hasn't really been anything new in terms of how these core word processing, spreadsheeting, presenting, and e-mail/calendaring apps since the days of the (long forgotten) Lotus Improv, which did at least try to offer an alternative to the formulae-in-a-grid of cells approach that defines spreadsheets today.
Office 2016 doesn't do anything to buck that trend.
Posted By CybrSlydr @ 9:21 AM
| In what is one of the most Goldblum-like moments of the year so far, Nvidia has partnered with OEMs like Asus and MSI to cram the full desktop version of its high-end GTX 980 graphics card into laptops. Thanks to its full array of 2048 CUDA cores, up to 8GB of 7GHz GDDR5 memory, and 1126MHz core clock, Nvidia claims the new laptop GTX 980 offers around a 30 percent performance boost over its previous flagship laptop GPU, the GTX 980M.
Even crazier, Nvidia has also managed to convince OEMs to let users overclock the GTX 980 too. Coupled with Intel's upcoming unlocked K-series Skylake laptop CPUs, users will be able to eke out a significant amount of extra performance from their laptops, cooling permitting. To help things along, Nvidia's laptop GTX 980s will differ slightly from their desktop counterparts in that they'll be binned for improved leakage and power consumption.
Nvidia says the binning process will ensure each laptop GTX 980 is guaranteed to hit the advertised 1126MHz GPU core clock and 1216MHz boost clock, as well as achieve overclocks somewhere in the region of 200MHz. That's a modest increase over the stock clock, but given the thermal restraints of a notebook chassis it's still rather impressive. To hit those overclocked speeds, users will be able to tweak the fan curve of the GPU (a first for laptops), as well as adjust the core clock and memory speeds.
However, users will be limited to a fan speed offset set by the notebook manufacturer. The overall power target as well as voltage control will also continue to be locked down. Other features of the laptop GTX 980 include between four- and eight-phase power supplies for better, cleaner power delivery, as well as support for three-panel surround gaming. Some OEMs are equipping their notebooks with three discrete outputs, although others will work via DisplayPort daisy chaining.
Naturally, cramming a 165W GPU into a laptop chassis does come with some compromises. For starters, all the notebooks available at launch feature a 17-inch or larger screen, which—when coupled with the gargantuan external power supplies they require—mean that they're not exactly something you want to carry around with you too often. All the launch models also only come with 1080p displays, albeit displays that support Nvidia's variable refresh rate technology G-Sync.
That's disappointing given the sheer graphics grunt of the GTX 980, which is more than capable of pushing 1440p or 4K visuals at high settings. That said, Asus has teased that some of its upcoming 17-inch gaming notebooks will feature a 4K option. The company's madcap watercooled GX700VO—which is just as big and outrageous in-person as you'd imagine—will feature a GTX 980 when it launches sometime in November. The rest of the laptop line-up, including a table-burning 18.4-inch SLI model from MSI, will launch later this month. Pricing is to be confirmed, but expect it to be very high indeed (probably £2,000 or more).
While the practicality of putting a GTX 980 into a notebook is questionable, that Nvidia convinced OEMs to designs notebooks with better cooling and power in order to make it happen is impressive—getting them to allow for overclocking, even more so. Nvidia's senior product manager for GeForce notebooks Brian Choi told Ars that overclocking in particular has been "difficult for [Nvidia] to encourage OEMs to do."
"When we develop a desktop GPU part we control the GPU temperature, we control the fan, we control everything, and we can expose that to the overclocker very easily," continued Choi. "In a notebook environment, OEMs like Asus and MSI are responsible for the safety and reliability and longevity of the entire system. So they don't really want to give away control of cooling, because that can affect the warranty, and stability of their brand."
As for whether anyone actually wants a desktop GPU in their laptop, along with all the compromises in size, battery life, and noise that brings, Choi was optimistic.
"We're not going for the mainstream guy who's looking for something thin and light with a desktop-class CPU and GPU. God knows I would love to do that one day, I think we all do. But physics is physics, and the fact that we're able to get a great flagship GTX 980 into a notebook is a real milestone," said Choi.
"The industry has sort of been kept in a time loop, because no one pushed [OEMs]. I consider it similar to the three-minute mile. No one thought you could break the three-minute mile, but as soon as someone did everyone was piling in. No one thought you could make a thin gaming-class notebook until Razr did it. After Razr did it, everyone figured out it wasn't that hard and started piling in. In this case, we're telling the industry to try harder, to make a desktop-class enthusiast notebook and to not be shy about it because people want this."
Posted By CybrSlydr @ 9:18 AM
| The Samsung 950 Pro SSD—the follow up to the legendary Samsung 850 Pro SSD—has been unveiled by the company at its annual SSD summit in Seoul, Korea. The 950 Pro will be available at retail in October, with MSRPs of $199.99 (probably ~£150) for the 256GB version, and $349.99 (~£280) for the 512GB version. UK pricing is yet to be confirmed.
Based on Samsung's V-NAND technology and available in 512GB and 256GB capacities, the 950 Pro shuns the common 2.5-inch form factor and SATA interface for cutting-edge M.2 2280 and PCIe 3.0 x4. It also makes use of the Non-Volatile Memory Host Controller Interface, better known as NVMe.
Most SSDs still make use of the AHCI (Advanced Host Controller Interface) architecture, which was originally developed for spinning platter SATA hard drives back in 2004. While AHCI works fine for traditional hard drives, it was never designed for low latency NAND chips. As flash speeds have increased, AHCI has become a performance bottleneck. NVMe exploits both the PCIe bus and NAND flash memory to offer higher performance and lower latency.
In the case of the 512GB Samsung 950 Pro, the combination of NVMe, speedy V-NAND chips, and a triple core, eight-channel UBX controller has resulted in some eye-popping performance. Sequential read speeds top out at 2500MB/s, while sequential writes hit 1500MB/s. By comparison, Samsung's OEM-only SM951 AHCI drive—which is based on the same UBX controller, albeit paired with planar NAND—tops out at 2150MB/s sequential reads and 1500MB/s sequential writes.
Random read performance on the 512GB 950 Pro is up to 300K IOPS (Input/Output Operations Per Second), with write speeds of up to 110K IOPS. Power tops out 5.7W on average, 7.0W in burst mode, and 1.7W at idle. The drive also features 512MB of DRAM memory, and support for 256-bit AES encryption. A future firmware update also promises to add TCG Opal support for Microsoft's eDrive standard.
Because the 950 Pro is a consumer drive—unlike the OEM-only SM951—Samsung is bundling it with its own proprietary NVMe driver, although, it will also be compatible with the standard driver available for Windows 7 and up. Both drives ship with a five year limited warranty covering up to 200 TBW (terabytes written) for the 256GB and 400 TBW on the 512GB, which is strangely less than 10 year warranty of the 850 Pro.
Posted By CybrSlydr @ 9:14 AM
Wednesday September 2nd, 2015
| Developer DICE confirmed the news via EA's community manager, Sledgehammer70, on the game's Subreddit page, stating that while a server browser will not be available it will use a "new skill-based matchmaking system" to match online players.
Dice could **** up a wet dream...
Posted By CybrSlydr @ 3:09 PM
Wednesday August 5th, 2015
| A large number of users invested into Intel based platforms during the Core 2 Quad, Nehalem and Sandy Bridge releases. Sandy Bridge was notable because it inferred a large performance gain at stock speeds, and with a good processor anyone could reach 4.7 GHz and even higher using a good high end cooler. With that, Intel has had a problem enticing these users to upgrade because their performance has been constantly matched by Ivy Bridge, Haswell and Broadwell – for every 5% IPC increase from the CPU, an average 200 MHz was lost on the good overclock and they would have to find a good overclocking CPU again. There was no great reason, apart from chipset functionality to upgrade.
That changes with Skylake.
From a clock-to-clock performance perspective, Skylake gives an average ~25% better performance in CPU based benchmarks, and when running both generations of processors at their stock speeds that increase jumps up to 37%. In specific tests, it is even higher. When you scale up to a 4.5 GHz Skylake against a 4.7 GHz Sandy Bridge, the 4% frequency difference is only a tiny portion of that. There are other added benefits, such as the move to a DDR4 memory topology that has denser memory modules, as well as PCIe storage and even PCIe 3.0 graphics connectivity.
As I said above, Skylake is not necessarily the most ground breaking architecture over Haswell. It affords a 19% CPU performance gain over the i7-4770K and 5% over the i7-4790K. There is a small minor issue with gaming that disappears when you use synthetics, but only to the tune of a couple of percentage points. For that minor hit, the package combo of processor, chipset and DDR4 should be an intrigue in the minds of Sandy Bridge (and older) owners.
Posted By chartiet @ 8:31 AM
Friday July 31st, 2015
| In February, 2010, Google announced its Fiber project, designed to offer a gigabit service with broadband speeds 100 times faster than what most Americans were receiving at that time. For several years, incumbent providers did not react with improved services, saying instead that such services were too expensive and consumers didn't care about getting a gig.
In the last year, however, as Google started to offer gigabit services at prices comparable to much slower offerings by cable and phone company providers and expanded Fiber's service territory, the incumbents and others have started to announce their own gigabit offerings at similar price points, setting the stage for a potential "game of gigs," in which tens of millions of Americans have the potential to receive faster, better and cheaper broadband.
There is no question that Google Fiber is a seminal development in the broadband market. The question is what is the lesson for policy?
Posted By CybrSlydr @ 9:41 PM
Monday July 27th, 2015
| By Seth G. MacyNvidia's next mid-range graphics card is rumored for debut on August 17. The GeForce GTX 950, as it's now known, will compete against AMD's Radeon R7 370.
According to TechSpot, the GTX 950 will have Nvidia's GM206 GPU, which first debuted in the GTX 960 back in January. However, 25% of the CUDA cores in the GM206 will be disabled for the GTX 950, giving it a core count of 768.
The reported clock speeds for the GTX 950 are between 1150 and 1250 MHz, with a boost clock speed between 1350 and 1450 MHz. This actually puts the speed of the 950 higher than the 960, which has a base clock speed of 1127 MHz and a boosted speed of 1178 MHz.
The card will come with 2GB GDDR5 memory, and the 90W TDP card is apparently powered by a single 8-pin PCIe connector.
The Radeon R7 370, the card the GTX 950 is aimed for, has 4GB GDDR5 memory and a similar $150 USD price point. Radeon unveiled its 300 series of cards at an event at E3.
Nvidia released its GTX 980 Ti earlier this year, a positive beast of a graphics card with a beastlier price tag than the reported 950, coming in at an MSRP of $650. The 950 should satisfy budget-conscious gamers who want to strike a balance between power and price.
Posted By CybrSlydr @ 4:49 PM
Thursday July 9th, 2015
| While Intel is finally getting its 14-nanometer sized chips out to the public, IBM today announced an even more impressive silicon breakthrough: The production of the first working 7nm chip. It's particularly impressive since it took years for chip makers like Intel to move from 22nm chips to 14nm, which offer better power efficiency and faster overall speeds thanks to their denser manufacturing. IBM's 7nm chip, produced together with partners including GlobalFoundries (which is taking over IBM's semiconductor business) and Samsung, will offer similar benefits, but the road to get there was vastly more complex than 14nm chips. IBM says it's using silicon germanium in electricity-conducting channels on the chip, as well as a new lithography method, dubbed Extreme Ultraviolet, to print finer circuits (which are around 10,000 times thinner than human hair). Perhaps most intriguingly it also keeps Moore's Law, the notion that computing power will double roughly every 18 months, alive for the next few years.
"The implications of our achievement are huge for the computer industry," wrote Mukesh Khare, IBM's VP of semiconductor technology research. "By making the chips inside computers more powerful and more efficient, IBM and our partners will be able to produce the next generations of servers and storage systems for cloud computing, big data analytics and cognitive computing."
Posted By CybrSlydr @ 3:43 PM
| Intel Compute Stick Review: Don’t Buy It
Who wants a cheap HDMI stick that can turn any TV into a full Windows computer? Everybody, right? That’s what we thought. Oh god were we wrong. When Intel announced the $150 Compute Stick at CES, we figured it could become the ultimate miniature PC for all kinds of people. Too bad it’s terrible.
Theoretically, there are loads of things you could do with a computer this tiny. You could work from it, of course, or browse the web from your couch. Watch Hulu without a subscription. Stream games from another computer. My editor Sean Hollister was excited to load Steam on it, plug in an Xbox 360 wireless adapter, and play lightweight games like Nidhogg with buddies on a big screen without lugging a console around. I was dubiously optimistic I could turn the Stick into a Kodi media streamer, accessing videos from my desktop PC over my home network.
Do some of these things work? Sure. But using this under-equipped PC is a giant pain in the ass—to the point that it’s probably not worth it.
What is it?
An attempt to cram an entire desktop computer into a tiny $150 HDMI-dongle that you plug into any TV—that doesn’t quite stick the landing. Boy, it sure sounded good on paper, though: a quad-core 1.33GHz Intel Atom processor, 2GB of RAM, and 32GB of storage all in a compact, pocketable package? Yes please. The Compute Stick’s tiny case even features a full-sized USB port and a MicroSD card reader. What’s not to like? As it turns out, almost everything.
All you need is an app to make your phone a wireless keyboard/mouse.
Posted By maxgull @ 9:36 AM
Thursday July 2nd, 2015
| By Alex OsbornBatman: Arkham Knight publisher Warner Bros. Interactive was apparently well aware of the technical issues that currently plague the PC version of Rocksteady Studio's trilogy-capper.
According to sources close to Kotaku, the publisher knew the PC version had problems months ago, but decided to ship the game anyway.
"I will say that it’s pretty rich for WB to act like they had no idea the game was in such a horrible state," an anonymous QA tester said. "It’s been like this for months and all the problems we see now were the exact same, unchanged, almost a year ago."
A few days ago, Warner Bros. decided to suspend sales of Arkham Knight on PC in response to an overwhelming number of reports from gamers having issues. We've reached out to Warner Bros. for comment regarding its alleged knowledge of these issues pre-release, and will update as soon as we receive a reply.
That said, the console version of Rocksteady's superhero game has been received incredibly well by critics, with IGN's Dan Stapelton praising the title in his review for its "excellent gameplay variety" and "detailed open world."
That's not good.
Posted By CybrSlydr @ 9:04 AM
Wednesday June 24th, 2015
| Last week at E3, AMD pulled the wraps off its new flagship GPU: the liquid-cooled Radeon R9 Fury X. It’s the most powerful GPU AMD has ever created, as well as the company’s first video card to feature new architecture and a new type of memory. Even before its announcement, speculation ran wild about how it’d compare to Nvidia’s GTX 980 Ti, which is equally priced at $650.
When specs and AMD-provided performance numbers came out late last week, it appeared that the two cards were pretty closely matched—and after some extensive testing, we can confirm that both cards are indeed pretty darn close in performance.
Posted By CybrSlydr @ 11:42 PM
Tuesday June 16th, 2015
| The wait is finally over. A year and a half after the launch of the R200-series, months after Nvidia refreshed its entire GeForce lineup, AMD has lifted the veil off its new Radeon graphics card lineup: the Fury series, powered by revolutionary high-bandwidth memory (HBM), and the company’s brand new Fiji GPU.
Leading the charge is the AMD Radeon R9 Fury X, which is capable of driving Tomb Raider to 45 frames per second at 5K resolution and Ultra settings, and offers 1.5 times the performance per watt of AMD’s previous R9 290X flagship. The graphics card boasts 4,096 stream processors—an incredible jump over the R9 290X’s 2816. The Radeon R9 Fury X packs 8.9 billion transistors, compared to the R9 290X’s 6.3 billion transistors. Given that AMD—like Nvidia—is still using the 28nm manufacturing process, the Fiji GPU itself must be massive.
Fury X has officially launched.
The R9 300 series will ship this coming Thursday.
Fury X Water Cooling Edition is shipping June 24th @ $649
Fury Air cooled is shipping July 14th @ $549
Fury X Specs
8.9 Billion Transistors
Fury Nano is later this summer. Lisa stated 1.5x performance per watt of 290x for Fury X and 2x performance per watt for Fury nano. Pics of Fury X, Nano, and dual Fury in attachments. VSR is coming as well something that has been lacking from AMD for sometime.
I think that surprisingly i'm loving the form factor. I could make one hell of an mITX with two Fury Nanos all on water. Would make for the ultimate lan PC or travel PC. Meanwhile quantum would be smaller than any console and as powerful as a full size tower with a Fury X in it. Seems about the same size as the smallest steam machines.
Fury X industrial design
Fury X product overview
High-Bandwidth Memory with more than 3x the bandwidth per watt over GDDR51 along with a 4096-bit memory interface - the highest AMD GPU memory bandwidth ever.
Sleek industrial styling including GPU Tach activity indicator and LED illumination all in a compact 7.5-inch card that’s all about raw graphics power.
Powerful performance, bleeding-edge technology, liquid-cooled, whisper-quiet and future-ready for extreme 4K and VR gaming.3
Posted By Blackops_2 @ 2:28 PM
Thursday June 11th, 2015
| The virtual reality company, bought by Facebook last year for $2 billion, said Thursday it plans to launch its consumer Rift headset early next year in a partnership with Microsoft that will tie together the Xbox One game console to the Oculus platform.
The final version of the Rift, shown today for the first time, will come with an Xbox One wireless controller, as well as a standing camera to track your head movements and whether you're standing and moving around the room or sitting down. Oculus is also working on hand controllers of its own, called Oculus Touch, that resemble small joysticks with looping rings around the base. That hardware is being designed to bring more realistic hand-motions to VR worlds that will let people interact with the environment.
Posted By CybrSlydr @ 1:40 PM