Saturday November 21st, 2015
| “Hunting for distribution rights is essentially detective work,” says Marcin Paczyński, Head of Product at GOG. “Rights can repeatedly change hands or be split up between different parties, and it’s our job to get to the bottom of what happened.”
Preservation of old games involves more than just an extra patch. The journey from dusty unplayable relic to polished, cross-platform installer is a minefield of technical and legal obstacles. The team at Good Old Games remain the industry leaders in the restoration of classic PC games, tasked with reverse engineering code written more than 20 years ago, unraveling knotty licensing issues left behind by defunct development studios, and battling lethargy on the part of skeptical publishers. It’s a thrilling and, at times, gruelling process, but – as the GOG team will testify – it never fails to surprise.
Posted By CybrSlydr @ 9:07 AM
Sunday November 15th, 2015
| By Lewis Leong
What do you do if you want to play your PC games in your living room? You could move your PC into the living room, but if your PC weighs 50 lbs like mine, you’re going to have a bad time. You could also buy yourself a fancy Steam Machine, but they’re expensive and you’ll be limited to SteamOS’s game library, which is a fraction of Steam’s entire game library.
Enter the $50 Steam Link. This little box hooks up to your TV via HDMI and will stream games from your PC to your TV over your home network. There are 3 USB 2.0 ports to connect your gaming peripherals like game controllers or even a keyboard and mouse. There’s also Bluetooth 4.0 to connect wireless accessories and you can hook the Link up to your network via a 100 Mbit/s Ethernet connection or Wireless AC. It’s strange that Valve left out Gigabit Ethernet, but the 100Mbit/s connection works well.
What I like most about the Steam Link is its simplicity. Turn it on, choose your language, connect it to your network, enter a 4-digit code on your PC, and you’re ready to stream your games. You’ll see a list of computers on your network that are capable of streaming to the Steam Link and selecting one launches Big Picture Mode on that PC.
To start playing, you can plug in a Xbox controller (360 or One), Logitech F710, or Valve’s own Steam Controller. It’s nice to be able to use the Steam Link with your existing peripherals.
Overall, setup is extremely simple and you can get started playing in a matter of minutes. However, I did have issues using the Steam Link on an older Sony Bravia 720p TV. The Steam Link interface was too big and created overscan issues. Parts of the interface couldn’t be shown since they were off-screen and I had to guess where the overscan settings were in the menu. Valve’s website states the Steam Link works with 720p televisions, so your mileage may vary.
Instead of relying on powerful hardware to render your games, the Steam Link relies on your gaming PC or laptop to do the heavy lifting. This keeps the cost of the Steam Link down, but also introduces some problems. Since the Steam Link simply mirrors what’s going on on your PC, there are times when Steam Big Picture drops off and you see your desktop, leaving you with limited control. During some first game launches, I was met with prompts to log in, forcing me to walk to my computer, since the Link doesn't pull up a virtual keyboard to type. If you have a dual or triple monitor set up, the Steam Link will show every desktop you have open with black bars on the top and bottom so prompts may be impossible to read.
The NVIDIA Shield TV has a better game streaming interface, allowing you to zoom in and out of your desktop, but it also costs four times the price of the Link and only supports NVIDIA GPUs. The Shield TV does a lot more though, packing in TV, movies, music, and its GeForce Now game streaming service. The Steam Link, on the other hand, can only stream your games.
Video quality will depend on your home network. If you have a slower connection, expect to see compression and artifacting, as well as some input latency. If you have a quick network, graphics look great, but will never be as good as playing directly from your computer. I didn’t mind a slight hit in graphics too much, though blacks did look consistently lighter than they should and there's some softness overall.
Streaming performance of the Steam Link, whether wired or wireless, is excellent. In my tests, games played without lag (for the most part) over my Wireless AC network. Enabling the Steam Link’s diagnostic tools showed a smooth 60 fps at 1080p using wireless and low latency. The wired Ethernet connection was even smoother and exhibited less latency.
I played a variety of games using the Steam Link and they all worked well after some initial launch issues (I had to get up and click on the install prompts). To test latency, I played Dirt Rally, a challenging racing game that requires fast reaction times. The game ran smoothly and registered all my inputs as if I were playing directly on my PC, with peak ping reaching 30 milliseconds. Audio is also pumped to the TV over the network and sounds good for the most part. There were times when the audio stuttered, but it didn't happen often.
Playing Grand Theft Auto V with the Steam Link and Xbox controller was like playing it on a console. The game’s gamepad support made it feel fluid and its graphics were good overall. With games that support controllers, the Steam Link provides a console-like experience. The only hint that you're playing from the PC is when you launch or quit a game, showing you a glimpse of your desktop at times.
The Steam Link is the simplest and most elegant solution for playing PC games in your living room. Sure, it has some bugs and will require you to get up to troubleshoot issues at times. But it works so well when playing that I can forgive most of its flaws.
If you already have a gaming PC, pass on buying the much more expensive Steam Machines. The $50 Steam Link will let you game on the couch and also gives you full access to Steam’s game library, unlike SteamOS.
Valve will continue developing the Steam Link and it will only get better in the future.
Posted By CybrSlydr @ 7:10 PM
Tuesday October 6th, 2015
| After weeks of leaks, Microsoft is officially unveiling the Lumia 950 today. It has been 18 months since the last flagship Windows phone was announced, and the new Lumia 950 is here to help launch Windows 10 Mobile. The Lumia 950 will be available starting November for $549.
Microsoft has opted for a 5.2-inch WQHD (1440 x 2560) OLED display on the Lumia 950, coupled with 3GB of RAM and a Snapdragon 808 processor. That makes it one of the more powerful Windows phones we've seen to date, and the first to make use of Qualcomm's latest processors.
Also, it has liquid cooling. Seriously.
Source: The Verge
Posted By CybrSlydr @ 10:22 AM
Friday October 2nd, 2015
| By Mitch Dyer Update: Microsoft has provided official comment to IGN on numerous subjects.
Havok will not be limited to Microsoft exclusively. “We will continue to license Havok’s technology to the broad AAA games industry," a representative told IGN. "This also means that we will continue to license Havok’s technology to run across various game consoles including Sony and Nintendo.”
On the subject of what Microsoft paid for the acquisition, we were told, “We are not discussing financial details at this time.”
Finally, Microsoft told IGN, "We are working closely with the team at Havok to ensure a smooth transition, but have nothing further to announce at this time.”
Havok, the company known for its fantastic physics in video games, has been acquired by Microsoft. The Xbox publisher purchased Havok from Intel. Microsoft explained that this is part of "building the most complete cloud service."
"Microsoft’s acquisition of Havok continues our tradition of empowering developers by providing them with the tools to unleash their creativity to the world," the company's corporate article reads. "We will continue to innovate for the benefit of development partners. Part of this innovation will include building the most complete cloud service, which we’ve just started to show through games like 'Crackdown 3.'"
Havok was used recently for Destiny, Mortal Kombat X, Dark Souls 2, and Watch Dogs.
Microsoft did not announce if Havok will continue to provide physics for third-party games, nor how much the acquisition cost.
Posted By CybrSlydr @ 5:55 PM
Tuesday September 22nd, 2015
| Office is 25 years old this year. Some of the individual components are older, but it was 1990 that Microsoft first released a combined Office bundle, containing Word 1.1, Excel 2, and PowerPoint 2. Through the peculiar quirks of Microsoft's versioning scheme, today marks the 17th release, version number 16.0, branded Office 2016.
You can tell that you're using Office 2016 and not its predecessor Office 2013 because, by default, Office 2016 colorizes the title bars of each app to make them reflect each application's distinctive color (except for Outlook, which remains distressingly blue after Microsoft decided that it should no longer be gold for some inexplicable reason). That's optional, and if you prefer the more muted look you can disable it. This leaves you with something that looks like Office 2013 with only a few minor variations.
Desktop productivity applications haven't really changed much for many years. Office 2007 shook up the interface in a big way with its ribbon interface, and refined it in Office 2010 with the introduction of the "backstage" view used for saving, opening, and printing documents. But even the ribbon didn't change the basic structure of the apps, or the way they interoperate with one another. One feels that there hasn't really been anything new in terms of how these core word processing, spreadsheeting, presenting, and e-mail/calendaring apps since the days of the (long forgotten) Lotus Improv, which did at least try to offer an alternative to the formulae-in-a-grid of cells approach that defines spreadsheets today.
Office 2016 doesn't do anything to buck that trend.
Posted By CybrSlydr @ 9:21 AM
| In what is one of the most Goldblum-like moments of the year so far, Nvidia has partnered with OEMs like Asus and MSI to cram the full desktop version of its high-end GTX 980 graphics card into laptops. Thanks to its full array of 2048 CUDA cores, up to 8GB of 7GHz GDDR5 memory, and 1126MHz core clock, Nvidia claims the new laptop GTX 980 offers around a 30 percent performance boost over its previous flagship laptop GPU, the GTX 980M.
Even crazier, Nvidia has also managed to convince OEMs to let users overclock the GTX 980 too. Coupled with Intel's upcoming unlocked K-series Skylake laptop CPUs, users will be able to eke out a significant amount of extra performance from their laptops, cooling permitting. To help things along, Nvidia's laptop GTX 980s will differ slightly from their desktop counterparts in that they'll be binned for improved leakage and power consumption.
Nvidia says the binning process will ensure each laptop GTX 980 is guaranteed to hit the advertised 1126MHz GPU core clock and 1216MHz boost clock, as well as achieve overclocks somewhere in the region of 200MHz. That's a modest increase over the stock clock, but given the thermal restraints of a notebook chassis it's still rather impressive. To hit those overclocked speeds, users will be able to tweak the fan curve of the GPU (a first for laptops), as well as adjust the core clock and memory speeds.
However, users will be limited to a fan speed offset set by the notebook manufacturer. The overall power target as well as voltage control will also continue to be locked down. Other features of the laptop GTX 980 include between four- and eight-phase power supplies for better, cleaner power delivery, as well as support for three-panel surround gaming. Some OEMs are equipping their notebooks with three discrete outputs, although others will work via DisplayPort daisy chaining.
Naturally, cramming a 165W GPU into a laptop chassis does come with some compromises. For starters, all the notebooks available at launch feature a 17-inch or larger screen, which—when coupled with the gargantuan external power supplies they require—mean that they're not exactly something you want to carry around with you too often. All the launch models also only come with 1080p displays, albeit displays that support Nvidia's variable refresh rate technology G-Sync.
That's disappointing given the sheer graphics grunt of the GTX 980, which is more than capable of pushing 1440p or 4K visuals at high settings. That said, Asus has teased that some of its upcoming 17-inch gaming notebooks will feature a 4K option. The company's madcap watercooled GX700VO—which is just as big and outrageous in-person as you'd imagine—will feature a GTX 980 when it launches sometime in November. The rest of the laptop line-up, including a table-burning 18.4-inch SLI model from MSI, will launch later this month. Pricing is to be confirmed, but expect it to be very high indeed (probably £2,000 or more).
While the practicality of putting a GTX 980 into a notebook is questionable, that Nvidia convinced OEMs to designs notebooks with better cooling and power in order to make it happen is impressive—getting them to allow for overclocking, even more so. Nvidia's senior product manager for GeForce notebooks Brian Choi told Ars that overclocking in particular has been "difficult for [Nvidia] to encourage OEMs to do."
"When we develop a desktop GPU part we control the GPU temperature, we control the fan, we control everything, and we can expose that to the overclocker very easily," continued Choi. "In a notebook environment, OEMs like Asus and MSI are responsible for the safety and reliability and longevity of the entire system. So they don't really want to give away control of cooling, because that can affect the warranty, and stability of their brand."
As for whether anyone actually wants a desktop GPU in their laptop, along with all the compromises in size, battery life, and noise that brings, Choi was optimistic.
"We're not going for the mainstream guy who's looking for something thin and light with a desktop-class CPU and GPU. God knows I would love to do that one day, I think we all do. But physics is physics, and the fact that we're able to get a great flagship GTX 980 into a notebook is a real milestone," said Choi.
"The industry has sort of been kept in a time loop, because no one pushed [OEMs]. I consider it similar to the three-minute mile. No one thought you could break the three-minute mile, but as soon as someone did everyone was piling in. No one thought you could make a thin gaming-class notebook until Razr did it. After Razr did it, everyone figured out it wasn't that hard and started piling in. In this case, we're telling the industry to try harder, to make a desktop-class enthusiast notebook and to not be shy about it because people want this."
Posted By CybrSlydr @ 9:18 AM
| The Samsung 950 Pro SSD—the follow up to the legendary Samsung 850 Pro SSD—has been unveiled by the company at its annual SSD summit in Seoul, Korea. The 950 Pro will be available at retail in October, with MSRPs of $199.99 (probably ~£150) for the 256GB version, and $349.99 (~£280) for the 512GB version. UK pricing is yet to be confirmed.
Based on Samsung's V-NAND technology and available in 512GB and 256GB capacities, the 950 Pro shuns the common 2.5-inch form factor and SATA interface for cutting-edge M.2 2280 and PCIe 3.0 x4. It also makes use of the Non-Volatile Memory Host Controller Interface, better known as NVMe.
Most SSDs still make use of the AHCI (Advanced Host Controller Interface) architecture, which was originally developed for spinning platter SATA hard drives back in 2004. While AHCI works fine for traditional hard drives, it was never designed for low latency NAND chips. As flash speeds have increased, AHCI has become a performance bottleneck. NVMe exploits both the PCIe bus and NAND flash memory to offer higher performance and lower latency.
In the case of the 512GB Samsung 950 Pro, the combination of NVMe, speedy V-NAND chips, and a triple core, eight-channel UBX controller has resulted in some eye-popping performance. Sequential read speeds top out at 2500MB/s, while sequential writes hit 1500MB/s. By comparison, Samsung's OEM-only SM951 AHCI drive—which is based on the same UBX controller, albeit paired with planar NAND—tops out at 2150MB/s sequential reads and 1500MB/s sequential writes.
Random read performance on the 512GB 950 Pro is up to 300K IOPS (Input/Output Operations Per Second), with write speeds of up to 110K IOPS. Power tops out 5.7W on average, 7.0W in burst mode, and 1.7W at idle. The drive also features 512MB of DRAM memory, and support for 256-bit AES encryption. A future firmware update also promises to add TCG Opal support for Microsoft's eDrive standard.
Because the 950 Pro is a consumer drive—unlike the OEM-only SM951—Samsung is bundling it with its own proprietary NVMe driver, although, it will also be compatible with the standard driver available for Windows 7 and up. Both drives ship with a five year limited warranty covering up to 200 TBW (terabytes written) for the 256GB and 400 TBW on the 512GB, which is strangely less than 10 year warranty of the 850 Pro.
Posted By CybrSlydr @ 9:14 AM
Wednesday September 2nd, 2015
| Developer DICE confirmed the news via EA's community manager, Sledgehammer70, on the game's Subreddit page, stating that while a server browser will not be available it will use a "new skill-based matchmaking system" to match online players.
Dice could **** up a wet dream...
Posted By CybrSlydr @ 3:09 PM
Wednesday August 5th, 2015
| A large number of users invested into Intel based platforms during the Core 2 Quad, Nehalem and Sandy Bridge releases. Sandy Bridge was notable because it inferred a large performance gain at stock speeds, and with a good processor anyone could reach 4.7 GHz and even higher using a good high end cooler. With that, Intel has had a problem enticing these users to upgrade because their performance has been constantly matched by Ivy Bridge, Haswell and Broadwell – for every 5% IPC increase from the CPU, an average 200 MHz was lost on the good overclock and they would have to find a good overclocking CPU again. There was no great reason, apart from chipset functionality to upgrade.
That changes with Skylake.
From a clock-to-clock performance perspective, Skylake gives an average ~25% better performance in CPU based benchmarks, and when running both generations of processors at their stock speeds that increase jumps up to 37%. In specific tests, it is even higher. When you scale up to a 4.5 GHz Skylake against a 4.7 GHz Sandy Bridge, the 4% frequency difference is only a tiny portion of that. There are other added benefits, such as the move to a DDR4 memory topology that has denser memory modules, as well as PCIe storage and even PCIe 3.0 graphics connectivity.
As I said above, Skylake is not necessarily the most ground breaking architecture over Haswell. It affords a 19% CPU performance gain over the i7-4770K and 5% over the i7-4790K. There is a small minor issue with gaming that disappears when you use synthetics, but only to the tune of a couple of percentage points. For that minor hit, the package combo of processor, chipset and DDR4 should be an intrigue in the minds of Sandy Bridge (and older) owners.
Posted By chartiet @ 8:31 AM
Friday July 31st, 2015
| In February, 2010, Google announced its Fiber project, designed to offer a gigabit service with broadband speeds 100 times faster than what most Americans were receiving at that time. For several years, incumbent providers did not react with improved services, saying instead that such services were too expensive and consumers didn't care about getting a gig.
In the last year, however, as Google started to offer gigabit services at prices comparable to much slower offerings by cable and phone company providers and expanded Fiber's service territory, the incumbents and others have started to announce their own gigabit offerings at similar price points, setting the stage for a potential "game of gigs," in which tens of millions of Americans have the potential to receive faster, better and cheaper broadband.
There is no question that Google Fiber is a seminal development in the broadband market. The question is what is the lesson for policy?
Posted By CybrSlydr @ 9:41 PM
Monday July 27th, 2015
| By Seth G. MacyNvidia's next mid-range graphics card is rumored for debut on August 17. The GeForce GTX 950, as it's now known, will compete against AMD's Radeon R7 370.
According to TechSpot, the GTX 950 will have Nvidia's GM206 GPU, which first debuted in the GTX 960 back in January. However, 25% of the CUDA cores in the GM206 will be disabled for the GTX 950, giving it a core count of 768.
The reported clock speeds for the GTX 950 are between 1150 and 1250 MHz, with a boost clock speed between 1350 and 1450 MHz. This actually puts the speed of the 950 higher than the 960, which has a base clock speed of 1127 MHz and a boosted speed of 1178 MHz.
The card will come with 2GB GDDR5 memory, and the 90W TDP card is apparently powered by a single 8-pin PCIe connector.
The Radeon R7 370, the card the GTX 950 is aimed for, has 4GB GDDR5 memory and a similar $150 USD price point. Radeon unveiled its 300 series of cards at an event at E3.
Nvidia released its GTX 980 Ti earlier this year, a positive beast of a graphics card with a beastlier price tag than the reported 950, coming in at an MSRP of $650. The 950 should satisfy budget-conscious gamers who want to strike a balance between power and price.
Posted By CybrSlydr @ 4:49 PM
Thursday July 9th, 2015
| While Intel is finally getting its 14-nanometer sized chips out to the public, IBM today announced an even more impressive silicon breakthrough: The production of the first working 7nm chip. It's particularly impressive since it took years for chip makers like Intel to move from 22nm chips to 14nm, which offer better power efficiency and faster overall speeds thanks to their denser manufacturing. IBM's 7nm chip, produced together with partners including GlobalFoundries (which is taking over IBM's semiconductor business) and Samsung, will offer similar benefits, but the road to get there was vastly more complex than 14nm chips. IBM says it's using silicon germanium in electricity-conducting channels on the chip, as well as a new lithography method, dubbed Extreme Ultraviolet, to print finer circuits (which are around 10,000 times thinner than human hair). Perhaps most intriguingly it also keeps Moore's Law, the notion that computing power will double roughly every 18 months, alive for the next few years.
"The implications of our achievement are huge for the computer industry," wrote Mukesh Khare, IBM's VP of semiconductor technology research. "By making the chips inside computers more powerful and more efficient, IBM and our partners will be able to produce the next generations of servers and storage systems for cloud computing, big data analytics and cognitive computing."
Posted By CybrSlydr @ 3:43 PM
| Intel Compute Stick Review: Don’t Buy It
Who wants a cheap HDMI stick that can turn any TV into a full Windows computer? Everybody, right? That’s what we thought. Oh god were we wrong. When Intel announced the $150 Compute Stick at CES, we figured it could become the ultimate miniature PC for all kinds of people. Too bad it’s terrible.
Theoretically, there are loads of things you could do with a computer this tiny. You could work from it, of course, or browse the web from your couch. Watch Hulu without a subscription. Stream games from another computer. My editor Sean Hollister was excited to load Steam on it, plug in an Xbox 360 wireless adapter, and play lightweight games like Nidhogg with buddies on a big screen without lugging a console around. I was dubiously optimistic I could turn the Stick into a Kodi media streamer, accessing videos from my desktop PC over my home network.
Do some of these things work? Sure. But using this under-equipped PC is a giant pain in the ass—to the point that it’s probably not worth it.
What is it?
An attempt to cram an entire desktop computer into a tiny $150 HDMI-dongle that you plug into any TV—that doesn’t quite stick the landing. Boy, it sure sounded good on paper, though: a quad-core 1.33GHz Intel Atom processor, 2GB of RAM, and 32GB of storage all in a compact, pocketable package? Yes please. The Compute Stick’s tiny case even features a full-sized USB port and a MicroSD card reader. What’s not to like? As it turns out, almost everything.
All you need is an app to make your phone a wireless keyboard/mouse.
Posted By maxgull @ 9:36 AM
Thursday July 2nd, 2015
| By Alex OsbornBatman: Arkham Knight publisher Warner Bros. Interactive was apparently well aware of the technical issues that currently plague the PC version of Rocksteady Studio's trilogy-capper.
According to sources close to Kotaku, the publisher knew the PC version had problems months ago, but decided to ship the game anyway.
"I will say that it’s pretty rich for WB to act like they had no idea the game was in such a horrible state," an anonymous QA tester said. "It’s been like this for months and all the problems we see now were the exact same, unchanged, almost a year ago."
A few days ago, Warner Bros. decided to suspend sales of Arkham Knight on PC in response to an overwhelming number of reports from gamers having issues. We've reached out to Warner Bros. for comment regarding its alleged knowledge of these issues pre-release, and will update as soon as we receive a reply.
That said, the console version of Rocksteady's superhero game has been received incredibly well by critics, with IGN's Dan Stapelton praising the title in his review for its "excellent gameplay variety" and "detailed open world."
That's not good.
Posted By CybrSlydr @ 9:04 AM
Wednesday June 24th, 2015
| Last week at E3, AMD pulled the wraps off its new flagship GPU: the liquid-cooled Radeon R9 Fury X. It’s the most powerful GPU AMD has ever created, as well as the company’s first video card to feature new architecture and a new type of memory. Even before its announcement, speculation ran wild about how it’d compare to Nvidia’s GTX 980 Ti, which is equally priced at $650.
When specs and AMD-provided performance numbers came out late last week, it appeared that the two cards were pretty closely matched—and after some extensive testing, we can confirm that both cards are indeed pretty darn close in performance.
Posted By CybrSlydr @ 11:42 PM