Sunday August 31st, 2014
| Leader of one of the most well known hardware review sites is retiring. Best wishes to him and thanks for the years of informative articles.
On April 26, 1997, armed with very little actual knowledge, I began to share what I had with the world on a little Geocities site named Anand’s Hardware Tech Page. Most of what I knew was wrong or poorly understood, but I was 14 years old at the time. Little did I know that I had nearly two decades ahead of me to fill in the blanks. I liked the idea of sharing knowledge online and the thought of building a resource where everyone who was interested in tech could find something helpful.
But after 17.5 years of digging, testing, analyzing and writing about the most interesting stuff in tech, it’s time for a change. This will be the last thing I write on AnandTech as I am officially retiring from the tech publishing world. Ryan Smith (@RyanSmithAT) is taking over as Editor in Chief of AnandTech.
Full Article @ Anandtech
Posted By WiCKeD @ 12:30 PM
Tuesday August 26th, 2014
| We're sorry to break the bad news, but that 5TB hard drive you bought last week? Yeah, it's already obsolete. Seagate has started shipping the first-ever 8TB desktop hard disk, doubling the 4TB capacities that seemed huge just a couple of years ago. If it's any consolation, though, this machinery isn't ready to go inside your hot gaming PC. Right now, all those terabytes are destined for data centers where capacity trumps every other concern; Seagate isn't mentioning prices, but enterprise-class storage is rarely cheap. You may want to set aside some money all the same. These extra-roomy drives have a tendency to filter down to the mainstream pretty quickly, so you may soon have more free disk space than you know what to do with... at least, for a little while.
Source : Engadget
Posted By Prozium @ 10:12 PM
| August 22, 2014, 1:44 AM — Many concepts of computing have moved to the cloud, but gaming has not been one of them. Even with the fastest pipe into your home, latency is inevitable, and who wants to die in a "Call of Duty" deathmatch because of lag? We get enough of that as it is with the software loaded on our PCs.
Cloud-based gaming would also help overcome the problem of console hardware because it would require just a thin client to display the game rather than hefty hardware to render it. Displaying the video is a lot easier and less system intensive than having to render each frame. Given how underpowered the Xbox One is, cloud-based rendering would help overcome its shortcomings.
But how do you get the rendered frames down the pipe to the gamer quickly? Microsoft Research may have a solution in a project called DeLorean. In a nutshell, it renders frames before an event occurs in the game based on a number of variables, the correct set of frames are sent down to your device.
A recently published white paper from Microsoft lays out the concept and solution. Microsoft notes that people could enjoy high-end graphics without needing a high-end GPU through cloud gaming. However, cloud gaming is hindered by latency as low as 60ms.
Microsoft calls its solution "speculative execution." It uses future input prediction, which is predictable based on player behavior, along with speculation of multiple outcomes and error compensation. Microsoft also came up with a new form of bandwidth compression that uses the speculation component to take advantage of the frames being similar from one to the next.
With this, Microsoft was able to achieve a playable cloud-based version of "Doom 3" and "Fable 3," both of which are framerate-intensive games, that were easily playable on thin clients despite a latency of over 250ms. Microsoft found players preferred DeLorean over traditional thin clients and that DeLorean can mimic a low-latency network successfully.
So when will we see it? Like with other Microsoft Research projects, they give no release date. This is still a lab experiment. But it could herald a day when gaming, like Salesforce's CRM, is a SaaS experience rather than 5-10GB on your hard drive.
Posted By CybrSlydr @ 11:05 AM
Monday August 25th, 2014
| During their 30 years of graphics celebration, today AMD announced a forthcoming addition to the Radeon R9 200 graphics card lineup.
Launching on September 2nd will be the company’s new midrange enthusiast card, the Radeon R9 285.
The R9 285 will take up an interesting position in AMD’s lineup, being something of a refresh of a refresh that spans all the way back to Tahiti (Radeon 7970). Spec wise it ends up being extremely close on paper to the R9 280 (née 7950B) and it’s telling that the R9 280 is no longer being advertised by AMD as a current member of their R9 lineup. However with a newer GPU under the hood the R9 285 stands to eclipse the 280 in features, and with sufficient efficiency gains we hope to see it eclipse 280 in performance too.
Finally, coinciding with the launch of the R9 285 will be a refresh of AMD’s Never Settle bundles. The details on this are still murky at this time, but AMD is launching what they call the Never Settle Space Edition bundle, which will see Alien Isolation and Star Citizen as part of a bundle for all R9 series cards. The lack of clarity is whether this replaces the existing Never Settle Forever bundle in this case, or if these games are being added to the Never Settle Forever lineup in some fashion. AMD has said that current Silver and Gold voucher holders will be able to get the Space Edition bundle with their vouchers, which lends credit to the idea that these are new games in the NSF program rather than a different program entirely.
Both Alien Isolation and Star Citizen are still-in-development games. Alien Isolation is a first person shooter and is expected in October of this year. Meanwhile the space sim Star Citizen does not yet have a release date, and as best as we can tell won’t actually be finished until late 2015 at the earliest. In which case the inclusion here is more about access to the ongoing beta, which is the first time we’ve seen beta access used as part of a bundle in this fashion.
Posted By CybrSlydr @ 4:46 PM
Thursday August 21st, 2014
| AMD is preparing to announce three new FX processors on September 1, 2014 including models FX-8370, FX-8370E and FX-8320E. It is also stated that AMD will lower pricing of older FX processors and that there might be some new chipsets.
The new FX-series microprocessors from AMD, which are due to be formally introduced on September 1, 2014, are the FX-8370 and the FX-8370E reports xbitlabs.
****it - I was just coming here to post this. lol
The 95w FX's look somewhat interesting considering you get higher clock at lower TDP and hopefully heat which was always a detriment for me. I wonder how they will OC?
Posted By chartiet @ 9:59 AM
Friday August 15th, 2014
| Last Tuesday Microsoft issued their August updates for fixes and security, unfortunately it renders Windows 8.1 completely to un-bootable for a lot of end-users as they end up with a black failure screen. The issue resides in two updated files. On the Microsoft support forum it rains complaints about the so called August update. Read more after the break.
Users that have a system restore point enabled can retrieve access to the OS in the pre-update state and get Windows going again. Those that have system restore disabled are in a world of hurt and might have to reside to a system OS backup, or revert to a clean install. For those with a system restore point, please make sure that you uninstall the following updates: KB2982791 and the optional update KB2975719 as these are the two responsible for all this.
Posted By chartiet @ 11:32 AM
Saturday August 2nd, 2014
| Thought all you needed to get a 4K TV working is HDMI 2.0? Guess again. The next generation of content protection is called HDCP 2.2, and not only is it not backwards compatible, many new 4K devices don't even support it.
So it's possible that the 4K TV you bought last year, or even the receiver you buy this year, might not be able to receive/pass all future 4K content.
Sound crazy? Sadly, it's not. Here's the skinny.
What it is
Copy protection/content protection has been around since the VHS era, something anyone who tried to copy a Blockbuster rental can tell you. Back then it was called Macrovision, which evolved to CSS for DVD and finally HDCP, which stands for High-bandwidth digital Content Protection, for Blu-ray players and HDTV devices like satellite and cable boxes.
HDCP 2.2 is the latest evolution of copy protection. It's designed to create a secure connection between a source and a display. Ostensibly this is so you can't take the output from a source (a Blu-ray player, say) and plug it into some kind of recorder, to make a copy of the content. DRM, the encryption of the content itself, is a separate issue. HDCP doesn't care what goes across the cable, as long as that cable is secure.
It does this by creating encrypted keys between the source and the display (called the sink). Enabled repeaters, like receivers, can be in the chain as well. The source and the sink need to be in agreement, understanding their keys, or no content gets transferred. If you've ever hooked up gear and gotten a blank screen (or turned on gear in the wrong order and gotten a blank screen), this HDCP "handshake" is usually the issue.
HDCP isn't solely over HDMI. It can be implemented to work over DVI, DisplayPort, USB, and more.
So what's new? The encryption on the keys in version 2.2 is more advanced than previous versions which, in theory, makes the whole chain harder to break. One other interesting change with 2.2 is a "locality check." The source sends a signal to the sink, and if the sink doesn't get that signal within 20ms, the source kills the connection. In theory, this shouldn't cause any issues in home setups, even over long HDMI runs (unless you have more than 3,740 miles of cable).
See Source for rest of article.
Posted By CybrSlydr @ 9:54 PM
Tuesday July 29th, 2014
| Portable electronics are very convenient except for one thing: Their battery life is horrible. No matter what the capacity of their battery is, they all require frequent recharging throughout the day and have a limited lifespan that lasts between 400 to 1200 cycles. Eventually, these batteries all die and need to be replaced with new (and often expensive) versions that will also meet the same fate. Still, these types of batteries are the best available and have also become a popular choice for electric vehicles, aerospace applications and even military projects.
These batteries are called lithium-ion (li-ion) and became the industry standard for consumer electronics in the early 1990s. For 25 years we have used them to power our cell phones, laptops and most gadgets that need to function without being plugged in all the time. But future applications in portable electricity will soon demand higher energy storage density and something will have to replace traditional li-ion batteries because they simply won’t be powerful enough.
Advantage of Li-ion Batteries
Disadvantage of Li-ion Batteries
- High energy density with potential for higher capacities
- Don’t require prolonged priming when new
- Relatively low self-discharge rate
- Low maintenance
- Specialty cells can provide high current to many different types of applications
Many scientists have focused their research efforts on high-capacity electrode materials that use silicon and tin as anodes, and sulfur and oxygen as cathodes. But pure lithium metal is still the optimum choice because it has the highest capacity (3,860 mAh g–1) of them all. Unfortunately, it’s is also very dangerous.
- Require protection circuit to maintain voltage
- Subject to aging, even when not in use
- Must be stored in a cool place to reduce aging effect
- Transportation restrictions
- Expensive to manufacture
Source: Newegg Unscrambled
Posted By CybrSlydr @ 5:23 PM
Monday July 28th, 2014
| Industry veteran John Romero is sceptical on the future of VR but sees PC leaving consoles behind.
By Luke ReillyIndustry veteran John Romero, best known for his work at id Software as a designer for Wolfenstein 3D, Doom and Quake and later as the creator of Daikatana, believes PC and mobile are dominating console platforms through price but can’t really see the new wave of VR gaining much traction with most players.
Speaking to GamesIndustry.biz at the Strong National Museum of Play in Rochester, New York, at an event marking the addition of his old Apple II Plus computer to the museum's permanent eGameRevolution exhibit, Romero shared his thoughts on how free-to-play continues to shake up the industry.
“With PC you have free-to-play and Steam games for five bucks,” said Romero. “The PC is decimating console, just through price. Free-to-play has killed a hundred AAA studios”
Romero believes there are two ways to do free-to-play and he hopes that players will gravitate towards games that get it right, comparing the model to the shareware era.
“It’s a different form of monetization than Doom or Wolfenstein or Quake where that’s free-to-play [as shareware],” said Romero. “Our entire first episode was free – give us no money, play the whole thing. If you like it and want to play more, then you finally pay us. To me that felt like the ultimate fair [model]. I'm not nickel-and-diming you. I didn't cripple the game in any design way.”
“Everybody is getting better at free-to-play design, the freemium design, and it’s going to lose its stigma at some point. People will settle into [the mindset] that there is a really fair way of doing it, and the other way is the dirty way. Hopefully that other way is easily noticeable by people and the quality design of freemium rises and becomes a standard. That’s what everybody is working hard on. People are spending a lot of time trying to design this the right way. They want people to want to give them money, not have to. If you have to give money, you’re doing it wrong... For game designers, that’s the holy grail.”
Romero went on to highlight the obvious technoligocial advantages of PC over consoles (“With PCs if you want a faster system you can just plug in some new video cards, put faster memory in it, and you'll always have the best machine that blows away PS4 or Xbox One,” he said) although he remains unconvinced that VR headsets are going to make a significant impact.
“Before using Oculus, I heard lots of vets in the industry saying this is not like anything we’ve seen before. This is not the crap we saw back in the late ’80s. I was excited to check it out and I was just blown away by just how amazing it was to just be in an environment and moving my head was just like mouse-look. I thought that was really great but when I kind of step back and look at it, I just don’t see a real good future for the way VR is right now. It encloses you and keeps you in one spot – even the Kinect and Move are devices I wouldn’t play because they just tire you out.”
“VR is going away from the way games are being developed and pushed as they go back into multiplayer and social stuff. VR is kind of a step back, it's a fad.”
“Even though I’m excited about VR and how cool games look, I can’t see it becoming the way people always play games... If you're inside of a cockpit, that’s cool, but if you’re supposed to be running around a world and you can’t physically run but you can look around, it’s a weird disconnect and it doesn’t feel right.”
Posted By CybrSlydr @ 9:34 AM
Tuesday July 22nd, 2014
| Recently appointed CEO Satya Nadella announced the largest layoffs in Microsoft’s 39 year history today, with a staggering 18,000 jobs on the chopping block. The goal, according to Nadella is to “simplify the way we work to drive greater accountability, become more agile and move faster” signifying Nadella's goal to bring some focus to Microsoft's portfolio of services while also seemingly looking to play down the job losses.
The last large round of layoffs at Microsoft came in 2009, after the stock market crash. That round of layoffs was the previous largest ever at 5,800 positions, and today’s announcement dwarfs that number substantially. But not all departments will share this burden evenly, with the recently acquired Nokia employees getting the brunt of the cuts. In April, Microsoft closed the acquisition of the Nokia mobile phone business, and in the process added 25,000 employees to its payroll. Nadella announced today that 50% of those employees will be let go. Some will be factory workers from some of the in-house manufacturing Nokia owned, and the remainder will be from the handset business itself.
The remaining 5,500 employees to be laid off will therefore come from within Microsoft itself, as it attempts to concentrate on some of its more successful offerings. Excluding the Nokia losses, which are often expected after a merger of this sort, the total number of Microsoft employees being affected is not significantly different than the 2009 cuts.
Former Nokia CEO, now Microsoft Executive VP of Devices and Services, Stephen Elop laid out some of the upcoming changes in his own letter to his employees. Elop promises a focus on Windows Phone, with a near term goal of driving up Windows Phone volume by focusing on the affordable smartphone segments. With that announcement comes the death of the strange Nokia X series of AOSP phones, which debuted at MWC 2014 and were updated with a new model only a couple of weeks ago. While I would make the argument that there was little need for the X series at all, it is doubly frustrating to anyone who bought into the platform to find it killed off so quickly. The X series would be easy prey for cuts like these, because it didn’t really offer anything new to Android or to Microsoft. While it promised to be low cost, retail pricing for the X line was often more than the low cost Lumia phones. The X series had no place in a Microsoft owned Nokia, and should have been killed a while ago.
Elop also announced that they would continue to work on the high end phone range as well. Historically Windows Phone has suffered selling flagship models for many reasons, but it appears that they are not ready to give up the fight in this market yet. He also specifically called out Surface, Perceptive Pixel, and Xbox as new areas of innovation, which likely means those brands are safe for the time being.
The remainder of the Nokia feature phone lines appear to be immediately canceled. This is a segment that has been rapidly shrinking in recent years, with the consumer push towards smartphones, so this is likely a good strategic move by Microsoft. The work done on Windows Phone to allow it to work well on low cost hardware is also likely another big reason for this.
Another major announcement was the closure of the Xbox Entertainment Studios which had a goal of providing original content for Xbox Live members. Several projects such as “Signal to Noise” and “Halo: Nightfall” that were mid production will be completed, but after that content is delivered the studio will be closed.
The full ramifications of these job cuts won’t be known for some time, but it seems fair to say that Nadella wants to put his own stamp on the company. Removing the Nokia X line, the Asha and S40 lines, and an entertainment studio seem like reasonable things to cut if you want to focus your company. Nadella speaks about flattening the organization out, which should help them be quicker to execute on ideas. These kinds of steps, though painful for the employees, can be better for the company in the long run. For quite some time, the perception is that Microsoft is not agile enough to respond to new markets, and it appears that Satya Nadella is trying to focus his company on its strength and that should have a net positive for the company. Microsoft’s next earnings call comes on July 22nd, at which point we may get more details about upcoming plans.
Posted By CybrSlydr @ 8:48 PM
Wednesday July 16th, 2014
| That thing in the above picture is an SSD, and a hoofing big one too. The Plextor M6e is the first M.2 SSD I’ve had arrive in the office, and it’s a 512GB drive that aims to circumvent the limitations of current SATA connections by using the same PCI Express bus that's been providing oodles of bandwidth to graphics cards for years.
In fairness some SSD manufacturers, like OCZ and Kingspec, have already been producing PCIe-based drives that slot in side-by-side with your graphics card. Those have been using the combined performance of multiple SSDs to create the extra speed, were this Plextor M6e is doing all the work itself.
The M.2 interface in most of the Z97 motherboards I’ve tested has a theoretical limit of 1GB/s compared with the 600MB/s limits of SATA. The beauty of using the PCIe bus is that in the future manufacturers can open up more PCIe lanes to allow for even higher possible bandwidth around the 4GB/s mark.
The Plextor M6e isn’t quite up in those numbers just yet. My preliminary tests have the 512GB version hitting sequential read/write figures of 676MB/s and 620MB/s respectively in the AS SSD benchmarking software.
While that bests any SATA-based SSD I’ve ever tested—including Samsung’s latest V-NANDtastic 850 Pro—the 4k random read/write results are nowhere near as spectacular.
At 31MB/s for the reads it’s up there with the best, but at just 72MB/s for the writes it’s considerably slower than the bargain-basement Crucial MX100 512GB SATA drive.
So, while the Plextor M6e is demonstrably breaking the limits of the SATA barrier, it’s not doing it by much and not making any inroads into the responsiveness 4k tests.
It’s early days for the M.2 interface, but once we get the proper SSD-focused NVMe standard rather than the current AHCI—the standard protocol for elderly spinning platter hard drives—I think we’ll start to see things change massively in the SSD world.
Source: PC Gamer
Posted By CybrSlydr @ 12:02 PM
Monday July 7th, 2014
| A new computer called the "HummingBoard" takes on the same basic shape as the Raspberry Pi but uses a more powerful processor and supports more operating systems.
SolidRun, which also makes the CuBox-i computer we wrote about, just started selling the HummingBoard in several configurations ranging from $45 to $100, not including the price of a power adapter and Micro SD card.
"The HummingBoard allows you to run many open source operating systems—such as Ubuntu, Debian, and Arch—as well as Android and XBMC," SolidRun says. "With its core technology based on SolidRun’s state-of-the-art Micro System on a Module (MicroSOM), it has ready-to-use OS images, and its open hardware comes with full schematics and layout. Best of all, as a Linux single board computer, the HummingBoard is backed by the global digital maker community, which means you can alter the product in any way you like and get full kernel upstreaming support and all the assistance you need."
HummingBoard uses a 1GHz ARMv7 processor rather than the 700MHz ARMv6 one that has worked well for the Raspberry Pi yet limits the number of operating systems it can run. HummingBoard configurations use single- and dual-core i.MX 6 chips based on the ARM Cortex-A9 architecture, and they range from 512MB to 1GB of memory.
Other features include OpenGL support, up to Gigabit Ethernet, support for mSATA and PCIe mini cards, HDMI, GPIO pins, LVDS display out, a camera interface, and powered USB.
The HummingBoard was "cleverly designed to mimic the Raspberry Pi’s dimensions and layout," Geek.com wrote. "That means it’ll fit into the hundreds of ready-to-use Raspberry Pi cases." In addition, "[t]he processor sits on its own module, which means you may be able to purchase upgrades for it in the future."
Here's a video from SolidRun that compares the HummingBoard to the Raspberry Pi: https://www.youtube.com/watch?v=dnGiYir07as
Source: Ars Technica
Posted By CybrSlydr @ 9:47 AM
| I attended both Apple's and Google's developer conference keynotes last month, and I experienced strong deja vu on more than one occasion. Both companies talked about design and consistency. Both companies talked about improving back-end services. And both companies talked about new initiatives to make stuff on your phone appear seamlessly on your tablet or laptop.
"Users almost always have a smartphone with them, including when they are using a Chromebook," said Google's Sundar Pichai. "So we want to increasingly connect those experiences together, so they have a seamless experience across their devices."
At or around the time the Android L release comes out this fall, this means your phone and your Chromebook are going to be able to share even more stuff than they already do. If you have your phone with you, it can unlock your Chromebook (and if you have your smartwatch with you, it can unlock your phone). If you get a call or a text or your battery is running low, you'll be told about it on your Chromebook. Some Android apps are even going to be able to run in Chrome OS, though Google didn't talk much about the technical details.
It was all very similar to the "Continuity" feature that Apple's Craig Federighi showed off on the same stage in the same room three weeks before (at least both companies can still share a conference hall). When iOS 8 and OS X Yosemite arrive in the fall, AirDrop will be able to move files between iOS devices and Macs. "Handoff" can send e-mails, webpages, and even files from iCloud-enabled applications on iOS to their counterparts in OS X (or vice-versa). You can receive texts alongside iMessages in the Messages app, and you can make and receive phone calls from your Mac even if your phone is in another room.
This isn't about which company is copying from which—this kind of integration is a logical next step for both Apple and Google after years of moving various operating systems and services closer and closer together. This is about ecosystem lock-in. All of these features sound like great, logical ways to extend both companies' platforms, since you can often assume that someone using an Apple phone will be using an Apple computer. They're also going to make it harder than ever for you to extricate yourself from a given company's ecosystem once you've become embedded in it.
Follow link for rest of story
Source: Ars Technica
Posted By CybrSlydr @ 9:43 AM
| By Luke Karmali
Microsoft has announced Kinect 2 for PC costs £159 / $199.
The device is available now for pre-order on the Microsoft Store, ahead of release on July 15.
Though Kinect's primary use on Xbox One was arguably gaming, the device's launch on PC will target applications more heavily. It also ships without software, which is licensed separately.
It's also worth noting that this is not the device that will work with the new Kinect-less Xbox One; we still don't know how much you'll pay for one of those.
Posted By CybrSlydr @ 9:31 AM
Wednesday June 25th, 2014
| We will not reveal our source dev (whose employment we confirmed and vowed we would not reveal the identity of) but the info should be taken with a grain of salt as a result. Today, I have the ultimate displeasure to inform the public that apparently the E3 demo of The Division was running on a PC and will be downgraded overall. Developers often overshoot for the moon and end up delivering next to nothing in terms of the visual garbage that the final retail copies end up becoming (Far Cry 3, Watch Dogs, Dark Souls II in comparison to their non-downgraded counterparts). This sort of false advertising and marketing absolutely has to stop. It is a vile and dementedly sick way of companies to make money off of people who obviously preorder because the game is visually impressive. Yes – some may claim “gameplay weighs in more” but this is arguable.
He tells us the following:
We really loved the reception to the demo we showed on the PC version at E3. Currently as it stands, there is definitely a lot of push coming from publishers to not make the experience so different on consoles as to alienate people into thinking that next generation is not as powerful as PC. This is probably what happened at Ubisoft Montreal. I think that while making stability changes is definitely important, it does not completely obliterate a lot of enhanced rendering applications.
Right now we already took out quite a lot of screen space reflections from the game and are working on asset management the best we can given consoles have that great unified memory. Naturally we will also be using online servers and have to produce a synchronization that higher graphics add to the latency so it had to be turned down. To me it still looks good, but not as good as the original reveal. I am sure as we get closer to launch and the actual console versions of the game featuring SD (Snowdrop) that it will start to seem all too obvious to people especially those on PCs. I just wanted to write and let you know that it definitely is not just stability but marketing politics plays into this a lot as well.
UPDATED 2nd Response from The Division Developer: Truth be told in regards to your question that while ‘Yes’ the lead platform is the PC, we simply cannot have such a big gap. As you know when the first WATCH DOGS Review was published by that one site, Ubisoft called it a “false review” and I am sure everyone can see how bad that sounded when they saw the game did look marginally better than something that was a last generation GTA IV. But no, they will not admit that they practice this or actively downgrade a game. It is much easier to say they removed things for stability which is often a lie as you can tell by the post-issues which are expected in any production we do.
Also to answer your 3rd question, no…they will never fully disclose what was removed from what build as no laws ask them to do so in terms of consumer rights. If we as developers published that information in very real terms for the consumer such as “Replaced particle fog simulation with 2d layer simulation in 3d space, removed particles from all explosions, lowered explosion volume multiplier by 20x, removed X # of trees and civilians, etc.” we would be out of a lot of sales and probably it would actually require too much time to deliver on the current hype that a lot of downgraded games see which look incredible with a vertical slice. I do share this in the hope’s that my colleagues and publishers and a lot of people who make false promises and do demonstrations which wrongfully create too much hype that they cannot deliver on ultimately stop doing such things. I want to see the industry actually move forward and not be so full of itself by promising too much and delivering too little. Regards
Our insider who is currently in the graphics technical division at Ubisoft Massive in Sweden contacted us because he too is sick of the practices that a company like Ubisoft has become all too known for. If Ubisoft denies downgrades have not happened and uses the lame excuse that “it is for the gamers and stability we did what we did” then there is certainly no reason for the PC/console parity to exist because currently the downgraded Watch Dogs runs sub-par still which is an utter joke. Everyone knows “next generation” currently as it stands is utter marketing BS. Of course, a lot of the uneducated folks out there feel this is more. Next gen, means next gen! If this is the case, PC raw throughput has the greatest power of any console despite having lower development focus (due to piracy). Essentially if it is not obvious by now: Next Gen has diminished any chances of making graphics leaps for the marketers to make more money on “next-gen” until the next next-gen comes out. It is a great marketing hype that is all too common in the gaming industry.
Bottom line: Publishers and developers – stop lying and rely on actual gameplay that is close to the real thing to do your marketing for you. And if you did remove a lot of features that affected the stability of the game, make sure to release a full disclosure of what this is before the game comes out. Oh wait…but then you would not see as many sales. Tsk tsk.
Posted By: Usman Ihtsham
ON Friday, June 20th, 2014
Source: What If Gaming
This isn't your normal news site, but it bears posting simply because of what it says - and sounds reasonable.
Posted By CybrSlydr @ 9:03 AM