Friday September 26th, 2014
| By Alaina Yee
Last week, three teams were given 24 hoursóthe span of Nvidia's GAME24 livestreamóto craft one-of-a-kind, high-end gaming PCs. The result, of course, was three unique, fantastic-looking systems featuring some kickass handiwork.
We've got the video showcasing the final products from Lee Harrington and Ron Lee Christianson (Team Mongoose), Bob Stewart and Rod Rosenberg (BSMODS), and Rich Surroz and Travis Jank (Team Kill Ninja)óone of which included a tribute to Phillip Scholz, an Nvidia marketing manager who died earlier this year. Take a look at their amazing work, then watch the judges deliver their decisions.
Video at source link
Posted By CybrSlydr @ 10:17 AM
Tuesday September 23rd, 2014
| Interesting, the cooler of the EVGA GeForce GTX 970 ACX series graphics cards seems to be somewhat poorly designed. Well that or somebody forgot to look up the GPU placement on the PCB and didn't compare it to the cooler spec sheet. Photos from reviews with the cooler removed show clearly that the GPU is mounted way off the position of the heatpipes.
Redditors point out, the ACX cooler on last year's EVGA GeForce GTX 760 SC had pretty much the same design with just two of the three direct-touch heatpipes actually touching the GPU. This could explain why EVGA's ACX cooler performs more poorly and noisier in reviews versus other custom cooling solutions They are missing almost a third of the cooler performance.
Posted By chartiet @ 10:28 AM
Friday September 19th, 2014
| At the risk of sounding like a broken record, the biggest story in the GPU industry over the last year has been over what isn’t as opposed to what is. What isn’t happening is that after nearly 3 years of the leading edge manufacturing node for GPUs at TSMC being their 28nm process, it isn’t being replaced any time soon. As of this fall TSMC has 20nm up and running, but only for SoC-class devices such as Qualcomm Snapdragons and Apple’s A8. Consequently if you’re making something big and powerful like a GPU, all signs point to an unprecedented 4th year of 28nm being the leading node.
We start off with this tidbit because it’s important to understand the manufacturing situation in order to frame everything that follows. In years past TSMC would produce a new node every 2 years, and farther back yet there would even be half-nodes in between those 2 years. This meant that every 1-2 years GPU manufacturers could take advantage of Moore’s Law and pack in more hardware into a chip of the same size, rapidly increasing their performance. Given the embarrassingly parallel nature of graphics rendering, it’s this cadence in manufacturing improvements that has driven so much of the advancement of GPUs for so long.
With 28nm however that 2 year cadence has stalled, and this has driven GPU manufacturers into an interesting and really unprecedented corner. They can’t merely rest on their laurels for the 4 years between 28nm and the next node – their continuing existence means having new products every cycle – so they instead must find new ways to develop new products. They must iterate on their designs and technology so that now more than ever it’s their designs driving progress and not improvements in manufacturing technology.
What this means is that for consumers and technology enthusiasts alike we are venturing into something of an uncharted territory. With no real precedent to draw from we can only guess what AMD and NVIDIA will do to maintain the pace of innovation in the face of manufacturing stagnation. This makes this a frustrating time – who doesn’t miss GPUs doubling in performance every 2 years – but also an interesting one. How will AMD and NVIDIA solve the problem they face and bring newer, better products to the market? We don’t know, and not knowing the answer leaves us open to be surprised.
Out of NVIDIA the answer to that has come in two parts this year. NVIDIA’s Kepler architecture, first introduced in 2012, has just about reached its retirement age. NVIDIA continues to develop new architectures on roughly a 2 year cycle, so new manufacturing process or not they have something ready to go. And that something is Maxwell.
Looks pretty good for only $550. Looks to edge out the 290x and the 780ti performance wise although it doesn't save that much power or heat OC'd (maybe not as much as I thought) but its a ref cooler and not an aftermarket/ACX cooler. Lets see what the Classy does ;)
Posted By chartiet @ 6:53 AM
Tuesday September 16th, 2014
| The DisplayPort 1.3 standard increases the maximum bandwidth to a staggering 32.4 Gb/s, which not only offers support for 4K at 120 Hz but can also be used to daisy-chain two 4K 60 Hz DisplayPort 1.3-enabled monitors. Still not impressed? It will also allow you to run a 5K monitor over a single DisplayPort cable. So far, we've only seen one of those monitors from Dell, although we're not sure whether it comes with DisplayPort 1.3 support...
...In order to ensure that the standard is a little more future-proof, VESA has also given it native support for the 4:2:0 sampling method. With this compression, you'll be able to drive a future 8K display from what will then be an antiquated DisplayPort 1.3 output. This is similar to what Nvidia has done in order to achieve 4K at 60 Hz over an HDMI 1.4 interface.
Posted By Prozium @ 4:33 PM
Friday September 12th, 2014
| Announced at the beginning of this year, Intelís Edison is the chipmakers latest foray into the world of low power, high performance computing. Originally envisioned to be an x86 computer stuffed into an SD card form factor, this tiny platform for wearables, consumer electronic designers, and the Internet of Things has apparently been redesigned a few times over the last few months. Now, Intel has finally unleashed it to the world. Itís still tiny, itís still based on the x86 architecture, and itís turning out to be a very interesting platform.
The key feature of the Edison is, of course, the Intel CPU. Itís a 22nm SoC with dual cores running at 500 MHz. Unlike so many other IoT and micro-sized devices out there, the chip in this device, an Atom Z34XX, has an x86 architecture. Also on board is 4GB of eMMC Flash and 1 GB of DDR3. Also included in this tiny module is an Intel Quark microcontroller Ė the same as found in the Intel Galileo Ė running at 100 MHz. The best part? Edison will retail for about $50. Thatís a dual core x86 platform in a tiny footprint for just a few bucks more than a Raspberry Pi.
When the Intel Edison was first announced, speculation ran rampant that is would take on the form factor of an SD card. This is not the case. Instead, the Edison has a footprint of 35.5mm x 25.0 mm; just barely larger than an SD card. Dumping this form factor idea is a great idea Ė instead of being limited to the nine pins present on SD cards and platforms such as the Electric Imp, Intel is using a 70-pin connector to break out a bunch of pins, including an SD card interface, two UARTs, two I≤C busses, SPI with two chip selects, I≤S, twelve GPIOs with four capable of PWM, and a USB 2.0 OTG controller. There are also a pair of radio modules on this tiny board, making it capable of 802.11 a/b/g/n and Bluetooth 4.0.
The Edison will support Yocto Linux 1.6 out of the box, but because this is an x86 architecture, there is an entire universe of Linux distributions that will also run on this tiny board. It might be theoretically possible to run a version of Windows natively on this module, but this raises the question of why anyone would want to.
The first round of Edison modules will be used with either a small breakout board that provides basic functionality, solder points, a battery charger power input, and two USB ports (one OTG port), or a larger board Edison board for Arduino that includes the familiar Arduino pin header arrangement and breakouts for everything. The folks at Intel are a generous bunch, and in an effort to put these modules in the next generation of Things for Internet, have included Mouser and Digikey part numbers for the 70-pin header (about $0.70 for quantity one). If you want to create your own breakout board or include Edison in a product design, Edison makes that easy.
There is no word of where or when the Edison will be available. Someone from Intel will be presenting at Maker Faire NYC in less than two weeks, though, and we already have our media credentials. Weíll be sure to get a hands on then. I did grab a quick peek at the Edison while I was in Vegas for Defcon, but I have very little to write about that experience except for the fact that it existed in August.
Update: You can grab an Edison dev kit at Make ($107, with the Arduino breakout) and Sparkfun (link down as of this update never mind, Sparkfun has a ton of boards made for the Edison. Itís pretty cool)
Source: Hack a Day
Posted By CybrSlydr @ 12:26 PM
| While we're not sure about any specific launch dates, we are expecting the GTX 980 and GTX 970 to launch very soon. The latest clue is that MSI is teasing the designhttp://images.intellitxt.com/ast/adTypes/icon1.png of its GTX 970 on Facebook. Teases of this caliber usually only happen close to the launch.
The card is definitely recognizable as an MSI card, as it follows all the design cues of previous cards. Note, though, that this card is carrying the new TwinFrozr V cooler, which has a number of small improvementshttp://images.intellitxt.com/ast/adTypes/icon1.png over the older design. We can't be sure about all the changes yet, but the cooler does have a large fin array, heat pipes, and two 100 mm fans.
The GTX 970 is expected to carry 1664 CUDA cores, along with 4 GB of memory accessible over a 256-bit memory interface. (We're speculating a bit here though, so take these specs with a grain of salt.) What is almost certain is that the GTX 970 will be based on the new Maxwell architecturehttp://images.intellitxt.com/ast/adTypes/icon1.png, which will allow it to provide more performancehttp://images.intellitxt.com/ast/adTypes/icon1.png per watt. If you look carefully at this card, you can also see two six-pin PCI-Express power connectors, which is a little less than the 6-pin + 8-pin design on the GTX 770.
Stay tuned for the actual launch.
Source: Tom's Hardware
Posted By CybrSlydr @ 10:29 AM
| GTA 5 Release Date For PS4, Xbox One and PC Revealed
By Luke KarmaliGrand Theft Auto V is set to release on PS4 and Xbox One on November 18 and on January 27 on PC, with a host of changes and improvements in tow.
The news comes after weeks of speculation and proves that a recent leak by a retailer was completely correct. But while it's exciting to hear that we'll finally be able to jump back into Los Santos before too long, what's even more impressive is the improvements Rockstar is saying we'll be able to see. Alongside new weapons, vehicles and activities, additional wildlife, denser traffic, a new foliage system and enhanced damage and weather effects, the developer is promising much more for those who think they've seen all the game has to offer.
"All players who pre-order the game will get $1,000,000 in-game bonus cash to spend across Grand Theft Auto V and Grand Theft Auto Online (GTA$500K each for your Story Mode and for your GTA Online in-game bank accounts)," Rockstar writes.
"A host of new, exclusive content also awaits for players returning from the PlayStation 3 and Xbox 360 versions including rare versions of classic vehicles to collect from across the Grand Theft Auto series such as the Dukes, the Dodo Seaplane and a faster, more maneuverable Blimp; activities including wildlife photography and new shooting range challenges, new weapons and more.
"Enhancements to Grand Theft Auto Online include an increased player count, with online play now for up to 30 players on PlayStation 4 and Xbox One. All existing gameplay upgrades and Rockstar-created content released since the launch of Grand Theft Auto Online will also be available for the PlayStation 4, Xbox One and PC with much more to come."
Grand Theft Auto V launched in early September last year, and we loved it. It made $800 million in a single day, and has since gone on to sell 33 million copies. At E3, it was announced for next-gen consoles and PC.
The slideshow below contains 16 new screenshots to flick through:
Wow - long break between the console and PC - November then not until nearly February? ****.
Posted By CybrSlydr @ 10:18 AM
Sunday August 31st, 2014
| Leader of one of the most well known hardware review sites is retiring. Best wishes to him and thanks for the years of informative articles.
On April 26, 1997, armed with very little actual knowledge, I began to share what I had with the world on a little Geocities site named Anandís Hardware Tech Page. Most of what I knew was wrong or poorly understood, but I was 14 years old at the time. Little did I know that I had nearly two decades ahead of me to fill in the blanks. I liked the idea of sharing knowledge online and the thought of building a resource where everyone who was interested in tech could find something helpful.
But after 17.5 years of digging, testing, analyzing and writing about the most interesting stuff in tech, itís time for a change. This will be the last thing I write on AnandTech as I am officially retiring from the tech publishing world. Ryan Smith (@RyanSmithAT) is taking over as Editor in Chief of AnandTech.
Full Article @ Anandtech
Posted By WiCKeD @ 12:30 PM
Tuesday August 26th, 2014
| We're sorry to break the bad news, but that 5TB hard drive you bought last week? Yeah, it's already obsolete. Seagate has started shipping the first-ever 8TB desktop hard disk, doubling the 4TB capacities that seemed huge just a couple of years ago. If it's any consolation, though, this machinery isn't ready to go inside your hot gaming PC. Right now, all those terabytes are destined for data centers where capacity trumps every other concern; Seagate isn't mentioning prices, but enterprise-class storage is rarely cheap. You may want to set aside some money all the same. These extra-roomy drives have a tendency to filter down to the mainstream pretty quickly, so you may soon have more free disk space than you know what to do with... at least, for a little while.
Source : Engadget
Posted By Prozium @ 10:12 PM
| August 22, 2014, 1:44 AM ó Many concepts of computing have moved to the cloud, but gaming has not been one of them. Even with the fastest pipe into your home, latency is inevitable, and who wants to die in a "Call of Duty" deathmatch because of lag? We get enough of that as it is with the software loaded on our PCs.
Cloud-based gaming would also help overcome the problem of console hardware because it would require just a thin client to display the game rather than hefty hardware to render it. Displaying the video is a lot easier and less system intensive than having to render each frame. Given how underpowered the Xbox One is, cloud-based rendering would help overcome its shortcomings.
But how do you get the rendered frames down the pipe to the gamer quickly? Microsoft Research may have a solution in a project called DeLorean. In a nutshell, it renders frames before an event occurs in the game based on a number of variables, the correct set of frames are sent down to your device.
A recently published white paper from Microsoft lays out the concept and solution. Microsoft notes that people could enjoy high-end graphics without needing a high-end GPU through cloud gaming. However, cloud gaming is hindered by latency as low as 60ms.
Microsoft calls its solution "speculative execution." It uses future input prediction, which is predictable based on player behavior, along with speculation of multiple outcomes and error compensation. Microsoft also came up with a new form of bandwidth compression that uses the speculation component to take advantage of the frames being similar from one to the next.
With this, Microsoft was able to achieve a playable cloud-based version of "Doom 3" and "Fable 3," both of which are framerate-intensive games, that were easily playable on thin clients despite a latency of over 250ms. Microsoft found players preferred DeLorean over traditional thin clients and that DeLorean can mimic a low-latency network successfully.
So when will we see it? Like with other Microsoft Research projects, they give no release date. This is still a lab experiment. But it could herald a day when gaming, like Salesforce's CRM, is a SaaS experience rather than 5-10GB on your hard drive.
Posted By CybrSlydr @ 11:05 AM
Monday August 25th, 2014
| During their 30 years of graphics celebration, today AMD announced a forthcoming addition to the Radeon R9 200 graphics card lineup.
Launching on September 2nd will be the companyís new midrange enthusiast card, the Radeon R9 285.
The R9 285 will take up an interesting position in AMDís lineup, being something of a refresh of a refresh that spans all the way back to Tahiti (Radeon 7970). Spec wise it ends up being extremely close on paper to the R9 280 (nťe 7950B) and itís telling that the R9 280 is no longer being advertised by AMD as a current member of their R9 lineup. However with a newer GPU under the hood the R9 285 stands to eclipse the 280 in features, and with sufficient efficiency gains we hope to see it eclipse 280 in performance too.
Finally, coinciding with the launch of the R9 285 will be a refresh of AMDís Never Settle bundles. The details on this are still murky at this time, but AMD is launching what they call the Never Settle Space Edition bundle, which will see Alien Isolation and Star Citizen as part of a bundle for all R9 series cards. The lack of clarity is whether this replaces the existing Never Settle Forever bundle in this case, or if these games are being added to the Never Settle Forever lineup in some fashion. AMD has said that current Silver and Gold voucher holders will be able to get the Space Edition bundle with their vouchers, which lends credit to the idea that these are new games in the NSF program rather than a different program entirely.
Both Alien Isolation and Star Citizen are still-in-development games. Alien Isolation is a first person shooter and is expected in October of this year. Meanwhile the space sim Star Citizen does not yet have a release date, and as best as we can tell wonít actually be finished until late 2015 at the earliest. In which case the inclusion here is more about access to the ongoing beta, which is the first time weíve seen beta access used as part of a bundle in this fashion.
Posted By CybrSlydr @ 4:46 PM
Thursday August 21st, 2014
| AMD is preparing to announce three new FX processors on September 1, 2014 including models FX-8370, FX-8370E and FX-8320E. It is also stated that AMD will lower pricing of older FX processors and that there might be some new chipsets.
The new FX-series microprocessors from AMD, which are due to be formally introduced on September 1, 2014, are the FX-8370 and the FX-8370E reports xbitlabs.
****it - I was just coming here to post this. lol
The 95w FX's look somewhat interesting considering you get higher clock at lower TDP and hopefully heat which was always a detriment for me. I wonder how they will OC?
Posted By chartiet @ 9:59 AM
Friday August 15th, 2014
| Last Tuesday Microsoft issued their August updates for fixes and security, unfortunately it renders Windows 8.1 completely to un-bootable for a lot of end-users as they end up with a black failure screen. The issue resides in two updated files. On the Microsoft support forum it rains complaints about the so called August update. Read more after the break.
Users that have a system restore point enabled can retrieve access to the OS in the pre-update state and get Windows going again. Those that have system restore disabled are in a world of hurt and might have to reside to a system OS backup, or revert to a clean install. For those with a system restore point, please make sure that you uninstall the following updates: KB2982791 and the optional update KB2975719 as these are the two responsible for all this.
Posted By chartiet @ 11:32 AM
Saturday August 2nd, 2014
| Thought all you needed to get a 4K TV working is HDMI 2.0? Guess again. The next generation of content protection is called HDCP 2.2, and not only is it not backwards compatible, many new 4K devices don't even support it.
So it's possible that the 4K TV you bought last year, or even the receiver you buy this year, might not be able to receive/pass all future 4K content.
Sound crazy? Sadly, it's not. Here's the skinny.
What it is
Copy protection/content protection has been around since the VHS era, something anyone who tried to copy a Blockbuster rental can tell you. Back then it was called Macrovision, which evolved to CSS for DVD and finally HDCP, which stands for High-bandwidth digital Content Protection, for Blu-ray players and HDTV devices like satellite and cable boxes.
HDCP 2.2 is the latest evolution of copy protection. It's designed to create a secure connection between a source and a display. Ostensibly this is so you can't take the output from a source (a Blu-ray player, say) and plug it into some kind of recorder, to make a copy of the content. DRM, the encryption of the content itself, is a separate issue. HDCP doesn't care what goes across the cable, as long as that cable is secure.
It does this by creating encrypted keys between the source and the display (called the sink). Enabled repeaters, like receivers, can be in the chain as well. The source and the sink need to be in agreement, understanding their keys, or no content gets transferred. If you've ever hooked up gear and gotten a blank screen (or turned on gear in the wrong order and gotten a blank screen), this HDCP "handshake" is usually the issue.
HDCP isn't solely over HDMI. It can be implemented to work over DVI, DisplayPort, USB, and more.
So what's new? The encryption on the keys in version 2.2 is more advanced than previous versions which, in theory, makes the whole chain harder to break. One other interesting change with 2.2 is a "locality check." The source sends a signal to the sink, and if the sink doesn't get that signal within 20ms, the source kills the connection. In theory, this shouldn't cause any issues in home setups, even over long HDMI runs (unless you have more than 3,740 miles of cable).
See Source for rest of article.
Posted By CybrSlydr @ 9:54 PM
Tuesday July 29th, 2014
| Portable electronics are very convenient except for one thing: Their battery life is horrible. No matter what the capacity of their battery is, they all require frequent recharging throughout the day and have a limited lifespan that lasts between 400 to 1200 cycles. Eventually, these batteries all die and need to be replaced with new (and often expensive) versions that will also meet the same fate. Still, these types of batteries are the best available and have also become a popular choice for electric vehicles, aerospace applications and even military projects.
These batteries are called lithium-ion (li-ion) and became the industry standard for consumer electronics in the early 1990s. For 25 years we have used them to power our cell phones, laptops and most gadgets that need to function without being plugged in all the time. But future applications in portable electricity will soon demand higher energy storage density and something will have to replace traditional li-ion batteries because they simply wonít be powerful enough.
Advantage of Li-ion Batteries
Disadvantage of Li-ion Batteries
- High energy density with potential for higher capacities
- Donít require prolonged priming when new
- Relatively low self-discharge rate
- Low maintenance
- Specialty cells can provide high current to many different types of applications
Many scientists have focused their research efforts on high-capacity electrode materials that use silicon and tin as anodes, and sulfur and oxygen as cathodes. But pure lithium metal is still the optimum choice because it has the highest capacity (3,860 mAh gĖ1) of them all. Unfortunately, itís is also very dangerous.
- Require protection circuit to maintain voltage
- Subject to aging, even when not in use
- Must be stored in a cool place to reduce aging effect
- Transportation restrictions
- Expensive to manufacture
Source: Newegg Unscrambled
Posted By CybrSlydr @ 5:23 PM