EXTREME Overclocking
Home | Reviews | EOC Forums | File Downloads | RealTime Pricing Engine | Free Magazines | Folding Stats Contact Us
Latest EXTREME Overclocking Reviews
AMD Phenom II X4 975 BE & 840 Quad-Core Processors AMD Phenom II X4 975 BE & 840 Quad-Core Processors
Review | January 4, 2011
Today AMD is releasing two new processors for the main-stream market. First up is a new top-end quad-core, the 3.6 GHz Phenom II X4 975 Black Edition. The X4 975 BE is simply a 100 MHz speed bump over the already familiar X4 970 BE with no other design changes. The second new processor is a budget quad-core, the 3.2 GHz Phenom II X4 840.

  Tuesday October 14th, 2014

Every year NVIDIA launches quite a few new products; some are better than others, but they're all interesting. This fall, the big news is Maxwell 2.0, aka GM204. Initially launched last month as the GTX 980 and GTX 970, NVIDIA is hopefully changing the way notebook gamers get treated by launching the mobile version of the GM204 just one month later.

We've already covered all of the new features in the desktop launch, so things like DSR, FXAA, VXGI, DX12, and GameWorks are all part of the notebook launch marketing materials. Of course, as a notebook GPU there are a few extra features available that you don't see on desktop GPUs, mostly because such features aren't really needed. Optimus Technology has been around for several years now so there's not much to add; it allows laptops to dynamically switch between the lower power integrated graphics when you're not doing anything that requires a faster GPU, and it can turn on and utilize the faster discrete NVIDIA GPU when needed.


BatteryBoost is a related technology that was first introduced with the GTX 800M series of GPUs, and it seeks to improve gaming battery life. Our test platform at the time didn't really give us the gains we were hoping to see, but NVIDIA assures us that the new GM204 mobile graphics chips will do much better at providing decent battery life while running games. We'll be diving into this in more detail once we get our test notebooks.


Source: Anandtech

Posted By CybrSlydr @ 6:43 PM


After a weekend of rumors spurred on by a Wall Street Journal report, HP has confirmed this morning that the company intends to split in half next year. The process will see each half become its own independent company, allowing for what amounts to HP’s enterprise and consumer divisions to go their separate ways. By doing so, HP is looking to allow each half to focus on one subset of HP’s overall business, allowing for more focused execution and growth while cutting the bonds that HP believes have made them slow to move in the past.
The split will see HP’s core businesses assigned into one of two companies. HP Inc. the closest of the two companies to an immediate successor to the current HP, will take HP’s PC and printing businesses, along with HP’s other consumer/mobile businesses such as the company’s Chromebooks and tablets. Internally these products are already organized under HP’s Printing and Personal Systems business, so in some senses this is merely moving a business that was its own division into its own company entirely. This split off will also see the current EVP of the Printing and Personal Systems business, Dion Weisler, promoted to CEO of the new HP Inc. Finally, HP Inc. will also be retaining the current HP branding.

Meanwhile the rest of HP’s businesses – servers, networking, storage, software, financial services, and other services– will all be split off together to form the new Hewlett-Packard Enterprise. As alluded to by the name, Hewlett-Packard Enterprise will be focused on HP’s enterprise businesses, where divisions such as the company’s networking business are potential rapid growth markets for HP. HP’s current CEO, Meg Whitman, will be transitioning over to CEO of Hewlett-Packard Enterprise.


Source: Anandtech

Posted By CybrSlydr @ 6:41 PM


Samsung today announced that it has begun mass producing the industry's first 3-bit multi-level-cell (MLC) three-dimensional (3D) Vertical NAND (V-NAND) flash memory, for use in solid state drives (SSDs).
"With the addition of a whole new line of high density SSDs that is both performance- and value-driven, we believe the 3-bit V-NAND will accelerate the transition of data storage devices from hard disk drives to SSDs," said Jaesoo Han, Senior Vice President, Memory Sales & Marketing, Samsung Electronics. "The wider variety of SSDs will increase our product competitiveness as we further expand our rapidly growing SSD business."

The 3-bit V-NAND is Samsung's latest second generation V-NAND device, which utilizes 32 vertically stacked cell layers per NAND memory chip. Each chip provides 128 gigabits (Gb) of memory storage.

In Samsung's V-NAND chip structure, each cell is electrically connected to a non-conductive layer using charge trap flash (CTF) technology. Each cell array is vertically stacked on top of one another to form multibillion-cell chips.


The use of 3 bit-per-cell, 32-layer vertically stacked cell arrays sharply raises the efficiency of memory production. Compared to Samsung's 10 nanometer-class* 3-bit planar NAND flash, the new 3-bit V-NAND has more than doubled wafer productivity.

Samsung introduced its first generation V-NAND (24 layer cells) in August 2013, and introduced its second generation V-NAND (32-layer) cell array structure in May 2014. With the launch of the 32-layer, 3-bit V-NAND, Samsung is leading the 3D memory era by speeding up the evolution of V-NAND production technology.

After having first produced SSDs based on 3-bit planar NAND flash in 2012, Samsung has proven that there is indeed a mass market for high-density 3-bit NAND SSDs.

The industry's first 3-bit 3D V-NAND will considerably expand market adoption of V-NAND memory, to SSDs suitable for general PC users, in addition to efficiently addressing the high-endurance storage needs of most servers today.

Source: Guru 3D

Posted By CybrSlydr @ 6:39 PM


You can now download 3Dmark Fire Strike Ultra, the world's first 4K Ultra HD benchmark, to 3DMark Advanced Edition. This new test is only available in the Advanced and Professional Editions at this time. For the free Basic Edition this update brings a new design to the benchmark selection screen, improved logging and robustness for identifying problems and hardware monitoring. Full patch notes are included below. Other benchmark tests included in 3DMark have not been changed and results are fully comparable with those from the previous version.

Source: Guru 3D Download

Posted By CybrSlydr @ 6:35 PM


  Thursday October 9th, 2014

The Evil Within will run at 30 frames per second on all formats upon its release. Via the official blog, Bethesda reveals The Evil Within is intended to be played at 30fps when it releases on PS3, PS4, Xbox 360, Xbox One, and PC. However, there is some leeway if you're playing on PC by way of an option to alter the framerate. However, the post goes on to mention that altering the framerate is not recommended or supported. "Shinji Mikami and the team at Tango designed The Evil Within to be played at 30fps and to utilize an aspect ratio of 2.35:1 for all platforms," the post reads.


Source: IGN

http://i.imgur.com/q3iCWqE.gif

Posted By Urbanfox @ 12:32 PM


This article will not be a guide on how to overclock your video card; instead, we'll be doing a round-up of the utilities that help make overclocking possible. Overclocking with the use of software utilities continues to be the most popular method to date. There are over a dozen utilities that can be used to overclock today’s video cards, but today we'll only be looking at the most popular within our community. These utilities include AMD OverDrive, Sapphire TriXX, ASUS GPU Tweak, EVGA Precision X, and MSI Afterburner. As we delve into these utilities, we’ll be analyzing each of their strengths and weaknesses and discovering what separates them from each other as well. Hopefully this round-up will provide some useful data and help you decide which utility is right for you.

Source: AnandTech

I use EVGA PrecX for EVGA/nVidia cards (has K-Boost) and MSI AB for everything else. Personally I am not a fan of PrecX 16. MSI AB is a more simple UI with still a lot of features. Cheers!

Posted By chartiet @ 12:14 PM


  Wednesday October 1st, 2014

By Mike MahardyAfter a plethora of milestones in Star Citizen's lucrative crowdfunding campaign, Roberts Space Industries has reached another as it achieves $55 million in funding: the title of largest crowdfunded project ever.

This record extends past the video game industry, encompassing any project on Kickstarter or similar sites to date. Creator Chris Roberts called this "unthinkable" on the Star Citizen blog, saying that the company still intends to spend all of the money on development, not for profits before release.

"Every effort is about enriching the game’s vision," he writes. "Funding to date has allowed us to go so far beyond what I thought was possible in 2012. You’re still getting that game, no question, but it will be all the richer and so much more immersive because of the additional funding."

oberts Space Industries and Cloud Imperium Games added original alien languages when funds surpassed $50 million in funding; anyone who pledged before the $55 million milestone will receive Ballistic Gattling, allowing them to swap between two ammunition types without exiting their ship. If the developer reaches $56 million, the Arena Commander Upgrade will be unlocked. There aren't any details about this milestone, but as of this writing, funding is about $500k away from the new goal.

Roberts said that his continued crowdfunding efforts are aimed at creating an ever-expanding project, and that he's had to reevaluate the entire development process.

"It’s not being developed like a normal game and it’s not being funded like a normal game," he writes. "I’ve had to toss aside a lot of my knowledge from the old way of developing and embrace a completely new world. There is no publisher. There is no venture capitalist wanting a massive return in three years. There is no need to cram the game onto a disc and hope we got it all right.

"Star Citizen is not the type of game that will be played for a few weeks, then put on a shelf to gather dust."



Source: IGN

Posted By CybrSlydr @ 7:22 PM


Today Microsoft introduced their new Miracast-based wireless streaming device for HDMI monitors and televisions. Its lengthy name is the Microsoft Wireless Display Adapter and it's Microsoft's take on an HDMI streaming dongle. One of the most popular devices in this category is Google's Chromecast, and there are many similarities between the two devices. Like the Chromecast, the Microsoft Wireless Display Adapter is a small adapter that plugs into an HDMI port on your television and uses a USB port for power. From the photos Microsoft has provided, the USB connector seems to be wired directly into the adapter which could pose a problem depending on your television's arrangement of ports as the cord does not look very lengthy.


The adapter allows streaming and display mirroring from any device with support for Miracast screencasting. Because of this, the adapter is able to work with a variety of devices running on different operating systems, rather than being a device limited to devices that run Windows or Windows Phone 8.

At $59.95 USD, the Microsoft Wireless Display Adapter is around $25 more expensive than Google's Chromecast. It is available for pre-order now on Microsoft's online store, and it will ship in October 2014.



Source :Anandtech

Posted By CybrSlydr @ 7:14 AM



It was only two years ago that Windows 8 was unleashed on the world. Microsoft tried to usher in an era of “Touch First” applications with a new look and feel for Windows. To say that Windows 8 was unsuccessful would be an understatement, and from both Microsoft’s and user’s perspectives, it was certainly a failure. Two years in, Windows 8 and its 8.1 derivative have struggled to gain market share over Windows 7 and XP, which still command the lion’s share of the desktop OS pie. A new interface, unfamiliar to users, did little to sway their wallets, and other market factors have come in to play as well.

Source: Anandtech

Posted By CybrSlydr @ 7:12 AM


  Saturday September 27th, 2014

A week ago Samsung acknowledged the existence of the read performance bug in the SSD 840 EVO and I just received a note that the fixed firmware is in validation process and is expected to be released to the public on October 15th. Unfortunately I don't have any further details about the bug or the fix at this point, or whether the update is coming to the 'vanilla' SSD 840 and OEM models, but I hope to get more details as the public release gets closer, so stay tuned.

Source: AnandTech

Posted By chartiet @ 3:05 PM


  Friday September 26th, 2014

By Alaina Yee

Last week, three teams were given 24 hours—the span of Nvidia's GAME24 livestream—to craft one-of-a-kind, high-end gaming PCs. The result, of course, was three unique, fantastic-looking systems featuring some kickass handiwork.

We've got the video showcasing the final products from Lee Harrington and Ron Lee Christianson (Team Mongoose), Bob Stewart and Rod Rosenberg (BSMODS), and Rich Surroz and Travis Jank (Team Kill Ninja)—one of which included a tribute to Phillip Scholz, an Nvidia marketing manager who died earlier this year. Take a look at their amazing work, then watch the judges deliver their decisions.


Video at source link

Source:IGN

Posted By CybrSlydr @ 10:17 AM


  Tuesday September 23rd, 2014

Interesting, the cooler of the EVGA GeForce GTX 970 ACX series graphics cards seems to be somewhat poorly designed. Well that or somebody forgot to look up the GPU placement on the PCB and didn't compare it to the cooler spec sheet. Photos from reviews with the cooler removed show clearly that the GPU is mounted way off the position of the heatpipes.
Redditors point out, the ACX cooler on last year's EVGA GeForce GTX 760 SC had pretty much the same design with just two of the three direct-touch heatpipes actually touching the GPU. This could explain why EVGA's ACX cooler performs more poorly and noisier in reviews versus other custom cooling solutions They are missing almost a third of the cooler performance.

Source: Guru3D

Whoopsidaisies :yawnsigh

Posted By chartiet @ 10:28 AM


  Friday September 19th, 2014

At the risk of sounding like a broken record, the biggest story in the GPU industry over the last year has been over what isn’t as opposed to what is. What isn’t happening is that after nearly 3 years of the leading edge manufacturing node for GPUs at TSMC being their 28nm process, it isn’t being replaced any time soon. As of this fall TSMC has 20nm up and running, but only for SoC-class devices such as Qualcomm Snapdragons and Apple’s A8. Consequently if you’re making something big and powerful like a GPU, all signs point to an unprecedented 4th year of 28nm being the leading node.

We start off with this tidbit because it’s important to understand the manufacturing situation in order to frame everything that follows. In years past TSMC would produce a new node every 2 years, and farther back yet there would even be half-nodes in between those 2 years. This meant that every 1-2 years GPU manufacturers could take advantage of Moore’s Law and pack in more hardware into a chip of the same size, rapidly increasing their performance. Given the embarrassingly parallel nature of graphics rendering, it’s this cadence in manufacturing improvements that has driven so much of the advancement of GPUs for so long.

With 28nm however that 2 year cadence has stalled, and this has driven GPU manufacturers into an interesting and really unprecedented corner. They can’t merely rest on their laurels for the 4 years between 28nm and the next node – their continuing existence means having new products every cycle – so they instead must find new ways to develop new products. They must iterate on their designs and technology so that now more than ever it’s their designs driving progress and not improvements in manufacturing technology.

What this means is that for consumers and technology enthusiasts alike we are venturing into something of an uncharted territory. With no real precedent to draw from we can only guess what AMD and NVIDIA will do to maintain the pace of innovation in the face of manufacturing stagnation. This makes this a frustrating time – who doesn’t miss GPUs doubling in performance every 2 years – but also an interesting one. How will AMD and NVIDIA solve the problem they face and bring newer, better products to the market? We don’t know, and not knowing the answer leaves us open to be surprised.

Out of NVIDIA the answer to that has come in two parts this year. NVIDIA’s Kepler architecture, first introduced in 2012, has just about reached its retirement age. NVIDIA continues to develop new architectures on roughly a 2 year cycle, so new manufacturing process or not they have something ready to go. And that something is Maxwell.

Source:
Anandtech

Looks pretty good for only $550. Looks to edge out the 290x and the 780ti performance wise although it doesn't save that much power or heat OC'd (maybe not as much as I thought) but its a ref cooler and not an aftermarket/ACX cooler. Lets see what the Classy does ;)

Posted By chartiet @ 6:53 AM


  Tuesday September 16th, 2014

The DisplayPort 1.3 standard increases the maximum bandwidth to a staggering 32.4 Gb/s, which not only offers support for 4K at 120 Hz but can also be used to daisy-chain two 4K 60 Hz DisplayPort 1.3-enabled monitors. Still not impressed? It will also allow you to run a 5K monitor over a single DisplayPort cable. So far, we've only seen one of those monitors from Dell, although we're not sure whether it comes with DisplayPort 1.3 support...

...In order to ensure that the standard is a little more future-proof, VESA has also given it native support for the 4:2:0 sampling method. With this compression, you'll be able to drive a future 8K display from what will then be an antiquated DisplayPort 1.3 output. This is similar to what Nvidia has done in order to achieve 4K at 60 Hz over an HDMI 1.4 interface.


NICE!

SOURCE: tomshardware

Posted By Prozium @ 4:33 PM


  Friday September 12th, 2014

Announced at the beginning of this year, Intel’s Edison is the chipmakers latest foray into the world of low power, high performance computing. Originally envisioned to be an x86 computer stuffed into an SD card form factor, this tiny platform for wearables, consumer electronic designers, and the Internet of Things has apparently been redesigned a few times over the last few months. Now, Intel has finally unleashed it to the world. It’s still tiny, it’s still based on the x86 architecture, and it’s turning out to be a very interesting platform.

The key feature of the Edison is, of course, the Intel CPU. It’s a 22nm SoC with dual cores running at 500 MHz. Unlike so many other IoT and micro-sized devices out there, the chip in this device, an Atom Z34XX, has an x86 architecture. Also on board is 4GB of eMMC Flash and 1 GB of DDR3. Also included in this tiny module is an Intel Quark microcontroller – the same as found in the Intel Galileo – running at 100 MHz. The best part? Edison will retail for about $50. That’s a dual core x86 platform in a tiny footprint for just a few bucks more than a Raspberry Pi.

When the Intel Edison was first announced, speculation ran rampant that is would take on the form factor of an SD card. This is not the case. Instead, the Edison has a footprint of 35.5mm x 25.0 mm; just barely larger than an SD card. Dumping this form factor idea is a great idea – instead of being limited to the nine pins present on SD cards and platforms such as the Electric Imp, Intel is using a 70-pin connector to break out a bunch of pins, including an SD card interface, two UARTs, two I²C busses, SPI with two chip selects, I²S, twelve GPIOs with four capable of PWM, and a USB 2.0 OTG controller. There are also a pair of radio modules on this tiny board, making it capable of 802.11 a/b/g/n and Bluetooth 4.0.

The Edison will support Yocto Linux 1.6 out of the box, but because this is an x86 architecture, there is an entire universe of Linux distributions that will also run on this tiny board. It might be theoretically possible to run a version of Windows natively on this module, but this raises the question of why anyone would want to.

The first round of Edison modules will be used with either a small breakout board that provides basic functionality, solder points, a battery charger power input, and two USB ports (one OTG port), or a larger board Edison board for Arduino that includes the familiar Arduino pin header arrangement and breakouts for everything. The folks at Intel are a generous bunch, and in an effort to put these modules in the next generation of Things for Internet, have included Mouser and Digikey part numbers for the 70-pin header (about $0.70 for quantity one). If you want to create your own breakout board or include Edison in a product design, Edison makes that easy.

There is no word of where or when the Edison will be available. Someone from Intel will be presenting at Maker Faire NYC in less than two weeks, though, and we already have our media credentials. We’ll be sure to get a hands on then. I did grab a quick peek at the Edison while I was in Vegas for Defcon, but I have very little to write about that experience except for the fact that it existed in August.

Update: You can grab an Edison dev kit at Make ($107, with the Arduino breakout) and Sparkfun (link down as of this update never mind, Sparkfun has a ton of boards made for the Edison. It’s pretty cool)


Source: Hack a Day

Posted By CybrSlydr @ 12:26 PM

Most Downloaded Files
Recently Added Files
CPU-Z 1.4912/12/08
Compare Prices On Top Brands!
Search:
For:

Intel Processors
Core i7/i5 - Nehalem
975 Extreme  960  950  920
870  860  750  670  661  660

Core 2 Quad - Yorkfield
Q9650  Q9550  Q9400  Q9300  Q8300  Q8200

Core 2 Duo - Wolfdale
E8600  E8500  E8400  E8200  E7300  E7200

AMD Processors
Phenom II X4
965 Black  955 Black  945  925

Phenom II X2
555  550

Athlon II X4
630  620

Athlon II X3
435  425

Athlon II X2
250  245

Video Cards
nVidia GeForce GTX 200 Series
GTX 295  GTX 285  GTX 280  GTX 260

nVidia GeForce 9 Series
9800 GX2  9800 GTX+  9800 GTX  9800 GT  9600 GT  9600 GSO

ATI Radeon HD 4000 Series
4870 X2  4870  4850  4830  4670  4650

Search By Brand
ASUS  BFG  Diamond  eVGA  Gigabyte  HIS  MSI  Palit  PowerColor  PNY  Sapphire  Visiontek  XFX

PC Memory
DDR3  DDR2  DDR

Motherboards
ASUS  Biostar  DFI  ECS  eVGA  Foxconn  Gigabyte  Intel  MSI  Shuttle  Supermicro  Tyan  XFX

Hard Drives
Seagate  Maxtor  Samsung  Fujitsu  Western Digital

  Technology Magazines FREE to Qualified Professionals.
eWeek MagazineeWeek is the essential technology information source for builders of e-business. Focuses on e-commerce, communications and Internet-based architecture. Oracle MagazineOracle Magazine contains technology-strategy articles, sample code, tips, Oracle and partner news, how-to articles for developers and DBAs, and more. Dr. Dobb's JournalDr. Dobb's Journal enables programmers to write the most efficient and sophisticated programs and help in daily programming quandaries. InformationWeekInformationWeek is the only newsweekly you'll need to stay on top of the latest developments in information technology.
  Other Popular Titles: PC Magazine, BusinessWeek, Baseline, Business Solutions, Software Magazine, InfoStor, Security Source , TelevisionWeek, more...
Copyright © 2000-2014 EXTREME Overclocking. All rights reserved.
Disclaimer of Liability - Privacy Policy