Friday September 12th, 2014
| Announced at the beginning of this year, Intel’s Edison is the chipmakers latest foray into the world of low power, high performance computing. Originally envisioned to be an x86 computer stuffed into an SD card form factor, this tiny platform for wearables, consumer electronic designers, and the Internet of Things has apparently been redesigned a few times over the last few months. Now, Intel has finally unleashed it to the world. It’s still tiny, it’s still based on the x86 architecture, and it’s turning out to be a very interesting platform.
The key feature of the Edison is, of course, the Intel CPU. It’s a 22nm SoC with dual cores running at 500 MHz. Unlike so many other IoT and micro-sized devices out there, the chip in this device, an Atom Z34XX, has an x86 architecture. Also on board is 4GB of eMMC Flash and 1 GB of DDR3. Also included in this tiny module is an Intel Quark microcontroller – the same as found in the Intel Galileo – running at 100 MHz. The best part? Edison will retail for about $50. That’s a dual core x86 platform in a tiny footprint for just a few bucks more than a Raspberry Pi.
When the Intel Edison was first announced, speculation ran rampant that is would take on the form factor of an SD card. This is not the case. Instead, the Edison has a footprint of 35.5mm x 25.0 mm; just barely larger than an SD card. Dumping this form factor idea is a great idea – instead of being limited to the nine pins present on SD cards and platforms such as the Electric Imp, Intel is using a 70-pin connector to break out a bunch of pins, including an SD card interface, two UARTs, two I²C busses, SPI with two chip selects, I²S, twelve GPIOs with four capable of PWM, and a USB 2.0 OTG controller. There are also a pair of radio modules on this tiny board, making it capable of 802.11 a/b/g/n and Bluetooth 4.0.
The Edison will support Yocto Linux 1.6 out of the box, but because this is an x86 architecture, there is an entire universe of Linux distributions that will also run on this tiny board. It might be theoretically possible to run a version of Windows natively on this module, but this raises the question of why anyone would want to.
The first round of Edison modules will be used with either a small breakout board that provides basic functionality, solder points, a battery charger power input, and two USB ports (one OTG port), or a larger board Edison board for Arduino that includes the familiar Arduino pin header arrangement and breakouts for everything. The folks at Intel are a generous bunch, and in an effort to put these modules in the next generation of Things for Internet, have included Mouser and Digikey part numbers for the 70-pin header (about $0.70 for quantity one). If you want to create your own breakout board or include Edison in a product design, Edison makes that easy.
There is no word of where or when the Edison will be available. Someone from Intel will be presenting at Maker Faire NYC in less than two weeks, though, and we already have our media credentials. We’ll be sure to get a hands on then. I did grab a quick peek at the Edison while I was in Vegas for Defcon, but I have very little to write about that experience except for the fact that it existed in August.
Update: You can grab an Edison dev kit at Make ($107, with the Arduino breakout) and Sparkfun (link down as of this update never mind, Sparkfun has a ton of boards made for the Edison. It’s pretty cool)
Source: Hack a Day
Posted By CybrSlydr @ 12:26 PM
| While we're not sure about any specific launch dates, we are expecting the GTX 980 and GTX 970 to launch very soon. The latest clue is that MSI is teasing the designhttp://images.intellitxt.com/ast/adTypes/icon1.png of its GTX 970 on Facebook. Teases of this caliber usually only happen close to the launch.
The card is definitely recognizable as an MSI card, as it follows all the design cues of previous cards. Note, though, that this card is carrying the new TwinFrozr V cooler, which has a number of small improvementshttp://images.intellitxt.com/ast/adTypes/icon1.png over the older design. We can't be sure about all the changes yet, but the cooler does have a large fin array, heat pipes, and two 100 mm fans.
The GTX 970 is expected to carry 1664 CUDA cores, along with 4 GB of memory accessible over a 256-bit memory interface. (We're speculating a bit here though, so take these specs with a grain of salt.) What is almost certain is that the GTX 970 will be based on the new Maxwell architecturehttp://images.intellitxt.com/ast/adTypes/icon1.png, which will allow it to provide more performancehttp://images.intellitxt.com/ast/adTypes/icon1.png per watt. If you look carefully at this card, you can also see two six-pin PCI-Express power connectors, which is a little less than the 6-pin + 8-pin design on the GTX 770.
Stay tuned for the actual launch.
Source: Tom's Hardware
Posted By CybrSlydr @ 10:29 AM
| GTA 5 Release Date For PS4, Xbox One and PC Revealed
By Luke KarmaliGrand Theft Auto V is set to release on PS4 and Xbox One on November 18 and on January 27 on PC, with a host of changes and improvements in tow.
The news comes after weeks of speculation and proves that a recent leak by a retailer was completely correct. But while it's exciting to hear that we'll finally be able to jump back into Los Santos before too long, what's even more impressive is the improvements Rockstar is saying we'll be able to see. Alongside new weapons, vehicles and activities, additional wildlife, denser traffic, a new foliage system and enhanced damage and weather effects, the developer is promising much more for those who think they've seen all the game has to offer.
"All players who pre-order the game will get $1,000,000 in-game bonus cash to spend across Grand Theft Auto V and Grand Theft Auto Online (GTA$500K each for your Story Mode and for your GTA Online in-game bank accounts)," Rockstar writes.
"A host of new, exclusive content also awaits for players returning from the PlayStation 3 and Xbox 360 versions including rare versions of classic vehicles to collect from across the Grand Theft Auto series such as the Dukes, the Dodo Seaplane and a faster, more maneuverable Blimp; activities including wildlife photography and new shooting range challenges, new weapons and more.
"Enhancements to Grand Theft Auto Online include an increased player count, with online play now for up to 30 players on PlayStation 4 and Xbox One. All existing gameplay upgrades and Rockstar-created content released since the launch of Grand Theft Auto Online will also be available for the PlayStation 4, Xbox One and PC with much more to come."
Grand Theft Auto V launched in early September last year, and we loved it. It made $800 million in a single day, and has since gone on to sell 33 million copies. At E3, it was announced for next-gen consoles and PC.
The slideshow below contains 16 new screenshots to flick through:
Wow - long break between the console and PC - November then not until nearly February? ****.
Posted By CybrSlydr @ 10:18 AM
Sunday August 31st, 2014
| Leader of one of the most well known hardware review sites is retiring. Best wishes to him and thanks for the years of informative articles.
On April 26, 1997, armed with very little actual knowledge, I began to share what I had with the world on a little Geocities site named Anand’s Hardware Tech Page. Most of what I knew was wrong or poorly understood, but I was 14 years old at the time. Little did I know that I had nearly two decades ahead of me to fill in the blanks. I liked the idea of sharing knowledge online and the thought of building a resource where everyone who was interested in tech could find something helpful.
But after 17.5 years of digging, testing, analyzing and writing about the most interesting stuff in tech, it’s time for a change. This will be the last thing I write on AnandTech as I am officially retiring from the tech publishing world. Ryan Smith (@RyanSmithAT) is taking over as Editor in Chief of AnandTech.
Full Article @ Anandtech
Posted By WiCKeD @ 12:30 PM
Tuesday August 26th, 2014
| We're sorry to break the bad news, but that 5TB hard drive you bought last week? Yeah, it's already obsolete. Seagate has started shipping the first-ever 8TB desktop hard disk, doubling the 4TB capacities that seemed huge just a couple of years ago. If it's any consolation, though, this machinery isn't ready to go inside your hot gaming PC. Right now, all those terabytes are destined for data centers where capacity trumps every other concern; Seagate isn't mentioning prices, but enterprise-class storage is rarely cheap. You may want to set aside some money all the same. These extra-roomy drives have a tendency to filter down to the mainstream pretty quickly, so you may soon have more free disk space than you know what to do with... at least, for a little while.
Source : Engadget
Posted By Prozium @ 10:12 PM
| August 22, 2014, 1:44 AM — Many concepts of computing have moved to the cloud, but gaming has not been one of them. Even with the fastest pipe into your home, latency is inevitable, and who wants to die in a "Call of Duty" deathmatch because of lag? We get enough of that as it is with the software loaded on our PCs.
Cloud-based gaming would also help overcome the problem of console hardware because it would require just a thin client to display the game rather than hefty hardware to render it. Displaying the video is a lot easier and less system intensive than having to render each frame. Given how underpowered the Xbox One is, cloud-based rendering would help overcome its shortcomings.
But how do you get the rendered frames down the pipe to the gamer quickly? Microsoft Research may have a solution in a project called DeLorean. In a nutshell, it renders frames before an event occurs in the game based on a number of variables, the correct set of frames are sent down to your device.
A recently published white paper from Microsoft lays out the concept and solution. Microsoft notes that people could enjoy high-end graphics without needing a high-end GPU through cloud gaming. However, cloud gaming is hindered by latency as low as 60ms.
Microsoft calls its solution "speculative execution." It uses future input prediction, which is predictable based on player behavior, along with speculation of multiple outcomes and error compensation. Microsoft also came up with a new form of bandwidth compression that uses the speculation component to take advantage of the frames being similar from one to the next.
With this, Microsoft was able to achieve a playable cloud-based version of "Doom 3" and "Fable 3," both of which are framerate-intensive games, that were easily playable on thin clients despite a latency of over 250ms. Microsoft found players preferred DeLorean over traditional thin clients and that DeLorean can mimic a low-latency network successfully.
So when will we see it? Like with other Microsoft Research projects, they give no release date. This is still a lab experiment. But it could herald a day when gaming, like Salesforce's CRM, is a SaaS experience rather than 5-10GB on your hard drive.
Posted By CybrSlydr @ 11:05 AM
Monday August 25th, 2014
| During their 30 years of graphics celebration, today AMD announced a forthcoming addition to the Radeon R9 200 graphics card lineup.
Launching on September 2nd will be the company’s new midrange enthusiast card, the Radeon R9 285.
The R9 285 will take up an interesting position in AMD’s lineup, being something of a refresh of a refresh that spans all the way back to Tahiti (Radeon 7970). Spec wise it ends up being extremely close on paper to the R9 280 (née 7950B) and it’s telling that the R9 280 is no longer being advertised by AMD as a current member of their R9 lineup. However with a newer GPU under the hood the R9 285 stands to eclipse the 280 in features, and with sufficient efficiency gains we hope to see it eclipse 280 in performance too.
Finally, coinciding with the launch of the R9 285 will be a refresh of AMD’s Never Settle bundles. The details on this are still murky at this time, but AMD is launching what they call the Never Settle Space Edition bundle, which will see Alien Isolation and Star Citizen as part of a bundle for all R9 series cards. The lack of clarity is whether this replaces the existing Never Settle Forever bundle in this case, or if these games are being added to the Never Settle Forever lineup in some fashion. AMD has said that current Silver and Gold voucher holders will be able to get the Space Edition bundle with their vouchers, which lends credit to the idea that these are new games in the NSF program rather than a different program entirely.
Both Alien Isolation and Star Citizen are still-in-development games. Alien Isolation is a first person shooter and is expected in October of this year. Meanwhile the space sim Star Citizen does not yet have a release date, and as best as we can tell won’t actually be finished until late 2015 at the earliest. In which case the inclusion here is more about access to the ongoing beta, which is the first time we’ve seen beta access used as part of a bundle in this fashion.
Posted By CybrSlydr @ 4:46 PM
Thursday August 21st, 2014
| AMD is preparing to announce three new FX processors on September 1, 2014 including models FX-8370, FX-8370E and FX-8320E. It is also stated that AMD will lower pricing of older FX processors and that there might be some new chipsets.
The new FX-series microprocessors from AMD, which are due to be formally introduced on September 1, 2014, are the FX-8370 and the FX-8370E reports xbitlabs.
****it - I was just coming here to post this. lol
The 95w FX's look somewhat interesting considering you get higher clock at lower TDP and hopefully heat which was always a detriment for me. I wonder how they will OC?
Posted By chartiet @ 9:59 AM
Friday August 15th, 2014
| Last Tuesday Microsoft issued their August updates for fixes and security, unfortunately it renders Windows 8.1 completely to un-bootable for a lot of end-users as they end up with a black failure screen. The issue resides in two updated files. On the Microsoft support forum it rains complaints about the so called August update. Read more after the break.
Users that have a system restore point enabled can retrieve access to the OS in the pre-update state and get Windows going again. Those that have system restore disabled are in a world of hurt and might have to reside to a system OS backup, or revert to a clean install. For those with a system restore point, please make sure that you uninstall the following updates: KB2982791 and the optional update KB2975719 as these are the two responsible for all this.
Posted By chartiet @ 11:32 AM
Saturday August 2nd, 2014
| Thought all you needed to get a 4K TV working is HDMI 2.0? Guess again. The next generation of content protection is called HDCP 2.2, and not only is it not backwards compatible, many new 4K devices don't even support it.
So it's possible that the 4K TV you bought last year, or even the receiver you buy this year, might not be able to receive/pass all future 4K content.
Sound crazy? Sadly, it's not. Here's the skinny.
What it is
Copy protection/content protection has been around since the VHS era, something anyone who tried to copy a Blockbuster rental can tell you. Back then it was called Macrovision, which evolved to CSS for DVD and finally HDCP, which stands for High-bandwidth digital Content Protection, for Blu-ray players and HDTV devices like satellite and cable boxes.
HDCP 2.2 is the latest evolution of copy protection. It's designed to create a secure connection between a source and a display. Ostensibly this is so you can't take the output from a source (a Blu-ray player, say) and plug it into some kind of recorder, to make a copy of the content. DRM, the encryption of the content itself, is a separate issue. HDCP doesn't care what goes across the cable, as long as that cable is secure.
It does this by creating encrypted keys between the source and the display (called the sink). Enabled repeaters, like receivers, can be in the chain as well. The source and the sink need to be in agreement, understanding their keys, or no content gets transferred. If you've ever hooked up gear and gotten a blank screen (or turned on gear in the wrong order and gotten a blank screen), this HDCP "handshake" is usually the issue.
HDCP isn't solely over HDMI. It can be implemented to work over DVI, DisplayPort, USB, and more.
So what's new? The encryption on the keys in version 2.2 is more advanced than previous versions which, in theory, makes the whole chain harder to break. One other interesting change with 2.2 is a "locality check." The source sends a signal to the sink, and if the sink doesn't get that signal within 20ms, the source kills the connection. In theory, this shouldn't cause any issues in home setups, even over long HDMI runs (unless you have more than 3,740 miles of cable).
See Source for rest of article.
Posted By CybrSlydr @ 9:54 PM
Tuesday July 29th, 2014
| Portable electronics are very convenient except for one thing: Their battery life is horrible. No matter what the capacity of their battery is, they all require frequent recharging throughout the day and have a limited lifespan that lasts between 400 to 1200 cycles. Eventually, these batteries all die and need to be replaced with new (and often expensive) versions that will also meet the same fate. Still, these types of batteries are the best available and have also become a popular choice for electric vehicles, aerospace applications and even military projects.
These batteries are called lithium-ion (li-ion) and became the industry standard for consumer electronics in the early 1990s. For 25 years we have used them to power our cell phones, laptops and most gadgets that need to function without being plugged in all the time. But future applications in portable electricity will soon demand higher energy storage density and something will have to replace traditional li-ion batteries because they simply won’t be powerful enough.
Advantage of Li-ion Batteries
Disadvantage of Li-ion Batteries
- High energy density with potential for higher capacities
- Don’t require prolonged priming when new
- Relatively low self-discharge rate
- Low maintenance
- Specialty cells can provide high current to many different types of applications
Many scientists have focused their research efforts on high-capacity electrode materials that use silicon and tin as anodes, and sulfur and oxygen as cathodes. But pure lithium metal is still the optimum choice because it has the highest capacity (3,860 mAh g–1) of them all. Unfortunately, it’s is also very dangerous.
- Require protection circuit to maintain voltage
- Subject to aging, even when not in use
- Must be stored in a cool place to reduce aging effect
- Transportation restrictions
- Expensive to manufacture
Source: Newegg Unscrambled
Posted By CybrSlydr @ 5:23 PM
Monday July 28th, 2014
| Industry veteran John Romero is sceptical on the future of VR but sees PC leaving consoles behind.
By Luke ReillyIndustry veteran John Romero, best known for his work at id Software as a designer for Wolfenstein 3D, Doom and Quake and later as the creator of Daikatana, believes PC and mobile are dominating console platforms through price but can’t really see the new wave of VR gaining much traction with most players.
Speaking to GamesIndustry.biz at the Strong National Museum of Play in Rochester, New York, at an event marking the addition of his old Apple II Plus computer to the museum's permanent eGameRevolution exhibit, Romero shared his thoughts on how free-to-play continues to shake up the industry.
“With PC you have free-to-play and Steam games for five bucks,” said Romero. “The PC is decimating console, just through price. Free-to-play has killed a hundred AAA studios”
Romero believes there are two ways to do free-to-play and he hopes that players will gravitate towards games that get it right, comparing the model to the shareware era.
“It’s a different form of monetization than Doom or Wolfenstein or Quake where that’s free-to-play [as shareware],” said Romero. “Our entire first episode was free – give us no money, play the whole thing. If you like it and want to play more, then you finally pay us. To me that felt like the ultimate fair [model]. I'm not nickel-and-diming you. I didn't cripple the game in any design way.”
“Everybody is getting better at free-to-play design, the freemium design, and it’s going to lose its stigma at some point. People will settle into [the mindset] that there is a really fair way of doing it, and the other way is the dirty way. Hopefully that other way is easily noticeable by people and the quality design of freemium rises and becomes a standard. That’s what everybody is working hard on. People are spending a lot of time trying to design this the right way. They want people to want to give them money, not have to. If you have to give money, you’re doing it wrong... For game designers, that’s the holy grail.”
Romero went on to highlight the obvious technoligocial advantages of PC over consoles (“With PCs if you want a faster system you can just plug in some new video cards, put faster memory in it, and you'll always have the best machine that blows away PS4 or Xbox One,” he said) although he remains unconvinced that VR headsets are going to make a significant impact.
“Before using Oculus, I heard lots of vets in the industry saying this is not like anything we’ve seen before. This is not the crap we saw back in the late ’80s. I was excited to check it out and I was just blown away by just how amazing it was to just be in an environment and moving my head was just like mouse-look. I thought that was really great but when I kind of step back and look at it, I just don’t see a real good future for the way VR is right now. It encloses you and keeps you in one spot – even the Kinect and Move are devices I wouldn’t play because they just tire you out.”
“VR is going away from the way games are being developed and pushed as they go back into multiplayer and social stuff. VR is kind of a step back, it's a fad.”
“Even though I’m excited about VR and how cool games look, I can’t see it becoming the way people always play games... If you're inside of a cockpit, that’s cool, but if you’re supposed to be running around a world and you can’t physically run but you can look around, it’s a weird disconnect and it doesn’t feel right.”
Posted By CybrSlydr @ 9:34 AM
Tuesday July 22nd, 2014
| Recently appointed CEO Satya Nadella announced the largest layoffs in Microsoft’s 39 year history today, with a staggering 18,000 jobs on the chopping block. The goal, according to Nadella is to “simplify the way we work to drive greater accountability, become more agile and move faster” signifying Nadella's goal to bring some focus to Microsoft's portfolio of services while also seemingly looking to play down the job losses.
The last large round of layoffs at Microsoft came in 2009, after the stock market crash. That round of layoffs was the previous largest ever at 5,800 positions, and today’s announcement dwarfs that number substantially. But not all departments will share this burden evenly, with the recently acquired Nokia employees getting the brunt of the cuts. In April, Microsoft closed the acquisition of the Nokia mobile phone business, and in the process added 25,000 employees to its payroll. Nadella announced today that 50% of those employees will be let go. Some will be factory workers from some of the in-house manufacturing Nokia owned, and the remainder will be from the handset business itself.
The remaining 5,500 employees to be laid off will therefore come from within Microsoft itself, as it attempts to concentrate on some of its more successful offerings. Excluding the Nokia losses, which are often expected after a merger of this sort, the total number of Microsoft employees being affected is not significantly different than the 2009 cuts.
Former Nokia CEO, now Microsoft Executive VP of Devices and Services, Stephen Elop laid out some of the upcoming changes in his own letter to his employees. Elop promises a focus on Windows Phone, with a near term goal of driving up Windows Phone volume by focusing on the affordable smartphone segments. With that announcement comes the death of the strange Nokia X series of AOSP phones, which debuted at MWC 2014 and were updated with a new model only a couple of weeks ago. While I would make the argument that there was little need for the X series at all, it is doubly frustrating to anyone who bought into the platform to find it killed off so quickly. The X series would be easy prey for cuts like these, because it didn’t really offer anything new to Android or to Microsoft. While it promised to be low cost, retail pricing for the X line was often more than the low cost Lumia phones. The X series had no place in a Microsoft owned Nokia, and should have been killed a while ago.
Elop also announced that they would continue to work on the high end phone range as well. Historically Windows Phone has suffered selling flagship models for many reasons, but it appears that they are not ready to give up the fight in this market yet. He also specifically called out Surface, Perceptive Pixel, and Xbox as new areas of innovation, which likely means those brands are safe for the time being.
The remainder of the Nokia feature phone lines appear to be immediately canceled. This is a segment that has been rapidly shrinking in recent years, with the consumer push towards smartphones, so this is likely a good strategic move by Microsoft. The work done on Windows Phone to allow it to work well on low cost hardware is also likely another big reason for this.
Another major announcement was the closure of the Xbox Entertainment Studios which had a goal of providing original content for Xbox Live members. Several projects such as “Signal to Noise” and “Halo: Nightfall” that were mid production will be completed, but after that content is delivered the studio will be closed.
The full ramifications of these job cuts won’t be known for some time, but it seems fair to say that Nadella wants to put his own stamp on the company. Removing the Nokia X line, the Asha and S40 lines, and an entertainment studio seem like reasonable things to cut if you want to focus your company. Nadella speaks about flattening the organization out, which should help them be quicker to execute on ideas. These kinds of steps, though painful for the employees, can be better for the company in the long run. For quite some time, the perception is that Microsoft is not agile enough to respond to new markets, and it appears that Satya Nadella is trying to focus his company on its strength and that should have a net positive for the company. Microsoft’s next earnings call comes on July 22nd, at which point we may get more details about upcoming plans.
Posted By CybrSlydr @ 8:48 PM
Wednesday July 16th, 2014
| That thing in the above picture is an SSD, and a hoofing big one too. The Plextor M6e is the first M.2 SSD I’ve had arrive in the office, and it’s a 512GB drive that aims to circumvent the limitations of current SATA connections by using the same PCI Express bus that's been providing oodles of bandwidth to graphics cards for years.
In fairness some SSD manufacturers, like OCZ and Kingspec, have already been producing PCIe-based drives that slot in side-by-side with your graphics card. Those have been using the combined performance of multiple SSDs to create the extra speed, were this Plextor M6e is doing all the work itself.
The M.2 interface in most of the Z97 motherboards I’ve tested has a theoretical limit of 1GB/s compared with the 600MB/s limits of SATA. The beauty of using the PCIe bus is that in the future manufacturers can open up more PCIe lanes to allow for even higher possible bandwidth around the 4GB/s mark.
The Plextor M6e isn’t quite up in those numbers just yet. My preliminary tests have the 512GB version hitting sequential read/write figures of 676MB/s and 620MB/s respectively in the AS SSD benchmarking software.
While that bests any SATA-based SSD I’ve ever tested—including Samsung’s latest V-NANDtastic 850 Pro—the 4k random read/write results are nowhere near as spectacular.
At 31MB/s for the reads it’s up there with the best, but at just 72MB/s for the writes it’s considerably slower than the bargain-basement Crucial MX100 512GB SATA drive.
So, while the Plextor M6e is demonstrably breaking the limits of the SATA barrier, it’s not doing it by much and not making any inroads into the responsiveness 4k tests.
It’s early days for the M.2 interface, but once we get the proper SSD-focused NVMe standard rather than the current AHCI—the standard protocol for elderly spinning platter hard drives—I think we’ll start to see things change massively in the SSD world.
Source: PC Gamer
Posted By CybrSlydr @ 12:02 PM
Monday July 7th, 2014
| A new computer called the "HummingBoard" takes on the same basic shape as the Raspberry Pi but uses a more powerful processor and supports more operating systems.
SolidRun, which also makes the CuBox-i computer we wrote about, just started selling the HummingBoard in several configurations ranging from $45 to $100, not including the price of a power adapter and Micro SD card.
"The HummingBoard allows you to run many open source operating systems—such as Ubuntu, Debian, and Arch—as well as Android and XBMC," SolidRun says. "With its core technology based on SolidRun’s state-of-the-art Micro System on a Module (MicroSOM), it has ready-to-use OS images, and its open hardware comes with full schematics and layout. Best of all, as a Linux single board computer, the HummingBoard is backed by the global digital maker community, which means you can alter the product in any way you like and get full kernel upstreaming support and all the assistance you need."
HummingBoard uses a 1GHz ARMv7 processor rather than the 700MHz ARMv6 one that has worked well for the Raspberry Pi yet limits the number of operating systems it can run. HummingBoard configurations use single- and dual-core i.MX 6 chips based on the ARM Cortex-A9 architecture, and they range from 512MB to 1GB of memory.
Other features include OpenGL support, up to Gigabit Ethernet, support for mSATA and PCIe mini cards, HDMI, GPIO pins, LVDS display out, a camera interface, and powered USB.
The HummingBoard was "cleverly designed to mimic the Raspberry Pi’s dimensions and layout," Geek.com wrote. "That means it’ll fit into the hundreds of ready-to-use Raspberry Pi cases." In addition, "[t]he processor sits on its own module, which means you may be able to purchase upgrades for it in the future."
Here's a video from SolidRun that compares the HummingBoard to the Raspberry Pi: https://www.youtube.com/watch?v=dnGiYir07as
Source: Ars Technica
Posted By CybrSlydr @ 9:47 AM