Thursday August 21st, 2014
| AMD is preparing to announce three new FX processors on September 1, 2014 including models FX-8370, FX-8370E and FX-8320E. It is also stated that AMD will lower pricing of older FX processors and that there might be some new chipsets.
The new FX-series microprocessors from AMD, which are due to be formally introduced on September 1, 2014, are the FX-8370 and the FX-8370E reports xbitlabs.
****it - I was just coming here to post this. lol
The 95w FX's look somewhat interesting considering you get higher clock at lower TDP and hopefully heat which was always a detriment for me. I wonder how they will OC?
Posted By chartiet @ 9:59 AM
Friday August 15th, 2014
| Last Tuesday Microsoft issued their August updates for fixes and security, unfortunately it renders Windows 8.1 completely to un-bootable for a lot of end-users as they end up with a black failure screen. The issue resides in two updated files. On the Microsoft support forum it rains complaints about the so called August update. Read more after the break.
Users that have a system restore point enabled can retrieve access to the OS in the pre-update state and get Windows going again. Those that have system restore disabled are in a world of hurt and might have to reside to a system OS backup, or revert to a clean install. For those with a system restore point, please make sure that you uninstall the following updates: KB2982791 and the optional update KB2975719 as these are the two responsible for all this.
Posted By chartiet @ 11:32 AM
Saturday August 2nd, 2014
| Thought all you needed to get a 4K TV working is HDMI 2.0? Guess again. The next generation of content protection is called HDCP 2.2, and not only is it not backwards compatible, many new 4K devices don't even support it.
So it's possible that the 4K TV you bought last year, or even the receiver you buy this year, might not be able to receive/pass all future 4K content.
Sound crazy? Sadly, it's not. Here's the skinny.
What it is
Copy protection/content protection has been around since the VHS era, something anyone who tried to copy a Blockbuster rental can tell you. Back then it was called Macrovision, which evolved to CSS for DVD and finally HDCP, which stands for High-bandwidth digital Content Protection, for Blu-ray players and HDTV devices like satellite and cable boxes.
HDCP 2.2 is the latest evolution of copy protection. It's designed to create a secure connection between a source and a display. Ostensibly this is so you can't take the output from a source (a Blu-ray player, say) and plug it into some kind of recorder, to make a copy of the content. DRM, the encryption of the content itself, is a separate issue. HDCP doesn't care what goes across the cable, as long as that cable is secure.
It does this by creating encrypted keys between the source and the display (called the sink). Enabled repeaters, like receivers, can be in the chain as well. The source and the sink need to be in agreement, understanding their keys, or no content gets transferred. If you've ever hooked up gear and gotten a blank screen (or turned on gear in the wrong order and gotten a blank screen), this HDCP "handshake" is usually the issue.
HDCP isn't solely over HDMI. It can be implemented to work over DVI, DisplayPort, USB, and more.
So what's new? The encryption on the keys in version 2.2 is more advanced than previous versions which, in theory, makes the whole chain harder to break. One other interesting change with 2.2 is a "locality check." The source sends a signal to the sink, and if the sink doesn't get that signal within 20ms, the source kills the connection. In theory, this shouldn't cause any issues in home setups, even over long HDMI runs (unless you have more than 3,740 miles of cable).
See Source for rest of article.
Posted By CybrSlydr @ 9:54 PM
Tuesday July 29th, 2014
| Portable electronics are very convenient except for one thing: Their battery life is horrible. No matter what the capacity of their battery is, they all require frequent recharging throughout the day and have a limited lifespan that lasts between 400 to 1200 cycles. Eventually, these batteries all die and need to be replaced with new (and often expensive) versions that will also meet the same fate. Still, these types of batteries are the best available and have also become a popular choice for electric vehicles, aerospace applications and even military projects.
These batteries are called lithium-ion (li-ion) and became the industry standard for consumer electronics in the early 1990s. For 25 years we have used them to power our cell phones, laptops and most gadgets that need to function without being plugged in all the time. But future applications in portable electricity will soon demand higher energy storage density and something will have to replace traditional li-ion batteries because they simply won’t be powerful enough.
Advantage of Li-ion Batteries
Disadvantage of Li-ion Batteries
- High energy density with potential for higher capacities
- Don’t require prolonged priming when new
- Relatively low self-discharge rate
- Low maintenance
- Specialty cells can provide high current to many different types of applications
Many scientists have focused their research efforts on high-capacity electrode materials that use silicon and tin as anodes, and sulfur and oxygen as cathodes. But pure lithium metal is still the optimum choice because it has the highest capacity (3,860 mAh g–1) of them all. Unfortunately, it’s is also very dangerous.
- Require protection circuit to maintain voltage
- Subject to aging, even when not in use
- Must be stored in a cool place to reduce aging effect
- Transportation restrictions
- Expensive to manufacture
Source: Newegg Unscrambled
Posted By CybrSlydr @ 5:23 PM
Monday July 28th, 2014
| Industry veteran John Romero is sceptical on the future of VR but sees PC leaving consoles behind.
By Luke ReillyIndustry veteran John Romero, best known for his work at id Software as a designer for Wolfenstein 3D, Doom and Quake and later as the creator of Daikatana, believes PC and mobile are dominating console platforms through price but can’t really see the new wave of VR gaining much traction with most players.
Speaking to GamesIndustry.biz at the Strong National Museum of Play in Rochester, New York, at an event marking the addition of his old Apple II Plus computer to the museum's permanent eGameRevolution exhibit, Romero shared his thoughts on how free-to-play continues to shake up the industry.
“With PC you have free-to-play and Steam games for five bucks,” said Romero. “The PC is decimating console, just through price. Free-to-play has killed a hundred AAA studios”
Romero believes there are two ways to do free-to-play and he hopes that players will gravitate towards games that get it right, comparing the model to the shareware era.
“It’s a different form of monetization than Doom or Wolfenstein or Quake where that’s free-to-play [as shareware],” said Romero. “Our entire first episode was free – give us no money, play the whole thing. If you like it and want to play more, then you finally pay us. To me that felt like the ultimate fair [model]. I'm not nickel-and-diming you. I didn't cripple the game in any design way.”
“Everybody is getting better at free-to-play design, the freemium design, and it’s going to lose its stigma at some point. People will settle into [the mindset] that there is a really fair way of doing it, and the other way is the dirty way. Hopefully that other way is easily noticeable by people and the quality design of freemium rises and becomes a standard. That’s what everybody is working hard on. People are spending a lot of time trying to design this the right way. They want people to want to give them money, not have to. If you have to give money, you’re doing it wrong... For game designers, that’s the holy grail.”
Romero went on to highlight the obvious technoligocial advantages of PC over consoles (“With PCs if you want a faster system you can just plug in some new video cards, put faster memory in it, and you'll always have the best machine that blows away PS4 or Xbox One,” he said) although he remains unconvinced that VR headsets are going to make a significant impact.
“Before using Oculus, I heard lots of vets in the industry saying this is not like anything we’ve seen before. This is not the crap we saw back in the late ’80s. I was excited to check it out and I was just blown away by just how amazing it was to just be in an environment and moving my head was just like mouse-look. I thought that was really great but when I kind of step back and look at it, I just don’t see a real good future for the way VR is right now. It encloses you and keeps you in one spot – even the Kinect and Move are devices I wouldn’t play because they just tire you out.”
“VR is going away from the way games are being developed and pushed as they go back into multiplayer and social stuff. VR is kind of a step back, it's a fad.”
“Even though I’m excited about VR and how cool games look, I can’t see it becoming the way people always play games... If you're inside of a cockpit, that’s cool, but if you’re supposed to be running around a world and you can’t physically run but you can look around, it’s a weird disconnect and it doesn’t feel right.”
Posted By CybrSlydr @ 9:34 AM
Tuesday July 22nd, 2014
| Recently appointed CEO Satya Nadella announced the largest layoffs in Microsoft’s 39 year history today, with a staggering 18,000 jobs on the chopping block. The goal, according to Nadella is to “simplify the way we work to drive greater accountability, become more agile and move faster” signifying Nadella's goal to bring some focus to Microsoft's portfolio of services while also seemingly looking to play down the job losses.
The last large round of layoffs at Microsoft came in 2009, after the stock market crash. That round of layoffs was the previous largest ever at 5,800 positions, and today’s announcement dwarfs that number substantially. But not all departments will share this burden evenly, with the recently acquired Nokia employees getting the brunt of the cuts. In April, Microsoft closed the acquisition of the Nokia mobile phone business, and in the process added 25,000 employees to its payroll. Nadella announced today that 50% of those employees will be let go. Some will be factory workers from some of the in-house manufacturing Nokia owned, and the remainder will be from the handset business itself.
The remaining 5,500 employees to be laid off will therefore come from within Microsoft itself, as it attempts to concentrate on some of its more successful offerings. Excluding the Nokia losses, which are often expected after a merger of this sort, the total number of Microsoft employees being affected is not significantly different than the 2009 cuts.
Former Nokia CEO, now Microsoft Executive VP of Devices and Services, Stephen Elop laid out some of the upcoming changes in his own letter to his employees. Elop promises a focus on Windows Phone, with a near term goal of driving up Windows Phone volume by focusing on the affordable smartphone segments. With that announcement comes the death of the strange Nokia X series of AOSP phones, which debuted at MWC 2014 and were updated with a new model only a couple of weeks ago. While I would make the argument that there was little need for the X series at all, it is doubly frustrating to anyone who bought into the platform to find it killed off so quickly. The X series would be easy prey for cuts like these, because it didn’t really offer anything new to Android or to Microsoft. While it promised to be low cost, retail pricing for the X line was often more than the low cost Lumia phones. The X series had no place in a Microsoft owned Nokia, and should have been killed a while ago.
Elop also announced that they would continue to work on the high end phone range as well. Historically Windows Phone has suffered selling flagship models for many reasons, but it appears that they are not ready to give up the fight in this market yet. He also specifically called out Surface, Perceptive Pixel, and Xbox as new areas of innovation, which likely means those brands are safe for the time being.
The remainder of the Nokia feature phone lines appear to be immediately canceled. This is a segment that has been rapidly shrinking in recent years, with the consumer push towards smartphones, so this is likely a good strategic move by Microsoft. The work done on Windows Phone to allow it to work well on low cost hardware is also likely another big reason for this.
Another major announcement was the closure of the Xbox Entertainment Studios which had a goal of providing original content for Xbox Live members. Several projects such as “Signal to Noise” and “Halo: Nightfall” that were mid production will be completed, but after that content is delivered the studio will be closed.
The full ramifications of these job cuts won’t be known for some time, but it seems fair to say that Nadella wants to put his own stamp on the company. Removing the Nokia X line, the Asha and S40 lines, and an entertainment studio seem like reasonable things to cut if you want to focus your company. Nadella speaks about flattening the organization out, which should help them be quicker to execute on ideas. These kinds of steps, though painful for the employees, can be better for the company in the long run. For quite some time, the perception is that Microsoft is not agile enough to respond to new markets, and it appears that Satya Nadella is trying to focus his company on its strength and that should have a net positive for the company. Microsoft’s next earnings call comes on July 22nd, at which point we may get more details about upcoming plans.
Posted By CybrSlydr @ 8:48 PM
Wednesday July 16th, 2014
| That thing in the above picture is an SSD, and a hoofing big one too. The Plextor M6e is the first M.2 SSD I’ve had arrive in the office, and it’s a 512GB drive that aims to circumvent the limitations of current SATA connections by using the same PCI Express bus that's been providing oodles of bandwidth to graphics cards for years.
In fairness some SSD manufacturers, like OCZ and Kingspec, have already been producing PCIe-based drives that slot in side-by-side with your graphics card. Those have been using the combined performance of multiple SSDs to create the extra speed, were this Plextor M6e is doing all the work itself.
The M.2 interface in most of the Z97 motherboards I’ve tested has a theoretical limit of 1GB/s compared with the 600MB/s limits of SATA. The beauty of using the PCIe bus is that in the future manufacturers can open up more PCIe lanes to allow for even higher possible bandwidth around the 4GB/s mark.
The Plextor M6e isn’t quite up in those numbers just yet. My preliminary tests have the 512GB version hitting sequential read/write figures of 676MB/s and 620MB/s respectively in the AS SSD benchmarking software.
While that bests any SATA-based SSD I’ve ever tested—including Samsung’s latest V-NANDtastic 850 Pro—the 4k random read/write results are nowhere near as spectacular.
At 31MB/s for the reads it’s up there with the best, but at just 72MB/s for the writes it’s considerably slower than the bargain-basement Crucial MX100 512GB SATA drive.
So, while the Plextor M6e is demonstrably breaking the limits of the SATA barrier, it’s not doing it by much and not making any inroads into the responsiveness 4k tests.
It’s early days for the M.2 interface, but once we get the proper SSD-focused NVMe standard rather than the current AHCI—the standard protocol for elderly spinning platter hard drives—I think we’ll start to see things change massively in the SSD world.
Source: PC Gamer
Posted By CybrSlydr @ 12:02 PM
Monday July 7th, 2014
| A new computer called the "HummingBoard" takes on the same basic shape as the Raspberry Pi but uses a more powerful processor and supports more operating systems.
SolidRun, which also makes the CuBox-i computer we wrote about, just started selling the HummingBoard in several configurations ranging from $45 to $100, not including the price of a power adapter and Micro SD card.
"The HummingBoard allows you to run many open source operating systems—such as Ubuntu, Debian, and Arch—as well as Android and XBMC," SolidRun says. "With its core technology based on SolidRun’s state-of-the-art Micro System on a Module (MicroSOM), it has ready-to-use OS images, and its open hardware comes with full schematics and layout. Best of all, as a Linux single board computer, the HummingBoard is backed by the global digital maker community, which means you can alter the product in any way you like and get full kernel upstreaming support and all the assistance you need."
HummingBoard uses a 1GHz ARMv7 processor rather than the 700MHz ARMv6 one that has worked well for the Raspberry Pi yet limits the number of operating systems it can run. HummingBoard configurations use single- and dual-core i.MX 6 chips based on the ARM Cortex-A9 architecture, and they range from 512MB to 1GB of memory.
Other features include OpenGL support, up to Gigabit Ethernet, support for mSATA and PCIe mini cards, HDMI, GPIO pins, LVDS display out, a camera interface, and powered USB.
The HummingBoard was "cleverly designed to mimic the Raspberry Pi’s dimensions and layout," Geek.com wrote. "That means it’ll fit into the hundreds of ready-to-use Raspberry Pi cases." In addition, "[t]he processor sits on its own module, which means you may be able to purchase upgrades for it in the future."
Here's a video from SolidRun that compares the HummingBoard to the Raspberry Pi: https://www.youtube.com/watch?v=dnGiYir07as
Source: Ars Technica
Posted By CybrSlydr @ 9:47 AM
| I attended both Apple's and Google's developer conference keynotes last month, and I experienced strong deja vu on more than one occasion. Both companies talked about design and consistency. Both companies talked about improving back-end services. And both companies talked about new initiatives to make stuff on your phone appear seamlessly on your tablet or laptop.
"Users almost always have a smartphone with them, including when they are using a Chromebook," said Google's Sundar Pichai. "So we want to increasingly connect those experiences together, so they have a seamless experience across their devices."
At or around the time the Android L release comes out this fall, this means your phone and your Chromebook are going to be able to share even more stuff than they already do. If you have your phone with you, it can unlock your Chromebook (and if you have your smartwatch with you, it can unlock your phone). If you get a call or a text or your battery is running low, you'll be told about it on your Chromebook. Some Android apps are even going to be able to run in Chrome OS, though Google didn't talk much about the technical details.
It was all very similar to the "Continuity" feature that Apple's Craig Federighi showed off on the same stage in the same room three weeks before (at least both companies can still share a conference hall). When iOS 8 and OS X Yosemite arrive in the fall, AirDrop will be able to move files between iOS devices and Macs. "Handoff" can send e-mails, webpages, and even files from iCloud-enabled applications on iOS to their counterparts in OS X (or vice-versa). You can receive texts alongside iMessages in the Messages app, and you can make and receive phone calls from your Mac even if your phone is in another room.
This isn't about which company is copying from which—this kind of integration is a logical next step for both Apple and Google after years of moving various operating systems and services closer and closer together. This is about ecosystem lock-in. All of these features sound like great, logical ways to extend both companies' platforms, since you can often assume that someone using an Apple phone will be using an Apple computer. They're also going to make it harder than ever for you to extricate yourself from a given company's ecosystem once you've become embedded in it.
Follow link for rest of story
Source: Ars Technica
Posted By CybrSlydr @ 9:43 AM
| By Luke Karmali
Microsoft has announced Kinect 2 for PC costs £159 / $199.
The device is available now for pre-order on the Microsoft Store, ahead of release on July 15.
Though Kinect's primary use on Xbox One was arguably gaming, the device's launch on PC will target applications more heavily. It also ships without software, which is licensed separately.
It's also worth noting that this is not the device that will work with the new Kinect-less Xbox One; we still don't know how much you'll pay for one of those.
Posted By CybrSlydr @ 9:31 AM
Wednesday June 25th, 2014
| We will not reveal our source dev (whose employment we confirmed and vowed we would not reveal the identity of) but the info should be taken with a grain of salt as a result. Today, I have the ultimate displeasure to inform the public that apparently the E3 demo of The Division was running on a PC and will be downgraded overall. Developers often overshoot for the moon and end up delivering next to nothing in terms of the visual garbage that the final retail copies end up becoming (Far Cry 3, Watch Dogs, Dark Souls II in comparison to their non-downgraded counterparts). This sort of false advertising and marketing absolutely has to stop. It is a vile and dementedly sick way of companies to make money off of people who obviously preorder because the game is visually impressive. Yes – some may claim “gameplay weighs in more” but this is arguable.
He tells us the following:
We really loved the reception to the demo we showed on the PC version at E3. Currently as it stands, there is definitely a lot of push coming from publishers to not make the experience so different on consoles as to alienate people into thinking that next generation is not as powerful as PC. This is probably what happened at Ubisoft Montreal. I think that while making stability changes is definitely important, it does not completely obliterate a lot of enhanced rendering applications.
Right now we already took out quite a lot of screen space reflections from the game and are working on asset management the best we can given consoles have that great unified memory. Naturally we will also be using online servers and have to produce a synchronization that higher graphics add to the latency so it had to be turned down. To me it still looks good, but not as good as the original reveal. I am sure as we get closer to launch and the actual console versions of the game featuring SD (Snowdrop) that it will start to seem all too obvious to people especially those on PCs. I just wanted to write and let you know that it definitely is not just stability but marketing politics plays into this a lot as well.
UPDATED 2nd Response from The Division Developer: Truth be told in regards to your question that while ‘Yes’ the lead platform is the PC, we simply cannot have such a big gap. As you know when the first WATCH DOGS Review was published by that one site, Ubisoft called it a “false review” and I am sure everyone can see how bad that sounded when they saw the game did look marginally better than something that was a last generation GTA IV. But no, they will not admit that they practice this or actively downgrade a game. It is much easier to say they removed things for stability which is often a lie as you can tell by the post-issues which are expected in any production we do.
Also to answer your 3rd question, no…they will never fully disclose what was removed from what build as no laws ask them to do so in terms of consumer rights. If we as developers published that information in very real terms for the consumer such as “Replaced particle fog simulation with 2d layer simulation in 3d space, removed particles from all explosions, lowered explosion volume multiplier by 20x, removed X # of trees and civilians, etc.” we would be out of a lot of sales and probably it would actually require too much time to deliver on the current hype that a lot of downgraded games see which look incredible with a vertical slice. I do share this in the hope’s that my colleagues and publishers and a lot of people who make false promises and do demonstrations which wrongfully create too much hype that they cannot deliver on ultimately stop doing such things. I want to see the industry actually move forward and not be so full of itself by promising too much and delivering too little. Regards
Our insider who is currently in the graphics technical division at Ubisoft Massive in Sweden contacted us because he too is sick of the practices that a company like Ubisoft has become all too known for. If Ubisoft denies downgrades have not happened and uses the lame excuse that “it is for the gamers and stability we did what we did” then there is certainly no reason for the PC/console parity to exist because currently the downgraded Watch Dogs runs sub-par still which is an utter joke. Everyone knows “next generation” currently as it stands is utter marketing BS. Of course, a lot of the uneducated folks out there feel this is more. Next gen, means next gen! If this is the case, PC raw throughput has the greatest power of any console despite having lower development focus (due to piracy). Essentially if it is not obvious by now: Next Gen has diminished any chances of making graphics leaps for the marketers to make more money on “next-gen” until the next next-gen comes out. It is a great marketing hype that is all too common in the gaming industry.
Bottom line: Publishers and developers – stop lying and rely on actual gameplay that is close to the real thing to do your marketing for you. And if you did remove a lot of features that affected the stability of the game, make sure to release a full disclosure of what this is before the game comes out. Oh wait…but then you would not see as many sales. Tsk tsk.
Posted By: Usman Ihtsham
ON Friday, June 20th, 2014
Source: What If Gaming
This isn't your normal news site, but it bears posting simply because of what it says - and sounds reasonable.
Posted By CybrSlydr @ 9:03 AM
Monday June 23rd, 2014
| Crytek may be enduring some financial difficulties of late, as Eurogamer reports that the developer has missed payroll at both its Bulgarian and UK offices recently.
That report followed an article on German outlet GameStar, which claimed the developer was on the brink of bankruptcy. It cited a source with a large publisher as saying "the vultures are circling," and rival companies have begun attempts to poach the studio's best talent.
However, the report also said Crytek co-founder Avni Yerli admitted earlier this month that the studio's transition to free-to-play games had been difficult, but claimed the studio was also on the verge of securing new financing. GameStar suggested an acquisition from World of Tanks publisher Wargaming was a possibility, but Eurogamer said the company may be entertaining an investment from a Chinese outfit instead.
Eurogamer reported that staff at Crytek's Sofia studio in Bulgaria hadn't been paid for two months. Meanwhile, Crytek UK, which just recently unveiled Homefront: The Revolution, failed to pay employees on time. The site added that staff have been upset by what they see as a lack of transparency from management over those issues.
A Crytek representative denied the claims to Eurogamer, saying, "Regardless of what some media are reporting, mostly based on a recent article published by GameStar, the information in those reports and in the GameStar article itself are rumors which Crytek deny."
Source: Games Industry.biz
Related Update: 6/26/2014
40 employees have left and studio can't bring in replacements quick enough, source claims
A number of staff at Crytek UK have not been paid their full salary since April 21st, a source connected to the matter has told Develop.
The source, who has ties with the studio, said since April, employees had received small payments of around £700 last month. At the time they had been told a deal was being made to secure money from Deutsche Bank, but that was since delayed.
A further payment was paid on June 16th, with staff then told to expect payment on Friday, June 27th. Our source claims however “this now looks like it won't happen either”.
Crytek UK is currently working on Homefront: The Revolution under what been a team of 90 developers.
Develop understands that since work on the game began, 40 staff have left, and the issues with salaries “has added to that number”. While there has been high turnover with new recruits coming in, it was said that the team cannot hire as fast as people are leaving.
It was also claimed that a number of staff have been promoted to senior roles recently, with such employees required to hand in three months notice if they choose to resign. These staff have also allegedly not been given pay rises “equivalent to the job role”, leading to suggestions it was a tactic to ensure staff stay on at the studio.
As a result of these issues, morale at the studio is said to be generally low, with unhappiness particularly directed at “differences with the creative direction of the project”, as well as pay.
Despite the staff departures, our source said that following the cancellation of Ryse 2, many developers from the Frankfurt office – where the sequel to the Xbox One launch title was to be made – are now working with Crytek UK on Homefront.
Developers at the Germany-based office have had their own problems however, with issues apparently starting as early as last year, though at that time the UK studio was unaffected. It is not currently clear however exactly what has happened in Frankfurt.
A Crytek spokesperson declined to comment on the matter.Source: Develop-Online.net
Welp, I mean look at EA, it was cheaper for Dice to make their own engine then to license and modify Crytek engine.
They should of copied Epic, they had the talent. Oh well.
Posted By @dmin @ 8:20 PM
Wednesday June 11th, 2014
| One of the most interesting things I saw at this year’s Computex was CVision's glasses-free 3D technology. You likely have not heard of the company before because they are currently not spending any money on B2C marketing or PR as they are focusing on selling/licensing the technology to OEMs to bring it to the mainstream market.
The way their technology works is unique. Instead of requiring a special panel or hardware, all that is needed is a custom film, or convergence of thin-film barrier as it's officially called, that is applied on top of the panel. That film along with CVision's software is able to produce a 3D experience that doesn't require glasses and to be honest, the quality was just awesome. CVision showed me a couple of short videos to highlight the 3D experience and I didn't notice that it was 3D unless I specifically looked for it. I mean, that's how smooth it was. There was no ghosting or bleeding, just sharp picture in 3D. The viewing angles were also as good as you would expect from an IPS panel -- the 3D effect didn't suffer at all even when viewed from an angle. Of course, if I moved the device or my eyes/head while playbacking the video, the smoothness was lost but as long as I held the device steady and focused on the screen there were no significant drawbacks compared to 2D.
CVision's software even supports 2D to 3D conversion on the fly, so playing Angry Birds in 3D wasn't a problem at all and it was actually very cool as the game itself suits well for 3D. Photos can also be converted to 3D and CVision showed me a couple of photos they had taken on the show floor with the phone. The camera itself was similar to what you can find inside any smartphone, so the conversion was done purely in software and the result was decent. I'm not sure if an exhibition hall is the best target for 3D photos as obviously it works the best when you are just focusing on one object but it was still clearly 3D but not as impressive as the videos or Angry Birds.
The main advantage of CVision's technology is that it can be applied to any device without the need for major re-engineering. The film itself is very thin and it is the only thing that is needed in terms of hardware and the prototype devices CVision had at their booth were as slim as any other high-end smartphone in the market. Currently the cost is about $3 per inch but CVision believes that they can cut this to half with higher volumes. The technology can scale to any size but as CVision is more of a technology company than a real manufacturer, they don't have the equipment to manufacture the films for TV size screens at this moment. However, their roadmap does include a 42" 1080p TV but it might be more of a concept at this point.
All in all, this is the first time I'm truly excited about 3D. I've never been a fan of the glasses and all the glasses-free 3D technologies I've seen so far have had too many limitations to make them better than 2D in my opinion. CVision is currently in talks with several smartphone and tablet OEMs to bring the technology to the mainstream market and I sure hope the OEMs see the potential. I mean, either I got totally fooled by their (non-existing) marketing or their technology "just works".
Posted By CybrSlydr @ 1:34 PM
| We are now six months down the line from the AMD Kaveri launch, and the only two Kaveri processors available on Newegg today are the A10-7850K at $170 and the A10-7700K at $160. Both of these SKUs come with games as part of the purchase, but as AMD’s biggest desktop processor launch of the year, one might have expected more processors to come to market by this point. This is especially true as AMD sampled the A8-7600 SKU to media with a configurable TDP which showcased a large jump in graphics APU performance at the 45W TDP margin, but this model number has not hit consumer shelves in North America. Perhaps then we get a sigh of relief that AMD are announcing seven new Kaveri APUs, including that A8-7600.
Posted By CybrSlydr @ 1:32 PM
| If anyone outside Apple saw Swift coming, they certainly weren't making any public predictions. In the middle of a keynote filled with the sorts of announcements you'd expect (even if the details were a surprise), Apple this week announced that it has created a modern replacement for the Objective-C, a programming language the company has used since shortly after Steve Jobs founded NeXT.
Swift wasn't a "sometime before the year's out"-style announcement, either. The same day, a 550-page language guide appeared in the iBooks store. Developers were also given access to Xcode 6 betas, which allow application development using the new language. Whatever changes were needed to get the entire Cocoa toolkit to play nice with Swift are apparently already done.
While we haven't yet produced any Swift code, we have read the entire language guide and looked at the code samples Apple provided. What follows is our first take on the language itself, along with some ideas about what Apple hopes to accomplish.
Source: Ars Technica
So, as someone who is completely unfamiliar with coding, how does this sound?
Posted By CybrSlydr @ 1:29 PM