Monday, October 20, 2008

GITEX: Intel displays computing for the distant future

Original Article

Dubai: It might look like a high school science project now, but anyone looking at the wire and tube-filled box at Intel's Gitex stand is looking at the future of personal computing.

Despite its rather ugly appearance, the machine is one of the most advanced pieces of technology at the show: an 80-core chip from Intel's research and development department.

However, visitors shouldn't expect anything spectacular. While the chip reaches speeds of up to 1 teraflops, or 1 trillion operations a seconds, which is about 100 times faster that many of the two-core chips that are standard in computers today, the chip is just being used to run mathematical equations, albeit some very complex ones.

"We can't run anything like game on it. The software hasn’t been written yet" said Nitin Borkar, a senior engineer at Intel who demonstrates what the chips can do.

While the machine may seem like a technophile's dream, the 80-core chip isn't available for sale and would be unusable to anyone who wanted to take it home. Software applications have to be designed to take advantage of multi-core chips, and no commercial programme today would be able to use that many cores.

According to Sam Al Schamma, Intel's regional director, it could be 10 to 15 years before that many cores begin appearing in home computers. By then, he says, the system will probably help users conduct searches that would be impossible today. One possible application would allow user to identify a person in a pictures and have the computer search for other photos of that person - not by keyword - but by comparing facial features.

"If you get to that stage, it will all be about pattern recognition," he said.

Al Schamma said multi-core chips are allowing manufactures to continue what is known as Moore's Law, which states that the number of transistors on a chip doubles every 18-24 months. Before the advent of multi-cores, each new generation of computer chips was producing more and more heat, which could damage the computer. Manufactures were being forced to add additional components, such as multiple fans, to help their computers to help keep the chips cool.

Multi-core chips help reduce the heat being generated, but the amount of heat being generate by 80 cores was still enough to "just melt it down," Borkar said, so the chip's developers began looking for ways to minimise how much energy the chip consumption. The current 80-core chip uses about 35 watts, far below the 100 watts the engineers were originally aiming for, when it runs at top speed, but the chip still requires a liquid-cooling system to keep it from getting to hot.

The 80-core chip isn't new – it was developed two years ago – but Borkar wouldn't reveal what Intel has in research and development except to same the companies was working with AI (artificial intelligence).

No comments: