Thursday, September 16, 2010

Yahoo Computers Wallop Pi Record

Nicholas Sze, of technology definite Yahoo, gritty that the number - when voiced in binary - is 0.

Mr Sze used Yahoo's Hadoop clouded cover computing technology to more than twice the formerly record.

The mathematics took 23 days on 1,000 of Yahoo's computers, racking up the homogeneous of more than 500 years of a singular computer's efforts.

The heart of the calculation done use of an draw close called MapReduce originally created by Google that divides up large problems in to not as big sub-problems, mixing the answers to compromise instead bullheaded arithmetic challenges.

At Yahoo, a cluster of 1,000 computers implemented this algorithm to compromise an equation that plucks out specific digits of pi.

The office of longer versions of pi is a long-standing entertainment amid mathematicians.

But this draw close is really not similar from the full calculation of all of the digits of pi - the record for that was set in January at 2.7 trillion digits.

Instead, any of the Hadoop computers was working on a regulation that turns a complex equation for pi in to a tiny set of arithmetic steps, returning only one, specific square of pi.

"Interestingly, by a few algebraic manipulations, (our) regulation can discriminate pi with a few pieces skipped; in other words, it allows computing specific pieces of pi," Mr Sze explained to BBC News.

Fabrice Bellard, who undertook the full calculation announced in January, told BBC News that the single-digit and full pi calculation are vastly not similar in the grade to that they may be "parallelised" - that is, cut up in to achievable pieces amid not similar computers.

He mentioned the current, single-digit record is "more a protest of the Hadoop parallelisation framework... it can denote the power of new algorithms that could be utilitarian in other fields".

The record-breaking MapReduce approach, he said, is utilitarian in physics, cryptography and information mining.

Mr Sze updated that the calculation was moreover a great assessment is to Hadoop hardware and approach.

"This type of calculation is utilitarian in benchmarking and testing," he said.

"We have used it to compare the [processor] opening amid our clusters."

   .. Get More Details Now! ...

No comments:

Post a Comment