INDEPENDENT NEWS

Multicore for Mega to Be Chewed-Over at High-tech Conference

Published: Wed 23 Jan 2013 09:39 AM


www.MulticoreWorld.com
Meaning of Multicore for Mega to Be Chewed-Over at High-tech Wellington Conference - Feb 19th and 20th 2013
Press Release - Wednesday, 9 January 2013
Multicore World 2013
The Mega Launch case study is an interesting illustration of the revolution in computing processing power being brought by multicore computing.
Kim Dotcom’s establishment of Mega Mega.co.nz as a new initially free way to store and share content through encrypted files is a good illustration of multicore computing and parallel programming being put to use in a New Zealand context.
Multicore World 2013 at The Wellington Town Hall on February 19 & 20 is about the present and future of computing says its founder Nicolas Erdody.
“Whatever its merits, Mega probably couldn’t exist and scale without the power of multicore computers and specialised programming, to deliver Kim Dotcom’s business model,” says Erdody.
“The fact he’s had some challenges due to massive user demand would’ve been a hundred times worse without multicore – in fact simply impossible without the many processors on one chip that multicore actually is.
“In a sense it's a coincidence that the creation of Mega illustrates the importance of multicore, and business, government and other organisations need to be at the conference to see where they fit into this rapidly changing environment.”
Mega’s establishment in New Zealand reveals flows in the country’s IT infrastructure.
The fact that the country doesn’t yet have a second major fibre optic connection other than the Southern Cross Cable instantly means cloud storage services cannot be provided here for international use and consumption for example.
“The wider debate of what is required to build multicore-oriented competence and services out of New Zealand are to be discussed at the conference,” says Erdody. “There’s no other forum that addresses this key component for our IT future,” he says.
Multicore World 2013 boasts an international line up of speakers who are authorities on computer architecture which allows parallel processing and massively increased computer, smartphone and other device performance.
Among Multicore World 2013 experts are IBM’s Paul McKenney, Intel’s Tim Mattson , Prof Ian Foster of Argonne National Laboratory and FreeBSD’s Poul-Henning Kamp.
Discounted GreenButton, Catalyst IT and Scoop.co.nz sponsored registration tickets to the Multicore World 2013 are available during January for $750. The full registration fee is $950.
Contacts
Nicolas Erdody, Director Open Parallel. Nicolas.erdody@openparallel.com (027 521 4020)
What is multicore?
The ability of computers to process massive amounts of data has been growing ever since they were invented. As computer power has increased, the speed of processing has reached a physical barrier, and more processing power cannot be put onto a chip without overheating.
The problem has been solved by putting more processors onto a single chip, creating multicore chips. These multicore chips entered the mainstream market a few years ago, and all vendors currently sell them. They are now standard kit in all laptops, desktops and smartphones.
Multicore chips are also more power efficient, and the number of cores able to be added is theoretically virtually unlimited.
Previously impossible computational tasks can now be achieved. And processes which previously took, days or even weeks to perform can now be done swiftly.
But while this new processing power enables computers to do things faster, it also adds new challenges.
Before Multicore computer software was written for a single central processing unit (CPU) in a chip. To exploit the potential of multicore chips, software now needs to be written while thinking in parallel.
But parallel programming is different than traditional programming, and so far few programmers have experience of it.
Multicore is a mainstream but (as yet) niche new technology.
In the next 10-15 years, there will be huge opportunities to translate sequential programming (‘traditional’) legacy code, and to create new software that takes full advantage of thousands of cores in the next generation of chips.
Around the world parallel computing is currently used to process vast quantities of data produced by the internet and the "big data" originating out of social networks and millions of intelligent data recording devices attached to the internet..
Here in NZ it is also used in the biggest CGI rendering facility in the world at Wellington's Weta Digital.
And soon it will be a key component of the information processing required to handle the data produced by the Square Kilometer Array radio - telescope – a global scientific project that New Zealand is a part of.
In addition, there is a wide range of services, solutions and systems integration challenges to connect the two world's together.
ENDS

Next in Business, Science, and Tech

General Practices Begin Issuing Clause 14 Notices In Relation To The NZNO Primary Practice Pay Equity Claim
By: Genpro
Global Screen Industry Unites For Streaming Platform Regulation And Intellectual Property Protections
By: SPADA
View as: DESKTOP | MOBILE © Scoop Media