Skip to main content

Distributed computing - the next big thing

Distributed computing is the process where is single work unit is distributed to various computers across the world.

As the technology increases its intensity of innovation and discovery, the reliance on computer systems is also growing rapidly. The secret of this remarkable success was the exponential growth of computing power. This is emphasized by Moore's law which states that the computing power would double every 18 months.

When human genome sequencing was first proposed, it was estimated it would require 20 years to complete the sequence. The progress was dismal for first few years putting many detractors on war path citing the technological feasibility, morals and ethics, Frankenstein scenarios and whatever they could conjure up. But surprisingly the entire process of sequence was completed in a very short time. So short in-fact, that we have been able to sequence the genomes of variety of species. This was because of the exponential growth in processing capacities in late 90’s.


This remarkable rise in computing power has given birth to new scientific possibilities. For instance, the earlier method of finding cures or potential drugs were to actually produce them, research them and test them. Today, we can simulate complex drugs, proteins their zillion combinations and test them virtually. This may not give us cure on a platter but it filters out the noises from genuine possibilities. The lesser the possibilities scientists have to work on the quicker the discovery period.

Though computing power of a calculator is many times more powerful than the computing capabilities of systems that put man on the moon. But in-spite of this exponential growth in computing power there are several calculations that cannot be handled by a single system of computers nor are concentrated at a same place.

To enable harnessing of latent computing potential of diverse systems spread across different locations, Dave Anderson and his team developed software called BOINC. BOINC was first form of mass distributed computing. BOINC splits the “work” into several million units and sends each ‘work unit’ to individual computers. When the results are received after computation, these results are validated and entered into database for further processing.

Scientists have always faced a problem of analyzing their statistical data. For instance, when David Baker of University of Washington tried to use computers to predict the structure of proteins, he found the task was much beyond what his university computers could do. Baker then aligned his project to Dave Anderson's project BOINC was started the project called Rosetta. This project along with iconic, SETI@Home has been longest running distributed computing projects. SETI@Home uses distributed computing to analyze the radio signals that it collects. The objective is to filter out genuine signals from white noises to search to signs of extra-terrestrials.

The speed is the essence of BOINC. BOINC is useful because any body can use this latent potential for good scientific cause. Blue Gene from IBM is the fasted computer with a crunching potential of 360 trillion operations per second. The second fastest system is Earth Simulator in Japan, which crunches at 40 trillion operations per second. BOINC and its supporters have achieved the milestone of 1 petaflop or 1000 trillion operations per second, and this at the fraction of cost of what these dedicated supercomputers are built for and require for maintenance.

Parallel computing

The origins of these systems can be found in parallel computing. In parallel computing a task is sent to several processing cores at the same time and hence we can do more things than a single core would have completed. BOINC uses the same concept, except that here volunteer computers are connected through internet.

Cloud computing

Distributed computing is not limited to social and volunteer causes, but is also being used extensively by companies to reduce their hardware investments and utilize their resources to maximum. For example, the pharma companies like Glaxo have used distributed computing within their organization enabling them to save millions of dollars in purchasing, running and maintenance of new hardware.

Conclusion

Distributed computing is the way of the future. The participation of general public will increase over the years. Also, the computing will shift from just computers to other devices like mobile phones, DVD players, televisions, calculators etc. such a shift has already projects coming to harness the potential of Play Station 3. Participating in these projects are very simple, only requirement being an internet connection to connect to central servers. These programs are highly configurable thus users do not experience any ‘nuisance’ or slowing down of their computer systems. There are wide ranging projects that user can select based on his/her inclination and interest. It is also a fun way to being in touch with a field of interest that we would otherwise would not have time for.

Comments

Popular posts from this blog

Cognitive rules of business presentations

In his recent book, Clear and to the Point, Kosslyn explained that the four rules of PowerPoint are: The Goldilocks Rule, The Rudolph Rule, The Rule of Four, and the Birds of a Feather Rule. Here's how they work. The Goldilocks Rule refers to presenting the "just right" amount of data. Never include more information than your audience needs in a visual image. As an example, Kosslyn showed two graphs of real estate prices over time. One included ten different numbers, one for each year. The other included two numbers: a peak price, and the current price. For the purposes of a presentation about today's prices relative to peak price, those numbers were the only ones necessary. The Rudolph Rule refers to simple ways you can make information stand out and guide your audience to important details -- the way Rudolph the reindeer's red nose stood out from the other reindeers' and led them. If you're presenting a piece of relevant data in a list, why not mak...

Value of dollar - Part 1

A Simple Perspective Will Do The date is 2000-05-28. Don't you get tired of all the bad news bears reminding you of all these instabilities, excesses, and 'potential' tensions in the global economy? After all, hasn't it always been like that? Yes it has, but not in money it hasn't. Increasingly, investors find it harder to know where to put their savings. What about Government Bonds? Wrong. Their recent record of capital losses have wiped out your guaranteed yields, probably because the stock market keeps crowding them out, and this even in a strong dollar and low inflation environment. Furthermore, there is no reliable liquidity and potentially poor quality debt in the corporate sector. Foreign assets? Wrong. Most of the world's economies are riskier, have been under performing, and also, there is this thing called currency risk. Like how is the average person gonna cope with currency...

Depreciation of British Pound 1900-2000

When the Bank of England was formed the powers to create money was finally transferred to private hands. The creation of Fed in US, was just a part of this cycle. Though it is a common knowledge US Dollar has depreciated nearly 100% since the creation of Federal Reserve, the same is the case of all the currencies across the globe. For example, below is the UK Parliament data that highlights the depreciating value of Pound.