Distributed computing is the process where is single work unit is distributed to various computers across the world.
As the technology increases its intensity of innovation and discovery, the reliance on computer systems is also growing rapidly. The secret of this remarkable success was the exponential growth of computing power. This is emphasized by Moore's law which states that the computing power would double every 18 months.
When human genome sequencing was first proposed, it was estimated it would require 20 years to complete the sequence. The progress was dismal for first few years putting many detractors on war path citing the technological feasibility, morals and ethics, Frankenstein scenarios and whatever they could conjure up. But surprisingly the entire process of sequence was completed in a very short time. So short in-fact, that we have been able to sequence the genomes of variety of species. This was because of the exponential growth in processing capacities in late 90’s.

This remarkable rise in computing power has given birth to new scientific possibilities. For instance, the earlier method of finding cures or potential drugs were to actually produce them, research them and test them. Today, we can simulate complex drugs, proteins their zillion combinations and test them virtually. This may not give us cure on a platter but it filters out the noises from genuine possibilities. The lesser the possibilities scientists have to work on the quicker the discovery period.
Though computing power of a calculator is many times more powerful than the computing capabilities of systems that put man on the moon. But in-spite of this exponential growth in computing power there are several calculations that cannot be handled by a single system of computers nor are concentrated at a same place.
To enable harnessing of latent computing potential of diverse systems spread across different locations, Dave Anderson and his team developed software called BOINC. BOINC was first form of mass distributed computing. BOINC splits the “work” into several million units and sends each ‘work unit’ to individual computers. When the results are received after computation, these results are validated and entered into database for further processing.

Scientists have always faced a problem of analyzing their statistical data. For instance, when David Baker of University of Washington tried to use computers to predict the structure of proteins, he found the task was much beyond what his university computers could do. Baker then aligned his project to Dave Anderson's project BOINC was started the project called Rosetta. This project along with iconic, SETI@Home has been longest running distributed computing projects. SETI@Home uses distributed computing to analyze the radio signals that it collects. The objective is to filter out genuine signals from white noises to search to signs of extra-terrestrials.
The speed is the essence of BOINC. BOINC is useful because any body can use this latent potential for good scientific cause. Blue Gene from IBM is the fasted computer with a crunching potential of 360 trillion operations per second. The second fastest system is Earth Simulator in Japan, which crunches at 40 trillion operations per second. BOINC and its supporters have achieved the milestone of 1 petaflop or 1000 trillion operations per second, and this at the fraction of cost of what these dedicated supercomputers are built for and require for maintenance.
Parallel computing
The origins of these systems can be found in parallel computing. In parallel computing a task is sent to several processing cores at the same time and hence we can do more things than a single core would have completed. BOINC uses the same concept, except that here volunteer computers are connected through internet.
Cloud computing
Distributed computing is not limited to social and volunteer causes, but is also being used extensively by companies to reduce their hardware investments and utilize their resources to maximum. For example, the pharma companies like Glaxo have used distributed computing within their organization enabling them to save millions of dollars in purchasing, running and maintenance of new hardware.
Conclusion
Distributed computing is the way of the future. The participation of general public will increase over the years. Also, the computing will shift from just computers to other devices like mobile phones, DVD players, televisions, calculators etc. such a shift has already projects coming to harness the potential of Play Station 3. Participating in these projects are very simple, only requirement being an internet connection to connect to central servers. These programs are highly configurable thus users do not experience any ‘nuisance’ or slowing down of their computer systems. There are wide ranging projects that user can select based on his/her inclination and interest. It is also a fun way to being in touch with a field of interest that we would otherwise would not have time for.
Comments
Post a Comment