The IT Law Wiki
Advertisement

Definitions[]

A supercomputer (also called a high-end computer) is

a powerful computer consisting of many processors that work in parallel to perform a complex task.[1]

Supercomputers are

very tightly coupled computer clusters with lots of identical processors and an extremely fast, reliable network between the processors.[2]

Overview[]

Supercomputers, introduced in the 1960s, were designed primarily by Seymour Cray at Control Data Corporation (CDC), and led the market into the 1970s until Cray left to form his own company, Cray Research. He then took over the supercomputer market with his new designs, holding the top spot in supercomputing for five years (1985-1990). In the 1980s a large number of smaller competitors entered the market, in a parallel to the creation of the minicomputer market a decade earlier, but many of these disappeared in the mid-1990's "supercomputer market crash." Today, supercomputers are typically one-of-a-kind custom designs produced by "traditional" companies such as Cray, IBM and HP, who had purchased many of the 1980s companies to gain their experience.

The term supercomputer itself is rather fluid, and today's supercomputer tends to become tomorrow's ordinary computer. Today, parallel designs are based on "off-the-shelf" server-class microprocessors, such as the PowerPC, Opteron, or Xeon, and most modern supercomputers are now highly-tuned computer clusters using commodity processors combined with custom interconnects.

Supercomputers have vast processing capability and are typically used to work on scientific and engineering applications, e.g. weather forecasting, aircraft and automotive design, pharmaceuticals, etc. Supercomputers are also employed in areas of research that generate huge datasets, such as particle physics and genetics. Over time, these capabilities will be available to the mass market, as new applications in media, gaming and ubiquitous computing require massive resources. The major computer and Internet companies are already recognising and exploiting the known opportunities. Realising these opportunities may depend on the skills to design software that can exploit the massive processing power.[3]

Common uses[]

Supercomputers are used for highly calculation-intensive tasks such as problems involving quantum mechanical physics, weather forecasting, climate research (including research into global warming), molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals), physical simulations (such as simulation of airplanes in wind tunnels, simulation of the detonation of nuclear weapons, and research into nuclear fusion), cryptoanalysis, and the like. Major universities, military agencies and scientific research laboratories are heavy users.

Supercomputers are used to model and simulate complex, dynamic systems with many data points that would be too expensive, impractical or impossible to physically demonstrate. Data scientists write codes and algorithms that simulate each individual component of the model or process being explored. This means that scientists can simulate the evolution of our universe star by star and galaxy by galaxy, or even the heart's electrical system at the cellular level. Supercomputers also play a critical role in keeping our nuclear stockpile safe, secure and effective. Supercomputer simulations help scientists understand everything from weapon design to safety features and overall performance — all without physical testing.[4]

References[]

  1. "What is a supercomputer?" (full-text).
  2. Australian Public Service Better Practice Guide for Big Data, at 11.
  3. Technology and Innovation Futures, at 32-33.
  4. Department of Energy, "Big Science: Supercomputing at the National Labs" (June 9, 2014) (full-text).

External resource[]


This page uses Creative Commons Licensed content from Wikipedia (view authors). Smallwikipedialogo.png
Advertisement