GETTING STARTED WITH SUPERCOMPUTING
An Approach for High School Students

by Donald W. Hyatt
Thomas Jefferson High School
for Science and Technology

1.0 Introduction

Because of remarkable advances in computer technology, scientists now have a new problem solving tool, a supercomputer. Supercomputers supported by high powered graphics workstations are extremely valuable resources for a wide range of scientific investigations. High performance computing is becomming increasingly more important to many scientific and engineering disciplines, so it is important for educators to prepare future generations to be ready for that growing demand.

Supercomputer access was originally reserved for graduate students and research scientists. However, in 1988 as part of a national contest sponsored by ETA Systems, a division of Contol Data Corporation, this power was made available to high school students. The contest was called SuperQuest. After Control Data closed down ETA Systems and terminated SuperQuest in 1989, this exciting educational initiative was picked up by several of the national supercomputing centers including the Cornell Theory Center, the National Center for Supercomputing Applications, and the University of Alabama/Alabama Supercomputing Network. It was jointly sponsored by the National Science Foundation and various private corporations. To be ready for the technical expertise that will be required in the next century, both students and teachers must learn to use high performance computing and computational science as powerful investigative tools.

As a teacher whose students won the ETA10-P supercomputer, the million dollar grand prize in that first SuperQuest, I have felt an obligation to develop curriculum appropriate for the secondary school level that would utilize our supercomputer. Although the ETA10-P at Thomas Jefferson High School for Science and Technology is no longer working due to an unfortunate leak in the roof of our computer lab, some of the materials we have developed could be used on even modest computers. At Jefferson, we are using modest Pentium systems in a networked UNIX (Linux) environment. By utilizing a parallel programming environment called PVM, Parallel Virtual Machine developed at MIT, we are able to attain computational power that is comparable to our former supercomputer.

2.0 Background
2.1 Science Before Supercomputers

Throughout history, scientists have been searching for the fundamental laws which govern our universe. Dr. Clifford N. Arnold [1], former manager of Computational Research at ETA Systems, pointed out to participants of the first SuperQuest that there are really four fundamental methods of scientific research, one of which involves supercomputing.

Until recently, only three of these techniques had been used for most scientific investigations. The first method, observational science, is where researchers study some situation or phenomenon, and then carefully document their discoveries. Some examples of observational science include studying the social behavior of gorillas in various habitats, or mapping the geological formations in the Grand Canyon.

The second method is referred to as experimental science. With this technique, an experiment is designed that will provide some insight into a basic scientific principle. It is important in experimental science to have control groups for comparison, and to try hold many factors constant in order to isolate cause and effect. Examples of experimental science include tests to determine the appropriate concentrations for a new medication, or comparative tests of airplane wing designs in a wind tunnel.

In the third method, theoretical science, a law or theory is hypothesized and then substantiated by additional research and rigorous mathematics. Examples of theoretical science include the complex equations describing fluid flow, and the familiar formula, e=mc2, in Einstein's theory of relativity.

2.2 Computational Science

The fourth and newest method of scientific investigation uses advanced computer technology and is referred to as computational science. Because high performance computers have become so powerful, scientists are now able to use them as tools to study a wide variety of very complex problems. There are applications in almost every subject area. For detailed examples of some standard supercomputer applications, the author suggests the introductory text by Karin and Smith, The Supercomputer Era [6], and various publications including Supercomputer Review [9], the International Journal on Supercomputing Applications [14], many of the ACM and IEEE conference proceedings [10], or supercomputer resources on the Internet. Supercomputers can help devise safer automobiles, build better airplanes, create special effects in Hollywood movies, and can even be used to design faster supercomputers.

A very practical application of supercomputer power is in the area of computer modeling and simulation. One distinct advantage of certain computer models is that they can be used to speed up extremely slow processes in order to predict potential outcomes in the future. Long range weather forecasting and the problems of global warming are environmental studies which are being investigated by supercomputer models. Scientists have even been able to study the effects of seemingly harmless forest management techniques that have now contributed to increased forest fire potential.

Computer simulation models are also excellent choices to study processes that happen too quickly to observe by a direct experiment. They are very useful to investigate the behavior of things that might be too small to examine by any physical means. Some examples include molecular dynamics simulations and three dimensional modeling of chemical compounds that can help scientists understand the physical properties of these structures.

Computer models can be used to explore situations that researchers cannot experience directly. For instance, scientists can investigate the nature of black holes using supercomputer simulations. Others have developed computer models of developing thunderstorms in order to understand the internal conditions that bring about severe weather phenomena such as tornados.

2.3 Graphics Visualization

Closely related to supercomputing is the field of graphics visualization. Instead of just looking at numbers, scientists can display supercomputer output in a visual form using high quality computer graphics. Through computer graphics, researchers are often better able to comprehend large quantities of data and notice subtle trends. In the series NCSA RealTime [14], the National Center for Supercomputing Applications has compiled some excellent video segments that show the wide range of graphics visualization techniques being used in certain supercomputer applications.

Some graphics visualization approaches may use just simple two-dimensional graphs of lines or dots on the screen, while others can display three-dimensional renderings of objects, contours, and surfaces. Advanced visualization techniques may include computer animations with realistic colors, reflections, and shading [4]. At Jefferson, we are using the library called OpenGL Graphics to provide graphics visualization access for our students.

Color is a very valuable tool in graphics visualization. It can be used to enhance images or provide emphasis to details that might not be readily apparent. Engineers can better view air currents and turbulence around new car designs with color enhancement. Architects can look for regions of material stress in structures that might fatigue under heavy use.

Doctors are using three-dimensional imaging to view cancerous tumors in the body which might be very difficult to discern by conventional x-rays. Using similar techniques, anthropologists can even investigate the interior of ancient mummies without ever unwrapping them.

3.0 Supercomputing Applications for High School Students

Students and teachers at the secondary school level should realize that many of the previously mentioned applications using supercomputers involve huge projects with teams of scientists, technicians, and programmers. The mathematics being used in some of these investigations is often far beyond the scope of a typical high school curriculum. However, there are computational science techniques that can be used to augment the typical high school program. Some methods that have been used successfully at Thomas Jefferson High School for Science and Technology are presented in the following examples.

3.1 Example #1 - Missile Trajectory Simulation

This activity first teaches students to develop a simple computational model of a projectile in flight. It then uses a computational science approach to investigate the complete problem space where all possible angles are matched with all possible velocities to look at the valid combinations that form the solution set.

3.2 Example #2 - Gypsy Moth Population Growth Simulation

This activity utilizes an approach to problem solving called a cellular automaton. It studies a biological simulation of a population of voratious caterpillers as they expand into adjacent territory and devour existing resources.

4.0 Conclusions

At Jefferson, we have been fortunate to have the computational resources and talented students experiment with as we try to develop some new secondary school curriculum in computational science. A summary of the courses we offer through the Computer Systems Lab is available via the net. We suggest you investigate some of the materials we present here, and would appreciate your sharing successful activities with our teachers and other interested educators.

We have been pleased to see our students use many of the tools they have learned in our elective courses as they pursue other projects and activities at Jefferson. We feel that the diversity of senior projects in the CS Lab involving computational science are a direct result of the dynmaically evolving curriculum we try to offer. We are especially gratified by the large number of former students who have gone on to advanced degrees at prestigious universities, and are kind enough to keep us informed of their accomplishments. After ten years of supercomputing at TJHSST, we feel very confident that the experiment started by ETA Systems was an unqualified success.

5.0 Bibliography


dhyatt@tjhsst.edu