Hacker Emblem

Earlham College Cluster Computing Group


Open Source Logo
  
  

Overview


Introduction

    The Cluster Computing Group at Earlham is a group of faculty and student researchers working on tools and techniques for effectively using Beowulf style compute clusters for interdisciplinary computational science research. Within that framework we have focused on these particular areas:

    • Parallel Programming and Cluster Computing Education. Software and hardware tools and curriculum modules for teaching parallel programming to undergraduate students and faculty.

    • Computational Science/Computational Thinking Education. Curriculum modules for teaching computational science to undergraduate students and faculty.

      These projects are done in conjunction with the Shodor Education Foundation/National Computational Science Institute, HPC-U, TeraGrid/XSEDE, and the SuperComputing Conference series.

Projects

    Active Projects

      Acme : Acme = LittleFe + Bootable Cluster CD + HPC-U Curriculum Modules

      LittleFe : The inexpensive, portable cluster for parallel programming and cluster computing education, http://LittleFe.net

      The Bootable Cluster CD (BCCD) : Software tools for parallel programming and cluster computing education, http://bccd.net

      • TeraGrid10 Paper (Best in EOT Track)

      Blue Waters Undergraduate Petascale Education Program : Working with the Shodor Foundation we are developing curriculum modules to support teaching students and faculty how to design, build, test and scale parallel software for petascale computational resources. ( Blue Waters modules ).

      National Computational Science Institute (NCSI) workshop instruction and curriculum module development. The subset of these workshops which we focus on are designed primarily for faculty who want to incorporate parallel programming and cluster computing into the undergraduate computer science curriculum. Other workshops in the series cover computational science/thinking at a variety of levels and computational [UMM] for teachers of [UMM], where [UMM] ranges over {biology, chemistry, engineering, physics}.

    Archived Projects

      Low Latency Linux Kernel

      Folding@Clusters : Harnessing HPC resources for large scale distributed molecular dynamics

      Benchmarking and tuning molecular dynamics packages

      Methods for calculating 1/sqrt(x) in the context of molecular dynamics simulations

Facilities

Currently we support 7 modest clusters and the associated server, software and network plumbing. The two most powerful

  1. Al-salam : 13 SuperMicro 1U nodes, 2 4-core Intel Xeon E5530 CPUs (100 cores total), 16GB of RAM per node. Two of the nodes have NVIDIA Tesla cards. There is about 1.5TB of local disk storage.

  2. BobSCEd : 8 node, 2 motherboard, 1 4-core Xeon CPU (64 cores total), running Debian, InfiniBand network fabric, ~500 GFLOPs.

  3. Cairo : 4 PowerPC G4 dual processor nodes each with 1GB of RAM running YellowDog Linux
    . This cluster was effectively retired in 2010, it was a 16 node system. Detailed description

  4. Bazaar : 4 PIII dual processor nodes running Suse Linux
    . This cluster was effectively retired in 2010, it was a 20 node system. Detailed description

  5. t-voc : ... This cluster was named after the grandfather of the person who arranged to have it donated to the Cluster Computing Group, Ned Thanhouser of Intel.

  6. LittleFe :

  7. BigFe :

Support

    Our work is supported by donations and grants from the National Computational Science Institute/Shodor Education Foundation, the SC Education Program, the Intel Corporation, Freescale Incorpated, Genesi Incorporated, Ray Ontko & Company, the Howard Hughes Medical Institute, The Arthur Vining Davis Foundation, Earlham College, and Safe Passage Communications, Inc.