Skip to Main Content
CWRU Links

July 2016

Subscribe to Our Mailing List


Research Computing Newsletter: July 2016

What’s New

This newsletter!  We hope that this new monthly newsletter will help to keep you updated on the services, events and other activities related to research computing on campus.  If you have suggestions for items that you feel would be useful to include, please send them to, or fill out our Google form.

The HPC Cluster has a name!  Welcome to RedCat!

After more than ten years of operating a nameless HPC cluster at CWRU, we've decided that it should have a name.  Please let us introduce RedCat (! RedCat is named after the athletic team mascot of Western Reserve University, the Red Cats.  

Please use for accessing the RedCat cluster rather than the names you've been using in the past.  We are using to provide access to a set of two (or more) login nodes that will automatically load-balance so that all users are spread approximately evenly among them.  As loads increase, we will consider adding more login nodes to keep response the way you need it.

PLEASE NOTE: This will work for simple ssh connections, but may not work if you are using NX or another GUI interface.  If this is the case, please use or  Please let us know right away if you run into this or any other issue when trying to access

We will be "rebranding" our HPC documentation to make it clear that what you are looking at refers to our newest SLURM-based cluster, RedCat.  If you still need to use the old, minimized Moab-based cluster, you can access it using the old method ( for a limited time.

Many thanks, again, for your patience and understanding over the past months as we phased in the new SLURM-based cluster, RedCat.

HPC Maintenance

Both HPC clusters will be shut down for scheduled quarterly maintenance on July 14 for 24 hours. During this downtime, we will upgrade the compute nodes to RHEL 6.7.  We will also move the hpctransfer and hpcviz head nodes to the RedCat cluster. The hpctransfer and hpcviz will serve as the transfer and visual nodes, just like before, but now in the new RedCat cluster. After this shutdown, will provide the only access to the old Moab-based cluster. This very small cluster is provided only for the users who still cannot run their jobs in the RedCat cluster and will be maintained for a limited time for that purpose.

Research Storage Maintenance

The Research Data Storage (RDS) service, based on Dell FluidFS technology, will be migrated on July 14 to a new network to improve performance for that enables increased network performance capacity to users who access the service (dell-ffs-ksl) outside of our data centers from their offices or from off-campus. Maintenance outage of the service will begin at 12 AM and be completed by 6 AM on July 14.

Featured Service

High Performance Computing

Besides load-balancing the login nodes of RedCat and replacing Moab with the SLURM resource manager, high performance computing at CWRU has been improved in other important ways recently. During the past year, we installed 12 Nvidia K40 GPU cards to support massively parallel computing.  We upgraded our high performance storage (Panasas) director blades, which helped mitigate the storage issues that were present during the past winter. In the upcoming year, we will be acquiring 18 new compute nodes and replacing some older 10Gb switches as part of our HPC refresh efforts, providing higher computational capacity for the research community.  

Snapshot of HPC as of July 1, 2016:

Total CPU cores: 3,120

Total GPU cores: 58,880

Total system-wide RAM: 11,776 GB

Total high performance storage: 170 TB

Number of faculty labs registered to date: 253

Number of users registered to date: 1295

HPC Statistics for July 2015 - June 2016:

Submitted jobs: 2.15 million

Total CPU hours consumed:  8.6 million

Number of active faculty labs: 237

Number of active users: 999 

Research Computing provides a full catalog of services and support options available to the research community. Please go to for more information.


Cyberinfrastructure Day 2016

Our second annual Cyberinfrastructure Day 2016 was held on May 7, 2016 at the Tinkham Veale University Center Ballroom. The theme this year was “Big Data and Machine Learning”. Our featured speakers discussed topics such as medical image interpretation for predicting diseases, material degradation evaluation from power data and human sound and movement interpretation. The playlist of the recorded talks can be viewed at:

2016 Cyberinfrastructure Day: Welcome and Research Computing

2016 Cyberinfrastructure Day: Computational Imaging - Dr. Anant Madabhushi

2016 Cyberinfrastructure Day: Big Data Analytics - Dr. Roger French

2016 Cyberinfrastructure Day: Multimodal Communication - Dr. Mark Turner

2016 Cyberinfrastructure Day: Big Data in Humanities - Dr. Alan Craig

2016 Cyberinfrastructure Day: Big Data Data Management

2016 Cyberinfrastructure Day: CWRU Research Computing Resources

2016 Cyberinfrastructure Day: Ohio Supercomputer Center

2016 Cyberinfrastructure Day: New XSEDE Resources

2016 Cyberinfrastructure Day: Big Data in Higher Education

Tea Time SDLE - Summer 2016 until July 28.

This summer, Dr. Roger French’s group from the Applied Data Science minor program continues to have afternoon “tea time” sessions on Tuesday, Wednesday, and Thursday from 3:30 until 4:15 in White 411 on topics of interest in data science. The sessions are open to all interested parties and no registration is needed.

Previous sessions (on R and Python) can be viewed at:   

Future sessions include:

July 12-14: LaTeX

July 19-21 and July 26-28: Statistics for Data Science: Im, Predictive Models, Sem Network, Time-series and Forecasting Models

HPC Bootcamp - Tuesday, Sep 13, 9-5 at Toepfer Room, Adelbert Hall

The first session in our HPC training series will kick off in September. Topics include introduction to HPC at CWRU, submitting jobs, using MATLAB on RedCat and much more! 

We are always looking to promote events going on in the research community. Please let us know if you would like to have your event advertised in the newsletter.