HCC AI Resources page is now available!  

HCC facilitates University of Nebraska researchers to develop, run, and scale modern Artificial Intelligence (AI) and Machine Learning (ML) workflows.  

For more information on the AI/ML resources, services, software and trainings HCC provides, please see https://hcc.unl.edu/ai


Using Large Language Models (LLMs) on Swan  

HCC has recently installed new tools for working with Large Language Models (LLMs) on Swan. This includes both a system-wide Ollama module and LM Studio Open OnDemand App. In addition to the Ollama module, a few public and common models (e.g., llama, gpt-oss, deepseek, etc.) are also available system-wide via the`mldata` module.  

More information can be found on the Using LLMs on HCC resources page. 


HCC In The News

UNL professor weighs in on growing power demands of artificial intelligence


Reducing file quota impact of conda environments 

Conda environments often consist of numerous tiny and small files. These small files often will reach the file limits of 5 million files on HCC’s filesystems. All of these small files may also cause a substantial load on the metadata servers for the respective filesystems, resulting in some performance loss as a result. 

HCC now has a guide on how to convert a conda environment into a single file using Apptainer available HERE


Class account information 

Are you interested in using HCC resources for your class? Instructors from University of Nebraska can request class HCC group.  

More details on this are available in the Class Group section of HCC’s Policies. 


HCC is hiring! 

HCC is hiring undergraduate and graduate students for multiple positions. For more information, please see: 


Upcoming Training Opportunities! 

HCC will be hosting an intermediate Python workshop involving machine learning and big data in the month of October. Please keep an eye on your email if you are interested or check the Upcoming Events page. 


Looking to reduce your wait time on Swan? 

HCC wants to hear more about your research! If you acknowledge HCC in your publications, posters, or journal articles, you can receive a boost in priority on Swan! 

Checkout the links below for more information!

Submit a request for the priority boost

https://hcc.unl.edu/acknowledgement-submission 

Details on Acknowledgement Credits

https://hcc.unl.edu/docs/submitting_jobs/hcc_acknowledgment_credit/ 


 HCC Summary for the last Month: 

CPU Hours Utilized

GPU Hours Utilized

New Software Installs

Data Stored

6.7 Million Hours

69,600 Hours

19 Software Packages

~3,200 TiB

Training Events in the last month:

FDST867 Class Introduction
HCC Kickstart Fall 2025
GP-ENGINE - Migrating AI/ML workflows to Nautilus


Additional Computational Resources

  • The National Research Platform (NRP) provides free access to a variety of distributed CPUs, GPUs and storage, arranged into a Kubernetes cluster. For more information, please see https://hcc.unl.edu/docs/nrp/.  

  • OSG Consortium provides free access to distributed compute and storage resources suitable for high throughput computing. For more information, please see https://hcc.unl.edu/docs/osg/

  • PATh provides free, dedicated and distributed compute resources to NSF funded projects. For more information, please see https://path-cc.io/

  • The NAIRR Pilot provides free Researcher and Classroom and Educator Resources for U.S. researchers and educators. For more information, please see https://nairrpilot.org/

  • ACCESS allocations are free and available to any U.S.-based researchers or educators. For more information, please see https://access-ci.org/