Copyright © Holland Computing Centerhttps://validator.w3.org/feed/docs/rss2.htmlHolland Computing Center Updateshcc.unl.eduhttps://hcc.unl.edu?utm_source=noticeable&utm_campaign=u6omvzxzinzftsruhmj5&utm_content=other&utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT&utm_medium=newspageenMon, 12 Feb 2024 21:24:29 GMThttps://noticeable.io[email protected] (Holland Computing Center)[email protected] (Noticeable Team)https://storage.noticeable.io/projects/U6oMVZxzinZFTsrUhmj5/newspages/Ua2Scex0CoZYQCCzgOfT/01h55ta3gssabfvjwvkfrwycv4-header-logo.jpgHolland Computing Center Updateshttps://hcc.unl.edu?utm_source=noticeable&utm_campaign=u6omvzxzinzftsruhmj5&utm_content=other&utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT&utm_medium=newspagehttps://storage.noticeable.io/projects/U6oMVZxzinZFTsrUhmj5/newspages/Ua2Scex0CoZYQCCzgOfT/01h55ta3gssabfvjwvkfrwycv4-header-logo.jpg#d00000mvQBFC2WIq7Tnyh5B2zCMon, 12 Feb 2024 21:23:39 GMT[email protected] (Holland Computing Center)February 15th Open Office Hourshttps://news.hcc.unl.edu/publications/february-15th-open-office-hoursHCC will not be hosting the virtual Open Office Hours on February 15th with HCC staff going through internal training.

If you have any questions during this period, please feel free to contact HCC support at [email protected].

]]>
HCC will not be hosting the virtual Open Office Hours on February 15th with HCC staff going through internal training.

If you have any questions during this period, please feel free to contact HCC support at [email protected].

]]>
Announcement
lMM4cjVhCybwW8vQlTpBMon, 08 Jan 2024 18:44:31 GMT[email protected] (Holland Computing Center)January 18th Open Office Hourshttps://news.hcc.unl.edu/publications/january-18th-open-office-hoursHCC will not be hosting the virtual Open Office Hours on January 18th with HCC staff going through internal training.

If you have any questions during this period, please feel free to contact HCC support at [email protected].

]]>
HCC will not be hosting the virtual Open Office Hours on January 18th with HCC staff going through internal training.

If you have any questions during this period, please feel free to contact HCC support at [email protected].

]]>
Announcement
M4xKdRi8MZ0MEvF3Kk53Tue, 12 Dec 2023 21:28:43 GMT[email protected] (Holland Computing Center)HCC Holiday Schedule 2023https://news.hcc.unl.edu/publications/hcc-holiday-schedule-2023undefinedIn accordance with the NU holiday schedule, HCC staff will be on break from Friday, December 22, 2023, through Monday, January 1, 2024. All HCC resources will continue to be operational during this break.

HCC staff will be monitoring the systems to ensure availability through the break. HCC User Services staff will be periodically monitoring the ticketing system during the break and will address any system critical issues. Non-critical tickets/issues will be addressed when we return after the winter break on January 2, 2024.

Please email [email protected] if you have any questions.

Happy Holidays!

Hongfeng Yu, Director
Holland Computing Center

]]>
In accordance with the NU holiday schedule, HCC staff will be on break from Friday, December 22, 2023, through Monday, January 1, 2024. All HCC resources will continue to be operational during this break.

HCC staff will be monitoring the systems to ensure availability through the break. HCC User Services staff will be periodically monitoring the ticketing system during the break and will address any system critical issues. Non-critical tickets/issues will be addressed when we return after the winter break on January 2, 2024.

Please email [email protected] if you have any questions.

Happy Holidays!

Hongfeng Yu, Director
Holland Computing Center

]]>
Announcement
LYr6DI6hk7fOPoAceGFsThu, 05 Oct 2023 14:38:27 GMT[email protected] (Holland Computing Center)HCC now has courses!https://news.hcc.unl.edu/publications/hcc-now-has-coursesThe Holland Computing Center now has three optional courses available.

These courses are only available to those within the University of Nebraska system at this time.

HCC Group Owner Important Information

Link: https://nebraska.bridgeapp.com/learner/courses/a90b1752/enroll

A short course containing important information that is strongly recommended for new group owners.

This includes information about HCC policies, account creation, an overview of data storage covering details about the purge policy on $WORK, support, training, and acknowledgment credits.

HCC Account Important Information

Link: https://nebraska.bridgeapp.com/learner/courses/77529776/enroll

A short course containing important information that is strongly recommended for those new to using HCC resources.

This includes information about HCC policies, managing your account, an overview of data storage covering details about the purge policy on $WORK, support, training, and acknowledgment credits.

Introduction to Holland Computing Center Resources

Link: https://nebraska.bridgeapp.com/learner/courses/edb77314/enroll

This is a longer course providing an introduction to using HCC’s Swan supercomputer including basic data management, software usage, submitting jobs, reviewing jobs, and acknowledgment credits. This course is recommended for those new to using HCC resources, and anyone wanting to learn more about HCC

]]>
The Holland Computing Center now has three optional courses available.

These courses are only available to those within the University of Nebraska system at this time.

HCC Group Owner Important Information

Link: https://nebraska.bridgeapp.com/learner/courses/a90b1752/enroll

A short course containing important information that is strongly recommended for new group owners.

This includes information about HCC policies, account creation, an overview of data storage covering details about the purge policy on $WORK, support, training, and acknowledgment credits.

HCC Account Important Information

Link: https://nebraska.bridgeapp.com/learner/courses/77529776/enroll

A short course containing important information that is strongly recommended for those new to using HCC resources.

This includes information about HCC policies, managing your account, an overview of data storage covering details about the purge policy on $WORK, support, training, and acknowledgment credits.

Introduction to Holland Computing Center Resources

Link: https://nebraska.bridgeapp.com/learner/courses/edb77314/enroll

This is a longer course providing an introduction to using HCC’s Swan supercomputer including basic data management, software usage, submitting jobs, reviewing jobs, and acknowledgment credits. This course is recommended for those new to using HCC resources, and anyone wanting to learn more about HCC

]]>
NewAnnouncementTraining
FzrN6GZkc1err7GeHZclTue, 01 Aug 2023 17:49:09 GMT[email protected] (Holland Computing Center)Crane Storage Final Reminderhttps://news.hcc.unl.edu/publications/crane-storage-final-reminderCrane as an HPC cluster resource was retired on July 1st, 2023 with a planned power off of the remaining storage and service nodes on August 1st, 2023. We have had a handful of requests to extend the availability of Crane to continue ongoing file transfers. We are extending the power off time to the firm date of September 1st, 2023, and will not extend or provide access beyond that time. On September 1st, 2023, the Crane login and transfer hosts will be turned off completely. The /work filesystem storage will be erased for compliance with NU policies. If you do not act soon, any data not copied elsewhere from /work by September 1st, 2023 will be lost permanently.

]]>
Crane as an HPC cluster resource was retired on July 1st, 2023 with a planned power off of the remaining storage and service nodes on August 1st, 2023. We have had a handful of requests to extend the availability of Crane to continue ongoing file transfers. We are extending the power off time to the firm date of September 1st, 2023, and will not extend or provide access beyond that time. On September 1st, 2023, the Crane login and transfer hosts will be turned off completely. The /work filesystem storage will be erased for compliance with NU policies. If you do not act soon, any data not copied elsewhere from /work by September 1st, 2023 will be lost permanently.

]]>
AnnouncementCrane
wdENQZsrDZic8mpbBJWPMon, 26 Jun 2023 19:56:08 GMT[email protected] (Holland Computing Center)Crane Retirement - FINAL Reminderhttps://news.hcc.unl.edu/publications/crane-retirement-final-reminderSince 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 640 million core hours and 2.3 million GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane has stopped accepting new job submissions and existing jobs will be allowed to complete, with all jobs stopped by June 30th.  The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data.  

With the retirement of Crane, all data stored on both the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files as soon as possible. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.  

As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.  

HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns. 

]]>
Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 640 million core hours and 2.3 million GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane has stopped accepting new job submissions and existing jobs will be allowed to complete, with all jobs stopped by June 30th.  The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data.  

With the retirement of Crane, all data stored on both the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files as soon as possible. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.  

As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.  

HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns. 

]]>
AnnouncementCrane
cWZmGJXEmenWFRu1X1SEMon, 22 May 2023 21:05:14 GMT[email protected] (Holland Computing Center)Crane Retirement - Reminder and GPU Updatehttps://news.hcc.unl.edu/publications/crane-retirement-reminder-and-gpu-updateSince 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 640 million core hours and 2.3 million GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.  Existing jobs will be allowed to complete, with all jobs stopped by June 30th.  The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data.  

With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.  

As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.  

REMINDER:

As part of the retirement process, GPUs on Crane will no longer be available on May 24th. Any jobs that can't complete by May 24th will not start. This is to allow HCC to migrate the GPUs from Crane to Swan and consolidate the pool of GPU resources into Swan. This will greatly increase the number of available GPUs on the Swan cluster.

Transiting GPU workflows to Swan should only require re-creating any environments needed and migrating any data needed. If you have any questions about this process, please contact [email protected] or join our Open Office Hours every Tuesday and Thursday from 2-3PM via Zoom.

HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns. 

]]>
Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 640 million core hours and 2.3 million GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.  Existing jobs will be allowed to complete, with all jobs stopped by June 30th.  The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data.  

With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.  

As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.  

REMINDER:

As part of the retirement process, GPUs on Crane will no longer be available on May 24th. Any jobs that can't complete by May 24th will not start. This is to allow HCC to migrate the GPUs from Crane to Swan and consolidate the pool of GPU resources into Swan. This will greatly increase the number of available GPUs on the Swan cluster.

Transiting GPU workflows to Swan should only require re-creating any environments needed and migrating any data needed. If you have any questions about this process, please contact [email protected] or join our Open Office Hours every Tuesday and Thursday from 2-3PM via Zoom.

HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns. 

]]>
AnnouncementCrane
7vwgChQU5S0Sg7mxWiTuMon, 24 Apr 2023 20:49:05 GMT[email protected] (Holland Computing Center)Crane Retirement - Update and Reminderhttps://news.hcc.unl.edu/publications/crane-retirement-update-and-reminderSince 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 625 million core hours and 2.2 million GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.  Existing jobs will be allowed to complete, with all jobs stopped by June 30th.  The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data.  

With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.  

As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.  

UPDATE:

As part of the retirement process, GPUs on Crane will no longer be available starting May 20th. This is to allow HCC to migrate the GPUs from Crane to Swan and consolidate the pool of GPU resources into Swan. The GPUs on Swan will be unaffected by this and GPUs on Swan will still be able to run jobs.

HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns. 

]]>
Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 625 million core hours and 2.2 million GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.  Existing jobs will be allowed to complete, with all jobs stopped by June 30th.  The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data.  

With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.  

As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.  

UPDATE:

As part of the retirement process, GPUs on Crane will no longer be available starting May 20th. This is to allow HCC to migrate the GPUs from Crane to Swan and consolidate the pool of GPU resources into Swan. The GPUs on Swan will be unaffected by this and GPUs on Swan will still be able to run jobs.

HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns. 

]]>
AnnouncementCrane
CKDWE1vvvoZkUuG59U1eMon, 27 Mar 2023 19:58:46 GMT[email protected] (Holland Computing Center)Crane Retirement - Reminderhttps://news.hcc.unl.edu/publications/crane-retirement-reminder-1Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 625 million core hours and 2.2 million GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.  Existing jobs will be allowed to complete, with all jobs stopped by June 30th.  The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data.  

With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.  

As a part of the long running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.  

HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns. 

]]>
Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 625 million core hours and 2.2 million GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.  Existing jobs will be allowed to complete, with all jobs stopped by June 30th.  The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data.  

With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.  

As a part of the long running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.  

HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns. 

]]>
AnnouncementCrane
barnjBLscWJ5NzYBio3MMon, 27 Feb 2023 16:53:50 GMT[email protected] (Holland Computing Center)Crane Retirement - Reminderhttps://news.hcc.unl.edu/publications/crane-retirement-reminderSince 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 625 million core hours and 2.2 million GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.  Existing jobs will be allowed to complete, with all jobs stopped by June 30th.  The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data.  

With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.  

As a part of the long running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.  

HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns. 

]]>
Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 625 million core hours and 2.2 million GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.  Existing jobs will be allowed to complete, with all jobs stopped by June 30th.  The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data.  

With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.  

As a part of the long running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.  

HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns. 

]]>
AnnouncementCrane