urn:noticeable:projects:U6oMVZxzinZFTsrUhmj5Holland Computing Center Updateshcc.unl.edu2023-08-01T17:50:11.345ZCopyright © Holland Computing CenterNoticeablehttps://storage.noticeable.io/projects/U6oMVZxzinZFTsrUhmj5/newspages/Ua2Scex0CoZYQCCzgOfT/01h55ta3gssabfvjwvkfrwycv4-header-logo.jpghttps://storage.noticeable.io/projects/U6oMVZxzinZFTsrUhmj5/newspages/Ua2Scex0CoZYQCCzgOfT/01h55ta3gssabfvjwvkfrwycv4-header-logo.jpg#d00000urn:noticeable:publications:FzrN6GZkc1err7GeHZcl2023-08-01T17:49:09.070Z2023-08-01T17:50:11.345ZCrane Storage Final ReminderCrane as an HPC cluster resource was retired on July 1st, 2023 with a planned power off of the remaining storage and service nodes on August 1st, 2023. We have had a handful of requests to extend the availability of Crane to continue<p>Crane as an HPC cluster resource was retired on July 1st, 2023 with a planned power off of the remaining storage and service nodes on August 1st, 2023. We have had a handful of requests to extend the availability of Crane to continue ongoing file transfers. We are extending the power off time to the firm date of <strong>September 1st, 2023</strong>, and will not extend or provide access beyond that time. On <strong>September 1st, 2023</strong>, the Crane login and transfer hosts will be turned off completely. The /work filesystem storage will be erased for compliance with NU policies. If you do not act soon, any data not copied elsewhere from /work by September 1st, 2023 <strong><span style="text-decoration: underline">will be lost permanently.</span></strong></p>Holland Computing Center[email protected]urn:noticeable:publications:wdENQZsrDZic8mpbBJWP2023-06-26T19:56:08.106Z2023-06-26T19:56:42.092ZCrane Retirement - FINAL ReminderSince 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an<p>Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 640 million core hours and 2.3 million&nbsp;GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane has stopped accepting new job submissions and existing jobs will be allowed to complete, <strong>with all jobs stopped by June 30th</strong>.&nbsp; The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data. &nbsp;</p><p>With the retirement of Crane, all data stored on both the $HOME and $WORK filesystems of Crane <strong>will be removed</strong> following decommissioning. Users with data on Crane are strongly encouraged to move their files as soon as possible. For precious data, HCC provides the <a href="https://hcc.unl.edu/attic?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-final-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.wdENQZsrDZic8mpbBJWP&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Attic</a> resource to reliably store data for a nominal cost. The <a href="https://hcc.unl.edu/docs/handling_data/data_storage/using_the_common_file_system/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-final-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.wdENQZsrDZic8mpbBJWP&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">$COMMON filesystem</a> is another option, as it is available on both Crane and Swan, and is not subject to the <a href="https://hcc.unl.edu/docs/handling_data/data_storage/preventing_file_loss/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-final-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.wdENQZsrDZic8mpbBJWP&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">6-month purge policy</a> in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use <a href="https://hcc.unl.edu/docs/handling_data/data_transfer/globus_connect/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-final-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.wdENQZsrDZic8mpbBJWP&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Globus</a> to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer. &nbsp;</p><p>As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts. &nbsp;</p><p>HCC will work to make this transition as minimally disruptive as possible. More information can be found at this <a href="https://hcc.unl.edu/docs/faq/crane_retirement/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-final-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.wdENQZsrDZic8mpbBJWP&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">page</a>. Please contact <a href="mailto:[email protected]" rel="noopener nofollow" target="_blank">[email protected]</a> with any questions or concerns.&nbsp;</p>Holland Computing Center[email protected]urn:noticeable:publications:cWZmGJXEmenWFRu1X1SE2023-05-22T21:05:14.722Z2023-05-22T21:19:15.116ZCrane Retirement - Reminder and GPU UpdateSince 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an<p>Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 640 million core hours and 2.3 million&nbsp;GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.&nbsp; Existing jobs will be allowed to complete, with all jobs stopped by June 30th.&nbsp; The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data. &nbsp;</p><p>With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the <a href="https://hcc.unl.edu/attic?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder-and-gpu-update&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.cWZmGJXEmenWFRu1X1SE&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Attic</a> resource to reliably store data for a nominal cost. The <a href="https://hcc.unl.edu/docs/handling_data/data_storage/using_the_common_file_system/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder-and-gpu-update&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.cWZmGJXEmenWFRu1X1SE&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">$COMMON filesystem</a> is another option, as it is available on both Crane and Swan, and is not subject to the <a href="https://hcc.unl.edu/docs/handling_data/data_storage/preventing_file_loss/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder-and-gpu-update&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.cWZmGJXEmenWFRu1X1SE&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">6-month purge policy</a> in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use <a href="https://hcc.unl.edu/docs/handling_data/data_transfer/globus_connect/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder-and-gpu-update&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.cWZmGJXEmenWFRu1X1SE&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Globus</a> to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer. &nbsp;</p><p>As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts. &nbsp;</p><p><strong><span style="text-decoration: underline">REMINDER:</span></strong></p><p>As part of the retirement process, GPUs on Crane will no longer be available on May 24th. Any jobs that can't complete by May 24th will not start. This is to allow HCC to migrate the GPUs from Crane to Swan and consolidate the pool of GPU resources into Swan. This will greatly increase the number of available GPUs on the Swan cluster.</p><p>Transiting GPU workflows to Swan should only require re-creating any environments needed and migrating any data needed. If you have any questions about this process, please contact <a href="mailto:[email protected]" rel="noopener nofollow" target="_blank" title="[email protected]">[email protected]</a> or join our <a href="https://hcc.unl.edu/OOH?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder-and-gpu-update&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.cWZmGJXEmenWFRu1X1SE&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank" title="https://hcc.unl.edu/OOH">Open Office Hours</a> every Tuesday and Thursday from 2-3PM via Zoom. </p><p>HCC will work to make this transition as minimally disruptive as possible. More information can be found at this <a href="https://hcc.unl.edu/docs/faq/crane_retirement/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder-and-gpu-update&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.cWZmGJXEmenWFRu1X1SE&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">page</a>. Please contact <a href="mailto:[email protected]" rel="noopener nofollow" target="_blank">[email protected]</a> with any questions or concerns.&nbsp;</p>Holland Computing Center[email protected]urn:noticeable:publications:7vwgChQU5S0Sg7mxWiTu2023-04-24T20:49:05.713Z2023-04-24T20:51:39.142ZCrane Retirement - Update and ReminderSince 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an<p>Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 625 million core hours and 2.2 million&nbsp;GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.&nbsp; Existing jobs will be allowed to complete, with all jobs stopped by June 30th.&nbsp; The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data. &nbsp;</p><p>With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the <a href="https://hcc.unl.edu/attic?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-update-and-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.7vwgChQU5S0Sg7mxWiTu&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Attic</a> resource to reliably store data for a nominal cost. The <a href="https://hcc.unl.edu/docs/handling_data/data_storage/using_the_common_file_system/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-update-and-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.7vwgChQU5S0Sg7mxWiTu&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">$COMMON filesystem</a> is another option, as it is available on both Crane and Swan, and is not subject to the <a href="https://hcc.unl.edu/docs/handling_data/data_storage/preventing_file_loss/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-update-and-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.7vwgChQU5S0Sg7mxWiTu&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">6-month purge policy</a> in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use <a href="https://hcc.unl.edu/docs/handling_data/data_transfer/globus_connect/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-update-and-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.7vwgChQU5S0Sg7mxWiTu&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Globus</a> to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer. &nbsp;</p><p>As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts. &nbsp;</p><p><strong><span style="text-decoration: underline">UPDATE:</span></strong></p><p>As part of the retirement process, GPUs on <strong>Crane</strong> will no longer be available starting May 20th. This is to allow HCC to migrate the GPUs from Crane to Swan and consolidate the pool of GPU resources into Swan. The GPUs on Swan will be unaffected by this and GPUs on Swan will still be able to run jobs.</p><p>HCC will work to make this transition as minimally disruptive as possible. More information can be found at this <a href="https://hcc.unl.edu/docs/faq/crane_retirement/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-update-and-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.7vwgChQU5S0Sg7mxWiTu&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">page</a>. Please contact <a href="mailto:[email protected]" rel="noopener nofollow" target="_blank">[email protected]</a> with any questions or concerns.&nbsp;</p>Holland Computing Center[email protected]urn:noticeable:publications:CKDWE1vvvoZkUuG59U1e2023-03-27T19:58:46.801Z2023-03-27T19:59:16.704ZCrane Retirement - ReminderSince 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an<p>Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 625 million core hours and 2.2 million&nbsp;GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.&nbsp; Existing jobs will be allowed to complete, with all jobs stopped by June 30th.&nbsp; The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data. &nbsp;</p><p>With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the <a href="https://hcc.unl.edu/attic?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder-1&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.CKDWE1vvvoZkUuG59U1e&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Attic</a> resource to reliably store data for a nominal cost. The <a href="https://hcc.unl.edu/docs/handling_data/data_storage/using_the_common_file_system/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder-1&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.CKDWE1vvvoZkUuG59U1e&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">$COMMON filesystem</a> is another option, as it is available on both Crane and Swan, and is not subject to the <a href="https://hcc.unl.edu/docs/handling_data/data_storage/preventing_file_loss/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder-1&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.CKDWE1vvvoZkUuG59U1e&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">6-month purge policy</a> in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use <a href="https://hcc.unl.edu/docs/handling_data/data_transfer/globus_connect/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder-1&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.CKDWE1vvvoZkUuG59U1e&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Globus</a> to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer. &nbsp;</p><p>As a part of the long running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts. &nbsp;</p><p>HCC will work to make this transition as minimally disruptive as possible. More information can be found at this <a href="https://hcc.unl.edu/docs/faq/crane_retirement/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder-1&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.CKDWE1vvvoZkUuG59U1e&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">page</a>. Please contact <a href="mailto:[email protected]" rel="noopener nofollow" target="_blank">[email protected]</a> with any questions or concerns.&nbsp;</p>Holland Computing Center[email protected]urn:noticeable:publications:barnjBLscWJ5NzYBio3M2023-02-27T16:53:50.919Z2023-02-27T16:55:36.763ZCrane Retirement - ReminderSince 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an<p>Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 625 million core hours and 2.2 million&nbsp;GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.&nbsp; Existing jobs will be allowed to complete, with all jobs stopped by June 30th.&nbsp; The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data. &nbsp;</p><p>With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the <a href="https://hcc.unl.edu/attic?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.barnjBLscWJ5NzYBio3M&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Attic</a> resource to reliably store data for a nominal cost. The <a href="https://hcc.unl.edu/docs/handling_data/data_storage/using_the_common_file_system/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.barnjBLscWJ5NzYBio3M&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">$COMMON filesystem</a> is another option, as it is available on both Crane and Swan, and is not subject to the <a href="https://hcc.unl.edu/docs/handling_data/data_storage/preventing_file_loss/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.barnjBLscWJ5NzYBio3M&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">6-month purge policy</a> in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use <a href="https://hcc.unl.edu/docs/handling_data/data_transfer/globus_connect/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.barnjBLscWJ5NzYBio3M&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Globus</a> to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer. &nbsp;</p><p>As a part of the long running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts. &nbsp;</p><p>HCC will work to make this transition as minimally disruptive as possible. More information can be found at this <a href="https://hcc.unl.edu/docs/faq/crane_retirement/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-reminder&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.barnjBLscWJ5NzYBio3M&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">page</a>. Please contact <a href="mailto:[email protected]" rel="noopener nofollow" target="_blank">[email protected]</a> with any questions or concerns.&nbsp;</p>Holland Computing Center[email protected]urn:noticeable:publications:sVOGKnZNHRYHcNjsoily2023-01-23T16:05:10.666Z2023-01-23T16:05:20.013ZCrane Retirement - NoticeSince 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an<p>Since 2013, HCC has operated and maintained multiple generations of the Crane HPC cluster alongside other HPC clusters. Due to the original hardware being well out of warranty and becoming unmaintainable, Crane is set to be retired as an HCC resource after nearly 10 years of service to Nebraska researchers. This will allow HCC to consolidate hardware into a single resource and improve performance and reliability. Since Crane's launch in 2013 at the 475th place in the Top500, Crane has provided over 625 million core hours and 2.2 million&nbsp;GPU hours to over 2,600 Nebraska researchers spanning 140 departments. Crane will be planned to stop accepting new job submissions on June 23rd, 2023.&nbsp; Existing jobs will be allowed to complete, with all jobs stopped by June 30th.&nbsp; The login node and storage will remain available until August 1st, 2023 to allow for migration of any remaining data. &nbsp;</p><p>With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the <a href="https://hcc.unl.edu/attic?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-notice&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.sVOGKnZNHRYHcNjsoily&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Attic</a> resource to reliably store data for a nominal cost. The <a href="https://hcc.unl.edu/docs/handling_data/data_storage/using_the_common_file_system/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-notice&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.sVOGKnZNHRYHcNjsoily&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">$COMMON filesystem</a> is another option, as it is available on both Crane and Swan, and is not subject to the <a href="https://hcc.unl.edu/docs/handling_data/data_storage/preventing_file_loss/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-notice&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.sVOGKnZNHRYHcNjsoily&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">6-month purge policy</a> in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use <a href="https://hcc.unl.edu/docs/handling_data/data_transfer/globus_connect/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-notice&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.sVOGKnZNHRYHcNjsoily&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">Globus</a> to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer. &nbsp;</p><p>As a part of the long running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts. &nbsp;</p><p>HCC will work to make this transition as minimally disruptive as possible. More information can be found at this <a href="https://hcc.unl.edu/docs/faq/crane_retirement/?utm_source=noticeable&amp;utm_campaign=u6omvzxzinzftsruhmj5.crane-retirement-notice&amp;utm_content=publication+link&amp;utm_id=U6oMVZxzinZFTsrUhmj5.Ua2Scex0CoZYQCCzgOfT.sVOGKnZNHRYHcNjsoily&amp;utm_medium=newspage" rel="noopener nofollow" target="_blank">page</a>. Please contact <a href="mailto:[email protected]" rel="noopener nofollow" target="_blank">[email protected]</a> with any questions or concerns.&nbsp;</p>Holland Computing Center[email protected]urn:noticeable:publications:9B0nV1oN9eaNtv1H87262022-05-13T15:27:23.032Z2022-05-13T16:01:39.121ZJupyter Notebook App RetirementThe Jupyter Notebook app on Crane and Rhino’s Open OnDemand will be retired on May 20th, 2022.  The Notebook app has been superseded by the Jupyter Lab environment, which provides all the functionality of Notebook and more.  All notebooks<p>The Jupyter&nbsp;<strong>Notebook</strong>&nbsp;app on Crane and Rhino’s Open OnDemand will be retired on May 20th, 2022.&nbsp; The Notebook app has been superseded by the Jupyter Lab environment, which provides all the functionality of Notebook and more.&nbsp; All notebooks and created environments from the Jupyter Notebook environment are available through the Jupyter&nbsp;<strong>Lab</strong>&nbsp;app. There is no additional action needed on your part to use your environments and notebooks in the Jupter Lab app.&nbsp; You may also revert to the classic Notebook interface at any time by selecting “Help”&gt;“Launch Classic Notebook”.&nbsp; Please contact&nbsp;<a href="mailto:[email protected]" rel="noopener noreferrer" target="_blank">[email protected]</a>&nbsp;with any questions or concerns.</p>Holland Computing Center[email protected]urn:noticeable:publications:HxpINr7SDd3WUvctuAJN2021-04-13T22:11:06.404Z2021-04-13T22:24:55.553ZCrane: Returned to ServiceThe Crane resource is available for use once again after a much needed downtime to upgrade the base operating system to CentOS 8. If you encounter any issues or have any questions please contact us at [email protected]. The main methods<p>The Crane resource is available for use once again after a much needed downtime to upgrade the base operating system to CentOS 8. If you encounter any issues or have any questions please contact us at [email protected]. The main methods in which users interact with Crane, including but not limited to Shell access, Open OnDemand, and Globus are all still available. While we also attempted to keep as much of the available software as possible, there may be certain modules which are no longer available due to their age and incompatibility with modern Linux. Especially for MPI-based software, HCC recommends verifying the appropriate modules are present before launching your production jobs. Similar to Rhino, a reasonably modern set of default modules is now also provided.</p><p>The SLURM queue of jobs was not maintained during this upgrade to limit unexpected side effects, and you will need to resubmit any work you previously had in queue when the downtime began.</p><p>After an upgrade of this scale, there will likely be some latent issues that crop up due to the complexity of the changes. Please email [email protected] with any problems or questions you may have, and we will work to resolve them as quickly as possible.</p>Holland Computing Center[email protected]urn:noticeable:publications:2C8GqScfatm6zYfxSUln2021-04-12T22:34:12.298Z2021-04-12T22:41:29.486ZCrane: Downtime UpdateWhile there was intent to have Crane available for general use no later than today (Monday April 12th) some unfortunate hardware failures over the weekend and this morning have made it necessary to postpone opening Crane until tomorrow<p>While there was intent to have Crane available for general use no later than today (Monday April 12th) some unfortunate hardware failures over the weekend and this morning have made it necessary to postpone opening Crane until tomorrow morning (Tuesday April 13th). One of the failures today involved Crane's <code>/work</code> filesystem, and out of an abundance of caution we are letting the affected components rebuild to completion before allowing access. This process will hopefully finish overnight, and allow us to open Crane in a healthy state tomorrow. No data on <code>/work</code> was affected by the earlier failure, so all your files on <code>/work</code> should remain.</p><p>We will send an announcement as soon as Crane is open for general use, including details about changes made during this downtime and how they may impact your use of the system.&nbsp;</p>Holland Computing Center[email protected]