Like most of HCC's file systems, NRDStor is available for use on Swan and can be used within computational jobs. What sets NRDStor apart from other HCC filesystems is that you are able to access it directly from your computer connected on campus or through the University of Nebraska VPN as a network drive. No additional software or command line tools are needed. For more information about NRDStor, please visit the NRDStor documentation.
As we continue to refine and enhance NRDstor we invite you to explore its features and provide your valuable feedback. Your feedback will play a crucial role in shaping the future of this new storage platform.
To request access to NRDStor , please visit this link. Before access to NRDStor can be granted, you will need to pass a short Bridge course.
]]>Like most of HCC's file systems, NRDStor is available for use on Swan and can be used within computational jobs. What sets NRDStor apart from other HCC filesystems is that you are able to access it directly from your computer connected on campus or through the University of Nebraska VPN as a network drive. No additional software or command line tools are needed. For more information about NRDStor, please visit the NRDStor documentation.
As we continue to refine and enhance NRDstor we invite you to explore its features and provide your valuable feedback. Your feedback will play a crucial role in shaping the future of this new storage platform.
To request access to NRDStor , please visit this link. Before access to NRDStor can be granted, you will need to pass a short Bridge course.
]]>New HCC account holders are strongly encouraged to attend the FREE June Workshop Series offered both in-person and remote every Thursday in the month of June. More details are available here: https://hcc.unl.edu/june-workshop-series-2024
If you have any questions during this period, please feel free to contact HCC support at [email protected].
]]>New HCC account holders are strongly encouraged to attend the FREE June Workshop Series offered both in-person and remote every Thursday in the month of June. More details are available here: https://hcc.unl.edu/june-workshop-series-2024
If you have any questions during this period, please feel free to contact HCC support at [email protected].
]]>Please note that*each user must have their own CryoSPARC license ID*. Attempting to share licenses, even within a lab group, will result in errors and failed jobs. CryoSPARC licenses are available at no charge for academic use and can be obtained here. HCC has created a documentation page with full details on using the CryoSPARC App, available here. We encourage every user to read the documentation page before using the app, as it has important details on how to best utilize the app.
Please note that since the app is newly developed, small issues may arise through widespread use. Please feel free to contact HCC at [email protected] with any issues or questions you may have while using the app.
Please also note that several HCC resources, including Swan, are scheduled for maintenance starting Wednesday, May 22nd at 8AM CST. Jobs (including CryoSPARC tasks) that cannot complete before the maintenance begins will be held. *HCC strongly recommends waiting until after the downtime to start any workflows that may result in held CryoSPARC jobs.* If these jobs begin prior to the app being relaunched, errors may result.
]]>Please note that*each user must have their own CryoSPARC license ID*. Attempting to share licenses, even within a lab group, will result in errors and failed jobs. CryoSPARC licenses are available at no charge for academic use and can be obtained here. HCC has created a documentation page with full details on using the CryoSPARC App, available here. We encourage every user to read the documentation page before using the app, as it has important details on how to best utilize the app.
Please note that since the app is newly developed, small issues may arise through widespread use. Please feel free to contact HCC at [email protected] with any issues or questions you may have while using the app.
Please also note that several HCC resources, including Swan, are scheduled for maintenance starting Wednesday, May 22nd at 8AM CST. Jobs (including CryoSPARC tasks) that cannot complete before the maintenance begins will be held. *HCC strongly recommends waiting until after the downtime to start any workflows that may result in held CryoSPARC jobs.* If these jobs begin prior to the app being relaunched, errors may result.
]]>If you have any questions during this period, please feel free to contact HCC support at [email protected].
]]>If you have any questions during this period, please feel free to contact HCC support at [email protected].
]]>If you have any questions during this period, please feel free to contact HCC support at [email protected].
]]>If you have any questions during this period, please feel free to contact HCC support at [email protected].
]]>HCC staff will be monitoring the systems to ensure availability through the break. HCC User Services staff will be periodically monitoring the ticketing system during the break and will address any system critical issues. Non-critical tickets/issues will be addressed when we return after the winter break on January 2, 2024.
Please email [email protected] if you have any questions.
Happy Holidays!
Hongfeng Yu, Director
Holland Computing Center
HCC staff will be monitoring the systems to ensure availability through the break. HCC User Services staff will be periodically monitoring the ticketing system during the break and will address any system critical issues. Non-critical tickets/issues will be addressed when we return after the winter break on January 2, 2024.
Please email [email protected] if you have any questions.
Happy Holidays!
Hongfeng Yu, Director
Holland Computing Center
These courses are only available to those within the University of Nebraska system at this time.
Link: https://nebraska.bridgeapp.com/learner/courses/a90b1752/enroll
A short course containing important information that is strongly recommended for new group owners.
This includes information about HCC policies, account creation, an overview of data storage covering details about the purge policy on $WORK, support, training, and acknowledgment credits.
Link: https://nebraska.bridgeapp.com/learner/courses/77529776/enroll
A short course containing important information that is strongly recommended for those new to using HCC resources.
This includes information about HCC policies, managing your account, an overview of data storage covering details about the purge policy on $WORK, support, training, and acknowledgment credits.
Link: https://nebraska.bridgeapp.com/learner/courses/edb77314/enroll
This is a longer course providing an introduction to using HCC’s Swan supercomputer including basic data management, software usage, submitting jobs, reviewing jobs, and acknowledgment credits. This course is recommended for those new to using HCC resources, and anyone wanting to learn more about HCC
]]>These courses are only available to those within the University of Nebraska system at this time.
Link: https://nebraska.bridgeapp.com/learner/courses/a90b1752/enroll
A short course containing important information that is strongly recommended for new group owners.
This includes information about HCC policies, account creation, an overview of data storage covering details about the purge policy on $WORK, support, training, and acknowledgment credits.
Link: https://nebraska.bridgeapp.com/learner/courses/77529776/enroll
A short course containing important information that is strongly recommended for those new to using HCC resources.
This includes information about HCC policies, managing your account, an overview of data storage covering details about the purge policy on $WORK, support, training, and acknowledgment credits.
Link: https://nebraska.bridgeapp.com/learner/courses/edb77314/enroll
This is a longer course providing an introduction to using HCC’s Swan supercomputer including basic data management, software usage, submitting jobs, reviewing jobs, and acknowledgment credits. This course is recommended for those new to using HCC resources, and anyone wanting to learn more about HCC
]]>With the retirement of Crane, all data stored on both the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files as soon as possible. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.
As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.
HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns.
]]>With the retirement of Crane, all data stored on both the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files as soon as possible. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.
As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.
HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns.
]]>With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.
As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.
REMINDER:
As part of the retirement process, GPUs on Crane will no longer be available on May 24th. Any jobs that can't complete by May 24th will not start. This is to allow HCC to migrate the GPUs from Crane to Swan and consolidate the pool of GPU resources into Swan. This will greatly increase the number of available GPUs on the Swan cluster.
Transiting GPU workflows to Swan should only require re-creating any environments needed and migrating any data needed. If you have any questions about this process, please contact [email protected] or join our Open Office Hours every Tuesday and Thursday from 2-3PM via Zoom.
HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns.
]]>With the retirement of Crane, all data stored on the $HOME and $WORK filesystems of Crane will be removed following decommissioning. Users with data on Crane are strongly encouraged to move their files sooner rather than later. For precious data, HCC provides the Attic resource to reliably store data for a nominal cost. The $COMMON filesystem is another option, as it is available on both Crane and Swan, and is not subject to the 6-month purge policy in effect on the $WORK filesystem. Please note data on $COMMON is not backed up; precious data should be additionally saved elsewhere. Each group is allocated 30TB of $COMMON space at no charge; additional space is available for a fee. For large data transfers, it is strongly encouraged to use Globus to transfer your data. The Globus transfer servers for Crane, Swan, and Attic provide a faster connection for the data transfer and provide checks on both ends of the data transfer.
As a part of the long-running plan to retire Crane, HCC deployed the Swan cluster in May 2022 as Crane’s replacement and made possible by investments from the UNL Office of Research and Economic Development and the Nebraska Research Initiative. Swan already has a significant number of resources available for general use, and will have the remaining in-warranty resources from Crane migrated into the Swan cluster. Most workflows from Crane can be used on Swan with little to no modification to software or submission scripts.
REMINDER:
As part of the retirement process, GPUs on Crane will no longer be available on May 24th. Any jobs that can't complete by May 24th will not start. This is to allow HCC to migrate the GPUs from Crane to Swan and consolidate the pool of GPU resources into Swan. This will greatly increase the number of available GPUs on the Swan cluster.
Transiting GPU workflows to Swan should only require re-creating any environments needed and migrating any data needed. If you have any questions about this process, please contact [email protected] or join our Open Office Hours every Tuesday and Thursday from 2-3PM via Zoom.
HCC will work to make this transition as minimally disruptive as possible. More information can be found at this page. Please contact [email protected] with any questions or concerns.
]]>