A recommended storage pattern is to have the master copy of data on Atlas (project folder) and only store data on the UL HPC Clusters temporarily for the required practical duration of computational analysis. The derived data and results should hereafter be transferred back to Atlas. This How-to Card describes the different methods to transfer data between Atlas and the UL HPC Clusters. The three recommended methods to transfer data are:
1.[Via laptop with ```scp``` or ```rsync```](#1. Vialaptopusingscporrsync)
3.[Via Large File Transfer (LFT)](#3-via-large-file-transfer-lft)
Please refer to the dedicated knowledge bases to see how to [connect to UL HPC Clusters](https://hpc-docs.uni.lu/connect/access/) and to [mount Atlas](https://service.uni.lu/sp?id=kb_article_view&sysparm_article=KB0010233).
...
...
@@ -27,7 +27,7 @@ When using the UL laptop to transfer data between UL HPC Clusters and Atlas, you
Please visit the [UL HPC documentation](https://hpc-docs.uni.lu/data/transfer/#data-transfer-tofromwithin-ul-hpc-clusters) to see how to use `rsync` and `scp`.
## 2. Via dedicated Virtual Machine (VM) using rsync
## 2. Via dedicated Virtual Machine using rsync
Data can be transferred via a dedicated VM, which can be requested via [ServiceNow](https://service.uni.lu/sp?id=sc_cat_item&table=sc_cat_item&sys_id=49956812db3fa010ca53454039961978).
Instead of transferring data between Atlas and UL HPC Clusters through the laptop as described above, the transfer will go through the dedicated VM. Once connected to the VM and mounted to Atlas, the ```rsync``` command can be used in the same way as described in the [UL HPC documentation](https://hpc-docs.uni.lu/data/transfer/#data-transfer-tofromwithin-ul-hpc-clusters). This method is recommended for **recurring transfers of very large datasets** that benefit from high-speed network connection between the VM and the HPC.