Morzsák

Oldal címe

FAQ

Címlapos tartalom

HUN-REN Cloud is available to researchers from the HUN-REN network and higher education sector in Hungary. To use it, you must fill out the project request form which requires eduID login. More information can be found on our how to join page. If you have any further questions, please contact the developers at info@science-cloud.hu.

We will contact you within a week after your project request has been filed to discuss the details (e.g. project quota size). After your project is successfully created, we’ll notify you via e-mail, and provide all necessary information about logging in.

We use an EduID based identification in our cloud. If you do not have an EduID, or you are unsure whether your institute has joined the EduID federation or not, you may check their status on the eduID members page. If your institute has not yet joined the EduID federation, you may initiate the process with the head of the IT department of your institute.

Alternatively, you can use the ‘Akadémiai Adattárs’ (Academy database - AAT) identifier which is compatible with EduID. You can register on their homepage by selecting the ‘Request for admittance’ button in the header and clicking on the ‘into MTA’s public body’ option. The username and password you create here will be used to log into AAT in the future.

In case of any other login issues or if you would like temporary access, please contact us at info@science-cloud.hu.

We will send you the full quota assigned to your project in our notification e-mail. You can cover this quota with the different sized virtual machines listed in the table below. Flavors with GPU are only available if the need was indicated when the project was requested. The following virtual machine sizes are available to HUN-REN Cloud users:

vCPU RAM GPU RAM
m2.tiny 1 1 GB
m2.small 1 2 GB
m2.medium 2 4 GB
m2.large 4 8 GB
m2.xlarge 8 16 GB
m2.2xlarge 16 32 GB
m2.4xlarge 32 64 GB
r2.medium 2 8 GB
r2.large 4 16 GB
r2.xlarge 8 32 GB
r2.2xlarge 16 64 GB
g2.medium 2 8 GB 4 GB
g2.large 4 16 GB 8 GB
g2.xlarge 8 32 GB 16 GB

Yes, it is possible. The quota can be modified by editing the project request. This request for change is treated in the same way as new requests meaning that the quota increase is not automatic but has to be approved.

To add another user, that user must first join the given project. This can be initiated by clicking on the 'Request to join' button on the project's data sheet. Once the user has submitted the request, users with the project manager role can approve it. To approve, simply click on the 'Members' tab on the project's data sheet and edit the membership of the new user. When editing, the user can be set as a member or a manager in the project. Users with manager permissions can modify the project and user memberships. There is no extra permissions related to this role in the OpenStack cloud.

In order to reach the virtual machine using SSH connection, you must first enable the necessary firewall rule. This is done by clicking on the ‘+’ symbol next to the ‘ssh’ (port 22) in the ‘Security Groups’ tab when launching a new instance (Compute/Instance/Launch instance). Furthermore, to log in without the use of a password (using a Key Pair) the ssh key must also be selected (Key Pair menu).

Firewall rules can also be enabled on a virtual machine that is already running. In the ‘Instance’ tab there is a drop-down list (of the particular instance) in which you find the ‘Edit Security Groups’ menu. Don’t forget to click on the ‘Save’ button, once you have clicked on the ‘+’ sign next to the ‘ssh’ (port 22).

Connecting via SSH:

  1. ‘Compute’ → ‘Instances’, then ‘Edit Instance’→ ‘Security Groups’ in the menu in the row of the virtual machine. In this tab you can assign rules to the virtual machine. Assign the SSH rule here, which opens port 22 that is necessary for SSH access.
  2. Clicking on the arrow pointing down in the ‘Actions’ column (‘Compute’ → ‘Instances’ row ) of the virtual machine opens up a drop-down menu. Click on ‘Associate Floating IP’ here. In the modular window popping up, click on ‘IP Address’ to select the already allocated floating IP address from the drop-down menu. Clicking the ‘Associate’ button assigns the public IP to the virtual machine.
  3. The public IP can be checked in the ‘IP Address’ row, under the ‘Floating IPs’ line of a chart found on the ‘Compute’ → ‘Instances’ page. The floating IP is mapped to an outside address: 10.1.20.X → 193.224.176.X. From now on, the virtual machine is accessed by the 193.224.176.X address. (e.g. 10.1.20.80 → 193.224.176.80.) For using the Sztaki branch (a subsidiary of the MTA institute), it is not needed to rewrite the last digits of the floating IP.
  4. The private key must be saved to the appropriate safety folder (~/.ssh/id_rsa file) with the appropriate privilege: chmod 600 [private_key_file].
    (if you do not yet have a key pair, you can generate one in the ‘Compute’ → ‘Access & Security’ → ‘Key Pairs’ menu)
  5. To build an SSH connection, the following command is to be given in the terminal:
    ssh –i [private_key_path]
    [username]@[virtual_machine_public_ip]
    Sample: ssh –i ~/.ssh/id_rsa johndoe@148.6.200.80

The ‘Image description’ field of the image file selected in the ‘Image’ menu contains the default username and password. If you use Ubuntu, these are ‘ubuntu’/’ubuntu’, if you use Windows, they are ‘windows’/’windows’ respectively. These must be reset after the first login.

HUN-REN Cloud provides one public IP address for every project, so monitoring multiple instances from the outside is not possible directly. This one IP address can be assigned to any of the virtual machines (‘Associate Floating IP’ in virtual machines list, in the drop-down menu next to the computer). In order to log in, the ssh key pair must be supplied and selected before launch (‘Key Pair’ menu), and the firewall must be configured (‘Security Group’).

We recommend that you amend the ‘default’ firewall rule set. Add a rule in the ‘Security Groups’ menu (‘Compute’ → ‘Access & Security’) by clicking on ‘default’, then ‘Manage Rules’, then ‘Add Rule’. Here you can add the ssh rule on the firewall (port 22). You can access the other virtual machines through their private addresses using ssh agent forward, or by a VPN server set up on the public computer.

HUN-REN Cloud provides one public IP address for every project. This one IP address can be assigned to any of the virtual machines (‘Associate Floating IP’ in virtual machines list, in the drop-down menu next to the computer). Reaching the virtual machines is not possible via their private addresses without either a Floating IP or a VPN. The virtual machine is not aware of its floating IP (although it can be retrieved from the metadata service, Linux cannot use it). If you’d like to communicate with multiple virtual machines from the outside, the floating IP must be bound to them. The floating IP is only a DNAT and SNAT rule, which translates between the internal (private) and external (public) address.

The main purpose of the HUN-REN Cloud service is to aid research projects by providing computing power, not storage space. We can however satisfy such demands by providing a certain quota, from which you may launch a Linux based virtual machine to store your data, accessed through SSH.

We recommend using SCP for data transfer as SCP uses SSH, so it is pre-installed in the official image files. Similarly, to SSH, port 22 must be enabled in Security Groups. We recommend using the scp command on Linux clients and the WinSCP program on Windows clients. We suggest putting the data on a separate volume.

The 3 main types of cloud computing service models are infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS), and software-as-a-service (SaaS). HUN-REN Cloud currently offers the IaaS level. As an IaaS cloud HUN-REN Cloud allows researchers to create different types and sizes of e-infrastructures that are dynamically adjustable according to the current needs of their ongoing projects without having to go through the usual lengthy and complicated procurement procedures.

Installing the necessary pieces of software is the user’s responsibility, but we do provide some applications - in the form of reference architectures - that make using our services significantly easier. The Available reference architectures and their technical descriptions can be found on our dedicated page.

Yes, a virtual machine may run indefinitely, however HUN-REN Cloud doesn’t take responsibility for shutdown if there is a malfunction.

Within a project, every user has the same authorization, so everyone can create new virtual machines. The administrator is simply the person who requested and is liable for the given project and he is also the one to decide who is allowed to use the resources allocated to the project.

If you have successfully joined and used the HUN-REN Cloud service and as a result finished an article, please include the following acknowledgement either in English or Hungarian:

Köszönetnyilvánítás

A .............. projekt nevében köszönetet mondunk az HUN-REN Cloud (lásd: Héder et al. 2022; https://science-cloud.hu/) használatáért, ami hozzájárult a publikált eredmények eléréséhez.

Acknowledgement

On behalf of the ............... project we are grateful for the possibility to use HUN-REN Cloud (see Héder et al. 2022; https://science-cloud.hu/) which helped us achieve the results published in this paper.

In the reference list, please include the publication presenting HUN-REN Cloud:

BIBTEX
RIS

  • APA:
    Héder, M., Rigó, E., Medgyesi, D., Lovas, R., Tenczer, S., Török, F., Farkas, A., Emődi, M., Kadlecsik, J., Mező, G., Pintér, Á., & Kacsuk, P. (2022). The Past, Present and Future of the ELKH Cloud. Információs Társadalom, 22(2), 128. https://doi.org/10.22503/inftars.xxii.2022.2.8
  • IEEE:
    [x] M. Héder et al., “The Past, Present and Future of the ELKH Cloud,” Információs Társadalom, vol. 22, no. 2, p. 128, Aug. 2022, doi: 10.22503/inftars.xxii.2022.2.8
  • MLA:
    Héder, Mihály, et al. “The Past, Present and Future of the ELKH Cloud.” Információs Társadalom, vol. 22, no. 2, Aug. 2022, p. 128. Crossref, https://doi.org/10.22503/inftars.xxii.2022.2.8
  • Chicago:
    Héder, Mihály, Ernő Rigó, Dorottya Medgyesi, Róbert Lovas, Szabolcs Tenczer, Ferenc Török, Attila Farkas, et al. “The Past, Present and Future of the ELKH Cloud.” Információs Társadalom 22, no. 2 (August 31, 2022): 128. https://doi.org/10.22503/inftars.xxii.2022.2.8

Please send us your article and a link to it if it gets published, so we can include it on our website. This would both increase the impact and visibility of your work, while it also serves as a reference to HUN-REN Cloud, showing the practical use of our services.

We recommend using the OpenVPN reference architecture, however, there are many guides available on the Internet to help you set up your OpenVPN server and client.

Yes, there is a possibility for extension, but please note that his option may be revoked if our cloud nears saturation. To extend the project, you must edit the completion deadline on the project's data sheet.

The SZTAKI side of HUN-REN Cloud has NVIDIA Tesla V100 GPUs with a maximum of 32GB of GPU memory. These GPUs can be divided into smaller vGPUs, so it is possible to request them with 8 or 16 GB of memory. To request, select the appropriate flavor on the application form (starting with g2). Currently there can only be 1 vGPU in a virtual machine.

The Wigner data center has NVIDIA Tesla A100 GPUs with a maximum of 40GB of GPU memory. These can be divided into vGPUS with 5, 10 and 20 GB of memory. Again, to request, the flavors starting with g2 must be selected, but in this case the largest flavor will have a 40GB GPU instead of 32, the medium 20 instead of 16, and so on.

If you prefer one of the GPUs (V100 or A100) please notify us in the comment section when requesting a project.

The project users have access to and can manage the requested resources within the project. On virtual machines running the Ubuntu operating system, ubuntu is both the default user name and the password. Of course, it is possible to add new users on the launched virtual machines as needed. These users do not have to be project users. It is important to note that the project manager is also responsible for the activities of these users. In clonclusion the users of the project are not the same as the users of the created virtual machines.

When requesting a project, the closing date of the project must be specified. As this date approaches, our colleagues will contact the project manager to find out if the project can be terminated on the given date. If the project ends before the planned closing date, please delete the virtual machines and volumes, and send an e-mail to us at info@science-cloud.hu with your project's name to initiate the shutdown. After the project is closed, all used resources will be deleted, so please make sure to save the data beforehand. Closing a project is never automatic, we will always consult with the given project manager.