Cluster: Difference between revisions

From IIHE Wiki
Jump to navigationJump to search
No edit summary
No edit summary
Line 99: Line 99:
*V04-05-00-jkunnen<br>
*V04-05-00-jkunnen<br>


===== /software/icecube/isimulation<br> =====
===== /software/icecube/simulation<br> =====


<br>
<br>

Revision as of 12:21, 13 February 2014

IIHE local cluster

Overview

The cluster is composed by 4 machine types :

  • User Interfaces (UI)

This is the cluster front-end, to use the cluster, you need to log into those machines

Servers : ui01, ui02

  • Computing Element (CE)

This server is the core of the batch system : it run submitted jobs on worker nodes

Servers : ce

  • Worker Nodes (WN)

This is the power of the cluster : they run jobs and send the status back to the CE

Servers : slave*

  • Storage Elements

This is the memory of the cluster : they contains data, software, ...

Servers : datang (/data, /software), lxserv (/user), x4500 (/ice3)

How to connect

To connect to the cluster, you must use your IIHE credentials (same as for wifi)

ssh username@icecube.iihe.ac.be

TIP : icecube.iihe.ac.be points automatically to available UI's (ui01, ui02, ...)


After a successful login, you'll see this message :

==========================================
Welcome on the IIHE ULB-VUB cluster

Cluster status http://ganglia.iihe.ac.be
IT Help support-iihe@ulb.ac.be
==========================================

username@uiXX:~$

Your default current working directory is your home folder.


Directory Structure

Here is a description of most useful directories

/user/username

Your home folder

/data

Main data repository

/software

The custom software area

/software/src


/software/icecube

Icecube specific tools :

/software/icecube/i3_ports

This folder contains the I3 ports used by icecube (meta-)projects

In order to use it, you must define the environment variable $I3_PORTS

export I3_PORTS="/software/icecube/i3_ports"

This variable is set only for the current session, in order to not define it everytime,  you can add this command to your .bashrc

/software/icecube/offline-software
/software/icecube/icerec

This folder contains the icerec meta-project

To use it, just run the following command (note the point at the beginning of the line)

. /software/icecube/icerec/[VERSION]/env-shell.sh

Available versions :

  • V04-05-00
  • V04-05-00-jkunnen
/software/icecube/simulation


/ice3

Batch System

Queues

The cluster is decomposed in queues


any lowmem standard highmem gpu
Description




CPU's




Walltime default/limit




Memory default/limit





Job submission

To submit a job, you just have to use the qsub command :

qsub myjob.sh

OPTIONS

-q queueName : choose the queue (default: any)

-N jobName : name of the job

-I : pass in interactive mode

-m : mail options

-l : resources options


Job management

To see all jobs (running / queued), you can use the qstat command or go to the JobMonArch page

qstat

OPTIONS

-u username : list only jobs submitted by username

-n : show nodes where jobs are running

-q : show the job repartition on queues


Useful links

Ganglia Monitoring : Servers status

JobMonArch : Jobs overview