r27 - 02 Mar 2010 - 12:19:46 - AlvaroFernandezYou are here: TWiki >  Atlas Web  >  GridComputing > EGEEHome > LustreStoRM

Lustre and StoRM?

Lustre Configuration and Setup

Software Installation.

O.S:

Headnode and OSS with Scientific Linux IFIC 4.4 (x86_64)
Client with Scientific Linux IFIC 3.0.6 (i386)

Updated software in sli44 machines:

acl-2.2.39-1.1.x86_64.rpm
 libacl-2.2.39-1.1.x86_64.rpm
 attr-2.4.28-1.2.x86_64.rpm
 libattr-2.4.28-1.2.x86_64.rpm
 dump-0.4b41-1.x86_64.rpm
 rmt-0.4b41-1.x86_64.rpm
 e2fsprogs-1.39.cfs2-0.x86_64.rpm (the e2fsprogs version must be equal or higher than 1.38)
Updated/Installed/Removed software in sli306 machine:
rpm -Uhv libgcc-3.4.6-3.i386.rpm
 rpm -ihv libstdc++-3.4.6-3.i386.rpm
 rpm -ihv db4-4.2.52-7.1.i386.rpm
 rpm -Uhv e2fsprogs-1.39.cfs2-0.i386.rpm
 rpm -e modutils
 rpm -ihv module-init-tools-3.1-0.pre5.3.2.i386.rpm

Lustre: kernel, modules and lustre utilities.

kernel-lustre-smp-2.6.9-42.0.10.EL_lustre_1.6.0.1.x86_64.rpm
 lustre-modules-1.6.0.1-2.6.9_42.0.10.EL_lustre_1.6.0.1smp.x86_64.rpm 
lustre-1.6.0.1-2.6.9_42.0.10.EL_lustre_1.6.0.1smp.x86_64.rpm

Setup

Sincronize nodes with ntp. Disable selinux until create a proper profile

Nodes setup

Headnode: will host MGS + MDT

MGS
mkfs.lustre --mgs /dev/sda5
 mkdir -p /mnt/lustre/mgs
 mount -t lustre /dev/sda5 /mnt/lustre/mgs
MDT
mkfs.lustre --fsname=ificfs --mdt --mgsnode=wn181@tcp0 --mountfsoptions=acl /dev/sda6
 mkdir -p /mnt/lustre/mdt-ificfs
 mount /dev/sda6 /mnt/lustre/mdt-ificfs

Look for the partition labels with e2label and add the lines to fstab for next mounts.

LABEL=MGS               /mnt/lustre/mgs        lustre  defaults,_netdev 0 0
 LABEL=ificfs-MDT0000    /mnt/lustre/mdt-ificfs lustre  defaults,_netdev 0 0
OSS:
mkfs.lustre --fsname=ificfs --ost --mgsnode=wn181@tcp0 /dev/sda5
 mkdir -p /mnt/lustre/ost0-ificfs
 mount -t lustre /dev/sda5 /mnt/lustre/ost0-ificfs/
fstab entry:
LABEL=ificfs-OST0000    /mnt/lustre/ost0-ificfs          lustre  defaults,_netdev 0 0
Client:
mkdir -p /lustre/ific.uv.es/
mount -t lustre wn181.ific.uv.es:/ificfs /lustre/ific.uv.es/
fstab entry:
wn181@tcp:/ificfs       /lustre/ific.uv.es      lustre  defaults,_netdev        0 0

StoRM Configuration and Setup

Current services

The current installation in production is based on Storm release 1.4.0.
We are currently evaluating version 1.5 to be installed first in the pre-production services: http://igrelease.forge.cnaf.infn.it/doku.php?id=doc:updates:ig3_1_32:infngrid-update31-61 

OS, Required packages & Storm Installation

O.S: Storm is currently released for Scientific Linux 4, and whe are using SLI 4.7, which is the Cern Version with some IFIC packages and configurations for our institute.
Relases for SL5 is promised by the developers, and although there are some users that have installed it over a SL5 there are a high dependency to legacy packages that makes it nonsense for us to install it in this moment.

Repositories:

We are using yum for the os, base and middleware repositories.

-rw-r--r--  1 root root 256 Nov 13 12:03 ific-extra.repo
-rw-r--r--  1 root root 585 Nov 13 12:03 glite-generic.repo
-rw-r--r--  1 root root 362 Nov 13 12:03 dag.repo
-rw-r--r--  1 root root 279 Nov 13 12:03 lcg-CA.repo
-rw-r--r--  1 root root 630 Nov 13 12:03 jpackage.repo
-rw-r--r--  1 root root 330 Nov 13 12:03 ig.repo
-rw-r--r--  1 root root 350 Nov 13 12:03 ific-update-srpms.repo
-rw-r--r--  1 root root 328 Nov 13 12:03 ific-update.repo
-rw-r--r--  1 root root 475 Nov 13 12:03 ific-test-srpms.repo.rpmorig
-rw-r--r--  1 root root 244 Nov 13 12:03 ific-srpms.repo
-rw-r--r--  1 root root 222 Nov 13 12:03 ific.repo
-rw-r--r--  1 root root 279 Nov 13 12:03 ific-extra-srpms.repo
-rw-r--r--  1 root root 225 Nov 13 12:38 ific4x.repo
-rw-r--r--  1 root root 331 Nov 13 12:39 ific-update-4x.repo
NTP: Install and configure ntp for node date syncronization Java: Installed Java 1_5_0_15 and exclude jre from updates (/etc/yum.conf).
java-1.5.0-sun-devel-1.5.0.15-1jpp
 java-1.5.0-sun-1.5.0.15-1jpp

If there are dependency problems in the installatin you can remove the packages that require jre, in our SLI installation:

mozilla-plugin, ificdummy openoffice* jere susefax susefax-cover

Install CAs and Storm packages:

yum install lcg-Ca
 yum install -y ig_SE_storm_backend ig_SE_storm_frontend

we are using another node/s for GridFTP? so we don't install GridFtp? here although this is a possible scenario for smaller sites. Depending on acess you can also install frontend and backend in different nodes, or even having several frontends for larger sites. 

Other:

copy grid certificates /etc/grid-security

In production we create the user "storm" (with id 666:666 ) prior to yaim configuration to maintain the permissions to access the data in /Lustre.

Configuration

We are using the ig-yaim release provided by INFN wich builds in top of generic gLite releases. Following instructions as indicated in http://igrelease.forge.cnaf.infn.it/doku.php?id=doc:guides:install-3_1

Yaim packages:

Ig-yaim-storm-4.0.8-1
 glite-yaim-core-4.0.8-7
 ig-yaim-4.0.8-3_1
Prepare the configuration files: Yaim is able to maintain a site config file, and a per-service file to maintain the variables belonging to this service. We include here the specific Storm variables in the file services/ig-se_storm_backend (*check below some notes about not included parameters)
#########################################
# StoRM backend configuration variables #
#########################################

#ig : necesario para igyaim-config
 NTP_HOSTS_IP=147.156.1.1

STORM_HOST=$SE_SRM_HOST
 STORM_PORT=8443
 STORM_USER=storm

STORM_GRIDFTP_POOL_LIST=(tux4u01.ific.uv.es)

# Database settings.
# Host for database connection. COMPULSORY
 STORM_DB_HOST=localhost
# User for database connection. COMPULSORY
 STORM_DB_USER=storm

# Protocol support.
# If set to 'FALSE', the following variables prevent the corresponding protocol
# to be published by the StoRM gip. OPTIONAL - Available values: [true|false] - Default value: true
#STORM_INFO_FILE_SUPPORT=true
#STORM_INFO_GRIDFTP_SUPPORT=true
#STORM_INFO_RFIO_SUPPORT=true
#STORM_INFO_ROOT_SUPPORT=true
 STORM_INFO_RFIO_SUPPORT=false
 STORM_INFO_ROOT_SUPPORT=false

STORM_DEFAULT_ROOT=/lustre/ific.uv.es/grid

ATLAS_STORAGE_AREAS="atlas atlasdatadisk atlasmcdisk atlasenduser atlasgroupdisk atlashotdisk atlaslocalgroupdisk atlasproddisk atlasscratchdisk atlasuserdisk"
STORM_STORAGEAREA_LIST=" $ATLAS_STORAGE_AREAS dteam ops swetest ific opscsic gencsic ngies opsngi genngi formngi grid4build nanodev mosfet archist blast slgrid photonics qcomp frodock odthpiv filogen timones gphase turbulencia meteo"

STORM_ATLASDATADISK_VONAME="atlas"
STORM_ATLASDATADISK_ROOT=$STORM_DEFAULT_ROOT/atlas/atlasdatadisk
 STORM_ATLASDATADISK_TOKEN="ATLASDATADISK"
STORM_ATLASDATADISK_ACCESSPOINT=$STORM_ATLASDATADISK_ROOT

STORM_ATLASMCDISK_VONAME="atlas"
STORM_ATLASMCDISK_ROOT=$STORM_DEFAULT_ROOT/atlas/atlasmcdisk
 STORM_ATLASMCDISK_TOKEN="ATLASMCDISK"
STORM_ATLASMCDISK_ACCESSPOINT=$STORM_ATLASMCDISK_ROOT

STORM_ATLASENDUSER_VONAME="atlas"
STORM_ATLASENDUSER_ROOT=$STORM_DEFAULT_ROOT/atlas/atlasenduser
 STORM_ATLASENDUSER_TOKEN="ATLASENDUSER"
STORM_ATLASENDUSER_ACCESSPOINT=$STORM_ATLASENDUSER_ROOT

STORM_ATLASGROUPDISK_VONAME="atlas"
STORM_ATLASGROUPDISK_ROOT=$STORM_DEFAULT_ROOT/atlas/atlasgroupdisk
 STORM_ATLASGROUPDISK_TOKEN="ATLASGROUPDISK"
STORM_ATLASGROUPDISK_ACCESSPOINT=$STORM_ATLASGROUPDISK_ROOT

STORM_ATLASHOTDISK_VONAME="atlas"
STORM_ATLASHOTDISK_ROOT=$STORM_DEFAULT_ROOT/atlas/atlashotdisk
 STORM_ATLASHOTDISK_TOKEN="ATLASHOTDISK"
STORM_ATLASHOTDISK_ACCESSPOINT=$STORM_ATLASHOTDISK_ROOT

STORM_ATLASLOCALGROUPDISK_VONAME="atlas"
STORM_ATLASLOCALGROUPDISK_ROOT=$STORM_DEFAULT_ROOT/atlas/atlaslocalgroupdisk
 STORM_ATLASLOCALGROUPDISK_TOKEN="ATLASLOCALGROUPDISK"
STORM_ATLASLOCALGROUPDISK_ACCESSPOINT=$STORM_ATLASLOCALGROUPDISK_ROOT
 STORM_ATLASPRODDISK_VONAME="atlas"
STORM_ATLASPRODDISK_ROOT=$STORM_DEFAULT_ROOT/atlas/atlasproddisk
 STORM_ATLASPRODDISK_TOKEN="ATLASPRODDISK"
STORM_ATLASPRODDISK_ACCESSPOINT=$STORM_ATLASPRODDISK_ROOT

STORM_ATLASSCRATCHDISK_VONAME="atlas"
STORM_ATLASSCRATCHDISK_ROOT=$STORM_DEFAULT_ROOT/atlas/atlasscratchdisk
 STORM_ATLASSCRATCHDISK_TOKEN="ATLASSCRATCHDISK"
STORM_ATLASSCRATCHDISK_ACCESSPOINT=$STORM_ATLASSCRATCHDISK_ROOT

STORM_ATLASUSERDISK_VONAME="atlas"
STORM_ATLASUSERDISK_ROOT=$STORM_DEFAULT_ROOT/atlas/atlasuserdisk
 STORM_ATLASUSERDISK_TOKEN="ATLASUSERDISK"
STORM_ATLASUSERDISK_ACCESSPOINT=$STORM_ATLASUSERDISK_ROOT

STORM_ATLAS_VONAME="atlas"
STORM_ATLAS_ROOT=$STORM_ATLASSCRATCHDISK_ROOT
 STORM_ATLAS_TOKEN="ATLAS"
STORM_ATLAS_ACCESSPOINT=$STORM_ATLAS_ROOT


only non-default paramenters have been included here, the rest are #commented and not included here for the sake of simplicity.

About the defined Storage areas, are defined in the STORM_STORAGEAREA_LIST variable and configured in the STORM_<VO_NAME>_* related variables. But only the ones refering to ATLAS tokens are included here since the others are similar are ATLAS are in the scope of this wiki.

Configuration files issue:

 the infn release installs by default several services that we don't need. We remove the configuration for GRIDEYE and NFS ( commenting/removing corresponding entry in yaim file: /opt/glite/yaim/node-info.d/ig-se_storm_backend): 

/opt/glite/yaim/node-info.d/ig-se_storm_backend:
#config_fmon_client
#config_nfs_sw_dir_server

Configure the services:

/opt/glite/yaim/bin/ig_yaim -c -s site-info-pps-ific-SL4.def -n ig_SE_storm_backend -n ig_SE_storm_frontend

ATLAS authorization configuration

To allow a proper authorization configuration an being able access to data, we are using a special authorization plugin developed by us, and that now it is included in the main storm sources. This means you can use it, but it is not very well documented.

Modify the plugin to be used, and related variables in the storm.properties file:

/opt/storm/backend/etc/storm.properties:

        authorization.sources = LocalAuthorizationSource
        # =======================
        # nuevas directrices
        # =======================
        # poner el permiso de escritura en directorios nuevos
        # lo necesita LocalAuthorizationSource
        directory.writeperm = true

In our srm and gridftp configuration, all atlas users all mapped to a single user. Check the mapping files:

/etc/grid-security/voms-grid-mapfile:

        "/atlas/Role=lcgadmin" atls000
        "/atlas/Role=lcgadmin/Capability=NULL" atls000
        "/atlas/Role=production" atlp000
        "/atlas/Role=production/Capability=NULL" atlp000
        "/atlas/Role=software" atls000
        "/atlas/Role=software/Capability=NULL" atls000
        "/atlas/Role=pilot" atlu000
        "/atlas/Role=pilot/Capability=NULL" atlu000
        "/atlas" atlu000
        "/atlas/Role=NULL" atlu000
        "/atlas/Role=NULL/Capability=NULL" atlu000

file /opt/edg/etc/edg-mkgridmap.conf:

        # ATLAS
        # Map VO members  (prd)
        group vomss://voms.cern.ch:8443/voms/atlas?/atlas/Role=production atlp000

        # Map VO members  (sgm)
        group vomss://voms.cern.ch:8443/voms/atlas?/atlas/Role=lcgadmin atls000

        # Map VO members  (root Group)
        group vomss://voms.cern.ch:8443/voms/atlas?/atlas atlu000

The idea is that with this configuration the plugin respects the ACLs on disk to authorize or not permission to access the files, according to ATLAS policies. You can check the ACLs that we have defined in the following spaces of the atlas pool in /lustre/ific.uv.es/grid/atlas:

# file: ..                                        
# owner: root                                     
# group: root                                     
user::rwx                                         
group::r-x                                        
other::r-x                                        

# file: .
# owner: atlu000
# group: atlas  
user::rwx       
user:101:rwx    
group::rwx      
mask::rwx       
other::r-x      

# file: atlasdatadisk
# owner: storm       
# group: storm       
user::rwx            
group::r-x           
group:atlas:r-x      
group:atlp:rwx       
mask::rwx            
other::---           
default:user::rwx    
default:group::r-x   
default:group:atlas:r-x
 default:mask::rwx      
default:other::---     

# file: atlasenduser
# owner: storm      
# group: storm      
user::rwx           
group::rwx          
group:atlas:rwx     
mask::rwx           
other::---          

# file: atlasgroupdisk
# owner: storm        
# group: storm        
user::rwx             
group::r-x            
group:atlas:r-x       
group:atlp:rwx        
mask::rwx             
other::---            
default:user::rwx     
default:group::r-x    
default:group:atlas:r-x
 default:mask::rwx      
default:other::---     

# file: atlashotdisk
# owner: storm      
# group: storm      
user::rwx           
group::r-x          
group:atlas:r-x     
group:atlp:rwx      
group:atls:--x      
mask::rwx           
other::---          
default:user::rwx   
default:group::r-x  
default:group:atlas:r-x
 default:mask::rwx      
default:other::r-x     

# file: atlaslocalgroupdisk
# owner: storm             
# group: storm             
user::rwx                  
group::rwx                 
group:atlas:rwx            
group:atlp:--x             
mask::rwx                  
other::---                 

# file: atlasmcdisk
# owner: storm
# group: storm
 user::rwx
 group::r-x
 group:atlas:r-x
 group:atlp:rwx
 mask::rwx
 other::---
default:user::rwx
 default:group::r-x
 default:group:atlas:r-x
 default:mask::rwx
 default:other::---

# file: atlasproddisk
# owner: storm
# group: storm
 user::rwx
 group::r-x
 group:atlas:r-x
 group:atlp:rwx
 mask::rwx
 other::---
default:user::rwx
 default:group::r-x
 default:group:atlas:r-x
 default:mask::rwx
 default:other::---

# file: atlasscratchdisk
# owner: storm
# group: storm
 user::rwx
 group::rwx
 group:atlas:rwx
 group:atlp:--x
 mask::rwx
 other::---

# file: atlasuserdisk
# owner: storm
# group: storm
 user::rwx
 group::rwx
 group:atlas:rwx
 group:atlp:--x
 group:atls:--x
 mask::rwx
 other::---

# file: generated
# owner: atlu000
# group: atlas
 user::rwx
 group::rwx
 other::r-x

See in the previous box for example the atlasproddisk space, where common atlas users can access and read the files, but only production users (atlp mapped GID) can write. The access ACLs are created for this policy, and the default acls and mask to respect it in the created files and subdirectories.

Dinamic Info Provider

The initial default configuration for the available and free space in disk checks the disk, but it does not take into account our pool configuration for the different spaces, specially for ATLAS vo. It does not update correctly the information when it is runnint, in the Information system and in the internal Storm database.

To be able to publish information correctly we developed a script to update this data frequently, that you can find here: space-tokens-dynamicinfo_v2.tgz

Configuration issues

  • If you are not using the default port, check  the configuration file since yaim is not changing it properly
 /opt/storm/backend/etc/storm.properties:
        storm.service.port=8444
        fe.port = 8444
  •   On previous version configuration was not understanding correctly FQAN at voms proxies, like this:
/opt/srmv2storm/var/log/srmv2storm.log:
06/14 18:15:46 26838,0 Rm: UserDN=/C=ES/O=DATAGRID-ES/O=IFIC/CN=Alvaro Fernandez Casani
 06/14 18:15:46 26838,0 Rm: Number of FQANs: 0

  • in this case, check that the yaim configuration creates the correct directories containing the voms servers certificates ( or for newer versions the DN of the voms server ):
[root@srmv2 storm]# ll /etc/grid-security/vomsdir/
total 432                                         
drwxr-xr-x  2 root root 4096 Nov 26 15:12 atlas   
-rw-r--r--  1 root root 4620 Jan 19 15:41 cclcgvomsli01.in2p3.fr.1413.pem
-rw-r--r--  1 root root 1440 Dec  2 16:09 cert-voms-01.cnaf.infn.it.pem  
-rw-r--r--  1 root root 1440 Dec  2 16:09 cert-voms-01.cnaf.infn.it.pem.1
-rw-r--r--  1 root root 1424 Dec  2 16:09 cert-voms-01.cnaf.infn.it.pem.2
-rw-r--r--  1 root root 5857 Mar 17  2009 dgrid-voms.fzk.de.4515.pem
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 dteam
-rw-r--r--  1 root root 5182 Mar 17  2009 glite-io.scai.fraunhofer.de.4499.pem
-rw-r--r--  1 root root 1436 Dec  2 16:09 grid12.lal.in2p3.fr.pem
-rw-r--r--  1 root root 5132 Mar 17  2009 grid-voms.desy.de.5369.pem
-rw-r--r--  1 root root 5132 Mar 17  2009 grid-voms.desy.de.8119.pem
-rw-r--r--  1 root root 1428 Mar 17  2009 grid-voms.esrf.eu.4140.pem
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 ific
-rw-r--r--  1 root root 5968 Jan 19 15:41 lcg-voms.cern.ch.2009-03-03.pem
-rw-r--r--  1 root root 6782 Jan 19 15:41 lcg-voms.cern.ch.2010-01-18.pem
-rw-r--r--  1 root root 5154 Dec  2 16:09 mu4.matrix.sara.nl.pem
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 ngi-es
 drwxr-xr-x  2 root root 4096 Nov 23 22:40 ops
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 swetest
-rw-r--r--  1 root root 2496 Nov 27 13:57 swevo.ific.uv.es.pem
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.archist.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.blast.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.filogen.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.formacion.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.frodock.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.general.csic.es
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.general.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.gphase.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.grid4build.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.meteo.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.mosfet.es-ngi.eu
-rw-r--r--  1 root root 2502 Jul  7  2009 voms01.ifca.es.pem
-rw-r--r--  1 root root 1419 Dec  2 16:09 voms-01.pd.infn.it.pem
-rw-r--r--  1 root root 1420 Dec  2 16:09 voms-01.pd.infn.it.pem.1
-rw-r--r--  1 root root 1419 Dec  2 16:09 voms-02.pd.infn.it.pem
-rw-r--r--  1 root root 1420 Dec  2 16:09 voms-02.pd.infn.it.pem.1
-rw-r--r--  1 root root 1424 Dec  2 16:09 voms2.cnaf.infn.it.pem
-rw-r--r--  1 root root 1424 Dec  2 16:09 voms2.cnaf.infn.it.pem.1
-rw-r--r--  1 root root 1404 Dec  2 16:09 voms2.cnaf.infn.it.pem.2
-rw-r--r--  1 root root 6552 Jan 19 15:41 voms.cern.ch.2009-06-22.pem
-rw-r--r--  1 root root 1419 Dec  2 16:09 voms.cnaf.infn.it.pem
-rw-r--r--  1 root root 1419 Dec  2 16:09 voms.cnaf.infn.it.pem.1
-rw-r--r--  1 root root 1399 Dec  2 16:09 voms.cnaf.infn.it.pem.2
-rw-r--r--  1 root root 4865 Jan 19 15:41 voms.fnal.gov.35501.pem
-rw-r--r--  1 root root 1484 Dec  2 16:09 voms.fnal.gov.pem
-rw-r--r--  1 root root 1298 Dec  2 16:09 voms.fnal.gov.pem.1
-rw-r--r--  1 root root 1843 Dec  2 16:09 voms.gridpp.ac.uk.pem
-rw-r--r--  1 root root 2138 Dec  2 16:09 voms.gridpp.ac.uk.pem.1
-rw-r--r--  1 root root 1651 Dec  2 16:09 voms.grid.sara.nl.pem
-rw-r--r--  1 root root 5152 Dec  2 16:09 voms.grid.sara.nl.pem.1
-rw-r--r--  1 root root 2483 Nov 27 14:05 voms.ific.uv.es.pem
-rw-r--r--  1 root root 6566 Jan 19 15:41 voms-pilot.cern.ch.2009-06-30.pem
-rw-r--r--  1 root root 1472 Dec  2 16:09 voms.research-infrastructures.eu.pem
-rw-r--r--  1 root root 1472 Dec  2 16:09 voms.research-infrastructures.eu.pem.1
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.nanodev.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.odthpiv.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.operaciones.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 26 15:12 vo.ops.csic.es
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.photonics.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.qcomp.es-ngi.eu
-rw-r--r--  1 root root 4697 Jan 19 15:41 vo.racf.bnl.gov.40260.pem
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.slgrid.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.timones.es-ngi.eu
 drwxr-xr-x  2 root root 4096 Nov 13 13:15 vo.turbulencia.es-ngi.eu
  • Then correct interpretation of FQANs is acomplished:
02/18 13:13:08   495,0 Mkdir: UserDN=/DC=es/DC=irisgrid/O=ific/CN=Alvaro-Fernandez
 02/18 13:13:08   495,0 Mkdir: Number of FQANs: 2
 02/18 13:13:08   495,0 Mkdir: FQAN[0]: /atlas/Role=NULL/Capability=NULL
 02/18 13:13:08   495,0 Mkdir: FQAN[1]: /atlas/lcg1/Role=NULL/Capability=NULL
  • If there are problems with java in reconfiguration, then it gives problem locating java and starting storm-backend server. Solution:
[root@ccc01 yaim]# alternatives --install /usr/bin/java java /usr/java/jdk1.5.0_07/bin/java 1
[root@ccc01 yaim]# alternatives --config java

There are 2 programs which provide 'java'.

  Selection    Command
-----------------------------------------------
*+ 1           /usr/share/java/libgcj-java-placeholder.sh
   2           /usr/java/jdk1.5.0_07/bin/java

Enter to keep the current selection[+], or type selection number: 2

Srm command testing

We developed a simple battery of test to check the srm functionality of the installed storm server. You can find the results below and the test script here: test-srm.sh

You will need the srm client that is distributed in the Storm webpage: http://storm.forge.cnaf.infn.it/downloads/client

srmPing

Testing Ping command: ./clientSRM Ping -e httpg://srmv2.ific.uv.es:8443
============================================================           
Sending Ping request to: httpg://srmv2.ific.uv.es:8443                 
============================================================           
Request status:                                                        
  statusCode="SRM_SUCCESS"(0)                                          
  explanation="SRM server successfully contacted"                      
============================================================           
SRM Response:                                                          
  versionInfo="v2.2"                                                   
  otherInfo (size=2)                                                   
    [0] key="backend_type"                                             
    [0] value="StoRM"                                                  
    [1] key="backend_version"                                          
    [1] value="<FE:1.4.0-01.sl4><BE:1.4.0-00>"                         
============================================================
Ping Command:                                                          
                                                           [  OK  ]

srmMkdir

Testing Mkdir command: ./clientSRM mkdir -e httpg://srmv2.ific.uv.es:8443 -s srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_mkdir_26999
============================================================                                                                                                            
Sending Mkdir request to: httpg://srmv2.ific.uv.es:8443                                                                                                                 
============================================================                                                                                                            
Request status:                                                                                                                                                         
  statusCode="SRM_SUCCESS"(0)                                                                                                                                           
  explanation="Directory created with success"                                                                                                                          
============================================================                                                                                                            
SRM Response:                                                                                                                                                           
============================================================                                                                                                            
Mkdir Command:                                                                                                                                                          
                                                           [  OK  ]

srmRmdir

Testing Rmdir command: ./clientSRM rmdir -e httpg://srmv2.ific.uv.es:8443 -s srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_mkdir_26999
============================================================                                                                                                            
Sending Rmdir request to: httpg://srmv2.ific.uv.es:8443                                                                                                                 
============================================================                                                                                                            
Request status:                                                                                                                                                         
  statusCode="SRM_SUCCESS"(0)                                                                                                                                           
  explanation="Directory removed with success!"                                                                                                                         
============================================================                                                                                                            
SRM Response:                                                                                                                                                           
============================================================                                                                                                            
Rmdir Command:                                                                                                                                                          
                                                           [  OK  ]

srmReserveSpace

Testing ReserveSpace command: ./clientSRM ReserveSpace -e httpg://srmv2.ific.uv.es:8443 -a 10000000 -b 5000000 -r 0,0                                                   
============================================================                                                                                                            
Sending ReserveSpace request to: httpg://srmv2.ific.uv.es:8443                                                                                                          
============================================================                                                                                                            
Request status:                                                                                                                                                         
  statusCode="SRM_SUCCESS"(0)                                                                                                                                           
  explanation="Space Reservation done"                                                                                                                                  
============================================================                                                                                                            
SRM Response:                                                                                                                                                           
  sizeOfTotalReservedSpace=10000000                                                                                                                                     
  sizeOfGuaranteedReservedSpace=10000000                                                                                                                                
  lifetimeOfReservedSpace=2147483647                                                                                                                                    
  spaceToken="D2EF3295-2B01-1F6D-939C-74E8AF1C71F6"                                                                                                                     
============================================================                                                                                                            
ReserveSpace Command:                                                                                                                                                   
                                         [  OK  ]

where parameters are the following:
  • -a <desiredSizeOfTotalSpace> in millions of bytes (i.e. 10Mb )
  • -b <desiredSizeOfGuaranteedSpace> (i.e. 5Mb)
  • -r <retentionPolicy,accessLatency> Retention policy (0=replica, 1=output, 2=custodial),(0=online, 1=nearline)

Currently StoRM? supports:
* Retention Policy = { REPLICA }
* Access Latency = { ONLINE }
* ExpirationMode? = { releaseWhenExpire, neverExpire } (Note: neverExpire will be the default)
* Access Protocol = { gsiftp, file, rfio }
*Access Pattern = { WAN, LAN } (Note: It depends on the access protocol)

srmCopy

*note*:srmCopy is working only with SURL and not with local file. This is not reflected in the documentation
[alferca@cg02 srmtest]$ ./clientSRM copy -e httpg://ccc01.ific.uv.es:8444/ -s file:///tmp/kk.log srm://ccc01.ific.uv.es:8444/atlas/filekk
============================================================
Sending Copy request to: httpg://ccc01.ific.uv.es:8444/
============================================================
Request status:
  statusCode="SRM_REQUEST_QUEUED"(17)
  explanation=""
============================================================
SRM Response:
  requestToken="59cbef94-a552-4886-96af-e55dae60c144"
  arrayOfFileStatuses (size=1)
      [0] sourceSURL="file:///tmp/kk.log"
      [0] targetSURL="srm://ccc01.ific.uv.es:8444/atlas/filekk"
      [0] status: statusCode="SRM_REQUEST_QUEUED"(17)
                  explanation=""
============================================================
[alferca@cg02 srmtest]$ ./clientSRM statuscopy -e httpg://ccc01.ific.uv.es:8444/ -t 59cbef94-a552-4886-96af-e55dae60c144
============================================================
Sending StatusCopy request to: httpg://ccc01.ific.uv.es:8444/
============================================================
Request status:
  statusCode="SRM_FAILURE"(1)
  explanation="This SRM Copy request contained nothing to process!"
============================================================
SRM Response:
  remainingTotalRequestTime=0
  arrayOfFileStatuses (size=1)
      [0] sourceSURL="file:///tmp/kk.log"
      [0] targetSURL="srm://ccc01.ific.uv.es:8444/atlas/filekk"
      [0] estimatedWaitTime=-1
      [0] status: statusCode="SRM_FAILURE"(1)
                  explanation="This SRM Copy request contained nothing to process!"
============================================================
srmCopy in pull mode is not implemente, and this tests currently *FAILS*. Actually if you want add a local file to a storage through an SRM we suggest to use the sequence of operation: - prepareToPut with the target SURL - globus-url-copy with the TURL returned by the previous srmPrepareToGet - srmPutDone to notify to the SRM the end of transfer.

SrmPtP?

Testing Ptp command: ./clientSRM Ptp -e httpg://srmv2.ific.uv.es:8443 -s file:////tmp/testfile.26999 srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file_26999
============================================================                                                                                                                        
Sending PtP request to: httpg://srmv2.ific.uv.es:8443                                                                                                                               
============================================================                                                                                                                        
Request status:                                                                                                                                                                     
  statusCode="SRM_REQUEST_QUEUED"(17)                                                                                                                                               
  explanation=""                                                                                                                                                                    
============================================================                                                                                                                        
SRM Response:                                                                                                                                                                       
  requestToken="ff4a0ea8-07d4-498b-a37b-847a7f530f3a"                                                                                                                               
  arrayOfFileStatuses (size=2)                                                                                                                                                      
      [0] SURL="file:////tmp/testfile.26999"                                                                                                                                        
      [0] status: statusCode="SRM_REQUEST_QUEUED"(17)                                                                                                                               
                  explanation=""                                                                                                                                                    
      [1] SURL="srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file_26999"                                                                         
      [1] status: statusCode="SRM_REQUEST_QUEUED"(17)                                                                                                                               
                  explanation=""                                                                                                                                                    
============================================================                                                                                                                        
Ptp Command:                                                                                                                                                                        
                                                           [  OK  ]

SrmStatusPtP?

Testing StatusPtp command: ./clientSRM StatusPtp -e httpg://srmv2.ific.uv.es:8443 -t ff4a0ea8-07d4-498b-a37b-847a7f530f3a                                                           
============================================================                                                                                                                        
Sending StatusPtP request to: httpg://srmv2.ific.uv.es:8443                                                                                                                         
============================================================                                                                                                                        
Request status:                                                                                                                                                                     
  statusCode="SRM_SUCCESS"(0)                                                                                                                                                       
  explanation="All chunks successfully handled!"                                                                                                                                    
============================================================                                                                                                                        
SRM Response:                                                                                                                                                                       
  remainingTotalRequestTime=0                                                                                                                                                       
  arrayOfFileStatuses (size=2)                                                                                                                                                      
      [0] SURL="file:////tmp/testfile.26999"                                                                                                                                        
      [0] status: statusCode="SRM_FAILURE"(1)                                                                                                                                       
                  explanation="This chunk of the request is malformed!"                                                                                                             
      [0] estimatedWaitTime=-1                                                                                                                                                      
      [1] SURL="srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file_26999"                                                                         
      [1] status: statusCode="SRM_SPACE_AVAILABLE"(24)                                                                                                                              
                  explanation="srmPrepareToPut successfully handled!"                                                                                                               
      [1] estimatedWaitTime=-1                                                                                                                                                      
      [1] TURL="gsiftp://tux4u04.ific.uv.es:2811//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file_26999"                                                                    
============================================================ 
StatusPtp Command:                                                                                                                                              
                                                           [  OK  ]

and when StatusPtP? is OK, then the data can be send with gridftp:

globus-url-copy file:///tmp/testfile.26999 gsiftp://tux4u04.ific.uv.es:2811//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file_26999
 Transfered file with globus-url-copy
 1024+0 records in                                          [  OK  ]                                                                                                     
1024+0 records out

SrmPutDone?

Testing PutDone command: ./clientSRM PutDone -e httpg://srmv2.ific.uv.es:8443 -s srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file_26999 -t ff4a0ea8-07d4-498b-a37b-847a7f530f3a
============================================================                                                                                                                        
Sending PutDone request to: httpg://srmv2.ific.uv.es:8443                                                                                                                           
============================================================                                                                                                                        
Request status:                                                                                                                                                                     
  statusCode="SRM_SUCCESS"(0)                                                                                                                                                       
  explanation="All file requests are successfully completed"                                                                                                                        
============================================================                                                                                                                        
SRM Response:                                                                                                                                                                       
  arrayOfFileStatuses (size=1)                                                                                                                                                      
      [0] SURL="srm://srmv2.ific.uv.es:8443/lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file_26999"                                                                          
      [0] status: statusCode="SRM_SUCCESS"(0)                                                                                                                                       
                  explanation=""                                                                                                                                                    
============================================================                                                                                                                        
PutDone Command:                                                                                                                                                                    
                                                           [  OK  ]

SrmPrepareToGet?

Testing PtG command: ./clientSRM PtG -p -e httpg://srmv2.ific.uv.es:8443 -s srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file_26999 -T -P file   
============================================================                                                                                                                        
Sending PtG request to: httpg://srmv2.ific.uv.es:8443                                                                                                                               
============================================================                                                                                                                        
Polling request status:                                                                                                                                                             
Current status: SRM_REQUEST_QUEUED (Ctrl+c to stop polling).                                                                                                                        
============================================================                                                                                                                        
Request status:                                                                                                                                                                     
  statusCode="SRM_SUCCESS"(0)                                                                                                                                                       
  explanation="All chunks successfully handled!"                                                                                                                                    
============================================================                                                                                                                        
SRM Response:                                                                                                                                                                       
  requestToken="e11bd7aa-5136-4d7a-b740-be2dd6a0c3b6"                                                                                                                               
  remainingTotalRequestTime=0                                                                                                                                                       
  arrayOfFileStatuses (size=1)                                                                                                                                                      
      [0] status: statusCode="SRM_FILE_PINNED"(22)                                                                                                                                  
                  explanation="srmPrepareToGet successfully handled!"                                                                                                               
      [0] sourceSURL="srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file_26999"                                                                   
      [0] fileSize=1048576                                                                                                                                                          
      [0] estimatedWaitTime=-1                                                                                                                                                      
      [0] transferURL="file:///lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file_26999"                                                                                       
============================================================                                                                                                                        
PtG Command:                                                                                                                                                                        
                                                           [  OK  ]

SrmGetSpaceTokens?

Testing GetSpaceTokens command: ./clientSRM GetSpaceTokens -v -e httpg://srmv2.ific.uv.es:8443  -d ATLASMCDISK                                                                      
============================================================                                                                                                                        
Sending GetSpaceTokens request to: httpg://srmv2.ific.uv.es:8443                                                                                                                    
============================================================                                                                                                                        
Request status:                                                                                                                                                                     
  statusCode="SRM_SUCCESS"(0)                                                                                                                                                       
  explanation=""                                                                                                                                                                    
============================================================                                                                                                                        
SRM Response:                                                                                                                                                                       
  arrayOfSpaceTokens(size=1)                                                                                                                                                        
  [0] "1405D063-2901-073C-939C-74E8BBCFCD45"                                                                                                                                        
============================================================                                                                                                                        
GetSpaceTokens Command:                                                                                                                                                             
                                                           [  OK  ]

SrmGetSpaceMetadata?

Testing GetSpaceMetadata command: ./clientSRM GetSpaceMetadata -v -e httpg://srmv2.ific.uv.es:8443  -s                                                                              
1405D063-2901-073C-939C-74E8BBCFCD45                                                                                                                                                
============================================================                                                                                                                        
Sending GetSpaceMetaData request to: httpg://srmv2.ific.uv.es:8443                                                                                                                  
============================================================                                                                                                                        
Request status:                                                                                                                                                                     
  statusCode="SRM_SUCCESS"(0)                                                                                                                                                       
  explanation=""                                                                                                                                                                    
============================================================                                                                                                                        
SRM Response:                                                                                                                                                                       
  arrayOfSpaceDetails (size=1)                                                                                                                                                      
      [0] spaceToken="1405D063-2901-073C-939C-74E8BBCFCD45"                                                                                                                         
      [0] status: statusCode="SRM_SUCCESS"(0)                                                                                                                                       
                  explanation="Valid space token"                                                                                                                                   
      [0] owner="/DC=it/DC=infngrid/OU=Services/CN=storm"                                                                                                                           
      [0] totalSize=154568936964096                                                                                                                                                 
      [0] guaranteedSize=154568936964096                                                                                                                                            
      [0] unusedSize=60829260051832                                                                                                                                                 
      [0] lifetimeAssigned=0                                                                                                                                                        
      [0] lifetimeLeft=0                                                                                                                                                            
============================================================                                                                                                                        
GetSpaceMetadata Command:                                                                                                                                                           
1024+0 records in                                          [  OK  ]                                                                                                                 
1024+0 records out                                                                                                                                                                  

Testing complete cicle to put File

Testing Ptp with space token command: ./clientSRM Ptp -e httpg://srmv2.ific.uv.es:8443 -p -t 
1405D063-2901-073C-939C-74E8BBCFCD45 -s file:////tmp/testfile2.26999 srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file2_26999
============================================================                                                                                                    
Sending PtP request to: httpg://srmv2.ific.uv.es:8443                                                                                                           
============================================================                                                                                                    
Polling request status:                                                                                                                                         
Current status: SRM_REQUEST_QUEUED (Ctrl+c to stop polling).                                                                                                    
============================================================                                                                                                    
Request status:                                                                                                                                                 
  statusCode="SRM_SUCCESS"(0)                                                                                                                                   
  explanation="All chunks successfully handled!"                                                                                                                
============================================================                                                                                                    
SRM Response:                                                                                                                                                   
  requestToken="86311065-0014-4e12-9b7f-dc4cdc0850ef"                                                                                                           
  remainingTotalRequestTime=0                                                                                                                                   
  arrayOfFileStatuses (size=2)                                                                                                                                  
      [0] SURL="file:////tmp/testfile2.26999"                                                                                                                   
      [0] status: statusCode="SRM_FAILURE"(1)                                                                                                                   
                  explanation="This chunk of the request is malformed!"                                                                                         
      [0] estimatedWaitTime=-1                                                                                                                                  
      [1] SURL="srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file2_26999"                                                    
      [1] status: statusCode="SRM_SPACE_AVAILABLE"(24)                                                                                                          
                  explanation="srmPrepareToPut successfully handled!"                                                                                           
      [1] estimatedWaitTime=-1                                                                                                                                  
      [1] TURL="gsiftp://tux4u01.ific.uv.es:2811//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file2_26999"                                               
============================================================                                                                                                    
Ptp with space token Command:                                                                                                                                   
                                                           [  OK  ]                                                                                             
Testing StatusPtp command: ./clientSRM StatusPtp -e httpg://srmv2.ific.uv.es:8443 -t 86311065-0014-4e12-9b7f-dc4cdc0850ef                                       
============================================================                                                                                                    
Sending StatusPtP request to: httpg://srmv2.ific.uv.es:8443                                                                                                     
============================================================                                                                                                    
Request status:                                                                                                                                                 
  statusCode="SRM_SUCCESS"(0)                                                                                                                                   
  explanation="All chunks successfully handled!"                                                                                                                
============================================================                                                                                                    
SRM Response:                                                                                                                                                   
  remainingTotalRequestTime=0                                                                                                                                   
  arrayOfFileStatuses (size=2)                                                                                                                                  
      [0] SURL="file:////tmp/testfile2.26999"                                                                                                                   
      [0] status: statusCode="SRM_FAILURE"(1)                                                                                                                   
                  explanation="This chunk of the request is malformed!"                                                                                         
      [0] estimatedWaitTime=-1                                                                                                                                  
      [1] SURL="srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file2_26999"                                                    
      [1] status: statusCode="SRM_SPACE_AVAILABLE"(24)                                                                                                          
                  explanation="srmPrepareToPut successfully handled!"                                                                                           
      [1] estimatedWaitTime=-1                                                                                                                                  
      [1] TURL="gsiftp://tux4u01.ific.uv.es:2811//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file2_26999"                                               
============================================================                                                                                                    
StatusPtp Command:                                                                                                                                              
globus-url-copy file:///tmp/testfile2.26999 gsiftp://tux4u01.ific.uv.es:2811//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file2_26999                    
Transfered file with globus-url-copy:                                                                                                                           
                                                           [  OK  ]                                                                                             
Testing PutDone command: ./clientSRM PutDone -e httpg://srmv2.ific.uv.es:8443 -s srm://srmv2.ific.uv.es:8443//lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file2_26999 -t 86311065-0014-4e12-9b7f-dc4cdc0850ef
============================================================                                                                                                                        
Sending PutDone request to: httpg://srmv2.ific.uv.es:8443                                                                                                                           
============================================================                                                                                                                        
Request status:                                                                                                                                                                     
  statusCode="SRM_SUCCESS"(0)
  explanation="All file requests are successfully completed"
============================================================
SRM Response:
  arrayOfFileStatuses (size=1)
      [0] SURL="srm://srmv2.ific.uv.es:8443/lustre/ific.uv.es/grid/atlas/atlasscratchdisk/test_file2_26999"
      [0] status: statusCode="SRM_SUCCESS"(0)
                  explanation=""
============================================================
PutDone Command:
1024+0 records in                                          [  OK  ]
1024+0 records out


GFAL testing

  • The current installation of GFAL and lcg_util tools >= lcg_util-1.5.2 STARTS supporting srmv2.2 protocol.

lcg-cr : copy a file on the remote SE and store it on the catalog

[alferca@cg02 jdl]$ lcg-cr -v --vo atlas -d ccc01.ific.uv.es -l lfn:/grid/atlas/ific.uv.es/test.$$ file:///tmp/kk.log
 Using grid catalog type: lfc
 Using grid catalog : lfc02.pic.es
 Using LFN : /grid/atlas/ific.uv.es/test.12208
 Using SURL : srm://ccc01.ific.uv.es/atlas/generated/2007-07-24/file175f15c4-da6a-470c-a8cd-597d2cbd4c88
 srm://ccc01.ific.uv.es/atlas/generated/2007-07-24/file175f15c4-da6a-470c-a8cd-597d2cbd4c88: srmPrepareToPut successfully handled!
Source URL: file:///tmp/kk.log
 File size: 9
 VO name: atlas
 Destination specified: ccc01.ific.uv.es
 Destination URL for copy: gsiftp://ccc01.ific.uv.es:2811/lustre/ific.uv.es/atlas/generated/2007-07-24/file175f15c4-da6a-470c-a8cd-597d2cbd4c88
# streams: 1
# set timeout to 0 seconds
 Alias registered in Catalog: lfn:/grid/atlas/ific.uv.es/test.12208
            9 bytes      0.02 KB/sec avg      0.02 KB/sec inst
 Transfer took 1060 ms
 srm://ccc01.ific.uv.es/atlas/generated/2007-07-24/file175f15c4-da6a-470c-a8cd-597d2cbd4c88:
Destination URL registered in Catalog: srm://ccc01.ific.uv.es/atlas/generated/2007-07-24/file175f15c4-da6a-470c-a8cd-597d2cbd4c88
 guid:573a6648-bb4e-4724-ae77-b70141a60946
   

lcg-cp: copy back a file and store locally

[alferca@cg02 jdl]$ lcg-cp -v --vo atlas lfn:/grid/atlas/ific.uv.es/test.$$ file:///tmp/kk-copied.log
 Using grid catalog type: lfc
 Using grid catalog : lfc02.pic.es
 VO name: atlas
 srm://ccc01.ific.uv.es/atlas/generated/2007-07-24/file175f15c4-da6a-470c-a8cd-597d2cbd4c88: srmPrepareToGet successfully handled!
Source URL: lfn:/grid/atlas/ific.uv.es/test.12208
 File size: 9
 Source URL for copy: gsiftp://ccc01.ific.uv.es:2811/lustre/ific.uv.es/atlas/generated/2007-07-24/file175f15c4-da6a-470c-a8cd-597d2cbd4c88
 Destination URL: file:///tmp/kk-copied.log
# streams: 1
# set timeout to  0 (seconds)
            0 bytes      0.00 KB/sec avg      0.00 KB/sec inst
 Transfer took 1020 ms
 Released
   

lcg-rep: replicate file to a remote storage element

[alferca@cg02 alferca]$ lcg-rep -v --vo atlas -d grid007g.cnaf.infn.it lfn:/grid/atlas/ific.uv.es/test3
 Using grid catalog type: lfc
 Using grid catalog : lfc02.pic.es
 srm://ccc01.ific.uv.es/atlas/generated/2007-07-25/filed5fcf388-ac93-4f2e-a88d-b127eabded18: srmPrepareToGet successfully handled!
Source URL: lfn:/grid/atlas/ific.uv.es/test3
 File size: 9
 VO name: atlas
 Destination specified: grid007g.cnaf.infn.it
 Source URL for copy: gsiftp://ccc01.ific.uv.es:2811/lustre/ific.uv.es/atlas/generated/2007-07-25/filed5fcf388-ac93-4f2e-a88d-b127eabded18
 Destination URL for copy: gsiftp://grid007g.cnaf.infn.it/flatfiles/SE00/atlas/generated/2007-07-25/filecea8d543-fd6a-4934-be52-0245cd014a7a
# streams: 1
# set timeout to 0
            0 bytes      0.00 KB/sec avg      0.00 KB/sec inst
 Transfer took 2020 ms
 Released
 Destination URL registered in LRC: sfn://grid007g.cnaf.infn.it/flatfiles/SE00/atlas/generated/2007-07-25/filecea8d543-fd6a-4934-be52-0245cd014a7a
   

lcg-lr: list replicas

[alferca@cg02 alferca]$ lcg-lr -v --vo atlas  lfn:/grid/atlas/ific.uv.es/test3
 sfn://grid007g.cnaf.infn.it/flatfiles/SE00/atlas/generated/2007-07-25/filecea8d543-fd6a-4934-be52-0245cd014a7a
 srm://ccc01.ific.uv.es/atlas/generated/2007-07-25/filed5fcf388-ac93-4f2e-a88d-b127eabded18
   

lcg-del: remove replicas of files

[alferca@cg02 alferca]$ lcg-del -v --vo atlas -a  lfn:/grid/atlas/ific.uv.es/test3
 VO name: atlas
 Using GUID : 90d0dc51-db18-47c2-9d0c-c6b37e62833e
 set timeout to 0 seconds
 sfn://grid007g.cnaf.infn.it/flatfiles/SE00/atlas/generated/2007-07-25/filecea8d543-fd6a-4934-be52-0245cd014a7a is deleted
 sfn://grid007g.cnaf.infn.it/flatfiles/SE00/atlas/generated/2007-07-25/filecea8d543-fd6a-4934-be52-0245cd014a7a is unregistered
 srm://ccc01.ific.uv.es/atlas/generated/2007-07-25/filed5fcf388-ac93-4f2e-a88d-b127eabded18: File removed
 srm://ccc01.ific.uv.es/atlas/generated/2007-07-25/filed5fcf388-ac93-4f2e-a88d-b127eabded18 is deleted
 srm://ccc01.ific.uv.es/atlas/generated/2007-07-25/filed5fcf388-ac93-4f2e-a88d-b127eabded18 is unregistered
   

Common Errors

  1. Authentication errors on MYSQL database, like:

    [root@ccc01 current]# /etc/rc.d/init.d/srmv2storm restart
     srmv2storm already stopped:                                [FAILED]
    Starting srmv2storm: bash: /root/.bashrc: Permission denied
     07/03 15:16:47 14106 srm_main: verbose level = 4
     07/03 15:16:47 14106 srmv2: started
     07/03 15:16:47 14106 srmv2: Using 'http://localhost:8080/RPC2' as XMLRPC endpoint
     07/03 15:16:47 14106 srmv2: Proxy directory for srmCopy: /opt/storm/var/proxies
     07/03 15:16:47 14106 storm_opendb: CONNECT error: Access denied for user 'storm'@'localhost' (using password: YES)
    07/03 15:17:47 14106 storm_opendb: CONNECT error: Access denied for user 'storm'@'localhost' (using password: YES)
    07/03 15:18:47 14106 storm_opendb: CONNECT error: Access denied for user 'storm'@'localhost' (using password: YES)
    07/03 15:19:47 14106 storm_opendb: CONNECT error: Access denied for user 'storm'@'localhost' (using password: YES)
    07/03 15:20:47 14106 storm_opendb: CONNECT error: Access denied for user 'storm'@'localhost' (using password: YES)
    07/03 15:21:47 14106 storm_opendb: CONNECT error: Access denied for user 'storm'@'localhost' (using password: YES)
    07/03 15:21:47 14106 storm_opendb: Too many connection attempts to the DB. Cowardly refusing to continue.
    07/03 15:21:47 14106 srmv2: SRM02 - get_supported_protocols error : Cannot allocate memory
       
    
  • Due to: config files are not modifyed according the user password set up in yaim config files
  • Solution: modify password by hand in the following files.

     
    /etc/sysconfig/srmv2storm.nsconfig -> /opt/srmv2storm/etc/sysconfig/srmv2storm.nsconfig
    /opt/srmv2storm/etc/db/storm_mysql_grant.sql
       
    
  1. Errors creating remote directories:
  • logs showing errors like: "Caused by: Malformed SURL does not contains StFNRoot?: srm://ccc01.ific.uv.es:8444/atlas/XXX" .
  1. Connection errors using lcg_utils:

    [alferca@cg02 srmtest]$ lcg-cr -v --vo dteam -d ccc01.ific.uv.es file:///tmp/kk.log
     Using grid catalog type: lfc
     Using grid catalog : lfc02.pic.es
     Using LFN : /grid/dteam/generated/2007-07-05/file-121bde62-fc39-4df1-a9e4-e215c403e72c
     Using SURL : srm://ccc01.ific.uv.es/lustre/ific.uv.es/dteam/generated/2007-07-05/filefc39f658-05d0-4c48-9b5f-6eac8ad24954
     CGSI-gSOAP: Could not open connection !
    lcg_cr: Connection refused
       
    
  • The error seems to be related to the fact that it is not finding the endpoing for the SE ccc01 in the PPS BDII. Check the published info of the Storm server in the top BDII
  1. lcg_utils "The path specified in the SURL does not have a local equivalent":
  • The following error is raised because of a wrong hadling of the srmv2.2 protocol by the GFAL high level tools (lcg-cr in this case). The tools does not create the proper directories on the remote SE node:

    [alferca@cg02 srmtest]$ lcg-cr -v --vo dteam -d ccc01.ific.uv.es file:///tmp/kk.log
     Using grid catalog type: lfc
     Using grid catalog : lfc02.pic.es
     Using LFN : /grid/dteam/generated/2007-07-05/file-b1ecae0d-4958-4aa5-bf84-763d3e20fddc
     Using SURL : srm://ccc01.ific.uv.es/dteam/generated/2007-07-05/file5d81d810-4df8-4975-b8f8-c3b9fed42835
     The path specified in the SURL does not have a local equivalent!
    lcg_cr: No such file or directory
       
    
  • Versions of of GFAL and lcg_util tools < lcg_util-1.5.1-1 does not support srmv2.2 protocol.
  1. grid-ftp error when doing 3rd party transfers (as lcg-rep does): " the server sent an error response: 500 500 Illegal PORT Com"

    [alferca@cg02 jdl]$ lcg-rep -v --vo atlas -d grid007g.cnaf.infn.it lfn:/grid/atlas/ific.uv.es/test.12208
     Using grid catalog type: lfc
     Using grid catalog : lfc02.pic.es
     srm://ccc01.ific.uv.es/atlas/generated/2007-07-24/file175f15c4-da6a-470c-a8cd-597d2cbd4c88: srmPrepareToGet successfully handled!
    Source URL: lfn:/grid/atlas/ific.uv.es/test.12208
     File size: 9
     VO name: atlas
     Destination specified: grid007g.cnaf.infn.it
     Source URL for copy: gsiftp://ccc01.ific.uv.es:2811/lustre/ific.uv.es/atlas/generated/2007-07-24/file175f15c4-da6a-470c-a8cd-597d2cbd4c88
     Destination URL for copy: gsiftp://grid007g.cnaf.infn.it/flatfiles/SE00/atlas/generated/2007-07-24/file4aa0dccf-f746-4628-bdd9-475c2043a896
    # streams: 1
    # set timeout to 0
                0 bytes      0.00 KB/sec avg      0.00 KB/sec instthe server sent an error response: 500 500 Illegal PORT Command
       
    
  • This is a configuration problem of the gridftp server at the storm node, that is not allowed by default to do passive transfers. Check the gridftp config .
  1. /etc/rc.d/init.d/storm-backend status does not work
it is not implemented in the current version

[root@ccc01 root]#  /etc/rc.d/init.d/storm-backend status
 storm-backend: Usage: /etc/rc.d/init.d/storm-backend {start|stop|restart|force-reload|status|suspend|resume}
StoRM SRMv2 backend server                                 [FAILED]
   

Workarounds and scripts

-- AlvaroFernandez - 1 Mar 2010 -- JavierSanchez - 04 Jun 2007

  • space-tokens-dynamicinfo_v2.tgz: space tokens dynamic info version2. Script to check dynamic info related to spaces at lustre and publish it in the info provider and update the storm mysql database

  • test-srm.sh: battery of simple srm tests to check storm
toggleopenShow attachmentstogglecloseHide attachments
Topic attachments
I Attachment Action Size Date Who Comment
xmlxml namespace.xml manage 9.7 K 23 Jul 2007 - 17:32 AlvaroFernandez old Namespace config file for storm
elseldif static-file-SE.ldif manage 6.6 K 24 Sep 2007 - 09:50 AlvaroFernandez old gip ldif file of Storm in preproduction
ziptgz space-tokens-dynamicinfo.tgz manage 10.0 K 25 Nov 2008 - 15:01 AlvaroFernandez space tokens dynamic info
ziptgz space-tokens-dynamicinfo_v2.tgz manage 10.0 K 02 Mar 2010 - 11:36 AlvaroFernandez space tokens dynamic info version2. Script to check dynamic info related to spaces at lustre and publish it in the info provider and update the storm mysql database
shsh test-srm.sh manage 4.9 K 02 Mar 2010 - 11:40 AlvaroFernandez battery of simple srm tests to check storm
Edit | WYSIWYG | Attach | PDF | Raw View | Backlinks: Web, All Webs | History: r27 < r26 < r25 < r24 < r23 | More topic actions
 
Powered by TWiki
This site is powered by the TWiki collaboration platformCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback