Ciop-copy » History » Revision 3
« Previous |
Revision 3/7
(diff)
| Next »
Herve Caumont, 2013-06-18 16:26
Ciop-copy¶
Command Line (CLI) reference
- Table of contents
- Ciop-copy
Name¶
ciop-copy copies the data from logical of physical location to a local directory of the sandbox
Synopsys¶
ciop-copy [-h] [-a] [-q] [-f] [-b <url-base>] [-d <driver-def>] [-o|O <sink-dir>] [-w <work-dir>] [-c] [-p <prefix>] [-z|-Z] [-r <num-retries>] [-t <timeout>] [-R] [-l <log-label>] [-D] <url1> [<url2> ... <urlN>]
URL Parameters:
Arguments are URL strings provided by seurl
if a parameter is specified as '-', URLs are read and inserted from standard input
Description¶
The following options are available:
-h displays this help page
-a abort on first error without attempting to process further URLs
-q quiet mode, local filenames are not echoed to stdout after transfer
-f force transfer of a physical copy of the file for nfs and HDFS urls.
-d <driver-file> get additional drivers from shell file <driver-file>. Drivers shall contain a named <protocol>Driver
-o|O <out-dir> defines the output directory for transfers (default is /root) with -O the sink files or directories possibly existing in the output directory will be overwritten.
-co <out-dir> same as -c -o <out-dir>. Kept for retro-compatibility.
-c creates the output directory if it does not exist
-p <prefix> prepend the given prefix to all output names
-z provide output as a compressed package (.gz for files or .tgz for folders). NOTE that it will not compress already compressed files (.gz, .tgz or .zip)
-U|--no-uzip disable file automatic decompression of .gz, .tgz and .zip files.
-r <num-retries> defines the maximum number of retries (default is 5)
-rt <seconds> define the time (in seconds) between retries (default is 60)
-t <timeout> defines the timer (in seconds) for the watchdog timeout applicable to gridftp, scp, ftp, http, and https schemes (default is 600 seconds)
-R do not retry transfer after timeout
-D set debug mode for command output parsing debugging
-H do not follow html and rdf tags and .uar archives.
-s skip download if sink path already exists
-x <pattern> exclude the files matching the pattern
-w do not overwrite single files iif already exist
Output¶
unless the quiet option is used (-q), the local path of each file (or directory) downloaded after each URL transfer is echoed, one per line unless the -U option is used, if the output file is a .gz or .tgz file it will be decompressed
unless the -H options is specified, the software will follow the RDF <dclite4g:onlineResource> and the HTML href and refresh tags.
Exit status¶
0 all URLs were successfully downloaded
1 an error occured during processing
255 environment is invalid (e.g. invalid working directory) or invalid options are provided
254 output directory does not exist or failed creating it (with -c option)
if the -a option is used, the exit code is set to the error code of the last URL transfer:
252 no driver available for URL
251 an existing file or directory conflicts with the sink for the URL in the output directory
250 an error occured while unpacking the output file or when packaging/compressing the output file (when -z or -Z option is used)
128 a timeout occured while fetching an url
127 a fatal error occured, source of error is not known or not handled by driver
<128 error codes specific to the transfer scheme
1 resource pointed by input URL does not exist
Updated by Herve Caumont over 11 years ago · 3 revisions