Understanding the sandbox » History » Revision 1
Revision 1/2
| Next »
Herve Caumont, 2013-06-20 15:02
Understanding the Sandbox¶
- Table of contents
- Understanding the Sandbox
The Sandbox filesystems¶
In the context of the application life-cycle, the Sandbox has three filesystems (or directory):- /home/<user> that we refer to as HOME
- /application that we refer to as APPLICATION
- /share that we refer to as SHARE
HOME directory¶
A user's home directory is intended to contain that user's files; including text documents, music, pictures or videos, etc. It may also include their configuration files of preferred settings for any software they have used there and might have tailored to their liking: web browser bookmarks, favorite desktop wallpaper and themes, passwords to any external services accessed via a given software, etc. The user can install executable software in this directory, but it will only be available to users with permission to this directory. The home directory can be organized further with the use of sub-directories.
As such, the HOME is used to store the user's files. It can be used to store source files (the compiled programs would then go APPLICATION).
At job or workflow execution time, the Sandbox uses a system user to execute the application. This system user cannot read files in HOME.
When the application is run on the Sandbox Runtime Environment, the HOME directory is not available in any of the computing nodes.
APPLICATION filesystem¶
The APPLICATION filesystem contains all the files required to run the application.
The APPLICATION filesystem is available on the Sandbox as /application.
Whenever an application wrapper script needs to use the APPLICATION value (/application) the variable $_CIOP_APPLICATION_PATH, example:
export BEAM_HOME=$_CIOP_APPLICATION_PATH/common/beam-4.11The APPLICATION contains
- the Application Descriptor File, named application.xml and described here: Application descriptor
- a folder for each job template
- the streaming executable, a script that deals with the stdin managed by the Sandbox (e.g. EO data URLs to be passed to ciop-copy). There isn't a defined naming convention although it is often called run.
Tip: The streaming executable will read its inputs via stdin managed by the Hadoop Map Reduce streaming underlying layer
- a set of folders such as:
- /application/<job template name>/bin standing for "binaries" and contains certain fundamental job utilities which are in part needed by the job wrapper script.
- /application/<job template name>/etc containing job-wide configuration files
- /application/<job template name>/lib containing the job libraries
- ...
There aren't any particular rules for the folders in the job template folder
The APPLICATION of a workflow with two jobs can then be represented as
/application/
application.xml
/job_template_1
run
/bin
/etc
/job_template_2
run
/bin
/lib
SHARE filesystem¶
The SHARE filesystem is the Sandbox distributed filesystem mount point. It is a HDFS filesystem used to store the application's job outputs generated by the execution of ciop-simjob and/or ciop-simwf.
The SHARE filesystem is available on the Sandbox as /share and the HDFS distributed filesystem acces point is /tmp thus, on the Sandbox, /share/tmp is the root of the distributed filesysyem.
SHARE for ciop-simjob¶
When the ciop-simjob is invoked to run a node of the workflow, the outputs are found in:
/share/tmp/sandbox/<workflow name>/<node name>
A job can be executed several times but the results of a previous execution will be deleted.
Tip: the workflow and node names are found in the Application Descriptor File, named application.xml and described here: Application descriptor
Tip: ciop-simjob -n will list the workflow node name(s), check the ciop-simjob reference page here: ciop-simjob
SHARE for ciop-simwf¶
When the ciop-simwf is invoked to run the complete application workflow, the outputs are found in a dedicated folder under SHARE:
/share/tmp/sandbox/run/<run identifier>/<node name>/data
Contrarly to ciop-simjob, ciop-simwf keeps all workflow execution runs. This feature allows comparing the results of different sets of parameters for example.
Tip: check the Application descriptor page to define default parameter values and how to override these in the workflow
Updated by Herve Caumont over 11 years ago · 1 revisions