dpdispatcher.contexts package

Contexts.

Submodules

dpdispatcher.contexts.dp_cloud_server_context module

class dpdispatcher.contexts.dp_cloud_server_context.BohriumContext(*args, **kwargs)[source]

Bases: BaseContext

Methods

machine_arginfo()

Generate the machine arginfo.

machine_subfields()

Generate the machine subfields.

bind_submission

check_file_exists

check_finish

check_home_file_exits

clean

download

load_from_dict

read_file

read_home_file

upload

upload_job

write_file

write_home_file

write_local_file

alias: Tuple[str, ...] = ('DpCloudServerContext', 'LebesgueContext')
bind_submission(submission)[source]
check_file_exists(fname)[source]
check_home_file_exits(fname)[source]
clean()[source]
download(submission)[source]
classmethod load_from_dict(context_dict)[source]
classmethod machine_subfields() List[Argument][source]

Generate the machine subfields.

Returns:
list[Argument]

machine subfields

read_file(fname)[source]
read_home_file(fname)[source]
upload(submission)[source]
upload_job(job, common_files=None)[source]
write_file(fname, write_str)[source]
write_home_file(fname, write_str)[source]
write_local_file(fname, write_str)[source]
dpdispatcher.contexts.dp_cloud_server_context.DpCloudServerContext

alias of BohriumContext

dpdispatcher.contexts.dp_cloud_server_context.LebesgueContext

alias of BohriumContext

dpdispatcher.contexts.hdfs_context module

class dpdispatcher.contexts.hdfs_context.HDFSContext(*args, **kwargs)[source]

Bases: BaseContext

Methods

check_file_exists(fname)

Check whether the given file exists, often used in checking whether the belonging job has finished.

download(submission[, check_exists, ...])

Download backward files from HDFS root dir.

machine_arginfo()

Generate the machine arginfo.

machine_subfields()

Generate the machine subfields.

upload(submission[, dereference])

Upload forward files and forward command files to HDFS root dir.

bind_submission

check_finish

clean

get_job_root

load_from_dict

read_file

write_file

bind_submission(submission)[source]
check_file_exists(fname)[source]

Check whether the given file exists, often used in checking whether the belonging job has finished.

Parameters:
fnamestring

file name to be checked

Returns:
status: boolean
clean()[source]
download(submission, check_exists=False, mark_failure=True, back_error=False)[source]

Download backward files from HDFS root dir.

Parameters:
submissionSubmission class instance

represents a collection of tasks, such as backward file names

check_existsbool

whether to check if the file exists

mark_failurebool

whether to mark the task as failed if the file does not exist

back_errorbool

whether to download error files

Returns:
none
get_job_root()[source]
classmethod load_from_dict(context_dict)[source]
read_file(fname)[source]
upload(submission, dereference=True)[source]

Upload forward files and forward command files to HDFS root dir.

Parameters:
submissionSubmission class instance

represents a collection of tasks, such as forward file names

dereferencebool

whether to dereference symbolic links

Returns:
none
write_file(fname, write_str)[source]

dpdispatcher.contexts.lazy_local_context module

class dpdispatcher.contexts.lazy_local_context.LazyLocalContext(*args, **kwargs)[source]

Bases: BaseContext

Run jobs in the local server and local directory.

Parameters:
local_rootstr

The local directory to store the jobs.

remote_rootstr, optional

The argument takes no effect.

remote_profiledict, optional

The remote profile. The default is {}.

*args

The arguments.

**kwargs

The keyword arguments.

Methods

machine_arginfo()

Generate the machine arginfo.

machine_subfields()

Generate the machine subfields.

bind_submission

block_call

block_checkcall

call

check_file_exists

check_finish

clean

download

get_job_root

get_return

load_from_dict

read_file

upload

write_file

bind_submission(submission)[source]
block_call(cmd)[source]
block_checkcall(cmd)[source]
call(cmd)[source]
check_file_exists(fname)[source]
check_finish(proc)[source]
clean()[source]
download(jobs, check_exists=False, mark_failure=True, back_error=False)[source]
get_job_root()[source]
get_return(proc)[source]
classmethod load_from_dict(context_dict)[source]
read_file(fname)[source]
upload(jobs, dereference=True)[source]
write_file(fname, write_str)[source]
class dpdispatcher.contexts.lazy_local_context.SPRetObj(ret)[source]

Bases: object

Methods

read

readlines

read()[source]
readlines()[source]

dpdispatcher.contexts.local_context module

class dpdispatcher.contexts.local_context.LocalContext(*args, **kwargs)[source]

Bases: BaseContext

Run jobs in the local server and remote directory.

Parameters:
local_rootstr

The local directory to store the jobs.

remote_rootstr

The remote directory to store the jobs.

remote_profiledict, optional

The remote profile. The default is {}.

*args

The arguments.

**kwargs

The keyword arguments.

Methods

machine_arginfo()

Generate the machine arginfo.

machine_subfields()

Generate the machine subfields.

bind_submission

block_call

block_checkcall

call

check_file_exists

check_finish

clean

download

get_job_root

get_return

load_from_dict

read_file

upload

write_file

bind_submission(submission)[source]
block_call(cmd)[source]
block_checkcall(cmd)[source]
call(cmd)[source]
check_file_exists(fname)[source]
check_finish(proc)[source]
clean()[source]
download(submission, check_exists=False, mark_failure=True, back_error=False)[source]
get_job_root()[source]
get_return(proc)[source]
classmethod load_from_dict(context_dict)[source]
read_file(fname)[source]
upload(submission)[source]
write_file(fname, write_str)[source]
class dpdispatcher.contexts.local_context.SPRetObj(ret)[source]

Bases: object

Methods

read

readlines

read()[source]
readlines()[source]

dpdispatcher.contexts.openapi_context module

class dpdispatcher.contexts.openapi_context.OpenAPIContext(*args, **kwargs)[source]

Bases: BaseContext

Methods

machine_arginfo()

Generate the machine arginfo.

machine_subfields()

Generate the machine subfields.

bind_submission

check_file_exists

check_finish

check_home_file_exits

clean

download

load_from_dict

read_file

read_home_file

upload

upload_job

write_file

write_home_file

write_local_file

bind_submission(submission)[source]
check_file_exists(fname)[source]
check_home_file_exits(fname)[source]
clean()[source]
download(submission)[source]
classmethod load_from_dict(context_dict)[source]
read_file(fname)[source]
read_home_file(fname)[source]
upload(submission)[source]
upload_job(job, common_files=None)[source]
write_file(fname, write_str)[source]
write_home_file(fname, write_str)[source]
write_local_file(fname, write_str)[source]

dpdispatcher.contexts.ssh_context module

class dpdispatcher.contexts.ssh_context.SSHContext(*args, **kwargs)[source]

Bases: BaseContext

Attributes:
sftp
ssh

Methods

block_checkcall(cmd[, asynchronously, ...])

Run command with arguments.

machine_arginfo()

Generate the machine arginfo.

machine_subfields()

Generate the machine subfields.

bind_submission

block_call

call

check_file_exists

check_finish

clean

close

download

get_job_root

get_return

list_remote_dir

load_from_dict

read_file

upload

write_file

bind_submission(submission)[source]
block_call(cmd)[source]
block_checkcall(cmd, asynchronously=False, stderr_whitelist=None)[source]

Run command with arguments. Wait for command to complete. If the return code was zero then return, otherwise raise RuntimeError.

Parameters:
cmdstr

The command to run.

asynchronouslybool, optional, default=False

Run command asynchronously. If True, nohup will be used to run the command.

stderr_whitelistlist of str, optional, default=None

If not None, the stderr will be checked against the whitelist. If the stderr contains any of the strings in the whitelist, the command will be considered successful.

call(cmd)[source]
check_file_exists(fname)[source]
check_finish(cmd_pipes)[source]
clean()[source]
close()[source]
download(submission, check_exists=False, mark_failure=True, back_error=False)[source]
get_job_root()[source]
get_return(cmd_pipes)[source]
list_remote_dir(sftp, remote_dir, ref_remote_root, result_list)[source]
classmethod load_from_dict(context_dict)[source]
classmethod machine_subfields() List[Argument][source]

Generate the machine subfields.

Returns:
list[Argument]

machine subfields

read_file(fname)[source]
property sftp
property ssh
upload(submission, dereference=True)[source]
write_file(fname, write_str)[source]
class dpdispatcher.contexts.ssh_context.SSHSession(hostname, username, password=None, port=22, key_filename=None, passphrase=None, timeout=10, totp_secret=None, tar_compress=True, look_for_keys=True)[source]

Bases: object

Attributes:
remote
rsync_available
sftp

Returns sftp.

Methods

inter_handler(title, instructions, prompt_list)

inter_handler: the callback for paramiko.transport.auth_interactive.

arginfo

close

ensure_alive

exec_command

get

get_ssh_client

put

static arginfo()[source]
close()[source]
ensure_alive(max_check=10, sleep_time=10)[source]
exec_command(**kwargs)
get(from_f, to_f)[source]
get_ssh_client()[source]
inter_handler(title, instructions, prompt_list)[source]

inter_handler: the callback for paramiko.transport.auth_interactive.

The prototype for this function is defined by Paramiko, so all of the arguments need to be there, even though we don’t use ‘title’ or ‘instructions’.

The function is expected to return a tuple of data containing the responses to the provided prompts. Experimental results suggests that there will be one call of this function per prompt, but the mechanism allows for multiple prompts to be sent at once, so it’s best to assume that that can happen.

Since tuples can’t really be built on the fly, the responses are collected in a list which is then converted to a tuple when it’s time to return a value.

Experiments suggest that the username prompt never happens. This makes sense, but the Username prompt is included here just in case.

put(from_f, to_f)[source]
property remote: str
property rsync_available: bool
property sftp

Returns sftp. Open a new one if not existing.