Linux – Executing local function code on a remote server

bashlinuxshell-scriptingssh

I am updating about 20 bash scripts that perform various server-side operational tasks, checking status / sending reports etc. On some environments these scripts will run locally, on others they need to run remotely. The scripts should detect whether they need to run remotely or locally and 'do the right thing'

#!/bin/bash

# Generate the foo report

execute_host=appdev7

if [[ "$execute_host" != "$(hostname)" ]]; then
    # ssh-agent will provide passwordless logon
    ssh "$execute_host" < $0
else
    # run report
    echo "Report"
fi

Some of the scripts use shared functions and environment variables. I can pass through environment variables using the ssh SendEnv option, but I can't figure out a nice way to make shared functions available when running remotely.

./shared.sh

# Shared functions and variables

export report_host="appdev7"

function run_report() {
    # run report
    echo 'Report'
}

./example

#!/bin/bash

# Generate the foo report

[[ -f shared ]] && source ./shared.sh

export execute_host="$report_host"

if [[ "$execute_host" != "$(hostname)" ]]; then
    # ssh-agent will provide passwordless logon
    ssh -o SendEnv='report_host' "$execute_host" < $0
else
    # This doesn't work when the script is run remotely
    run_report
fi
  • I can place shared.sh on all our hosts, but I have to keep the files in sync somehow and that will inevitably go wrong at some point.
  • I can place shared.sh on an NFS share, but if NFS goes down we lose the ability to use our scripts
  • I could scp shared.sh to the server before running the script. This would work but might be a little slow, and if shared.sh depends on some other script we would have to copy that file too
  • I can use declare -f to extract the code of my functions, but I can't figure out a nice way to shuffle them over to the remote server. Function dependencies might also cause problems.

About the cleanest solution I can find is to run the shared library inline using Process Substitution:

#!/bin/bash

# Generate the foo report

# source the shared code/vars if we're running locally
[[ -f shared.sh ]] && source shared.sh

execute_host="$report_host"

if [[ "$execute_host" != "$(hostname)" ]]; then
    # ssh-agent will provide passwordless logon
    # Note that we source the shared library in-line with the script
    ssh -T "$execute_host" < <(cat shared.sh $0)
else
    run_report
fi

My questions then:

  1. Are there any problems with this approach?
  2. Can you see a way to avoid referencing the shared library in multiple places?
  3. Any way to resolve dependencies in shared.sh? (eg if shared.sh depends on shared2.sh)
  4. Is there any better way to solve this problem?

Best Answer

So what I have been using to accomplish this task is a different approach. First I created an executing script called remotely1:

#!/bin/bash

# Usage:
# Put this thing as a shebang into your scripts
# i.e.:
# 
#   #!/bin/remotely user@your-server.com
#   
#   echo "hello world from ${hostname}"
#

login=$1
script=$2

args=${@#$1}
args=${args#$2}

tar cz $script | ssh $login "tar xz && bash --login $script $args"

And then use it in my scripts that should be executed remotely:

#!/bin/remotely user@your-server.com

echo "hello world from ${hostname}"

It can also be tweaked to support multiple hosts using something like gnu-parallel.