Running remote commands on a Webfusion shared service

The general advice here is don’t – especially if you are not familiar with the bash command shell and Linux environments in general.  However, there are still certain cases when you might need to run a command on the shared service environment, and you have the necessary expertise.  For example, I develop all my scripts and draft my blogs on my local test system and then only synchronise any changes with the live ellisons.org.uk site when I am happy with the change. I have a script that creates a tarball of the files that have been added or changed since the last synch and then FTP copies this to my _private sub-directory on the Webfusion shared server.  I then need to unpack this locally and this requires running a tar command at the server.

I run Ubuntu on my laptop, which allows me to use the ssh command-line utility to access the dedicated and virtual servers that I help administer. I have aliases set up for each so, for example, I can check the disk space on the user.services.openoffice.org server by typing oooprod df -h at the command prompt.  This works well as long as (i) ssh-client is installed locally, (ii) ssh-server is installed remotely and (iii) I have an ssh-enabled account on the remote machine.  Unfortunately, Webfusion no longer provide ssh access to their shared servers, so is there a simple mechanism for implementing an equivalent?  The answer here is yes, and the last section of this article provides the client and server side of this code.

There are various protocols for open client-server transport, such as SOAP and XMLRPC, but the APIs for providing services based on these aren’t fully implemented on the Webfusion configuration.  However, this application is a closed solution and by using PHP for both the client and server ends (CLI and suPHP based respectively) I can adopt a much simpler approach.  Whilst there are some web-based “command interfaces” available, by sticking to a pure command-line approach, I can keep this code simple.  Why implement aliases, command history, etc. when my local scripting environment (bash in my case) does all of this for me?  Even so, there are some issues that I still need to address:

  • Transport.  I use a simple HTTP post protocol for input arguments returning an application/gzip response stream for the ouput.  This means that I have minimal parsing so that my server encodes an array containing the stdout, stderr and return status for the command by an echo gzcompress( serialize( $out ) ).
  • Security.  I need to have a strong method of ensuring that the server can’t be attacked.  However, this is made a lot easier by the fact that I control both ends and therefore this can be based in a shared secret.  So (i) I use the shared HTTPS service (in my case https://fusion.webfusion-secure.co.uk/~ellisons.org.uk/) for transport to minimise potential eavesdropping, and (ii) I also pass the MD5 of the data-stream including the shared secret prefix and compare this at the server-end; I only honour the request if this MD5 matches at the server-end.
  • Process Execution.  I use a fairly standard template based on proc_open and the related functions.  This is because I want to handle the STDERR and return status separately at the client end.

The client-side is a simple CLI script.  Installing php-CLI is trivial under Linux. (An equivalent is possible through installation is slightly more convolved on Windows.)  All I need to complete this is an alias in my .bashrc and now typing webfusion ls -l b does exactly what I want it to do.  There are some limits to what you can execute on the server, specifically the command sub-process is unprivileged and runs in your UIC.  The webfusion (professional) configuration also defines:

RLimitCPU   15
RLimitNPROC 7
RLimitMem   250000000

which define the process count, time and memory limits.  These are reasonable and adequate for the sort of commands that you should be needing to execute.

The implementation

The server end is as follows:

<?php
define( 'SALT', 'some private expression' ); # replace this with your own version, as I have

if( isset( $_POST['check'] ) && isset( $_POST['cmd'] ) &&
   $_POST['check'] == md5( SALT . $_POST['cmd'] ) ) {

   $cmd = $_POST['cmd'];
   $io  = array(); 
   $out = array();
   $descriptor = array( 1 => array('pipe', 'w'), 2 => array('pipe', 'w') );

   $p = proc_open($cmd, $descriptor, $io);
   
   $out = array (
      1 => stream_get_contents( $io[1]),
      2 => stream_get_contents( $io[2]),
      ) ;

   fclose( $io[1] ); fclose( $io[2] );
   $out['status'] = proc_close($p);

   header( 'Content-Type: application/gzip' );  # The O/P is a gzipped serialised response
   echo gzcompress( serialize( $out ) ); 
}

The client end is as follows:

#! /usr/bin/php
<?php
# replace this SALT with your own version and use a different procedure name, as I have
define( 'SALT', 'some private expression' ); 
define ('REMOTE_SERVICE', "https://fusion.webfusion-secure.co.uk/~ellisons.org.uk/cmdServer.php" );

$command = implode( ' ', array_slice( $argv, 1) );
$check = md5( SALT . $command );

$ch = curl_init();

curl_setopt_array( $ch, array(
   CURLOPT_URL        => REMOTE_SERVICE,
   CURLOPT_POST       => 1,
   CURLOPT_POSTFIELDS => 
      array( 'cmd' => $command, 'check' => $check, ),
   CURLOPT_HEADER     => 0, 
   CURLOPT_RETURNTRANSFER => 1,
   ) ); 
$response = curl_exec($ch);
curl_close($ch);

$response = unserialize( gzuncompress( $response ) );

file_put_contents( 'php://stdout', $response[1] );
file_put_contents( 'php://stderr', $response[2] );

exit( $response['status'] );

Leave a Reply