Warning: call_user_func_array() expects parameter 1 to be a valid callback, no array or string given in /home/kalofero/public_html/blog/wp-includes/class-wp-hook.php on line 286
ABX Action to Run Scripts on Cloud Assembly Deployments (SKKB1054) | Spas Kaloferov's Blog

ABX Action to Run Scripts on Cloud Assembly Deployments (SKKB1054)

In this blog post we are going to look at an ABX action that allows us to run any (Shell, PowerShell, or CMD) script on any Linux or Windows machine deployment in any (Amazon Web Services, Azure, vCenter, VMware Cloud on AWS) cloud account in VMware Cloud Assembly.


Update Log:


Action features:

  • Run script on machine deployments through VMware Cloud Assembly.
  • Run script regardless of the underlying OS: Linux (shell) or Windows (powershell/cmd)
  • Run script regardless in which cloud account we deploy to: Amazon Web Services (AWS), Google Cloud Platform (GCP) (under development), Microsoft Azure, VMware vCenter, VMware Cloud on AWS (VMC)
  • Authenticate both with Username and Password and with SSH Keys
  • Run a given script for all deployment.
  • Run a different script for each machine in each blueprint. Scripts are stored in the Blueprint.
  • Combination of both above: Run a given script for all machines and in addition custom script for each machine with each Blueprint

The action can be found on VMware Sample Exchange or  the following Gitlab Repo bit.ly/The-Gitlab.  Action name is casRunScript
Special thanks goes to Kaloyan Kolev !!!

Let’s dig in into the action to see what it does.


Using the ABX Inputs

Let’s examine the blueprint inputs and what they do:

  • psUserIn (String): Windows Username
  • sshUserIn (String): Linux Username
  • cmdHostABXIn (String): IP Address for test purposes from within ABX
  • actionOptionSshAuthIn (String): Authentication method
    • key: Uses Username and SSH key
    • password: Uses Username and Password
      • psPassIn (String): Windows Password
      • sshPassIn (String): Linux Password
  • cmdDefaultShellTypeIn (String): Default script type. Used if one not defined in the Blueprint
    • linux (String): Linux shell script.
    • pwoershell (String): Powershell/CMD shell script.
  • cmdActionPreOneDelayIn (String): Delay in Seconds before the FIRST PRE ABX Action is started.
  • cmdActionPostOneDelayIn (String): Delay in Seconds before the FIST POST ABX Action is started.
  • cmdActionPreOneScriptIn (String): First Pre script. Will be run before any Blueprint scripts.
  • cmdActionPostOneScriptIn (String): First Post script. Will be run after any Blueprint scripts.
  • actionOptionAllowBlueprintScriptsIn (Boolean): Wherever to allow Blueprint script to execute
    • true: Allows Blueprint script to execute. Script will run after cmdActionPreOneScriptIn and before the cmdActionPostOneScriptIn script.
    • false: Blueprint script will not be run.

Now let’s say I want to execute a particular script for all machines provisioned from VMware Cloud Assembly and I want to use Username and SSH Key for login. And I want that script to start 2 minutes after the action triggers.
I will have to configure the settings as follows:

  • psUserIn : <username>
  • sshUserIn: <username>
  • actionOptionSshAuthIn : key
  • cmdDefaultShellTypeIn: shell
  • cmdActionPreOneDelayIn : 120
  • cmdActionPreOneScriptIn : < action script to execute>
  • actionOptionAllowBlueprintScriptsIn: false

Let’s say now that this is not enough, and I want to extend my customization. I want to run different script for every machine resource in every blueprint that I have. After this script finishes I want to trigger a script from  the action that will that is the same for all machines. And I will have a mixture of Windows and Linux machines so my shells will be different. I do not want to have any delay for the Action script. I want to have 2 minutes delay for the Blueprint script.
The settings would look like this.

  • psUserIn : <username>
  • sshUserIn: <username>
  • actionOptionSshAuthIn : key
  • cmdDefaultShellTypeIn: shell
  • cmdActionPreOneDelayIn : 1
  • cmdActionPreOneScriptIn : “”
  • cmdActionPostOneDelayIn: 1
  • cmdActionPostOneScriptIn: <action script to e execute>
  • actionOptionAllowBlueprintScriptsIn: true

As you can see we have set actionOptionAllowBlueprintScriptsIn: true to allow blueprint script to execute.
Now we have to prepare our blueprint with a couple of custom properties.
For each machine in each blueprint where for which we ant to run a custom script , we have to place the following properties:

  1. #--------------------------------------------------------#
  2. #                     Spas Kaloferov                     #
  3. #                   www.kaloferov.com                    #
  4. # bit.ly/The-Twitter      Social     bit.ly/The-LinkedIn #
  5. # bit.ly/The-Gitlab        Git         bit.ly/The-Github #
  6. # bit.ly/The-BSD         License          bit.ly/The-GNU #
  7. #--------------------------------------------------------#
  8. info: |-
  9.   #
  10.   #     VMware Cloud Assembly Blueprint Code Sample    
  11.   #
  12. #-------------------------INPUTS-------------------------#
  13. #------------------------RESOURCES-----------------------#
  14. resources:
  15.   db-tier:
  16.     type: Cloud.Machine
  17.     properties:
  18.      #-----------------CUSTOM PROPERTIES----------------#
  19.       # Command to execute
  20.       abxRunScript_script: mkdir bp-dir
  21.       # TIme delay in seconds before the script is run
  22.       abxRunScript_delay: 120
  23.       # Type of the script: shell (Linux) or powershell (Windows)
  24.       abxRunScript_shellType: linux
  25.       # Specifies in which Cloud Account the deployment is running: azure, aws, ect..
  26.       # Used for Subscription Condition Filtering
  27.       abxRunScript_endpointType: '${self.endpointType}'
  28.       #--------------------------------------------------#

Let’s see what each of these properties does.

  • abxRunScript_script: Command to execute. E.g.: mkdir bp-dir
  • abxRunScript_delay: Time delay in seconds before the script is run : E.g. (1m): 60
  • abxRunScript_shellType: Type of the script. E.g.: shell (Linux) , PowerShell (Windows)
  • abxRunScript_endpointType: Endpoint type which we use to filter in Subscriptions. ${self.endpointType}’  will resolve to E.g.; aws, azure, ect..

In this case I wanted to have 2m delay for my blueprint script so I’ve set abxRunScript_delay: 120. I the script that will be executed is set via abxRunScript_script: mkdir bp-dir. In this case this is a Linux machine so I’ve specified the shell via abxRunScript_shellType: linux

If we want to run another script before or after this one, we can specify it in the ABX Action via cmdActionPreOneScriptIn and cmdActionPostOneScriptIn

Now if we are deploying in multiple cloud accounts (AWS,vCenter, Azure, VMC) we need to use multiple FaaS providers for the function. So that the function can run in the same environment where the machine is being provisioned.
To accomplish this we can need to :

  • Clone the ABX action multiple times . One for each Cloud Account that we have.
  • Set different FaaS provider for each action.
  • Create a subscription for each action.
  • In the Subscription condition filter create a filter for each cloud account and map it to the corresponding action . E.g. for the Subscription that will wall the action which is configured fort Azure, we will add the following condition filter event.data.customProperties.abxRunScript_endpointType == "azure". For the Subscription that will trigger the ABX action that is configured to run on AWS FaaS we will add the filter event.data.customProperties.abxRunScript_endpointType == "aws", ect..


Testing the ABX Action

Let’s run an example test of the action. Here are inputs we’ve had set for the action.

As you can see I’ve accepted blueprint script to run . I also have defined two script that will run before and after the blueprint script as explained earlier. One of them has a pre-delay of 30 seconds the other one 60 seconds. I’ve selected SSH Key as authentication method and I’ve specified usernames for both my Linux and Windows boxes.
Nota that in this case the Windows virtual machine has to have ssh client. Otherwise we will have to set actionOptionSshAuthIn : key and user username and password for both Linux and Windows machines.
We’ve subscribed the action to the compute.provision.post Event Topic.

Our Blueprint looks like this:

Let’s run this and connect to the VM which get’s provisioned and check the created folders.

As you can see all folders were created including those specified in the script in the VM and those specified in both scripts in the action





Final Step

If all went well, go grab a beer.

DISCLAIMER; This is a personal blog. Any views or opinions represented in this blog are personal and belong solely to the blog owner and do not represent those of people, institutions or organizations that the owner may or may not be associated with in professional or personal capacity, unless explicitly stated. Any views or opinions are not intended to malign any religion, ethnic group, club, organization, company, or individual.
All content provided on this blog is for informational purposes only. The owner of this blog makes no representations as to the accuracy or completeness of any information on this site or found by following any link on this site. The owner will not be liable for any errors or omissions in this information nor for the availability of this information. The owner will not be liable for any losses, injuries, or damages from the display or use of this information.
Unless stated, all photos are the work of the blog owner and are licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License. If used with watermark, no need to credit to the blog owner. For any edit to photos, including cropping, please contact me first.
Unless stated, all recipes are the work of the blog owner and are licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License. Please credit all recipes to the blog owner and link back to the original blog post.
Downloadable Files
Any downloadable file, including but not limited to pdfs, docs, jpegs, pngs, is provided at the user’s own risk. The owner will not be liable for any losses, injuries, or damages resulting from a corrupted or damaged file.
Comments are welcome. However, the blog owner reserves the right to edit or delete any comments submitted to this blog without notice due to
– Comments deemed to be spam or questionable spam
– Comments including profanity
– Comments containing language or concepts that could be deemed offensive
– Comments containing hate speech, credible threats, or direct attacks on an individual or group
The blog owner is not responsible for the content in comments.
This policy is subject to change at anytime.

Leave a Reply

Your email address will not be published. Required fields are marked *