IaC for vRealize: Documenting Code with JSDoc and Syntax Checking and Code Style Management with ESLint

Welcome to the 4th post in the series where I will take a look at how we can add comments to our code using JSDoc and perform syntax checking and style management with ESLint. I felt this was a good point to intersect the series on these topics, where you will establish fundamental development practices and make your vRO development life easier for both yourself and your peers.

Once you have implemented these techniques, you will begin on a journey of well documented and standardised code and take a step into the doorway of Test Driven Development (TDD).

Prerequisites

Several packages need to be installed, which will be used in this post. I created a previous article on Using Visual Studio Code for your vRealize Orchestrator Development which covers the installation of NodeJS (needed for NPM), ESLint and some other plugins.

Please have a look at this post and ensure you have everything installed. Don’t pay too much attention to the JSDoc examples in that post as they are very specific to documenting Actions (with no function declaration). I’ll be demonstrating a different approach here with Maven.

Also, you will need to install the following packages using NPM (I’m installing globally for this tutorial, but production environments should make use of local dependencies, something a bit too long for this post, which I will cover in a later one):

npm install -g jsdoc eslint-plugin-jsdoc eslint

I will detail how and where these are used throughout this post.

You can also download my ‘.eslintrc.json‘ configuration here:

Documenting Code with JSDoc

Developers are often expected to add comments to their code as this can help both yourself and others when trying to figure out what your code is trying to achieve. However, this practice has evolved using tools that are much more integrated and can automate the creation of documentation.

Many programming languages offer some form of documentation library, for example,  java has Javadoc, and Python has the native docstring and In JavaScript, we have JSDoc.

JSDoc makes use of annotations and tags that are added alongside the code and are used to describe constructs such as functions, classes, namespaces, and variables. Some tags allow metadata such as author information, versioning and licensing to be added to the documentation.

JSDoc can also be used to describe variable types and with some advanced settings in VScode, and a few extra lines of code, it can enforce static type checking. It’s one of the major reasons why developers turn to TypeScript for their JavaScript development. However, my attempts to get this to work have been unsuccessful and I leave this open to anyone willing to give it a try. The issue was dealing with vRO object types, such as ‘REST:RESTHost’, where VScode/TypeScript interprets the colon as an error.

Adding JSDoc Annotations To Your Code

Adding JSDoc annotations to your code is a hard requirement. This is due to the way the vRealize Build Tools has been implemented. The ‘vRealize:pull‘ goal has been designed to depend on the JSDoc description and tags (specifically @param and @returns) when creating the Action in vRO. If these properties are missing or specify the wrong data types, then these will be carried through to vRO.

The best way to begin to explain JSDoc and how to use it would be to provide an example of this being used to describe one of my vRO functions.

Below is a JSDoc annotation used to define my ‘attachSecurityTagToVcVm’ function.

/**
 * Attaches the specified nsx security tag to a vCenter virtual machine.
 * @author Gavin Stephens <gavin.stephens@simplygeek.co.uk>
 * @function attachSecurityTagToVcVm
 * @param {REST:RESTHost} nsxRestHost The NSX Rest Host.
 * @param {VC:VirtualMachine} vcVm The vCenter Virtual Machine.
 * @param {string} tagName The NSX security tag name.
 * @returns {void}
 */

The JSDoc annotations are always placed inside a comment block (/**   */). Each line that is defined within the comment block starts with an asterisk (*). Pay close attention to the indentation of the commented lines, these are single-spaced. The comment block itself should be placed on the immediate line before the item you are documenting (i.e. the function declaration).

The first line should always be the description. It should clearly describe what the function does or is used for. The description is mandatory and is added to the description field for the vRO Action.

@author allows me to add author details such as my name and email address. This tag is optional

@function allows the function name to be specified. I use this because function declarations for vRO Actions are anonymous. Therefore, when generating documentation, JSDoc doesn’t know what the function is called. Using this tag can help JSDoc. This tag is optional.

@param describes the parameters that the function expects, which includes the data type, name and description. Note that the data type is specified exactly as it would appear in an Action in vRO for example, ‘VC:VirtualMachine‘ and not ‘vcVirtualMachine‘. The ‘vRealize:push‘ goal will apply these data types exactly as-is to the Actions inputs.

The parameters can also be specified as optional by enclosing square brackets around the parameter name, i.e. in my example, if ‘tagName‘ was optional, it would be specified as ‘[tagName]‘. This does not affect how parameters are presented to vRO and are used for documentation purposes only. You will have to rely on adding code that handles mandatory and optional parameters, including setting defaults.

@returns describes any variable that is returned by the function. Normally, it would be fine to remove this tag if the function does not return a value. However, vRO Actions must specify ‘void‘ if no value is returned. It’s also good to be explicit. Note that when the vRealize Build Tools pulls down any of your Actions for the first time, it will set an @return tag, but it is recommended to use @returns.

Check out https://devdocs.io/jsdoc/ to learn more about JSDoc and all the available tags.

One thing to note is that any additional tags you use, such as @author, @function or anything else apart from @param and @returns, will be appended to the description field of the vRO Action. Read more “IaC for vRealize: Documenting Code with JSDoc and Syntax Checking and Code Style Management with ESLint”

IaC for vRealize: Define Dependencies, Manage Versions, Prepare & Release Packages & Deploy Artifacts

Welcome to the third part in the series working with the vRealize Build Tools. At this stage, you should have a fully working CI infrastructure and have all of your vRO code exported using packages and stored in Git repositories. In this post, I will show you how to manage dependencies across your packages and how you can use the Maven plugins to automatically update packages so that they are using the latest dependency versions.

I will also show you how to manage versioning for development (Snapshots) and production (release) code. Once we have our versioning in place, I will detail how to prepare the releases and finally push the artefacts to the Artifactory repositories, which can be picked up by a release pipeline (something I will cover in much more detail in a later post).

I also want to point out that I had to update my previous post to include the git scm connections in your pom.xml files, as this is required for this post, so make sure to go back and check that out.

Understanding Snapshot vs Release Versions

The first thing to understand is the ‘-SNAPSHOT‘ string that is suffixed to package versions. You will have noticed that all of the package versions you created in my previous post, were version ‘1.0.0-SNAPSHOT’. This suffix tells Maven that the code inside this package is in a development stage and not suitable for production release. The Maven release plugins understand this and will prevent the dependency handler from including these as valid dependencies for your packages, as this could result in unstable code (though, they can be defined manually).

If you are following these posts, then currently, all your packages will be at version 1.0.0-SNAPSHOT. Don’t worry, we will change this later, but for now, accept this as-is.

Define Dependencies

In my previous post, I provided an example of 3 packages that I have created. I also created a table that detailed the ‘groupId‘ and ‘artifactId‘ for these packages. I am going to extend this table to list their dependencies on other packages. I have kept this simple, in that no dependencies exist outside of these 3 packages.

A dependency exists when one package is calling code from another package (typically with a System.getModule()). When dependencies are mapped, their artefacts and required versions are released as part of the release process.

groupId artifactId Dependency
com.simplygeek.library logger
com.simplygeek.library rest logger
com.simplygeek.library nsx logger
rest

I have added the ‘Dependency‘ column, which lists the projects/artefacts that this package depends on. All of my packages depend on ‘logger‘, whereas ‘nsx‘ also depends on ‘rest‘.

We now need to add some XML to the projects ‘pom.xml‘ file, that is located at the root of the project folder. These are ‘<dependencies>‘ tags that are used to define a set of ‘<dependency>‘ tags for the project. You should insert these tags immediately after the ‘<packaging>‘ tags but before the ‘<scm>‘ tags.

We also want to omit ‘-SNAPSHOT‘ from the version, since we’ll be pushing actual releases.

Below is an example of what the dependency XML insert looks like for the ‘rest‘ package:

<dependencies>
    <dependency>
        <groupId>com.simplygeek.library</groupId>
        <artifactId>logger</artifactId>
        <type>package</type>
        <version>1.0.0</version>
    </dependency>
</dependencies>

Once you have completed defining your dependencies, save the ‘pom.xml‘ file and make sure to commit the changes to git.

Repeat this process for all of your projects.

Prepare & Release Packages

Maven provides several lifecycle phases and goals that can be used to prepare and perform the releases. We will also need to push the release artefacts to the Artifactory repositories, as this will be queried for dependencies when releasing a package that has any defined.

You should first be focusing on all the core projects that make up most of your dependencies. Also, start with the projects that have dependencies in the lowest order, i.e. the ‘logger‘ project has no dependencies, so that will be released first, ‘rest‘ depends on ‘logger‘, so that will be released next, etc.

For all the release tasks, we’re going to specify the ‘release‘ plugin followed by the goal. We will use the following goals:

  • Clean
  • Prepare
  • Perform

Read more “IaC for vRealize: Define Dependencies, Manage Versions, Prepare & Release Packages & Deploy Artifacts”

IaC for vRealize: Manage Existing vRO Code With vRealize Build Tools & Set up Git Repositories

In my previous post on Deploying vRealize Build Tools To Allow Infrastructure As Code for vRA and vRO, I covered how to set up the CI infrastructure and your developer workstation, in preparation for managing your vRO code as projects with Visual Studio Code and Maven. In this post, I will explain how you can work with your existing code base and manage it using the build tools. A major part of this will be creating and managing new projects that will map to our existing code in vRO.

Once projects have been created, I will detail how Git repositories can be used to store and manage your vRO code, and then we can map your project dependencies and allow development teams to work collaboratively, without risk of overwriting the work of others (a major problem when developing using the vRO client). Git is going to bring some very useful processes and methodologies to the table such as branching, tagging and merge conflict resolution.

I also want to point out that I am currently only focusing on Actions, as I believe this is where all your code should exist. I will have followup posts that will cover strategies for managing Workflows and other items.

Setting Up GitLab

One thing that I didn’t cover in my previous post, was setting up Git. I purposely reserved this topic for this post, as it was more relevant. To start, you will need to have a GitLab server deployed that can be used to create the repositories for storing your vRO projects. You can use GitHub if you so wish, it doesn’t matter too much, but using GitLab doesn’t require that your environments have access to the Internet. GitLab also allows you to create groups to organise multiple repositories, which is going to be useful.

If you need to install GitLab, then there is a good guide on VULTR, that details how to install GitLab and enable HTTPS.

Create User

You should be using your user account when working with Git and not the default root account. So log into the admin area and create a new local account for your personal use. Alternatively, you can also configure GitLab to allow users from Active Directory to log in, you can use the guide provided by Git here.

Create a Personal Access Token

Once you have a Git user account set up, you will need to create a Personal Access Token. An Access Token can be used instead of a password when authenticating with Git over HTTPS. This will provide safer storage of user credentials in the Git configuration files.

Click on the profile icon on the top right of the page and select Settings:

On the Settings page select Access Tokens.

On the ‘Add a personal access token‘ page, give the access token a name and the ‘write_repository‘ permissions. Set an expiry date if you wish, or leave blank to never expire.

Once you click ‘Create personal access token‘, the token will be displayed. You will need to copy and save this token somewhere safe as you will not be able to view it again. If you lose this token, you will have to create a new one to replace it.

Create a Group

A Group allows multiple projects/repositories to be created under a single namespace. This is useful when a project spans many repositories and you need to keep these together so that they are easy to locate and manage. Our vRO projects will be using multiple repositories, therefore we’ll great a Group for these. A Group also simplifies granting access to projects, as collaborators can be granted access to the group and inherently, the projects it contains.

To create a new group, select ‘Groups‘ from the main menu at the top and then select ‘Your groups‘.

Click the ‘New group‘ button and give your group a name and settings that you require (I simply called my group ‘vRO’).

We will create all vRO projects under this new group.

Install Git Client

You will also need to install the Git client for your workstation. You can download the client for your OS here. Read more “IaC for vRealize: Manage Existing vRO Code With vRealize Build Tools & Set up Git Repositories”

IaC for vRealize: Deploying vRealize Build Tools To Allow Infrastructure As Code for vRA and vRO

As any vRealize Orchestrator developer will tell you, managing code outside of the appliance is difficult. I recently wrote a post about Using Visual Studio Code for your vRealize Orchestrator Development, where I highlighted some of the challenges with this. The issue is that we’re not given the freedom to use any IDE we want, easily run unit tests on our code or do continuous integration with tools like Jenkins.

I did mention that a couple of solutions were going to make their way, one of these was internal tooling that VMware’s CoE team currently uses for their vRO development (you can read the article here: https://blogs.vmware.com/management/2018/11/automating-at-scale-with-vro-and-vra.html). It wasn’t possible to get access to these tools without engaging with CoE and forking up a bit of cash.

That is until now, as VMware has released these tools as a new fling. The fling is currently in preview, but you can check it out here: https://labs.vmware.com/flings/vrealize-build-tools. I think this is quite an exciting time for VMware developers as these tools could finally change the way we develop and manage our code and integrate into the wider developer ecosystem.

This is my first blog on this topic but if I find these tools useful, then there will be plenty more to follow. Getting the environment set up to use these tools is not straight forward and has several dependencies. These include deploying supporting infrastructure such as JFrog Artifactory, preparing all the required artefacts that are sourced from the vRO server and getting the workstation set up to create and manage packages.

Deploy and Configure Platform

Before the developer can begin using the vRealize Build Tools, the supporting platform has to be deployed and configured. This consists of setting up an Artifactory server to store Maven artefacts and build dependencies and preparing the artefact repositories.

Deploy Artifactory Server (skip if you already have this deployed)

This section will detail how to set up the Artifactory server and required dependencies. Note that the details below only deploy a single Artifactory node with the database instance running on the same machine. It is recommended that for a production environment to ensure Artificatory is deployed with high availability and connects to external/dedicated database instances.

Install Java Development Kit (JDK)

JFrog Artifactory requires the Java JDK 8 and above to be installed and the JAVA_HOME variable configured. I am using the Open Source version of these tools. Install using the following command:

sudo yum install -y java-1.8.0-openjdk-devel

Add the following lines to ‘/etc/profile‘ to set the ‘JAVA_HOME environment variable and add the Java bin directory to the path.

export JAVA_HOME=$(dirname $(dirname $(readlink $(readlink $(which javac)))))
export PATH=$PATH:$JAVA_HOME/bin

This is what my ‘/etc/profile‘ looks like:

Then source the file and check that the variables have been correctly configured:

source /etc/profile
env | grep JAVA_HOME
env | grep PATH

Read more “IaC for vRealize: Deploying vRealize Build Tools To Allow Infrastructure As Code for vRA and vRO”

Using Visual Studio Code for your vRealize Orchestrator Development

Please note that this post was created before VMware released the vRealize Build Tools fling. I have a new series which covers using these tools and in effect, supersedes this post. Please check out my new series on IaC for vRealize: Deploying vRealize Build Tools To Allow Infrastructure As Code for vRA and vRO


When you are developing in a vRealize Orchestrator environment, one of the biggest frustrations is being limited to the vRO IDE. The vRO IDE is very simple, in that it does not provide any of the features that you would expect from an IDE, such as IntelliSense and quality of life extensions/addons, and only provides basic syntax checking.

There is no integration with source control management systems such as GIT (it has an internal system which isn’t great), moving code can be difficult and unit testing in the true sense of development, doesn’t exist. A connection to the vRO IDE uses a JAVA client and also requires a constant connection to the vRO server, so developing on the move or offline isn’t possible.

Anyone that has developed on this platform will have experienced the same issues and can only dream of the day when this is no longer the case.

You can use an IDE called Visual Studio Code, that can help make your development life easier. Admittedly, this alone doesn’t solve all of the discussed problems, but it does allow you to leverage the power of this IDE to assist in code development. There are still restrictions, such as lack of integration with vRO itself, which requires the code to be manually copied to the vRO server (yes, annoying). The good news, however, is that solutions are starting to become available to provide that integration. I am going to expand more on this at the end of this post.

If you haven’t heard of Visual Studio Code, it is a lightweight and features rich IDE created by Microsoft. The Visual Studio Code website describes it as:

Visual Studio Code combines the simplicity of a source code editor with powerful developer tooling, like IntelliSense code completion and debugging.

First and foremost, it is an editor that gets out of your way. The delightfully frictionless edit-build-debug cycle means less time fiddling with your environment, and more time executing on your ideas.

Visual Studio Code supports macOS, Linux, and Windows – so you can hit the ground running, no matter the platform.

I do almost all of my vRO development using Visual Studio Code, which gives me access to useful extensions and most importantly, keeps me in the mindset of how a developer should work.

In this post, I am going to cover how to set up a Visual Studio Code environment in Windows, install some useful extensions that I like to use and the installation of GIT and other required software components.

Setting Up Your Development Environment

The first thing that you will want to do, is download and install Visual Studio Code from here. When you launch this for the very first time, you’ll get an immediate good impression, from how quickly it loads and how lightweight it feels. The default dark theme is also quite nice.

Read more “Using Visual Studio Code for your vRealize Orchestrator Development”