This section details the module plugins which are included with JobServer.NET when it is initially installed. Watch for additional downloadable modules which can be installed optionally. Before reviewing the individual modules, note the commonly used conventions in the installed modules. The modules that take a list of one or more files as one of the main input parameters for some process will usually use the name FileSource for this purpose. And any module that outputs a list of one or more files that would be directly compatible with using as the input to these will usually use the name FileList as the output for these parameters. We will refer to both in general terms as a FileList and we will see why this sometimes has more significance to it later. For now, it is sufficient to just say that this is just a list of files that we either provide as an input parameter to a module or generate as a list of files found or processed as an output parameter from a module. The most common example of the second is using the [Files] Find module to pick a selection of files and then passing that onto another module as the list of files for it to process. For modules with a FileSource input parameter, generally if you do not need to use the more powerful options the [Files] Find module provides, then you can just supply the folder with a filename pattern on the end of the path. An example would be specifying a folder path and filename pattern like C:\Temp\*.log for FileSource.

[ActiveDirectory] Find Users

This module is used to look up a list of one or more user accounts in Active Directory. The module needs an Active Directory connection, then the other remaining optional parameters can be combined in various ways to select the Active Directory user accounts of interest.

[Email] Send Message

This module is used to send an email message to one or more recipients. It requires the connection information for an SMTP server that will accept the email messages for forwarding, or delivery if the messages are all going to local recipients.

Parameter I/O Description
SMTPConnection In This should specify an SMTP connection.
To In This should specify one or more destination email addresses. Multiple email addresses should be separated by comma. See paragraph below on email address formatting.
From In This can be left empty as long as the connection being used specifies a From address. Otherwise, this can over-ride the From address in a connection and specifies the email address the message should appear to come from. For automated messages, typical convention might be to use an address like noreply@example.com.
Subject In The subject line of the email message.
Message In The body text of the email message. The body text can contain basic HTML for formatting.
FileSource In An option FileList of files to include as attachments to the email message.
Priority In What priority level the message should be sent as. Valid values are Low, Normal, or High. The default is Normal.
LogOutputLevel In Minimal: Normal output to the log. Verbose: More detailed output is written to the log, suitable for debugging purposes.

Email address formatting uses either the plain inbox-host format, or must be in fully quoted displayname format. The plain inbox-host format, {inbox}@{hostname}, is typically what you may always be used to seeing when you type in an email address. An example could be aaron88@example.com. The fully quoted displayname format supports a much friendlier display option on the recipients end when you have their full name along with the email address. These are properly formatted as "Lastname, Firstname <inbox@hostname>". So a corresponding email formatted this way would need to look like "Abernathy, Aaron <aaron88@example.com>".

[Excel] Template Merge

For each source file provided, the module creates a new Excel file in the specified destination folder by copying cell values from the source worksheet into a designated area of the template worksheet. The resulting output files retain the template’s styling and structure while incorporating the source data.

[Facebook] Post Message

This module is used to post a message to a Facebook account wall, or optionally, in a specified group if the account is a member of the group.

[Files] Are Identical

This module is used to check if a file is identical, or not, to another referenced file. It compares the file contents to each other.

[Files] Checksum

This module is used to generate a checksum value for one or more files. This is useful for comparing files to each other, or to a known value, to determine if the file is the same as another file or not. The module supports the following algorithms for generating the checksum value: MD5, SHA256, and SHA512.

[Files] Compress

This module compresses one or more files, either to a specified destination folder, or in-place. In regular usage, the files are compressed as they are copied or moved to their destination with each file that is being compressed having the same root filename as the original source file. Note that the file extension will be changed to denote the file as being compressed. Thus, a file that came from the source list of files with the filename “Chain Quarterly Inventory Results 2012.xlsx” will become “Chain Quarterly Inventory Results 2012.xlsx.zip”. Each compressed file in the destination has the same base source filename as the original file did.

Additionally there is an optional grouping set of options which changes the behavior of the specified destination. When the grouping options are used, the destination filename may not be a 1:1 relationship to the source files. The destination file may contain multiple files from the source based on the selected grouping options. The grouping option allows the source files to be compressed together into a single .zip file on each execution. The grouping options are based on a time period setting which controls how the destination filename is generated. This option is combined with the file prefix option and the date/time the module is executed to determine the actual name of the destination file.

The table below illustrates how the combinations can be used together when the input files are these three files:

For illustrative purposes below we are assuming the module executed on 2012-02-03 at exactly 23:00:00, and this is reflected in the date and time based values in the destination filenames for all options except the None option:

Grouping File Prefix Destination Filename(s)
None (none)Hello Daily Sales 1.xlsx.zipDaily Sales 2.xlsx.zipDaily Sales 3.xlsx.zip Hello_Daily Sales 1.xlsx.zipHello_Daily Sales 2.xlsx.zipHello_Daily Sales 3.xlsx.zip
Timestamp (none)Hello 2012-02-03_23-00-00.zip Hello_2012-02-03_23-00-00.zip
Hour (none)Hello 2012-02-03_23.zip Hello_2012-02-03_23.zip
Day (none)Hello 2012-02-03.zip Hello_2012-02-03.zip
Week (none)Hello 2012-02-W05.zip Hello_2012-02-W05.zip
Month (none)Hello 2012-02.zip Hello_2012-02.zip
Quarter (none)Hello 2012-Q1.zip Hello_2012-Q1.zip
Year (none)Hello 2012.zip Hello_2012.zip

As we can see from the examples in the list, when no grouping option is selected, the files will keep their original base filename. A file prefix can be specified for the None option and doing that will add the prefix to the original filenames. That option may not seem so useful for the None option, but that changes when combined with any of the other grouping options.

Except for the None option, when the destination filename is based on the selected grouping option, the generated filename for the destination is likely to use the same destination filenames when the module is executed multiple times within the time period that corresponds to the named grouping option. Or, in other words, if you selected the Month option for grouping, every time you execute the module within the same month, it will use the same destination filename. This is beneficial in that new input files on the subsequent executions will be continually added to the compressed file. So, in the scenario using the Month option, all files for the same month will be added to the destination compressed file for that entire month.

Again, keep in mind that all the grouping options besides None, and the values for determining the output filenames, are based on the date and time of the module being executed. It is not based on the source files filenames. This behavior only has a superficial appearance as being different when grouping is not used, when the original source files have dates and/or times in their filenames.

The parameters that the module uses are detailed in the following table.

Parameter I/O Description
FileSource In This is a list of paths to existing files for the module to process. There should be one or more entries for the module to process.
Action In Copy: Copies the source files to the destination. Move: Copies the source files to the destination and then deletes from the source folder the files that were copied.
DestinationFolder In This should be a full path to the folder where you want your compressed output files to be written to. A path can be specified as a local path, or as a UNC path. Examples: D:\Sales\DailyDetailArchive \ServerName\ShareName\Sales\DailyDetailArchive
CompressionGrouping In None: Creates one destination zip file for each source file, with the same name as the source file, plus a “.zip” extension. Timestamp: Compresses all files to a single file named yyyy-mm-dd_HH-MM-SS.zip Hour: Compresses all files to a single file named yyyy-mm-dd_HH.zip Day: Compresses all files to a single file named yyyy-mm-dd.zip Week: Compresses all files to a single file named yyyy-mm-W??.zip, where ?? is the week number Month: Compresses all files to a single file named yyyy-mm.zip Quarter: Compresses all files to a single file named yyyy-Q?.zip, where ? is the quarter (1-4) Year: Compresses all files to a single file named yyyy.zip See the table in the section above for more details.
FilePrefix In This string, plus an underscore ("_") character is added to the start of the destination filename when specified. Applies to all CompressionGrouping settings above.
LogOutputLevel In Minimal: Normal output to the log. Verbose: More detailed output is written to the log, suitable for debugging purposes.
FileList Out On a successful outcome, this will be a list of the files written to the compressed file(s).

[Files] Contains

This module is used to get a list of available files from a specified location that meet specified search criteria.

[Files] Copy/Move

This module copies or moves files from one location to another. Typically, you would first set up a job step that uses the [Files] Find module, then use its output as the input to this module. You can also forgo this and supply a file or folder name in FileSource.

[Files] Decode Base64

This module is used to decode a file that is encoded in Base64 format. A file encoded in Base64 is typically used to encode a binary file or object for transmission through a system that normally only properly handles text values. A commonly found example is its use in legacy SMTP email systems.

To use, pass in a list of one or more files that need to be Base64 decoded. The files are created in the output folder, fully decoded and ready for use in any next steps on your job.

[Files] Decompress

This module can be used to decompress one or more files that are stored in a .ZIP compressed file format. The compressed files are extracted and written out to the specified destination folder.

[Files] Delete

This module allows you to delete the files that are specified. This is a common baseline file handling feature and works exactly as you would guess. While you can use it to delete a specific file, or group of files matching a pattern, it becomes more flexible when you combine it with other modules such as the [Files] Find module that is outlined below.

[Files] Delimiter Convert

This module is used with text-based data files that are usually provided by legacy systems, external sources and other places where the choice of output file formats are fairly restricted. The text file may contain data with any number of output records. Each field within each record may use a type of delimiter that is not convenient for use on your target process. This module provides a way to easily automate that type of change. For example, the output of a legacy system only uses TAB characters in the output data/text file it creates, and you wish to use the file in a system that only accepts Comma Separated Values. The module allows a convenient way of transforming this into the proper format.

The module can convert your data/text files using Commas, Tab Separated Values, or Pipe characters as field delimiters.

[Files] Download Zone Info

This module allows you to manage the download zone information flag for your files that are stored on any filesystem that supports it. The download zone information flag is stored within an alternate file stream attached to the file, which is only supported on filesystems like NTFS. Filesystems like FAT32 and other old or legacy filesystem do not support features like alternate file streams.

The download zone information flag can be a help or a hindrance to end users depending on your specific usage needs. This module assists you in managing that either way. When files are downloaded through users’ browsers from untrusted networks, the download zone will be set to indicate it may be from an untrustworthy source. In such cases, when end users attempt to do certain things with the file, they will be prompted with a warning that the file can be dangerous. Sometimes you may want this behavior, and other times may be an unnecessary hindrance, such as when you are downloading files from a trusted vendor.

The module allows you to remove the flag from any one or more files, or set it to a certain level if needed.

[Files] Encode Base64

This module is used to encode a file, typically a binary file, and modify its contents so that it is now encoded in Base64 format. A file encoded in Base64 is typically used to encode a binary file or object for transmission through a system that normally only properly handles text values. A commonly found example is its use in SMTP email systems.

To use, pass in a list of one or more files that need to be Base64 encoded. The files are created in the output folder, fully encoded and ready for use in any next steps on your job.

[Files] Find

This module allows you to locate a selection of various files that you want to process further in a following step or module. The module provides a variety of options to gather up a list of files from a specified location and allows the combination of options to allow for some extraordinarily rich filtering of what files you want it to identify. We will first start by just outlining the parameters the module uses in the following table.

Parameter I/O Description
FileSource In This should be the base folder (path) that you want to have the find module begin looking for eligible files. Note: while most other built-in modules support a filename pattern at the end of the folder path for this same named parameter, this is not supported in this module. The pattern feature is significantly more enhanced in this module and it is specified as a separate parameter.
FilenamePattern In This is a simple filename pattern that can be used like in many legacy applications using the asterisk * and question mark ? characters as wildcards for simple pattern matching. These simple patterns can be combined by using the vertical bar separator `
IncludeRegEx In This allows specifying a regular expression to find files from the source path. The matching is done by file name only, not on the entire path. Using a regular expression allows for complex selection of filename rules. Since regular expressions can be involved, we will provide more details and examples in the section below.
ExcludeRegEx In This allows specifying a regular expression to filter out any files from the source path that we do not want. The matching is done by file name only, not on the entire path. Since this uses a regular expression, it can be combined with both the filename pattern and include regex options to provide a rich set of rules for processing specifically named files in a source path.
IncludeSubfolders In True: The find operation will dig down into any sub-folders located in the source path and will include any files matched in the sub-folders. The sub-folders option is recursive and will keep looking for files any number of subfolders deep. False (default): The find operation is only going to look for files located directly within the source path.
AgeType In Newer: Match only files newer than the thresholds specified below. Older: Match only files older than the thresholds specified below. None (default): The threshold options below have no effect on the matching files found.
ThresholdValue In An integer value for the age threshold. This is combined with the next option to tell us what the threshold is. So, we are able to input a value of 3 here and then choosing the next option is what differentiates the threshold from being set to 3 days or 3 weeks.
ThresholdDuration In Minutes, Hours, Days, Weeks, Months Combining this with ThresholdValue above allows for selecting a wide range of aging options for file selection. Defaults to Minutes.
NoFilesFoundOutcome In ContinueJob (default): If no files are found, continue to the next step of the job. StopJobWithSuccess: If no files are found, stop the job with a success status. StopJobWithFailure: If no files are found, stop the job with a failure status to indicate that this might not be intended results.
LogOutputLevel In Minimal: Normal output to the log. Verbose: More detailed output is written to the log, suitable for debugging purposes.
FileList Out This parameter lists all the files from the Find operation that the module was able to identify.

For most situations where you may just have a single folder where you pick up files for some process, you will often just be limiting yourself to just specifying the FileSource folder location and that might be it. At a minimum, you should consider specifying a filename pattern for all entries and limiting it to the specific type of file(s) you expect to see. For example, if you have a folder location where some other process always drops .log files that you are going to do something with and never expect to see any other types of files, you should really always use a *.log pattern as opposed to specifying a *.* pattern explicitly or not supplying a pattern at all.

Almost all Windows applications support this same filename pattern in various ways, but many times you are limited to only a single filename pattern. The filename pattern in this module allows you to specify nearly any number of filename patterns you want by simply separating each with the vertical bar (|), also referred to as the pipe character. So if you have a process that is going to read a variety of image files, and you can support multiple different formats, you might use a filename pattern like the following example. Notice how this allows you to handle files where sometimes some people or applications are in the habit of creating JPEG image files with either the .jpg or the .jpeg filename extension.

*.jpg|*.jpeg|*.gif|*.png|*.bmp

The IncludeRegEx option works like a filter as well and can be combined with whatever options you might have in the filename pattern. This helps in the example used in the filename pattern as you only have to build the regular expression to match on the filename portion and do not have to build out the final part of your regular expression to limit based on all the supported image filename extensions. It will match against just the filename portion of the file (it does not compare against the full source path or any optional sub-folders the file might be found in). Continuing with the example we started with above, we want to make sure our image files have what looks like a valid date in the format of YYYYMMDD as any part of the filename. We can do that with an expression like the following example.

^.*\d\d\d\d(0[1-9]|1[0-2])(0[1-9]|[12][0-9]|3[01]).*\..+$

The ExcludeRegEx option continues to filter this set of files and when this is used, it will exclude any files with the filename that match this regular expression. Because this can be used to filter out certain types of exclusions, it can help to keep your IncludeRegEx much shorter and easier to read without resorting to overly complex expressions. So again, we will continue with the example started above and we want to make sure we do not have any files which might have the text “_OLD” as the end part of the filename.

^.*_OLD\..+$

By combining these various parameters together, we wind up with a comprehensive set of rules that define what filenames we want to recognize in the source folder, while keeping the set of regular expressions we use to accomplish that to a more simplified and easier to read format. For a more visual introduction on how to utilize the combinations of filename patterns and the include/exclude regular expressions, check out the article at the following URL for additional examples and more.

https://kb.jobserver.net/Q100032

The option to include subfolders is used when you want all the files in the specified path, and all the files located in any of the subfolders that fall underneath of that top level path. This option is recursive and will retrieve all the files located in the folder structure anywhere under the top-level path. When the FileList output for the [Files] Find module is connected to any other module that which can work with relative paths, the relative path information is included in the FileList output and allows such modules to offer enhanced functionality. For example, if the subfolders are populated with files, and you use it in conjunction with the [Files] Copy/Move module, one of the options in that module is to replicate subfolders on the destination. It is this enhanced meta-data that is encoded in the FileList output which allows this feature to work with the original relative hierarchy of files found in the source, when it is desired. Otherwise for all other modules which do not recognize the hierarchy data that [Files] Find can provide, they will just treat the list of files the same as a list from one source folder, ignoring the hierarchy information. This will be covered in more detail in the Copy/Move modules.

The next three parameters work together as a single set of options for the functionality we’re about to review. Under normal circumstances, if these parameters are left at their default values, the files returned from the find operation will include all files that match the previously defined filtering options. By default, file timestamps are not considered. These three parameters introduce an additional and very useful capability: filtering files based on age relative to the current execution time. This allows you to process only files that are either older or newer than a specified time range. The first parameter, AgeType, defaults to None. When set to None, age-based filtering is disabled, and the other two parameters—ThresholdValue and ThresholdDuration—have no effect. If AgeType is set to either Newer or Older, age-based filtering is enabled. In that case, ThresholdValue and ThresholdDuration define the time window used to determine which files are included in the results.

When the AgeType setting is changed to Newer or Older, we can now easily specify a timeframe for files that have been modified and we can narrow our focus to just those files. We can specify this timeframe all the way down to a period of minutes, up to a period of months. Of course, you can specify years, but you are just going to have to do a tiny bit of math to enter 36 Months if you want to specify three years. You could specify a timeframe of only files older than 90 days by setting the three parameters to AgeType: Older; ThresholdValue: 90; ThresholdDuration: Days. You might connect a [Files] Delete module to this to clean up a folder of old files you may be using in some other process. Going the other direction, you could specify a timeframe of only files newer than 60 minutes by setting the three parameters to AgeType: Newer; ThresholdValue: 60; ThresholdDuration: Minutes. You might connect this to a process that only runs once an hour for importing data files and need to evaluate only the most recently updated file(s).

The parameter NoFilesFoundOutcome provides a way to override some of the default behavior of the module. The default value for this parameter is ContinueJob and we will see how the other options are different from this in just a moment. Normally, once all your filtering options are combined and the module finds all the matching files, it emits them through the FileList output parameter. It does this if it finds a few files, thousands of files, or no files at all. This means that when no files are found matching your specifications, that result will still be passed onto the next step or module in your job. Normally, this should be fine for most modules, as a well written module should handle an empty input list properly. But there can be times when either a specific module does not handle a list of no files as input, or it just makes sense that if [Files] Find did not actively find any files matching your search request, then maybe the job should stop at this point. It is this last condition that this option becomes particularly useful.

By changing this parameter to either StopJobWithSuccess or StopJobWithFailure, the module will stop the job execution from proceeding any further in the event the find operation would return no matching files. The only difference between these options is if it causes the step to be flagged as completing successfully or not. If the fact that the find operation found no files and there is no further work to do in the job, then stopping it with a success status would make sense as this would be no cause for concern, and it is safe to stop any further processing here. Otherwise, if a particular job always expects to find one or more files when it is run, and the fact that no files were found on a given run might indicate some sort of problem, then setting it to the failure option would allow it to stand out in your log and gives you the ability to trigger DevOps notifications in the job definition if desired.

The parameter LogOutputLevel controls the amount of detail that is included in the log activity when the job is executed. For [Files] Find, there is not normally a reason you might need to set a higher level of logging detail. The exception for this might be if you are setting up a non-trivial combination of options and want more information about what it is using and the results it generates recorded in the log, then setting this to the higher verbose option can be useful.

[Files] Find And Replace

The [Files] Find and Replace module provides a method for applying content changes to text/data files. This can be used to manipulate or fix data in plain text files, such as replacing all occurrences of “flat-white paint” with “White Paint, Matte Finish”.

[Files] Generate Filename

This module is used to generate a new, unique filename based on the supplied parameters.

[Files] Join

The [Files] Join module provides a method for joining multiple input files into a singular text/data file. Various options provide for joining files under a variety of conditions.

[Files] Mutex Lock

The [Files] Mutex Lock module allows for jobs which may interact with each other when running to take an action based on lock activity. This can be used to prevent two jobs from running at the same time which might try to use the same resource. When used in this manner, the job that received the lock first, will be able to continue running, while the job that encounters an existing lock will be stopped. Another form of operation would be to allow the lock to act as a gatekeeper, only pausing any subsequent jobs and only allowing them to continue on once the previous lock is cleared.

This allows for multiple jobs which may need access to a limited resource, or when a process would not work correctly if multiple jobs were attempting to work with a given resource at the same time.

[Files] Mutex Unlock

The [Files] Mutex Unlock module allows for jobs which may interact with each other when running to take an action based on lock activity. This is used in conjunction with the Mutex Lock module. They are used together to wrap around a critical section of job steps. See the Mutex Lock KB article for a complete description and examples.

[Files] Pick Subset

The [Files] Pick Subset module picks a subset of files from the supplied FileSource, up to the specified number of files, and returns their names in an output list parameter.

[Files] Read

The [Files] Read module provides a means to read the text contents of a file and make it available as an output parameter for use by subsequent steps in a job. Thus, it takes only a single input parameter as the source filename to read. The contents of the file are made available on the TextValue output parameter.

[Files] Render FileList as Html

This module takes a list of files and renders it as an HTML table.

[Files] Set Attributes

The [Files] Set Attributes module allows you to operate on one or more specified files and set the filesystem attributes for all of the indicated files.

[Files] Split

The [Files] Split module allows you to take one or more large files, and split them into individual smaller sized files. Splitting large files like this is typically used when it is necessary to transmit or store the files on different media or filesystems that might not properly handle the original large size files. The files can be joined back together when needed after transmission or when recovered.

For example, suppose you have a 100 MB file and you want to split it into 5 MB chunks for storage/transmission. Your original input file might be named ThisSeasonsData.dat and with specifying 5 MB for the SplitAtSize parameter, you should end up with 20 output files, sequentially named from ThisSeasonsData_0001.dat through to ThisSeasonsData_0020.dat.

[Files] Text Convert

The [Files] Text Convert module provides a variety of options that allow for quick and simple cleanup of text based data files you might have received or generated from one system that needs to have some minor manipulation occur on the file before it can be used or consumed by another system. This module offers several common options for cleaning up text-based data files of this type.

[Files] Touch

The [Files] Touch module provides a way for the supplied list of files to have the file date and time timestamp updated for the file(s) along with setting or clearing one or more other flags on the file(s).

[Files] Validate Checksum

This module validates a list of file(s) against a checksum manifest file. The manifest file is in either CSV, JSON, or SUM format. The supported checksum types are MD5, SHA256, and SHA512. The module will return a list of files and pass/fail status.

[Files] Write

The [Files] Write module implements a method for taking an output parameter from another module and writing the contents to a specified text file.

[Folders] Check Size

The [Folders] Check Size module provides a way to look at a folder based on the number of files in it, or the size of all the files in it, and the option of those values being applicable to only the top level folder, or include the hierarchy of any nested subfolders.

[Folders] Create

This module creates 1 or more folders in a specified root path. It returns a list of folders that were created.

[Folders] Delete

This module deletes 1 or more folders in a specified root path. It returns a list of folders that were deleted.

[FTP] Copy/Move from Remote

This module allows you to copy or move one or more files from an FTP server to a different location, e.g., on a local or network drive. Typically, you would first set up a job step that uses the [FTP] Find on Remote module, then use its output as the input to this module. You can also forgo this and supply a file or folder name in FileSource.

The module will connect to an FTP server, copy each specified file to the desired destination, and then optionally delete it from the FTP server once it has been successfully copied.

[FTP] Copy/Move to Remote

This module provides a way to copy or move one or more files from a local or network drive and upload them to an FTP server. Typically, you would first set up a job step that uses the [Files] Find module, then use its output as the input to this module.

The module will connect to an FTP server, copy each specified file to the desired destination, and then optionally delete it from the local source once it has been successfully copied.

[FTP] Delete from Remote

This module allows you to delete files on a remote FTP server. You can specify a specific file or an FTP FileList, which is typically done using the [FTP] Find on Remote module.

[FTP] Find on Remote

This module allows you to locate a selection of various files on an FTP server that you want to process further in a following step or module. Like the [Files] Find used for selecting files on local drives or network connections, this module provides a variety of options to gather up a list of files—in this case from an FTP server; and allows a combination of options to allow for filtering of what files you want it to identify.

[Hyper-V] Action

The [Hyper-V] Action module allows you to use a job to control a Hyper-V virtual machine. It allows you to start and stop VMs, along with a few other operations.

[Hyper-V] Checkpoint

The [Hyper-V] Checkpoint module allows you to manage checkpoints on a Hyper-V virtual machine.

[IIS] Action

The [IIS] Action module provides a means to use a job to control an IIS server. It allows you to start and stop websites on the server, along with a few other operations.

[Images] Constrain

The [Images] Constrain module provides you with a flexible means to size images to a consistent state. It constrains the image to the specified settings with the goal of not distorting the image due to mismatches in the aspect ratio in the source image and desired target image.

The parameters that the module uses are detailed in the following table.

Parameter I/O Description
FileSource In This parameter specifies the source of the image files as a FileList type of parameter. As a FileList, this can be specified as just the path for a specific folder. Or it can be specified as the path to a folder with a filename pattern to limit it to specific types of filenames or extensions. Or it can be linked to a preceding [Files] Find module for more flexibility in choosing what files to process.
TargetFolder In By default, re-encoded images are written to the target folder without affecting the original source image. This means the target folder must be a unique location from the source folder.
DeleteSourceFiles In The default value for this parameter is false (unchecked) meaning the original source image is not affected by the re-encoding process. If changed to true (checked), then the source image is deleted only once the modified file is successfully written to the target folder.
Operation In Crop (default): Determines the best way to obtain a usable image from the source image by cropping it to fit the target size while maintaining the targeted aspect ratio. Scale: Maximizes the original image inside the target size, maintaining the aspect ratio. If the source image and the target size are not an exact match to the aspect ratio, then the canvas color will be visible as either horizontal or vertical bars as needed.
Width In Either the target height or width of the output image should be specified. If only one is specified, then the aspect ratio of the target must be specified. If both are specified, then the target aspect ratio does not need to be as supplying both the width and height define the new aspect ratio.
Height In Optional sometimes as explained in the description for the Width parameter above.
AspectRatio In If only a target width or height (but not both) are specified, then the target aspect ratio must also be specified. Aspect ratio is supplied as a string value such as “16:9” which is the standard aspect ratio for 1080p HDTV resolution. An aspect ratio of “4:3” would match the aspect ratio of the older SDTV resolution.
SourceAnchor In None, Top, TopLeft, TopRight, Bottom, BottomLeft, BottomRight, Left, Right When determining how to best fit the source image into the target image, the default is None, which centers the source image both vertically and horizontally either within or over the canvas for the target image. This behavior can be modified by having the source image optionally anchor itself to one or two sides of the target. An example might be if all the source images you are processing are photos of people’s faces and the top of their head is generally near the top of the photo, then you might change the anchor to Top to make sure it does not crop out the subject’s hair.
CanvasColor In On scale operations when the canvas of the target image may wind up being slightly larger than the source image which would result in an uncovered portion of the canvas. In such a condition, you may want to set the canvas color to match your usage. Colors can be specified in the HEX-RGB format such as “#808080”, or any of the standardized names recognized by the .NET System.Drawing.Color namespace.
AutoRotateOnExif In True (default): The image will automatically be rotated to the correct orientation if there is embedded EXIF sensor data that would indicate the image is not stored oriented to the expected viewing angle. False: The EXIF data is ignored, and no rotation will occur regardless of any EXIF data embedded in the image.
LogOutputLevel In Minimal: Normal output to the log. Verbose: More detailed output is written to the log, suitable for debugging purposes.
FileList Out When the module completes any work successfully, this is a FileList of all the specific images that were re-encoded and written to the target folder.

Let’s review how the crop and scale operations work. When the system prepares the target image, it first creates a blank canvas sized according to your specifications. At this point, no source image has been applied—just the empty canvas. The crop operation trims the source image as little as possible so that it completely covers the canvas. In other words, it ensures there are no empty areas, even if that means cutting off some of the image. The scale operation works differently. It resizes the source image so that the entire image remains visible while fitting inside the canvas as closely as possible. This guarantees nothing is cut off, though there may be empty space around the image if the aspect ratios differ.

The crop operation is the method you would want to use if you are certain that the content of your image is either usually centered, or closest to one edge or corner of the source images. This allows you to get the maximum amount of the original image into your target image and will completely cover the canvas of the target image. Using crop allows you to wind up with a target image that is consistent in size and does not have any bleed through of the canvas on the target image.

((TODO: add example constrained:crop images))

The scale operation is the method you would want to use if you are either uncertain that the content of your image is going to be mostly centered or in a predictable relative position of the source image. This preserves your entire source image at the expense of having the canvas of the target image visible on one or two edges of the target image.

((TODO: add example constrained:scale images))

For more examples on the constrain module and the differences between the crop and scale operations, you can review the article below and follow it for additional examples and demonstrations on how this works.

[Images] Contact Sheet

This module allows you to create a contact sheet of multiple images.

[Images] Encode

This module allows you to take one or more files and convert them all to a consistent encoding method. If you have a folder full of images that you want to use and some are .gif images, some are .jpg images, and some are .png images, you may want to make sure they get reencoded for proper use with some other process. With this module, you can make sure they all get changed to .png format or whatever your desired image format is. When re-encoding images, the module uses the best quality settings on any image codecs that use any settings for balancing size over quality.

[Images] Overlay

This module overlays an image onto one or more input image files. Opacity, X offset, and Y offset can be specified. The image can optionally be resized to fit the input image.

[Images] Quantize

This module quantizes images, reducing the number of individual colors used in each individual image. Number of colors, color reduction method, and dithering method can be specified.

[Images] Rotate

This module allows you to take one or more files and either rotate them automatically to match any recorded sensor data in the image’s EXIF data, or explicitly rotate the image to any 90-degree angle as needed. For images taken with devices that have GPS and/or orientation sensors, these devices usually record EXIF information to the photos that they take. EXIF is an acronym for Exchangeable Image File Format and can record data such as the location where the image was taken as well as the orientation of the camera. If the camera was held at an angle different from the default, then the image when viewed in some applications which are not aware of how to display the image properly, may show the image as sideways or upside down when this happens. To fix such situations, the automatic setting of this module can be used to look at the orientation data for the image, and rotate it to match, then remove the orientation data from the image so that it displays as expected for all applications.

[JSON] Minify

This module formats an input JSON string to minimize the amount of whitespace making it smaller without altering the code. Note that minimization comes at the expense of code readability.

[JSON] Prettify

This module formats an input JSON string adding whitespace and linebreaks without altering the code in order to make it more easily legible.

[JSON] Render Html

This module renders a block of JSON data to HTML for visualization purposes. It can be used to display JSON data in a more user-friendly format, making it easier to read and understand the structure of the data.

[LogFiles] Archive

This is a specific task-driven module. It combines several steps into one that would otherwise require you to build the same functionality out of several steps with other base modules. Its primary focus is providing a quick way for managing an archive of log files. While we focus on log files here, it certainly could be used on any type of collection of files that you would want to handle in the same manner.

The module will archive log files, with the option to use a different folder for the archive from the source. When a log file is archived, it is moved into a compressed destination file, meaning the original raw log file is removed once it is successfully archived. The module allows you to optionally use the compression grouping features similar as to what is found in the [Files] Compress module for the compressed archive file(s). Then, after processing all those options, it also can automatically prune the archived .zip files to a certain age range. This bundles multiple module functionality all into one module while keeping the number of parameters to a minimum for the common task of managing application logfiles.

[Logic] Branch

This module causes the JobServer to immediately jump to a different step in the same job. This is useful for skipping steps, or for creating loops in a job.

[Logic] Compare And Branch

This module is used to control execution depending upon logic comparison of values specified in the module. The module will compare two values, and can jump to different steps when the comparison is less than/equal to/greater than.

[Logic] Contains

This module is used to control execution based on string value comparison. The module will check if a string contains another string, and can jump to different steps when the comparison is true or false. A regular expression can be used for the comparison.

[Logic] Stop

This module is used to immediately stop a job. The execution outcome of the job can be set to Success, Warning, or Failure. This is useful for stopping a job when an error condition is detected, or for stopping a job when it has completed its work.

[M365] Extract Attachments

This module is used to extract attachments from emails in Microsoft 365. It takes a list of message Ids (such as those from the M365 Inbox Watcher trigger) and then extracts the attachments from those emails.

[Machine] Activate Power Plan

This module provides a job with the means to control the local machine’s power plan. The active power plan will affect certain operational characteristics of the machine. These characteristics can control the power consumption and processor speed of the machine.

[Machine] Hardware Inventory

This module allows you to retrieve the general hardware inventory of the local or specified machine. When specifying a machine other than the local machine, the JobServer service must be running in an account with the proper domain trust relationship for domain member machines.

[Machine] Purge Downloads

This module will iterate through all the user profiles on the local machine and will purge the downloads folders of files left behind.

[Machine] Service Control

This module allows you to take action on the services which are installed on the local machine. This can be used in response to other steps in a job to start, stop or restart specific services. You may be asking what type of situations would you want to control various services on the machine? One example would be if you have a service from some third party vendor that might have a memory leak or some other resource problem with it. By using this in conjunction with the scheduler, you could periodically shutdown and restart the service to force it to release resources.

[Machine] Shutdown/Restart

This module allows you to shut down or restart a running machine. This can be used for the local machine as well as on any machine in a member or trusted domain. While this may have many uses, one example we will outline might be when you normally want all your network users to shut off their machines when they leave for the day. If you have a problematic end-user who constantly forgets to do this, you could easily force their machine to shutdown after a certain time each time using this module.

[Network] Http Action

This module allows you to generate a basic HTTP request of various types. The parameters can be used to build up the request you want to send to the webserver.

[Network] Http Ping

This module allows you to send an HTTP GET request to a specified URL. Its purpose is to send the request and get the returned HTTP status code of the request. This module is meant for purposes where you either need to send an HTTP request for an external site or process. Examples may be keeping a specific page of a website loaded in cache, or to do a basic status code check of a specified URL in order to make sure a site or webserver is alive and responding. This module is meant for quickly creating very simple HTTP requests. For other types of requests, use the more flexible [Network] Http Action module.

[Network] VPN

This module allows a job to start or stop a VPN that is defined in the Windows networking configuration. Found in Control Panel > Network and Internet > Network and Sharing Center in Windows 10 for example.

[Parameter] Set

This module is used set, increment, or decrement a local or global parameter’s value. Combined with the [Logic] Compare And Branch module, this allows you to create loops and conditional branches in your job.

[PDF] Merge

This module is used to merge multiple PDF files into a single PDF file. The module takes a list of PDF files as input and produces a single PDF file as output.

[PDF] Password Maintenance

This module is used to set/clear a PDF file’s passwords. The module takes a PDF file as input and produces a PDF file as output. Note that if the PDF is already protected by a password, the password must be provided in order to clear it or to set a new password. User & Owner passwords are supported.

[Perl] Execute

This module allows a step to call and execute a specified Perl script. In order to run the Perl script, a Perl processor must be installed on the local server. Currently this module looks for and uses an installation of Strawberry Perl in order to execute perl scripts.

[PGP] Decrypt

This module provides a way to decrypt any number of files that have been encrypted using PGP.

[PGP] Encrypt

This module provides a way to encrypt any number of files using the PGP encryption algorithm.

[Python] Execute

This module allows a step to call and execute a specified Python script. In order to run the Python script, a Python processor must be installed on the local server. Currently, this module looks for and uses an installation from www.python.org in order to execute python scripts.

[Shell] Command Line

This module is built to provide a quick method for executing a legacy command-line command such as: DIR, XCOPY, DEL, RMDIR, and so on.

[Shell] PowerShell Command

This module is built to provide a quick method for executing a PowerShell command.

[Shell] Sleep

This is a simple module. Its purpose is to delay for a specified number of seconds. Any job steps coming after the sleep command must wait until its countdown has finished. It is an ideal module to use for testing when learning how to use and configure job definitions in JobServer. It also helps account for potential lag between operations in a single step, ensuring that subsequent steps observe the expected results.

[Shell] Sleep Random

This is much like the [Shell] Sleep module, except that you can specify a range of seconds to sleep, and it will initiate a countdown for a random duration within that range. Like [Shell] Sleep, any job steps coming after it must wait until the countdown has finished.

[Slack] Send Message

This module allows a job to have the ability to send messages to a specified Slack channel.

[SMS] Send Message

This module allows your job to send an SMS message. The SMS Connection defined for the module will define how the system transmits messages via a messaging provider. Current, there is one supported provider, Twilio. For more information on configuring messaging providers for SMS, refer to the article on SMS Connections.

[SQL Server] Agent

This module allows a job to trigger a SQL Server Agent Job. This allows the JobServer to coordinate any activity it might perform in its own jobs with the SQL Server Agent jobs.

[SQL Server] Create Tables

This module creates tables in a SQL Server database based on the provided source data files. The data source files can be CSV or Excel format. The module reads the structure of the source data and executes the necessary SQL commands to create tables with appropriate columns and data types in the target SQL Server database..

[SQL Server] Execute

This module is built to provide a quick method for executing a SQL command on a Microsoft SQL Server database.

[SQL Server] Export

This module allows a quick method for creating data output from a SQL command. A variety of options allow for a wide variety of configurations.

[SQL Server] Get Value

This module executes a SQL command on specified SQL Server and retrieves a singular value as the result. The module is designed to return a single value from the database, which can be used in subsequent steps of the workflow. It allows you to specify the SQL command to be executed and the connection details for the SQL Server.

[SQL Server] Import

This module allows a quick method for bringing data from external files into a SQL database. A variety of options allow for a wide range of configurations.

[SQL Server] SSAS Execute

This module allows the execution of a SQL Server Analysis Services XML command.

[Teams] Copy/Move From Remote

This module provides a job the ability to copy or move files from the remote Teams channel or folder, to the local machine or network.

[Teams] Copy/Move To Remote

This module provides a job the ability to copy or move files from the local machine or network to the remote Teams channel or folder.

[Teams] Find Files

This module provides a job the ability to locate files of interest on a remote Teams site.

[Teams] Send Message

This modules provides a job the ability to send a message to a Microsoft Teams channel.

[Telegram] Send Message

This module provides a method for jobs to send a message to a Telegram Channel.

[Twitter] Send Message

This module provides a method for jobs to send a message via X/Twitter.