Apache Ant: Generating a build file
After the Apache Ant and Java JDK have been installed and their environment variables set up correctly, a build.xml file can be generated for your scheduling requirements.
This build.xml file defines the scope and requirements of the scheduling task, including what tests should be run, the test environment to be used, the report recipients and other details.
You can generate a build.xml file automatically using Provar’s Run Under ANT option.
Run Under ANT
Start by opening Provar. If Provar is already running, and has not been restarted since you updated your Environment Variables, please close it and restart now.
In the Navigator view, locate the test case or test folder you wish to schedule.
Right-click on the test case or test folder and select Run Under ANT.
The following ANT Dialog window will appear.
This window provides all the options available around test scheduling. Note that these options have already been defaulted to the most common configuration.
Note: This page does not run through every option available in Run Under ANT. If you require more information on a specific option, refer to the Run Under ANT Options page. Or, to learn more about the parameters in your Build File, refer to ANT Task Parameters.
In particular, note the ‘Start Ant execution’ checkbox at the bottom of the window. If ticked, Test Case execution will be initiated through ANT as soon as the ‘Run’ button is clicked. If ‘Start Ant execution’ is not ticked, the Build File will be created without executing the Test Cases. This is useful if you want to create the Build File to be run on a different machine.
We recommend unticking Start Ant execution for now, since we will test the Build File manually after it is created.
Setting up a report email
We recommend that your scheduling sends a report email to stakeholders after the scheduled run has been completed.
To set up an email report, click the Email tab on the ANT Dialog window.
Note that, from Provar 1.8.11 onwards, there are some additional configuration options for the run report.
Refer to Exporting Reports for more information on these options.
Once you have selected your preferred options, click the Email Configuration button.
Enter the details of the email account you want to use to send the report email:
Click the Test Connection button to verify the email account details. Note that clicking this button will send a test email from the account, to the same account, to confirm access.
If successful, an email will be sent and received from the email account:
And this success message will display:
Click OK to remove this message, and then OK again to confirm the email connection details.
Creating the build file
After you have set your preferred options, click the ‘Run’ button from any tab of the ANT Dialog window.
This will create the Build File in the location you defined:
If the Start Ant Execution checkbox was ticked on the Details tab, this will also begin Ant execution automatically. If you unticked Start Ant execution, follow the steps below to run the Build File manually on local.
To learn more about the parameters in your Build File, refer to ANT Task Parameters.
Running the build file locally
Now you can test the Build File by running it locally. The following steps describe how to do this using the command line. (Note that you can do this automatically when creating the Build File by ticking the ‘Start Ant Execution’ checkbox.)
To do run the Build File, use the command ant -f build.xml runtests to run the Build.xml file from the directory where it is stored. Follow the steps below if you are not sure how to do this.
Navigate to the folder which contains the new Build File.
Highlight the full file-path by clicking in the top panel.
Then type in cmd and hit return.
This will open cmd in the directory of the Build File.
Type ant into cmd.
Note: this simple ant instruction will work fine if there is only one target in the Build File. If your Build File was generated using the default settings supplied in Run Under Ant, you can use this command. However, if you have modified the Build File so that there are multiple targets, instead of entering ‘ant’ you should enter a ant -f build.xml runtests, where runtests is the name of the target you want to specify:
Hit return to run the command above. If you get the following ‘not recognized’ error, check your ANT_HOME and Path variables:
Wait for the command to complete.
Viewing Test Results
On completion, the interface will look something like this:
If you specified an email as part of the Build Task, you should also have received an email at the addresses you specified:
A detailed execution report will also have been created in the Results directory specified in the Build File (this defaults to /ANT/Results).
This contains an index.html file which can be opened in any web browser:
- General information
- Licensing Provar
- Provar trial guide and extensions
- Using Provar
- API testing
- Behavior-driven development
- Creating and importing projects
- Creating test cases
- Custom table mapping
- Debugging tests
- Defining a namespace prefix on a connection
- Defining proxy settings
- Environment management
- Exporting test cases into a PDF
- Exporting test projects
- Managing test steps
- Namespace org testing
- Provar desktop
- Provar Test Builder
- Refresh and Recompile
- Reload Org Cache
- Running tests
- Searching Provar with find usages
- Secrets management and encryption
- Setup and teardown test cases
- Tags and Service Level Agreements (SLAs)
- Test cycles
- Test plans
- Testing browser options
- Tooltip testing
- Using the Test Palette
- Test Palette introduction
- Control test steps
- List compare
- Read test step
- String test steps
- UI Test Steps
- Using custom APIs
- Callable tests
- Data-driven testing
- Page objects
- Block locator strategies
- Introduction to XPaths
- Creating an XPath
- Label locator strategies
- Maintaining page objects
- Mapping non-Salesforce fields
- Page object operations
- Refresh and reselect field locators in Test Builder
- Using Java method annotations for custom objects
- Applications testing
- Database testing
- Document testing
- Email testing
- Mobile testing
- OrchestraCMS Testing
- Salesforce CPQ testing
- ServiceMax testing
- Skuid testing
- Vlocity testing
- Webservices testing
- Introduction to test scheduling
- Apache Ant
- Configuration for sending emails via the Provar Command Line Interface (CLI)
- Continuous integration
- Azure DevOps
- Running a Provar CI task in Azure DevOps
- Configuring the Provar secrets password in Microsoft Azure Pipelines
- Parallel execution in Microsoft Azure Pipelines using multiple build.xml files
- Parallel execution in Microsoft Azure Pipelines using targets
- Parallel execution in Microsoft Azure Pipelines using Test Plans
- Bitbucket Pipelines
- GitHub Actions
- Running a Provar CI task in GitHub Actions
- Remote Trigger in GitHub Actions
- Parameterization using Environment Variables in GitHub Actions
- Parallel Execution in GitHub Actions using multiple build.xml files
- Parallel Execution in GitHub Actions using Targets
- Parallel Execution in GitHub Actions using Test Plan
- Parallel Execution in GitHub Actions using Job Matrix
- GitLab CI
- Travis CI
- Execution Environment Security Configuration
- Parallel Execution
- Running Provar on Linux
- Salesforce DX
- Team foundation server
- Version control
- Zephyr Cloud and Server
- Salesforce testing
- Adding a Salesforce connection
- Assert Page Error Messages on Add/Edit Product
- Dynamic Forms
- Internationalization support
- List and table testing
- Salesforce Release Updates
- Salesforce Lightning Testing
- Salesforce Lightning Web Component (LWC) locator support
- Salesforce console testing
- Visualforce Testing
- Testing best practices
- Configurations and permissions
- Error messages
- Administrator has blocked access to client
- macOS Big Sur upgrade issue
- Resolving failed to create ChromeDriver error
- Resolving Jenkins license missing error
- Resolving metadata timeout errors
- Test execution fails – Firefox not installed
- Update to Opportunity field validation behaviour
- Licensing, installation and firewalls
- Test Builder and test cases
- Release notes