Apache ANT: Generating ANT build file
After the Apache ANT and Java JDK have been installed and their ANT environment variables are set up correctly, a build.xml file can be generated for your scheduling requirements.
This build.xml file defines the scope and requirements of the scheduling task, including what tests should be run, the test environment to be used, the report recipients, and other details.
You can generate a build.xml file automatically using Provar’s Run Under ANT option.
Run Under ANT in Provar
Start by opening Provar. If Provar is already running and has not been restarted since you updated your Environment Variables, please close and restart it now.
In the Navigator view, locate the test case or test folder you wish to schedule.
Right-click on the test case or test folder and select Run Under ANT.
The following ANT Dialog window will appear.
This window provides all the options available around test scheduling. Note that these options have already defaulted to the most common configuration.
Note: This page does not run through every option available in Run Under ANT. Refer to the Run Under ANT Options page for more information on a specific option. Or, to learn more about the parameters in your Build File, refer to ANT Task Parameters.
In particular, note the ‘Start Ant execution’ checkbox at the bottom of the window. If ticked, Test Case execution will be initiated through ANT as soon as the ‘Run’ button is clicked. If ‘Start Ant execution’ is not ticked, the Build File will be created without executing the Test Cases. This is useful if you want to make the Build File to be run on a different machine.
We recommend unticking Start Ant execution for now since we will test the Build File manually after it is created.
Setting Up a Report Email
We recommend that your scheduling sends a report email to stakeholders after completing the scheduled run.
Click the Email tab on the ANT Dialog window to set up an email report.
Note that, from Provar 1.8.11 onwards, additional configuration options exist for the run report.
Refer to Exporting Reports for more information on these options.
Once you have selected your preferred options, click the Email Configuration button.
Enter the details of the email account you want to use to send the report email:
Click the Test Connection button to verify the email account details. Note that clicking this button will send a test email from the account to the same tab to confirm access.
If successful, an email will be sent and received from the email account:
And this success message will display:
Click OK to remove this message and then OK again to confirm the email connection details.
Creating the Build File
After you have set your preferred options, click the ‘Run’ button from any tab of the ANT Dialog window.
This will create the Build File in the location you defined:
If the Start Ant Execution checkbox is ticked on the Details tab, this will also begin Ant execution automatically. If you unticked Start Ant execution, follow the steps below to run the Build File manually locally.
To learn more about the parameters in your Build File, refer to ANT Task Parameters.
Running the Build File Locally
Now you can test the Build File by running it locally. The following steps describe how to do this using the command line. (You can do this automatically when creating the Build File by ticking the ‘Start Ant Execution’ checkbox.)
To run the Build File, use the command ant -f build.xml runtests to run the Build.xml file from the directory where it is stored. Follow the steps below if you are not sure how to do this.
Navigate to the folder which contains the new Build File.
Highlight the full file path by clicking in the top panel.
Then type in cmd and hit return.
This will open cmd in the directory of the Build File.
Type ant into cmd.
Note: this simple ant instruction will work fine if only one target is in the Build File. You could use this command if your Build File were generated using the default settings supplied in Run Under Ant. However, if you have modified the Build File so that there are multiple targets, instead of entering ‘ant,’ you should enter an ant -f build.xml runtests, where runtests is the name of the target you want to specify:
Hit return to run the command above. If you get the following ‘not recognized’ error, check your ANT_HOME and Path variables:
Wait for the command to complete.
Viewing Test Results
On completion, the interface will look something like this:
If you specified an email as part of the Build Task, you should also have received an email at the addresses you specified:
A detailed execution report will also be created in the Results directory specified in the Build File (this defaults to /ANT/Results).
This contains an index.html file that can be opened in any web browser:
- General information
- Licensing Provar
- Provar trial guide and extensions
- Using Provar
- API testing
- Behavior-driven development
- Creating and importing projects
- Creating test cases
- Custom table mapping
- Debugging tests
- Defining a namespace prefix on a connection
- Defining proxy settings
- Environment management
- Exporting test cases into a PDF
- Exporting test projects
- Override auto-retry for Test Step
- Managing test steps
- Namespace org testing
- Provar desktop
- Provar Test Builder
- Refresh and Recompile
- Reload Org Cache
- Running tests
- Searching Provar with find usages
- Secrets management and encryption
- Setup and teardown test cases
- Tags and Service Level Agreements (SLAs)
- Test cycles
- Test plans
- Testing browser options
- Tooltip testing
- Using the Test Palette
- Test Palette introduction
- Control test steps
- Generate Test Case
- List compare
- Page Object Cleaner
- Read test step
- String test steps
- UI Test Steps
- Using custom APIs
- Callable tests
- Data-driven testing
- Page objects
- Block locator strategies
- Introduction to XPaths
- Creating an XPath
- Label locator strategies
- Maintaining page objects
- Mapping non-Salesforce fields
- Page object operations
- Refresh and reselect field locators in Test Builder
- Using Java method annotations for custom objects
- Applications testing
- Database testing
- Document testing
- Email testing
- Mobile testing
- OrchestraCMS Testing
- Guide in Salesforce CPQ Testing in Provar
- Guide in ServiceMax Testing
- Skuid Testing
- Vlocity API Testing
- Webservices testing
- Introduction to test scheduling
- Apache Ant
- Configuration for Sending Emails via the Provar Command Line Interface
- Continuous integration
- AutoRABIT Salesforce DevOps in Provar Test
- Azure DevOps
- Running a Provar CI Task in Azure DevOps Pipelines
- Configuring the Provar secrets password in Microsoft Azure Pipelines
- Parallel Execution in Microsoft Azure Pipelines Using Multiple build.xml Files
- Parallel Execution in Microsoft Azure Pipelines using Targets
- Parallel execution in Microsoft Azure Pipelines using Test Plans
- Bitbucket Pipelines
- Gearset DevOps CI/CD
- GitHub Actions
- Integrating GitHub Actions CI to Run Provar CI Task
- Remote Trigger in GitHub Actions
- Parameterization using Environment Variables in GitHub Actions
- Parallel Execution in GitHub Actions using Multiple build.xml Files
- Parallel Execution in GitHub Actions using Targets
- Parallel Execution in GitHub Actions using Test Plan
- Parallel Execution in GitHub Actions using Job Matrix
- GitLab Continuous Integration
- Travis CI
- Execution Environment Security Configuration
- Provar Jenkins Plugin
- Parallel Execution
- Running Provar on Linux
- Salesforce DX
- Team foundation server
- Version control
- Salesforce testing
- Adding a Salesforce connection
- Assert Page Error Messages on Add/Edit Product
- Dynamic Forms
- Internationalization support
- List and table testing
- Salesforce Release Updates
- Salesforce Lightning Testing
- Salesforce Lightning Web Component (LWC) locator support
- Salesforce console testing
- Visualforce Testing
- Performance Best Practices
- Testing best practices
- Configurations and permissions
- Error messages
- Administrator has blocked access to client
- macOS Big Sur Upgrade
- Resolving failed to create ChromeDriver error
- Resolving Jenkins license missing error
- Resolving metadata timeout errors
- Test execution fails – Firefox not installed
- Update to Opportunity field validation behaviour
- Licensing, installation and firewalls
- Test Builder and test cases
- Release notes