Reporting
After running a test, the Test Runner will provide a detailed output of the tests you have run. You can also export a report of the test run.
Provar supports exporting test results into CSV or PDF format.
PDF reports have options for the following information:
- Screenshots
- Salesforce IDs
- Summary information
- Test summary based on Tags added to the tests
- Logging levels for output messages
To export a report, choose the
From this screen you can specify the level of detail you would like in the PDF report, or select the CSV option if you would like to import into Excel.
Some additional configuration options are also available:
- Show Test Case Path Hierarchy: Configure whether to show or hide the full path hierarchy. (This is defaulted to Show.)
- Configure Screenshot Size: Configure whether to include screenshot thumbnails or full-size screenshots. (This is defaulted to thumbnails.)
If Show Test Case Path Hierarchy is enabled, the run report will include the test case folders:
This is the default option. If instead it is disabled, the run report will display only the test case names:
If Screenshots (Thumbnails) is ticked, the run report will show the following:
This is the default option. If instead Screenshots (Full) is ticked, the run report will show larger captures:
Note: If both options are ticked, preference will be given to showing full screenshots.
The default location for the report will be your project folder. To see the report file in the Navigator, right click on the project and choose Refresh.
Customizing your report
This feature allows you to generate customized reports in CSV and PDF.
In order to use this feature, please do the the following:
- Create a folder named reports inside the src folder of the test project.
- Access the TestRunMonitorDemo.java information. The code is listed below for reference.
- Add the .java file to the new reports folder.
package reports; import java.io.File; import java.io.FileWriter; import java.io.IOException; import com.provar.core.testapi.ITestExecutionContext; import com.provar.core.testapi.annotations.TestExecutionContext; import com.provar.core.testapi.annotations.TestRunExporter; import com.provar.testrunner.AbstractTestRunMonitor; import com.provar.testrunner.ITestRunExporter; import com.provar.testrunner.api.IExecutionItem; import com.provar.testrunner.exporters.TestCaseReportingOptions; public class TestRunMonitorDemo extends AbstractTestRunMonitor { @TestRunExporter private ITestRunExporter pdfExporter; @TestRunExporter(mimetype="application/csv") private ITestRunExporter csvExporter; @TestExecutionContext private ITestExecutionContext testExec ; String pdfPath; FileWriter fileWriter = null; String fileName = "ProvarTestRunExport"; static int i = 1; TestCaseReportingOptions testCaseReportingOptions = new TestCaseReportingOptions(true, false, true, true, true, true, true, false, true); File reportsDir ; public void initialize() { reportsDir = new File(testExec.getProjectPath() + File.separator + "Reports"); if (!reportsDir.exists()) { reportsDir.mkdirs(); } pdfPath = this.testExec.getProjectPath(); try { fileWriter = new FileWriter(reportsDir.getAbsoluteFile() + "\\" + fileName + ".csv", false); fileWriter.write("run id, started, ended, path, name, successful, skipped, failed, total, failure Message\n"); fileWriter.close(); } catch (IOException e) { e.printStackTrace(); } } @Override public void writeTestCaseExecution(IExecutionItem executionItem) { try { pdfExporter.exportExecutionItem(getRuntimeConfiguration(), pdfPath, executionItem, getTestArtifactsPath(), testCaseReportingOptions, null, true, "pdf",null, null, testExec.getRunId()); } catch (Exception e) { e.printStackTrace(); } try { fileWriter = new FileWriter(reportsDir.getAbsoluteFile() + "\\" + fileName + ".csv", true); csvExporter.exportExecutionItem(null, null, executionItem, null, null, null, false, "csv" , fileWriter, ",", testExec.getRunId()); } catch (Exception e) { e.printStackTrace(); } } @Override public void writeTestRunSummary(IExecutionItem rootExecutionItem) { try { pdfExporter.exportExecutionItem(getRuntimeConfiguration(), (reportsDir.getAbsoluteFile() + "\\" + "Summary.pdf"), rootExecutionItem, getTestArtifactsPath(), testCaseReportingOptions, null, false, "pdf",fileWriter, ",", testExec.getRunId()); } catch (Exception e) { e.printStackTrace(); } try { fileWriter = new FileWriter(reportsDir.getAbsoluteFile() + "\\" + fileName + ".csv", true); csvExporter.exportExecutionSummary(rootExecutionItem, fileWriter, ","); } catch (Exception e) { e.printStackTrace(); } } }
- Start time
- End time
- Test location path
- Test case name
- Test status (successful, skipped, failed)
- Any environment variables which have been defined
Report location
The name and path where the report should be generated could be changed to a different path inside or outside the test project.
Data writing
Whether the data should be written at the beginning or should be appended to the csv file after each test run.
- General information
- Licensing Provar
- Provar trial guide and extensions
- Using Provar
- API testing
- Behavior-driven development
- Creating and importing projects
- Creating test cases
- Custom table mapping
- Functions
- Debugging tests
- Defining a namespace prefix on a connection
- Defining proxy settings
- Environment management
- Exporting test cases into a PDF
- Exporting test projects
- Managing test steps
- Namespace org testing
- Provar desktop
- Provar Test Builder
- Refresh and Recompile
- Reload Org Cache
- Reporting
- Running tests
- Searching Provar with find usages
- Secrets management and encryption
- Setup and teardown test cases
- Tags and Service Level Agreements (SLAs)
- Test cycles
- Test plans
- Testing browser options
- Tooltip testing
- Using the Test Palette
- Using custom APIs
- Callable tests
- Data-driven testing
- Page objects
- Block locator strategies
- Introduction to XPaths
- Creating an XPath
- JavaScript locator support
- Label locator strategies
- Maintaining page objects
- Mapping non-Salesforce fields
- Page object operations
- ProvarX™
- Refresh and reselect field locators in Test Builder
- Using Java method annotations for custom objects
- Applications testing
- DevOps
- Introduction to test scheduling
- Apache Ant
- Configuration for sending emails via the Provar Command Line Interface (CLI)
- Continuous integration
- AutoRABIT
- Azure DevOps
- Running a Provar CI task in Azure DevOps
- Configuring the Provar secrets password in Microsoft Azure Pipelines
- Parallel execution in Microsoft Azure Pipelines using multiple build.xml files
- Parallel execution in Microsoft Azure Pipelines using targets
- Parallel execution in Microsoft Azure Pipelines using Test Plans
- Bitbucket Pipelines
- Circle CI
- Copado
- Docker
- Flosum
- Gearset
- GitLab CI
- Jenkins
- Travis CI
- Parallel Execution
- Running Provar on Linux
- Reporting
- Salesforce DX
- Git
- Team foundation server
- Version control
- Zephyr Cloud and Server
- Salesforce testing
- Adding a Salesforce connection
- Assert Page Error Messages on Add/Edit Product
- Dynamic Forms
- Internationalization support
- List and table testing
- Salesforce Release Updates
- Salesforce Lightning Testing
- Salesforce Lightning Web Component (LWC) locator support
- Salesforce console testing
- Visualforce Testing
- Testing best practices
- Troubleshooting
- Release notes