As an open source project, the source of the GNSSTk is subject to intermittent updates, contributions, and corrections. The GNSSTk testing process has been redesigned to build confidence in the functionality of the toolkit. Testing within the GNSSTk is designed with the following distinct goals in mind:
- Testing is repeatable with a low amount of effort.
- Testing is distributed along with the source to support both internal testing and to assure outside users and contributors of the quality of the library.
- Testing is designed to accommodate easy additions to the existing test suite.
- Testing is implemented to ensure changes have not broken functionality.
All testing is performed using cmake/ctest. This allows the testing to function on all supported platforms.
The goal is to have some level of testing performed on all classes and applications in the GNSSTk core. In fact, some testing is a requirement for the classes and applications to be in core. However, it is encouraged that tests be written for all developed code whether it be in the core or ext.
In general, content in core is tested and stable. This stability has two sides. First the content in core will be stable in the sense that the features currently expressed are functioning according to the tests' expectations. In addition, content in core will also be stable in the sense that any changes to the interfaces with the content need to be agreed upon by the community. Therefore, the extent of testing required is highly dependent on the needs of those who wish the content to be in core. Most importantly is that the testing cover whatever functionality is vital to the production of whatever software uses the toolkit. In this way the content is ensured to be functioning and can be used by other software with the confidence that the interfaces will not be shifting in each release of the toolkit.
- Using build.sh
$ cd ~/git/gnsstk
$ build.sh -te
- Manually
$ cd ~/git/gnsstk/build
$ cmake .. -DTEST_SWITCH=ON
$ make
$ ctest
- Run ctest with
-V
option or build.sh with the-v
option - Examine the detailed log generated by ctest (does not require -V)
- build/Testing/Temporary/LastTest.log
- Write a C++ program in core/tests/... or ext/tests/...
- File name starts with class name and ends with _T before the ".cpp". For example, the test program for a class named foo would be called foo_T.cpp.
- Add any required data to the appropriate data directory. See Testing Data section for more detail.
- Modify the CMakeLists.text to build and run the tests.
- Unit tests for a particular GNSSTk library class are organized in a single cpp file titled by the GNSSTk library class under test with a _T.cpp appended.
- Unit test files are kept in gnsstk/core/tests and gnsstk/ext/tests in the same subdirectories as in gnsstk/core/lib/ and gnsstk/ext/lib/.
- The individual cpp files are broken into two parts, a test class to test the GNSSTk library class and a main() segment to run those tests.
- The test class is organized into multiple public methods in which each method contains multiple assertions which test a particular feature of the GNSSTk library class under test.
- The test class might inherit from the GNSSTk library class in order to access protected members for direct checking of values.
- To facilitate reporting to the testing logs, GNSSTk uses its own TestUtil class. TestUtil provides standardized output containing information on the GNSSTk library class being tested, feature of class being tested, test file name, line number of test in that file, pass/fail bit, and a failure message should the test have failed. It also provides the number of failures to the main() portion of the test cpp file. The current style of using the TestUtil class is by use of its macros.
- The main() portion of the code creates the test class object and executes its methods. It then tallies the number of failures and reports it to the screen/log.
- Data for testing is located in the gnsstk/data directory. Only place data in there that is publicly releasable.
- The file build_config.h.in is configured by the cmake process to define some functions to allow C++ programs to find this data after they are compiled.
- The CMAKE variable GNSSTK_TEST_DATA_DIR can be used to find the data from cmake. It is defined in the top level CMakeLists.txt file.
- The application tests utilize CMake scripts to run the GNSSTk applications with varying options and data.
- The tests are added to the CMakeLists.txt file in the application's corresponding subdirectory of gnsstk/core/tests/ or gnsstk/ext/tests/.
- When possible, utilize one of the shared .cmake files in gnsstk/core/tests/ in order to perform tests. The shared .cmake files include the following functionality:
- testhelp.cmake - Runs the application with various forms of help options to ensure they all work.
- testfailexp.cmake - Runs the application with given options and expects that the application should exit with a code other than 0, but not segmentation fault.
- testsuccexp.cmake - Runs the application with given options and expects that the application should exit with a code 0. It can also compare whole files.
- testsuccdiff.cmake - Runs the application with given options and then diffs the output of the application with a stored expected output.
- If the tested application use case does not fit into one of the above scripts, feel free to create a new one. Be sure to include why a new one was needed in the comments of the script.
- Store the cmake script in the core/tests/dir where dir corresponds to the core/apps/dir where the program source resides. Name the file after the test that is being run.
- Add any required source or reference data to the appropriate data directory. See Testing Data section for more detail.
- Any outputs created should be named for the test that creates them. For instance, if a test, called rinex_creator, creates three files (one for each of the rinex filetypes), then the files should be named something like rinex_creator.robs, rinex_creator.rnav, and rinex_creator.rmet.
- Ensure any output files that are created are given unique names. CTest will run the tests in parallel creating a race condition if applications have the same named output.
- If a test needs output from another test, then specify the dependency by using the set_tests_properties CMake command in the CMakeLists.txt file. This would look something like set_tests_properties( test1 PROPERTIES DEPENDS test2 ).
- If a test needs multiple commands, they can be strung together with COMMAND statements in one execute_process.
Any input data should be placed in the gnsstk/data/inputs/ directory and follow this naming convention:
- File name should describe the key content of the data. For example, a v2.11 Rinex Obs file from day 360 of 2015 could be called robs.v2_11.doy360.yr2015.
- A file taken from a production system may keep its original name if the file is unchanged.
- Any inputs generated from existing input files should have the modification description appended to it. Using the example Rinex obs file as a base, a new input with only the first half of that day's data could be called robs.v2_11.doy360.yr2015.firsthalfday.
Any expected result data should be placed in the gnsstk/data/expected/ directory and follow this naming convention:
- File name be the test's name and end with a .exp. For example, expected output for a test called Foo_bar should be named Foo_bar.exp.
Also, any further sub-organization of the testing data is left to the developer's discretion, but make sure that the name of the grouped content is clear. For example, if testing an app called foo required one of each of the Rinex filetypes for a single day, all of the data could be grouped into a directory named for the day.
As of the October 2022 release, the test data for gnsstk is kept in an independent repository and managed via submodules.
If you wish to run the unit tests, new clones can be created with the --recurse-submodules option, e.g.
git clone --recurse-submodules $GITSERVER/gnsstk.git
For existing clones, you can set up the submodules using:
git fetch
git submodule update --init
Changes to the code repositories work the same as always. If you need to make changes to the test data repo, you should also make sure that the source repos have a consistent link to that changed repo.
During development, you can use commands like the following to make temporary changes which will be reflected in the CI pipeline (replacing "feature/tks-XXX" with the branch name you want to use for the data repo changes):
~/src/gnsstk$ cd data
~/src/gnsstk/data$ git checkout -b "feature/tks-XXX" main
*edit, change, etc.*
~/src/gnsstk/data$ git commit -am "Made some changes"
~/src/gnsstk/data$ git push -u origin "feature/tks-XXX"
~/src/gnsstk/data$ cd ..
~/src/gnsstk$ git commit -am "Made some changes"
~/src/gnsstk$ git push
NOTE: this will push a commit that contains a change of the submodule commit reference to the above branch
When the CI pipeline for the code repository (gnsstk) succeeds, you can merge the changes to the data repository (gnsstk-data). After those changes are merged, you will need to update the code submodule commit reference:
~/src/gnsstk$ git submodule update --recursive --remote
~/src/gnsstk$ git commit -am "Updated test data submodule reference"
~/src/gnsstk$ git push
The CI pipeline should complete successfully. Repeat this in other repositories that use the gnsstk-data repo as a submodule (e.g gnsstk-apps).
This illustrates one test on the rmwcheck application. The test is to verify that the application will fail when a non Rinex Met file is given. It requires one file to be in the gnsstk/data directories, arlm200a.15n.
This file is where the test scripts parameters are set and the test is added to the CTest suite.
...
# check a valid RINEX Nav file (should fail, as it isn't a Met file)
add_test(NAME rmwcheck_Invalid_1
COMMAND ${CMAKE_COMMAND}
-DTEST_PROG=$<TARGET_FILE:rmwcheck>
-DSOURCEDIR=${SD}
-DTARGETDIR=${TD}
-DNODIFF=1
-DARGS=${SD}/arlm200a.15n
-DGNSSTK_BINDIR=${GNSSTK_BINDIR}
-P ${CMAKE_CURRENT_SOURCE_DIR}/../testfailexp.cmake)
...
This is the script that is run to execute the test.
# Generic test where failure is expected
# stick a space-separated argument list in ARGS
# Convert ARGS into a cmake list
IF(DEFINED ARGS)
string(REPLACE " " ";" ARG_LIST ${ARGS})
ENDIF(DEFINED ARGS)
execute_process(COMMAND ${TEST_PROG} ${ARG_LIST}
OUTPUT_QUIET
ERROR_QUIET
RESULT_VARIABLE HAD_ERROR)
if(HAD_ERROR EQUAL 0)
message(FATAL_ERROR "Test failed")
endif()
if (HAD_ERROR STREQUAL "Segmentation fault")
message(FATAL_ERROR "Test had a seg fault")
endif()
These files illustrate how library unit tests are added to the system. The example shown will be how the ValidType class's tests are created and run.
This snippet creates the executable to run the tests, links the executable to the gnsstk library, and adds the executable to be run as part of the test suite.
...
add_executable(ValidType_T ValidType_T.cpp)
target_link_libraries(ValidType_T gnsstk)
add_test(Utilities_ValidType ValidType_T)
...
This is the program where the individual unit tests are stored. The file is broken into two parts, the test class and a main segment which instantiates the test class and runs its methods. Each of the test class's methods, shown below, contain multiple tests of a broader topic. For instance methodTest exercises ValidType's methods.
#include "ValidType.hpp"
#include "TestUtil.hpp"
#include <iostream>
#include <string>
#include <sstream>
#include <cmath>
using namespace gnsstk;
class ValidType_T
{
public:
ValidType_T(){ eps = 1E-15;}// Default Constructor, set the precision value
~ValidType_T() {} // Default Desructor
int methodTest(void)
{
TUDEF( "ValidType", "isValid");
std::string failMesg;
ValidType<float> vfloat0;
//Is the invalid Valid object set as valid?
TUASSERT(!vfloat0.is_valid());
//Is the invalid Valid object's value 0?
TUASSERTFE(0.0, vfloat0.get_value());
ValidType<float> vfloat (5);
//Does the get_value method return the correct value?
TUASSERTFE(5.0, vfloat.get_value());
//Is the valid Valid object set as valid?
TUASSERT(vfloat.is_valid());
vfloat.set_valid(false);
//Was the valid Valid object correctly set to invalid?
TUASSERT(!vfloat.is_valid());
TURETURN();
}
int operatorTest(void)
{
TUDEF( "ValidType", " == Operator");
std::string failMesg;
ValidType<float> Compare1 (6.);
ValidType<float> Compare2 (6.);
ValidType<float> Compare3 (8.);
ValidType<int> Compare4 (6);
ValidType<float> vfloat;
//Are two equvalent objects equal?
TUASSERT(Compare1 == Compare2);
//Are two non-equvalent objects equal?
TUASSERT(Compare1 != Compare3);
vfloat = 7.;
TUCSM(" = Operator");
//Did the = operator store the value correctly?
TUASSERT(vfloat.get_value() == 7.);
//Did the = operator set the object as valid?
TUASSERT(vfloat.is_valid());
TUCSM(" += Operator");
vfloat += 3.;
//Did the += operator store the value correctly?
TUASSERT(vfloat.get_value() == 10.);
//Did the += operator change the object's valid bool?
TUASSERT(vfloat.is_valid());
TUCSM(" -= Operator");
vfloat -= 5.;
//Did the -= operator store the value correctly?
TUASSERT(vfloat.get_value() == 5.);
//Did the -= operator change the object's valid bool?
TUASSERT(vfloat.is_valid());
TUCSM(" << Operator");
vfloat = 11;
std::stringstream streamOutput;
std::string stringOutput;
std::string stringCompare;
streamOutput << vfloat;
stringOutput = streamOutput.str();
stringCompare = "11";
//Did the << operator ouput valid object correctly?
TUASSERT(stringCompare == stringOutput);
streamOutput.str(""); // Resetting stream
vfloat.set_valid(false);
streamOutput << vfloat;
stringOutput = streamOutput.str();
stringCompare = "Unknown";
// Did the << operator output invalid object correctly?
TUASSERT(stringCompare == stringOutput);
TURETURN();
}
private:
double eps;
};
int main() //Main function to initialize and run all tests above
{
int errorTotal = 0;
ValidType_T testClass;
errorTotal += testClass.methodTest();
errorTotal += testClass.operatorTest();
std::cout << "Total Failures for " << __FILE__ << ": " << errorTotal << std::endl;
return errorTotal; //Return the total number of errors
}
The results are stored in the build directory's Testing/Temporary/LastTest.log file. Here are the snippets of that file in relation to the two examples.
...
187/205 Testing: rmwcheck_Invalid_1
187/205 Test: rmwcheck_Invalid_1
Command: "/usr/bin/cmake" "-DTEST_PROG=/home/nfitz/git/appUnitTesting/gnsstk/build/hpub5-moreAppUnitTests/core/apps/checktools/rmwcheck" "-DSOURCEDIR=/home/nfitz/git/appUnitTesting/gnsstk/data" "-DTARGETDIR=/home/nfitz/git/appUnitTesting/gnsstk/build/hpub5-moreAppUnitTests/Testing/Temporary" "-DNODIFF=1" "-DARGS=/home/nfitz/git/appUnitTesting/gnsstk/data/arlm200a.15n" "-DGNSSTK_BINDIR=" "-P" "/home/nfitz/git/appUnitTesting/gnsstk/core/tests/checktools/../testfailexp.cmake"
Directory: /home/nfitz/git/appUnitTesting/gnsstk/build/hpub5-moreAppUnitTests/core/tests/checktools
"rmwcheck_Invalid_1" start time: Jul 14 10:32 CDT
Output:
----------------------------------------------------------
<end of output>
Test time = 0.04 sec
----------------------------------------------------------
Test Passed.
"rmwcheck_Invalid_1" end time: Jul 14 10:32 CDT
"rmwcheck_Invalid_1" time elapsed: 00:00:00
----------------------------------------------------------
...
...
70/205 Testing: Utilities_ValidType
70/205 Test: Utilities_ValidType
Command: "/home/nfitz/git/appUnitTesting/gnsstk/build/hpub5-moreAppUnitTests/core/tests/Utilities/ValidType_T"
Directory: /home/nfitz/git/appUnitTesting/gnsstk/build/hpub5-moreAppUnitTests/core/tests/Utilities
"Utilities_ValidType" start time: Jul 14 10:32 CDT
Output:
----------------------------------------------------------
GNSSTkTest, Class=ValidType, Method=isValid, testFile=ValidType_T.cpp, testLine=23, subtest=1, failBit=0
GNSSTkTest, Class=ValidType, Method=isValid, testFile=ValidType_T.cpp, testLine=26, subtest=2, failBit=0
GNSSTkTest, Class=ValidType, Method=isValid, testFile=ValidType_T.cpp, testLine=31, subtest=3, failBit=0
GNSSTkTest, Class=ValidType, Method=isValid, testFile=ValidType_T.cpp, testLine=34, subtest=4, failBit=0
GNSSTkTest, Class=ValidType, Method=isValid, testFile=ValidType_T.cpp, testLine=39, subtest=5, failBit=0
GNSSTkTest, Class=ValidType, Method=== Operator, testFile=ValidType_T.cpp, testLine=56, subtest=1, failBit=0
GNSSTkTest, Class=ValidType, Method=== Operator, testFile=ValidType_T.cpp, testLine=59, subtest=2, failBit=0
GNSSTkTest, Class=ValidType, Method== Operator, testFile=ValidType_T.cpp, testLine=65, subtest=3, failBit=0
GNSSTkTest, Class=ValidType, Method== Operator, testFile=ValidType_T.cpp, testLine=68, subtest=4, failBit=0
GNSSTkTest, Class=ValidType, Method=+= Operator, testFile=ValidType_T.cpp, testLine=74, subtest=5, failBit=0
GNSSTkTest, Class=ValidType, Method=+= Operator, testFile=ValidType_T.cpp, testLine=77, subtest=6, failBit=0
GNSSTkTest, Class=ValidType, Method=-= Operator, testFile=ValidType_T.cpp, testLine=84, subtest=7, failBit=0
GNSSTkTest, Class=ValidType, Method=-= Operator, testFile=ValidType_T.cpp, testLine=87, subtest=8, failBit=0
GNSSTkTest, Class=ValidType, Method=<< Operator, testFile=ValidType_T.cpp, testLine=103, subtest=9, failBit=0
GNSSTkTest, Class=ValidType, Method=<< Operator, testFile=ValidType_T.cpp, testLine=114, subtest=10, failBit=0
Total Failures for /home/nfitz/git/appUnitTesting/gnsstk/core/tests/Utilities/ValidType_T.cpp: 0
<end of output>
Test time = 0.00 sec
----------------------------------------------------------
Test Passed.
"Utilities_ValidType" end time: Jul 14 10:32 CDT
"Utilities_ValidType" time elapsed: 00:00:00
----------------------------------------------------------
...