OGRE Wiki
Support and community documentation for Ogre3D
Ogre Forums
ogre3d.org
Log in
Username:
Password:
CapsLock is on.
Remember me (for 1 year)
Log in
Home
Tutorials
Tutorials Home
Basic Tutorials
Intermediate Tutorials
Mad Marx Tutorials
In Depth Tutorials
Older Tutorials
External Tutorials
Cookbook
Cookbook Home
CodeBank
Snippets
Experiences
Ogre Articles
Libraries
Libraries Home
Alternative Languages
Assembling A Toolset
Development Tools
OGRE Libraries
List of Libraries
Tools
Tools Home
DCC Tools
DCC Tutorials
DCC Articles
DCC Resources
Assembling a production pipeline
Development
Development Home
Roadmap
Building Ogre
Installing the Ogre SDK
Setting Up An Application
Ogre Wiki Tutorial Framework
Frequently Asked Questions
Google Summer Of Code
Help Requested
Ogre Core Articles
Community
Community Home
Projects Using Ogre
Recommended Reading
Contractors
Wiki
Immediate Wiki Tasklist
Wiki Ideas
Wiki Guidelines
Article Writing Guidelines
Wiki Styles
Wiki Page Tracker
Ogre Wiki Help
Ogre Wiki Help Overview
Help - Basic Syntax
Help - Images
Help - Pages and Structures
Help - Wiki Plugins
Toolbox
Freetags
Categories
List Pages
Structures
Trackers
Statistics
Rankings
List Galleries
Ogre Lexicon
Comments
History: Visual Unit Testing Framework
View page
Source of version: 12
(current)
!__Abstract__ This is the documentation for OGRE's visual unit testing framework. The framework allows you to perform image-based comparisons of test scenes between builds. !__Introduction__ OGRE already uses [http://sourceforge.net/projects/cppunit/|CppUnit] for a selection of unit tests that cover the basics: Vectors, String functions, etc; however, this handful of existing tests is far from comprehensive. Moreover, given the high degree of interdependency and the dependence on graphics API's and the like, traditional unit testing really wouldn't be possible without a very complex testing setup involving mock rendersystems and a huge amount of testing code. Even if traditional unit testing were doable without significant commitment, it still wouldn't be especially helpful in the context of a rendering engine (you really can't effectively decide "Is the rendered image correct?" with just assertions and so forth). Given that the output of a rendering engine is an image, why not test it using just that? This framework aims to make testing possible by creating test scenes that can be screen-captured and compared between builds. This allows for features to be tested very simply (just implement the feature in a simple test scene, and the framework does the rest; there's no need for assertions or elaborate test cases). [https://bitbucket.org/RileyA/ogre-gsoc-testingframework/overview|Here] is the fork where you can find this project. {maketoc} !__Build Details__ Simply build with the OGRE_BUILD_TESTS option enabled in CMake. Also, note that test plugins are fully compatible with the sample browser. !__Running Tests__ !!__Generating Reference Images__ Due to differences in drivers and so forth, it works best to generate a reference image set for each machine you will be testing with, this greatly reduces the chance of false positives due to driver issues. The easiest way to generate reference image sets, is as follows: ''__1.__ Build Ogre with the CMake option __OGRE_BUILD_TESTS__ enabled.'' ''__2.__ Run the TestContext executable from the command line with the following options:'' {CODE(wrap="1", colors="c++")}TestContext.exe -r -d{CODE} The __-r__ generates a reference set, and the __-d__ forces the config dialog to appear. Select your preferences from the dialog. A few notes: * Lower resolutions are highly recommended, time to do the image comparison and disk space used scale up quickly with over 100 tests. * VSync isn't recommended, it just slows down execution of the tests (since they run for a set number of frames, and VSync locks the max framerate). * Fullscreen isn't recommended, I haven't tested with it much, and it may produce different results than windowed mode. ''__3.__ Repeat step 2 for each render system you're using.'' !!__Testing__ Once you have reference image sets ready, you can run tests. The testing is integrated with Ogre's CMake system, so the easiest way of running tests is through your build environment: __Using Unix Makefiles:__ * {CODE(wrap="1", colors="c++")}make test{CODE} __Using Visual Studio:__ * Build the RUN_TESTS project __Using NMake Makefiles: (run from the "Visual Studio Command Prompt")__ * {CODE(wrap="1", colors="c++")}nmake test{CODE} For other build systems, consult CMake/CTest docs, in general there should be a "target" or "project" or the equivalent for running tests. Alternatively, you can run it directly from ctest, from the command line in your build directory (you may need to specify a build configuration): {CODE(wrap="1", colors="c++")}ctest -C Release{CODE} !__CDash and Automated Testing__ Using CMake and CTest, it's possible to upload build and test results to a CDash web dashboard. !!__Experimental Builds__ The simplest way of running tests and uploading results to the dashboard is through an 'Experimental' target in your build system (with makefiles this'll be "make Experimental", VS will have an "Experimental" target). As above, you can run it directly from ctest: {CODE(wrap="1", colors="c++")}ctest -C Release -D Experimental{CODE} !!__CTest Scripts__ CTest allows you to use .cmake scripts to script the build/test process, which is useful for automated building, you can test with a script like so: {CODE(wrap="1", colors="c++")}ctest -S script.cmake -V{CODE} The -V sets verbose output (otherwise it's almost silent, -VV makes it even more verbose). See below for example Nightly and COntinuous build scripts. __Note that the scripts I've provided use NMake in Windows, so you'll need to run them from the "Visual Studio Command Prompt".__ !!__Nightly Builds__ Nightly builds grab a snapshot of the source repository once a day, built, test, and upload results. You'll need to use your OS's scheduling system to run the script (see this CTest wiki entry for details on setting this up in various OS's). Here's the script, it's sort of a template, there's a lot of specific details that will need configuration, so you'll want to read through it and set it up to match your needs: {CODE(wrap="1", colors="c++")} # Where to find OGRE's dependencies (very important in Windows, /usr should work in Unix) set(DEPENDENCIES_DIR "...") # determine home directory if (WIN32) # ctest seems to choke on backslash path separators... string(REPLACE "\\" "/" HOME_DIRECTORY "$ENV{HOMEDRIVE}$ENV{HOMEPATH}") elseif (UNIX) set(HOME_DIRECTORY "$ENV{HOME}") endif () # choose where you want source/builds to go set(CTEST_SOURCE_DIRECTORY "${HOME_DIRECTORY}/dashboards/ogre/nightly/source") set(CTEST_BINARY_DIRECTORY "${HOME_DIRECTORY}/dashboards/ogre/nightly/build") # set any additional build options you need here set(BUILD_OPTIONS "OGRE_BUILD_TESTS=ON" # tests need to be enabled to do unit testing "OGRE_DEPENDENCIES_DIR=${DEPENDENCIES_DIR}" # set where to find dependencies ) # Everything after here should (hopefully) work without any editing #-------------------------------------------------------------- # site name will just be hostname site_name(CTEST_SITE) if(UNIX) # In Unix (including OSX, Cygwin, et al) use plain ol' makefiles set(CTEST_CMAKE_GENERATOR "Unix Makefiles") set(CTEST_BUILD_NAME "${CMAKE_SYSTEM_NAME}-make-gcc") elseif(WIN32) # This must be run from the Visual Studio command prompt for CMake to find everything set(CTEST_CMAKE_GENERATOR "NMake Makefiles") set(CTEST_BUILD_NAME "${CMAKE_SYSTEM_NAME}-nmake-vs") endif() # Configuration (release will be quickest to run the tests) set(CTEST_BUILD_CONFIGURATION "Release") # Setup build options set(CTEST_BUILD_OPTIONS "") foreach (opt ${BUILD_OPTIONS}) set(CTEST_BUILD_OPTIONS "${CTEST_BUILD_OPTIONS}-D${opt} ") endforeach () # memcheck/coverage aren't set up (yet?) set(WITH_MEMCHECK FALSE) set(WITH_COVERAGE FALSE) # look for mercurial executable find_program(CTEST_HG_COMMAND NAMES hg) # do a fresh clone if necessary if(NOT EXISTS "${CTEST_SOURCE_DIRECTORY}") set(CTEST_CHECKOUT_COMMAND "${CTEST_HG_COMMAND} clone https://bitbucket.org/RileyA/ogre-gsoc-testingframework ${CTEST_SOURCE_DIRECTORY}") endif() # tell CTest how to update set(CTEST_UPDATE_COMMAND "${CTEST_HG_COMMAND}") # set this all up set(CTEST_CONFIGURE_COMMAND "${CMAKE_COMMAND} -DCMAKE_BUILD_TYPE:STRING=${CTEST_BUILD_CONFIGURATION}") set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} ${CTEST_BUILD_OPTIONS}") set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} \"-G${CTEST_CMAKE_GENERATOR}\"") set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} \"${CTEST_SOURCE_DIRECTORY}\"") # do a full build, from an empty bin directory SET (CTEST_START_WITH_EMPTY_BINARY_DIRECTORY_ONCE 1) # build and quit ctest_start("Nightly") ctest_update() ctest_configure() ctest_build() ctest_test() ctest_submit() # manually call post-test cleanup script exec_program("cmake -P ${CTEST_BINARY_DIRECTORY}/Tests/VisualTests/PostTest.cmake") {CODE} !!__Continuous Builds__ You can also run a continuous build setup, that checks the repository every few minutes, and does a build/test/upload whenever new changes are found. This script will probably take more setup work than the Nightly one, so be sure you look over it and configure as needed: {CODE(wrap="1", colors="c++")} # Where to find OGRE's dependencies (very important in Windows, /usr should work in Unix) set(DEPENDENCIES_DIR "...") # How long to run in seconds set(TIME_TO_RUN 43200) # default of 12hrs # How long between checking for changes set(TIME_BETWEEN_UPDATES 300) # default of 5mins # determine home directory if (WIN32) # ctest seems to choke on backslash path separators... string(REPLACE "\\" "/" HOME_DIRECTORY "$ENV{HOMEDRIVE}$ENV{HOMEPATH}") elseif (UNIX) set(HOME_DIRECTORY "$ENV{HOME}") endif () # choose where you want source/builds to go set(CTEST_SOURCE_DIRECTORY "${HOME_DIRECTORY}/dashboards/ogre/continuous/source") set(CTEST_BINARY_DIRECTORY "${HOME_DIRECTORY}/dashboards/ogre/continuous/build") # set any additional build options you need here set(BUILD_OPTIONS "OGRE_BUILD_TESTS=ON" # tests need to be enabled to do unit testing "OGRE_DEPENDENCIES_DIR=${DEPENDENCIES_DIR}" # set where to find dependencies ) # Everything after here should (hopefully) work without any editing #-------------------------------------------------------------- # site name will be hostname site_name(CTEST_SITE) if(UNIX) # In Unix (including OSX, Cygwin, et al) use plain ol' makefiles set(CTEST_CMAKE_GENERATOR "Unix Makefiles") set(CTEST_BUILD_NAME "${CMAKE_SYSTEM_NAME}-make-gcc") elseif(WIN32) # In Windows, NMake seems to be the only way of automating this # This must be run from the "Visual Studio Command Prompt" for CMake to find everything set(CTEST_CMAKE_GENERATOR "NMake Makefiles") set(CTEST_BUILD_NAME "${CMAKE_SYSTEM_NAME}-nmake-vs") endif() # Configuration (release will be quickest to run the tests) set(CTEST_BUILD_CONFIGURATION "Release") # Setup build options set(CTEST_BUILD_OPTIONS "") foreach (opt ${BUILD_OPTIONS}) set(CTEST_BUILD_OPTIONS "${CTEST_BUILD_OPTIONS}-D${opt} ") endforeach () # memcheck/coverage aren't set up (yet?) set(WITH_MEMCHECK FALSE) set(WITH_COVERAGE FALSE) # look for mercurial executable find_program(CTEST_HG_COMMAND NAMES hg) # do a fresh clone if necessary if(NOT EXISTS "${CTEST_SOURCE_DIRECTORY}") set(CTEST_CHECKOUT_COMMAND "${CTEST_HG_COMMAND} clone https://bitbucket.org/RileyA/ogre-gsoc-testingframework ${CTEST_SOURCE_DIRECTORY}") endif() # tell CTest how to update set(CTEST_UPDATE_COMMAND "${CTEST_HG_COMMAND}") # set this all up set(CTEST_CONFIGURE_COMMAND "${CMAKE_COMMAND} -DCMAKE_BUILD_TYPE:STRING=${CTEST_BUILD_CONFIGURATION}") set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} ${CTEST_BUILD_OPTIONS}") set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} \"-G${CTEST_CMAKE_GENERATOR}\"") set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} \"${CTEST_SOURCE_DIRECTORY}\"") # do a full build once set(CTEST_START_WITH_EMPTY_BINARY_DIRECTORY_ONCE 1) set(STARTED_CTEST FALSE) # start continuous integration while (${CTEST_ELAPSED_TIME} LESS ${TIME_TO_RUN}) set (START_TIME ${CTEST_ELAPSED_TIME}) # make sure this only gets called once for each build/test if(NOT STARTED_CTEST) ctest_start("Continuous") set(STARTED_CTEST TRUE) endif() ctest_update(RETURN_VALUE NUM_UPDATED) # if files were updated: build, test and submit if (${NUM_UPDATED} GREATER 0) ctest_configure() ctest_build() ctest_test() ctest_submit() # manually call post-test cleanup script exec_program("cmake -P ${CTEST_BINARY_DIRECTORY}/Tests/VisualTests/PostTest.cmake") set(STARTED_CTEST FALSE) else () exec_program("cmake -E echo \"No Updates found, waiting...\"") endif () # wait before checking again ctest_sleep( ${START_TIME} ${TIME_BETWEEN_UPDATES} ${CTEST_ELAPSED_TIME}) endwhile() {CODE} !__TestContext Command Line Options__ Note that this is using Ogre's built-in [http://www.ogre3d.org/docs/api/html/group__General.html#gad78e25bde5597796c07e75d4f857a3cd|findCommandLineOpts] which does not allow for combining of options (i.e. foo -abc bar would have to be written as foo -a -b -c bar) __-h or --help__ Usage details. __-r__ Generate a reference set. __-d__ Force the config dialog to appear. __--no-html__ Suppress html output. __--nograb__ Do not restrict mouse to window (WARNING: may affect test results). __-rs "[[Render System]"__ Specify the rendersystem. __-m "[[comment]"__ Add an optional comment to be associated with the generated image set. __-ts "[[test set name]"__ Select the test set to use (default is 'VTests'). __-c "[[image set name to compare against]"__ Select which image set you want to compare this run with (default is 'Reference'). __-n "[[name]"__ Specify a name for this set (omitting this, or choosing 'AUTO' will result in an automatically generated name). __-o "[[path]"__ Generate a summary file at the specified path for the test results (this is used for CTest). !__The Output__ Whenever a set is created, the test images themselves, along with a small config file containing data about the set (resolution, date/time, name, etc), are created in a new directory (see ((Visual Unit Testing Framework|#Where to Find the Output|below)) for details on directory structure). The primary output is an HTML document containing an overview of the test, and side-by-side images of the reference image set and the newly generated images. A small linked javascript file allows for some basic diffing to be done within a web browser (requires HTML5/Canvas). [http://rileyadams.net/gsoc/July5/out.html|Here] is a sample of the html output. !!__Where to Find the Output__ Output is generated in the same directory as the logs and cfg's for the sample browser. This is generally in your My Documents or home directory (or your OS's equivalent). There should be an Ogre directory, with a subdirectory for the Ogre version. From there the structure looks like: *VisualTests ** [[Test set name] *** [[Rendersystem] **** out.html **** [[Reference] ***** info.cfg ***** Reference screenshots (.png's)... **** [[Test set name]_[[date] ***** Info.cfg ***** Screenshots (.png's)... !__Image Comparison__ The images are compared to reference images using a selection of common metrics. A failed test will report values for the following metrics in the HTML output: The most basic is just the __absolute difference__; how many pixels differ between two images. Next is the __Mean Squared Error (MSE)__, which, as the name suggests produces the average squared error (difference between the images). Lower is better. Next is the __Peak Signal-to-Noise Ratio (PSNR)__, which measures the ratio between the maximum signal (in this case, full color values in each channel), and corrupting noise (the differences in the images). Higher values of this metric are better. Last, is the __Structural Similarity (SSIM) index__, which is a more recent development (see [https://ece.uwaterloo.ca/~z70wang/publications/ssim.html|this] 2004 paper for in-depth details), and aims to provide a metric better related to human preception (images with identical MSE may actually be of very different quality levels). It gives a value in the range of -1 to 1, (with 1 being identical). !__Creating New Tests__ The testing framework is built on top of the existing sample framework, so it is very similar to creating a sample. Tests are created in plugins that the TestContext is able to load dynamically. !!__General__ Create a class derived from VisualTest, override whichever functions you need (the same FrameListener-style functions used in Samples apply here), and add it to a test plugin. __Some Things to Note:__ * You will need to specify when you want test screenshot(s) to be taken, with addScreenshotFrame (timing is done by frame to prevent floating point issues). * Tests must be deterministic, so use the delta (time since last frame) time passed to the frameStarted/frameEnded functions for any timing needs. * Keep tests simple, the idea is to isolate and test a single feature as completely as possible. !!__Defining Test Sets__ Test sets (a grouping of tests that will be generated and compared together) are defined as a collection of test plugins in the 'tests.cfg' file. Plugins can belong to more than one test set. An example configuration of test.cfg is below {CODE(wrap="1", colors="c#")} # where the test plugins are located TestFolder=[Ogre lib dir] # A set of all visual tests [VTests] TestPlugin=PlayPenTests TestPlugin=VTests # Only the playpen tests [Playpen] TestPlugin=PlayPenTests {CODE} !__Tests__ The following are the initial tests being used with the system: {FANCYTABLE(head="Tests")} PlayPen_ManualLOD|PlayPen_ManualLODFromFile|PlayPen_ManualBlend PlayPen_ProjectSphere|PlayPen_CameraSetDirection|PlayPen_MorphAnimationWithNormals PlayPen_MorphAnimationWithoutNormals|PlayPen_PoseAnimationWithNormals|PlayPen_PoseAnimationWithoutNormals PlayPen_SceneNodeTracking|PlayPen_StencilGlow|PlayPen_TransparencyMipMaps PlayPen_BasicPlane|PlayPen_MultiViewports|PlayPen_Distortion PlayPen_AttachObjectsToBones|PlayPen_Ortho|PlayPen_StencilShadows PlayPen_2Spotlights|PlayPen_LotsAndLotsOfEntities|PlayPen_StaticGeometry PlayPen_StaticGeometryWithLOD|PlayPen_BillboardTextureCoords|PlayPen_ReflectedBillboards PlayPen_ManualObjectNonIndexed|PlayPen_ManualObjectNonIndexedUpdateSmaller|PlayPen_ManualObjectNonIndexedUpdateLarger PlayPen_ManualObjectIndexed|PlayPen_ManualObjectIndexedUpdateSmaller|PlayPen_ManualObjectIndexedUpdateLarger PlayPen_BillboardChain|PlayPen_CubeDDS|PlayPen_Dxt1 PlayPen_Dxt1FromMemory|PlayPen_Dxt1Alpha|PlayPen_Dxt3 PlayPen_Dxt3FromMemory|PlayPen_Dxt5|PlayPen_RibbonTrail PlayPen_BlendDiffuseColour|PlayPen_CustomProjectionMatrix|PlayPen_PointSprites PlayPen_BillboardAccurateFacing|PlayPen_MultiSceneManagersSimple|PlayPen_NegativeScale PlayPen_SRGBtexture|PlayPen_LightScissoring|PlayPen_LightClipPlanes PlayPen_LightClipPlanesMoreLights|PlayPen_MaterialSchemes|PlayPen_BuildTangentOnAnimatedMesh PlayPen_BillboardOrigins|PlayPen_DepthBias|PlayPen_16Textures PlayPen_FarFromOrigin|PlayPen_AlphaToCoverage|PlayPen_BlitSubTextures PlayPen_ImageCombine|PlayPen_WindowedViewportMode|PlayPen_Bsp PlayPen_SkeletalAnimation|PlayPen_SubEntityVisibility|PlayPen_SkeletonAnimationOptimise PlayPen_TextureShadows|PlayPen_TextureShadowsIntegrated|PlayPen_TextureShadowsIntegratedPSSM PlayPen_TextureShadowsCustomCasterMat|PlayPen_TextureShadowsCustomReceiverMat|PlayPen_ManualObject2D PlayPen_LiSPSM|PlayPen_MaterialSchemesWithLOD|PlayPen_MaterialSchemesWithMismatchedLOD PlayPen_ClearScene|PlayPen_StencilShadowsMixedOpSubMeshes|PlayPen_ManualIlluminationStage PlayPen_Projection|PlayPen_CompositorTextureShadows|PlayPen_CompositorTechniqueSwitch PlayPen_ManualBoneMovement|PlayPen_IntersectionSceneQuery|PlayPen_RaySceneQuery PlayPen_ReloadResources|PlayPen_SuppressedShadows|PlayPen_DepthShadowMap PlayPen_TextureShadowsTransparentCaster|PlayPen_ViewportNoShadows|PlayPen_SpotlightViewProj PlayPen_NormalMapMirroredUVs|PlayPen_SerialisedColour|PlayPen_ShadowLod PlayPen_MaterialSchemesListener|PlayPen_ReinitialiseEntityAlteredMesh|PlayPen_InfiniteAAB PlayPen_GeometryShaders|PlayPen_VertexTexture|PlayPen_NonUniqueResourceNames ParticleTest|StencilShadowTest|TransparencyTest CubeMappingTest|TextureEffectsTest {FANCYTABLE} !__Known Issues__ * I haven't tested with OSX. * I haven't tested the Direct3D 11 render system. * NMake doesn't support precompiled headers at the moment, and this seems to cause at least one test to produce different results from regular VC++. !__Future Improvements__ * More Tests! * A more robust image comparison algorithm (e.g. something along the lines of [http://pdiff.sourceforge.net/|PerceptualDiff]) * Integrate CppUnit tests with the CTest testing setup. * More testing dashboard features, (better, mercurial integration, emailing comitters when tests fail as a result of their changes, etc; depending on how much CDash allows for). !__Additional Links__ * [http://www.ogre3d.org/forums/viewtopic.php?f=13&t=63582|Summer of Code forum thread] * [http://aras-p.info/blog/2007/07/31/testing-graphics-code/|Unity3d's test framework] * [https://bitbucket.org/RileyA/ogre-gsoc-testingframework/overview|Bitbucket fork]
Search by Tags
Search Wiki by Freetags
Latest Changes
IDE Eclipse
FMOD SoundManager
HDRlib
Building Ogre V2 with CMake
Ogre 2.1 FAQ
Minimal Ogre Collision
Artifex Terra
OpenMB
Advanced Mogre Framework
MogreSocks
...more
Search
Find
Advanced
Search Help
Online Users
59 online users