Abstract

This is the documentation for OGRE's visual unit testing framework. The framework allows you to perform image-based comparisons of test scenes between builds.

Introduction

OGRE already uses CppUnit for a selection of unit tests that cover the basics: Vectors, String functions, etc; however, this handful of existing tests is far from comprehensive. Moreover, given the high degree of interdependency and the dependence on graphics API's and the like, traditional unit testing really wouldn't be possible without a very complex testing setup involving mock rendersystems and a huge amount of testing code.

Even if traditional unit testing were doable without significant commitment, it still wouldn't be especially helpful in the context of a rendering engine (you really can't effectively decide "Is the rendered image correct?" with just assertions and so forth).

Given that the output of a rendering engine is an image, why not test it using just that? This framework aims to make testing possible by creating test scenes that can be screen-captured and compared between builds. This allows for features to be tested very simply (just implement the feature in a simple test scene, and the framework does the rest; there's no need for assertions or elaborate test cases).

Here is the fork where you can find this project.

Build Details

Simply build with the OGRE_BUILD_TESTS option enabled in CMake. Also, note that test plugins are fully compatible with the sample browser.

Running Tests

Generating Reference Images

Due to differences in drivers and so forth, it works best to generate a reference image set for each machine you will be testing with, this greatly reduces the chance of false positives due to driver issues.

The easiest way to generate reference image sets, is as follows:

1. Build Ogre with the CMake option OGRE_BUILD_TESTS enabled.
2. Run the TestContext executable from the command line with the following options:

TestContext.exe -r -d

The -r generates a reference set, and the -d forces the config dialog to appear. Select your preferences from the dialog.

A few notes:

  • Lower resolutions are highly recommended, time to do the image comparison and disk space used scale up quickly with over 100 tests.
  • VSync isn't recommended, it just slows down execution of the tests (since they run for a set number of frames, and VSync locks the max framerate).
  • Fullscreen isn't recommended, I haven't tested with it much, and it may produce different results than windowed mode.

3. Repeat step 2 for each render system you're using.

Testing

Once you have reference image sets ready, you can run tests.

The testing is integrated with Ogre's CMake system, so the easiest way of running tests is through your build environment:

Using Unix Makefiles:

  • make test

Using Visual Studio:

  • Build the RUN_TESTS project

Using NMake Makefiles: (run from the "Visual Studio Command Prompt")

  • nmake test

For other build systems, consult CMake/CTest docs, in general there should be a "target" or "project" or the equivalent for running tests.

Alternatively, you can run it directly from ctest, from the command line in your build directory (you may need to specify a build configuration):

ctest -C Release

CDash and Automated Testing

Using CMake and CTest, it's possible to upload build and test results to a CDash web dashboard.

Experimental Builds

The simplest way of running tests and uploading results to the dashboard is through an 'Experimental' target in your build system (with makefiles this'll be "make Experimental", VS will have an "Experimental" target).

As above, you can run it directly from ctest:

ctest -C Release -D Experimental

CTest Scripts

CTest allows you to use .cmake scripts to script the build/test process, which is useful for automated building, you can test with a script like so:

ctest -S script.cmake -V

The -V sets verbose output (otherwise it's almost silent, -VV makes it even more verbose).

See below for example Nightly and COntinuous build scripts.

Note that the scripts I've provided use NMake in Windows, so you'll need to run them from the "Visual Studio Command Prompt".

Nightly Builds

Nightly builds grab a snapshot of the source repository once a day, built, test, and upload results.

You'll need to use your OS's scheduling system to run the script (see this CTest wiki entry for details on setting this up in various OS's).

Here's the script, it's sort of a template, there's a lot of specific details that will need configuration, so you'll want to read through it and set it up to match your needs:

# Where to find OGRE's dependencies (very important in Windows, /usr should work in Unix)
set(DEPENDENCIES_DIR "...")

# determine home directory
if (WIN32)
  # ctest seems to choke on backslash path separators...
  string(REPLACE "\\" "/" HOME_DIRECTORY "$ENV{HOMEDRIVE}$ENV{HOMEPATH}")
elseif (UNIX)
  set(HOME_DIRECTORY "$ENV{HOME}")
endif ()

# choose where you want source/builds to go
set(CTEST_SOURCE_DIRECTORY "${HOME_DIRECTORY}/dashboards/ogre/nightly/source")
set(CTEST_BINARY_DIRECTORY "${HOME_DIRECTORY}/dashboards/ogre/nightly/build")

# set any additional build options you need here
set(BUILD_OPTIONS 
  "OGRE_BUILD_TESTS=ON" # tests need to be enabled to do unit testing
  "OGRE_DEPENDENCIES_DIR=${DEPENDENCIES_DIR}" # set where to find dependencies
  )

# Everything after here should (hopefully) work without any editing
#--------------------------------------------------------------

# site name will just be hostname
site_name(CTEST_SITE)

if(UNIX)
  # In Unix (including OSX, Cygwin, et al) use plain ol' makefiles
  set(CTEST_CMAKE_GENERATOR "Unix Makefiles")
  set(CTEST_BUILD_NAME "${CMAKE_SYSTEM_NAME}-make-gcc")
elseif(WIN32)
  # This must be run from the Visual Studio command prompt for CMake to find everything
  set(CTEST_CMAKE_GENERATOR "NMake Makefiles")
  set(CTEST_BUILD_NAME "${CMAKE_SYSTEM_NAME}-nmake-vs")
endif()

# Configuration (release will be quickest to run the tests)
set(CTEST_BUILD_CONFIGURATION "Release")

# Setup build options
set(CTEST_BUILD_OPTIONS "")
foreach (opt ${BUILD_OPTIONS})
  set(CTEST_BUILD_OPTIONS "${CTEST_BUILD_OPTIONS}-D${opt} ")
endforeach ()

# memcheck/coverage aren't set up (yet?)
set(WITH_MEMCHECK FALSE)
set(WITH_COVERAGE FALSE)

# look for mercurial executable
find_program(CTEST_HG_COMMAND NAMES hg)

# do a fresh clone if necessary
if(NOT EXISTS "${CTEST_SOURCE_DIRECTORY}")
  set(CTEST_CHECKOUT_COMMAND "${CTEST_HG_COMMAND} clone https://bitbucket.org/RileyA/ogre-gsoc-testingframework ${CTEST_SOURCE_DIRECTORY}")
endif()

# tell CTest how to update
set(CTEST_UPDATE_COMMAND "${CTEST_HG_COMMAND}")

# set this all up
set(CTEST_CONFIGURE_COMMAND "${CMAKE_COMMAND} -DCMAKE_BUILD_TYPE:STRING=${CTEST_BUILD_CONFIGURATION}")
set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} ${CTEST_BUILD_OPTIONS}")
set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} \"-G${CTEST_CMAKE_GENERATOR}\"")
set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} \"${CTEST_SOURCE_DIRECTORY}\"")

# do a full build, from an empty bin directory
SET (CTEST_START_WITH_EMPTY_BINARY_DIRECTORY_ONCE 1)

# build and quit
ctest_start("Nightly")
ctest_update()
ctest_configure()
ctest_build()
ctest_test()
ctest_submit()
# manually call post-test cleanup script
exec_program("cmake -P ${CTEST_BINARY_DIRECTORY}/Tests/VisualTests/PostTest.cmake")

Continuous Builds

You can also run a continuous build setup, that checks the repository every few minutes, and does a build/test/upload whenever new changes are found.

This script will probably take more setup work than the Nightly one, so be sure you look over it and configure as needed:

# Where to find OGRE's dependencies (very important in Windows, /usr should work in Unix)
set(DEPENDENCIES_DIR "...")

# How long to run in seconds
set(TIME_TO_RUN 43200) # default of 12hrs
# How long between checking for changes
set(TIME_BETWEEN_UPDATES 300) # default of 5mins

# determine home directory
if (WIN32)
  # ctest seems to choke on backslash path separators...
  string(REPLACE "\\" "/" HOME_DIRECTORY "$ENV{HOMEDRIVE}$ENV{HOMEPATH}")
elseif (UNIX)
  set(HOME_DIRECTORY "$ENV{HOME}")
endif ()

# choose where you want source/builds to go
set(CTEST_SOURCE_DIRECTORY "${HOME_DIRECTORY}/dashboards/ogre/continuous/source")
set(CTEST_BINARY_DIRECTORY "${HOME_DIRECTORY}/dashboards/ogre/continuous/build")

# set any additional build options you need here
set(BUILD_OPTIONS 
  "OGRE_BUILD_TESTS=ON" # tests need to be enabled to do unit testing
  "OGRE_DEPENDENCIES_DIR=${DEPENDENCIES_DIR}" # set where to find dependencies
  )

# Everything after here should (hopefully) work without any editing
#--------------------------------------------------------------

# site name will be hostname
site_name(CTEST_SITE)

if(UNIX)
  # In Unix (including OSX, Cygwin, et al) use plain ol' makefiles
  set(CTEST_CMAKE_GENERATOR "Unix Makefiles")
  set(CTEST_BUILD_NAME "${CMAKE_SYSTEM_NAME}-make-gcc")
elseif(WIN32)
  # In Windows, NMake seems to be the only way of automating this
  # This must be run from the "Visual Studio Command Prompt" for CMake to find everything
  set(CTEST_CMAKE_GENERATOR "NMake Makefiles")
  set(CTEST_BUILD_NAME "${CMAKE_SYSTEM_NAME}-nmake-vs")
endif()

# Configuration (release will be quickest to run the tests)
set(CTEST_BUILD_CONFIGURATION "Release")

# Setup build options
set(CTEST_BUILD_OPTIONS "")
foreach (opt ${BUILD_OPTIONS})
  set(CTEST_BUILD_OPTIONS "${CTEST_BUILD_OPTIONS}-D${opt} ")
endforeach ()

# memcheck/coverage aren't set up (yet?)
set(WITH_MEMCHECK FALSE)
set(WITH_COVERAGE FALSE)

# look for mercurial executable
find_program(CTEST_HG_COMMAND NAMES hg)

# do a fresh clone if necessary
if(NOT EXISTS "${CTEST_SOURCE_DIRECTORY}")
  set(CTEST_CHECKOUT_COMMAND "${CTEST_HG_COMMAND} clone https://bitbucket.org/RileyA/ogre-gsoc-testingframework ${CTEST_SOURCE_DIRECTORY}")
endif()

# tell CTest how to update
set(CTEST_UPDATE_COMMAND "${CTEST_HG_COMMAND}")

# set this all up
set(CTEST_CONFIGURE_COMMAND "${CMAKE_COMMAND} -DCMAKE_BUILD_TYPE:STRING=${CTEST_BUILD_CONFIGURATION}")
set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} ${CTEST_BUILD_OPTIONS}")
set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} \"-G${CTEST_CMAKE_GENERATOR}\"")
set(CTEST_CONFIGURE_COMMAND "${CTEST_CONFIGURE_COMMAND} \"${CTEST_SOURCE_DIRECTORY}\"")

# do a full build once
set(CTEST_START_WITH_EMPTY_BINARY_DIRECTORY_ONCE 1)

set(STARTED_CTEST FALSE)

# start continuous integration
while (${CTEST_ELAPSED_TIME} LESS ${TIME_TO_RUN})
  set (START_TIME ${CTEST_ELAPSED_TIME})

  # make sure this only gets called once for each build/test
  if(NOT STARTED_CTEST)
    ctest_start("Continuous")
    set(STARTED_CTEST TRUE)
  endif()

  ctest_update(RETURN_VALUE NUM_UPDATED)

  # if files were updated: build, test and submit
  if (${NUM_UPDATED} GREATER 0)
    ctest_configure()
    ctest_build()
    ctest_test()
    ctest_submit()
    # manually call post-test cleanup script
    exec_program("cmake -P ${CTEST_BINARY_DIRECTORY}/Tests/VisualTests/PostTest.cmake")
	set(STARTED_CTEST FALSE)
  else ()
    exec_program("cmake -E echo \"No Updates found, waiting...\"")
  endif ()

  # wait before checking again
  ctest_sleep( ${START_TIME} ${TIME_BETWEEN_UPDATES} ${CTEST_ELAPSED_TIME})

endwhile()

TestContext Command Line Options

Note that this is using Ogre's built-in findCommandLineOpts which does not allow for combining of options (i.e. foo -abc bar would have to be written as foo -a -b -c bar)

-h or --help
Usage details.

-r
Generate a reference set.

-d
Force the config dialog to appear.

--no-html
Suppress html output.

--nograb
Do not restrict mouse to window (WARNING: may affect test results).

-rs "[Render System]"
Specify the rendersystem.

-m "[comment]"
Add an optional comment to be associated with the generated image set.

-ts "[test set name]"
Select the test set to use (default is 'VTests').

-c "[image set name to compare against]"
Select which image set you want to compare this run with (default is 'Reference').

-n "[name]"
Specify a name for this set (omitting this, or choosing 'AUTO' will result in an automatically generated name).

-o "[path]"
Generate a summary file at the specified path for the test results (this is used for CTest).

The Output

Whenever a set is created, the test images themselves, along with a small config file containing data about the set (resolution, date/time, name, etc), are created in a new directory (see below for details on directory structure).

The primary output is an HTML document containing an overview of the test, and side-by-side images of the reference image set and the newly generated images. A small linked javascript file allows for some basic diffing to be done within a web browser (requires HTML5/Canvas).

Here is a sample of the html output.

Where to Find the Output

Output is generated in the same directory as the logs and cfg's for the sample browser. This is generally in your My Documents or home directory (or your OS's equivalent). There should be an Ogre directory, with a subdirectory for the Ogre version. From there the structure looks like:

  • VisualTests
    • [Test set name]
      • [Rendersystem]
        • out.html
        • [Reference]
          • info.cfg
          • Reference screenshots (.png's)...
        • [Test set name]_[date]
          • Info.cfg
          • Screenshots (.png's)...

Image Comparison

The images are compared to reference images using a selection of common metrics. A failed test will report values for the following metrics in the HTML output:

The most basic is just the absolute difference; how many pixels differ between two images.

Next is the Mean Squared Error (MSE), which, as the name suggests produces the average squared error (difference between the images). Lower is better.

Next is the Peak Signal-to-Noise Ratio (PSNR), which measures the ratio between the maximum signal (in this case, full color values in each channel), and corrupting noise (the differences in the images). Higher values of this metric are better.

Last, is the Structural Similarity (SSIM) index, which is a more recent development (see this 2004 paper for in-depth details), and aims to provide a metric better related to human preception (images with identical MSE may actually be of very different quality levels). It gives a value in the range of -1 to 1, (with 1 being identical).

Creating New Tests

The testing framework is built on top of the existing sample framework, so it is very similar to creating a sample. Tests are created in plugins that the TestContext is able to load dynamically.

General

Create a class derived from VisualTest, override whichever functions you need (the same FrameListener-style functions used in Samples apply here), and add it to a test plugin.

Some Things to Note:

  • You will need to specify when you want test screenshot(s) to be taken, with addScreenshotFrame (timing is done by frame to prevent floating point issues).
  • Tests must be deterministic, so use the delta (time since last frame) time passed to the frameStarted/frameEnded functions for any timing needs.
  • Keep tests simple, the idea is to isolate and test a single feature as completely as possible.

Defining Test Sets

Test sets (a grouping of tests that will be generated and compared together) are defined as a collection of test plugins in the 'tests.cfg' file. Plugins can belong to more than one test set.

An example configuration of test.cfg is below

# where the test plugins are located
TestFolder=[Ogre lib dir]

# A set of all visual tests
[VTests]
TestPlugin=PlayPenTests
TestPlugin=VTests

# Only the playpen tests
[Playpen]
TestPlugin=PlayPenTests

Tests

The following are the initial tests being used with the system:

Tests
PlayPen_ManualLOD PlayPen_ManualLODFromFile PlayPen_ManualBlend
PlayPen_ProjectSphere PlayPen_CameraSetDirection PlayPen_MorphAnimationWithNormals
PlayPen_MorphAnimationWithoutNormals PlayPen_PoseAnimationWithNormals PlayPen_PoseAnimationWithoutNormals
PlayPen_SceneNodeTracking PlayPen_StencilGlow PlayPen_TransparencyMipMaps
PlayPen_BasicPlane PlayPen_MultiViewports PlayPen_Distortion
PlayPen_AttachObjectsToBones PlayPen_Ortho PlayPen_StencilShadows
PlayPen_2Spotlights PlayPen_LotsAndLotsOfEntities PlayPen_StaticGeometry
PlayPen_StaticGeometryWithLOD PlayPen_BillboardTextureCoords PlayPen_ReflectedBillboards
PlayPen_ManualObjectNonIndexed PlayPen_ManualObjectNonIndexedUpdateSmaller PlayPen_ManualObjectNonIndexedUpdateLarger
PlayPen_ManualObjectIndexed PlayPen_ManualObjectIndexedUpdateSmaller PlayPen_ManualObjectIndexedUpdateLarger
PlayPen_BillboardChain PlayPen_CubeDDS PlayPen_Dxt1
PlayPen_Dxt1FromMemory PlayPen_Dxt1Alpha PlayPen_Dxt3
PlayPen_Dxt3FromMemory PlayPen_Dxt5 PlayPen_RibbonTrail
PlayPen_BlendDiffuseColour PlayPen_CustomProjectionMatrix PlayPen_PointSprites
PlayPen_BillboardAccurateFacing PlayPen_MultiSceneManagersSimple PlayPen_NegativeScale
PlayPen_SRGBtexture PlayPen_LightScissoring PlayPen_LightClipPlanes
PlayPen_LightClipPlanesMoreLights PlayPen_MaterialSchemes PlayPen_BuildTangentOnAnimatedMesh
PlayPen_BillboardOrigins PlayPen_DepthBias PlayPen_16Textures
PlayPen_FarFromOrigin PlayPen_AlphaToCoverage PlayPen_BlitSubTextures
PlayPen_ImageCombine PlayPen_WindowedViewportMode PlayPen_Bsp
PlayPen_SkeletalAnimation PlayPen_SubEntityVisibility PlayPen_SkeletonAnimationOptimise
PlayPen_TextureShadows PlayPen_TextureShadowsIntegrated PlayPen_TextureShadowsIntegratedPSSM
PlayPen_TextureShadowsCustomCasterMat PlayPen_TextureShadowsCustomReceiverMat PlayPen_ManualObject2D
PlayPen_LiSPSM PlayPen_MaterialSchemesWithLOD PlayPen_MaterialSchemesWithMismatchedLOD
PlayPen_ClearScene PlayPen_StencilShadowsMixedOpSubMeshes PlayPen_ManualIlluminationStage
PlayPen_Projection PlayPen_CompositorTextureShadows PlayPen_CompositorTechniqueSwitch
PlayPen_ManualBoneMovement PlayPen_IntersectionSceneQuery PlayPen_RaySceneQuery
PlayPen_ReloadResources PlayPen_SuppressedShadows PlayPen_DepthShadowMap
PlayPen_TextureShadowsTransparentCaster PlayPen_ViewportNoShadows PlayPen_SpotlightViewProj
PlayPen_NormalMapMirroredUVs PlayPen_SerialisedColour PlayPen_ShadowLod
PlayPen_MaterialSchemesListener PlayPen_ReinitialiseEntityAlteredMesh PlayPen_InfiniteAAB
PlayPen_GeometryShaders PlayPen_VertexTexture PlayPen_NonUniqueResourceNames
ParticleTest StencilShadowTest TransparencyTest
CubeMappingTest TextureEffectsTest

Known Issues

  • I haven't tested with OSX.
  • I haven't tested the Direct3D 11 render system.
  • NMake doesn't support precompiled headers at the moment, and this seems to cause at least one test to produce different results from regular VC++.

Future Improvements

  • More Tests!
  • A more robust image comparison algorithm (e.g. something along the lines of PerceptualDiff)
  • Integrate CppUnit tests with the CTest testing setup.
  • More testing dashboard features, (better, mercurial integration, emailing comitters when tests fail as a result of their changes, etc; depending on how much CDash allows for).

Additional Links

<HR>
Creative Commons Copyright -- Some rights reserved.


THE WORK (AS DEFINED BELOW) IS PROVIDED UNDER THE TERMS OF THIS CREATIVE COMMONS PUBLIC LICENSE ("CCPL" OR "LICENSE"). THE WORK IS PROTECTED BY COPYRIGHT AND/OR OTHER APPLICABLE LAW. ANY USE OF THE WORK OTHER THAN AS AUTHORIZED UNDER THIS LICENSE OR COPYRIGHT LAW IS PROHIBITED.

BY EXERCISING ANY RIGHTS TO THE WORK PROVIDED HERE, YOU ACCEPT AND AGREE TO BE BOUND BY THE TERMS OF THIS LICENSE. THE LICENSOR GRANTS YOU THE RIGHTS CONTAINED HERE IN CONSIDERATION OF YOUR ACCEPTANCE OF SUCH TERMS AND CONDITIONS.

1. Definitions

  • "Collective Work" means a work, such as a periodical issue, anthology or encyclopedia, in which the Work in its entirety in unmodified form, along with a number of other contributions, constituting separate and independent works in themselves, are assembled into a collective whole. A work that constitutes a Collective Work will not be considered a Derivative Work (as defined below) for the purposes of this License.
  • "Derivative Work" means a work based upon the Work or upon the Work and other pre-existing works, such as a translation, musical arrangement, dramatization, fictionalization, motion picture version, sound recording, art reproduction, abridgment, condensation, or any other form in which the Work may be recast, transformed, or adapted, except that a work that constitutes a Collective Work will not be considered a Derivative Work for the purpose of this License. For the avoidance of doubt, where the Work is a musical composition or sound recording, the synchronization of the Work in timed-relation with a moving image ("synching") will be considered a Derivative Work for the purpose of this License.
  • "Licensor" means the individual or entity that offers the Work under the terms of this License.
  • "Original Author" means the individual or entity who created the Work.
  • "Work" means the copyrightable work of authorship offered under the terms of this License.
  • "You" means an individual or entity exercising rights under this License who has not previously violated the terms of this License with respect to the Work, or who has received express permission from the Licensor to exercise rights under this License despite a previous violation.
  • "License Elements" means the following high-level license attributes as selected by Licensor and indicated in the title of this License: Attribution, ShareAlike.

2. Fair Use Rights

Nothing in this license is intended to reduce, limit, or restrict any rights arising from fair use, first sale or other limitations on the exclusive rights of the copyright owner under copyright law or other applicable laws.

3. License Grant

Subject to the terms and conditions of this License, Licensor hereby grants You a worldwide, royalty-free, non-exclusive, perpetual (for the duration of the applicable copyright) license to exercise the rights in the Work as stated below:

  • to reproduce the Work, to incorporate the Work into one or more Collective Works, and to reproduce the Work as incorporated in the Collective Works;
  • to create and reproduce Derivative Works;
  • to distribute copies or phonorecords of, display publicly, perform publicly, and perform publicly by means of a digital audio transmission the Work including as incorporated in Collective Works;
  • to distribute copies or phonorecords of, display publicly, perform publicly, and perform publicly by means of a digital audio transmission Derivative Works.
  • For the avoidance of doubt, where the work is a musical composition:
    • Performance Royalties Under Blanket Licenses. Licensor waives the exclusive right to collect, whether individually or via a performance rights society (e.g. ASCAP, BMI, SESAC), royalties for the public performance or public digital performance (e.g. webcast) of the Work.
    • Mechanical Rights and Statutory Royalties. Licensor waives the exclusive right to collect, whether individually or via a music rights society or designated agent (e.g. Harry Fox Agency), royalties for any phonorecord You create from the Work ("cover version") and distribute, subject to the compulsory license created by 17 USC Section 115 of the US Copyright Act (or the equivalent in other jurisdictions).
    • Webcasting Rights and Statutory Royalties. For the avoidance of doubt, where the Work is a sound recording, Licensor waives the exclusive right to collect, whether individually or via a performance-rights society (e.g. SoundExchange), royalties for the public digital performance (e.g. webcast) of the Work, subject to the compulsory license created by 17 USC Section 114 of the US Copyright Act (or the equivalent in other jurisdictions).


The above rights may be exercised in all media and formats whether now known or hereafter devised. The above rights include the right to make such modifications as are technically necessary to exercise the rights in other media and formats. All rights not expressly granted by Licensor are hereby reserved.

4. Restrictions

The license granted in Section 3 above is expressly made subject to and limited by the following restrictions:

  • You may distribute, publicly display, publicly perform, or publicly digitally perform the Work only under the terms of this License, and You must include a copy of, or the Uniform Resource Identifier for, this License with every copy or phonorecord of the Work You distribute, publicly display, publicly perform, or publicly digitally perform. You may not offer or impose any terms on the Work that alter or restrict the terms of this License or the recipients' exercise of the rights granted hereunder. You may not sublicense the Work. You must keep intact all notices that refer to this License and to the disclaimer of warranties. You may not distribute, publicly display, publicly perform, or publicly digitally perform the Work with any technological measures that control access or use of the Work in a manner inconsistent with the terms of this License Agreement. The above applies to the Work as incorporated in a Collective Work, but this does not require the Collective Work apart from the Work itself to be made subject to the terms of this License. If You create a Collective Work, upon notice from any Licensor You must, to the extent practicable, remove from the Collective Work any credit as required by clause 4(c), as requested. If You create a Derivative Work, upon notice from any Licensor You must, to the extent practicable, remove from the Derivative Work any credit as required by clause 4(c), as requested.
  • You may distribute, publicly display, publicly perform, or publicly digitally perform a Derivative Work only under the terms of this License, a later version of this License with the same License Elements as this License, or a Creative Commons iCommons license that contains the same License Elements as this License (e.g. Attribution-ShareAlike 2.5 Japan). You must include a copy of, or the Uniform Resource Identifier for, this License or other license specified in the previous sentence with every copy or phonorecord of each Derivative Work You distribute, publicly display, publicly perform, or publicly digitally perform. You may not offer or impose any terms on the Derivative Works that alter or restrict the terms of this License or the recipients' exercise of the rights granted hereunder, and You must keep intact all notices that refer to this License and to the disclaimer of warranties. You may not distribute, publicly display, publicly perform, or publicly digitally perform the Derivative Work with any technological measures that control access or use of the Work in a manner inconsistent with the terms of this License Agreement. The above applies to the Derivative Work as incorporated in a Collective Work, but this does not require the Collective Work apart from the Derivative Work itself to be made subject to the terms of this License.
  • If you distribute, publicly display, publicly perform, or publicly digitally perform the Work or any Derivative Works or Collective Works, You must keep intact all copyright notices for the Work and provide, reasonable to the medium or means You are utilizing: (i) the name of the Original Author (or pseudonym, if applicable) if supplied, and/or (ii) if the Original Author and/or Licensor designate another party or parties (e.g. a sponsor institute, publishing entity, journal) for attribution in Licensor's copyright notice, terms of service or by other reasonable means, the name of such party or parties; the title of the Work if supplied; to the extent reasonably practicable, the Uniform Resource Identifier, if any, that Licensor specifies to be associated with the Work, unless such URI does not refer to the copyright notice or licensing information for the Work; and in the case of a Derivative Work, a credit identifying the use of the Work in the Derivative Work (e.g., "French translation of the Work by Original Author," or "Screenplay based on original Work by Original Author"). Such credit may be implemented in any reasonable manner; provided, however, that in the case of a Derivative Work or Collective Work, at a minimum such credit will appear where any other comparable authorship credit appears and in a manner at least as prominent as such other comparable authorship credit.

5. Representations, Warranties and Disclaimer

UNLESS OTHERWISE AGREED TO BY THE PARTIES IN WRITING, LICENSOR OFFERS THE WORK AS-IS AND MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND CONCERNING THE MATERIALS, EXPRESS, IMPLIED, STATUTORY OR OTHERWISE, INCLUDING, WITHOUT LIMITATION, WARRANTIES OF TITLE, MERCHANTIBILITY, FITNESS FOR A PARTICULAR PURPOSE, NONINFRINGEMENT, OR THE ABSENCE OF LATENT OR OTHER DEFECTS, ACCURACY, OR THE PRESENCE OF ABSENCE OF ERRORS, WHETHER OR NOT DISCOVERABLE. SOME JURISDICTIONS DO NOT ALLOW THE EXCLUSION OF IMPLIED WARRANTIES, SO SUCH EXCLUSION MAY NOT APPLY TO YOU.

6. Limitation on Liability.

EXCEPT TO THE EXTENT REQUIRED BY APPLICABLE LAW, IN NO EVENT WILL LICENSOR BE LIABLE TO YOU ON ANY LEGAL THEORY FOR ANY SPECIAL, INCIDENTAL, CONSEQUENTIAL, PUNITIVE OR EXEMPLARY DAMAGES ARISING OUT OF THIS LICENSE OR THE USE OF THE WORK, EVEN IF LICENSOR HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.

7. Termination

  • This License and the rights granted hereunder will terminate automatically upon any breach by You of the terms of this License. Individuals or entities who have received Derivative Works or Collective Works from You under this License, however, will not have their licenses terminated provided such individuals or entities remain in full compliance with those licenses. Sections 1, 2, 5, 6, 7, and 8 will survive any termination of this License.
  • Subject to the above terms and conditions, the license granted here is perpetual (for the duration of the applicable copyright in the Work). Notwithstanding the above, Licensor reserves the right to release the Work under different license terms or to stop distributing the Work at any time; provided, however that any such election will not serve to withdraw this License (or any other license that has been, or is required to be, granted under the terms of this License), and this License will continue in full force and effect unless terminated as stated above.

8. Miscellaneous

  • Each time You distribute or publicly digitally perform the Work or a Collective Work, the Licensor offers to the recipient a license to the Work on the same terms and conditions as the license granted to You under this License.
  • Each time You distribute or publicly digitally perform a Derivative Work, Licensor offers to the recipient a license to the original Work on the same terms and conditions as the license granted to You under this License.
  • If any provision of this License is invalid or unenforceable under applicable law, it shall not affect the validity or enforceability of the remainder of the terms of this License, and without further action by the parties to this agreement, such provision shall be reformed to the minimum extent necessary to make such provision valid and enforceable.
  • No term or provision of this License shall be deemed waived and no breach consented to unless such waiver or consent shall be in writing and signed by the party to be charged with such waiver or consent.
  • This License constitutes the entire agreement between the parties with respect to the Work licensed here. There are no understandings, agreements or representations with respect to the Work not specified here. Licensor shall not be bound by any additional provisions that may appear in any communication from You. This License may not be modified without the mutual written agreement of the Licensor and You.