Introduction

This article explains how you can use Ogre and its shadow technique management to implement custom algorithms. We will only focus on a well-known texture-based technique called Shadow Mapping (SM). The shader code is written using the NVidia's Cg language.

This article is somewhat "technical" and should be restricted to advanced users, thus newbies that are not very familar with Ogre or standard 3D techniques may begin with something "lighter", but they could give a try though...

What is Shadow Mapping ?

Shadow Mapping is an old image-based shadow determination algorithm published by L. Williams in 1978. It has been extensively used since, both in offline (Toy Story by Pixar) rendering and real time graphics (most games). It consists in two passes :

  • first, render a 2D depth buffer (called "depth" or "shadow" map) from the light's point-of-view
  • second, render scene from the eye's point-of-view


In the second pass the depth map is used when lighting computations occur to know if each fragment shaded is in shadow or not, according to the following algorithm :

  • determine fragment's xyz position relative to light
  • compare fragment's depth z to depth value stored at position xy in the shadow map
  • if greater the fragment is shadowed as something closer to the light occlude it


As this article is not a tutorial about shadow mapping but about implementing it with Ogre, the reader can refer to http://www.paulsprojects.net/tutorials/smt/smt.html to know more about it.

How to do it with Ogre ?

For this, you will need to understand how additive light masking works in Ogre (indeed you will have to properly setup your materials and shaders to work with it). It consists in rendering the scene many times, each time computing a single light contribution whose influence is masked out in areas of shadow. This single light contribution is added to the previous one such that all the light contribution has correctly accumulated in the scene at the end of the rendering job. To do this Ogre categorises the passes of your material in 3 types:

  • ambient, which handles any effect independent from the lights (light emission, ambient light, reflection mapping, etc.)
  • lighting, which is the pass used to compute each single light contribution
  • decal, which is used to modulate the accumulated lighting with the final texture color (not use here as texture color may be used for lighting too)


First we have to let Ogre know about the shader to use when computing the shadow map. We assume no {LEX()}vertex{LEX} deformation, thus all materials may use the same vertex program that simply projects the geometry on the screen (see next section). However, object with vertex deformation may provide its own vertex shader to make the deformation reflects in the shadow map. Secondly we have to let Ogre know about the shaders to use when performing lighting computations for a given light, using the corresponding shadow map. That is a similar version of your standard shaders but with the lighting computation restricted to a single light and a new texture as input (the shadow map in unit 0). Here is the syntax for material scripts :

material
{
  technique default
  {
    pass ambient
    {
      // Not needed for rendering, but as information
      // to lighting pass categorisation routine
      ambient 1 1 1 
      diffuse 0 0 0
      specular 0 0 0 0

      vertex_program_ref AmbientVP
      {
      }

      fragment_program_ref AmbientFP
      {
      }   
   
      shadow_caster_vertex_program_ref ShadowCasterVP
      {
      }

      ...
    }

    pass lighting
    {
      // Not needed for rendering, but as information
      // to lighting pass categorisation routine
      ambient 0 0 0 
      diffuse 1 1 1
      specular 1 1 1 1
      
      vertex_program_ref LightingVP
      {
      }

      fragment_program_ref LightingFP
      {
      }

      shadow_receiver_vertex_program_ref LightingWithShadowMapVP
      {
      }

      shadow_receiver_fragment_program_ref LightingWithShadowMapFP
      {
      }

      ...
    }
  }
}


Interesting links related to this topic in the manual are :


In order to prepare Ogre to do shadow mapping, we also indicate that we will use a texture-based shadow algorithm and handle self-shadowing. We will also use a floating point render target to store the depth map (a standard byte texture can be used too).

SceneManager::setShadowTexturePixelFormat(Ogre::PF_FLOAT16_R);
SceneManager::setShadowTechnique(Ogre::SHADOWTYPE_TEXTURE_ADDITIVE);
SceneManager::setShadowTextureSelfShadow(true);

Setting up the shadow materials

Shadow materials are global shadow settings used by Ogre when rendering shadows and shadow maps. It is set in Ogre using :

SceneManager::setShadowTextureCasterMaterial("ShadowCaster");
SceneManager::setShadowTextureReceiverMaterial("ShadowReceiver");


The shadow caster material holds the shaders that fill the depth map in (it is detailled in the next section) :

vertex_program ShadowCasterVP cg
{
    source v-shadow-caster.cg
    entry_point main
    profiles arbvp1

    default_params
    {
        param_named_auto p_ModelViewProjection worldviewproj_matrix
        param_named_auto p_AmbientLight ambient_light_colour
    }
}

fragment_program ShadowCasterFP cg
{
    source f-shadow-caster.cg
    entry_point main
    profiles arbfp1
    // Store normalized (usefull to avoid overflowin) or non-normalized depth ?
    //compile_arguments -DSTORE_NORMALIZED_DEPTH

    default_params
    {
        // Only used when storing normalized depth values
        //param_named_auto p_Near near_clip_distance
        //param_named_auto p_Far far_clip_distance
        param_named p_DepthOffset float 0.01
    }
}

material ShadowCaster
{
    technique default
    {
        // Z-write only pass
        pass Z-write
        {
            vertex_program_ref ShadowCasterVP
            {
            }
            fragment_program_ref ShadowCasterFP
            {
            }
        }
    }
}


The shadow receiver material simply holds the shadow map in the first texture unit :

material ShadowReceiver
{
    technique default
    {
        pass lighting
        {
            texture_unit ShadowMap
            {
                tex_address_mode clamp
                filtering none
            }
        }
    }
}


Starting from these materials and the material of your object, Ogre derives the final material for rendering by mixing them up. Actually, it provides the shadow caster vertex program if not overriden in the material (only required when vertex deformation occurs) and adds the shadow map to your lighting pass.

As of newer Ogre versions, you can alternatively directly 'pull in' the shadow texture into your regular materials. This effectively removes the need for a shadow receiver material. To accomplish this, we use content_type as described here: http://www.ogre3d.org/docs/manual/manual_17.html#SEC70

texture_unit
{
    content_type shadow
    tex_address_mode clamp
    filtering none
}

You'd simply include the above texture unit into your usual lighting passes, addressing it with the texture_viewproj_matrix matrix as you'll read below.

Writing the depth map

The idea here is to keep the shaders as simple as possible in order to draw the depth map as fast as possible. Thus, we get simple ambient shaders that additionaly store the depth of each fragment. Here, just the ambient pass is required because we only have to draw the depth of the objects in the scene, no lighting computations are done at this point.

Depth value can be stored as normalized values in 0,1 or in a non-normalized way. Normalization method was used before floating point targets have appeared, when GPUs were restricted to byte components. In this case, more precision is given to objects close to the near plane and less for objects close to the far plane. Otherwise, there would be a very bad precision for depth storage as the distance between the far and the near planes can be very large. This problem can be avoided with today's graphics hardware and floating point targets.

To avoid some precision issue when comparing the depth value stored in the depth map and the one of the current rendered fragment according to the light frustum we add a small bias to the stored depth in the fragment shader.

Vertex shader :

// Define inputs from application.
struct VertexIn
{
  float4 position : POSITION;       // Vertex in object-space
  float2 texCoords  : TEXCOORD0;    // Vertex's Texture Coordinates
};

// Define outputs from vertex shader.
struct Vertex
{
  float4 position   : POSITION;     // Vertex position in screen-space
  float4 color      : COLOR;        // Vertex color
  float2 texCoords  : TEXCOORD0;    // Vertex Texture Coordinates
  float depth       : TEXCOORD1;    // Vertex depth in eye space
};

Vertex main(VertexIn p_In,
            uniform float4 p_AmbientLight,          // Ambient light in scene
            uniform float4x4 p_ModelViewProjection  // Model view projection matrix
           )
{
    Vertex l_Out;
    
    // Transform vertex position into homogenous screen-space.
    l_Out.position = mul(p_ModelViewProjection, p_In.position);
    
    // Store depth
    l_Out.depth = l_Out.position.z;
    
    // Store ambient color
    l_Out.color = p_AmbientLight;
    
    // Pass texture coordinates to fragment shader
    l_Out.texCoords = p_In.texCoords;

    return l_Out;
}


Fragment shader :

struct Vertex
{
  float4 position   : POSITION;     // Fragment position in screen-space
  float4 color      : COLOR;        // Fragment color
  float2 texCoords  : TEXCOORD0;    // Fragment's Texture Coordinates
  float depth       : TEXCOORD1;    // Fragment depth in eye-space
};

struct Fragment
{
    float4 color  : COLOR0;
};

Fragment main(Vertex p_In
              
            #ifdef STORE_NORMALIZED_DEPTH
              ,uniform float p_Near // Near distance
              ,uniform float p_Far  // Far distance
            #endif

              ,uniform float p_DepthOffset  // Depth offset
              )
{
    Fragment l_Out;

#ifdef STORE_NORMALIZED_DEPTH

    // Store normalized depth in [0,1] to avoid overflowing,
    // even when using half precision floating point render target
    float l_Depth = (1.0/p_Near - 1.0/p_In.depth) / (1.0/p_Near - 1.0/p_Far);

    // Use some bias to avoid precision issue
    // TODO : As depth is not distributed uniformly across the range
    // we should bias proportionately to the depth value itself.
    // The absolute bias closer to the camera is lower than the bias further away.
    l_Depth += p_DepthOffset;

#else

    // Store non-normalized depth
    float l_Depth = p_In.depth;
    
    // Use some bias to avoid precision issue
    l_Depth += p_DepthOffset;

#endif
    
    // Write the depth value to the depth map
    l_Out.color.r = l_Depth;
    
    return l_Out;
}

Performing lighting computations

Here come the shaders used to compute each single light contribution. As lighting with or without the shadow map is pretty similar, we can use the same shaders with little modifications included in a #define directive. Ogre automatically provides the transformation to compute the fragment position in light space as a view/projection matrix for the current shadow projector in the texture view projection matrix.

vertex_program LightingVP cg
{
    source v-lighting.cg
    entry_point main
    profiles arbvp1

    default_params
    {
        param_named_auto p_ModelView worldview_matrix
        param_named_auto p_InverseModelView inverse_worldview_matrix
        param_named_auto p_ModelViewProjection worldviewproj_matrix
        param_named_auto p_LightPosition light_position_object_space
    }
}

fragment_program LightingFP cg
{
    source f-lighting.cg
    entry_point main
    profiles arbfp1
    
    default_params
    {
        param_named_auto p_LightDiffuse light_diffuse_colour 0
        param_named_auto p_LightSpecular light_specular_colour 0
        param_named_auto p_LightPower light_power 0
        param_named p_Diffuse float4 0.5 0 0 1 
        param_named p_Specular float 1 1 1 30
    }
}

vertex_program LightingWithShadowMapVP cg
{
    source v-lighting.cg
    entry_point main
    profiles arbvp1
    // Similar to standard lighting but using the shadow map in addition
    compile_arguments -DSHADOW_MAP
    
    default_params
    {
        param_named_auto p_ModelView worldview_matrix
        param_named_auto p_InverseModelView inverse_worldview_matrix
        param_named_auto p_ModelViewProjection worldviewproj_matrix
        param_named_auto p_LightPosition light_position_object_space
        // Required to express fragment's position in light space
        param_named_auto p_Model world_matrix
        param_named_auto p_TextureViewProjection texture_viewproj_matrix
    }
}

fragment_program LightingWithShadowMapFP cg
{
    source f-lighting.cg
    entry_point main
    profiles arbfp1
    // Similar to standard lighting but using the shadow map in addition
    compile_arguments -DSHADOW_MAP
    
    default_params
    {
        param_named_auto p_LightDiffuse light_diffuse_colour 0
        param_named_auto p_LightSpecular light_specular_colour 0
        param_named_auto p_LightPower light_power 0
        param_named p_Diffuse float4 0.5 0 0 1 
        param_named p_Specular float 1 1 1 30
    }
}


Vertex Shader :

// Define inputs from application.
struct VertexIn
{
  float4 position   : POSITION;   // Vertex in object-space
  float4 normal     : NORMAL;     // Vertex's Normal
  float2 texCoords  : TEXCOORD0;  // Vertex's Texture Coordinates
};

// Define outputs from vertex shader.
struct Vertex
{
  float4 position       : POSITION;     // Vertex position in screen-space
  float2 texCoords      : TEXCOORD0;    // Vertex texture coordinates
  float3 normal         : TEXCOORD1;    // Normal in eye-space
  float3 halfVector     : TEXCOORD2;    // Half angle vector in eye space
  float3 lightVector    : TEXCOORD3;    // Light vector in eye space

#ifdef SHADOW_MAP
  float4 lightPosition  : TEXCOORD4;    // Vertex position in light space
#endif
};

Vertex main(VertexIn p_In,
            uniform float4x4 p_ModelViewProjection // Model view projection matrix
            ,uniform float4 p_LightPosition         // Light position in object-space
            ,uniform float4x4 p_ModelView           // Model view matrix
            ,uniform float4x4 p_InverseModelView    // Model view matrix inverted

            #ifdef SHADOW_MAP
              ,uniform float4x4 p_Model                 // Model matrix
              ,uniform float4x4 p_TextureViewProjection  // Texture view projection matrix
            #endif
            )
{
    Vertex l_Out;

    // Compute light position in eye-space
    float4 l_LightPosition4 = mul(p_ModelView, p_LightPosition);
    float3 l_LightPosition3 = l_LightPosition4.xyz;
    
    // Compute vertex position in eye-space
    float4 l_Position4 = mul(p_ModelView, p_In.position);
    float3 l_Position3 = l_Position4.xyz / l_Position4.w;
    
    // Transform normal from model-space to eye-space.
    l_Out.normal = normalize(mul(transpose(p_InverseModelView), p_In.normal).xyz);
    
    // Light vector.
    l_Out.lightVector = l_LightPosition3 - (l_Position3 * l_LightPosition4.w);
    
    // Half angle vector = light vector + eye vector
    l_Out.halfVector = l_Out.lightVector + (- l_Position3);

#ifdef SHADOW_MAP

    // Compute vertex position in light space
    // First object to world space
    l_Out.lightPosition = mul(p_Model, p_In.position);
    // Then world to light space
    l_Out.lightPosition = mul(p_TextureViewProjection, l_Out.lightPosition);

#endif

    // Transform vertex position into homogenous screen-space.
    l_Out.position = mul(p_ModelViewProjection, p_In.position);

    // Pass texture coordinates to fragment shader
    l_Out.texCoords = p_In.texCoords;

    return l_Out;
}


Fragment Shader :

// Shadow map always comes in texture unit 0,
// so we have to decal all other textures if any...
#ifdef SHADOW_MAP
  sampler2D p_ShadowMap : TEXUNIT0;
  //sampler2D p_DiffuseMap : TEXUNIT1;
#else
  //sampler2D p_DiffuseMap : TEXUNIT0;
#endif

struct Vertex
{
  float4 position       : POSITION;     // Fragment's position in screen-space
  float2 texCoords      : TEXCOORD0;    // Fragment's texture coordinates
  float3 normal         : TEXCOORD1;    // Fragment's normal in eye-space
  float3 halfVector     : TEXCOORD2;    // Fragment's half angle vector in eye-space
  float3 lightVector    : TEXCOORD3;    // Fragment's light vector in eye-space

#ifdef SHADOW_MAP
  float4 lightPosition  : TEXCOORD4;    // Fragment's position in light space
#endif
};

struct Fragment
{
    float4 color  : COLOR0;
};

Fragment main(Vertex p_In,
              uniform float4 p_LightDiffuse,        // Light diffuse component
              uniform float  p_LightPower,          // Light power
              uniform float4 p_Diffuse,             // Material diffuse component
              uniform float4 p_LightSpecular,       // Light specular component
              uniform float4 p_Specular,            // Material specular component + specular exponent
              )
{
    Fragment l_Out;

    // Normalized normal.
    float3 l_Normal = normalize(p_In.normal);

    // Normalized light vector.
    float3 l_LightVector = normalize(p_In.lightVector);
    
    // Normalized half angle vector.
    float3 l_HalfVector = normalize(p_In.halfVector);
    
    // Diffuse component
    // -----------------

    // Angle between normal and light vector
    float l_CosNL = saturate(dot(l_Normal, l_LightVector));

    // No light can reach back surfaces...
    if (l_CosNL == 0)
        discard;
    
    l_Out.color.rgb = p_Diffuse.rgb * p_LightDiffuse.rgb * l_CosNL;
    
    // Specular component
    // ------------------

    // Apply cosine power distribution around mirror direction
    float l_CosNH = saturate(dot(l_Normal, l_HalfVector));
        
    float l_SpecularPower = pow(l_CosNH, p_Specular.a);
    
    float3 l_Specular = p_Specular.rgb * p_LightSpecular.rgb * l_SpecularPower;

    // Add specular component
    l_Out.color.rgb += l_Specular.rgb;

    // Modulate by light incoming power
    l_Out.color.rgb *= p_LightPower;
    
#ifdef SHADOW_MAP

    // Test if fragment is in shadow
    // -----------------------------
    
    // Compute the distance from light of the rasterized fragment (normalized in [0,1] or not)
    #ifdef STORE_NORMALIZED_DEPTH
        float l_LightDistance = p_In.lightPosition.z / p_In.lightPosition.w;
    #else
        float l_LightDistance = p_In.lightPosition.z;
    #endif

    // Compute fragment position in shadow map (texture) space
    float2 l_ShadowMapTexCoords = float2(p_In.lightPosition.x / p_In.lightPosition.w,
                                         p_In.lightPosition.y / p_In.lightPosition.w);
    
    // Get the stored nearest fragment distance from light in the shadow map (normalized in [0,1] or not)
    float3 l_ShadowDistance = tex2D(p_ShadowMap, l_ShadowMapTexCoords).rgb;

    // Perform standard shadow map comparison
    float l_Lit = (l_LightDistance <= l_ShadowDistance.r ? 1 : 0);

    // Attenuate the light contribution as necessary to compute the final color
    l_Out.color.rgb *= l_Lit;

#endif

    return l_Out;
}

Conclusion


Some screenshots to finish.

A scene without shadow mapping :
Image

The same scene with shadow mapping :
Image

The same scene with shadow map projected onto the geometry :
Image

This article has been initiated by Luc Claustres and the original post http://www.ogre3d.org/forums/viewtopic.php?f=11&t=20840&p=150916#p150916 contains some more screenshots. I had succesfuly implemented standard shadow mapping as well as some derived algorithms with Ogre. Variance Shadow Mapping (http://www.punkuser.net/vsm/) only requires two little modifications (in addition with writing the proper shader code of course ;-) :

  • add a blur compositor on the top of your shadow map (you can find a blur example in the compositor sample for instance)

CompositorManager::getSingleton().addCompositor(l_ShadowMapViewport, "ShadowMapBlur");
CompositorManager::getSingleton().setCompositorEnabled(l_ShadowMapViewport, "ShadowMapBlur", true);
  • set bilinear filtering on the shadow map in the shadow receiver material

material ShadowReceiver
{
    technique default
    {
        pass lighting
        {
            texture_unit ShadowMap
            {
                tex_address_mode clamp
                filtering linear linear none
            }
        }
    }
}

Percentage Closer Filtering (http://www.mpi-sb.mpg.de/~brabec/doc/brabec_cgi01.pdf) simply consists in averaging multiple shadow comparisons in order to get various intensity levels instead of a binary result (shadowed or not).

Many other derived algorithms are possible from this base, such as soft-edged shadows (http://www.gamedev.net/reference/articles/article2193.asp), soft PCF (http://www.cs.utah.edu/classes/cs5610/projects-2005/lha/, http://download.nvidia.com/developer/presentations/2005/SIGGRAPH/PCSS.pdf), and so on.


Alias:Custom_Shadow_Mapping

<HR>
Creative Commons Copyright -- Some rights reserved.


THE WORK (AS DEFINED BELOW) IS PROVIDED UNDER THE TERMS OF THIS CREATIVE COMMONS PUBLIC LICENSE ("CCPL" OR "LICENSE"). THE WORK IS PROTECTED BY COPYRIGHT AND/OR OTHER APPLICABLE LAW. ANY USE OF THE WORK OTHER THAN AS AUTHORIZED UNDER THIS LICENSE OR COPYRIGHT LAW IS PROHIBITED.

BY EXERCISING ANY RIGHTS TO THE WORK PROVIDED HERE, YOU ACCEPT AND AGREE TO BE BOUND BY THE TERMS OF THIS LICENSE. THE LICENSOR GRANTS YOU THE RIGHTS CONTAINED HERE IN CONSIDERATION OF YOUR ACCEPTANCE OF SUCH TERMS AND CONDITIONS.

1. Definitions

  • "Collective Work" means a work, such as a periodical issue, anthology or encyclopedia, in which the Work in its entirety in unmodified form, along with a number of other contributions, constituting separate and independent works in themselves, are assembled into a collective whole. A work that constitutes a Collective Work will not be considered a Derivative Work (as defined below) for the purposes of this License.
  • "Derivative Work" means a work based upon the Work or upon the Work and other pre-existing works, such as a translation, musical arrangement, dramatization, fictionalization, motion picture version, sound recording, art reproduction, abridgment, condensation, or any other form in which the Work may be recast, transformed, or adapted, except that a work that constitutes a Collective Work will not be considered a Derivative Work for the purpose of this License. For the avoidance of doubt, where the Work is a musical composition or sound recording, the synchronization of the Work in timed-relation with a moving image ("synching") will be considered a Derivative Work for the purpose of this License.
  • "Licensor" means the individual or entity that offers the Work under the terms of this License.
  • "Original Author" means the individual or entity who created the Work.
  • "Work" means the copyrightable work of authorship offered under the terms of this License.
  • "You" means an individual or entity exercising rights under this License who has not previously violated the terms of this License with respect to the Work, or who has received express permission from the Licensor to exercise rights under this License despite a previous violation.
  • "License Elements" means the following high-level license attributes as selected by Licensor and indicated in the title of this License: Attribution, ShareAlike.

2. Fair Use Rights

Nothing in this license is intended to reduce, limit, or restrict any rights arising from fair use, first sale or other limitations on the exclusive rights of the copyright owner under copyright law or other applicable laws.

3. License Grant

Subject to the terms and conditions of this License, Licensor hereby grants You a worldwide, royalty-free, non-exclusive, perpetual (for the duration of the applicable copyright) license to exercise the rights in the Work as stated below:

  • to reproduce the Work, to incorporate the Work into one or more Collective Works, and to reproduce the Work as incorporated in the Collective Works;
  • to create and reproduce Derivative Works;
  • to distribute copies or phonorecords of, display publicly, perform publicly, and perform publicly by means of a digital audio transmission the Work including as incorporated in Collective Works;
  • to distribute copies or phonorecords of, display publicly, perform publicly, and perform publicly by means of a digital audio transmission Derivative Works.
  • For the avoidance of doubt, where the work is a musical composition:
    • Performance Royalties Under Blanket Licenses. Licensor waives the exclusive right to collect, whether individually or via a performance rights society (e.g. ASCAP, BMI, SESAC), royalties for the public performance or public digital performance (e.g. webcast) of the Work.
    • Mechanical Rights and Statutory Royalties. Licensor waives the exclusive right to collect, whether individually or via a music rights society or designated agent (e.g. Harry Fox Agency), royalties for any phonorecord You create from the Work ("cover version") and distribute, subject to the compulsory license created by 17 USC Section 115 of the US Copyright Act (or the equivalent in other jurisdictions).
    • Webcasting Rights and Statutory Royalties. For the avoidance of doubt, where the Work is a sound recording, Licensor waives the exclusive right to collect, whether individually or via a performance-rights society (e.g. SoundExchange), royalties for the public digital performance (e.g. webcast) of the Work, subject to the compulsory license created by 17 USC Section 114 of the US Copyright Act (or the equivalent in other jurisdictions).


The above rights may be exercised in all media and formats whether now known or hereafter devised. The above rights include the right to make such modifications as are technically necessary to exercise the rights in other media and formats. All rights not expressly granted by Licensor are hereby reserved.

4. Restrictions

The license granted in Section 3 above is expressly made subject to and limited by the following restrictions:

  • You may distribute, publicly display, publicly perform, or publicly digitally perform the Work only under the terms of this License, and You must include a copy of, or the Uniform Resource Identifier for, this License with every copy or phonorecord of the Work You distribute, publicly display, publicly perform, or publicly digitally perform. You may not offer or impose any terms on the Work that alter or restrict the terms of this License or the recipients' exercise of the rights granted hereunder. You may not sublicense the Work. You must keep intact all notices that refer to this License and to the disclaimer of warranties. You may not distribute, publicly display, publicly perform, or publicly digitally perform the Work with any technological measures that control access or use of the Work in a manner inconsistent with the terms of this License Agreement. The above applies to the Work as incorporated in a Collective Work, but this does not require the Collective Work apart from the Work itself to be made subject to the terms of this License. If You create a Collective Work, upon notice from any Licensor You must, to the extent practicable, remove from the Collective Work any credit as required by clause 4(c), as requested. If You create a Derivative Work, upon notice from any Licensor You must, to the extent practicable, remove from the Derivative Work any credit as required by clause 4(c), as requested.
  • You may distribute, publicly display, publicly perform, or publicly digitally perform a Derivative Work only under the terms of this License, a later version of this License with the same License Elements as this License, or a Creative Commons iCommons license that contains the same License Elements as this License (e.g. Attribution-ShareAlike 2.5 Japan). You must include a copy of, or the Uniform Resource Identifier for, this License or other license specified in the previous sentence with every copy or phonorecord of each Derivative Work You distribute, publicly display, publicly perform, or publicly digitally perform. You may not offer or impose any terms on the Derivative Works that alter or restrict the terms of this License or the recipients' exercise of the rights granted hereunder, and You must keep intact all notices that refer to this License and to the disclaimer of warranties. You may not distribute, publicly display, publicly perform, or publicly digitally perform the Derivative Work with any technological measures that control access or use of the Work in a manner inconsistent with the terms of this License Agreement. The above applies to the Derivative Work as incorporated in a Collective Work, but this does not require the Collective Work apart from the Derivative Work itself to be made subject to the terms of this License.
  • If you distribute, publicly display, publicly perform, or publicly digitally perform the Work or any Derivative Works or Collective Works, You must keep intact all copyright notices for the Work and provide, reasonable to the medium or means You are utilizing: (i) the name of the Original Author (or pseudonym, if applicable) if supplied, and/or (ii) if the Original Author and/or Licensor designate another party or parties (e.g. a sponsor institute, publishing entity, journal) for attribution in Licensor's copyright notice, terms of service or by other reasonable means, the name of such party or parties; the title of the Work if supplied; to the extent reasonably practicable, the Uniform Resource Identifier, if any, that Licensor specifies to be associated with the Work, unless such URI does not refer to the copyright notice or licensing information for the Work; and in the case of a Derivative Work, a credit identifying the use of the Work in the Derivative Work (e.g., "French translation of the Work by Original Author," or "Screenplay based on original Work by Original Author"). Such credit may be implemented in any reasonable manner; provided, however, that in the case of a Derivative Work or Collective Work, at a minimum such credit will appear where any other comparable authorship credit appears and in a manner at least as prominent as such other comparable authorship credit.

5. Representations, Warranties and Disclaimer

UNLESS OTHERWISE AGREED TO BY THE PARTIES IN WRITING, LICENSOR OFFERS THE WORK AS-IS AND MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND CONCERNING THE MATERIALS, EXPRESS, IMPLIED, STATUTORY OR OTHERWISE, INCLUDING, WITHOUT LIMITATION, WARRANTIES OF TITLE, MERCHANTIBILITY, FITNESS FOR A PARTICULAR PURPOSE, NONINFRINGEMENT, OR THE ABSENCE OF LATENT OR OTHER DEFECTS, ACCURACY, OR THE PRESENCE OF ABSENCE OF ERRORS, WHETHER OR NOT DISCOVERABLE. SOME JURISDICTIONS DO NOT ALLOW THE EXCLUSION OF IMPLIED WARRANTIES, SO SUCH EXCLUSION MAY NOT APPLY TO YOU.

6. Limitation on Liability.

EXCEPT TO THE EXTENT REQUIRED BY APPLICABLE LAW, IN NO EVENT WILL LICENSOR BE LIABLE TO YOU ON ANY LEGAL THEORY FOR ANY SPECIAL, INCIDENTAL, CONSEQUENTIAL, PUNITIVE OR EXEMPLARY DAMAGES ARISING OUT OF THIS LICENSE OR THE USE OF THE WORK, EVEN IF LICENSOR HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.

7. Termination

  • This License and the rights granted hereunder will terminate automatically upon any breach by You of the terms of this License. Individuals or entities who have received Derivative Works or Collective Works from You under this License, however, will not have their licenses terminated provided such individuals or entities remain in full compliance with those licenses. Sections 1, 2, 5, 6, 7, and 8 will survive any termination of this License.
  • Subject to the above terms and conditions, the license granted here is perpetual (for the duration of the applicable copyright in the Work). Notwithstanding the above, Licensor reserves the right to release the Work under different license terms or to stop distributing the Work at any time; provided, however that any such election will not serve to withdraw this License (or any other license that has been, or is required to be, granted under the terms of this License), and this License will continue in full force and effect unless terminated as stated above.

8. Miscellaneous

  • Each time You distribute or publicly digitally perform the Work or a Collective Work, the Licensor offers to the recipient a license to the Work on the same terms and conditions as the license granted to You under this License.
  • Each time You distribute or publicly digitally perform a Derivative Work, Licensor offers to the recipient a license to the original Work on the same terms and conditions as the license granted to You under this License.
  • If any provision of this License is invalid or unenforceable under applicable law, it shall not affect the validity or enforceability of the remainder of the terms of this License, and without further action by the parties to this agreement, such provision shall be reformed to the minimum extent necessary to make such provision valid and enforceable.
  • No term or provision of this License shall be deemed waived and no breach consented to unless such waiver or consent shall be in writing and signed by the party to be charged with such waiver or consent.
  • This License constitutes the entire agreement between the parties with respect to the Work licensed here. There are no understandings, agreements or representations with respect to the Work not specified here. Licensor shall not be bound by any additional provisions that may appear in any communication from You. This License may not be modified without the mutual written agreement of the Licensor and You.