IMPORTANT: This framework is meant to be used with old releases of Ogre. For new projects, rather use OgreBites::InputListener or SDL2.

Tutorial Introduction
Ogre Tutorial Head

This tutorial will introduce the use of buffered input with OIS. Instead of asking for input information every frame, we will have callback methods defined that are called every time an input event occurs.

The full source for this tutorial is here.

Any problems you encounter during working with this tutorial should be posted in the Help Forum(external link).

Prerequsites

This tutorial assumes that you already know how to set up an Ogre project and compile it successfully. If you need help with this, then read Setting Up An Application. This tutorial is also part of the Basic Tutorials series and knowledge from the previous tutorials will be assumed.

tudor_house_visual.png

Setting Up the Scene

There is a 'tudorhouse.mesh' included in the Samples directory with Ogre, but the texture no longer seems to be included. Here is the texture:

 Plugin disabled
Plugin attach cannot be executed.


After adding the mesh and texture to your project, you need to make sure this material entry is in one of your material scripts:

material Examples/TudorHouse
{
	technique
	{
		pass
		{
			texture_unit
			{
				texture fw12b.jpg
				tex_address_mode clamp
			}
		}
	}
}

Finally, you'll need to set up a basic scene for this tutorial. Compile and run this code. You should see the tutor house in front of you.

TutorialApplication.h
#ifndef TUTORIALAPPLICATION_H
#define TUTORIALAPPLICATION_H

#include "BaseApplication.h"

class TutorialApplication : public BaseApplication
{
public:
  TutorialApplication();
  virtual ~TutorialApplication();

private:
  virtual void createScene();
  virtual bool frameRenderingQueued(const Ogre::FrameEvent& fe);

  Ogre::Real mRotate;
  Ogre::Real mMove;
  Ogre::SceneNode* mCamNode;
  Ogre::Vector3 mDirection;

};

#endif
TutorialApplication.cpp
#include "TutorialApplication.h"

TutorialApplication::TutorialApplication()
  : mRotate(.13),
    mMove(250),
    mCamNode(0),
    mDirection(Ogre::Vector3::ZERO)
{
}

TutorialApplication::~TutorialApplication()
{
}

void TutorialApplication::createScene()
{
  mSceneMgr->setAmbientLight(Ogre::ColourValue(.2, .2, .2));

  Ogre::Entity* tudorEntity = mSceneMgr->createEntity("tudorhouse.mesh");
  Ogre::SceneNode* node = mSceneMgr->getRootSceneNode()->createChildSceneNode(
    "Node");
  node->attachObject(tudorEntity);

  Ogre::Light* light = mSceneMgr->createLight("Light1");
  light->setType(Ogre::Light::LT_POINT);
  light->setPosition(Ogre::Vector3(250, 150, 250));
  light->setDiffuseColour(Ogre::ColourValue::White);
  light->setSpecularColour(Ogre::ColourValue::White);

  mCamera->setPosition(0, -370, 1000);

}

bool TutorialApplication::frameRenderingQueued(const Ogre::FrameEvent& fe)
{
  bool ret = BaseApplication::frameRenderingQueued(fe);

  return ret;
}

// MAIN FUNCTION OMITTED FOR SPACE

An Introduction to Buffered Input

In the previous tutorial, we used unbuffered input. For instance, every frame we checked the OIS::Keyboard instance to see if any keys were being held down. Thist tutorial will use buffered input. This approach involves registering a listener class to report input events directly. It is called buffered input because the events are fed into a buffer and then dispatched via the callback methods. Don't worry if this seems too abstract, we will give concrete examples soon that will help clarify things.

With buffered input we no longer have to devise a system for keeping track of whether a button was held down the previous frame. Instead, there are two separate input events that are fired: a key pressed event and a key released event. And each event carries information like what key was pressed or released.

It is important to know that OIS only allows one listener per Keyboard, Mouse, or Joystick object. This is done for performance reasons. If you try to register a second listener, then it will simply replace the first. If you need multiple objects to be notified of input events, then you should implement your own message dispatching system.

The KeyListener and Interface

The listener class provided by OIS for the keyboard is called KeyListener. It provides two pure virtual callback methods: keyPressed and keyReleased. Both of these functions take a single KeyEvent as a parameter. This object contains information about which key is being used.

You should add overrides of the KeyListener methods to your header and cpp file. Add the declaration to the private section.

TutorialApplication.h
virtual bool keyPressed(const OIS::KeyEvent& ke);
virtual bool keyReleased(const OIS::KeyEvent& ke);
TutorialApplication.cpp
bool TutorialApplication::keyPressed(const OIS::KeyEvent& ke) 
{ 
  return true; 
}

bool TutorialApplication::keyReleased(const OIS::KeyEvent& ke) 
{ 
  return true; 
}

The MouseListener Interface

The listener class that OIS provides for the mouse is called MouseListener. It is only slight more complex than the KeyListener. It contains two methods to check the status of mouse buttons: mousePressed and mouseReleased. It also has a third method called mouseMoved that is called whenever it detects an event caused by moving the mouse. The MouseEvent that is passed to these functions contains information about the relative movement of the mouse (how far it has moved since the last frame) and information about where the mouse is in absolute screen coordinates. You'll find situations where both pieces of information are useful.

Go ahead and add overrides for the MouseListener methods to your header and cpp file. Add the declaration to the private section as well. Do not compile your application yet. We've taken over some input management from BaseApplication and you'll be stuck with no way to move around or exit. We'll fix that soon.

TutorialApplication.h
virtual bool mouseMoved(const OIS::MouseEvent& me);
virtual bool mousePressed(const OIS::MouseEvent& me, OIS::MouseButtonID id);
virtual bool mouseReleased(const OIS::MouseEvent& me, OIS::MouseButtonID id);
TutorialApplication.cpp
bool TutorialApplication::mouseMoved(const OIS::MouseEvent& me) 
{ 
  return true; 
}

bool TutorialApplication::mousePressed(
  const OIS::MouseEvent& me, OIS::MouseButtonID id) 
{ 
  return true; 
}

bool TutorialApplication::mouseReleased(
  const OIS::MouseEvent& me, OIS::MouseButtonID id) 
{ 
  return true; 
}

Setting Up SceneNodes For Camera Positioning

Now we are going to create two SceneNodes that we can attach the Camera to. This will allow us to toggle back and forth between the two viewpoints using buffered input. The first thing you should do is remove this line in createScene:

mCamera->setPosition(0, -370, 1000);

We're now positioning the camera by attaching it to a SceneNode so this was no longer needed. Now we are going to build the SceneNodes. Add the following to createScene right after we create the Light:

node = mSceneMgr->getRootSceneNode()->createChildSceneNode(
  "CamNode1", Ogre::Vector3(1200, -370, 0));
node->yaw(Ogre::Degree(90));

Here we have created a new SceneNode and reused the node variable to rotate it 90 degrees around the y-axis. The next thing we will do is assign our mCamNode pointer to our new SceneNode and attach our Camera to it. mCamNode will be used as our current SceneNode. We will move the pointer to another SceneNode based on keyboard input.

mCamNode = node;
node->attachObject(mCamera);

This will mean our Camera will now start out at the position of the SceneNode it was just attached to. Now we want to create another SceneNode that we can move the Camera to.

node = mSceneMgr->getRootSceneNode()->createChildSceneNode(
  "CamNode2", Ogre::Vector3(-500, -370, 1000));
node->yaw(Ogre::Degree(-30));

Now we have two "anchors" that we can attach the Camera to. Remember that the Camera won't just get placed at the position of the SceneNode, it will also inherit any transformations like rotations that have been applied to the node.

Registering the Listeners

As with most listener classes, we need to register the KeyListener and MouseListener with their respective objects for them to work correctly. This is something that is already being done for us in BaseApplication. If you look at BaseApplication::createFrameListener, you will see these two calls:

mMouse->setEventCallback(this);
mKeyboard->setEventCallback(this);

They register our class (which inherits from KeyListener and MouseListener) as the source of the callback methods. Since we aren't overriding BaseApplication::createFrameListener, this registration still happens in BaseApplication.

Capturing Input State

OIS does not automatically capture the state of the keyboard and mouse every frame. We need to explicitly call the capture method to do this. This is another thing that is already being done by the tutorial framework. If you look at BaseApplication::frameRenderingQueued, you will see these two lines:
{CODE (wrap="1", colors="c++")}
mKeyboard->capture();
mMouse->capture();
{CODE}
These two calls fill the "buffers" with input information. This is why this method is referred to as buffered input. While we're here, you should also notice the line that makes sure mShutDown causes the application to exit:
{CODE (wrap="1", colors="c++")}
if (mShutdown) return false;
{CODE}
These are all things we would have to put in our own class if BaseApplication wasn't taking care of them for us. The Intermediate Tutorials no longer use the BaseApplication framework to help move you away from relying on code that is "behind the scenes". For now, just keep in mind that a lot of our functionality is still coming from BaseApplication. We are pointing it out here so things start to seem less mysterious. Also keep in mind that if we override one of the methods from BaseApplication, then we must remember to call the parent method so we continue to get the functionality from BaseApplication (like we are currently doing with frameRenderingQueued.

Dealing With Input Events

We are now ready to develop our actual input code. The first thing we will do is allow the user to exit the application by pressing the escape key. Add the following to TutorialApplication::keyPressed:

switch (ke.key)
{
case OIS::KC_ESCAPE: 
  mShutDown = true;
  break;
default:
  break;
}

When the keyPressed method is called by the listener, the KeyEvent reference that we have stored in ke will contain a member called key. We can use its value to determine which key was pressed. We will use a switch statement that tests for the key code associated with the escape key. The OIS namespace defines a series of key code constants that all start with the prefix "KC_". When we discover the escape key has been pressed, then we simply set our shutdown flag to true. BaseApplication::frameRenderingQueued will then return false next frame and the application will exit as we wanted.

If you compile your application now, you won't be able to move the Camera, but you should be able to exit by pressing escape.

Changing Viewpoints

Next we are going to allow the user to jump between SceneNodes by pressing 1 or 2 on the keyboard. Add the following to the switch statement we just set up in keyPressed:

case OIS::KC_1:
  mCamera->getParentSceneNode()->detachObject(mCamera);
  mCamNode = mSceneMgr->getSceneNode("CamNode1");
  mCamNode->attachObject(mCamera);
  break;
 
case OIS::KC_2:
  mCamera->getParentSceneNode()->detachObject(mCamera);
  mCamNode = mSceneMgr->getSceneNode("CamNode2");
  mCamNode->attachObject(mCamera);
  break;

This will detach the Camera and reattach it whenever these keys are pressed. Compile your application. You should be able to change viewpoints. The only other thing you can do is press escape to exit.

Camera Movement

Now we will add Camera movement back into our application. We are going to do this by using mDirection as a velocity vector. Whenever the user presses the W key we will set the velocity vector's z component to equal -mMove. This is because the Camera's default orientation is facing in the negative direction down the z-axis. We will apply the same logic to all of the different directions the user could move. Add the following to the switch statement in keyPressed that we've been working with:

case OIS::KC_UP:
case OIS::KC_W:
    mDirection.z = -mMove;
    break;
  
case OIS::KC_DOWN:
case OIS::KC_S:
    mDirection.z = mMove;
    break;
  
case OIS::KC_LEFT:
case OIS::KC_A:
    mDirection.x = -mMove;
    break;

case OIS::KC_RIGHT:
case OIS::KC_D:
    mDirection.x = mMove;
    break;
 
case OIS::KC_PGDOWN:
case OIS::KC_E:
    mDirection.y = -mMove;
    break;
  
case OIS::KC_PGUP:
case OIS::KC_Q:
    mDirection.y = mMove;
    break;

You might want to take a moment to convince yourself this aims the vector in the right directions. The next thing we need to do is return a component to zero if the key is released. Add the following to keyReleased:

switch (ke.key)
{
case OIS::KC_UP:
case OIS::KC_W:
    mDirection.z = 0;
    break;

case OIS::KC_DOWN:
case OIS::KC_S:
    mDirection.z = 0;
    break;

case OIS::KC_LEFT:
case OIS::KC_A:
    mDirection.x = 0;
    break;

case OIS::KC_RIGHT:
case OIS::KC_D:
    mDirection.x = 0;
    break;

case OIS::KC_PGDOWN:
case OIS::KC_E:
    mDirection.y = 0;
    break;

case OIS::KC_PGUP:
case OIS::KC_Q:
    mDirection.y = 0;
    break;

default:
    break;
}
return true;

This switch is just like the one we were using in keyPressed. All of this code has only built the velocity vector we needed. Now we have to translate the Camera's current SceneNode based on this vector so that the Camera actually moves. Add the following to frameRenderingQueued right after we call the BaseApplication method:

mCamNode->translate(mDirection * fe.timeSinceLastFrame, Ogre::Node::TS_LOCAL);

You should recognize our use of the local transformation space from the past tutorial. This coordinate frame stays attached to mCamNode, so the negative z-axis will always point forwards in this transformation space. This is crucial to our method. You should also notice the method we used to smooth out the Camera's movement by scaling it based on the time that has passed since the last frame. If any of this is unclear, then refer to Basic Tutorial 4.

Compile and run your application. We can now move the Camera using the keyboard again, but we've done it all using the buffered input system. Try moving the Camera and then jumping to the other SceneNode and back again. Notice that you return to the place you moved the SceneNode. It does not "reset" because when we are actually translating the Camera by using the SceneNode it is attached to. The SceneNode is not just a stationary place-holder.

Toggling a Light With Buffered Input

We will now add in some mouse functionality. We will start by toggling the scene's light on and off like we did in the last tutorial. This will be a very similar process to what we've just done with the keyboard. It might seem strange, but the MouseEvent does not contain information about what button was pressed. Instead we have a second parameter that holds a MouseButtonID. This is what we will use in the switch to determine what button was pressed. Add the following to mousePressed:

switch (id)
{
case OIS::MB_Left:
  Ogre::Light* light = mSceneMgr->getLight("Light1");
  light->setVisible(!light->isVisible());
  break;

default:
  break;
}

The first thing we do is get a pointer to the light we created for our scene. Then we toggle the Light's visibility when the left mouse button is pressed. Compile and run your application. That's all there is to it!

Using the Mouse for Camera Movement

The last thing we are going to do is add back the ability to rotate the camera using the mouse. We will bind the right mouse button to this "mouse look mode". Every time the mouse is moved, we will check to see if the right mouse button is held down. If it is, then we will rotate the Camera based on the relative movement of the mouse. The relative movement values are contained in a member of our MouseEvent called state. This member contains an X and Y member. These members contain a value called rel that contains the relative movement and a value called abs that contains the absolute position of the cursor.

Add the following to mouseMoved:

if (me.state.buttonDown(OIS::MB_Right))
{
  mCamNode->yaw(Ogre::Degree(-mRotate * me.state.X.rel), Ogre::Node::TS_WORLD);
  mCamNode->pitch(Ogre::Degree(-mRotate * me.state.Y.rel), Ogre::Node::TS_LOCAL);
}

Compile and run your application. You should now be able to look around freely when holding down the right mouse button. The yaw and pitch calls are mildly complicated. You should look at them and make sure you understand how they work. In particular, why is mRotate negative for both calls, and why are we using the world transformation space for yaw, but the local transformation space for pitch?

Other Input Systems

There are other input systems available that work well with Ogre. One of the more popular options is SDL. SDL provides cross-platform windowing/input systems and game controller input. To begin working with SDL's joystick system, we would wrap our application's go method with calls to SDL_Init and SDL_Quit.

SDL_Init(SDL_INIT_JOYSTICK | SDL_INIT_NOPARACHUTE);
SDL_JoystickEventState(SDL_ENABLE);

app.go();

SDL_Quit();

Te setup a joystick, we would call SDL_JoystickOpen with the joystick number as parameter (0, 1, 2...).

SDL_Joystick* mJoystick;
mJoystick = SDL_JoystickOpen(0);
 
if (mJoystick == NULL)
  // No joystick found

If SDL_JoystickOpen returns NULL, then there was a problem loading the joystick. This often means the requested joystick doesn't exist. You can check SDL_NumJoysticks to see how many joysticks were recognized. You also need to close the joystick when you're done with it.

SDL_JoystickClose(mJoystick);

To use the joystick, we call SDL_JoystickGetButton and SDL_JoystickGetAxis. Here is a snippet of movement code that uses these SDL methods:

SDL_JoystickUpdate();
 
mTrans.z += evt.timeSinceLastFrame * mMoveAmount * SDL_JoystickGetAxis(mJoystick, 1) / 32767;
mTrans.x += evt.timeSinceLastFrame * mMoveAmount * SDL_JoystickGetAxis(mJoystick, 0) / 32767;

xRot -= evt.timeSinceLastFrame * mRotAmount * SDL_JoystickGetAxis(mJoystick, 3) / 32767;
yRot -= evt.timeSinceLastFrame * mRotAmount * SDL_JoystickGetAxis(mJoystick, 2) / 32767;

The mTrans variable is used with the SceneNode::translate method to move Camera later in the code. xRot and yRot were used in similar ways. SDL_JoystickGetAxis returns a value between -32767 and 32767. In this snippet, the value is scaled to fit between -1 and 1 to match up with Ogre's coordinate system.

Conclusion

This tutorial has covered the basic of buffered input with OIS. This input method does not rely on asking directly for the state of an input device whenever it is needed. Instead, it uses a listener system to "callback" to methods that were defined to deal with the input events. The system is called buffered input, because it fills a buffer with input events and then delivers them via the callbacks.

We also mentioned that OIS is not the only option for input with Ogre. The reason why Ogre is kept strictly to a graphics engine is so that you can make your own choices about what other systems to use in your application. Another popular choice for input is the Simple DirectMedia Layer (SDL). This is a widely-used and very stable library that provides simple access to audio, keyboard, mouse, and joystick. It also provides access to OpenGL for rendering graphics, but it does not render 3D.

Full Source

The full source for this tutorial is here.

Next

Basic Tutorial 6


Alias: Basic_Tutorial_5

<HR>
Creative Commons Copyright -- Some rights reserved.


THE WORK (AS DEFINED BELOW) IS PROVIDED UNDER THE TERMS OF THIS CREATIVE COMMONS PUBLIC LICENSE ("CCPL" OR "LICENSE"). THE WORK IS PROTECTED BY COPYRIGHT AND/OR OTHER APPLICABLE LAW. ANY USE OF THE WORK OTHER THAN AS AUTHORIZED UNDER THIS LICENSE OR COPYRIGHT LAW IS PROHIBITED.

BY EXERCISING ANY RIGHTS TO THE WORK PROVIDED HERE, YOU ACCEPT AND AGREE TO BE BOUND BY THE TERMS OF THIS LICENSE. THE LICENSOR GRANTS YOU THE RIGHTS CONTAINED HERE IN CONSIDERATION OF YOUR ACCEPTANCE OF SUCH TERMS AND CONDITIONS.

1. Definitions

  • "Collective Work" means a work, such as a periodical issue, anthology or encyclopedia, in which the Work in its entirety in unmodified form, along with a number of other contributions, constituting separate and independent works in themselves, are assembled into a collective whole. A work that constitutes a Collective Work will not be considered a Derivative Work (as defined below) for the purposes of this License.
  • "Derivative Work" means a work based upon the Work or upon the Work and other pre-existing works, such as a translation, musical arrangement, dramatization, fictionalization, motion picture version, sound recording, art reproduction, abridgment, condensation, or any other form in which the Work may be recast, transformed, or adapted, except that a work that constitutes a Collective Work will not be considered a Derivative Work for the purpose of this License. For the avoidance of doubt, where the Work is a musical composition or sound recording, the synchronization of the Work in timed-relation with a moving image ("synching") will be considered a Derivative Work for the purpose of this License.
  • "Licensor" means the individual or entity that offers the Work under the terms of this License.
  • "Original Author" means the individual or entity who created the Work.
  • "Work" means the copyrightable work of authorship offered under the terms of this License.
  • "You" means an individual or entity exercising rights under this License who has not previously violated the terms of this License with respect to the Work, or who has received express permission from the Licensor to exercise rights under this License despite a previous violation.
  • "License Elements" means the following high-level license attributes as selected by Licensor and indicated in the title of this License: Attribution, ShareAlike.

2. Fair Use Rights

Nothing in this license is intended to reduce, limit, or restrict any rights arising from fair use, first sale or other limitations on the exclusive rights of the copyright owner under copyright law or other applicable laws.

3. License Grant

Subject to the terms and conditions of this License, Licensor hereby grants You a worldwide, royalty-free, non-exclusive, perpetual (for the duration of the applicable copyright) license to exercise the rights in the Work as stated below:

  • to reproduce the Work, to incorporate the Work into one or more Collective Works, and to reproduce the Work as incorporated in the Collective Works;
  • to create and reproduce Derivative Works;
  • to distribute copies or phonorecords of, display publicly, perform publicly, and perform publicly by means of a digital audio transmission the Work including as incorporated in Collective Works;
  • to distribute copies or phonorecords of, display publicly, perform publicly, and perform publicly by means of a digital audio transmission Derivative Works.
  • For the avoidance of doubt, where the work is a musical composition:
    • Performance Royalties Under Blanket Licenses. Licensor waives the exclusive right to collect, whether individually or via a performance rights society (e.g. ASCAP, BMI, SESAC), royalties for the public performance or public digital performance (e.g. webcast) of the Work.
    • Mechanical Rights and Statutory Royalties. Licensor waives the exclusive right to collect, whether individually or via a music rights society or designated agent (e.g. Harry Fox Agency), royalties for any phonorecord You create from the Work ("cover version") and distribute, subject to the compulsory license created by 17 USC Section 115 of the US Copyright Act (or the equivalent in other jurisdictions).
    • Webcasting Rights and Statutory Royalties. For the avoidance of doubt, where the Work is a sound recording, Licensor waives the exclusive right to collect, whether individually or via a performance-rights society (e.g. SoundExchange), royalties for the public digital performance (e.g. webcast) of the Work, subject to the compulsory license created by 17 USC Section 114 of the US Copyright Act (or the equivalent in other jurisdictions).


The above rights may be exercised in all media and formats whether now known or hereafter devised. The above rights include the right to make such modifications as are technically necessary to exercise the rights in other media and formats. All rights not expressly granted by Licensor are hereby reserved.

4. Restrictions

The license granted in Section 3 above is expressly made subject to and limited by the following restrictions:

  • You may distribute, publicly display, publicly perform, or publicly digitally perform the Work only under the terms of this License, and You must include a copy of, or the Uniform Resource Identifier for, this License with every copy or phonorecord of the Work You distribute, publicly display, publicly perform, or publicly digitally perform. You may not offer or impose any terms on the Work that alter or restrict the terms of this License or the recipients' exercise of the rights granted hereunder. You may not sublicense the Work. You must keep intact all notices that refer to this License and to the disclaimer of warranties. You may not distribute, publicly display, publicly perform, or publicly digitally perform the Work with any technological measures that control access or use of the Work in a manner inconsistent with the terms of this License Agreement. The above applies to the Work as incorporated in a Collective Work, but this does not require the Collective Work apart from the Work itself to be made subject to the terms of this License. If You create a Collective Work, upon notice from any Licensor You must, to the extent practicable, remove from the Collective Work any credit as required by clause 4(c), as requested. If You create a Derivative Work, upon notice from any Licensor You must, to the extent practicable, remove from the Derivative Work any credit as required by clause 4(c), as requested.
  • You may distribute, publicly display, publicly perform, or publicly digitally perform a Derivative Work only under the terms of this License, a later version of this License with the same License Elements as this License, or a Creative Commons iCommons license that contains the same License Elements as this License (e.g. Attribution-ShareAlike 2.5 Japan). You must include a copy of, or the Uniform Resource Identifier for, this License or other license specified in the previous sentence with every copy or phonorecord of each Derivative Work You distribute, publicly display, publicly perform, or publicly digitally perform. You may not offer or impose any terms on the Derivative Works that alter or restrict the terms of this License or the recipients' exercise of the rights granted hereunder, and You must keep intact all notices that refer to this License and to the disclaimer of warranties. You may not distribute, publicly display, publicly perform, or publicly digitally perform the Derivative Work with any technological measures that control access or use of the Work in a manner inconsistent with the terms of this License Agreement. The above applies to the Derivative Work as incorporated in a Collective Work, but this does not require the Collective Work apart from the Derivative Work itself to be made subject to the terms of this License.
  • If you distribute, publicly display, publicly perform, or publicly digitally perform the Work or any Derivative Works or Collective Works, You must keep intact all copyright notices for the Work and provide, reasonable to the medium or means You are utilizing: (i) the name of the Original Author (or pseudonym, if applicable) if supplied, and/or (ii) if the Original Author and/or Licensor designate another party or parties (e.g. a sponsor institute, publishing entity, journal) for attribution in Licensor's copyright notice, terms of service or by other reasonable means, the name of such party or parties; the title of the Work if supplied; to the extent reasonably practicable, the Uniform Resource Identifier, if any, that Licensor specifies to be associated with the Work, unless such URI does not refer to the copyright notice or licensing information for the Work; and in the case of a Derivative Work, a credit identifying the use of the Work in the Derivative Work (e.g., "French translation of the Work by Original Author," or "Screenplay based on original Work by Original Author"). Such credit may be implemented in any reasonable manner; provided, however, that in the case of a Derivative Work or Collective Work, at a minimum such credit will appear where any other comparable authorship credit appears and in a manner at least as prominent as such other comparable authorship credit.

5. Representations, Warranties and Disclaimer

UNLESS OTHERWISE AGREED TO BY THE PARTIES IN WRITING, LICENSOR OFFERS THE WORK AS-IS AND MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND CONCERNING THE MATERIALS, EXPRESS, IMPLIED, STATUTORY OR OTHERWISE, INCLUDING, WITHOUT LIMITATION, WARRANTIES OF TITLE, MERCHANTIBILITY, FITNESS FOR A PARTICULAR PURPOSE, NONINFRINGEMENT, OR THE ABSENCE OF LATENT OR OTHER DEFECTS, ACCURACY, OR THE PRESENCE OF ABSENCE OF ERRORS, WHETHER OR NOT DISCOVERABLE. SOME JURISDICTIONS DO NOT ALLOW THE EXCLUSION OF IMPLIED WARRANTIES, SO SUCH EXCLUSION MAY NOT APPLY TO YOU.

6. Limitation on Liability.

EXCEPT TO THE EXTENT REQUIRED BY APPLICABLE LAW, IN NO EVENT WILL LICENSOR BE LIABLE TO YOU ON ANY LEGAL THEORY FOR ANY SPECIAL, INCIDENTAL, CONSEQUENTIAL, PUNITIVE OR EXEMPLARY DAMAGES ARISING OUT OF THIS LICENSE OR THE USE OF THE WORK, EVEN IF LICENSOR HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.

7. Termination

  • This License and the rights granted hereunder will terminate automatically upon any breach by You of the terms of this License. Individuals or entities who have received Derivative Works or Collective Works from You under this License, however, will not have their licenses terminated provided such individuals or entities remain in full compliance with those licenses. Sections 1, 2, 5, 6, 7, and 8 will survive any termination of this License.
  • Subject to the above terms and conditions, the license granted here is perpetual (for the duration of the applicable copyright in the Work). Notwithstanding the above, Licensor reserves the right to release the Work under different license terms or to stop distributing the Work at any time; provided, however that any such election will not serve to withdraw this License (or any other license that has been, or is required to be, granted under the terms of this License), and this License will continue in full force and effect unless terminated as stated above.

8. Miscellaneous

  • Each time You distribute or publicly digitally perform the Work or a Collective Work, the Licensor offers to the recipient a license to the Work on the same terms and conditions as the license granted to You under this License.
  • Each time You distribute or publicly digitally perform a Derivative Work, Licensor offers to the recipient a license to the original Work on the same terms and conditions as the license granted to You under this License.
  • If any provision of this License is invalid or unenforceable under applicable law, it shall not affect the validity or enforceability of the remainder of the terms of this License, and without further action by the parties to this agreement, such provision shall be reformed to the minimum extent necessary to make such provision valid and enforceable.
  • No term or provision of this License shall be deemed waived and no breach consented to unless such waiver or consent shall be in writing and signed by the party to be charged with such waiver or consent.
  • This License constitutes the entire agreement between the parties with respect to the Work licensed here. There are no understandings, agreements or representations with respect to the Work not specified here. Licensor shall not be bound by any additional provisions that may appear in any communication from You. This License may not be modified without the mutual written agreement of the Licensor and You.