This tutorial explains how QuickGUI works on a high level, and includes what is needed by QuickGUI in order to be used in your application. For those who would like to see a full solution, a QuickGUIOgreDemo application is packaged with QuickGUI, which shows how Ogre and QuickGUI are used together.


Previously QuickGUI was built using Ogre as its primary method of rendering UI, however in this new version rendering has been abstracted out of the library. QuickGUI introduces a few abstract classes that must be implemented and passed in for QuickGUI to operate. Not only does this remove QuickGUI's dependency on Ogre, but it gives the developer control over how rendering occurs. This is especially useful if you work in an environment with certain limitations, ie power of 2 texture sizes, or use a specific or custom graphics library. In any case, you are in complete control of the rendering, so you can always adjust it without modifying QuickGUI source!

What if I don't know how to write a solution using Ogre for manual rendering?

In this tutorial I will provide an implementation that uses Ogre 1.7. It's important to realize that you can write your own rendering solution, and that the provided solution may not be the most efficient for all scenarios.

Creating QuickGUI Core

The Core class is similar to Ogre's Root class, providing access to all main subsystems within QuickGUI, and allowing creation and destruction of Interface objects. An Interface is a class that represents a collection of Window and RenderTarget objects, and will be covered later in this tutorial.

Core constructor:

* Constructor.  The passed in ResourceManager and SkinEffectCreator classes will NOT be owned by QuickGUICore, as
* they might be part of a class with scope larger than QuickGUI. Passing in a NULL value for the SkinEffectCreator
* is valid, as long as your UI Skins do not have any Skin Effects.
Core(ResourceManager* m, SkinEffectManager* sem);

NOTE: Both the constructor and destructor are public, and should be created/destroyed by your application.

As seen in the comments, a ResourceManager and SkinEffectManager are required. The SkinEffectManager is part of the Skinning system, and will be covered in another tutorial. Passing in NULL for the SkinEffectManager is perfectly acceptable, provided you don't use any SkinEffect's. (which should be covered in the same tutorial as the SkinEffectManagersmile)


The ResourceManager class is an abstract class in QuickGUI, and meant to be derived and implemented by your application code. It performs the following functionality:

  • Access to Images by name. An Image is a class representing an image file on disk, and gives read-only access to pixel data. Depending on your implementation, this could also work for Images in memory.
  • Creation and destruction of RenderTexture objects. This class represents a texture in memory that supports drawing of Images and rectangles to it.
  • Creation and destruction of Texture objects. A Texture represents a image in memory, and provides both read and write access to pixel data.

Here is the provided implementation used by the QuickGUIOgreDemo application:

NOTE: This class won't compile successfully until the Image, RenderTexture, and Texture classes have been defined and included!
NOTE: When you implement a ResourceManager for QuickGUI, you do not have to create a stand alone class specifically for QuickGUI. For example, your application could have a generic ResourceManager that inherits from the QuickGUI ResourceManager. Pass a pointer to this generic resource manager to the Core constructor, and QuickGUI will work just the same.


A recognized QuickGUI Image must have the following interface:


	* Returns the ColorValue of the pixel at the position specified.
	virtual ColorValue getColorAtPosition(const Position& p) = 0;
	* Gets the name of this Image.
	virtual std::string getName() = 0;
	* Gets the size of this Image, in pixels.
	virtual Size getSize() = 0;

Here is the provided implementation used by the QuickGUIOgreDemo application:


The RenderTexture interface is quite large. Here are a few of the APIs required by the class:

  • clear
  • drawLine
  • drawImage
  • drawRect
  • drawTiledImage
  • getClippingBounds
  • setClippingBounds
  • writeContentsToFile

Here is the provided implementation used by the QuickGUIOgreDemo application:

It is strongly recommended to look through the source of these files to gain a better understanding of one approach to providing an implementation for this class.


A recognized QuickGUI Texture must have the following interface:


	* Copies a portion of an Image's contents to this Texture.
	* NOTE: width and height of 0 will be expanded to take the maximum width and height 
	*  of the source of destination area.
	virtual void copyImageToTexture(Image* i, Rect dest = Rect::ZERO, Rect source = Rect::ZERO) = 0;

	* Locks the pixel buffer for reading and writing purposes.
	virtual unsigned char* lockBuffer() = 0;

	* Unlocks the pixel buffer.
	virtual void unlockBuffer() = 0;

	* Writes the RenderTarget contents out to an image file.
	virtual void writeContentsToFile(const std::string& fileName) = 0;

NOTE: The Texture class inherits from the Image class, so the derived Texture class will need to implement all APIs defined by the Texture interface as well as the Image interface.

Here is the provided implementation used by the QuickGUIOgreDemo application:

So now we've provided QuickGUI with methods to retrieve image data, and draw them to textures in memory. Whats next?

QuickGUI Scene

As you might have guessed, QuickGUI works by generating Textures and displaying them. We've provided methods to generate textures, but what about displaying them? The Scene class provides QuickGUI with this functionality via RenderTargets, the Overlay and UIPanel classes.

The Scene class should be thought of as the object representing your 3d scene. Overlays are 2d Panels that are drawn on top of your view, and do not change with Camera orientation. UIPanels are 3d Panels inside your scene, that can change orientation, and get smaller, larger, or move out of view, depending on your Camera's position and orientation within the 3d scene. Its worth noting that the UIPanel doesn't have to be implemented if you don't want to support 3d UI. Likewise, the Overlay class doesn't have to be implemented if you don't want to support 2d UI. Simply return NULL for the creation and accessor methods, allowing the Scene class to compile correctly.

Here is the provided implementation used by the QuickGUIOgreDemo application:

NOTE: This class won't compile successfully until the Overlay and UIPanel classes have been defined and included!


The Overlay implementation is pretty easy to implement, since Ogre already has an Overlay implementation.twisted

Some of the APIs required by the QuickGUI Overlay interface:

  • getPosition
  • getRenderTexture
  • getSize
  • getWindow
  • getZOrder
  • offsetPosition
  • setPosition
  • setRenderTexture
  • setSize
  • setWindow
  • setZOrder

NOTE: The Overlay interface inherits from the QuickGUI RenderTarget interface, so the derived class will need to implement APIs required by both interfaces.

Here is the provided implementation used by the QuickGUIOgreDemo application:


In terms of Ogre, the UIPanel is a very simple ManualObject consisting of 6 vertices, or 2 triangles, that make up a quad. The required interface is very similar to the QuickGUI Overlay interface, except for getPosition and setPosition.

NOTE: The UIPanel interface inherits from the QuickGUI RenderTarget interface, so the derived class will need to implement APIs required by both interfaces.
NOTE: QuickGUI does not require any UIPanel APIs related to orientation, but that doesn't mean you shouldn't add these in. This will allow you to perform special effects such as UI facing certain targets, or changing orientation.

Here is the provided implementation used by the QuickGUIOgreDemo application:

So now that we have a working Scene implementation, how do we let QuickGUI use it?


In QuickGUI, an Interface represents a grouping of UI Windows, the RenderTargets to display them, and allows for input injection to interact with the UI. Note that its perfectly acceptable to have multiple RenderTargets sharing the same Window (texture). This means you could have multiple Overlays and UIPanels displaying the same Window. Any change to the Window's texture will be reflected by all RenderTargets.

Interfaces are created via Core APIs:

* Creates an empty Interface.
Interface* createInterface(const std::string& name, Scene* s);
* Creates an interface as defined in XML Data, using the name given.
Interface* createInterface(const std::string& name, XMLData* d, Scene* s);

The second API makes use of the XMLData class, which will be outlined in another tutorial.

Input Injection

Input injection is the primary way users will interact with the GUI, and is done through Interface APIs:

* The KeyCode represents the keyboard key that went down; the unicode_char represents the Text that is injected.
* If there is no text associated with the key, Text::UNIODE_NEL should be passed in.
* Example: KC_H, "h"
bool injectKeyDown(const KeyCode& kc, unicode_char c = Text::UNICODE_NEL);
bool injectKeyUp(const KeyCode& kc);
* The following functions interact with the GUI using the Mouse.
bool injectMouseButtonDown(const MouseButtonID& button);
bool injectMouseButtonUp(const MouseButtonID& button);
bool injectMouseClick(const MouseButtonID& button);
bool injectMouseDoubleClick(const MouseButtonID& button);
bool injectMouseTripleClick(const MouseButtonID& button);
* This function changes the position of the Mouse Cursor, potentially entering or leaving widgets,
* causing UI state changes.
bool injectMousePosition(const int& xPixelPosition, const int& yPixelPosition);
bool injectMouseWheelChange(float delta);

Some important things to note:

  • QuickGUI does not filter code point injection, as in previous versions. Its the responsibility of the application to decide what input to pass into the Interface.

From the QuickGUIOgreDemo application:

bool MainWindow::keyPressed(const OIS::KeyEvent &arg)
	unsigned int codePoint = arg.text;

	// by default, we support codepoints 9, and 32-166.
	if((codePoint == 9) || ((codePoint >= 32) && (codePoint <= 166)))
		codePoint = QuickGUI::Text::UNICODE_NEL;


	return true;

  • If using OIS, the MouseButtonID's cannot be directly casted to QuickGUI MouseButtonID's, as in previous versions.

From the QuickGUIOgreDemo application:

bool MainWindow::mousePressed(const OIS::MouseEvent &arg, OIS::MouseButtonID id)
	// QuickGUI's MouseButton IDs are in binary values: 1/2/4/8/16/32/64/128
	mInterface->injectMouseButtonDown(static_cast<QuickGUI::MouseButtonID>(1 << id));

	return true;

  • By default, the Interface class will automatically determine click, double click, and triple click events and inject them for you. If you want to inject these input events manually, call Interface::setDetermineClickEvents(false);.

What's next?

Now that you're all setup and ready ready to write GUIs, where do you start? The next tutorial will cover creating a basic UI. (however if you want to dive in, the Interface class is the place to start!)