Wednesday, 11 December 2013

Demo Day! Feedback on our Game

Welcome to the final blog of my game engines course!

In this blog I will be going over the presentation day for our game. I wanted to talk about the feedback we received from the people who played our game during the demo day. In this blog I will also talk about the feedback we received in general over the course of the semester from people who played it or at least had the chance to see it.

Our studio had a couple sessions of live streams that showed the development of our game. In our first stream we showed lots of people what our game looked like and how it plays. When people saw this stream, we were three weeks away from the final demo day. Their input was very useful to our future development of the game. Some of the input we received during the first stream consisted of having more visual feedback for the player. We also got positive feedback on the overall idea of the game as people were saying this is a creative game that not many people have seen before.

Since the viewers of the stream could only see the game they did not get to experience the feel of the controls or hear any of the audio. With this problem in mind, we took our game to the lab where we got others in our program to play the game. This way they got to hear the game and feel the controls. From this, we had our play testers say the controls were not responsive enough. With this feedback we went back to our physics component of the engine and tweaked the forces being applied to the player's helicopter. Initially, we just increased the force applied and they liked it. The controls were much more responsive and they enjoyed it more. However, we had a better idea. We modified the throttle of the helicopter so it had would return to a hover after using full throttle. This fixed the problems people had with trying to shoot and fly at the same time. We changed the forces applied to the helicopter to only affect the thrust of the throttle. This provides the flying of the helicopter to still be responsive.

After showing our game in the lab, we wanted to show our game to Dr. Nacke, our game design professor, and see what he had to say. His feedback was very useful to us as he said that we need a tutorial. He tried to play our game and even with us telling him the controls he still couldn't work the basic mechanics of flying and shooting. He also wanted to play the game with a controller and felt the game would be easier if there was one. This was actually common feedback from lots of people who played our game and we plan we to implement that next semester to give us the ability to send physical feedback to the player as well as making it easier to play the game.

So with a tutorial in mind and less than a week until we present the game on demo day, we set out to put together a tutorial for players to learn the ropes of our new game. We decided to make another live stream while we created it to get more feedback for the best tutorial. As we did this we got even more comments and critiquing from our peers and even this one random guy from the states. People were telling us lots of ways of presenting the mechanics of the players that we really liked so we took the ideas down and saved them for next semester. For now, we used text with Chrystal reading it for our tutorial. We broke down the basic mechanics of flying into small steps so the player didn't feel overwhelmed with all the controls.

This turned out to be quite effective when we finally got to demo day! At demo day, three weeks after we started getting others feedback, we had not made many changes to the internal engine of our game. All changes made were regarding gameplay. The gameplay systems for our game should have been done in scripts similar to how Vandelay Industries' game Etesian did it so we could have been able to add, fix, and tweak gameplay mechanics more efficiently. Instead, our gameplay system was just plastered across the classes within the gameplay components of our engine such as the player, level, and enemy classes. It was messy but got the job done. We plan to clean up most of the gameplay code this holiday break so it is more organized and easier to read. Code like this is what I like to call crunch code.

To end this blog off, I want to say one more time, thank you to all of those who gave us their time to look at and play our game. We really appreciate all of your feedback and help with making our game that much better. Also, I really enjoyed the game engines course, I learned a lot of the internal systems of an engine as well as different ways to set up an engine of my own.

Tuesday, 26 November 2013

Scripting

These are the notes I made during the Wednesday scripting lecture.

So what is scripting? Scripting is an interface that exposes C++. A script has functions and classes. Programs written to handle scripts use a special run-time environment that can interpret rather than compile and automate the execution of tasks which could alternatively be executed one-by-one by a programmer or scripter. What this means is that there are files that are read in by the engine to determine the game's logic. This is good because you do not have to recompile this logic every time it is modified.

Here is a list of scripting languages that we came up with in class:

Squirrel
AngelScript
Lua
Python
JavaScript
ActionScript
HeroScript
MEL
GameMonkey
ChaiScript

Quake C
UnrealScript

I left Quake and Unreal at the end because these two languages are singled out to be specifically for their own engines. These are techincally scripting languages but they are most efficient inside of their respective engines.

In class we also talked about Data Declarative types of scripting languages. For example Ogre's particle engine is a data declarative scripting style. It declares a specific type of data and how it is going to be used. Below I have an example of a data declarative scripting languages:

XML is data declarative
XML allows to creates your own language. DOM
<character Name "Joe">
<strength> 5 </strength>
</character>

<particleSystem ID = "fire">
<force>0,0,5
</force>
</particleSystem>

In this example the code is defines a character named Joe and then a particle system called fire that has a force of 5 in the z direction. This is really simple to read and write code. If you are looking for an easy Data Declarative then use XML. This allows you to load and initialize these before run time. In Doom 3 they had all of their data files as XML.

On a side not, we discussed file formats for objects that can be loaded in game. COLLADA only has about three engines use this format. I don't recommend using this and instead using fbx or obj formats. That's about all I wanted to say about the file types since I want to focus on scripting.

There's another style called Runtime. Examples of these are C and C++. It's nice because they are already compiled. Basically it takes a text file and it parses it. Then it has to convert it into some machine readable code. On typical machines they're all binary so it has to compile to assembly language. The reason for this is that assembly has a 1 to 1 for binary.

There are also interpreted languages. It has a just in time (JIT) compiler. It compiles while it runs rather than waiting for it to compile first. Here is a list of Interpreted languages.

Python
Java
and all other scripting languages listed above.

I like these interpreted languages because it minimizes the amount of initial compile time leaving more time for debugging. Java is half interpreted and half runtime.

The point of scripting is to remove the parts of C and C++ that coders don't like. Basically allows you to simply and quickly create gameplay code. A lot of the functionality of the scripting and it does not touch the engine code. The great thing about scripting is when you are tweaking small variables you don't have to recompile your engine everytime you do.

Python and LUA are most commonly used. Coders like Python the most because it's very C like. LUA is nice because it is very light weight so it doesn't take much compile time and your entire game could be written in LUA. It's free and has good licences. Every few month there are new updates coming out.

Interpreted, runtime, and Data Declared languages are the ones we need to focus on in this course since we need a runtime language for our engine and either an interpreted or data declared language for the scripting in the game.

There are other types of languages too. Here I have a small list of other language types plus some examples of these languages:

Procedural Languages

These languages are based on the concept of the unit and scope. This would have a long list such as C, Java, Perl, Python etc.

Functional Languages

LISP
F#
Haskell has a lot of multithreading capability.

Logic Based Languages

These languages specify a set of attributes that a solution must have, rather than a set of steps to obtain a solution such as ALF, Fril, Janus, and Leda.

Prolog Languages

These languages are genereal purpose logic programming languages associated with AI and computational linguistics.

GUI Programming Language

LegoLogo
Alice
.Net
Visual Basic
ShaderLang - insomniac

Friday, 8 November 2013

Workflow of Aeolus

Hey there! Welcome to my blog on the workflow of Aeolus.

In this blog I will be going through the design and development flow of our levels. This will probably be more intricate next week after we add more tools and components to our game engine. Until then, here is our flow of work to get a level into our game.

To start, our flow begins in Maya. This work flow is assuming the assets for the levels are done and the concept design of the level structure is complete. We would load all the assets we need in, then we would make sure each object has the following in our game's resources folder:

-Maya scene : For you to work in
-Script : Additional parameters
-Fbx : Mesh export for engine
-Hkx : Physics export for engine
-Material : Auto generated the first time object is loaded

Once all of those are made and put in the right locations within our resources folder, we are going to want to export our level scene. The level scene would have all of the rigid body objects in the position and orientation you want. Here are the steps to export the havok scene correctly:

Havok Export Settings
Tranform Scene
Create Rigid Bodies
Create World
Write to platform
- MAKE SURE YOU DOUBLE CHECK THE FILE NAME YOU ARE SAVING
- Must be Binary + Packfile
Preview tool

The preview tool is there to ensure everything in your scene is right. The biggest thing that tool spots out is if mass has been applied to the right objects in the scene.

Now that the models are RB's and in the right spots, we want to add our locators for intangible things like sounds and particles. it is important that there is at least one camera locator in the scene or else the scene will crash in Aeolus.  Each locator needs to be named in our naming convention so our parser can read it properly. Here are the locators so far that are working in the parser:

Only Locators are exported for the level
Scenery Locators
"Scenery_<Script name>_0"
Player Locator
"Helicopter_Player_0"
Particle Locators
"Particle_<Particle Effect Name>_0"
Camera Locator
"Camera_MainCamera_0"

We set it up like this so we can have many different scenery objects like different buildings with different amounts for each. There would be different particle types like smoke, fire, flares, lasers, etc. Later we hope to have multiple cameras to give a more cinematic feeling in our game with a small camera system to work through them all. 

We would only export the locators in the scene and then we load all of the models and particles and sounds etc. once in the initialize if there was a locator made for it.

Then at the end of the day we will get a maya generated level in our game! Best level editor ever, thanks maya : D

Dynamic Cameras in God of War 3

We're back with another blog on the God of War 3 cameras!

I feel my last blog was missing an important feature to their camera system and that was their weighted camera component. There is so much to talk about for this topic but I just want to talk about my experience with them so far. If you are interested in reading about this system which I highly recommend because it will really make your game stand out from the rest when you're developing your own, I will be posting a link to my references on the topic.

So I was in the lab the other day working and a fourth year was  coming up with the design for their game next to me. I got distracted from my work and looked at the video of Smite he was watching and heard him saying that he likes the camera style in the game and thinks it should be put into their game. As I was watching I mentioned to him that the static third person camera is very dry.

I noticed after learning about cameras in class that anything less than God of War cameras is not a very immerse camera system.  Now I recognize that High Rez Studios, the company that made Smite, is about 60 staff and that God of War 3 had over 120 people work on it. I'm not looking to criticize Smite or their development team. I'm saying that the GOW team did an amazing job on the game's camera system and Smite is still in beta development of their game. You never know, they might secretly release crazy camera dynamics in the end.

Anyways, back to my story. I was telling the fourth year about the dynamic camera system I learned in my engines class. I explained to him that it is important to have more than just a static camera in your game. The use of the camera can really improve the gameplay as well as the feeling and mood for the setting of the scene. The easiest way it improves gameplay is how it brings the player in and keeps them immersed into their game mechanic.

The main thing I explained to them for how they could make their game with more dynamic cameras is how they did it in God of War. From what I learned in class, they made their camera follow the player like a normal third person camera and added features to it such as the dolly which I talked about last blog. Then they added weights to key elements in the scene like objectives and enemies. These weights would pull the cameras focus towards the weighted object. The main character would have the most weight which would focus the camera mostly towards him. Then the camera would move on the dolly to fit most of the weights in the scene.

So I looked at Smite's gameplay, and showed them that if the camera panned out and to the side where the most enemies were it would give the gameplay more cinematic feeling to it. The camera would then dynamically move as the player moved and the enemies started to die. I thought it would be a really good idea for their game if there were enemy players and enemy Ai since you can weight the enemy players a lot and with the panning cameras it would really engage the player to face the enemy player. This idea would work great for any arena style game such as League of Legends or Dota.

Then they said we want our game to have players be able to sneak up on each other, we don't want the camera to ruin this mechanic if the camera is based on a weight. Well I thought maybe take the camera system a step further and apply negative weights to certain objects in the scene that would push the camera away from the object. This would give the player's the mechanic of being able to sneak up on them. To counter this, the player being sneaked up on would have there camera pan up to more of a bird's eye view so they can see behind them more when standing still. As they ran more the camera would pan down and look towards where they were going leaving them more open for a sneak attack.

Well this was my experience with dynamic cameras. Maybe later I will talk about how I noticed the dynamic cameras in the new Assassins Creed 4 Black Flag.

Oh yeah, and here are those links I was talking about.

The not so long but still long version = http://web.cs.wpi.edu/~rich/courses/imgd4000-d10/lectures/camera.pdf

The long version = http://twvideo01.ubm-us.net/o1/vault/gdc2011/slides/Phil_Wilkins_IteratingDynamicCamera.pdf

Cameras in God of War 3

Hey guys and girls, welcome to my blog on cameras!

This blog will be covering the systems they use for their cameras in God of War 3. This game did an amazing job of showing how important the camera is to improving gameplay. With their ability to make every scene have a cinematic feeling and with their techniques such as using rails are some of the reasons why the cameras in this game are one of the best I've ever seen.

These cameras had many hours put into them, with four teams alone working on ensuring the best effort to have great cameras. They would spend months sometimes on making some areas really epic. On top of passing camera scenes back and forth between the programmers and artists, the QA team spent lots of time on making sure every moment was as juicy looking as the last.

A scene could have dozens of cameras in it, all used at different times to give different effects and feel to the gameplay. They would interpolate between the cameras to give smooth transitions between them. There is great emphasis on the fact that the camera isn't fixed. They have highly scripted all of their cameras to provide a cinematic play experience.

Their cameras are not controllable by the player. They did this on purpose because the team has carefully crafted each scene to look as cinematic as possible. So the main tool they used to create a film type experience with their games is the implementation of a dolly on a rail. This dolly had full maneuverability as it could  move in, out, pan, zoom and tilt while on an arm.

With this system, every game frame would be calculating the position and orientation of the camera. The rails would be dependent on the factors of the players position and orientation as well as other important things in the scene such as enemies or even art assets the designers wanted to show off.

So since the player has no control and some of the level designs are tricky, there needs to be a way to help the player without bluntly telling them what to do. The camera team decided to use their cameras to help the player in regards to where they need to go or what they need to do in their levels. The way they would do that is by setting up an on-rail camera to give the best possible viewpoint on some of the specific points on the games level.

There are some links below if there are still any questions regarding the cameras in God of War 3.

Here I have a link to a reference of the God of War 3 cameras.
page=2http://www.eurogamer.net/articles/the-making-of-god-of-war-iii?page=2

If you are more of a video tyoe of person like me, I have a video link of the camera systems here too.




Thursday, 7 November 2013

Phantom Triggers/Events with Havok

So for my next blog I decided to do some more Havok!

Here I will post how we plan to implement Phantom Triggers for our game. These are some notes I took while reviewing the havok demos on phantom events.

Can use: #include <common/visualize/hkDebugDisplay.h> to display some collision based info
This is where all of the ‘magic’ for this demo takes place and it all centres around the custom implementations of the two pure virtual methods in the base class, namely phantomEnterEvent(…) and phantomLeaveEvent(…). These methods notify us of an entry or exit event for the phantom volume and most importantly the provide a reference to the collidable that has penetrated the phantom. The pseudo code is below

Class MyPhantomShape: public hkpPhantomCallbackShape
{
            Public:
            myPhantomShape(){}
//hkpPhantom interface implementation
virtual void phantomEnterEvent(const hkpCollidable8 collidableA, const hkpCollidable* collidableB, const hkpCollisionInput& env)
{

            //the color can only be changed once the entity has been added to the world
hkpRigidBody* owner = hkpGetRigidBody(collidableB);
//the “collidables” here are faked so its necessary to get the owner first in order to get the real collidable
//this is where the event code goes
}
//hkpPhantom interface implementation
virtual void phantomLeaveEvent(const hkpCollidable* collidableA, const hkpCollidable* collidableB)
{

            //the color can only be changed once the entity has been added to the world
hkpRigidBody* owner = hkpGetRigidBody(collidableB);
//the “collidables” here are faked so its necessary to get the owner first in order to get the real collidable
//this is where the event code goes
}

}

//the reason why we call getOwner() on the original collidable only to later call getCollidable() on the owner is that the collidable passed by reference may be a ‘sub’ collidable of the rigid body, perhaps a hkpTriangleShape belonging to a larger hkpMoppBvTreeShape.
Class PhantomEvents : public hkDefaultPhysicsDemo (env, ERROR_FLAGS)
{
Public:
            PhantomEvents(hkEnviroment* env)
            {

                        //setup the camera
                        //create the world
                        //register the collision agents
                        hkpAgentRegisterUtil::registerAllAgents( m_world->getCollisionDispatcher() );
                        //create terrain
                        //create models/scenary
                        //below is the construction code for the phantom volume:
                        //creating a PHANTOM FLOOR
                        {
                        hkpRigidBodyCinfo boxInfo;
                        hkVector4 boxSize( 4.0f, 1.5f , 2.0f );
                        hkpShape* boxShape = new hkpBoxShape( boxSize , 0 );
                        boxInfo.m_motionType = hkpMotion::MOTION_FIXED;

                        boxInfo.m_position.set(2.0f, -4.0f, 0.0f);
                       
                       
                        // MyPhantomShape is the demo implementation of the hkpPhantomCallbackShape.
                        MyPhantomShape* myPhantomShape = new MyPhantomShape();
                        hkpBvShape* bvShape = new hkpBvShape( boxShape, myPhantomShape );
                        boxShape->removeReference();
                        myPhantomShape->removeReference();

                        boxInfo.m_shape = bvShape;
           
                        hkpRigidBody* boxRigidBody = new hkpRigidBody( boxInfo );
                        bvShape->removeReference();

                        m_world->addEntity( boxRigidBody );
                       

                        // the color can only be changed after the entity has been added to the world
                        HK_SET_OBJECT_COLOR((hkUlong)boxRigidBody->getCollidable(), hkColor::rgbFromChars(255, 255, 255, 50));

                        boxRigidBody->removeReference();
                       
            }
}


The phantom volume is created using a hkpBvShape using a hkpBoxShape as the bounding volume and a hkpPhantomCallbackShape as the child shape. The volume is set to be fixed in space and coloured with a slight alpha blend. Once the simulation starts the ball drops into the phantom volume, upon notification of this event we colour the ball red and wait for an exit event. As soon as this event is raised we colour the ball green and apply an impulse upwards back towards the wall and the simulation repeats.

So basically after making the phantom shape you use it as the space for your phantom call back as the child shape. That volume is fixed in space but is not restricted to be a dynamic object such as a moving objective. 



Havok in Aeolus

So we finally got Havok working in our game!

In this blog I will be going over Havok in more detail starting with not Havok, kind of. What I mean by this is I will be starting with Maya. The reason why I'm starting with Maya is because it will be our source of Havok files. After the Havok plugin is installed to your Maya software, we can start adding physics to our maya scene. The most important tool in Maya that you will use is the rigid body component.  This is what applies the physics properties of an object in your scene. By default it makes your object a static rigid body which means it is collidable but will not move. Once a mass is applied to the model it can have gravity applied to it as well as forces. This makes it a dynamic rigid body.

Once your scene has been set up in Maya and you've applied your rigid bodies properties to your objects within your scene and you're happy with what you have, we will open up our Havok content tools. This is what it looks like.
So there are 7 different available filters there that allow you to modify the configuration set. So when we create our scenes for Aeolus, we try to have the configuration set circled above. It's a very simple base level config set where we transform our scene, then we create the rigid bodies and the world. After that we have a write to platform config. This is used to save the hkx file. Circled on the right hand side of the image we have the format for this file. We need it to be binary and packfile and has to be written to windows 32 for our game to be able to read the files correctly. Up above that we give it a filename and the location of where we want to store the file. At the bottom of this config set we have our preview tool. This is what we use to preview our scene after we have run the configuration.

Now we have this hkx file, but what is an hkx file? This is taken directly from Val, who is a Havok Software Engineer who focuses on the physics, she answers one of my peers when he asked this question on their forum, "hkx is usually the default extension for Havok asset files,  so they are used to export data from the modellers (e.g. Maya) with the Havok content tools, and to load them into a game at run-time." Thanks Val!

On to our game. So we set up a small test scene in Maya, exported it, and loaded it up. Here is our model, well just the top since the camera is parented to the player in the third person mode. I know the scene is boring but we plan to by November 13 to have a polished demo with gameplay and lots of particles and scenery plus 3D sounds. So in terms of physics in this scene, we have our player who is a dynamic rigid body that has gravity being applied to it. When I use the standard wasd i can move the player around with forces. Since we are flying a helicopter we need to be able to elevate and rotate so those controls are integrated near the wasd. These controls are also simple forces being applied.


So here is our very poorly done very hard coded planning on fixing soon code for moving the helicopter around. This is just for the forward, but the rest of the control keys code is pretty much the same aside from some vector changes. As you will read from the code, the forces being applied are not applied locally to the player so once the player rotates the vectors inaccurately move the player. This will be fixed soon. The goal is that date stated earlier. 

switch (arg.key)

{
case OIS::KC_W://move forward
{
if (_RigidBody)
{
int lSpeed = 10;
hkVector4 lNewVel = TOOLS_HAVOK::ToHKVector(Ogre::Vector3(1,0,0) * Ogre::Real(lSpeed));
hkVector4 lNewPos = TOOLS_HAVOK::ToHKVector(Ogre::Vector3(0,0,0));
hkWorld->markForWrite();
{
_RigidBody->GetRB()->applyForce(Ogre::Real(lSpeed), lNewVel,lNewPos);
_RigidBody->GetRB()->setAngularVelocity(TOOLS_HAVOK::ToHKVector(Ogre::Vector3(0,0,0)));
_RigidBody->GetRB()->setLinearVelocity(lNewVel);
}
hkWorld->unmarkForWrite();
}
break;
}
case OIS::KC_Z://rotate left
if (_RigidBody)
{
hkWorld->markForWrite();
{
_RigidBody->GetRB()->setAngularVelocity(TOOLS_HAVOK::ToHKVector(Ogre::Vector3(0,1,0)));

_RigidBody->GetRB()->setLinearVelocity(_RigidBody->GetRB()->getLinearVelocity());
}
hkWorld->unmarkForWrite();
}
break;

I also put the rotate code in here since that's kinda different since it's a setVelocity function rather than an applyForce function.

For my next blog I'm not sure what I plan on doing but it will be something cool.