Game Jam – Leap Motion 3D Jam – Day 5

Win conditions, particle effects, other minor changes

 

After showing the game off at the Abertay Game Development Society again a few nights ago, I got some more valuable feedback about how I could improve the game.  The main comments were;

  • Hotseat multiplayer needed a longer switch over time, it was too chaotic for everyone to try to get into position, and the game would often start without them.  To solve this I made the game wait for the previous hands to disappear and new hands to be found in the scene.
  • People were still having trouble with some of the minigame instructions, as a result I removed the Scream and Bomb Defusal games (which lacked polish anyway).  I also changed the phrasing of the target practice to mention shooting, so that the gestures would be more obvious.
  • On watching people try the Target Practice minigame I noticed almost everyone first tried to fire by imitating a gun’s hammer using their thumb, so I added this as the main gesture.  This feels a lot better and is a lot easier to aim than the previous recoil gesture.

Most of the game foundation is functional now, leaving me to;

  • Improve the main menu
  • Generally polish the game
  • Add more minigames

Game Jam – Leap Motion 3D Jam – Day 4

Implemented the first iteration on hot-seat multiplayer today.

The main menu now allows the users to select how many players there are (up to four), who are each represented differently when they play. There are currently four characters; Weird Business, Bling Bling, Ruby Slippers, Unknown.

I also experimented a little with using a gesture to return to the main menu, which would be easy to carry out but not accidentally be performed during gameplay. I went for a ‘timeout’ style one-hand-perpendicular-to-the-other gesture, which should be unambiguous.

Game Jam – Leap Motion 3D Jam – Day 2

Today I worked on adding a couple more minigames and polishing up the game aesthetic.

Leap Jam 2

One of the new minigames is the Target Practice mode shown above.  Players shape their hands as guns and imitate the recoil on a gun in order to fire.

I decided to go with an art style similar to the one I used last year with Breakfast Simulator, a flat coloured cel-shaded aesthetic.  I also replaced the default Arial font with something with a carnival style to better fit the theme.

 

After showing the game off to a few friends at the Abertay Game Development Society and watching their fumblings, I realised that many of the gestures for the minigames needed to be clearer.  Initially I had the gun recoil as a gesture straight backwards away from the scene, whereas many people expected to fire either by imitating the hammer with their thumb press or by recoiling at an angle.  Overall the feedback was very positive & I hope to have made the experience more accessible before presenting it to them again next week.

 

I also set up the Github repository today;
 
johnjoemcbob/LeapJam

Game Jam – Leap Motion 3D Jam – Day 1

I’ve decided to enter the Leap Motion 3D Jam again this year; the jam lasts 6 weeks and is only 2 weeks through, so I still have plenty of time to work on my submission.  Last year I spent a few days working on a Oculus Rift DK2/Leap Motion Controller experience and created Breakfast Simulator.  Overall I’m happy with what I accomplished; it was my first experience with developing/using any virtual reality technology, so I spent most of my time playing around in the sandbox kitchen I created.  This year however, I’ve decided to forgo VR and have a larger focus on actual gameplay.

With this in mind, I have developed a simple prototype for a WarioWare style game (i.e. a multiplayer collection of quick minigames) which I can expand and polish easily.  Due to the limitations of the Leap, I plan to implement a hot-seat style multiplayer mode (i.e. players take turns to complete each minigame).  Players will have infinite retries on each game until one of the players completes it, at which point the remaining users will have one last chance to succeed at it before moving on.

 

Here is a video of my day one prototype progress;

Rust Modding: Team Colours

Just a quick update today, implemented the team selection menu & chose some appropriate clothing to identify the teams.

Rust: Team Select 1

I ran into a bug which caused the client and server to become unresponsive, so I’ll be looking into that tomorrow.

I’ve also been trawling the forums for references of plugins changing weapon variables.  After a while I stumbled across someone with a similar question; they wanted to know how to change the types of ammo a weapon would accept.  The general consensus of the replies seems to be that these variables have client-side counterparts which plugins cannot currently change, which lines up with what I was experiencing.  This means that I’ll have to make do with the weapons as they are now, which is a workable if a bit disappointing.  After Garry’s Mod I can only assume that Facepunch do have plans for an official modding API or at least more support in the future.

Rust Modding: Changing Game Variables

I realised yesterday, after creating the basic loadout system and testing it with the Soldier class, that for the plugin to be functional I would have to edit the base game variables of weapons and ammo to alter fire rates and damage.

Rust: Class Select 2

With this in mind today I have been exploring similar plugins which edit base game values, such as Stack Size Controller and ItemConfig.  Reading through Stack Size Controller gave me a good base knowledge on how to achieve this, which ItemConfig built on – being the more complex of the two.  I couldn’t find any examples of plugins modifying weaponry specifically, so I spent a lot of time printing out the various Components each GameObject possessed and trying to work out which was which.

 

At the end of the day I’m still hitting a wall with trying to have changes to weaponry, so I’ll be leaving this for a bit and working on team colours tomorrow.

Continue reading

Rust Modding: Introduction

Started looking into modding Rust today.

Rust: Class Select 1

While reading through the Rust devblogs and community updates I discovered a few server-side mods which caught my attention, and so I decided to have a closer look at the tools available.  I’v decided to use the Oxide modding API for now; as it seems to have been in development consistently longer than the other options, and allows for plugins created using C#.  I also found that Garry added a “community entity” to Rust, which allows modders to send GUI JSON code through remote procedure calls to the clients.  This opens a lot of possibilities for modding Rust with in-depth plugins which can communicate without spamming the chat window, as I had previously thought my options would be limited to.

Continue reading

Honours Project: Initial Thoughts

With this being my final year of studying Computer Games Technology I’ll be working on a games programming related honours project, and later writing a dissertation about it.  I will be posting updates here as I progress with the project.

 

The main outcome I want for this year is to have a quality game prototype to add to my portfolio.  The project must have a technical focus to allow me to research & document my findings, and so my first goal is to decide what this focus will be.  Here are my current thoughts;

  • Procedural dungeon generation;
    • This would allow me to write about the technicalities of the generation, while giving me the perfect foundation to create the rogue-lite game I want to prototype.
  • Analytics to support game design
    • To create a tool for a current game engine (Unity 5, Unreal 4) allowing developers to gain a better perspective of how people are interacting with their game & present this in a useful way.
  • Analytics to create dynamic difficulty
    • A game system which would use the analytical data collected from the current player’s session and compare it against previous sessions in order to alter the difficulty of the game through level design, resource spawn rates, and AI complexity scaling.

Project – Multiple Fire Planes

This week I finalised the simulation by adding functionality for the fire to spread between multiple planes.

The fire spreading information is stored in an array of objects with a MAX_OBJECT count.  Each fire spreading plane in the scene is initialized with a unique ‘fireID’ attribute which describes the index of the array to lookup when simulating the fire.  Each object has a separate particle system, however they are all linked to the same nucleus and so all share the same wind and gravity forces.

I ran into a problem with all the fire objects all igniting in the same way at the same time, despite having different array elements describing them.  After looking up similar issues, I found that it was due to the way I was creating the two-dimensional arrays, which meant all the objects would share the inner array information.

 

Collisions events are fired when the blue (less frequent) particles collide with a rigid body.  When this happens, the mesh the body belongs to is looked up and then the closest face to the collision is set alight and the simulation starts.  The main, visual fire particles have collision turned off due to the number of them, in an attempt to counteract any slowdown caused by the simulation as I am targeting real time game applications.

Project – Fire Particles & Wind

My main focus this week was on making the particles look more like real fire.

After the issues last week, my first task was to fix the particles not rising out of the plane at all.  After some experimentation I realised that the particles would have to be initialized on the first frame of the animation, even if they didn’t start emitting at that point – just to ensure the dynamics were properly setup for playback.

Next I added an object called ‘FEP_Controller’ (Fire Emission Plane), which controls the wind direction and strength of the fire spreading, allowing the user to key frame these to have more control over the effect.

I then made the wind description looked up from the controller influence the particles in the 3D scene by applying it to the simulation’s nucleus.

 

I thought I would have a chance to go back to working on the wood blackening effect, and to add the metal glowing interaction, however at this point I am running out of time and want to focus more on making the simulation work properly.  With this in mind, my final task will be to have the fire spreading through multiple objects.

 

Project – Fire Particles

My work this week was focused on combining the extra fire grid polygons and surface emitting from the united output, which can be seen in the video below;

The planes are combined two at a time as they are created, the first one starts as the ‘united plane’, then each one after renames the previous and combines with a ‘united plane’ output.  The particle emitter is then deleted and recreated on the new ‘united plane’, then linked to the existing particle system.  This particle system is set up once during the animation and initialized with some basic colour changing attributes, which give it a fiery effect.

 

Currently having an issue of the particles not rising at all from the plane, though the inverted gravity in the scene should be pulling them up into the air.  My next goal is to fix this.

Project – Fire Spreading 3D

After this week’s work I now have the grid of fire, which used to output to the console, creating 3D planes representing the subdivisions of the burning plane.  I have also been experimenting with using HLSL shaders inside the Maya Viewport 2.0, creating a blackening effect seen on the wood texture in the video.

The shader uses the original texture and a noise filter texture in order to create the blackening effect when a burn value is increased.  Parts of the noise texture are interpreted as embers, displayed as red lines through the wood, and parts as blackened.

 

Each extra polygon is stored in a group within the scene, so that the outliner is not crowded, and to allow easy deletion of all of them the first frame.

The green 3D planes represent which subdivisions are currently alight, and will be combined into one polygon to use as a surface particle emitter for the fire effects, which is my goal for next week.

Game Jam – The Meatly – Day 3

By the end of yesterday I felt I had a pretty good grasp of the ChilliSource engine, so today I just did some more experimentation with its UI system.

 

I still didn’t have a game by the end of the jam, but I’m quite happy with having experimented with dialogue systems again, and having an attempt at isometric drawing.

The isometric style was easy to pull off in ChilliSource, as it is a 2D and 3D engine I simply placed the tiles on descending y values, rather than having to hard code the draw order.

 

I saw this pop up on my Twitter feed, which inspired me & moving forward I’m considering creating a procedural city with Kenny’s isometric assets (he has also provided the 3D source files) for my university Procedural Methods coursework (rather than the procedural terrain I was working on before).

Game Jam – The Meatly – Day 2

So far today I’ve been working on the random event dialog system.  ChilliSource’s UI system took a bit of experimentation to get working but now that I understand how to use it, it seems very powerful.  The event panel was easy to setup & was a nice change from building a UI system from scratch, as I did for Frontier Town‘s dialogue.

 

Here’s a video of what I have so far;

 

I’m happy with the way it’s coming together so far, especially the title screen, but in hindsight I should have planned more of the actual gameplay before I began.  For now it seems like my entry will be mostly about the dialogue system, with stats increasing depending on the options you pick.

 

Next Steps:

  • Game developer skills screen & progression

 

Extra:

  • NPCs
  • Proper game city (will probably just load a map)
  • Vehicles

Game Jam – The Meatly – Day 1

I’m participating in The Meatly Game Jam this weekend as a practice run for the upcoming ChilliSource Game Jam, as I want to get some experience with the engine rather than wrestling with it during the proper jam.

 

The theme for this jam is “All things TheMeatly: Life as a Game Developer” which is pretty wide open.  I’ve decided to make a dialogue based game in which you play as an aspiring game developer, going to game jams and studying to increase your skills.  I’ve also been meaning to make a game using Kenney.nl’s isometric city assets, so I’m going to incorporate them into the game.  I’m planning for it to be similar to Game Dev Tycoon, but more dialogue based.

 

 

The jam started at 16:00, and so far I’ve spent 5 hours experimenting with ChilliSource & making my title screen.  A lot of what I have so far is placeholder, including the title, but here’s a screenshot of what it looks like;

Game Dev Story

Next Steps:

  • Dialog boxes
  • Random events
  • Game developer skills screen & progression

 

Extras:

  • Proper game city (will probably just load a map)
  • Vehicles

Project – Fire Spreading Grid

This week I’ve been working on getting the basic fire spreading working in a two-dimensional array of grid spaces.  The fire spreads strongest in the direction which aligns with the wind, but also spreads (more slowly) against the wind’s push.

The wind’s effect is on each of the four cardinal spreading directions is calculated and cached for spreading each time the wind direction or strength change.

 

My next steps will be towards displaying the spreading on 3D planes, by creating new geometry in real time to emit particles from.

Maya nHair

This week I worked on adding hair to a Mannequin bust.  I added a curly beard to the model which behaved strangely when the simulation was running, extending outwards quickly.

Project Idea – Blacksmith

Starting to think about what I want to make as my coursework piece, one idea I have is to simulate a medieval blacksmith’s workshop.

This would give me the chance to;

  • Simulate fire
  • Experiment with dynamic interaction between fire and different material types
  • Work more on my molten metal fluid
  • Implement some cloth based mechanical systems

When I was experimenting with Unreal 4 I had a look at their fire material interaction, which inspired me & is what I would be aiming for.  You can see it below.

 

I also found this video recently;

This caught my interest as it shows how diverse the uses of cloth can be, and I think a rope pulley system could fit well in my workshop as an extra dynamics system.

Cloth and Game Engines

While researching different methods of creating cloth in Maya, I decided to look into what each of the big three game engines (Unreal 4, Unity 4, CRYENGINE 3) to see what cloth types they support.

First I looked at Unreal 4, as I have had some brief experience with this.  It uses Nvidia PhysX which can be added to Maya as a plugin.  This allows the cloth to be simulated in Maya, giving the artist the ability to affect the simulation by painting values, but also means that the cloth is still simulated in engine.  This means that cloth can be affected by outside forces in the game (e.g. getting hit by a projectile).

After then looking into support for cloth in Unity 4, I found that the best process seemed to be using Unity’s built in cloth components.  This is useful as it allows values to be tweaked quickly, without re-importing, and while the game is running.  However this also means that the artist will need access to the engine while adding cloth to characters, rather than being able to built them on top of the character rig in Maya.

CRYENGINE 3 has support for importing cloth from Maya.  However it seems to depend on CRYENGINE’s interpretation of the cloth rather than using the PhysX or nDynamics cloth systems.  This means that, as with Unity, the artists will probably be unable to see a representation of how the cloth with look without actually importing it and running it in the engine.

One method that would work for all three engines is simply to cache & bake the cloth simulation to the animation after the artist is happy with how it looks.  This has the advantage of looking exactly the same in-game as intended by the artist, however also means that the cloth cannot be affected by outside forces in the game.  From what I have seen, Unreal 4’s system seems to be the best suited to artists, though Unity 4’s could also be useful for quick tweaks or for use by someone with less Maya experience.

Merry-Go-Round

In the labs this week we were working on our first procedural animation, and were tasked with making a Merry-Go-Round spin with functionality to animate it changing speed. We were using the scene provided in this tutorial.

Scripting and Dynamics: Merry-Go-Round

 

The horses are set up parented to the circular base they are attached to, which rotates during the animation at the speed set.  Each horse is then moved up and down vertically using a sine wave of the frequency set.

Scripting and Dynamics: Merry-Go-Round

 

At this point there were several problems with the animation:

  • The animation wasn’t the same every time, even if the same values were used: this is because the speed was simply used as the variable to increase rotation by each frame.  This was solved by making the variable increase of rotation a sum of the speed & the max number of frames, meaning that for a speed of 1 the whole animation would represent 1 full rotation.
  • It was tedious to try and change the frequency of every horse to be the same: every slider would have had to been adjusted to the same positioning.  This was solved by adding a ‘change all frequency’ slider to the top of the menu, while still having the individual horse values below.

Scripting and Dynamics: Merry-Go-Round

I then added the option to change the vertical offset of the horses, which allowed the user to change the distance displaced from their original positions.

Scripting and Dynamics: Merry-Go-Round

Molten Metal

I continued looking into how best to simulate molten metal during the week and came across an official Autodesk tutorial, which proved to be very useful.

It began quite differently than I had when aiming for the same effect, as it ignored the liquid simulation settings and instead used the default ball emitter settings.  It also created a mesh from the particles rather than relying on overlapping spheres to look whole, which makes sense for exporting into games as well because they can easily render triangle based meshes.  Due to there now being one mesh instead of a great number of spheres I was also able to apply a texture as described in the tutorial, rather than creating the effect with different coloured materials.  The overall result of this is shown below;

screenshot2

As you can see, this is slightly better looking than the original silver molten metal I created last time.  The flowing liquid looks a lot smoother than the lumpy look which the spheres created before, and also travels a lot faster (as it should).  However the texture stretching on the sides of the liquid detracts quite a bit from the overall look.

Next the tutorial explained how to create a fire/smoke effect using the fluid containers in Maya, by linking its emitter to the liquid mesh and toggling AutoResize on so that the effect would always follow the liquid.  The image below shows the first implementation of the smoke on top of the liquid.

screenshot3

The effect helps to provide detail to the simulation, and makes it harder to notice the texturing errors on the liquid.  After a few tweaks to the smoke colours the result was further improved, as shown below.

screenshot4

Last time I was looking into creating a bubbling effect on the liquid, however I was unable to get it to emit from all points rather than only the center of the liquid.  Due to the liquid now being a mesh, a particle emitter could be setup to emit properly using the ‘Emit from surface’ setting.  I used this second system to spawn particles under the surface which would collide with the molten particles, forcing them upwards.  This allowed me to influence the liquid itself rather than needing to display the second set of particles, which wouldn’t have any effect on the liquid mesh.  The effect can be seen in the gif below (smoke/flames are hidden here to better show the bubbling);

Molten Bubbles

Scripting and Dynamics

The introductory lab of this module had us learning the basics of particle systems by creating a simple fluid.  I first decided to attempt something resembling lava.  To do this I first created the default water particle system.  I then tried various drag values in order to make it move very slowly, however this was unconvincing as it led to the particles floating sometimes.  I later tried instead altering the dampening value, which led to a more realistic look.  I also added different colours, one of which was to be randomly selected when a particle was created.  In order to make it appear as one object rather than a mass of spheres moving together, I made the actual radius of the spheres larger than their colliders.

screenshot5

At this point I realised that the effect looked more like molten pizza than lava, so I decided to try a silvery molten metal look instead.

screenshot

I then decided to add to the effect by creating another particle system which was meant to emit from particles in the fluid system.  This did not give me the desired effect however, as instead it would only ever emit from the mid point between all particles.

I was told that the effect I wanted would be best created by using expressions on the individual particles in order to emit from them.  I plan to look into this further in the weeks to come.

Final Pose Saver

screenshotui2
This is the view of my final implementation of the Pose Saver script.

I had to cut some of the desired features due to time constraints, but the included features are as follow:

  • Reset scene control shapes using a prefix defined by the user
  • Save the currently selected control shapes out to a file with the name provided by the user
  • Saving also takes a screenshot of the current 3D view to use as an icon for the saved pose
  • Loaded poses are displayed in the list at the top of the user interface, and can be renamed or deleted

Other features I would liked to have included are

  • A search bar, allowing the user to narrow down the list of poses
  • A sort option, helping to find poses when there is a long list
  • The reset button also resetting custom attributes (currently only resets translation and rotation)
  • Blending between two poses when they use the same control shapes

Pose Saver Progress

I’ve been experimenting with taking screenshots of the main Maya view using python in order to save one when a pose is created, allowing the user to visualise better.

The script also hides all UI/HUD elements in the view before taking the screenshot, and then reverts them all back to their previous settings after it is taken.  This allows for the pose images to be less cluttered, without adversely affecting the user by removing any elements they were using.  I was initially storing these in ‘.iff’ format as the documentation mentioned that this was the default and would be fastest, however I had difficulty loading these in for the UI panel, and so they are now saved and loaded as ‘.jpg’.

I have also been playing around with Maya UI panel layouts, to help when I structure my final script.  This layout also contains an image loaded after being output by the screenshot part of the script.

screenshotui

The screenshot is currently visibly squashed horizontally, and so from here I will either change the aspect of the images displayed in the pose list, or crop the images before they are saved into a square shape.

Facial Rigging

I choose to use blend shapes for my character’s face, rather than a joint setup.  This method seems to be a lot easier for the animator to use, as it allows them to change a value and see the face interpolate between two states (e.g. left_eye_closed = 0 would be open, 0.5 would be half closed, & 1 would be fully closed). It is also supported in Unreal 4 and Unity, the two modern engines which I have experience with, which makes it ideal for me.

For the initial trial of creating blend shapes I used the whole mesh, rather than just the face.  If I was to redo this process I would either choose a model with a separate mesh for the head, or cut the head vertices off myself and then create the blend shapes from that.

I was happy to find that the model had a mouth sock, as I didn’t notice it when I was first selecting the model.  I decided to test this piece of the mesh by making the first blend shape an interpolation between the mouth closed and the mouth wide open.

Mouth Blend

When I was happy with the way this blend looked, I decided to try and setup an easier way of changing the value.  I decided there must be a way to link the blend value to an attribute on the base mesh using set driven key, and found this tutorial on doing so (http://www.creativecrash.com/maya/tutorials/animating/c/using-blend-shape-and-set-driven-key-for-facial-expressions). Following this I selected the blend shape as the driven key and the base mesh attribute as the driver, linking them together.  This provided me with easier access to the blend shapes for later, which further improved the usability of the rig for the animators.

Introduction to Skinning

This week I started the skinning process on my model.  It turned out to be almost identical to the weight painting part of setting up PhysX cloth physics, which helped as I already had some idea of what to do. However, due to the number of joints & the complexity of the model in comparison with my half-sphere cloth, the skinning process is clearly going to take a lot longer.

One feature of the painting tool which should help with this is the ‘flood’ option, which allows me to select a group of vertices on the model and then set them all to the same value.  This proved useful when I had to remove links between joints and areas of the model the joint should not have any influence on.

When the skin was first applied because of the closeness of the fingers to the rest of the mesh, there was a large amount of influence each finger would have on the model’s legs.  This meant that when the hands were moved, some of the leg vertices would follow.  However when I tried to paint zero influences to solve this issue, as I had done with other areas of the model, I ran into problems.  For some reason when using the joint at the tip of each finger (the end of the chain), the zero weighting values I painted would be reset when I deselected the joint.

Finger Influence

Tidy Workplace

As the role of the technical artist is to create a rig which will eventually be used by others, it is important to make it as user friendly as possible.

To help with this, my rig has three visibility layers; Mesh, Joint, & Control Point.  This will allow the animator more control over what they see, leading to easier selection of specific parts of the rig (e.g. hide the joint layer and only view the mesh & control points).

Layers

The rig’s components are also grouped into categories; Mesh, Joint & Control.  This makes the base Outliner easier to search through as it appears less cluttered, and the animator can simply expand the group containing the type of object they are interested in.

Outliner

When I was creating the custom driven attributes on the mesh (e.g. hand curling) I also applied limits on the range of values they can be set to (e.g. for the curling between -1 and 1), which helps show the animator the range of motion they will have access to with this variable.

I then went on to lock and hide any attributes that the rig does not support, or that the animator should not have access to.  My rig was not created scalable, so I locked and hid the scale values on the joints.

The Last of Us Character Rigging

A friend pointed me towards this video, which gives a good insight into Naughty Dog’s technical art processes for The Last of Us;

The video begins by talking about the facial animation of Ellie and Joel.  One of the changes the Naughty Dog team made for this project was to begin facial rigging with the eyes and mouth half way between their minimum and maximum points.  This gave the models a more natural look when animating, because the elements would have less displacement from their base positions and so would stretch less.

Their arm joints have some interesting additions.  They have six ‘arm mover controls’ from shoulder to wrist on Ellie’s arm, which allow the animator to fix small mesh penetration problems by subtly moving parts of the arm around.  This doesn’t seem too complex, so I’ll have a look into using these for my rig.  They also have ‘cone readers’ on the wrist to help solve problems with losing volume when rotating the joint by adding a few extra joints around it.  The setup seems to have four wide cones facing various directions around the wrist, and the video describes that when the joint moves in and out of their bounds a different position value will be given to the extra joints.  It describes the method being complex and I agree, this is way out of scope for my rig and I don’t fully understand their system.

They go on to discuss their method for dispersing the amount of twist in the wrist back up through the other joints in the arm, in order to give a more natural feel of the arm working together.  This is something I need to look into further, as the ‘candy wrapper’ effect was also brought up in the ‘Topology VS Skeletons’ lecture (when the wrist is rotated 180 degrees and looks like a cone).

Something I noticed during the video was that they had some key values displayed in the scene at appropriate points on the model, this seemed useful so I had a look into doing it for my own rig.

Pose Saver Script

My script will be a pose saver tool, allowing the user to save poses, load poses, blend poses and reset control shapes.

One of the most advanced pose saver tools I have come across is shown below.

An example of a poser tool
An example of a poser tool

From this tool I have decided to support the following (explained in more detail in the full feature list):

  • Pose blending
  • Rename/delete/replace poses
  • Remove a pose’s influence on a specific control/joint
  • Taking a screenshot of the scene to use as the icon

Another tool of a similar feature level is shown below.

An example of a poser tool
An example of a poser tool

This tool supports the same basic features as I have decided to build in.

The full list of features I have planned is as follows:

  • Pose blending
    • The option to simply load the pose (overwriting any other pose on these controls) or add a percentage of this pose’s influence to the control’s current pose
  • Rename/delete/replace poses
    • From a drop down menu similar to the two examples above, right clicking a pose’s icon will bring up options for that pose
  • Selecting only the controls/joints which should be saved to the pose
    • All attributes of these objects will be saved, including custom additions
  • Remove a pose’s influence on a specific control/joint
    • Accessed as ‘edit pose’ from the pose drop down menu described.  This will present the user with a list of all controls saved to the pose, allowing controls to be removed from certain poses later on
  • Taking a screenshot of the scene to use as the icon
  • A search bar and sorting type
    • Searching will remove any poses from the list which do not match
    • Sorting will allow the user to view the list of poses in ascending/descending order of the following:
      • Alphabetically
      • Chronologically
    • (Will have to store date&time on creation of a pose)

User Interface Diagram:

UI Design