Pinned; Updates on this blog ('TheMattePoser' included) can now be followed via this twitter channel.
Goodbye Kansas Work Contributions
Posted on Wednesday, March 14, 2024
Note; For examples regarding tools creation/coding and projects that I have been part of working at Goodbye Kansas Studios please contact me directly! Privacy concerns.

Upcoming Blogg Updates
Posted on Sunday, July 03, 2016
I have decided to start using the twitter channel created for TheMattePoser for updates made on this blog as well. So if you are among the people interested in this site (all 10 of you :) ) and you feel like getting a ping whenever I post something new you can follow this twitter channel.

Maya Expression/Node Network Tool
Posted on Sunday, July 03, 2016
This is a highly personal tool, however given the lack of, hum, should we call it, "smoothness", in maya when creating node math network solutions, I figured I will post this and make an inquiry to see if there is an interest from others to try this one out as well? If it worked out well for me it might for others as well.

So what is this!? It quite a simple tool that generates a node network from a user inputted expression. Revolutionary; Nope, useful, yes indeed! When designing logic for technical situations dealing with standard nodes in maya manually I find using the hypergraph/node editor/connection editor (the stock tools really) time consuming and tedious (and I have met others that think this as well).

So while not extremly visual, like the node editor for example, this tool generates the nodal setup much much faster from an expression than one could ever do manually. For really complex expressions/node setup logic it might not be the best of solution but for your average rigging/animation math situations it has worked a treat in speeding up workflow/testing circumstances (for me anyway).

Being the only one that has been using this it might be somewhat rough around the edges, however, despite that, is anyone else interested in testing this out? If enough people are interested I will see if I can dig up some more bugs, kill them off, and then post it as is.

Anyone Interested?

Implementation (Python)
'TheMattePoser' Maya Poser Tool
Posted on Sunday, September 20, 2015
For a period of time I have been busy constructing a pose tool for Maya similar to motionbuilders in functionality.

TheMattePoser - Maya Poser Tool

More detailed information regarding this and how it might be useful for you on a day-to-day basis follow the sub-site link above.

Questions/comments/ideas/requests regarding anything on this site, dont hesitate to initiate a mail dialogue...
Scene Objects Picker Tool (Motionbuilder)
Posted on Friday, October 17, 2014
Long time no posting! So high time to post some stuff.

The following is a video description of how to use a namebased object "Picker" tool I put together for motionbuilder. Supporting namespaces and hierarchy based pick entries. The video shows, in its simplest form, how to use the tool. I opted for a video rather than a textal based instruction on how to use it. And some complementary descriptions of the other stuff here via text instead.

It should work for the past 3 versions of motionbuilder, ie 2013/2014 and 2015. The tool can be downloaded HERE.

Install instructions are pretty simple; Download the *.rar. Unpack all files to whatever directory you want to have the tool. Drag-n-drop the "*" file into motionbuilder to fire up the tool itself. From there on the video should do a decent job describing how to use the tool itself.

Some stuff that is NOT mentioned in the video; The "Select Body Part", briefly shown what is does in the beginning of the video, is a small separated tool from the picker but which I decided to add to this package as well. It selects ALL objects that are part of a body part/full body on any fbik character. The builtin ways to do this has a tendency to miss certain controllers associated. Which means that if you want to move keys, for example, you will un-sync the keys associated with a fullbody/body part key if moved in the fcurve editor. Really annoying in my opinion. Supports multi selects of body parts as well, as shown.

The Picker; All picker entries are save in the scene as relation constraints with extra info attached to them. So when you reload a scene your previously saved picker entries will be there. This means that you can also move these constraints to other scenes by "Save Selection" and merging that *.fbx into the other scene. And by so doing you have a set of picker items that you use at templates for other scenes, for example.

You can also do some operations on the members/items in the list. The meny "Edit List" is briefly just opened in the video. Most of the operation there should be possible to figure out what they do in themselves so I omitted any sort of demo out of the video itself. The "Select Branch Data Scene Node" meny alternative is related to the constraints I described above. Select 1 or more entries in the picker UI and run this command and it will select the associated relation constraints which holds that specific picker information. For easier "Save Selection" when one want to move things between files.

Implementation (Python)
'mnAuxControl' More In-Depth
Posted on Saturday, September 07, 2013
One of the systems that is part of the motionbuilder rigging system I setup a while back (described in this previous post), is the auxiliary control part of things. I did mention it briefly in the video at the end however the plan was to come back to it later on with a little more info. So I will try and fill some of the gaps with this post that was present in the video. Look at the video/info in the older post to get a feel for what all this is about.

Firstly; Any object that is an object can take advantage of this. Thru tools when doing rigging and setup with this in mind. When part of the rigg is created using automation much of this is automated into the rigg, however much can be created manually "standalone" as well.

I will refer to the following image, and the numbers found in it, when trying to explain stuff;

1. I mentioned this briefly in the video. This is used to plot the current in view motion back to the controllers themselves, either to the IK or the FKs. This happends on a limb basis, as defined as a limb during the riggning phase of course. So a standard biped, for example, would have 4 limb layers (LLeg, RLeg, LArm and RArm), by default anyway. Of course this is used when the need arises to modify the motion in a more "motion capture" editing sort of way, using layers to offset/reposition torso/arms etc but to try and keep the original motion as intact as one can without having to "re-animate" things that one wants to stay in place.

2. Every Aux of Pivot object has a set of internal operations that you can do with them individually. The button labeled "O" for Operation, labeled 2 in the image. Pressing that will popup a new window that will allow you to do some specific operations, labeled as 3 in the image.

These do the following;
"Select" - Select the object in the scene.
"Focus Curve" - Will focus this objects weight animation curve in the fcurve editor.
"Creation Frame" - This is the frame on the timeline which you created this Aux/Pivot. Pressing this will send the timehead to that point in time.
"Creation Position" - The position for the Aux/Piv is also recorded when created, so similar to the frame button this will set this position back to creation value.
"Plot Me From Motion" - Similar to the "limb layer" plot but only for a specific Aux/Piv object. Here you can also select a section of you current timeline todo the plot for (the 2 dropdown boxes in the image is Start/Stop plot span). Which can be manipulated using the dropdowns.

It is likely that I will end up adding more "per-Aux/Piv" operations as specific needs arise (or a good idea present itself).

Implementation ( Python)
Questions/comments/ideas/requests regarding anything on this site, dont hesitate to initiate a mail dialogue...
Skin Sliding Animation Node (Maya)
Posted on Monday, July 22, 2013
So what´s in this video; A fairly lowres hand with a custom realtime *.cgfx skin shader.

The effect of skin sliding is achieved using a custom written UV animation node that animates the upper hands UVs when the finger curls. Pretty simple idea however the effect works pretty well. As the video shows. This could of course be used for any situation when a skinsliding effect is needed. Be it highres or lowres, since the only thing that moves are the UVs.

Tools are in place to create/edit the saved animation targets on the node itself (Using the normal Maya UV Editor). Briefly shown in the video as well.

The plan is to put together a similar node for motionbuilder. More of that to come...


Edit Monday, August 19, 2013
After testing some ideas that I had so far it seems that being able to "animate" UVs similarly to Maya on a individual UV basis in motionbuilder does not seem to be possible (at least not in the way I was hoping it would). I kind of suspected that this would be the case to some degree (knowing a little bit how motionbuilder seems to work internally). I will probably keep trying to figure something over time though (cause I would like to have this possibility in long run).

Not essential to have this work in motionbuilder, but it would be nice. If anyone has come up with something that may be used in solving this. Feel free to contact me.

"The Area" forum post; UV Additive Texture Animation

Implementation (Python/Maya C++ API)
Ripple Weight Effect Node (Maya + Motionbuilder)
Posted on Sunday, June 30, 2013
This node implements a simple way to "split" a value that ranges from 0-1 into a set of "rippling" output of 0 to 1 values depending on how many output the user wants (with a maximum of 10 in this case).

What can this be used for!? My main reason for creating this is to emulate in-between blendshapes in motionbuilder, which it does not support by default. Maya does, but motionbuilder does not. So the implementation of this node has been made for both Maya/Motionbuilder (as the image shows). So that one can set it up in maya and then get the same visual behavior in motionbuilder while animating.

Usage, for example, could be a more advanced corrective blendshaping setups in both maya and motionbuilder. As well as just when a generic ripple/wave effect is needed to control a set of output weights.

Implementation ( Motionbuilder C++ API / Maya C++ API)
Questions/comments/ideas/requests regarding anything on this site, dont hesitate to initiate a mail dialogue...
Animation "Exchange" System (Maya/ Motionbuilder)
Posted on Sunday, June 9, 2013
Moving animation between Maya/Motionbuilder while working is a reoccuring fact. Unfortunately often using the staple .fbx format for this exchange often produce more headache then it helps (even with the "Send" functionality quite recently added by Autodesk), especially when having a more complex sequence to deal with.

This is my take on such a animation "exchange" format for Maya/Motionbuilder. Its more "simple" in its functionality, for the lack of better word. And the idea is to ONLY move the motion itself, without affecting much else in the source/target scene. Keeping both scenes as intact as one can with constraints/expressions/API connections/other more technical connections kept. And still be able to "update" the motion without destroying parts of the setup already active, making "on the fly" changes easier once something has been setup already.

A simple format, "*.mnTransfer", has been setup as a sharing format. The setup reads/writes using the timeline to gets/sets whatever motion you want to move between the two applications. Upon import the only thing updated is the motion itself. This being mainly python based also means that you can use different versions of Maya/Motionbuilder (Maya 2012/MB 2013 in my example case) and it will still work without a problem (unlike the "Send" autodesk functionality).

The following video demonstrates a situation when this can be of use; The mechanical arm was keyframed todo its work using motionbuilder. The rigg for it was setup using the "motionbuilder custom rigging system" described more thoroughly in this previous post. The camera switcher has also been used in this sequence since I have added functionality to by automation generate the motion + camera settings + camera cuts upon import in Maya. So one can use this tool to rebuild all motion in Maya from a cutscene in Motionbuilder, for example, for rendering or whatever else. Hopefully the video gives you more ideas of how this can be used;

With the simulation scene setup in Maya already the simpleness of this allows one to, for example, make some animation changes on the arm animation itself (be it polish or whatever), bring that back to Maya again, and just re-run the simulation in Maya for the cable motion without having to "re-setup" anything technically really. Which allows for good level of tweaking of the final outcome.

The use of this is of course not limited to simulations. I only used this as an example. Any object that is attached to something moving can be effectively used with this system. Animated channels are also supported on the objects themselves, which can be used for blendshape control, facial channels, driven keys, or whatever else one can use to drive something else. UI changes might occur as I continue todo changes/update to this.

Implementation ( Python/ MEL/ Animation)
Transform Blend Node (Maya)
Posted on Thursday, May 9, 2013
A common method to create shape based animation using joints in maya is to use driven keys. This can be used for anything really where a "blendshape" kind of effect is needed and the use of actual vertex transformation is not possible. The driven key setup works but has some drawbacks. One, you can get euler issues when blending poses, two, it creates quite a lot of nodes in maya which affects realtime performance pretty rapidly.

So, instead of using driven keys, I have created a transformation blending node (mnBlendTransform) that does this in quaternion space, meaning you wont get any euler issues. Also using these nodes replaces many underlying nodes when using the driven key setup system with only 1 per transform, which affects realtime performance to the better.

A toolbase is in place to manage/create these systems, see image below. You can have multiple "sets" or groups of this in the scene with multiple channels within. The tools is also used to switch between "edting mode" and "animation mode", you can also edit the poses in-place with animation on the channels.

Typcial usage for this is joint based facial setups, which are quite common one way or another (which the example in the image suggests as well).

Implementation ( MEL / Python / Maya C++ API)
Questions/comments/ideas/requests regarding anything on this site, dont hesitate to initiate a mail dialogue...
Motionbuilder *.FBX Animation Exporter
Posted on Tuesday, April 16, 2013
With the fbx format being a really popular format used for importing data into engines nowadays (although I do have my reservations against its design/implementation methodology). I have put together a exporter that can export multiple characters in multiple takes and still maintain a decent organisation for your files that you are working with.

I believe strongly in keeping one files organized and the workfiles separated from the actual export files (for good transparency). The exporter also reads the timeline rather then forcing one to plot data effectivly to something. Meaning if you are animating a prop, for example, you dont have to plot down on the transforms, but you can keep your keys as is. And the outputted fbx will still contain valid "plotted" data. The export itself never modifies your working scene at all, only reads thru it and spits out a fbx/s that only contains the necessary information for import into and engine or whatever.

I have put together a video to demonstrate this in use. Hopefully it helps people understand somewhat more how to use the tool.

This may be of interest for others. So, it can be downloaded here if anyone wishes to try it out. It was written while I was doing some work in Unity, which uses fbx natively, and this has worked a charm when working with animations for a mechanim tree. With that said, I offer this as is. Use at your own risk. Right now the exporter only does work on joints + single animated channels on Null objects. It acts on selection, and namespaces of course to identify different character/set of objects that are part of the same export group.

I have mainly used this in MB 2013, however it should work in 2012 as well.

Download the file. Place the 2 files whereever you want. Run the "" file in motionbuilder to start the tool.

Implementation ( Python)
Mocap Retarget Solution (for MB Rigg System)
Posted on Saturday, March 23, 2013
Building the custom rigging system (se previous post) got me thinking about setting up a way to get mocap onto characters made using the system. So that the setups supports a mocap workflow as well as keyframing. In most cases this only make sense for bipedal characters/creatures, however the support is good to have for other reasons as well. Since it can be used to retarget animation from a handkeyed character, for example, to another as well.

So for that I set up a tool for doing just this. The tool itself is fairly technical in how to set it up currently. More work is needed to make this more obvious to uninitiated people. As of right now the main core logic is there to be able to retarget from one character to another, which is the most important thing.

Simply put; Add two characters to motionbuilder scene, one having animation and the other being the target one of course. Use the tool to setup how the animation should transfer from the source to the target. The source does not need to have an attached fbik to be able to use. The idea is to be able to use the sources animated bones directly. After setup just plot the take/s and the animation will be transferred to the other character.

Below is a video showing some retargeted mocap animation to the character used in the rigging system post earlier posted. The motion comes from a cutscene from the EA released syndicate game that I worked on a few years back. The troll animation has been retargeted from another character with the same skeleton as the guy on left in the video.

This is from a work in progress scene so many things are not working properly as well as need work. And I have only setup the bodymotion in this case, so no face + no fingers on the troll, as you can see. Its a dialogue sequence however and I have stripped away the sound since its the motion itself that is the important thing here.

Implementation (Python /Motionbuilder C++ API /Animation)
Questions/comments/ideas/requests regarding anything on this site, dont hesitate to initiate a mail dialogue...
Motionbuilder Custom Rigging System
Posted on Sunday, February 3, 2013
The following video is a demonstration/description of a custom rigging system for motionbuilder that I have spent some time developing. So what is this for!? And why have a spent time putting this together!?

Being a real motionbuilder enthusiast I have always feelt that being able to rigg just about anything is something that motionbuilder is lacking support for. So this has been a answer to that. A system that allows the user to build a more "Maya" style rigg (for lack of better word), with all its benefits that motionbuilder has contra maya in terms of animation in itself. And at the same time have a toolsetup that helps the user to interact with the rigg as smartly as possible.

So what does this support! Anthing really that has spines/limbs in any configurable setup you can think of. So it is NOT somewhat locked in what you can do with it, much like the fbik. The setup consists of 2 "main" parts. Firstly you do the skeleton creation / skinning setup etc and so forth in maya. As you would any character. After that a maya tool is used to define how the spine/limbs should be created upon import into motionbulder. Export into motionbuilder followed by running of a battery of scripts that will setup the things you defined earlier in maya.

This is however not an autorigger, so after running the script modules one still have to define how this characters control hierarchy is gonna look like + using some other tools to define how the rigg will work. Depending on what sort of motion is needed etc. This is intentional, of course, so the setup as a whole can be used for many different things rather then being limiting to the user. During the script phase many of the hooks needed for all the tools shown in the below video will be automatically setup and built into the rigg itself.

The above description is of course somewhat of a overview description of how to create the actual rigg. The below clip is more focused on how the user would interact with the rigg when working with anmation on it. Which is more of an interest for everyone to see. I might add a more technical video later on.

So without further ado, here is the video, narrated by your´s truely. Yes, I am rambling somewhat here and there, but most of the important information is there.

I will most likely do some tweaking on the tool setup/interaction etc and so forth as time goes by. Improving/modifying/adding things etc.

Remember; most parts of the above setup can be used for any character with all the benefits shown in this video. Below is a sceengrab of a spider character which has all the same control possibilities as the biped shown in the video (it has been setup using the same system).

If anyone has more of a technical interest in this setup (or anything else for that matter), feel free to contact me via mail.

Implementation (Python/ Motionbuilder C++ API/ MEL/ Maya C++ API/ Animation)
2D Video Tracker + FACS Solver (Motionbuilder)
Posted on Wednesday, August 15, 2012
Facial animation is something that I really care about, and in the process of making believable characters its essential. A common method to record facial animation from human performance is to use a theoretical system named FACS, which is a system of poses that when layered generates fairly real looking sequences.

Being able to generate this by myself from tooling has always appealed to me. So this is my take on that. Implemented as a tool for motionbuilder this tool can be used to track 2D facial video footage, which has been setup to certain pre-decided guidelines of course. It works by tracking key positions on the face using markers (seen as green dots in the smaller "picture-in-picture" video below). And from that extrapolating poses back to the FACS encoding system, which can then be put on the character itself.

Below shows a screenshot of how the tool currently looks like;

To explain the following video somewhat more; The left/right characters share the same poseset, which the one on the left being the orginal one. The right characters performance has been generated using the tool only from the small inlay "picture-in-picture" video. Converting the motion of the small green markers into channel data re-creating the 3d performance that you see on the right one.

The setup itself is a realtime capture from a motionbuilder viewport. No rendering. Using a *.cgfx wrinklemapp animated shader to generate the surfacing on the characters face.

Of course, the main idea of this is to have a facial mounted camera that can be put on a real human in the long run. Meaning that the small inlay picture would be recorded from real life. I havent had the possibility to test the tool using that sort of footage just yet, plan on when I have the possibility, so more on that to come...

The output data (channels) are fully editable in realtime on the character in motionbuilder after creating via this tool.

Note: As often the above description is a pretty simplified one, however it should give a clue of what this can be used for.

Implementation (Python / C++ /Animation)
Questions/comments/ideas/requests regarding anything on this site, dont hesitate to initiate a mail dialogue...
Morpheme *.XMD Exporter/Importer (Motionbuilder)
Posted on Saturday, April 21, 2012
The morpheme blending system, a 3rd party ingame animation blending solution made by naturalmotion, is a really nice system to work against. However its native export tools for motionbuilder (that comes bundled with morpheme for licensecies) is very simple and limited in its design/usefullness.

This is my take on such a tool. It support multiple characters in multiple takes. 2 characters in a scene, for example, that has two takes, can be exported in one action. Export result will be 4 files. 2 for each take with the motion of the individual characters in separated files. It acts on selection and characters/props are separated using namespaces. Export samples the timeline so it does not change the workfile in any way.

A tool for import *.xmd files back onto characters/props has also been developed (shown briefly in the video).

The following video gives some idea of how to use the tool (without any in-depth details);

Implementation (Python / Motionbuilder C++ API)
Motionbuilder Driven Key System (Maya Style)
Posted on Monday, March 12, 2012
A driven key system, similar to what Maya has, is something that motionbuilder lacks. The "half" solution that can be setup in motionbuilder already often produce strange behavior and does not produce predictable results. Most people that has done more serious rigging in motionbuilder can tell you this.

So the following setup I have put together is a way to add this sort of functionality, very similar to maya, into motionbuilder. Having access to this means you can build much more clever/advanced riggs in motionbuilder as well (predictably). As well as any situation really when a 0-1 -> "none-linear" value output is needed. And have it all take advantage of motionbuilder native realtime engine when dealing with deformation/animation.

My motionbuilder relation constraint node supports cascading, meaning you can layer driven key nodes ontop of eachother. It has the same pre and post behavior settings that you find in Maya (Constant, Linear, Cycle, Cycle with Offset), as settings on every individual node. It supports single values as well as compound vector inputs/outputs. Tools are in place for Maya/Motionbuilder management.

The following video shows a situation when one can use driven keys to get something sliding along a surface, not a clearn cut rigging situation but it demonstrates the behavior pretty good, and have this working the same way in Maya/Motionbuilder;

Obviously this is a simplified overview of what can be done with this. If anyone has a greater interest in this feel free to mail me.

Implementation (Python / MEL / Motionbuilder C++ API)
Questions/comments/ideas/requests regarding anything on this site, dont hesitate to initiate a mail dialogue...
Skinning Split/Merge Tool (Maya)
Posted on Tuesday, January 10, 2012
Often when dealing with LODS in production for character/props (and a bunch of other reasons as well) the need for splitting things up as well as merging them together arises. Also, in these cases, this might be needed after one has already gone thru the skinning etc. Most of the time you end up doing some copy/pasting of different meshes, followed by, maybe, the use of some of the limited "skinning" export tools available in maya already, copy skin weights is usually in there somewhere as well. Its usually a bunch of steps todo a pretty simple thing.

So, a tool to be able todo this in a much more simple manner. The following video shows how my take on this works;

Implementation (Python / MEL )
Site Up N Running!
Posted on Thursday, December 15, 2011
Introducing my animation aimed blogg site, "TheMatte"!

Here I will post some of the stuff that I create in the name of animation (be it tech or pure simple animation). Or just to vent my opinon on things in general related to 3d stuff.

Whether you are a professional or just a layman interested in 3d animation, welcome to my site.

Implementation ()
Pinned; I give thanks to an old friend, Eric Krona, for helping me set this website up as well as hosting the data. /Thx
2011 - 2024 Mattias Nyberg