As anyone can guess the last 3 years were quite challenging: i was lucky enough to travel and work in Canada, Spain and Scotland.
I didn’t manage to update this blog in the meantime, but todays post will but related to maya development .
Downloadable files and tool:
After a review of this task and additional information you will be able see some benchmark using different methods and tool.
The one i wrote for this occasion can be grab on my git hub account:
State of the art and current configuration:
Saving and loading weights is a common task any character TD will face at some point.
Usually in the rigging pipeline :
- This update can be motivated when a character model has changed.
- A rig module has changed or animation department request a special feature.
Those updates will mean our rig need to be rebuilt.
Back in the day changing skeleton layout requires people to detach their skin deformer but is no longer needed.
Another observation is that we usually are interested to updated a whole character skincluster or build it from scratch:
- loading weight on selected points is quite rare ( but easy to do ).
Pracital interaction and scripting access:
#lets try this code in the python script Editor vertexToWrite = ['bodySuit.vtx', 'bodySuit.vtx', 'bodySuit.vtx'] #when you want to write some point weights maya.cmds.skinPercent(vertexToWrite, 'bodySuitSkinCluster1', transformValue = ('spine1',0.5), ('spine2', 0.5)]) #We can see you can specify as argument the component you wish to act upon
This example can illustrate some of maya strong points : its nodal nature, openness and scripting capacity .
A very interesting design choice was made by the maya architects:
- letting user having free access to attributes .
This freedom let you read/write data at will and connect compatible element together and this process can be done without having to select their parent node
In contrast in 3dsmax the skinOps command is only accessible from the command panel after a skin modifier have been selected, unfrtunately :
- Some exposed function are broken or doesn’t support undo…
Skincluster node study:
To continue the 3dsmax analogy an operator which act on set of points is called a deformer.
- its a specialized node which only update point attributes (so normals /position are quite common, but usually not the topology nor the uvs)
In our case a skincluster purpose is to attach a set of points to a joint and then define a smooth falloff between regions.
( above the equation in academic papers: not very useful and readable isn’t it?)
In common terms each influence will move all points rigidly ( as if the whole shape was parented to it ) and then bone vertex weights will be used to define how we mix things together.
Without any surprise the main input attribute used to define this behavior is the matrix attribute.
Each influences worldMatrix will feed this list and tell the deformer when to update our shape point positions.
In order to parent the shape to a joint we need to convert points into this joint space:
- at bind Time Maya will store your joint worldInverseMatrix into the correct bindPrematrix attribute.
Technically weight data are stored into a compound attribute:
- each vertex will have a sparse array of float/double related to the joints influencing them
- One other fun fact is that you can connect this attribute ( maya being a nodal application remember?) to have animatable weights ( it will slow down your rig in a tremendous manner on complex character though).
This choice was mostly driven for it flexibity:
- when you paint weights or colors on a set of points having a sparse array attribute save space and memory.
Its interesting to see that this attribute is inherited from weightGeometryFilter parent class and as such is shared between all kind of deformer in maya ( very useful example of class design and inheritance).
Lets try some simple experience to illustrate the Linear Blend Skinning algorithm:
Saving weights from the grounds up:
It can be tempting to write and read this data to a text/xml/ json file when people try to complete this task faster.
It can give some reasonable result speed wise but fall apart if you try to use it on feature film quality character. ( waiting minutes for skin weight loading/saving or for rig built using pymel is not acceptable by the way).
At this end the logical answer is to have a look at the API:
|getWeights||(||const MDagPath &||path,|
|const MObject &||components,|
Above one of the method of maya MFnSkinCluster function set
On this side we can see that we have more options to extract and write weights:
- one common points is that we have to provide the shape path and components we which to act upon
- component here refers to vertex point, curve CV or lattice point etc…
Years ago when i was testing the different options to save weight faster I start by saving individual vertex data to json file.
One thing that stroke me was that people are still trying to export skin weights in xml on a per Vertex basis.
The next step was to reverse the logic and to save the data from a joint point of view:
- each influence will carry a list of vertex index and matching list of weights
It was showing great improvement in speed but in the end the correct answers in my case
was to dump the whole shape:
- instead of filtering point weights below a certain threshold the APImethod will expose all values( even zero one ) for all influences .
One interesting side effect is that for a mesh with 12000 vertices ( 36000 tris ) and 170 joint you will have a 2 million float list ( and no need for influence index ).
It can me limiting on very dense asset but have proved to be the fastest method maya can offer.
Above sven rig strip down to only its influence object and geometries ( from autodesk open source project )
In python it makes more sense to save this kind of data set as a whole in a binary format:
with open(binFile ,’wb’) as weightData:
Here exposedSkinData will be a MdoubleAray filled by your getWeights method. and will roughly take …. seconds.
Another alternative, opened to mel scripter and maya native user, is to take advantage of maya nature and leverage is strong points.
It is kind of sad to see nowadays softimage user bashing mel, and trying to promote pymel as the only correct way to interact with maya.(sorry but Ice and visual programming is not cool nor new to any nodal application user ).
Unfortunately pymel design which enforce object programming:
- cuts maya user from its architecture,
- uses and expose api element to beginners( messing with MObject will have terrible consequence in your UI or scripts… ( look at motionbuilder instability if you want to see one concrete example of this potential disaster)
- slower than both mel and and maya command
- carry subtle bugs
- pollutes maya namespace and scripted plugin at import times.
end of the rant…
Benchmarking and final toughts:
So the funny trick to save and load weight is to actually let maya do all the work:
- exporting a selected skincluster node in either binary or ascii format (you need to unsure no connection ,history other other accessory elements are included at save time ).
Lets call it maya ascii/binary injection.( there is a really intersting section in the API documention on mayaAscii filter with similar concepts).
(The following test can be carried out on your side if you dowmload and use script from :
On my laptop using Chris script ( after correcting the missing import statements )
sven Export weight methods will produce this result in the script editor:
Exported skinWeights for 22 meshes in 1.80200004578 seconds.
The binary extraction methods from the script I will share on github:
Processings took 0.82452176412 seconds
The ascii extraction methods
Processings took 2.28121973499 seconds
(slower but still reasonable with retrospective to the amount of work involved… )
The last methods will take advantage of alembic cache node:
to save heavy element into abc archive
and layout additional information in to a json file. asset element are then package into a zip file.
Saving 22 elements took 0.473954696144 seconds
Export to D:/SuitSkinCluster1.abc was successful
Number of vertex 18148
Number of influences 216
Number of weights Samples 3919968
Processings took 0.313700978128 seconds
(SKIPPING 21 other elements)
Processings took 0.516421672774 seconds
Thats it for today, will update the post with techniques related to nodecast / ascii injection in the comming days