This entry was posted on Saturday, February 6th, 2016 at 10:11.pm and is filed under Announcements. You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.
Morphy needs to store some skins…
As I’ve been closing in on finishing Morpheus 2 I found myself in need of a distributable skin data system to be able to apply skinning information to Morphy meshes after they’d been customized and no longer matched up with the base mesh. Not being able to find a good way of doing it in natively to Maya and not finding any open source options, writing our own was the only way forward.
Thanks to Alex Widener and Chad Vernon for some tech help along the way.
Before delving in here, here’s some lessons learned along the way.
- mc.setAttr — found this to be a unreliable method of setting weights via the ‘head_geo_skinNode.weightList[0].weights[0]’ call convention. Just didn’t seem to set properly via any call but the api.
- mc.skinPercent — call is hopelessly slow and should never be used for intensive work. A query loop went from 78 seconds to run to 1.3 simply using an api weights data call even with having to re-parse the weights data to a usable format.
- weights — speaking of, this was an obtuse concept to me. This is regards to the doubleArray list used with an MFnSkinCluster. In short the easist way to get to a spot in this data set is as follows:
- weights.set(value, vertIdx*numInfluences+jointIdx)
- weights — doubleArray list instance
- value is the given value you want
- vertex index * the number of incluences + the joint index = the index in the array
- Normalizing skin data — You usually want your skin values to add up to 1.0, so here’s a chunk to help
L = [.2, .5]#...list of values normalizeTo = 1.0#...value to normalize the sum to [float(i)/normalizeTo for i in [float(i2)/sum(L) for i2 in L]] #...thanks to http://stackoverflow.com/questions/26785354/normalizing-a-list-of-numbers-in-python
The initial list or requirements for the functions were as follows:
- Readable data format — decided on configobj having used it with some red9 stuff and finding it easy to use.
- Export/import data sets
- Work completely from the data file for reference (no source skin necessary)
- Work with different vertex counts if similar shape
- Used indexed data sets for easy remapping of influences
With that being said. Here’s the demo file link and you’ll need the latest cgm package to follow along. Open up a python tab in the script editor and try these things one line at a time.
import cgm.core.lib.skinDat as SKIN reload(SKIN) import maya.cmds as mc SKIN.data()#...nothin #select pSphere1 then try again d1 = SKIN.data()#...now we have a validated source and throw it to a variable d1.report()#...this will show some of what we have stored. No config file yet so there's not a ton there. #...let's write our skin data to a file d1.write()#...this will 1) gather the skinning data and 2) write it to a file where you specify d1.report()#...bit more info now... """ As you can see if you peruse we're storing a lot of extra data. The idea here is to store enough that we can do some much neater stuff down the line. """ d1.validateTargetMesh('pSphere2')#...let's add a target mesh to our data object """ Before we apply our data let's talk about a couple of modes we have available :target - Uses existing target objects skin cluster influences :source - Uses source mesh's skin cluster influences :config - Uses config files joint names :list - Uses a list of joints (must match config data set len and be indexed how you want it mapped) In this case, we have no skinCluster on our object yet so source is """ d1.applySkin(influenceMode = 'source')#...no go as we don't have a source yet reload(SKIN) #...what about for different vert counts d1.validateTargetMesh('pSphere_moreverts')#...let's add a new target mesh d1.applySkin(influenceMode = 'source')#...no go as we don't have a source yet """ Say we wanna map our data to new joints... """ mc.delete('pSphere2_skinCluster')#...cause I don't have influcne changing for existing clusters working yet newJointList = [u'joint4', u'joint4|joint2', u'joint4|joint2|joint3'] d1.validateTargetMesh('pSphere2')#...let's add a target mesh to our data object d1.applySkin(influenceMode = 'list')#...oops d1.applySkin(influenceMode = 'list', jointList = newJointList)#...there we go #...other calls d1.read()#...read a file d1.updateSourceSkinData()#...this is a call to update the source data with new weighting should you change it SKIN.gather_skinning_dict('pSphere1')#...the data gatherer
This is a first pass on this thing till Morphy 2 is done.
Cheers!
j@cgm

