Working With Nifs 301 : Nodes Breakdown
Node breakdown!
I've covered the basic NiNode in an earlier guide, and many of these nodes have similar options available, so I won't be covering basics (like children and property arrays).
I'm going alphabetically again. There are a lot of nodes that go unused, or that we don't know how to use properly, neither of which I'll be discussing here. I offer guides on how to use specific nodes, some with step-by-step processes.
AmbientLight: I know I saw this used on an official file somewhere, but can't recall. Can't say I've ever gotten this one to work. I'd reckon this node would attach an ambient light to the scene.
DirectionalLight: Same, I think I saw this on the sigil stone nifs.
BillboardNode
My hero! This node will always face the camera. Commonly used for particle effects, distant trees, or effects like a faked light or torch flame. There are different modes that can be configured. Official files almost always have a StringExtraData attached that also declares which billboard mode to use. -There may be more billboard modes available through this. Also, some particle stacks use a regular NinOde, with the string data attached: this seems to work just as well.
Also, Placement is important: Say you've moddeled a tall candle, and have a flat image plane for the flame. You will want to place the billboard node just at the tip of the wick (in the 3d space). Now with the image plane trishape attached to the billboard node. The bottom of your image plane should then be adjusted just on top of the center of the billboard node. -If you had the billboard placed say, at the bottom of the candle's base, and then adjusted the image plane from there, you'd get some pretty wild rotations as the billboard did its thing.
Billboard Modes:
Always_face_camera: The billboard will always face the camera, with a bias on the Y axis. (is that right? I get lost sometimes)
Rotate_about_up: The billboard will only rotate on its' Z axis (up) to view the camera.
If a billboard using this mode is attached to a node that changes direction and rotation, the billboard's Z axis is translated with the parent node. ex: If attached on a weapon nif, the billboard's Z axis will change in relation to what direction the weapon is pointed.
Rigid_face_camera: The billboard will always face the camera, but will not use a parents' co-ordinates for the Z axis.
ex: The same way a torches' flame would always point up regardless of how you move the torch around.
Always_face_center: Similar to face_camera.
Rigid_face_center: Similar to rigid_face_camera.
BinaryExtraData
Used to store a number, not something one really plays with often. Tangent space is stored in one of these. There are other ExtraData blocks, each for storing a different type of data.
StringExtraData
The most common one you'll likely to play with is a StringExtraData, which houses string data, oddly enough. Just another way to convey data to the engine. Commonly seen named as "UPB" which seems to have optimizer info, terrain and visibility, LOD, billboard info, and so on.
BHK *
All the bhk nodes are for havok and collision. Keep in mind nifskope doesn't have full understanding of all the data blocks for these nodes. Again, I don't mess around with havok often, see other guides for information.
BSFurnitureMarker
I have never really touched this one. It is the hook in which Oblivion can attach an actor to perform an action, like sit, or lay down. See the various furniture nifs for examples.
Controllers, Interpolators and Data
There are various controllers, Interpolators for those controllers, and data for the controllers. These are for animating an aspect of a .nif or otherwise moving data. For more information, see many of Bethesdas .nifs for examples of them in action. Larger stacks are overseen by a ControllerManager, which is the start of complex, time-based actions, and I've had little success creating one from scratch. Much easier to find a suitable effect/structure, paste that into your work, and edit from there.
To use one: After picking the appropriate controller, it links into the Controller value of a given Node or Property. The controller has a slot to link the Interpolator to, and the Interpolator has a slot to link the data block to. Here's a breakdown.
Controller Flags: Active (on/off) and the loop mode: Clamp or Cycle (and reverse). I'm not sure how the engine interprets the loop mode.
Frequency: The timing at which the animation is played out, standard is 1.
Phase: Usually no need to set this, but it's an offset for start time.
Start Time: Time in seconds along the animation loop to start. (usually 0)
Stop Time: Time in seconds the animation to end.
Target: Defines what exactly this controller is affecting. Usually, you will link this back to the node you attached it to. ex: TextureTransformController is attached to a TexturingProperty: the texturingProperty will be the Target of the controller.
Interpolator: There's usually little to do here except link the Data block.
Data: This contains all the keys for the animation. And they generally look like this:
Num Keys: Defines how many keys to use.
Interpolation/Type: Defines what kind of keys to use. Linear or Quadratic keys are the most common. Linear keys are pretty straightforward, and Quadratic keys can do fancier curves and generally confuse me.
After defining the key number and type, you will need to update the array of keys to see the changes.
Linear Keys and XYZ Rotations: They have two values, Time and Value. Time is the time in seconds to play this key, and the key array should be advancing forward at all times. (don't use 3 seconds for the first key, then try and start the second key at 2 seconds.) Value is the number by which to make a change. This will vary depending on what kind of controller you're using.
Quadratic Keys have those two, as well as a Forward and Backward vector, I get a little lost at this point. I've only done these things by hand.
Unknown Key: It's a mystery.
Here are some of the controllers:
MaterialColorController
This is capable of animating a Materials' color settings over time. I've forgotten if it can only change the diffuse channel or all of them. It attaches to a MaterialProperty.
It accepts a Point3Interpolator, "Point 3 Value" in it should be set to 1,1,1.
PosData is the datachannel here. The material can be animated over time by Quadratic or Linear keys.
TextureTransformController
Can move a texture on a mesh over time in different ways. This controller also has an Operation flag, which defines which type of animation to perform. It attaches to a TexturingProperty.
0: scroll along U 1: Scroll along V 2: Rotate (you may need to tinker with Offset settings back in texturingProperty>Base Texture) 3: Wierd zoom along U 4: Similar wierd zoom along V
Accepts a FloatInterpolator, which takes a FloatData.
The keys here are straightforward, Just know that nifskope sometimes bugs out while changing keys/modes. Workaround: Go back to texturingProperty, adjust Base Texture>Translation, hit enter, then change it back.
FlipController
Another texture animation method. This one cycles through a given list of texture files over time. It attaches to a TexturingProperty. It has two new values:
Num sources: Defines how many source images to use.
Sources: The list of individual SourceTexture nodes to play in the sequence. The first in the list should be the same source texture that the base TexturingProperty uses.
Accepts a FloatInterpolator, which takes a FloatData.
The FloatData here should use "Unknown Key", or possibly linear. Again, time is in seconds, and Value corresponds with the list defined in the interpolator. 1 for the first source image, 2 for the second, 3 for the third, and so on.
AlphaController
Animates the alpha transparency over time, and attaches to an AlphaProperty, or possibly a MaterialProperty.
I can't find an example for this, so I can't check it. I know i've seen it. But I'm sure it would take a FloatInterpolator and FloatData. Using Linear or Quadratic keys, one could then control the alpha value.
UVController
It sounds neat, but I've never seen it used. It could possibly flip between UV sets, or change UV dimensions for tiling, if supported.
KeyframeController
haven't explored this node.
Bspline
...and other skeletal animation nodes: I also haven't explored these.
PathInterpolator
When pointed at a parent NiNode, this interpolator can then move said node along a path. (meshes/landscape/butterfly01.nif for example) Note: It Attaches to a ninode. You would then attach your mesh to the same ninode, don't attach the path to your mesh directly.
It takes a PosData and a FloatData. The PosData contains the keyframes for all the locations to define the curve of the total path. The FloatData just seems to reinforce start/stop times? I'm not sure if the exporters are capable of exporting one of these nodes, and making one by hand is dizzying. (One could possibly define a curve in their 3d app, and get the coordinates for each point on the curve, and input those by hand in nifskope; just a thought.)
TransformController
Another fun one. It will do XYZ Rotation, translation and scale on it's parent node. It Accepts a TransformInterpolator and TransformData. Just like the pathInterpoloator, it affects a node, not a mesh. Anything attached to this node will follow the nodes' translations. Again, target for the controller will point to the parent node, everything else is the same as the others. -Be sure to set your Frequency to 1 at the very least before playing, otherwise you won't see anything changing! Also, Anything animated this way won't work when equipped to an armor or clothing slot.
TransformInterpolator
I always use meshes/landscape/miscbutterfly01.nif, and copy the one from there. You can't create the large numbers seen there in a file by hand, just take this data. Then re-point it to the next in chain: TransformData!
Also, It's handy to turn Axis and Nodes view from the Render menu while testing animations.
TransformData
So here we have the data by which to animate from:
Num Rotation Keys: Set this to 1.
Rotation Type: Always seen set at XYZ_rotation_key: this assumedly defines what kind of rotation we're doing.
XYZ Rotations
There are three channels, one for each axis individually. Each subset defines how many keys to use, and this does not need to be consistent over all three of the axis channels. Keys can be Linear or Quadratic. Again, Time is in seconds to execute each key. Value is rotation in radians (not degrees). You can always use google search like this: "90 degrees in radians" It then tells me that will be 1.57079633.
Translations
Used for moving X Y or Z along a path defined by the keys. Enter the number of keys, then use linear or quadratic keys, then update the key array. The values on the keys are fairly straightforward, I'm sorry I cannot explain quadratic keys well. I don't know how to calc out the scales for writing curves this way.
Scales: Change the size of the shape over time. The keys work the same here too, I have no idea what Scale the scale values actually work in.
Example: You want a rotating pommel to a sword, but only the pommel should rotate. From the root NiNode, attach your sword mesh (the pommel should be a separate trishape). Then attach a NiNode as a child to the root node, Rename this node Pommel just for fun. Now, attach the TransformController AND the trishape for your pommel to this second node, it should look like this now:
First, We've copied and pasted a working TransformInterpolator from meshes/landscape/miscbutterfly01.nif. Then open our sword nif, and Block>Insert>NiNode, TransformController, and TransformData. Then paste in the TransformInterpolator.
0NiNode (Scene Root) -1BSXFlags (BSX) -2Collisionstuff -3niStringExtraData (PRN) -4Trishape (sword) -5NiNode (Pommel) --6niTransformController ---7niTransformInterpolator ----8niTransformData --9TriShape (pommelshape)
In Recap, the 0-Ninode has two extra datas, and two children. The children are 4-Trishape, and 5-NiNode. This second Ninode, numbered 5, also has one Controller: 6-TransformController and and one Child: 9-TriShape. In the TransformControllers' data, we will set Flags to Active/Cycle, Frequency=1, Stop Time=Whatever. For Target, we enter 5, the parent NiNode, and then set Interpolator to 7 to link the TransformInterpolator. Now we click on the interpolator, and edit Data, to 8, for the TransformData block. IN TransformData, change Num Rotation Keys to 1, and Type to XYZ.
Let's just use the second set of keys, so expand that, change Num Keys to 2 / Linear_Key. Now rClick "keys" and Array>Update to display the list. (and expand both of them)
For the second key, give it a Time of 2, and a Value of 6.2831. Now, when you hit play, you should get a smooth full rotation that loops.
In this linear mode, the engine automatically tweens the rotatation values for you. In this case, We set the animation's length to 2 seconds. The first key is set all to 0, the default pose, so to say. The second key is at 2 seconds, 6.2831 radians is 360 in degrees. By the two-second mark, the mesh has been rotated to 6.2831 then starts over again at 0 to do it again and again.
Enjoy!
GeomMorphController
This Node is another special-use animator, Used for Creature facial animations and bows. It is the handler for vertex-based animations. Instead of using keyframes, each morph is a savestate of vertex data rearranged in different positons. The animation is done by blending the base position with the morph position linearally. Open any of Bethesda's bows in nifskope, hit play, or scrub the playbar. You can see that there is a base position, the bow in a relaxed state, as well as a fully-drawn state. As the time progresses, it goes from relaxed, to drawn, with a slight back/forth jitter at the end, to simulate the string vibrating from the release. (bows also have several bone nodes for attaching arrows, but I'll get into that later.)
Now then:
A triShape/Strip has a Controller slot where the GeomMorphController attaches. The Controller has three attachments: One MorphData, and two FloatInterpolators, each in their respective slots. Also, there is an "unkown int" that has a value of two, which corresponds with the number of interpolators attached.
I'll talk about the interpolators first: Since we have two states of animation, the game needs one for the "base" and one for the "morph" positions. The first one is for the Base positon, and you can see that its' keys are quite bare, save for the last key which only has the final animation time listed for courtesy.
The second key manages display of the Morphed version. Time is in seconds, and defines changes in the animation. Value is a scale of 0 to 1, 0 will be 100% base frame, and 1 will be 100% morph frame, 0.5 would be halfway inbetween the two. We can see that by .333 seconds the bow is fully drawn, and then fully released at .3367 seconds. At .5 there is a slight jump towards the morph frame again, the string vibrating effect. -Overall, a pretty straightforward and easy animation.
Now the MorphData: it contains vertex positions for both the Base frame, and the Morph frame.
Num Morphs: the number of morph frames to use.
Num Vertices: The number of vertices in all frames used. <<important! Both the Base and the Morph need to have the same number or vertices. Vertex order should also be maintained for the animation to look right.
Morphs: Each entry has a name that will be called by the engine. Vectors is the big list of vertices for a given frame.
Cool, so let's make a bow work! I can't help with the modelling aspect, but you will need to do these final preparations:
Finish your base mesh, UV mapping, normals and texture. I'd suggest making a Duplicate of this mesh, and use whatever means you wish to work it into the Morphed positon. Now export both of these meshes into the .obj format, both with a unique filename.
Save yourself some time by opening a bethesda bow. One, the framework is already there, we just need to inject our model. Two, the floatInterpolators here have another funny number that is difficult to create from scratch.
- If the bow's mesh is a triStrip, rClick it, and do Mesh>Triangulate.
- Now rClick it again, and do .OBJ>Import Mesh, and import your Base .obj. Now rClick the GeomMorphController, and do Morph>Save Vertices to frame>Base. (Nifskope will tell you that it is resetting anything that was in the morphData.)
- rClick the trishape again, and do .OBJ>Import mesh, this time importing your morphed bow mesh.
- rClick GeomMorphController again, and Morph>Save Vertices to frame>BowMorph.
- rClick trishape one last time, and reimport your base mesh.
- Hit play to make sure everything is in place. Any distortions or undesired will be caused by a mismatch in vertices, you will need to fix this in your 3d app.
- Now update tangent space, change texture paths or material settings as usual, and you're finished!
What we did there was straightforward: Take in the base mesh, and save it's position in the base frame, then inserted the morph data in. Finally, we put the base mesh back in, so the bow doesn't always look like it's drawn back.
Particles!
I would really love to write a particle guide. But unfortunately, Nifskope doesn't have a total grasp on all of the various particle nodes, and doesn't currently render the particles. Also, some particle nodes can't be created via Block>Insert and still work, but you can easily copy/paste those. (unless this has been improved since the last nifskope version I used. It has been quite a while). These things can make creating new particle effects a real drag. Make a change, load game, check effect, quit back to nifskope, make more changes, load game again, ad nauseum. But I can offer some advice and suggestions. I should also note that I've spent very little time creating particle stacks in 3d apps, and know little :) Oh, and particle nodes seem to be disabled when placed on armor. Use the CS to attach one via the MagicEfectShader.
If you want to make some, go check them out in-game for a while. Then find the nif that uses the stack, and copy/paste the whole thing into your file, and make with the mad scientest goggles from there. Sometimes, you can get away with piecing multiple stacks together into something that works well. Some effects are simpler than others, and are much easier to manage, like a simple foggy cloud that kind of wobbles around.
Particle branches governed by a MultiTargetTransformController can be difficult to get working on their own. Many of this type are set to recieve other input to turn off or on, and that needs to be circumvented. I remember having trouble with this before, but again, it's been a while. sorry!
So, based on what you've learned about linking so far, feel free to put to use in whatever creative fashion now. The stack starts with a niParticleSystem node, works just like other nodes, but has Modifiers instead of children. Nearly any niPSys* node attached to the Modifier list also has a Target entry, which should link back to the particleSystem node.
In the Controller field, goes your Emitter controller. Easiest just to copy this one, due to funky numbers within. Aside from that, it works like any other controller, but also has a special entry for Visibility control- which may be used in toggle effects (not sure)
It's common for the ParticleSystem node to have properties attached to it directly, or it's parent NiNode if used in an array of different particle stacks. Billboard, alpha, vertex color and zbuffer are common. Again, having these rigged to a parent ensures that all aspects of the particle generators will have these properties.
Emitter: Emits particles for the modifiers to play with. I'll do a brief look at a Box Emitter. (there is also a mesh emitter, which can take a trishape, it works more or less the same :) )
(remember I know little right now)
Name: Naming convention for particles seems to be the same as the node type, followed by a :0, :1, :3, etc. no idea. It doesn't seem to relate to stack order.
Order: Particles seem to run in stacks, or layers. Useful in saying that you want a rotation before a gravity hit.
Target: the ParticleSystem node it is attached to.
Active: y/n.
Speed': How much/often to emit.
Speed Variation: for stuttering time?
Declination: The declination in which to emit. I'm not sure how this is measured. Also has a Variation value.
Planar Angle: Which angle to emit at. Also has a variation value.
Initial color: Yup, initial color of the emitted particle.
Initial Radius: I think this how wide of an area we're covering. Also has Variation.
Life Span: How long before a particle "dies". <<important so as not to overload the system. Also has a variation.
Emitter Object: I'm not sure I understand this one. But it links back to the NiNode that is the parent of the ParticleSystem node. (two steps up the heirarchy.)
Width, Height and Depth: Not sure if these get used. Possibly for volumetric-based effects.
So, with this, we're emitting a box particle at a specified speed, direction, lifespan, etc. After that, we hit it with the various modifiers. Change it's color, change directions, add gravity or wind, and so on. I won't cover them here, but they are all easy to see what they do. Again, check with working particle stacks for ideas/info. I'm just regurgitating what I see on my screen at the moment. Nifskope will offer info via tooltips: Just over over a Name or Value block. If there's information known, it will display the tooltip.
All particle stacks seem to end at Order=7000, contained within a niPSysBoundUpdateModifier.
And that's that for now! I'd like to talk more about MultiTargetTransforms/ControllerManager setups, but I really don't know enough to speak on them.