Animate your robot in Blender

You’ve built a robot full of servers and now you’re ready for the fun part of programming your new dance animatronic bear! The pain in your life is just beginning. Imagine that you have decided that the dancing bear should raise its hand. If you set only one servo position, the motor will move as fast as possible. All you need is an animation, and especially with smooth acceleration.

You Can Work through all the math yourself. After half an hour of fumbling with the numbers, the bear raises its arm like an armed zombie. And then you realize that the bear has 34 more servos.

Industrial robot type arm render with pedestal, base, upper arm and lower arm and IK ball

Fortunately there are blenders for those who have worked above. It’s about creating smooth motion for animation and computer graphics. Making robotic motion with Blender is not easy, at least tolerable. We created a sample project, to explain a 3-axis robotic arm. It has a non-moving pedestal, rotating base, upper arm and lower arm. We will first animate it in Blender and then translate the file into something we can use to run the servers with a little script.

Now, Blender is notorious for a tough user interface. The good news is, with amendment 2.9, it’s gone to one A lot More general interface. It is still definitely Is A big program with 23 different editors and literally thousands of controls, but we’ll only use a small subset to move our robot. We won’t teach you Blender here, because there are thousands of great Blender tutorials online. You want to focus on animation and the Human Rigging series is especially recommended.

Here are the main steps to animate a robot:

  1. Create a ‘skeleton’ (armature) that matches your robot
  2. Rig the armature so that it moves like your robot and is convenient to animate.
  3. Animate the armature
  4. Export servo location to your robot control program
4 bone armature.  A bone is an octahedron with a ball on its head and tail
Robot Armature

Typically, computer animation has an armature and then a mesh for the body hanging on it. For a robot, we don’t need a net, we just need an armature, because we’re not making a movie. Still armature By Requires hardware size matching. Import a CAD file or create an image on top or just measure the robot.

Of robots Resting posture The reference position of the joints, where the robot is when the axes are zero Our hands are pointed straight up.

The robot is pointing straight up
Resting posture

Pivot the bones about TailAnd Head ‘Where they are’. The tail of your shinbone is your knee and the head of your shinbone is your ankle. There may be bones Parents. Your shin bone is a child of your thigh bone. If you move your femur (thigh bone), your shin goes with it.

Bones rotate only on certain axes. The knees sway only back and forth, not sideways. You can twist your wrists, but not your nails. They also have speed limits. You probably can’t bend your knees backwards.

The hinge joints only rotate, but can scale some bones (a soft robot), or translation (a CNC router).

There are even more complex ones Limitations. The front wheels of a passenger car move together. The rods of a steam locomotive rest on their pins. The dog can move freely up to the limit of its ribs.

The animator may not pose through the position of the bone. It would be awkward to animate a character’s eyes by setting their angles. It is better to have an extra target bone that the character always looks at and prevents the eye from looking at the target. We usually curl our fingers at once and we curl all the joints of one finger at once. So how is an extra bone that controls everything at once?

An armature with all these extra control bones and limitations is called a Rig. Blender has powerful tools for making such rigs.

The first bone in our example, Foot Representing the pedestal of the robot, the part is bolted to the floor. The bones are Possible – That’s the main thing, you can animate them. But this bone never moves, so we have locked everything – position, rotation and scale.

Blender interface n key features show all axes but Y rotation is locked

Rotate the base on the pedestal, so Foundation Bone is a child Foot Bones FoundationIts head is in the pivot of the upper arm. Rotation on base Foundation Y axis of bone. It’s just speed, so we’ve locked all the other axes.

The other bones are very similar. The surface of the arm Its a child Foundation, Keeping the head on the elbow. The lower arm runs from the elbow to the wrist. These joints are based on rotating along the local jade.

Armature works now. Select (Pause Mode) Foundation And it rotates (Y rotates) and the bones of the hand also rotate. If you experiment and make noise in the position, just undo or set the unlocked axes to zero.

Our sample robot has a simple mesh that moves along the bones so you can see the robot’s movements.

But it’s still possible to keep the robot in a position where hardware can’t. Suppose our sample hardware robot can only rotate 90 degrees on each side of the base center. We can prevent the animator from taking impossible steps Bone limitations. We add one Rotation limitation To every bone.


That’s great, but now we want our robot bear to receive a gift. How robots will communicate with the outside world. The solution is Reverse dynamics (IK). IK lets us know where we want the wrists, not the shoulder and elbow joints. So we added a ball called IK to the scene and added an IK constraint to the lower arm to try to reach the ball.

If you’re following, move the sample timeline to frame 120, so that IK is turned on and you’re not messing with our animations.

The robot’s wrist is now ‘grabbing’ the IK ball. In Object mode, select and use the ball (g keyTo ‘occupy’ it. Drag the ball to move around. The robot follows nicely, and the joints set themselves up to do what is needed. Much more convenient.

Ike’s home if you want to poke into the confines Lower arm. In Pause mode, select the Lower Arm and Bone Restrictions tab in the Object Property Editor on the right. Ike limitations live here.

A bright bit of Blender is that almost anything can be animated. Most values ​​have a small white dot to the right. Clicking turns it into a diamond and makes the value animated. We do this to “influence,” which is the ‘power’ of limitation.

Animation time

We are now ready to animate. 10 fps is fine for most robots – set it to render property. We put all the animations in the timeline one by one, and grab the slices we want, so maybe the ‘bare wheel’ frames 50 to 70.

Blender Curve Editor IK shows the speed of the ball
IK ball speed. The black dot is the keyframe.

Back when we were typing numbers and moving the bear’s hand, we had to frame each one. Fortunately there is no need now. We just have to pose the robot in enough frames to get the behavior we want. These Keyframe. Blender makes the frames.

Blender adds and removes easily by default, so servings are accelerated smoothly. You can select the IK force and check the ‘Graph Editor’ if you want to see the curve.

There is a difference between Posing And Creating a keyframe. Posing Robot animation does not change. It’s just Creating a keyframe (What am I?) Which changes the animation.

In 50 to 75 frames the robot picks up something and moves it. Notice that the animation was created only four times with ‘Move Ball, Make Keyframe’. It took less than two minutes to create the whole animation. We’ve never animated a real robot – reverse dynamics have taken care of it for us.

From 90 to 105 the robot avoids a table nearby when placing something on it. In practice we have to run these into real robots and tweak them a dozen and a half times. It would be a nightmare to do this without software support.

Robots from animation

We are ready to remove our animations in our robot control program. There is a nifty hack for this. The de facto standard for computer animation files is the ‘Biovision Hierarchical’ format (BVH). Blender can export it, though you may need to enable the plugin and select the armature. Here is a sample.

ROOT pedestal
        OFFSET 0.000000 0.000000 -1.000000
        CHANNELS 6 Xposition Yposition Zposition Xrotation Yrotation Zrotation
        JOINT base
                OFFSET 0.000000 0.000000 1.000000
                CHANNELS 3 Xrotation Yrotation Zrotation
                JOINT upperarm
                        OFFSET 0.000000 0.000000 0.500000
                        CHANNELS 3 Xrotation Yrotation Zrotation
                        JOINT lowerarm
                                OFFSET 0.000000 0.000000 3.100000
                                CHANNELS 3 Xrotation Yrotation Zrotation
                                End Site
                                        OFFSET 0.000000 0.000000 3.100000
Frames: 251
Frame Time: 0.100000
0.000000 0.000000 -1.000000 -0.000000 0.000000 -0.000000 -0.000000 0.000000 -0.000000 -0.000000 0.000000 -0.000000 -0.000000 0.000000 -0.000000 
0.000000 0.000000 -1.000000 -0.000000 0.000000 -0.000000 -0.000000 0.000000 -0.000000 -0.000000 0.000000 -0.000000 -0.000000 0.000000 -0.000000 
0.000000 0.000000 -1.000000 -0.000000 0.000000 -0.000000 -0.000000 0.000000 -0.000000 -0.000000 0.000000 -0.000000 -0.000000 0.000000 -0.000000 
... lots of lines of this ...

Although it looks unpleasant to purse, there is a hack. We don’t think about the upper part (skeleton). We just want the motion data frame from the bottom up.

Find the line number of the line after ‘frame time’, 29 for our file, and use it tail -n +29 ./robotarm.bvh | sed --expression='s/ $//g' | sed --expression='s/ /,/g' >robotarm.csv Get a CSV file of composite angles for each frame.

Which of these numbers serves? And how do we map these numbers to the numbers sent to the survey?

We’ve added an animation (frames 1-6) that exercises each free axis in sequence – at the base, the upper arm, the lower arm. If we look at the CSV file for which the channels change in this order, we see that Channel 8 is the base, 10 is the upper arm and 13 is the lower arm. If you know the location of the servo in the hardware range, you can map one to the other.

The actual numbers are the Blender joint position in degrees, so the servo is left to be set once every frame time and your animation can be seen in real life.

Don’t forget, if you run one animation after another, the second animation will start with the robot where the last one placed it. And remember that because you ask for a servo location, you may not get it. If you know that your robot moves slowly, you will lose position control if you ask for quick action. Blender doesn’t know.

Finally, we will notice that the ‘robot’ does not have to be a robot. Anything that requires scripted animation can be treated this way.

Of course, this is not the case for creating an intelligent robot assistant. There are other tasks such as sight and grip that require real-time control, adjusting the speed of flight, smooth running between canned animations, simplifying in and out of animations, and mixing animations. But a blender and a simple export routine can get you started.

Making robots is half the fun playing with them. We hope we’ve inspired you to try Blender, and probably some Animetronics.

Leave a Reply

Your email address will not be published.