Pick-and-Place Workflow using Stateflow for MATLAB

This example shows how to setup an end-to-end pick and place workflow for a robotic manipulator like the KINOVA® Gen3.


This example identifies and sorts objects onto two tables using a KINOVA Gen3 manipulator. The example uses tools from four toolboxes:

  • Robotics System Toolbox™ is used to model, simulate, and visualize the manipulator, and for collision-checking.

  • Model Predictive Control Toolbox™ and Optimization Toolbox™ are used to generated optimized, collision-free trajectories for the manipulator to follow.

  • Stateflow® is used to schedule the high-level tasks in the example and step from task to task.

This example builds on key concepts from two related examples:

Stateflow Chart

This example uses a Stateflow chart to schedule tasks in the example. Open the chart to examine the contents and follow state transitions during chart execution.

edit exampleHelperFlowChartPickPlace.sfx

The chart dictates how the manipulator interacts with the objects, or parts. It consists of basic initialization steps, followed by two main sections:

  • Identify Parts and Determine Where to Place Them

  • Execute Pick-and-Place Workflow

Initialize the Robot and Environment

First, the chart creates an environment consisting of the Kinova Gen3 manipulator, three parts to be sorted, the tables used for sorting, and a blue obstacle. Next, the robot moves to the home position.

Identify the Parts and Determine Where to Place Them

In the first step of the identification phase, the parts must be detected. The commandDetectParts class directly gives the object poses. Replace this class with your own object detection algorithm based on your sensors or objects.

Next, the parts must be classified. The commandClassifyParts class classifies the parts into two types to determine where to place them (left or right table). Again, you can replace this class with any method for classifying parts.

Execute Pick-and-Place Workflow

Once parts are identified and their destinations have been assigned, the manipulator must iterate through the parts and move them onto the appropriate tables.

Pick up the Object

The picking phase moves the robot to the object, picks it up, and moves to a safe position, as shown in the following diagram:

The CommandComputeGraspPose class computes the grasp pose. The class computes a task-space grasping position for each part. Intermediate steps for approaching and reaching towards the part are also defined relative to the object.

This robot picks up objects using a simulated pneumatic gripper. When the gripper is activated, CommandActivateGripper adds the collision mesh for the part onto the rigidBodyTree representation of the robot, which simulates grabbing it. Collision detection includes this object while it is attached. Then, the robot moves to a retracted position away from the other parts.

Place the Object

The robot then places the object on the appropriate table.

As with the picking workflow, the placement approach and retracted positions are computed relative to the known desired placement position. The gripper is deactivated using CommandActivateGripper, which removes the part from the robot.

Moving the Manipulator to a Specified Pose

Most of the task execution consists of instructing the robot to move between different specified poses. The exampleHelperPlanExecuteTrajectoryPickPlace function defines a solver using a nonlinear model predictive controller (see Nonlinear MPC (Model Predictive Control Toolbox)) that computes a feasible, collision-free optimized reference trajectory using nlmpcmove and checkCollision. Collision-checking is computed for the manipulator and environment using methods similar to Check for Environmental Collisions with Manipulators. The helper function then simulates the motion of the manipulator under computed-torque control as it tracks the reference trajectory using the jointSpaceMotionModel object, and updates the visualization. The helper function is called from the Stateflow chart via CommandMoveToTaskConfig, which defines the correct inputs.

This workflow is examined in detail in Plan and Execute Collision-Free Trajectories using KINOVA Gen3 Manipulator. The controller is used to ensure collision-free motion. For simpler trajectories where the paths are known to be obstacle-free, trajectories could be executed using trajectory generation tools and simulated using the manipulator motion models. See Plan and Execute Task- and Joint-space Trajectories using KINOVA Gen3 Manipulator.

Task Scheduling in a Stateflow Chart using a Command Dispatcher

This example uses a Stateflow chart to direct the workflow in MATLAB®. For more info on creating state flow starts, see Create Stateflow Charts for Execution as MATLAB Objects (Stateflow).

The Stateflow chart directs task execution in MATLAB by using command classes that are dispatched by exampleHelperDispatcherPickPlace. By dispatching a sequence of commands, you can avoid creating a large stack of function calls in the chart. When the command finishes executing, the dispatcher sends an input event to wake up the chart and proceed to the next step of the task execution, see Execute a Standalone Chart (Stateflow).

Executing Commands

The dispatcher maintains a queue of handles to command classes as a property. The Stateflow chart, exampleHelperFlowChartPickPlace, adds commands such as ActivateGripper or DetectParts to this queue following the state transitions defined in the chart. As long as the dispatcher is running, it checks for new commands in the queue at a constant rate and dispatches them for execution.

Command Definitions

Commands are inherited classes that describe distinct operations the robot needs to execute in order to complete the pick-and-place workflow.

To add a custom command, use any of the provided commands as template. At minimum, a command needs to:

  • inherit from handle class exampleHelperCommandPickPlace

  • Include a doit() method that implements the required functionality, using data from the dispatcher and sending data back to it (e.g. if part detection relies on point cloud classification, the dispatcher would hold the point cloud and the doit() method of the CommandDetectParts Command would analyze the point cloud to determine the part location).

Run and Visualize the Simulation

This simulation uses a KINOVA Gen3 manipulator. Load the robot using loadrobot. Specify the data format as 'row' to get row-vector robot configurations.

robot = loadrobot('kinovaGen3','DataFormat','row');

Initialize the Pick and Place Command Dispatcher

Set the initial robot configuration. Create the dispatcher by giving the robot model, initial configuration, and end-effector name.

currentRobotJConfig = homeConfiguration(robot);
dispatcher = exampleHelperDispatcherPickPlace(robot,currentRobotJConfig, "EndEffector_Link");

Specify pick-and-place dispater properties.

dispatcher.homeRobotTaskConfig = trvec2tform([0.4, 0, 0.6])*axang2tform([0 1 0 pi]);
dispatcher.placingPose{1} = trvec2tform([[0.31 0.62 0.36]])*axang2tform([0 1 0 pi]);
dispatcher.placingPose{2} = trvec2tform([[0.31 -0.62 0.36]])*axang2tform([0 1 0 pi]);

Run and Visualize the Simulation

Connect the Command Dispatcher to the Stateflow Chart. Once started, the Stateflow chart is responsible for continuously going through the states of detecting objects, picking them up and placing them in the correct staging area.

dispatcher.flowChart = exampleHelperFlowChartPickPlace('dispatcher', dispatcher); 

Use a dialog to start the pick-and-place task execution. Continue to run the command dispatcher until a fixed number of detection runs have occurred. When the runs have completed, delete the dispatcher object.

answer = questdlg('Do you want to start the pick-and-place job now?', ...
         'Start job','Yes','No', 'No');

switch answer
    case 'Yes'
        while dispatcher.numDetectionRuns < 4
    case 'No'

Observe the Simulation States

During execution, the active states at each point in time are highlighted in blue in the Stateflow chart. This helps keeping track of what the robot does and when. You can click through the subsystems to see the details of the state in action.

Visualize the Pick-and-Place Action

The visualization shows the robot in the working area as it moves parts around. The robot avoids obstacles in the environment (blue ball) and places objects based on their classification. The robot continues working until all parts have been placed.

Copyright 2019 The MathWorks, Inc.