The control APP has been written in Python, the latest version code available here. The app was created with a simple interface yet with plenty of features.

On the left side of the User Interface (UI) you will find a TOP view of the pyBot SCARA robotic arm. In the green working area, four concentric circles set the 5/10/15 and 20 cms radial distances from the pivoting point of the arm. A grid of 2.5 cm will help you to know where the arm is when controlling the robot. You can print in 2x A4 sheets the pyBot Working template, stick both sheets together (with minor overlap) and place the Robot´s base on top of it. The printed template have the same GRID layout and reference circles. Those marks will help you to spatial positioning the arm.


USER INTERFACE

Click on the image to enlarge it

On the right side, a SIDE view of the Robotic Arm indicates if the gripper is OPEN or CLOSE and the distance from the ground plane to the lower point of the gripper (when it is completely closed)

Starting with the basics: Above, the view of the user interface after launching the control APP. The first step would be to config you system clicking on CONFIG in the BUTTONS PANEL

Config button

The LOG BOX will continuously list the waypoints according to this format:

1-[1.3,12.3,32.0,15,False]
2-[11.4,1.5,12.0,39,False]
3-[34.1,62.3,142.0,60,True]

where the first value (1,2,3…) is the POINT number

Then the coordinates [X,Y and Z] in millimeters of the GRIPPER, followed by WRIST ANGLE and a field indicating if the GRIPPER is  OPEN OR CLOSED [TRUE or FALSE]

You can edit the waypoints with the EDIT TRAJECTORY button

CONTROL PANEL EXPLAINED

Click on the image to enlarge it

BUTTONS PANEL EXPLAINED

Click on the image to enlarge it

CONTROLLING the Robotic Arm

To move the pyBot Robotic arm is quite simple. You have 4+1 options: Blockly, Real time, Trajectory and Leap Motion. An additional way is to control the pyBot via Python Code, just telling the Robotic Arm where to go defining the movement/ trajectory parameters

1. Using google´s BLOCKLY. This easy to understand but powerful when mastered “visual language” is perfect if you want to precisely control the pyBot Robotic Arm and command it to do complex tasks. Just connect instruction blocks and press RUN. Learn how to use BLOCKLY here

pyBot Blockly´s blocks. Perfect to do whatever you want with the arm

To use BLOCKLY, you will need to launch the pyBot control APP and then open, with your default internet browser (Chrome recommended), the file index.html located inside the Blockly folder. Your browser will display a environment like this one:


2. Real Time motion. In the pyBot control APP tick the Real Time Motion box and activate the Robot (Activate Robot). You will be able to control in real time the Robotic Arm with your mouse. Use the mouse´s wheel to move up and down the arm and RIGHT/LEFT arrow keys to rotate the gripper. Left Click to OPEN/CLOSE the gripper. TIP: You can adjust the robot movements speed and acceleration (X/Y/Z axis) in the CONFIG menu (button)

Keyboard ↔ Robot Actions Mapping

Space bar Activates/ Deactivates robot
L Activates Leap Motion control
R Real Time Motion
T Trajectory Motion (waypoints)
A Absolute: The gripper will keep its orientation
V Activates the Vision System (is the webcam connected?)
i Changes the arm´s elbow laterality
K Starts the Kinematics mode
S Starts the SCAN mode (to be used with the LIDAR sensor)
↓ ↑ or Mouse´s wheel Move Down/Up the Robot´s arm
→ ← Rotate gripper´s wrist (Works in Real Time and Trajectory mode)


3. Trajectory MODE. Clicking on the working area, you will create waypoints. After creating one, the arm will travel to it taking the shortest path. Every waypoint will save the current X,Y,Z and gripper positions. The precision of the waypoint is 0.1 mm. Add as many as you want or need. If you need to modify the waypoint go to EDIT TRAJECTORY and click on any to drag it to its new position. Once finished, press END EDITION


4. Leap MOTION. If you have a LEAP MOTION sensor and the drivers and SDK installed (files here) control the pyBot is quite easy. Just tick on the LEAP MOTION box and move your hand over the sensor. The arm will follow your hand and even open/close the gripper accordingly.


5. Python can directly control the pyBot Robotic Arm. Take a look a these examples:

WORKING IN PROGRESS

OPENCV. The pyBot vision system

The Pybot Robot features artificial vision detection. Just connect a webcam or any other OPENCV compatible camera to your computer and calibrate its POV of the robot´s working area. The control APP has a VISION SYSTEM CONFIGURATION menu (launched clicking on the VISION SETUP button), capable of defining the “detection parameters” you will need to properly detect any feature in front of the Robot/camera. NOTE: you will have to tick the VISION box in the SCARA PARAMETERS CONFIGURATION to use the camera

A very good CAMERA CALIBRATION is mandatory. Is the camera is a little bit off or tilted, the system (robot+camera) will not know where everything is located with accuracy.

What is the HSV space?

The HSV is an acronym for: Hue + Saturation + Vue

  • Hue: the dominant color as perceived by the observer
  • Saturation: the amount of white light mixed with a Hue
  • Value: the chromatic notion of intensity

As these definitions are a little bit…ambiguous and not easy to understand. Lets play a little bit with the HSV space here: http://colorizer.org/new/ or here: http://aleto.ch/color-mixer/. A good explanation of the HSV color scheme can be found here

A video about the HSV color (explanation starts at 2:23, but the video is quite interesting as it explains how to work in python with the HSV color scheme )

To detect an object, the control APP will use a set of parameters to discriminate them from the background. First, the color (defined in the HSV color space) of the object will be extracted from the camera´s image. After that, the color of that object will define a surface (and limits it with boundaries). The discrimination process will take in account the color and its “size” (using the MAX and MIN AREA values as surface limits)

Example: Some M&M have been randomly placed inside the robot´s “working area”. Above: On top, a photo took from from above of the M&Ms and the base of the robot. Bottom left image: what the webcam sees (bear in mind that this wont be the perfect alignment for a camera as it is not perfectly perpendicular to the ground plane -the grid has perspective deviation-. But for detecting purposes, it works).

After playing a little bit the the HSV parameters, we were able to discriminate the “button-shaped chocolates” perfectly in the COLOR MASK window. If we switch to IMAGE BINARY (control APP option), the RED M&M can be seen isolated from the others. TIP: play with the HSV sliders of the VISION SYSTEM CONFIGURATION menu, the HSV color space is extremely good for object detection.

The VIDEO SYSTEM CONFIGURATION menu: Here you can set the detection parameters. MIN and MAX object´s area, H,S,V and H wide , S wide and V wide (thresholds). To adjust the camera position (height in relation with the ground plane) use Xmm and Ymm to compress the camera´s X and Y axis

VIDEO: How to detect orange CUBES inside the working area with the VISION SYSTEM CONFIGURATION menu

Uniform and good lighting is recommended if you want to make easier for the camera to recognize colors and shapes. Confirm you have the VIDEO option activated in the CONFIG menu (BUTTONS PANEL), then click on the VISION SETUP button. That action will open the VISION SYSTEM CONFIGURATION window. In this new panel, you can set the projected Area of the object you want to discriminate, set the HSV, and HSV wide parameters and adjust the X and Y camera axis extensions (these two last value work as a X and Y axis ZOOM). Watch the video below. It is an example of how to set the parameters to detect two orange cubes in the robot´s working area.

A correct location + orientation of the camera is critical if you want the robot to pick-up objects correctly. The camera calibration will help the robot to know where it is (and all the detected object in relation to it) but everything starts with a good camera placement.