Rethink Robotics Sawyer - Camera Access

Happy New Year!

Kicking off the new year with a quick technical post. I've been working with a Rethink Robotics Sawyer robot for the last while, and while the SDK is great for research, it's not without quirks.

One issue I recently had was getting a video stream from both of the robot's cameras simultaneously, it was like only one stream could be active at any one time.

After a bit of digging under the hood, I found some topics and services used to communicate with the robot's I/O whose functionality is somewhat masked by the use of JSON messages for control of the components. Luckily I just wanted to turn the cameras on or off so the message was pretty simple!

Below is a code example for how to use the relevant service. Just specify the camera you would like to command and if it should be activated (True) or deactivated (False).

Given this isn't documented on the SDK website, I thought it might be helpful to post here. I've also posted this on the support forum, which you should join if you're working with a Sawyer!

Note that there might be a specific reason (bandwidth etc.) why only one camera could be active at a time... I've asked Rethink and will update here if there's any info on that.

camera_command_client.py
camera_command_client.py
#!/usr/bin/env python 
# Aran Sena 2018 
# 
# Code example only, provided without guarantees 
# 
# Example for how to get both cameras streaming together 
# 
#### 
 
import rospy 
from intera_core_msgs.srv._IOComponentCommandSrv import IOComponentCommandSrv 
from intera_core_msgs.msg._IOComponentCommand import IOComponentCommand 
 
 
def camera_command_client(camera, status, timeout=0.0): 
    rospy.wait_for_service('/io/internal_camera/' + camera + '/command') 
    try: 
        cam_control = rospy.ServiceProxy('/io/internal_camera/' + camera + '/command', IOComponentCommandSrv) 
        cmd = IOComponentCommand() 
        cmd.time = rospy.Time.now() 
        cmd.op = 'set' 
        if status: 
            cmd.args = '{"signals": {"camera_streaming": {"data": [true], "format": {"type": "bool"}}}}' 
        else: 
            cmd.args = '{"signals": {"camera_streaming": {"data": [false], "format": {"type": "bool"}}}}' 
 
        resp = cam_control(cmd, timeout) 
        print resp 
 
    except rospy.ServiceException, e: 
        print "Service call failed: %s"%e 
 
 
if __name__ == '__main__': 
    rospy.init_node('camera_command_client') 
 
    camera_command_client(camera='head_camera', status=True) 
    camera_command_client(camera='right_hand_camera', status=True) 
 
 
 

Solve for X-MAS

It's that time of the year again, the days are short, decorations are up, and the college is getting quiet again - not a creature stirring and all that.. well except maybe the mice... or postgrads in the labs.

It's been a pretty hectic run-up to Christmas between invited talks, and trying to get some actual work done, here's a few highlights!

christmas_skate.jpg
mince_pie_bot.jpg

KCL Christmas Lectures

The first event of a busy week, my supervisor Dr. Matthew Howard delivered a lecture to a room full of eager secondary students, asking them who they think will be making their Christmas Dinner in the future? We put forward our vision: mincepie-bot, which the students helped us program in a slightly panto-style.

 

GROWBOT also made an appearance at the National Farmers Union (NFU) AGM, where Matthew was giving a talk on the research our group is working on for application in Agriculture and Horticulture.

After all that it's nearly time take a breather and enjoy a mince pie or two!

Task Parameteris(/z?)ed Learning

I've put together this short demo video for a talk I'm giving soon showing some recent work on task parameterised (TP) learning methods on our Sawyer Robot https://www.youtube.com/watch?v=i0dcedYD-II

Specifically, this is using a TP Gaussian Mixture model to encode the demonstrations, and then a Gaussian Mixture Regression to generate the new trajectories from the model.

This is a method developed by Sylvain Calinon et al and there's a nice software library to help get you started :) http://www.idiap.ch/software/pbdlib/

For a more detailed insight to TP methods, I highly recommend this paper http://calinon.ch/paper4018.htm

In order to incorporate grasping to the model, I parameterised the control of the AR10 hand and recorded the control signals as a continuous variable, which happily seems to work perfectly.

Starting to get the hang of this robot, looking forward to trying out more things.