Older blog entries for Pi Robot (starting at number 13)

3 Oct 2012 (updated 3 Oct 2012 at 00:49 UTC) »
The First Book on ROS for Beginners



I have written a book on ROS for beginners called ROS By Example and available on Lulu in paperback or PDF.

PLEASE NOTE: The book was written for and tested against ROS Electric under Ubuntu 10.04 (Lucid) and 11.10 (Oneric). Some of the code samples definitely will not work under ROS Fuerte at this time and Debian packages for ROS Electric are not available for Ubuntu 12.04 (Precise).

Please also note that the book assumes that the reader has already worked through the ROS Beginner Tutorials.
ROS Head Tracking Tutorial Available

For those of you getting started with ROS, I have written up a do-it-yourself tutorial for tracking a colored object using a web cam and AX-12 pan and tilt servos. You can find it here:

ROS by Example: Visual Object Tracking

 

26 Nov 2010 (updated 26 Nov 2010 at 15:02 UTC) »
Robot Cartography: ROS + SLAM

In this short article on using SLAM with ROS, I have posted a couple of videos showing Pi Robot mapping out part of an apartment using a Hokuyo laser scanner and the gmapping package. See

http://www.pirobot.org/blog/0015/

Pi Robot Meets ROS

For the past several months, I have been learning the basics of ROS from Willow Garage. At the same time, I have been testing Mike Ferguson's "Poor Man's Lidar" or PML as an alternative to a more expensive laser range finder. The results are encouraging--at least for obstacle avoidance and simple navigation tasks. You can see the report at:

http://www.pirobot.org/blog/0014/

10 Aug 2010 (updated 25 Aug 2010 at 14:54 UTC) »
Robot Agents, Messages and The Society of Mind

I recently converted most of the C# code for my Pi Robot project to Python. At the same time, I am changing the programming architecture to use message passing among nodes. To get started, I wrote up a little introduction to the topic at:

Robot Agents, Messages and The Society of Mind

30 Apr 2010 (updated 30 Apr 2010 at 23:32 UTC) »
An Introduction to Robot Coordinate Frames

I finally had a chance to write up the math behind the Pi Robot arm tracking video. Keep in mind that I am only using the two shoulder joints in each arm--the elbow and wrist servos are fixed--so the inverse kinematics is fairly straightforward. Later on I'll have to deal with the other joints...

Here is the link to the write-up:

http://www.pirobot.org/blog/0011/

--patrick

23 Mar 2010 (updated 24 Mar 2010 at 13:51 UTC) »
Visually-Guided Grasping

Here is a followup video to my previous blog entry. In this video, a number of independent behavioral threads are running to enable the robot to track and grasp the green balloon. Whenever the balloon is grasped, the robot turns its attention to the red balloon. When the green balloon is released, tracking turns again to it and the red balloon is ignored. I use RoboRealm to do the green/red tracking. There is a sonar sensor on the inside of the left hand that tells the robot when something is ready to be grasped. It can also do this using vision alone along with some trigonometry, but the result is more reliable when using the sensor.

--patrick

http://www.pirobot.org

Robotic Eye-Hand Coordination

I just finished up some work on using RoboRealm to guide my robot as it reaches toward a target object. The ultimate goal is for the robot to be able to pick up the object from a random location or take it from someone's hands. For now, I simply wanted to work out the coordinate transformations from visual space to arm space to get the two hands to point in the right direction as the target is moved about. The following video shows the results so far:

I don't have a full write-up yet on how I did this but it basically just uses 3-d coordinate transformations from the head angles and distance to the target (as measured by sonar and IR sensors mounted near the camera lens) to a frame of reference attached to each shoulder joint. The Dynamixel AX-12 servos are nice for this application since they can be queried for their current position info. The distance to the balloon as measured by the sonar and IR sensors is a little hit and miss and I think I'd get better performance using stereo vision instead.

--patrick

http://www.pirobot.org

8 Jan 2010 (updated 8 Jan 2010 at 22:32 UTC) »

Hello,

I put together a new robot using Dynamixel AX-12+ servos and I wanted to test an algorithm for tracking a moving object. The camera being used is a DLink 920 wireless operating over 802.11g and the visual tracking is done using RoboRealm. All processing is done on my desktop PC. The full writeup can be found here:

http://www.pirobot.org/blog/0008/

--patrick

4 older entries...

X
Share this page