Presented Master Thesis about 6 dof’s Educational Manipulator

Last December, the 10th (2014), José María presented his Master Thesis about an educational 6 Dof’s manipulator programmable under Simulink.

Educational Manipulator Programmable under Simulink

Educational Manipulator Programmable under Simulink

The arm has been designed to be used on the PIERO platform, and built usin Dynamixel servos controlled by and Arduino Mega.

Good work.

Thaks!

Presented Master Thesis on Navigation for Social Mobile Robots

On July, the 15th, 2014, Antonio Martín presented its Master Thesis about Navigation for Social Mobile Robots.

antoniomartin

He developed a ROS based platform that uses a RGBD sensor for people detection. Some sonars were added for local obstacle avoidance.

ROS based mobile platform developed

ROS based mobile platform developed

The original report “Robot móvil con comportamientos sociales” can be downloaded.

Thanks, Antonio!

 

Presented Master Thesis about Educational Mobile Robots

On July, the 15th, 2014, J. Gil Lozano presented its Master Thesis about Educational Mobile Robots programmable under Matlab/Simulink. The original title is “Sistema didáctico para programación de robots móviles en Matlab/Simulink”.

juangil1

The student participated in the development of the PIERO mobile platform for teaching robotics and mechatronics. He worked with the version 1.0 of the platform and developed lab-work using both the Arduino and the Raspberry PI computing boards.

More details are going to be published at the annual conference of the spanish society of automation.

The report (Project Report) is available, although was written in spanish.

Thanks, Juan!

Presented a Master Thesis about remote environment modelling for off-line robot programming

The student J.C. Camacho presented a work called “3D MODEL DRIVEN DISTANT ASSEMBLY” about remote environment modelling for off-line programming, using computer vision.

He uses structured light and robot parallax to build geometrical models of the objects found. These models are later used to build automatic assembly plans.

Virtual pillars and camera motion are used for  model building.

Virtual pillars and camera motion are used for model building.

This work was developed at the Wise-ShopFloor research group from the Skövde University, wher he worked with J. Cana Quijada, under the tutoring of Abdullah Mohammed.

A remote programming interface was also developped

A remote programming interface was also developped

The project report (3D model driven distant assembly) was written in english and includes an abstract in spanish.

 

Wall-E Animatronic Robot at a Comic Conference

Last weekend Wall-e showed its last capabilities:

– Head-Arm Gesture control.

– Locomotion control.

– Central controller Wireless communication (raspberry pi based).

– LoL (Lots of LiPo’s)

Esteban and Felix had a very hard work to get all this working together.

People enjoyed watching this expressive character.

Among other things to be done are:

– Minor surgery on its left arm.

– A little brain able to link all its sensors with the locomotion and gesture system.

I’m using this platform as a valuable tool for student Master Thesis. They learn a lot and have a good time. So students are wanted to complete the robot.

… and BTW, If you are in the area and interested on a Show. Let us know!

Cheers

 

The PIERO mechanical design available for download

PIERO is a mobile platform for educational purposes. It is modular, layered, open source, arduino controlled and can be programmed under the simulink environment.

If you are interested in the project, visit the PIERO page at: http://gomezdegabriel.com/wordpress/projects/piero-mobile-robot-platform/ and download the latest CAD files in sketchup format.

 

 

How to remote reset your arduino with bluefruit from matlab

I’m using the bluefruit bluetooth adaptor for the arduino controlled mobile robot Piero

20140424_143025

Soldering JST PH male socket connectors to the bluefruit modules

This module is very useful for mobile robotics since is allows you to communicate wireless with your arduino and more important: to remote program the arduino.

In order to reprogram your arduino, the module needs (apart from dynamically adapt its uart speed) to invoke the arduino bootloader by resetting the microcontroller. This is done by using the module’s DTR output. This signal is connected to the arduino’s reset line trough a high-pass filter (electrolitic capacitor in series), as described in this tutorial. This hardware handshake signal can be controlled if you are using bluetooth EDR 4.0 or later, so choose your computer adapter carefully.

If you just want to remote resetting your arduino and you are using matlab this is all you have to do is to switch on and off the DTR line (Data Terminal Ready). Assuming that the com port for the bluetooth device is “COM20”, write in the matlab’s console:

s=serial("COM20");
fopen(s);
s.DataTerminalReady = 'on';
s.DataTerminalReady = 'off';
fclose(s)

This way, the arduino receives a short pulse with a duration that depends on the capacitor size. Enough to reset the micro, but short to let your program run.

Enjoy

PIERO is almost ready

More laser-cut acrylic components arrived from Ingeniería UNO workshops for the PIERO educational robot.

The finishing is excellent and everything is looking good.

The arduinos Mega are stuck in a post office in Barajas and it’s going to take a while to get them.

So far we have tested the LM298 with the EMG30 and although the encoder resolution is low, a x4 decoding interface is going to be enough for the platform PID speed control.

CAM00165 (1)

 

Thank you, Alejandro. Great design and accurate manufacturing!

Soon, the fully assembled prototypes.

Presented Master Thesis about Facial Expressions on Animatronic Robot

Daniel Casale presented it Master Thesis project last December 2013, with the maximum qualification.

CAM00154

Thanks to the good job of Daniel, the robot’s head is finally moving, and making recognizable expressions despite the lack of “mouth”.

He also made a survey at the School of Engineering of the Universidad de Málaga, in order to measure the accuracy of the set of facial expressions.

CAM00090

The rest of the animatronic platform was started years ago and documented in the amateur spanish blog: http://hombremecatronico.es/2010/05/electronica-de-control-de-la-cabeza/

Some other projects are to be developed on this platform. Interested students, please contact me.

Thanks again, Daniel.

Mobile Robot Modelling and Simulation in Simulink (Ideal Robot)

The goal of this entry is to learn how to build a simulation model that can be used to obtain the position of a mobile robot moving on a flat surface, knowing its local velocities. Let’s start by introducing this local ideal vehicle model on a two-dimensional world.

Ideal mobile robot model

Ideal mobile robot model

Let’s assume that the mobile robot moves over a flat surface (X-Y plane ) with any linear and angular speeds $V_c$ and $W_c$.

if $\{C\}$ is the platform reference frame, it may be natural to consider the vehicle’s local linear and angular speeds $^cV_c$ and $^cW_c$ possibly caused by wheels attached to the same platform. Those velocities are now function of the vehicle’s wheel velocities, however they are not considered yet.

But, How can we know the absolute global position of the robot at any moment?

I’m sorry to say that there’s no analytical closed solution for this. However a numerical solution can be found by using simulation.

Global mobile robot coordinate frames

Global mobile robot coordinate frames

The absolute robot position is give by $P_c$ and $\psi$.

The local linear speed $cV_c$ of the robot has to be rotated based on the robot orientation $\psi$ in order to obtain $V_c$. However $cW_c$ remains unchanged by the vehicle orientation.

As a result, the vehicle’s position is the result of the integration of the vehicle’s speed with respect to to the global reference frame.

In every integration process there is an initial value, which in this case is the starting (initial) position of the platform.

This can be very easily summarized in the  corresponding simulink model:

Simulink model for an ideal mobile platform moving on a flat surface

Simulink model for an ideal mobile platform moving on a flat surface

In this model, the block labeled “Velocidad Local a global” is a 2-dimensional rotation matrix, that rotates the input vector along the z axis. Note that the third component of the input vector is the angular speed and remains unchanged.

$$ R_z(\alpha) = \begin{Bmatrix} cos(\alpha)& -sin(\alpha)& 0 \\ sin(\alpha)& cos(\alpha)& 0 \\ 0& 0& 1 \end{Bmatrix}$$

$$ V_c=R_z(\psi)^cV_c$$

$$ P_c(t)=\int_0^t \! Vc(t) \, \mathrm{d}t $$

Rotation along the Z axis. Not my favorite implementation.

Rotation along the Z axis. Not my favorite implementation.

The integrator converts the speeds into positions. In this case, a vector with the initial vehicles position has been included. Note also that the angular position is feed-back to the rotation block.

Simulation results can be recorded for later use, but for model verification, two graphical outputs have been added: A robot trajectory (cartesian coordinates over time) by using the scope block  and  a plot of the followed path by using the XY Graph block.

In the above example model, with constant local speeds we get the following trajectory for $X$ (yellow), $Y$ (purple) and orientation $\psi$ (turquoise) over a 10 second simulation.

Circular trajectory as result of a constant linear and angular speeds

Circular trajectory as result of a constant linear and angular speeds

The resulting path for this trajectory is show by double clicking on the XY Chart block after the simulation is complete.

Circular path as result of a constant linear and angular speeds

Circular path as result of a constant linear and angular speeds

Now we know how to model and simulate the motion of a mobile robot. However our platform needs some actuators like wheels, legs, propellers…

In future posts we´ll learn how to add the actuator’s model to the robot simulation.

See you soon.