# Goodbye, “Computer Controlled Systems” subject

I have been teaching this subject for 25 years until today, the course 2013-14.

The study programs change and this subject belongs to a program that is no longer alive. I just evaluated the tests of the last course.

This year I had the biggest amount of students I ever had in this subject. This made me change the way I did the laboratory lessons with extra work, but I didn’t care because I’ve had such a nice group students.

I feel a bit sad about it but I’m excited about what’s to come! I’m hungry, and way too foolish.

Computer Controlled Systems 2013-14 by Slidely Slideshow

Thanks to, absolutely all, my students.

# Presented Master Thesis about a 3D Printer Color Extruder

Today, Víctor Andueza just presented its Master Thesis for the spanish Industrial Engineering title. He did a great presentation (Soon available in spanish) about a big work developed in the laboratories of the department of Systems Engineering and Automation of the University of Málaga.

He designed and built a multifilament ABS color extruder for 3D printing, capable of generating virtually any color based on three primary colors plus black and white. That’s a multidisciplinary work, where analysis, design, implementation and experiments have been done by the student.

Thanks to the tribunal too. They were interested in many aspects and maintained an interesting debate. Victor obtained the maximum qualification. Congratulations.

This work couldn’t have been possible, again without the collaboration of UNO Engineering. Thanks, Alejandro.

Stay tuned

More laser-cut acrylic components arrived from Ingeniería UNO workshops for the PIERO educational robot.

The finishing is excellent and everything is looking good.

The arduinos Mega are stuck in a post office in Barajas and it’s going to take a while to get them.

So far we have tested the LM298 with the EMG30 and although the encoder resolution is low, a x4 decoding interface is going to be enough for the platform PID speed control.

Thank you, Alejandro. Great design and accurate manufacturing!

Soon, the fully assembled prototypes.

# Presented Master Thesis about Fuzzy Control of Blood Glucose Level

Thanks to David Castro Chica who developed and presented its Project in December 2013.

A simulation model has been developed in matlab and two fuzzy glucose level controllers have been implemented and tested.

These controllers can be implemented on a portable insuline pump.

He achieved the maximum qualification too.

# Start Programming a Falcon haptic with LabVIEW

If you already have the “hdl” library its time to start programming. Else you can learn how to build it by yourself here.

An impedance-type haptic link this one is a very simple device: it receives Cartesian forces and return its position. As this is a three degrees of freedom (DOF’s) system, only forces and positions along the X, Y and Z axes are exchanged. Also, the status of the tool-tip buttons can be read.

The Cartesian workspace is about 20 cm with with origin at the center position. The Y axis is pointing up and the Z is aiming toward the user.

The good news are that because the concurrent nature of LabVIEW language and the lack of conventional classes architecture, is easier

## Minimum example

First you need to “init” the device to get a device ID. If this fail (the device is not ready), a -1 is returned and your program should stop. Else you can “start” the servo thread and make the ID the default ID value for the read and write operations. It is your responsibility to repeatedly read and write to the haptic at a rate of about 1000 times per second (yes, 1000). A lower rate just makes a poorer haptic illusion. At the end you must “Stop” the servo thread and “uninit” the used ID.

Waning: Always “uninit” your device ID before “initing” another one. Else your computer will have a bad digestion and you’ll need to restart LabVIEW. So don’t forget to stop your program using your panel Stop button, instead of the “Abort” button. You are warned!

Here, in case that the haptic is not ready you get a message and your program stops.

You can start by making a read only program like this. The haptic is not going to move so don’t be afraid to try:

Check in the user panel that the buttons and the position readings are working. You can use a chart and the default array indicators.

Remember to stop the program pressing the “Stop” boolean control.

Now you are ready for the next step.

## Use the force, Luke

Let’s make another example with force generation. In this next case a simple elastic model is going to be used. The force output force is opposed, and proportional, to the displacement of the haptic from its workspace origin.

The elasticity constant can be changed on line, and the positions are shown in a chart.

You can see the program in action in the following video.

Here you can see the behaviour of the haptic with different elasticities.

## If you are lazy…

Here you can find the hdl library (hdl Library). Make sure you have installed the latest Falcon drivers and SDK and then copy the folder “hdl” into your LabVIEW’s user.lib folder.

The archive with the two examples VI from above are here (FalconLabVIEWExamples).

This is a very simple model and I’ll be happy to see your programs.

Cheers!

# Presented Master Thesis about Facial Expressions on Animatronic Robot

Daniel Casale presented it Master Thesis project last December 2013, with the maximum qualification.

Thanks to the good job of Daniel, the robot’s head is finally moving, and making recognizable expressions despite the lack of “mouth”.

He also made a survey at the School of Engineering of the Universidad de Málaga, in order to measure the accuracy of the set of facial expressions.

The rest of the animatronic platform was started years ago and documented in the amateur spanish blog: http://hombremecatronico.es/2010/05/electronica-de-control-de-la-cabeza/

Thanks again, Daniel.

# Mobile Robot Modelling and Simulation in Simulink (Ideal Robot)

The goal of this entry is to learn how to build a simulation model that can be used to obtain the position of a mobile robot moving on a flat surface, knowing its local velocities. Let’s start by introducing this local ideal vehicle model on a two-dimensional world.

Ideal mobile robot model

Let’s assume that the mobile robot moves over a flat surface (X-Y plane ) with any linear and angular speeds $V_c$ and $W_c$.

if $\{C\}$ is the platform reference frame, it may be natural to consider the vehicle’s local linear and angular speeds $^cV_c$ and $^cW_c$ possibly caused by wheels attached to the same platform. Those velocities are now function of the vehicle’s wheel velocities, however they are not considered yet.

But, How can we know the absolute global position of the robot at any moment?

I’m sorry to say that there’s no analytical closed solution for this. However a numerical solution can be found by using simulation.

Global mobile robot coordinate frames

The absolute robot position is give by $P_c$ and $\psi$.

The local linear speed $cV_c$ of the robot has to be rotated based on the robot orientation $\psi$ in order to obtain $V_c$. However $cW_c$ remains unchanged by the vehicle orientation.

As a result, the vehicle’s position is the result of the integration of the vehicle’s speed with respect to to the global reference frame.

In every integration process there is an initial value, which in this case is the starting (initial) position of the platform.

This can be very easily summarized in the  corresponding simulink model:

Simulink model for an ideal mobile platform moving on a flat surface

In this model, the block labeled “Velocidad Local a global” is a 2-dimensional rotation matrix, that rotates the input vector along the z axis. Note that the third component of the input vector is the angular speed and remains unchanged.

$$R_z(\alpha) = \begin{Bmatrix} cos(\alpha)& -sin(\alpha)& 0 \\ sin(\alpha)& cos(\alpha)& 0 \\ 0& 0& 1 \end{Bmatrix}$$

$$V_c=R_z(\psi)^cV_c$$

$$P_c(t)=\int_0^t \! Vc(t) \, \mathrm{d}t$$

Rotation along the Z axis. Not my favorite implementation.

The integrator converts the speeds into positions. In this case, a vector with the initial vehicles position has been included. Note also that the angular position is feed-back to the rotation block.

Simulation results can be recorded for later use, but for model verification, two graphical outputs have been added: A robot trajectory (cartesian coordinates over time) by using the scope block  and  a plot of the followed path by using the XY Graph block.

In the above example model, with constant local speeds we get the following trajectory for $X$ (yellow), $Y$ (purple) and orientation $\psi$ (turquoise) over a 10 second simulation.

Circular trajectory as result of a constant linear and angular speeds

The resulting path for this trajectory is show by double clicking on the XY Chart block after the simulation is complete.

Circular path as result of a constant linear and angular speeds

Now we know how to model and simulate the motion of a mobile robot. However our platform needs some actuators like wheels, legs, propellers…

In future posts we´ll learn how to add the actuator’s model to the robot simulation.

See you soon.