Author Archives: paul

M3D Micro Unpacking and Software

Unpacking

The M3D printer is the result of a successfull kickstarter campaign where 11.855 founders backed 3.401.361 $ USD. I came across this beautiful little device at the Chemnitzer Linux Tage 2016 and somehow fell in love. I immediately ordered the device from the companies website printm3d.com, which was possible because the company started selling various models shortly before. Having the choice I’ve ordered a crystal clear case with 349 $ USD + 25 $ USD fee for beeing clear, 49 $ USD for packaging and worldwide express shipping.

Only a few days later a little package arrived at my door, held by an UPS delivery guy asking for an additional VAG of about 80 € EUR. Which I expected, but never was prepared for by the manufacturer.Overall in germany the printer not as cheap as promoted everywhere, but still unpackaging was really pleasant:

I’ve taken this pictures a while ago, nowadays there are nice videos of the unpacking process I highly recomment:

Youtube Video

Software

Beside the price, there are other pros and cons of the printer.  First of all its software is genius and catastrophic at the same time.  From a beginners point of view having a nice catchy windows application which runs on the latest plattform and service pack is perfect. But having no Linux interface, no reliable instructions for setting it up a working setup is nothing M3D should be proud of. It took me several days to connect the printer to my virtual box windows emulation environment. It never worked stable and embarrassed me and my enthusiasm  in front of friends I wanted to show the little magic cube.

Nevertheless, there is hope! A nice plugin called M3D-Fio for multi-printer software Octoprint. It can be installed on a simple Raspberry Pi and offers a website where you are able to upload .stl files, slice them to the m3d gcode by using the ultimaker slicing software called Cura, and flash the current M3D firmware.  Only by that, the printer is usable to me. And it has the nice side effect of beeing able to leave the room taking my laptop with me, and saving a lot of energy by not beeing required to run a dedicated windows machine while printing.

Screenshot_2016-05-12_15-10-01

My Octoprint setup.

For the sake of completeness:I can’t evaluate the MacOS Version of the software, because I do not use an apple laptop.

M3D - while printing

M3D – while printing

Overall i can say, the M3D i brought was really worth it.

Leap Motion in a low cost robotics context

As partner of ASUS  the Leap Motion uses the same emitted and reflected infrared light for tracking parts of the human body like the Asus Xtion Pro . Available since July 2013, the Leap Motion with about 90EUR is an inexpensive, but limited input device,  which is optimized for tracking fingers and hands as illustrated in the following illustrations.

LeapMotionVisualization

LeapMotion Visualization API allows track- ing of two hands and advanced gesture recognition

LeapMotionDevice

LeapMotion device emits infrared light, which can be seen with a non-filtered camera

The main features include the tracking of two simultaneous hands with gesture recognition for all ten fingers. For distances between 10cm and 1m at daylight the device works reliably.

During my thesis, I have tested the existing ROS driver, which currently only supports one hand and was not able to provide 3D PointCloud data. In brief, the Leap Motion unfortunately is inappropriate for our project as their only use could be unreliable robot control by hand gestures.

ROS Basics – challenges in the robotic low cost context

Applications in robotics need to solve a lot of computational intensive tasks. While some of them can be outsourced to an externally powered device like a laptop or a server, others essentially can be calculated on the UGV.

Examples for that are collecting sensor data, receiving and executing commands or streaming data. Balancing these is a challenging task, because on concurrent executing systems all processes can influence each other. Especially when computational power gets cut down to the limits in order to save energy. As most libraries, frameworks or software environments do, ROS requires additional resources when being compared to a single purpose application.

In conclusion providing enough computational power while using reasonable amounts of energy is an important task to solve.

Physical properties
Physical dimensions and requirements result from a tradeoff between costs and size, whereas smaller UGVs tend to be more expensive and complex. On the other hand, an upper bound among others is set by being manageable in terms of transport and storage.

The low cost target UGV is a four wheel or two tracks driven ground robot with physical dimensions below 150mm * 300mm * 300mm (height, width, length). The drive power should be accordingly with an effective force of more than 100 Ncm for moving or holding torque in case of stronger slope. Additionally, tracks are the preferred primary propulsion system as they have better grip properties and only require simple motor control. Another nice to have would be the capability of spot-turning, which would allow operating on small areas and facilitates 3D scans of rooms without moving further than required. Another optional point if the robot is going to be used outside of buildings or around kids is a splash-proof case that would increase the robots life. Furthermore, modular extensibility would increase the usability of the robot significantly.

Modular design

In order to solve the tremendous requirements of robotics in a low cost context, we need to think out of the box while structuring the challenges in solvable problems. Like the following graph shows, we should divide the functionality of UGVs into four main modules: First, Sensors are the parts the robot requires to sense the outside world, next Accumulators serving and saving power, followed by Processors the units are processing information gathered by Sensors and finally, Actuators which provide physical movement. These areas in turn get separated into further sections which we discuss one by one on the next posts.

Screenshot_2016-03-11_21-40-35

ROS Basics – ROS in a low cost robotic context

UGVs like they are found in industry, education or Do it yourself (DIY) communities are currently not affordable for average technique enthusiasts, teachers in schools or sometimes even universities. The concept of low cost robots tries to solve that issue.

What is low cost in a robotic context?

The traditional interpretation of low cost is minimizing the expenses while keeping most important features. In borders of mostly expensive robotics this term needs to follow the same differentiation as between cheap , which means coming with a significantly reduced price and quality, and keen , considered as maintaining a certain amount of quality at a reduced total cost. For example, the 50 000 USD UBR1 is a low cost version 250 000 USD up to 400 000 USD PR2  of Willow Garage , but still is far away from the term cheap in a common way . Another example and at the same time another robot Melonee Wise worked on is the TurtleBot , which was constructed with the attempt to be the lowest cost version of a ROS robot at time of creation.1

How to achieve low cost?

There is no general solution to this problem. But an approach to solve the issue in the robotic context is to replace expensive single purpose solutions produced by companies in low quantities with mass produced products that get customized to suit the application.
A demonstration of this positive misuse are the first versions of the TurtleBot . Instead of constructing the robot with expensive 3D Laser Scanners they replaced it by a Microsoft Kinect originating from the gaming industry. Furthermore, it used a iRobot Roomba and later a iRobot Create as a low cost mobile base as constructing a custom movable footprint would
have been way more expensive. Also, the mass produced product came at a lower cost and unharmed warranty. An important side-effect of these replaceable parts is the independence of unique cost intensive and sometimes, due to customs regulations, not easily accessible parts. By that, the power to choose a cheap replacement at any time reduces overall expenses and
total risk.

As a consequence, an low cost UGV should be easy to build and reproduce, affordable for education and able to run ROS with some kind of 3D measuring device. It further should consist of easily achievable or replaceable parts.
In conclusion, these properties lead to a modular design concept with communication inter- faces between the inexpensive components. Also a certain degree of flexibility is required to maintain extensibility and independence of expensive parts.

ROS Basics – ROS Transformations

One of the most important packages a ROS robot should implement is TF (Transformations), because it enables the robot to keep track of multiple coordinate systems (frames) and their relations between each other over time. Following the ROS Enhancement Proposal s (REPs) especially REP105 the most global frame should be the world frame. Every other frame derives from it in a tree structure and can be transformed back into world coordinates by using the same units of measurement defined in REP103.

Another important frame tree is the robot itself. Starting with a mobile base_link further attached elements called links like wheels or cameras have their own frame and are connected via relations, also called joints. Those joints can be static or dynamic. A sample configuration can be seen in the following images:

To define a robot, ROS offers a special XML description file using the Unified Robot Description Format (URDF) which is further improved by special markups and an additional interpreter called XML Macros (XACRO). In ROS, all not time-related relations can be defined in a single file and can be published periodically by the robot_state_publisher for example for simulation purposes. In advanced setups, publishing the robots joint states and especially the relation of the base_link is a complex task. Therefore it gets divided into separate processes like navigation, mapping or the hardware controllers.

aMoSeRo source code online on Github

IMG_20140716_1817151-624x468I reactivated my old laptop and released the source code of the amosero robot. In case you want to build your own – do not hesitate to contact me to get a detailed construction info for free.

Have a nice day!

Building tinc1.1pre11 on Ubuntu

Because local package sources currently only offers tinc 1.0* versions we need to compile tinc ourselfs to use the features like invite or join of 1.1.

## we need root privileges
sudo -i

apt-get install build-essential liblzo2-dev libssl-dev libncurses5-dev libreadline-dev libghc-zlib-dev

cd /usr/local/src

wget http://www.tinc-vpn.org/packages/tinc-1.1pre11.tar.gz
tar -xvzf tinc-1.1pre11.tar.gz

cd tinc-1.1pre11

./configure --prefix= --sysconfdir=/etc --localstatedir=/var && make && make install

 

esp8266 ws2812b hostname triggered wifi light

Today I built a small wifi light which rotates in case a certain hostname (my smartphone) is in the local wifi. As these devices tend not to answer to ping or arp requests, and bonjour or mDNS where to slow, I crawl my dhcp server every five to ten seconds. Additionally i decreased the lease time of the dhcp to  improve the switch off response time. As my smartphone usually logs into my wifi instantly when I enter the house it’s usually switches on before the main door has been opened.

The interior of the lamp consists of seventeen ws2812b rgb leds which I controlled in an intermediate stage with the esp2866 opc code on github. As this would have required a constant network packages flow and a device delivering the UDP packages, I later on switched over to control the led animation by the esp8266.

Something I discovered today is that soldering the 2mm grid esp8266 upside down onto a 2.54 grid prototyping circuit board improves handling and speed, as well as the size of the final circuit.  You also can see the pin map information on the final product, which is nice.

After putting some hot glue on the board to prevent shorts and improve lifetime, I took some measurements regarding current consumption: about 0,1Amps at 5V, which should result to 0.5W with a constant rotating light and wifi crawling. This makes the device capable to be run on most USB power providers. The final result looks like this in action:

Youtube

As the code is very specific and dependent on my local setup, I will not post it on github this time. Just one thing I would have found really helpful to find in the internet while I was struggling with a constantly without information resetting esp8266 would have been this:

Howto grab and parse a HTTP.Auth protected website with the esp8266 as a client:

bool getPage() {
  bool foundHost = false;

  WiFiClient client; //initialising the client globally leads to crashes

  if (client.connect(http_site, 80)) { //the more common version !client.connect() crashes 

    // We now create a URI for the request
    String url = "/dhcp";
    
    // This will send the request to the server
    client.print(String("GET ") + url + " HTTP/1.1\r\n" +
                 "Host: " + http_site + "\r\n" +
                 "Authorization: Basic YWRTeW4kYWRmaW4=\r\n" + //this is Http.Auth as a Client (Base 64)
                 "Connection: close\r\n\r\n");
    delay(500); // you'll need to wait until repsonse
    String line = "";
    // Read all the lines of the reply from server and print them to Serial
    while (client.available()) {
      line = client.readStringUntil('\r');
      //Serial.print(line);
      if (line.indexOf(hostname) != -1) {
        foundHost = true;
        break;
      }
    }
  } else {
    Serial.println("connection failed");
  }
}

In conclusion, this was a nice little project I really enjoyed doing in a sleepless night 🙂 And with about 10€ plus the lamp I got as a gift a long time ago, not that expensive.

Controlling ws2812b with an esp8266 by open-pixel-control protocol

Harder than it looks but controlling an 5m led stripe using the esp8266 by the open pixel control protocol took me a night (and might be the reason for extra bad english as i write this post directly after it). But it’s real fun!

There are several ways to make the controller blink, the easiest way is shown here:

while true; do ( echo -en '\x00\x00\x02\xA6'; dd if=/dev/urandom bs=678 count=1 status=none ) | ncat --send-only --udp 172.22.99.155 2342; sleep 0.1; done

For the duration of infintiy, it sends the static header consisting of 4 bytes ( prio, command and checksum) followed by 8bit red 8bit green and 8bit blue for each led of the stripe. It gets the blinking values by asking the source of random in linux.  It lacks a bit of white as my power source got to its limits, so if you reimplement this use 5V and 1A per 30 leds.

Another thing to mention is the data length field which are bytes 3-4 of the header or \x02\xA6 as in the command above. This length needs to equal the amount of leds times three, so in this example 226 Leds where controlled as the bytes in network order end up to be 678.

This results in that little animation:

Youtube Video

Another possibility is to send these packets by a small python script like that:

import socket
import time

from struct import *

HOST = 'your-hostname'    
PORT = 2342              
colors = [(255,255,255), (255,0,0) ,(0,0,255), (0,255,0)  ]


for color in colors:
        print "sending color {} {} {}".format(color[0],color[1],color[2])
        data = [pack('b',0),pack('b',0), pack('!h',678)];

        for i in range(0,226):
                data.append(pack('BBB',color[0],color[1],color[2]))

        s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)


        for i in range(0,1024):
                s.sendto("".join(data),(HOST,PORT))

        time.sleep(0.5)
        s.close()

import pdb; pdb.set_trace()

Code for the controller at github.

5V Light Detector analog / digital ‘Flying Fish – MH Sensor Series”

Took me a while to find the purpose of this little device I had in the mail recently. it’s a light detection sensor, which I connected to the arduino nano to test its functionality. it servs the amount of light from 0 (very bright) to 1024 (very dark) using the analog pin. To use this device with the ESP8266 you’ll probably need to adapt the voltage and transform it between 0-1V.  But for now, it works fine costing around 2 EUR 🙂

 void setup ()
int sensorPin = A0; // select the input pin for the potentiometer
int ledPin = 13; // select the pin for the LED
int sensorValue = 0; // variable to store the value coming from the sensor
 
void setup () 
{
  pinMode (ledPin, OUTPUT);
  Serial.begin (9600);
}
 
void loop () 
{
  sensorValue = analogRead (sensorPin);
  digitalWrite (ledPin, HIGH);
  delay (sensorValue);
  digitalWrite (ledPin, LOW);
  delay (sensorValue);
  Serial.println (sensorValue, DEC);
}

Output looks like this (analog):

21 -> bright
90
68
63
81
81
83
89
78
85
99
558
897
822
882
864 -> dark