Autonomous Mobile Robot

Matebot part 1 – Hardware setup

1. The vision

I love Argentinian Mate tea and robotics, so why not combine both? The robot is supposed to drive around autonomously in my room and create a map of it. Once the robot spots a mate teacup, it marks the spot on the map and sends a notification to a phone.

One of the most reliable ways to create an indoor map is to use a Lidar. That’s a sensor that measures the distance to a target by illuminating the target with laser light and measuring the reflected light. By rotating the sensor, the whole surrounding environment is captured.

The mate teacup will be detected by a camera using computer vision and machine learning. For this robot, I’m using a stereo camera, so it’s even possible to extract the depth information from the image

To process the sensor data and the actuator signals, I’m using the open-source middleware ROS (Robot Operating System). Click here to learn more about ROS.

2. Parts

3. Printable Parts

The frame consists of four corner pieces — Mount the motors and aluminum profiles to them. The middle profiles need the T and H brackets. All the power electronics are mounted to them. You can find all the printable .stl-files here

4. Assembling MateBot

You can see the main structure of MateBot in the following scheme. First, assemble the frame with the aluminum profiles and motors. Next, we need to mount the components. Every component has a corresponding printable mount so that you can mount it directly to the aluminum profile.

The following table describes the wire functions of the motor:

The motor power pins are connected to the MD13S motor drivers. Those have a DIR and PWM pins each. Be careful! You can only connect the PWM Pins to a PWM capable Arduino Pin (Digital Pin 3, 5, 6, 9, 10, 11). I used the following configuration:

Connect the encoder power pins to the corresponding Arduino pins and the encoder data pins to the remaining Arduino pins.

The Motor drivers have an input voltage of 12V. The step-down modules have two potentiometers — one for limiting the voltage and one for the current. Change the first potentiometer position, so the output voltage is 12V. Then connect the motor drivers.

For the second step-down converter, set the output voltage to 5V and connect the NVIDIA Jetson Nano.

The Lidar has two micro USB ports. One for data and one for power. Connect the USB charger module to the battery and power the Lidar with a micro-USB cable.

5. Code

I will write a second tutorial on installing ROS and the required nodes. But if you want to test the hardware setup, here’s some sample code. But first, you need to install this Arduino library

#include “CytronMotorDriver.h”

// Configure the motor driver.
CytronMD motor1(PWM_DIR, 3, 2);  // 1PWM = Pin 3, 2DIR = Pin 4.
CytronMD motor2(PWM_DIR, 5, 4);  // PWM = Pin 3, DIR = Pin 4.
CytronMD motor3(PWM_DIR, 9, 8);  // PWM = Pin 3, DIR = Pin 4.
CytronMD motor4(PWM_DIR, 6, 7);  // PWM = Pin 3, DIR = Pin 4.

// The setup routine runs once when you press reset.
void setup() {

// The loop routine runs over and over again forever.
void loop() {

motor1.setSpeed(150);  // Run forward
motor2.setSpeed(150);  // Run forward
motor3.setSpeed(150);  // Run forward
motor4.setSpeed(150);  // Run forward

}If you want to test the motor encoders, I recommend you to check out this tutorial

6. Future work

In the next tutorial, we will install Ubuntu and ROS on the Nvidia Jetson Nano. Then you’ll learn how to build a ROS node to control the motors. For the Lidar and stereo Camera, we’ll install the corresponding node and use the Hector Slam algorithm to create a map of the room.

If you are interested in the current progress, check out my Instagram (@code_byter) and GitHub (code-byter) page.

What do you think?

4 points
Upvote Downvote

Written by code_byter

Electrical engineering student
💻 #MachineLearning
🤖 #Robotics and #3Dprinting

Soldering Art

A fan with using power bank