miércoles, 10 de diciembre de 2025

Laser Mapping

 

Laser Mapping

Goal of the Project

The goal of this project is to build a navigation system that allows the robot to autonomously explore an unknown environment while generating an occupancy map using LIDAR data.

The robot must:

  • Detect obstacles and free space using a probabilistic laser sensor model.

  • Explore unknown areas by moving toward the closest frontier.

  • Safely update its trajectory as the occupancy grid changes.

  • Move waypoint by waypoint until no more unexplored space is reachable.

This results in a robot capable of mapping an entire warehouse-like area on its own.

 

Project Overview

In this project I implemented a frontier-based exploration system combined with a probabilistic occupancy grid. The idea is simple: the robot keeps expanding the map and always tries to move toward the nearest “unknown border” of the map until the entire environment has been explored.

To do this, the robot maintains a continuous estimate of its own position in pixel coordinates.
The simulator provides the robot position in meters, and I use poseToMap() to convert the robot’s real-world position into the exact pixel indices inside the occupancy grid.

I also use the same conversion to compute the robot’s heading direction in pixel coordinates, which is essential for the angular control and for knowing where the robot is “looking” inside the map.

 

Implementation 

BFS Frontier Search

At the beginning of each cycle, the robot launches a BFS (Breadth-First Search) that tries to locate the closest unknown pixel in the occupancy grid.

To make the search faster, I don’t explore pixel by pixel.
Instead, I move in steps of 15 pixels, which drastically reduces the amount of visited nodes and still gives a smooth exploration behavior.

The BFS expands outward from the robot’s current pixel position and checks cells that are:

  • free according to the occupancy grid, and

  • safe (meaning no obstacles are too close in the vicinity).

Safety rectangle (experimentally tuned)

To determine whether a cell is safe for the robot, I implemented a function that checks a rectangular region around each candidate pixel.
This rectangle extends 12 pixels in all directions, forming a 25×25 pixel safety box.

These exact dimensions were not arbitrary:
I determined them experimentally by observing how many pixels a wall or obstacle occupied inside the grid and how much clearance was needed to prevent the robot from clipping corners or touching walls.

This ensures that BFS never selects waypoints that are technically free but too close to walls in practice.

As soon as the BFS finds a free cell that is adjacent to an unknown area, that position is considered a frontier, and the BFS reconstructs the shortest path back to the robot.

This path becomes the list of waypoints the robot will follow.

If the BFS can’t find any safe frontier, the robot backs up very slowly.
This is something I added because sometimes the robot gets too close to a wall, and then the starting BFS node is not considered safe, causing BFS to refuse to expand.
By adding a small negative velocity, the robot moves away from the obstacle until a safe BFS start becomes possible again.

 

Probabilistic Laser Mapping (Log-Odds)

The map is built using a classical log-odds occupancy update. For each laser beam:

  • I compute its world angle using the robot’s yaw.

  • I check the measured distance.

While testing, I confirmed that the effective maximum range of the laser in this environment is about 3.5 meters, so:

  • If the laser detects an obstacle, I mark the endpoint as occupied.

  • If the laser returns no hit, I treat it as a max-range reading and set its reading to 3.5 m.

In both cases, I fill all intermediate cells between the robot and the endpoint as free.

For every cell traversed:

  • I add log_free.

  • If the endpoint corresponds to an actual hit, I also add log_occ at the final cell.

These log-odds updates accumulate over time and converge to probabilities where:

  • positive means occupied,

  • negative means free,

  • values close to zero remain unknown.

This results in a clean, continuously updated occupancy map.

The laser mapping is made every 0.5m so it has independent measurements and prevent from saturation of the probabilistic grid

 

Robot Movement and Path Following

Once BFS generates the waypoint list, the robot follows each point sequentially.

I use a P-controller for angular velocity:

w = K * angular_error

To compute the angular error, I use the robot’s orientation in pixel coordinates.
I take the robot’s current world position, convert it to pixel coordinates with poseToMap(), and do the same with a point located one meter in front of the robot.

This gives me two pixel points:

  • the robot’s current pixel location

  • the “front direction” pixel location

Using these two points I compute the robot’s heading angle in pixel space and compare it with the angle to the waypoint.

The robot:

  • Rotates toward the waypoint using the P-controller.

  • Moves forward only when the heading error is small enough.

  • Switches to the next waypoint when it gets sufficiently close.

This leads to smooth and reliable navigation.

 

Safety Checks While Moving

Even after BFS finds a path, the robot constantly rechecks if the next waypoint is safe.
Because the map updates continuously, an area thought to be free may later be detected as dangerous when new laser scans arrive.

If a waypoint becomes unsafe:

  • the robot cancels the current path,

  • switches back to BFS,

  • and if the robot is too close to a wall, it backs up slowly until a safe position is reached again.

This combination of real-time mapping and active safety makes the robot robust in tight or cluttered environments.

 

Problems and Solutions

Robot stuck near walls.

Sometimes BFS rejected the robot’s starting cell because it was too close to a wall.
Solution: add a slow backward motion when BFS fails.

Laser maximum distance behavior.

The simulation’s effective maximum laser distance is ~3.5 m.
Solution: treat no-return readings as 3.5 m and mark the entire ray as free.

Efficient exploration.

A pixel-by-pixel BFS was too slow.
Solution: BFS moves in steps of 15 pixels, which preserves exploration quality while being much faster.

The robot hit the walls and shelves when moving.

At first the movement was done with the bfs just checking pixel by pixel. What happened was that the robot moved too close to the obstacles and hit them.

Solution: I had to create a function that checks a square of pixels around the pixel is being checked so neither the pixel or the surroundings are occupied.  

The laser problems.

The laser sometimes assigned values to pixels behind obstacles and that generate errors in my algorithm.

Solution: I had to stablish that when a laser detects an object it doesn't paint further away 

 

Video

    Perfect Localization 

         

    Odom localizations:

Using Odom the robot doesn't have an exact location so it starts in the location it should and then it starts to slowly deteriorate and maps wrong the room and in Odom3 it finally hits an obstacle.

 

    Odom1: 

             



    Odom2:


We can observe that using Odom2 the location of the robot is not an exact location so it maps the room deformed.

 

    Odom3:

        


 It hits the wall because of the odom3 location

No hay comentarios:

Publicar un comentario

Marker Visual Loc

Marker Based Visual Localization Goal of the Project The goal of this project is to estimate the 2D pose (x, y, yaw) of a mobile robot us...