Lab 9: Localization on the real robot

Objective

The purpose of this lab is to implement localization on the actual robot

Simulator

As shown above, the grid localization on the simulator runs smoothly. There is a little bit of difference between the ground truth and the odom pose which is a discrepancy that originates from the measurents and not the code.

Real Robot

I was not able to complete the lab for the real robot because of the following reasons:
  • Initially, my VM stopped working and so I was trying to retrieve my data from the VM, and I ended up deleting the VM and I could not reinstall it anymore due to which I was not able to work on lab 9 or 8 during the week when lab 9 was due.
  • Vivek helped me boot up the VM again, and I started working on lab 9 but my bluetooth broke and was able to get a replacement on tuesday this week.
  • I worked for a few hours to write the code shown below on the Arduino IDE that would send data via bluetooth. I tested it on windows (not on the VM) and it worked.
  • It took me 2 more hours to set up bluetooth on the VM since I did not have a keyboard with a right ctrl so I had to acquire one that did. Before I realized the only way for me to enable bluetooth was from the devices tab, I spent a lot of time trying to figure out other ways to enable bluetooth than to open the devices tab. I was in scaled view mode so I could not see the tab intially.
  • Finally, I had to setup the jupyter lab to accept data through bluetooth which took a few more hours because initially I had no idea what I was doing since I used bluetooth out of the VM for all the other labs and there were not many resources on how to approch this problem online. After following steps in the videos posted by Alex, I finally reached to a a place where I could communicate with the robot (ex. ping it, send message). However, this is where my progress halted.
  • I spent 27 hours on debugging, testing, writing code, and fixing other issues (ex. VM). My robot kept failing to connect through bluetooth (shown below). At one point, it showed me 11 bleak errors in a row. The problem usually was fixed when I restarted the system/VM which took additional time.
  • I was really looking forward to testing this on the map and the real robot, but was dissapointed to not get too far. I intend on continuing to work this monday and tuesday, and post the completed report by tuesday night.

Arduino Code (Real Robot):

The code below rotates the robot 360 degrees and stores data in 20 degree increments, then it sends this data over bluetooth to the computer to be analyzed for localization.

 // ------------------------  Store Data at angles needed ------------------------ //
    if (yaw >= yaw_want[i] && STOP == 0) { // only at the angles we want 


      distanceSensor.startRanging(); //Write configuration bytes to initiate measurement
      int distance = distanceSensor.getDistance(); //Get the result of the measurement from the sensor
      dist[i] = distance;
      distanceSensor.clearInterrupt();
      distanceSensor.stopRanging();

      Serial.print("Distance(mm): ");
      Serial.print(distance);
      Serial.print(" | ");
      Serial.print("yaw_wanted (deg): ");
      Serial.print(yaw_want[i]);
      Serial.print(" | ");
      Serial.print("yaw (deg): ");
      Serial.print(yaw);
      Serial.println();

      i++;
  
      if (yaw > 340) { // If done with rotation
        STOP = 1;
        // ----------------------------------  STOP Motors  ---------------------------------- //
        digitalWrite( LEDPIN, LEFT_MOTOR );
        digitalWrite( LEDPIN, RIGHT_MOTOR );
        myMotorDriver.setDrive( LEFT_MOTOR, 1, 0); //Drive motor i forward at full speed
        myMotorDriver.setDrive( RIGHT_MOTOR, 1, 0); //Drive motor i forward at full speed

        // ----------------------------------  Send Data  ---------------------------------- //
        res_cmd->command_type = BYTESTREAM_TX;
        res_cmd->length = 14;

        // Send the 18 measurements at different angles
        memcpy(res_cmd->data, &dist[0], 4); 
        memcpy(res_cmd->data + 4, &dist[1], 4);
        memcpy(res_cmd->data + 8, &dist[2], 8);
        memcpy(res_cmd->data + 12, &dist[3], 4);
        memcpy(res_cmd->data + 16, &dist[4], 4);
        memcpy(res_cmd->data + 20, &dist[5], 4);
        memcpy(res_cmd->data + 24, &dist[6], 4);
        memcpy(res_cmd->data + 28, &dist[7], 4);
        memcpy(res_cmd->data + 32, &dist[8], 4);
        memcpy(res_cmd->data + 36, &dist[9], 4);
        memcpy(res_cmd->data + 40, &dist[10], 4);
        memcpy(res_cmd->data + 44, &dist[11], 4);
        memcpy(res_cmd->data + 48, &dist[12], 4);
        memcpy(res_cmd->data + 52, &dist[13], 4);
        memcpy(res_cmd->data + 56, &dist[14], 4);
        memcpy(res_cmd->data + 60, &dist[15], 4);
        memcpy(res_cmd->data + 64, &dist[16], 4);
        memcpy(res_cmd->data + 68, &dist[17], 4);

        //Serial.printf("Stream %d \n", bytestream_active);
        amdtpsSendData((uint8_t *)res_cmd, 74);
        
      }

    }
    else if (STOP == 0) {
      // ------------------------  Move Motors at Steady speeds ------------------------ //
      digitalWrite( LEDPIN, LEFT_MOTOR );
      digitalWrite( LEDPIN, RIGHT_MOTOR );
      myMotorDriver.setDrive( LEFT_MOTOR, 1, 150); //Rotate
      myMotorDriver.setDrive( RIGHT_MOTOR, 1, 150);

      // ------------------------  Calculate Yaw and ang. vel------------------------ //
      if ( myICM.dataReady() ) {
        myICM.getAGMT();                // The values are only updated when you call 'getAGMT'

        // yaw from Gyro
        timecurr = millis();
        dt = (timecurr - timeprev) / 1000;
        timeprev = timecurr;
        yaw = yaw + (myICM.gyrZ()) * dt;
        thd = myICM.gyrZ();
        Input = thd;
      }

      // ---------------------------------  PID --------------------------------- //
      myPID.Compute();
    }
  }
  timeprev = timecurr;
  trigger_timers();
  delay(10);
                

Map

Offline - Real Robot

After not being able to make the bluetooth work, I decided to perform the bayes filter offline. I took the depth measurements using bluetooth on windows, and transfered these 18 depth measurements to the VM in a text file. I used numpy to load the .txt file onto jupyter and into the function "perform_observation_loop". I divided the values by 1000 to convert the units to meters, and added 3 inches ( 0.0762 m) to account for the distance from the sensor to the center of the robot (about which it rotates). I chose 6 different positions to localize. I measured the truth pose for these positions using a measuring tape, and ran the arduino code below to rotate and measure the depth at 20 degree intervals:

Arduino code (offline)


if (bytestream_active)
  {

    float Stop = 0;

    // Rotate 360 degrees until all angles have been reached
    distanceSensor.startRanging(); //Write configuration bytes to initiate measurement
    while (yaw < 340) {

      if (i == 0) { // skip first iteration

        int distance = distanceSensor.getDistance(); //Get the result of the measurement from the sensor
        dist[i] = distance;

        i++; timecurr = millis();
      }
      else { // everything except first iteration
        // ------------------------  Calculate Yaw and ang. vel------------------------ //
        if ( myICM.dataReady() ) {
          myICM.getAGMT();                // The values are only updated when you call 'getAGMT'

          // yaw from Gyro
          timecurr = millis();
          dt = (timecurr - timeprev) / 1000;
          timeprev = timecurr;
          yaw = yaw + (myICM.gyrZ()) * dt;
          thd = myICM.gyrZ();
          Input = thd;
        }

        // ---------------------------------  PID --------------------------------- //
        myPID.Compute();

        // Send data if greater than angles we want
        if (yaw >= yaw_want[i]) { // only at the angles we want

          int distance = distanceSensor.getDistance(); //Get the result of the measurement from the sensor
          dist[i] = distance;

          Serial.print("Distance(mm): ");
          Serial.print(distance);
          Serial.print(" | ");
          Serial.print("yaw_wanted (deg): ");
          Serial.print(yaw_want[i]);
          Serial.print(" | ");
          Serial.print("yaw (deg): ");
          Serial.print(yaw);
          Serial.println();

          i++;
        }
      }
      timeprev = timecurr;

      // ------------------------  Move Motors at Steady speeds ------------------------ //
      digitalWrite( LEDPIN, LEFT_MOTOR );
      digitalWrite( LEDPIN, RIGHT_MOTOR );
      myMotorDriver.setDrive( LEFT_MOTOR, 1, 170); //Rotate
      myMotorDriver.setDrive( RIGHT_MOTOR, 1, 170);

      if (yaw > 340) {
        Stop = 1;
      }

    }

    if (Stop == 1) {
      // ------------------------  STOP Motors ------------------------ //
      digitalWrite( LEDPIN, LEFT_MOTOR );
      digitalWrite( LEDPIN, RIGHT_MOTOR );
      myMotorDriver.setDrive( LEFT_MOTOR, 1, 0); //Rotate
      myMotorDriver.setDrive( RIGHT_MOTOR, 1, 0);

      // ----------------------------------  Send Data  ---------------------------------- //
      res_cmd->command_type = BYTESTREAM_TX;
      res_cmd->length = 14;

      // Send the 18 measurements at different angles
      memcpy(res_cmd->data, &dist[0], 4);
      memcpy(res_cmd->data + 4, &dist[1], 4);
      memcpy(res_cmd->data + 8, &dist[2], 8);
      memcpy(res_cmd->data + 12, &dist[3], 4);
      memcpy(res_cmd->data + 16, &dist[4], 4);
      memcpy(res_cmd->data + 20, &dist[5], 4);
      memcpy(res_cmd->data + 24, &dist[6], 4);
      memcpy(res_cmd->data + 28, &dist[7], 4);
      memcpy(res_cmd->data + 32, &dist[8], 4);
      memcpy(res_cmd->data + 36, &dist[9], 4);
      memcpy(res_cmd->data + 40, &dist[10], 4);
      memcpy(res_cmd->data + 44, &dist[11], 4);
      memcpy(res_cmd->data + 48, &dist[12], 4);
      memcpy(res_cmd->data + 52, &dist[13], 4);
      memcpy(res_cmd->data + 56, &dist[14], 4);
      memcpy(res_cmd->data + 60, &dist[15], 4);
      memcpy(res_cmd->data + 64, &dist[16], 4);
      memcpy(res_cmd->data + 68, &dist[17], 4);

      //Serial.printf("Stream %d \n", bytestream_active);
      amdtpsSendData((uint8_t *)res_cmd, 74);

      distanceSensor.clearInterrupt();
      distanceSensor.stopRanging();

    }
  }
                

Ground truth Pose

Six Positions on Map

Results

Pose 1:

Pose 2:

Pose 3:

Pose 4:

Pose 5:

Pose 6:

Sources of error:

As shown each of the representations above, there is a discrepancy between the true pose and the belief determined from the measurements and the bayes filter (small and big), and this is due to the following reasons:
  • The depth measurements are taken during the rotation of the robot, which provides inaccurate depth measurements which result in inaccurate localization. This can be fixed by stopping in the middle of the motion of rotating to take the measurements, rather than taking the measurements in one continuous motion.
  • The map has certain locations where the depth measurements may be similar yet not completely alike, thefefore, the belief given may be one of two (or more) locations that provide similar depth measurements post-rotation. To fix this, I put in an obstacle, but in future labs I will offset the obstacle to the side a bit so the entire map is not symmetric anymore.
  • The rotation of the robot was faster than I would have liked. After changing the batteries many times, and testing the motion, I arrived at an input of 175 for the motors which provided a smooth rotation which was still a bit fast, but if I went any lower, the motion became jagged and the robot would not rotate about the center anymore.

Conclusion

Considering the possible sources of error, most of the measurements shown above results in beliefs that were relatively close to the ground truth pose. In future labs, this localization will help in path planning since the results seem to be dependable. Furthermore, localization for a moving robot will also depend on the odometry (linear and angular velocities), so the localization results should not jump around since it will depend on initial localization. I can choose an initial point that I know reduces the impact of the errors mentioned above, so the following poses will be relatively more accurate.