I can connect to my Raspberry Pi via SSH.
I can deploy an app to a Raspberry Pi.
You will submit a link to your GitHub repo.
You will demonstrate your capacity to control your robot.
You will be asked regular comprehension questions about Raspberry Pis.
During our school's open house days, I setup a few GoPiGo robots that respond when visitors hold their hand in front of the sensors. We'll make our own version of this function. It's fun and an easy way for us to start responding to data from the ultrasonic sensor.
Power-up your robot and connect over SSH
Go to your project folder: cd PnR-Final
Launch Python: python
Import our file: import student
Instantiate our Piggy: p = student.Piggy()
Experiment with the sensor with p.dist()
Let's look at our menu and create a new option for open house. We'll keep checking dist()
and perform some sort of action when it returns a number too close
Before you start on your project to experiment with and grow your own navigational algorithm, let's go over a few helpful tricks.
Let's imagine a list of about twenty distance measurements:
[200, 210, 204, 3, 2, 7, 197, 221, 211, 1, 5, 5, 3, 205, 202]
How many obstacles do you think were found in this set of distance measurements?
Our GoPiGo's keep a list of 180 measurements, self.scan
, that corresponds to the servo's angle when taking measurements. Here's the starting code to count obstacles directly in front of the robot:
Let's try this out and see if it's accurate. Can you improve its accuracy?
Next, let's try to modify the method to get a 360 degree view of all the obstacles around our robot.
We've already experimented with self.wide_scan()
. We use that method to fill the self.scan
list with data. For example, self.scan[60]
is the distance found at the servo's 60 degree angle.
What's the distance at the midpoint of your robot?
We've also used the self.is_clear()
option to tell whether or not the space right in front of the robot is clear. Create a loop that keeps turning the robot until it's clear.