After 4 days of vigorous competitions, our Maze Team Polaris, Alexander Lisenko, Jeffrey Cheng, and Julian Lee, brought home 2 top awards from the 2019 World Robocup Junior Maze Competition – 3rd Place, and BEST Engineering Strategy Award. Tournament was held in Sidney, Australia from July 3 to 7. * ***News at the NJ Star Ledger** ** **

*Besides Team Polaris, there were two all-rookie teams competed at the event. Although not having placed top rank, with their admirable effort, they all gained substantial amount of knowledge through this rare opportunity – this will surely contribute to their future success. These teams are : Maze team – Victor Hu, and Varun Sarabudla. Rescue Line Open – Andy Cheng, Dhruva Chakravarthi, and Peter Lin. Event took place from July 4-7 at the Sidney of Australia.*

*Besides Team Polaris, there were two all-rookie teams competed at the event. Although not having placed top rank, with their admirable effort, they all gained substantial amount of knowledge through this rare opportunity – this will surely contribute to their future success. These teams are : Maze team – Victor Hu, and Varun Sarabudla. Rescue Line Open – Andy Cheng, Dhruva Chakravarthi, and Peter Lin. Event took place from July 4-7 at the Sidney of Australia.*

**June 23rd (Sunday) at 2:15 pm** (Reservation is required.)

**Speaker : Alex Zhang**

Alex won the 2nd Place Winner in the BioEngineering research work category in the 2019 International Science and Engineering Fair (ISEF), a highly prestigious science and engineering competition. He also received the Computer Science Awards despite his work being in the different category.

Please indicate the event name.

]]>This year Storming Robots has sent 19 teams to compete in different leagues including: Rescue, Soccer, and OnStage. Check out below to see where each team placed!

**Pictures coming soon!**

For more information, please visit https://junior.robocup.org/

]]>High Schools Teams from across the globe were tasked to
program actual robotic satellites up on the International Space Station.
Students watched via live downlink as competition judged by astronauts aboard
the ISS. NJ team,
Quark Charm team from Branchburg-based robotics and computer science
learning center named Storming Robots, ranked admirable 4^{th} Place.

The live broadcast featured presentations by former NASA Astronauts Cady Coleman, Jeffrey Hoffman and Vice President of Airbus DS Space Systems, Hans-Juergen Zachrau. The 2018 game, ECO-SPHERES, centers around the removal of space debris from Low Earth Orbit. Students were challenged with the unprecedented task of attempting to hook two satellites together in microgravity.

Congratulations to all teams from the globe made to the ISS-Final. ISS Champions of this year’s Zero Robotics High School Tournament 2018: Naughty Dark Spaghetti (an alliance of three teams from Italy, NY, and WI)! “

This year’s game called ECO-SPHERES centers around the removal of space debris from Low Earth Orbit. Students have been challenged with the unprecedented task of successfully hooking two satellites together in microgravity. Students must command their satellite to navigate to another disabled spacecraft (another satellite), hook onto it and tow it back safely. The goal of the game is to complete these tasks in the shortest amount of time. In order to be victorious, however, students must be careful to avoid their own satellite from collisions with space debris which incur thruster damage. During the on-orbit competition, the SPHERES will be configured with hooks that were 3D printed on the ISS.

The NJ Storming Robots’ Quark Charm team formed an alliance team with two foreign teams from Poland and Russia.

Members from the NJ group consists of: Three from Warren H.S.: Jagdeep Bhatia /2nd-Captain (Gr. 11), Mayur Sharma /2nd-Captain (Gr. 11), and Prateek Humane /1st-Captain (Gr. 12). Two from Bridgewater Raritan H.S.: Jeffrey Cheng /2nd-Captain (Gr. 10) and Sunny Cheng /2nd-Captain (Gr. 12). Two from Montgomery H.S.: Daniel Xue (Gr. 10) and Shikhar Ahuja (Gr. 10). Two from Somerville H.S.: Andrew Dailey (Gr. 9) and Rishi Purohit (Gr. 8) (although attending Middle School, Rishi also attend Math program in H.S.). The rest of the team members include: Julian Lee /2nd-Captain (Gr. 10 from Pingry School), Arya Nagabhyru (Gr. 11 from Pittstown H.S.), Adithya Swaminathan (Gr. 11 from South Brunswick H.S.), Deep Patel (Gr. 11 from The Pennington School, PA), Mehal Kashyap (Gr. 12 from Edison H.S.), and Jalen Patel (Gr. 11 from Moorestown Friends School, PA).

Zero Robotics is an annual robotics programming competition where the robots are SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) inside the International Space Station. This is from the SPHERES-ZERO-Robotics program run by the Massachusetts Institute of Technology (MIT) and National Aeronautics and Space Administration (NASA). Students (Gr.9+) write algorithms for the SPHERES satellites to accomplish tasks relevant to future space missions.

]]>
* * Want to also sign up for the Math Contest? Go here.

Furthermore, registration for the competition will be opening soon, slated for next week. Be sure to look out for that!

]]>Topics

Presentor

Our 2018 ZeroRobotics Team and Participation.

Elizabeth Mabrey

Highlight of Tech Info about 2018 ZeroRobotics, and what you need to to know.

Strategies.

Strategies.

Sunny Cheng

Navigation with Dijkstra

Jagdeep Bhatia

The Physics and Math Work Behind it!

Mayur Sharma

Automated Simulations with AWS!

Prateek Humane

Data Modeling used to maximize predictability!

Sunny Cheng

There are two primary types of cameras available for LEGO. The first is the PixyCam and the second is the NXTCam (which can be used for both NXT and EV3). The PixyCam has two very different versions — the original Pixy and the Pixy for LEGO. Mindsensors produces a adapter that allows the original Pixy to be used with LEGO and programmed like a NXTCam. This recap content applies to the Pixy original with the Mindsensors adapter, however the programming section likely also works for the NXTCam and the setup portion will apply mostly to the Pixy for LEGO. A brief addition regarding Pixy for LEGO will be in the recap for reference.

- Download and install the PixyMon configuration software from https://pixycam.com/downloads/.
- Focus the camera. Turn the lens until the image becomes clear.
- Plug in the PixyCam. The camera feed should begin streaming. Make sure the option for cooked video is set (the chef button on the top bar, next to the raw meat symbol and setting).
- Configure the communication interface. Note this is different for PixyCam for LEGO. See Dennis’ post. (Configure -> Interface. Set data out protocol to UART and baud rate to 115200).
- Configure the colors. Point your camera at a color you wish to detect. Action -> set signature x. Drag the area of the color on the now frozen image. You can set one color per signature.

Go to https://github.com/botbench/robotcdriversuite, click clone or download and download as zip. Extract the contents into the directory of your RobotC project. The relevant header is mindsensors-nxtcam.h. To include the driver, use “#include “robotcdriversuite-master/include/mindsensors-nxtcam.h””, which points to the path of the file.

See the sample code located in robotcdriversuite-master/examples/mindsensors-nxtcam-test1.c.

First, include the library, from there you must initialize the camera. Define a camera object of type tSensors and the value being the sensor port. For example, “tSensors cam = S1;”. From there simply call “NXTCAMinit(cam);”.

To get the blobs, you must first declare a variable of type blob_array (“blob_array blobs;”), which you pass to the function “nblobs = NXTCAMgetBlobs(cam, blobs, condensed);”, where if condenser is true then the colliding blobs will be combined.

Each element in a blob_array is of type blob, and has the following variables: x1 (the left bound), y1 (the upper bound), x2 (the right bound), y2 (the bottom bound), colour (the signature number), and size.

There are also a number of auxiliary functions that can help with blob analysis, such as NXTCAMgetAverageCenter, which can be found in the header file that was included.

A brief sample code has been provided below that initializes a camera and continually grabs the blobs and prints out the location and size of the biggest blob.

#include “robotcdriversuite-master/include/mindsensors-nxtcam.h”

task main()

{

tSensors cam = S1;

NXTCAMinit(cam);

blob_array blobs;

bool condensed = true;

while (true)

{

short nblobs = NXTCAMgetBlobs(cam, blobs, condensed);

if (nblobs > 0)

{

_sortBlobs(nblobs, blobs);

// blobs[0] is now the largest blob.

displayTextLine(1, “Number of blobs: %d”, nblobs);

displayTextLine(3, “Top left: (%d, %d)”, blobs[0].x1, blobs[0].y1);

displayTextLine(5, “Bottom right: (%d, %d)”, blobs[0].x2, blobs[0].y2);

displayTextLine(7, “Colour: %d”, blobs[0].colour);

displayTextLine(9, “Size: %d”, blobs[0].size);

}

}

}

Firstly, for setup, when configuring the interface the actions are different. Make sure the interface is set to LEGO I2C, not UART.

When programming the Pixy for LEGO, use the LEGO I2C functions. Documentation on the I2C registers can be found here: http://cmucam.org/documents/36. Documenation on RobotC I2C functions can be found here: http://botbench.com/driversuite/group__common.html.

An example code is below (has not yet been tested so it may not work exactly as expected):

#include “robotcdriversuite-master/include/common.h”

tByteArray msg;

tByteArray reply;

// “port” is just the sensor port, “reg” is the register you want info for (0x50 for general, 0x51 – 0x57 for specific signatures — see documentation). The information you get back will be in the array “msg”.

bool getInfo(tSensors port, ubyte reg)

{

short replyLen;

if (reg == 0x50)

replyLen = 6;

else if (reg >= 0x51 && reg <= 0x57)

replyLen = 5;

else

return false;

memset(msg, 0, sizeof(msg));

msg[0] = reg;

bool result = writeI2C(port, msg, reply, replyLen);

return result;

}

https://drive.google.com/open?id=1aosqiHktFN_0wDS9-guM2N0HI30AHSRk

]]>**Note:** some of the variable names will have changed from the video, specifically x, y, and h.

θ = theta = angle

x = horizontal distance

y = vertical distance

A right triangle is formed with h, x, and y. Thus, we can establish the following trigonometric identities:

sin(θ) = y / h

cos(θ) = x / h

Here, we are assuming that we know the angle θ (for example, from a gyro sensor or IMU) and the distance h as measured by a sensor. We can use this information to calculate the distances x and y by using the trigonometric identities listed above:

y = h sin(θ)

x = h cos(θ)

Let us define h=h1+h2. h is therefore the total length of the hypotenuse in the right triangle. Since we know the horizontal distance x, which is the width of the field, we can therefore find the angle θ.

Let us first establish the following trigonometric identities by simply taking the reverse of the previous identities discussed (note that arcsin is the inverse function of sin):

θ = arcsin(y / h)

θ = arccos(x / h)

By these identities, we can simply plug in h we measure with our sensors to get θ = arccos((4 feet)/h), and thus we have the angle the robot is facing. Note that the angle you get won’t be able to distinguish whether the robot is pointing towards h1 or h2. Note the following image will yield the same angle theta, but the robot will be oriented differently.

Note first the difference between radians and degrees. They are both ways of measuring angles. There there are 360 degrees in a circle, whereas there are 2π radians in a circle. Therefore, 1 degree = π/180 radians.

In RobotC, these are the function calls for trigonometric functions:

`asin(y/h)`

returns the arcsine in radians (arcsin(y / h)).

`acos(y/h)`

returns the arccosine in radians (arcsin(x / h)).

`sin(θ)`

returns the value of y/h for an angle θ in radians.

`cos(θ)`

returns the value of x/h for an angle θ in radians.

`sinDegrees(θ)`

returns the value of y/h for an angle θ in degrees.

`cosDegrees(θ)`

returns the value of x/h for an angle θ in degrees.

`radiansToDegrees(θ)`

returns the angle in degrees for a given angle θ in radians.

`degreesToRadians(θ)`

returns the angle in radians for a given angle θ in degrees.

Note that the values you get back from the sensor may not be exactly accurate. First, the value an ultrasonic or infrared sensor returns can vary from the true result, especially at long distances, since practically its is impossible to make a perfectly accurate sensor. In particular, both the infrared and ultrasonic sensors have a maximum distance it can sense up to, and the ultrasonic sensor will give junk values if it is placed directly up to something (when the distance between the object and the sensor is very low). In other cases, they should be fairly accurate.

Another thing to note is that ultrasonic sensors don’t see in a straight line, but instead a cone. That means the value you get back will be approximately the closest distance to any obstacle inside a conical region in front of it, as demonstrated by the image below.

In the diagram below, even though the obstacle (denoted by the black circle) is not directly in front of the sensor, since it is in the conical field of view the sensor will not return the distance to the wall but instead something closer to the distance to the obstacle.

Finally, some infrared sensors (such as the LEGO EV3 infrared sensor) don’t provide an absolute distance measurement; they must first be converted to distances, and their values may change depending on the color of the surface.

Motor encoders measure how many degrees a motor has rotated. So if the motor encoder value reported is 720, the motor has gone 2 full rotations. Therefore, the formula to calculate the number of rotations is (# of rotations) = (# of degrees)/360°. Let (# of rotations) = r and (# of degrees) = n. Thus, r = n/360°.

**Note**: in the stock graphical LEGO software, encoder values can be provided as either rotations or degrees.

First, we can easily find the circumference of a wheel by finding the radius and applying the equation c = 2πr. Note that the circumference c is how far the robot moves with one rotations of the motor (and thus the wheel).

The equation d = c*r is true to find the distance d the robot moves with r rotations. Since r = n/360°, d = c*n/360°. By simply rearranging the equation, we get n = d*360/c°. Therefore, given the distance we want to travel, we know how many degrees the motor needs to spin. This can be used in conjunction with a loop or similar to implement a moving function.

To turn a certain angle in place, we can execute a point turn where the robot’s two motors turn in opposite directions. Thus, when the robot turns 360°, each motor traces the circumference of a circle with a diameter of the distance between the robot’s wheels.

If we define the wheelbase (the distance between the wheels) to be w, we can calculate the circumference of this circle with πw. To turn an arbitrary amount, we multiply this circumference by the ratio of the angle desired a to one full revolution: a/360°. Now that we know the distance that each wheel needs to travel, we can apply the equations from moving in straight lines; the only difference is that one wheel is rotating in the opposite direction. Therefore the equation for the degrees a motor needs to turn is n = aπw/(360c) where c is the circumference of the wheel.

- Resources
- Resources for learning how to program in RobotC, including basic programming constructs and accessing sensors and motors, can be found on the Storming Robots site.
- Please sign up for the forum if you have not done so already. There you can post any questions or concerns you may have. Instructions are on last week’s recap.

- Sensors
- Touch
- Detects when it pushes against something (like a button). Can potentially be used to detect if the robot hits a wall or the other robot.

- Light
- Measures how much light is reflected back (white reflects more than black). Can be used to detect what color the robot is on, white or green. Since penalty boxes, the midfield line, etc are white, the light sensor may be used to help determine the location of the robot.

- Ultrasonic/Infrared
- Measures the distance from the sensor to an object, such as a wall. Can also be used to determine the location of the robot, specifically to see if its getting too close to a wall.

- Gyro
- Measures the rotation of the robot. In other words, it can help you figure out what direction the robot is facing.

- Motor Encoder
- Measures how many times the motor has spun. Can possibly be used to aid movement functions (figure out how far the robot has moved) and other applications. This functionality is built right into the motor and does not need a seperate sensor.

- Cameras
- Can track different blobs of color and return their location. Essential for tracking the orange ball. We will go more into this topic in a future meeting.

- Allowed sensors
- A full list of allowed sensors will be released with the rules. Tentatively, first party sensors (sensors made by LEGO) and third party sensors from Mindsensors and HiTechnic will be allowed. For cameras, PixyCam and NXTCam from Mindsensors will be allowed. If there are any sensors that are not on the approved list, please reach out to the organizers through the forum.

- Touch
- Summary of rules Q&A
- To rephrase the rules, the robot must fit within a circle 23cm in diameter.
- A “handle” is simply something that allows referees to easily pick up the robot without causing damage. It can be made out of anything; a loop of zip-ties connected together will do.

- What to do before the next meeting
- We recommend continuing in planning the robot design, especially with the understanding of the available sensors in mind.

Full meeting recording:

]]>