A well-established education company named __GoWell from Hong Kong__ acquires Storming Robots to run a two-week robotics workshop during the summer for a group of students from oversea. See __this brochure__ created by GoWell’s marketing team, and __here__ is a Chinese version.

High Schools Teams from across the globe were tasked to
program actual robotic satellites up on the International Space Station.
Students watched via live downlink as competition judged by astronauts aboard
the ISS. NJ team,
Quark Charm team from Branchburg-based robotics and computer science
learning center named Storming Robots, ranked admirable 4^{th} Place.

The live broadcast featured presentations by former NASA Astronauts Cady Coleman, Jeffrey Hoffman and Vice President of Airbus DS Space Systems, Hans-Juergen Zachrau. The 2018 game, ECO-SPHERES, centers around the removal of space debris from Low Earth Orbit. Students were challenged with the unprecedented task of attempting to hook two satellites together in microgravity.

Congratulations to all teams from the globe made to the ISS-Final. ISS Champions of this year’s Zero Robotics High School Tournament 2018: Naughty Dark Spaghetti (an alliance of three teams from Italy, NY, and WI)! “

This year’s game called ECO-SPHERES centers around the removal of space debris from Low Earth Orbit. Students have been challenged with the unprecedented task of successfully hooking two satellites together in microgravity. Students must command their satellite to navigate to another disabled spacecraft (another satellite), hook onto it and tow it back safely. The goal of the game is to complete these tasks in the shortest amount of time. In order to be victorious, however, students must be careful to avoid their own satellite from collisions with space debris which incur thruster damage. During the on-orbit competition, the SPHERES will be configured with hooks that were 3D printed on the ISS.

The NJ Storming Robots’ Quark Charm team formed an alliance team with two foreign teams from Poland and Russia.

Members from the NJ group consists of: Three from Warren H.S.: Jagdeep Bhatia /2nd-Captain (Gr. 11), Mayur Sharma /2nd-Captain (Gr. 11), and Prateek Humane /1st-Captain (Gr. 12). Two from Bridgewater Raritan H.S.: Jeffrey Cheng /2nd-Captain (Gr. 10) and Sunny Cheng /2nd-Captain (Gr. 12). Two from Montgomery H.S.: Daniel Xue (Gr. 10) and Shikhar Ahuja (Gr. 10). Two from Somerville H.S.: Andrew Dailey (Gr. 9) and Rishi Purohit (Gr. 8) (although attending Middle School, Rishi also attend Math program in H.S.). The rest of the team members include: Julian Lee /2nd-Captain (Gr. 10 from Pingry School), Arya Nagabhyru (Gr. 11 from Pittstown H.S.), Adithya Swaminathan (Gr. 11 from South Brunswick H.S.), Deep Patel (Gr. 11 from The Pennington School, PA), Mehal Kashyap (Gr. 12 from Edison H.S.), and Jalen Patel (Gr. 11 from Moorestown Friends School, PA).

Zero Robotics is an annual robotics programming competition where the robots are SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) inside the International Space Station. This is from the SPHERES-ZERO-Robotics program run by the Massachusetts Institute of Technology (MIT) and National Aeronautics and Space Administration (NASA). Students (Gr.9+) write algorithms for the SPHERES satellites to accomplish tasks relevant to future space missions.

]]>
* * Want to also sign up for the Math Contest? Go here.

Furthermore, registration for the competition will be opening soon, slated for next week. Be sure to look out for that!

]]>Topics

Presentor

Our 2018 ZeroRobotics Team and Participation.

Elizabeth Mabrey

Highlight of Tech Info about 2018 ZeroRobotics, and what you need to to know.

Strategies.

Strategies.

Sunny Cheng

Navigation with Dijkstra

Jagdeep Bhatia

The Physics and Math Work Behind it!

Mayur Sharma

Automated Simulations with AWS!

Prateek Humane

Data Modeling used to maximize predictability!

Sunny Cheng

There are two primary types of cameras available for LEGO. The first is the PixyCam and the second is the NXTCam (which can be used for both NXT and EV3). The PixyCam has two very different versions — the original Pixy and the Pixy for LEGO. Mindsensors produces a adapter that allows the original Pixy to be used with LEGO and programmed like a NXTCam. This recap content applies to the Pixy original with the Mindsensors adapter, however the programming section likely also works for the NXTCam and the setup portion will apply mostly to the Pixy for LEGO. A brief addition regarding Pixy for LEGO will be in the recap for reference.

- Download and install the PixyMon configuration software from https://pixycam.com/downloads/.
- Focus the camera. Turn the lens until the image becomes clear.
- Plug in the PixyCam. The camera feed should begin streaming. Make sure the option for cooked video is set (the chef button on the top bar, next to the raw meat symbol and setting).
- Configure the communication interface. Note this is different for PixyCam for LEGO. See Dennis’ post. (Configure -> Interface. Set data out protocol to UART and baud rate to 115200).
- Configure the colors. Point your camera at a color you wish to detect. Action -> set signature x. Drag the area of the color on the now frozen image. You can set one color per signature.

Go to https://github.com/botbench/robotcdriversuite, click clone or download and download as zip. Extract the contents into the directory of your RobotC project. The relevant header is mindsensors-nxtcam.h. To include the driver, use “#include “robotcdriversuite-master/include/mindsensors-nxtcam.h””, which points to the path of the file.

See the sample code located in robotcdriversuite-master/examples/mindsensors-nxtcam-test1.c.

First, include the library, from there you must initialize the camera. Define a camera object of type tSensors and the value being the sensor port. For example, “tSensors cam = S1;”. From there simply call “NXTCAMinit(cam);”.

To get the blobs, you must first declare a variable of type blob_array (“blob_array blobs;”), which you pass to the function “nblobs = NXTCAMgetBlobs(cam, blobs, condensed);”, where if condenser is true then the colliding blobs will be combined.

Each element in a blob_array is of type blob, and has the following variables: x1 (the left bound), y1 (the upper bound), x2 (the right bound), y2 (the bottom bound), colour (the signature number), and size.

There are also a number of auxiliary functions that can help with blob analysis, such as NXTCAMgetAverageCenter, which can be found in the header file that was included.

A brief sample code has been provided below that initializes a camera and continually grabs the blobs and prints out the location and size of the biggest blob.

#include “robotcdriversuite-master/include/mindsensors-nxtcam.h”

task main()

{

tSensors cam = S1;

NXTCAMinit(cam);

blob_array blobs;

bool condensed = true;

while (true)

{

short nblobs = NXTCAMgetBlobs(cam, blobs, condensed);

if (nblobs > 0)

{

_sortBlobs(nblobs, blobs);

// blobs[0] is now the largest blob.

displayTextLine(1, “Number of blobs: %d”, nblobs);

displayTextLine(3, “Top left: (%d, %d)”, blobs[0].x1, blobs[0].y1);

displayTextLine(5, “Bottom right: (%d, %d)”, blobs[0].x2, blobs[0].y2);

displayTextLine(7, “Colour: %d”, blobs[0].colour);

displayTextLine(9, “Size: %d”, blobs[0].size);

}

}

}

Firstly, for setup, when configuring the interface the actions are different. Make sure the interface is set to LEGO I2C, not UART.

When programming the Pixy for LEGO, use the LEGO I2C functions. Documentation on the I2C registers can be found here: http://cmucam.org/documents/36. Documenation on RobotC I2C functions can be found here: http://botbench.com/driversuite/group__common.html.

An example code is below (has not yet been tested so it may not work exactly as expected):

#include “robotcdriversuite-master/include/common.h”

tByteArray msg;

tByteArray reply;

// “port” is just the sensor port, “reg” is the register you want info for (0x50 for general, 0x51 – 0x57 for specific signatures — see documentation). The information you get back will be in the array “msg”.

bool getInfo(tSensors port, ubyte reg)

{

short replyLen;

if (reg == 0x50)

replyLen = 6;

else if (reg >= 0x51 && reg <= 0x57)

replyLen = 5;

else

return false;

memset(msg, 0, sizeof(msg));

msg[0] = reg;

bool result = writeI2C(port, msg, reply, replyLen);

return result;

}

https://drive.google.com/open?id=1aosqiHktFN_0wDS9-guM2N0HI30AHSRk

]]>Nov. 27th | 6:00pm | RoadMap At SR. Email us to reserve one timeslot.

Nov. 13th | 6:00pm | RoadMap Online. Email us to reserve one timeslot.

Nov. 17th | 4:00pm | New Student Evals At SR. Click here to register

Dec. 1st | 4:00pm | New Student Eval At SR. Click here to register

Dec. 15th | 4:00pm | New Student Eval At SR. Click here to register

]]>**Note:** some of the variable names will have changed from the video, specifically x, y, and h.

θ = theta = angle

x = horizontal distance

y = vertical distance

A right triangle is formed with h, x, and y. Thus, we can establish the following trigonometric identities:

sin(θ) = y / h

cos(θ) = x / h

Here, we are assuming that we know the angle θ (for example, from a gyro sensor or IMU) and the distance h as measured by a sensor. We can use this information to calculate the distances x and y by using the trigonometric identities listed above:

y = h sin(θ)

x = h cos(θ)

Let us define h=h1+h2. h is therefore the total length of the hypotenuse in the right triangle. Since we know the horizontal distance x, which is the width of the field, we can therefore find the angle θ.

Let us first establish the following trigonometric identities by simply taking the reverse of the previous identities discussed (note that arcsin is the inverse function of sin):

θ = arcsin(y / h)

θ = arccos(x / h)

By these identities, we can simply plug in h we measure with our sensors to get θ = arccos((4 feet)/h), and thus we have the angle the robot is facing. Note that the angle you get won’t be able to distinguish whether the robot is pointing towards h1 or h2. Note the following image will yield the same angle theta, but the robot will be oriented differently.

Note first the difference between radians and degrees. They are both ways of measuring angles. There there are 360 degrees in a circle, whereas there are 2π radians in a circle. Therefore, 1 degree = π/180 radians.

In RobotC, these are the function calls for trigonometric functions:

`asin(y/h)`

returns the arcsine in radians (arcsin(y / h)).

`acos(y/h)`

returns the arccosine in radians (arcsin(x / h)).

`sin(θ)`

returns the value of y/h for an angle θ in radians.

`cos(θ)`

returns the value of x/h for an angle θ in radians.

`sinDegrees(θ)`

returns the value of y/h for an angle θ in degrees.

`cosDegrees(θ)`

returns the value of x/h for an angle θ in degrees.

`radiansToDegrees(θ)`

returns the angle in degrees for a given angle θ in radians.

`degreesToRadians(θ)`

returns the angle in radians for a given angle θ in degrees.

Note that the values you get back from the sensor may not be exactly accurate. First, the value an ultrasonic or infrared sensor returns can vary from the true result, especially at long distances, since practically its is impossible to make a perfectly accurate sensor. In particular, both the infrared and ultrasonic sensors have a maximum distance it can sense up to, and the ultrasonic sensor will give junk values if it is placed directly up to something (when the distance between the object and the sensor is very low). In other cases, they should be fairly accurate.

Another thing to note is that ultrasonic sensors don’t see in a straight line, but instead a cone. That means the value you get back will be approximately the closest distance to any obstacle inside a conical region in front of it, as demonstrated by the image below.

In the diagram below, even though the obstacle (denoted by the black circle) is not directly in front of the sensor, since it is in the conical field of view the sensor will not return the distance to the wall but instead something closer to the distance to the obstacle.

Finally, some infrared sensors (such as the LEGO EV3 infrared sensor) don’t provide an absolute distance measurement; they must first be converted to distances, and their values may change depending on the color of the surface.

Motor encoders measure how many degrees a motor has rotated. So if the motor encoder value reported is 720, the motor has gone 2 full rotations. Therefore, the formula to calculate the number of rotations is (# of rotations) = (# of degrees)/360°. Let (# of rotations) = r and (# of degrees) = n. Thus, r = n/360°.

**Note**: in the stock graphical LEGO software, encoder values can be provided as either rotations or degrees.

First, we can easily find the circumference of a wheel by finding the radius and applying the equation c = 2πr. Note that the circumference c is how far the robot moves with one rotations of the motor (and thus the wheel).

The equation d = c*r is true to find the distance d the robot moves with r rotations. Since r = n/360°, d = c*n/360°. By simply rearranging the equation, we get n = d*360/c°. Therefore, given the distance we want to travel, we know how many degrees the motor needs to spin. This can be used in conjunction with a loop or similar to implement a moving function.

To turn a certain angle in place, we can execute a point turn where the robot’s two motors turn in opposite directions. Thus, when the robot turns 360°, each motor traces the circumference of a circle with a diameter of the distance between the robot’s wheels.

If we define the wheelbase (the distance between the wheels) to be w, we can calculate the circumference of this circle with πw. To turn an arbitrary amount, we multiply this circumference by the ratio of the angle desired a to one full revolution: a/360°. Now that we know the distance that each wheel needs to travel, we can apply the equations from moving in straight lines; the only difference is that one wheel is rotating in the opposite direction. Therefore the equation for the degrees a motor needs to turn is n = aπw/(360c) where c is the circumference of the wheel.

By Cy Westbrook and Rishi Sappidi (both 8th Grade)

**Abstract: **Embedded systems are widely used for many applications. They control many appliances, as well as many simpler electronic devices. At the heart of these embedded systems is an MCU. Many of us are used to using Arduinos, which have MCUs inside of them, but it is important to understand the differences between Arduinos and the more advanced MCUs that we learned about in this class over the summer. In this presentation we will be discussing how and why the transition from Arduinos to these MCUs should be made.

By Cy Westbrook (8th Grade)

**Abstract: **

As systems get more complicated, we begin to have multiple devices that need to communicate with each other. These devices include sensors, displays, motors, other MCUs, and more. Many communication protocols have been established in order for these devices to communicate with each other, including I2C. In this presentation I will explain the different terms and concepts behind many of the communication protocols that are used.

By Neil Chalil (11th Grade)

**Abstract: **

Soldering and circuit construction allow for the physical connections comprising the PCB. Soldering allows for secure electrical connections between pins or devices such as MCUs, sensors, motors, or displays and the board. There are several techniques and options for creating proper electrical connections when soldering. We used soldering in the class in order to connect our devices to our boards. In this presentation I will explain the basic concepts and techniques used when soldering.

By Dheeraj Kattar (9th Grade)

**Abstract: **

Bare Metal Programming represents what real embedded system engineers in the real world do. It’s the idea of working with the embedded system on the most basic level. Bare Metal Programming is essentially the concept of operating with the system by changing registers within the memory. I will be explaining the purpose of Bare Metal Programming and how we used this concept in our summer course.

By Rishi Sappidi (8th Grade) and Allen Zhang (11th Grade)

]]>

Winter Term Renewal will start on November 15th. All current students will receive notification with recommendation no later than November 15th.

**For New students: **

Gr. 7+ : Will be required to work on an assessment worksheet for eligibility. Steps:

- Submit an application
__here__. - Our office will email you an evaluation worksheet within the same day within M-F.
- You will be informed eligibility status within two business days.

Gr. 4 – 6 : Need to receive an acceptance after an Evaluation Session for eligibility. Steps:

- Submit an application
__here.__In the application, you will be asked also to sign up for assessment session. If you are unable to attend an evaluation session, please email to our Program Coordinator to discuss alternative. - You will be informed eligibility status within two business days.

Please note that all assessment will be subject to a cost of non-refundable $25.00.

]]>