Furthermore, registration for the competition will be opening soon, slated for next week. Be sure to look out for that!
]]>There are two primary types of cameras available for LEGO. The first is the PixyCam and the second is the NXTCam (which can be used for both NXT and EV3). The PixyCam has two very different versions — the original Pixy and the Pixy for LEGO. Mindsensors produces a adapter that allows the original Pixy to be used with LEGO and programmed like a NXTCam. This recap content applies to the Pixy original with the Mindsensors adapter, however the programming section likely also works for the NXTCam and the setup portion will apply mostly to the Pixy for LEGO. A brief addition regarding Pixy for LEGO will be in the recap for reference.
Go to https://github.com/botbench/robotcdriversuite, click clone or download and download as zip. Extract the contents into the directory of your RobotC project. The relevant header is mindsensors-nxtcam.h. To include the driver, use “#include “robotcdriversuite-master/include/mindsensors-nxtcam.h””, which points to the path of the file.
See the sample code located in robotcdriversuite-master/examples/mindsensors-nxtcam-test1.c.
First, include the library, from there you must initialize the camera. Define a camera object of type tSensors and the value being the sensor port. For example, “tSensors cam = S1;”. From there simply call “NXTCAMinit(cam);”.
To get the blobs, you must first declare a variable of type blob_array (“blob_array blobs;”), which you pass to the function “nblobs = NXTCAMgetBlobs(cam, blobs, condensed);”, where if condenser is true then the colliding blobs will be combined.
Each element in a blob_array is of type blob, and has the following variables: x1 (the left bound), y1 (the upper bound), x2 (the right bound), y2 (the bottom bound), colour (the signature number), and size.
There are also a number of auxiliary functions that can help with blob analysis, such as NXTCAMgetAverageCenter, which can be found in the header file that was included.
A brief sample code has been provided below that initializes a camera and continually grabs the blobs and prints out the location and size of the biggest blob.
#include “robotcdriversuite-master/include/mindsensors-nxtcam.h”
task main()
{
tSensors cam = S1;
NXTCAMinit(cam);
blob_array blobs;
bool condensed = true;
while (true)
{
short nblobs = NXTCAMgetBlobs(cam, blobs, condensed);
if (nblobs > 0)
{
_sortBlobs(nblobs, blobs);
// blobs[0] is now the largest blob.
displayTextLine(1, “Number of blobs: %d”, nblobs);
displayTextLine(3, “Top left: (%d, %d)”, blobs[0].x1, blobs[0].y1);
displayTextLine(5, “Bottom right: (%d, %d)”, blobs[0].x2, blobs[0].y2);
displayTextLine(7, “Colour: %d”, blobs[0].colour);
displayTextLine(9, “Size: %d”, blobs[0].size);
}
}
}
Firstly, for setup, when configuring the interface the actions are different. Make sure the interface is set to LEGO I2C, not UART.
When programming the Pixy for LEGO, use the LEGO I2C functions. Documentation on the I2C registers can be found here: http://cmucam.org/documents/36. Documenation on RobotC I2C functions can be found here: http://botbench.com/driversuite/group__common.html.
An example code is below (has not yet been tested so it may not work exactly as expected):
#include “robotcdriversuite-master/include/common.h”
tByteArray msg;
tByteArray reply;
// “port” is just the sensor port, “reg” is the register you want info for (0x50 for general, 0x51 – 0x57 for specific signatures — see documentation). The information you get back will be in the array “msg”.
bool getInfo(tSensors port, ubyte reg)
{
short replyLen;
if (reg == 0x50)
replyLen = 6;
else if (reg >= 0x51 && reg <= 0x57)
replyLen = 5;
else
return false;
memset(msg, 0, sizeof(msg));
msg[0] = reg;
bool result = writeI2C(port, msg, reply, replyLen);
return result;
}
https://drive.google.com/open?id=1aosqiHktFN_0wDS9-guM2N0HI30AHSRk
]]>Nov. 27th | 6:00pm | RoadMap At SR. Email us to reserve one timeslot.
Nov. 13th | 6:00pm | RoadMap Online. Email us to reserve one timeslot.
Nov. 17th | 4:00pm | New Student Evals At SR. Click here to register
Dec. 1st | 4:00pm | New Student Eval At SR. Click here to register
Dec. 15th | 4:00pm | New Student Eval At SR. Click here to register
]]>Note: some of the variable names will have changed from the video, specifically x, y, and h.
θ = theta = angle
x = horizontal distance
y = vertical distance
A right triangle is formed with h, x, and y. Thus, we can establish the following trigonometric identities:
sin(θ) = y / h
cos(θ) = x / h
Here, we are assuming that we know the angle θ (for example, from a gyro sensor or IMU) and the distance h as measured by a sensor. We can use this information to calculate the distances x and y by using the trigonometric identities listed above:
y = h sin(θ)
x = h cos(θ)
Let us define h=h1+h2. h is therefore the total length of the hypotenuse in the right triangle. Since we know the horizontal distance x, which is the width of the field, we can therefore find the angle θ.
Let us first establish the following trigonometric identities by simply taking the reverse of the previous identities discussed (note that arcsin is the inverse function of sin):
θ = arcsin(y / h)
θ = arccos(x / h)
By these identities, we can simply plug in h we measure with our sensors to get θ = arccos((4 feet)/h), and thus we have the angle the robot is facing. Note that the angle you get won’t be able to distinguish whether the robot is pointing towards h1 or h2. Note the following image will yield the same angle theta, but the robot will be oriented differently.
Note first the difference between radians and degrees. They are both ways of measuring angles. There there are 360 degrees in a circle, whereas there are 2π radians in a circle. Therefore, 1 degree = π/180 radians.
In RobotC, these are the function calls for trigonometric functions:
asin(y/h)
returns the arcsine in radians (arcsin(y / h)).
acos(y/h)
returns the arccosine in radians (arcsin(x / h)).
sin(θ)
returns the value of y/h for an angle θ in radians.
cos(θ)
returns the value of x/h for an angle θ in radians.
sinDegrees(θ)
returns the value of y/h for an angle θ in degrees.
cosDegrees(θ)
returns the value of x/h for an angle θ in degrees.
radiansToDegrees(θ)
returns the angle in degrees for a given angle θ in radians.
degreesToRadians(θ)
returns the angle in radians for a given angle θ in degrees.
Note that the values you get back from the sensor may not be exactly accurate. First, the value an ultrasonic or infrared sensor returns can vary from the true result, especially at long distances, since practically its is impossible to make a perfectly accurate sensor. In particular, both the infrared and ultrasonic sensors have a maximum distance it can sense up to, and the ultrasonic sensor will give junk values if it is placed directly up to something (when the distance between the object and the sensor is very low). In other cases, they should be fairly accurate.
Another thing to note is that ultrasonic sensors don’t see in a straight line, but instead a cone. That means the value you get back will be approximately the closest distance to any obstacle inside a conical region in front of it, as demonstrated by the image below.
In the diagram below, even though the obstacle (denoted by the black circle) is not directly in front of the sensor, since it is in the conical field of view the sensor will not return the distance to the wall but instead something closer to the distance to the obstacle.
Finally, some infrared sensors (such as the LEGO EV3 infrared sensor) don’t provide an absolute distance measurement; they must first be converted to distances, and their values may change depending on the color of the surface.
Motor encoders measure how many degrees a motor has rotated. So if the motor encoder value reported is 720, the motor has gone 2 full rotations. Therefore, the formula to calculate the number of rotations is (# of rotations) = (# of degrees)/360°. Let (# of rotations) = r and (# of degrees) = n. Thus, r = n/360°.
Note: in the stock graphical LEGO software, encoder values can be provided as either rotations or degrees.
First, we can easily find the circumference of a wheel by finding the radius and applying the equation c = 2πr. Note that the circumference c is how far the robot moves with one rotations of the motor (and thus the wheel).
The equation d = c*r is true to find the distance d the robot moves with r rotations. Since r = n/360°, d = c*n/360°. By simply rearranging the equation, we get n = d*360/c°. Therefore, given the distance we want to travel, we know how many degrees the motor needs to spin. This can be used in conjunction with a loop or similar to implement a moving function.
To turn a certain angle in place, we can execute a point turn where the robot’s two motors turn in opposite directions. Thus, when the robot turns 360°, each motor traces the circumference of a circle with a diameter of the distance between the robot’s wheels.
If we define the wheelbase (the distance between the wheels) to be w, we can calculate the circumference of this circle with πw. To turn an arbitrary amount, we multiply this circumference by the ratio of the angle desired a to one full revolution: a/360°. Now that we know the distance that each wheel needs to travel, we can apply the equations from moving in straight lines; the only difference is that one wheel is rotating in the opposite direction. Therefore the equation for the degrees a motor needs to turn is n = aπw/(360c) where c is the circumference of the wheel.
By Cy Westbrook and Rishi Sappidi (both 8th Grade)
Abstract: Embedded systems are widely used for many applications. They control many appliances, as well as many simpler electronic devices. At the heart of these embedded systems is an MCU. Many of us are used to using Arduinos, which have MCUs inside of them, but it is important to understand the differences between Arduinos and the more advanced MCUs that we learned about in this class over the summer. In this presentation we will be discussing how and why the transition from Arduinos to these MCUs should be made.
By Cy Westbrook (8th Grade)
Abstract:
As systems get more complicated, we begin to have multiple devices that need to communicate with each other. These devices include sensors, displays, motors, other MCUs, and more. Many communication protocols have been established in order for these devices to communicate with each other, including I2C. In this presentation I will explain the different terms and concepts behind many of the communication protocols that are used.
By Neil Chalil (11th Grade)
Abstract:
Soldering and circuit construction allow for the physical connections comprising the PCB. Soldering allows for secure electrical connections between pins or devices such as MCUs, sensors, motors, or displays and the board. There are several techniques and options for creating proper electrical connections when soldering. We used soldering in the class in order to connect our devices to our boards. In this presentation I will explain the basic concepts and techniques used when soldering.
By Dheeraj Kattar (9th Grade)
Abstract:
Bare Metal Programming represents what real embedded system engineers in the real world do. It’s the idea of working with the embedded system on the most basic level. Bare Metal Programming is essentially the concept of operating with the system by changing registers within the memory. I will be explaining the purpose of Bare Metal Programming and how we used this concept in our summer course.
By Rishi Sappidi (8th Grade) and Allen Zhang (11th Grade)
For New students:
Gr. 7+ : Will be required to work on an assessment worksheet for eligibility. Steps:
Gr. 4 – 6 : Need to receive an acceptance after an Evaluation Session for eligibility. Steps:
Please note that all assessment will be subject to a cost of non-refundable $25.00.
]]>Full meeting recording:
]]>About
Currently, there are other sophisticated Robotics Soccer competitions for pre-college students such as the RobocupJunior Robotics Soccer League. However, it can be very overwhelmingly complex especially in the hardware component, as well as very costly for an individual.
This competition is organized by two of SR’s high school students, Daniel Xue and Ethan Wu. ( Learn more about them at the “Committee” section below.) Due to proprietary trademark restriction, this competition league is called the Amateur Robotics Soccer League for the time being, instead of LEGO Robotics League.
This robotics soccer league offers one very unique aspect – game modes with scaffolding complexity – Free Kick, Goalie, and Full Game. This plan enforces participating students to divide and conquer tasks which are all critical to achieving the final Full Game.
When : March 24th, 2019 |1:30 to 3:30pm.
Where : Raritan Valley Community College, Branchburg,
Duration : As a pilot event, it will only be an two-hours. It will be held right after the 3rd Annual Math Contest .
Organizer : Storming Robots
Visit our Blog to view all the latest news about this LEGO Soccer Competition.
Who may participate
Age Group : Gr. 6+. (Do not need to be current Storming Robots students.)
Team Size : | # of Robots | |
For free kick and goalie league: | 1 to 2 | Only one robot is allowed; e.g. participants cannot use 2nd alternative robot to do a different round; although they are allowed to make modifications to a single robot in-between rounds |
For Full Game: | 2 to 4 | Only two robots are allowed; e.g. participants cannot use 3rd alternative robot to do a different round; although they are allowed to make modifications to the same two robots in-between rounds. |
All team members MUST be active participants, i.e. taking important technical roles in the team.
View the Game Book for Detailed Rules .
Sign up the Free Online Sessions (see below) to learn how to get started.
The Technical Committee...
Daniel Xue: Daniel is currently a 10th grader at the Montgomery High School in NJ. He has been learning about robotics since he was in 6th grade. He loves just about anything computers-related, especially in algorithms, computer science (especially into AI realm), and mechatronics.
Ethan Wu: Ethan is currently a 11th grader at BRHS in NJ. He has participated various robotics competitions. Hobbies include anything computers-related, including web design, algorithms programming, CGI art, Linux, and Mechatronics. Despite of his young age, target audience to his tech talk are usually into embedded system, college level EE, Computer Engineering and Computer Science in AI area.
Both members have participated and earned high rank in 2017 and 2018 World RobocupJunior held in Japan and Montreal respectively. In 2017, the duo won the Best Engineering Design Award under Maze League at the World event.
Both also have delivered tech talk multiple times at the World Makerfaire in NYC, as well as to the World community at the 2018 International Robocupjunior in Montreal. Learn more about Ethan’s and Daniel’s work.
Quote From Daniel Xue: This new competition serves as an excellent launchpad to advanced robotics projects and competitions by expanding the scope of both software and hardware competitors get to interact with, compared to familiar competitions like the First LEGO League. The unique establishment of different game modes not only gives an opportunity for beginners to compete and achieve, but also provides a foundation for newcomers who wish to compete in more advanced divisions in the future.
Advisors...
Full meeting recording:
]]>