Last week, two of our members from the Hackers group were at the BT Young Scientist & Technology Exhibition, where they presented their projects. Luke, who goes to Clarin College in Athenry, presented a project called “Relax and Reward”, … Continue reading
Here are photos from our Christmas party and Show & Tell day at CoderDojo Athenry on 08 December 2018.
It was fantastic to see the things that our young people had created.
We are very grateful to our supporters in the community around Athenry:
- Clarin College and Principal Ciaran Folan, who are so generous with their space every week
- Galway & Roscommon Education & Training Board, who provide us with an annual Youth Club Grant
- HEA (Higher Education Authority) and NUI Galway School of Computer Science, who provide us with funding towards equipment.
- Medtronic and Declan Fox, who have provided us with a grant linked to Declan’s volunteering
- Hewlett Packard Enterprise and Mark Davis, who provide us with loaner laptops
- Boston Scientific and Kevin Madden, who provide us with the loan of 3D printers.
- Supermacs, who gave us a great deal on the food for the Christmas party
And of course, we are eternally grateful to our wonderful mentors, and to the parents who come along with their children every week. Thank you!
This week we looked at two ways to do object recognition.
Kevin went through the steps involved in finding an object with a particular colour in an image. He started on an image with six circles each of a different colour, and demonstrated finding a green circle in the image. Then he stepped through the Python code and explained each task.
Here’s the image:
OpenCV, the Open Source Computer Vision library, has lots of functions for transforming and processing images.
We started with a standard RGB (red, green, blue) JPEG image, which OpenCV stores in memory as BGR (blue, red, green). Then we transformed the image to HSV (hue, saturation, value) format. The HSV colour space has a very useful property: colours are described by their hue and saturation. The value represents the intensity of the colour. This means that we can use H and S to find a colour, without having to worry much about lighting or shadows.
Next we used the known value for the green circle to apply a threshold to the image: any colours above or below the threshold are converted to black. Any colour at the threshold is converted to white. Here’s what the thresholded image looked like:
Then we found the white area in the image. To do that, we used an OpenCV function that gets the co-ordinates for contours that can be drawn around the boundaries of all the white regions in the image. We calculated the areas of each contour, and took the largest. We’ll find out why this is useful later.
To show that we had found the right circle, we calculated the co-ordinates of its centre-point. Finally, we drew a small cross at that centre-point to mark it, and displayed the full image on screen. This is what we ended up with:
Since we had a contour, we also used that contour to draw a line around the perimeter of the circle.
Next, we took a photo of a blue marker, found the HSV value of the blue, and used that value to find the marker in a live image. We held the marker in front of a laptop webcam, moved the marker around, and watched as the spot moved with it. Our method for finding a particular colour works for any shape, since we use a contour, not just circles.
Michael introduced us to TensorFlow, a machine learning library. Once trained, TensorFlow can identify specific objects by name. It’s a lot more sophisticated than finding something by colour. We spent some time setting the library up on a Raspberry Pi. The Pi isn’t powerful enough to train the software, but it is capable of doing the recognition after training models on more powerful computers.
Or final goal is to build an autonomous robot to play a game of hide and seek. We can use one of our remote-controlled battlebots from last year to hide, and the new robot to do the seeking on its own. One way to do the seeking would be to go after the biggest thing in the robot’s field of view that has a particular colour – the colour of the hiding robot. Another way to do the seeking would be to train a TensorFlow model with pictures of the hiding robot, so that the seeker can recognise it from different angles.
It’s going to take us a while to figure out what works best, and then we have to work out how to control our robot. It should be an interesting new year.
Last week in Creators, we looked at Vectors and how they can be used to specify things like position, velocity and acceleration. This week, we looked at what can actually cause something to accelerate – FORCE! First we talked and played a bit with force and then we created a simulation of something that we could apply different forces to. We wanted to think about a scene where we had lots of objects and different forces acting like wind, gravity, friction, etc.
What the heck IS force?
We started off by looking at the laws of motion from Sir Isaac Newton, who was the first guy to think hard about this (or at least to come up with good theories). Everyone knew about the apple but not many about what it made him think of. We googled his laws of motion and arrived at a NASA web Page that had a nice short explanation:
Like all great ideas, these look obvious when you know the answer, but were huge ideas at the time! I’ll butcher the above laws by trying to re-word them to capture the interesting thing about them from our point of view!
This week in Creators we covered a few very important concepts which we will likely need to run over a few times until we get them solid!
What are Vectors?
First we looked into a concept called Vectors, in particular 2D Vectors. We saw that at the most basic, a Vector is a simple way of holding an x and a y in one variable. This is useful for keeping track of the position of an object as you only need to keep one variable e.g. “rocketPosition” rather than two “rocket_x, rocket_y”.
The other thing we talked about is how this X and Y can represent a change in a given direction – i.e. a vector with x=10 and y=5 can mean “change X by 10 and y by 5”. This way it’s useful for ANYTHING that might be in a given direction – things like velocity and acceleration for example!
The other thing cool about the vector object in p5 is that it has a bunch of functions that allow you to add, subtract, etc them. This would allow you to e.g. add a velocity vector to a position vector to come up with a new position vector!
This web page has some nice details on the maths behind vectors.
Position, Velocity and Acceleration
We saw that position, velocity and acceleration are really common uses of vectors.
This week we looked at animations, and how to make objects move in p5.js.
- How to animate things in p5
- Some mouse interaction
To save some typing, there is a video below with a quick overview of what we did:
Okay, it’s pretty simple stuff and not exactly the most exciting animation in the world, but you guys did some playing around and came up with several interesting variations and now we know how to do it, the sky is the absolute limit on what we can build 🙂 !
As usual, the code is up on the github to be pulled down and played with or changed to your heart’s content!
Thanks for coming Saturday even though the weather was dreadful.
This week, I helped Bo Peep found her sheep! Some of you did the same and some used ideas like Minecraft Steve finding Iron Ore and Diamonds, Knights finding dragons or a Princess finding flowers.
Before we even started our game this week we talked a little bit about File Management and about the importance of keeping your files somewhere you can access them quickly and giving them a meaningful name.
So to this end, we all created a folder where we will be keeping our files in the future and within that we had a sub folder for this weeks files.
We started our game by drawing our background on our stage:
Unfortunately, due to internet problems we could not search the internet for images for our Sprites, but we could still use the sprites from the Scratch Library.
This week, we decided to move our main sprite using the arrow keys. For this we had to learn a small bit about the X and Y axis and I gave you a little tip on how to remember which is which!
Here are the notes for this weeks session in PDF CDA-S7-Week_05-BoBeep.pdf
Julie, Ruaidhrí and Eoin
Today we continued towards building a temperature controller.
Last week, we used an LM35 temperature sensor to read the temperature in the room and report it on the Arduino serial console.
We were able to get a reading for the temperature of the air in the room, and the temperature of a cup of coffee touching the sensor. We couldn’t be sure, however, that the readings were correct. So, we decided to test with a potentiometer and a voltmeter.
We supplied 5 V to the potentiometer and fed the output to the Arduino. We used a voltmeter to measure the output from the pot and compared it to the reading from
the Arduino. Once we got the code working, we got good agreement on voltage readings between the meter and the Arduino.
The sensor presents a voltage on its output pin that represents the temperature its measuring. In its basic mode, the sensor reads temperatures between 0.2 and 150 degrees Celsius. Every degree is 10 mV, so the output voltage ranges from 2 mV to 1500 mV. Converting our voltage reading to a temperature is simple then: divide the voltage reading in millivolts by 10.
We can use one of the analog I/O pins on the Arduino to read the voltage. The analogRead() function returns a value between 0 and 1023 for voltages between 0 and 5 V.
We digressed into how to use a multimeter correctly. The first thing to do is make sure that our meter is set up correctly.
We tried measuring a 9 V DC battery with the DC voltage setting and got 9.2 V. We then tried measuring the same battery with the AC voltage setting. This time we got 19.6 V.
There’s a lot of potential (sorry) to get confused, then. Worse again, if we try measuring 220 VAC with the meter set to DC, we get a reading close to zero. In other words, a live AC supply looks safe if we set our meter wrong.
Next, we start with the highest reading range and step down. For a 1.5 V battery, we start with the 200 V reading, then step down to 20 V. This is to protect the meter: if the voltage is higher than we expected, we might damage the meter.
Voltage readings are taken in parallel with the circuit, and while it is live.
Resistance and continuity readings, on the other hand, are taken with the component disconnected from the circuit.
We worked out why with a simple circuit:
F is a fuse. W is a wire between the two ends. We’re not sure if the fuse is blown. We try testing for continuity across the fuse by putting our meter leads on A and B.
Even if the fuse is blown, we will get continuity because W provides a circuit. We need to cut the wire to get a true reading.
Our first attempt at Arduino code for the voltage reading gave us a surprise. Our meter displayed 2.5 V. The Arduino displayed 2 V. We used this formula to calculate the voltage:
v = 5 * analogRead_reading / 1023.
V was declared as float variable. analogRead_reading was declared as an int. The Arduino code multiplied two ints and divided the result by another int, truncated the result and stored the int as a float. When we made the 5 and 1023 floating point numbers (5.0 and 1023.0), we got the right answer.
Once we were happy with the Arduino code, we replaced the potentiometer with an LM35. Unfortunately, we didn’t notice the “bottom view” label on the datasheet drawing. We connected the 5 V supply to the wrong side of the sensor. It’s amazing how hot a temperature sensor can get! It was too hot to touch. And we couldn’t measure the temperature because…
After we disconnected the LM35, we made another discovery: we continued to get random voltage readings displayed on the Arduino even though there was no voltage to read. The analogRead() function happily outputs values from random noise.
Next week, we’ll use an LM35 connected the right way around, and try controlling a relay to switch power on and off to an external device. We’ll build a temperature controlled soldering station carefully – we want to solder with the tip of the iron, not the temperature sensor.
What is P5?
But really, what is P5?
Hello again everybody.
This week in the Bodgers group we started working on our code for the Mission Zero Challenge.
We began by writing a simple text message on the 8×8 full-colour LED display, then we changed the text and background colours. We then coded a picture by assigning a colour to each of the 64 LEDs on the display. We finished the session by taking a quick look at using the temperature sensor to read the temperature. Here are my slides from this week day 2.
Next week we will recap what we covered this week and we will start to personalise our code for the challenge.
In the meantime, here’s a couple of fun videos on how the Astro Pi computers got to the ISS.
See you all next Saturday
Declan, Dave and Alaidh