Machine learning in Scratch?? ??

Wait, really?

Yes, absolutely! Scratch has all the necessary tools for a working linear regression with gradient descent. With a few nifty tricks, we can even visualize how the algorithm works!

What is Scratch, though?

Scratch is a proje…

Wait, really?

Yes, absolutely! Scratch has all the necessary tools for a working linear regression with gradient descent. With a few nifty tricks, we can even visualize how the algorithm works!



What is Scratch, though?

Scratch is a project of the Scratch Foundation, in collaboration with the Lifelong Kindergarten Group at the MIT Media Lab. It is available for free at https://scratch.mit.edu. Scratch is a tool to teach programming by offering a GUI where instructions are combined via drag&drop. Sprites and backgrounds allow for a user interface with animations and user input. One create all kinds of programs, like simple interactive movies, entire games, or, well, machine learning algorithms.

Some example code in Scratch



Machine learning in Scratch? Why would you even do this?

Because it’s possible. I wanted to explore the boundaries of Scratch and figure out new ways to solve certain problems. It made me understand the algorithm better and think outside the box. Besides: I’ve got a little history in implementing those things in languages or tools you wouldn’t expect:

This post also explains how linear regression works and how to implement it.



Ok, let’s implement linear regression in Scratch then!

Awesome! First, I need some data. For linear regression, that’s a bunch of XY-coordinates. Since Scratch doesn’t have nested lists, I keep them in separate lists. A third list keeps track of the error values, so we can inspect or display it later.

The lists in Scratch's side bar

I’m generating the data now, because why not. I could simply generate random data, but some kind of correlation would be nice. UVM has a good formula for that. First,


xxx

and

yyy

are set randomly. Then, a new

yyy

is calculated with this formula, with

rrr

as the “rate of correlation”:

y=x∗r+y∗1−r2
y = x*r+y*\sqrt{\smash[b]{1-r^2}}
y=xr+y1r2

I create a new sprite with a single red dot in the center and tell it to calculate it’s own coordinates as soon as it’s created. I can then clone this point sprite to create new as many data points as I want:

The code for the data point sprite

The nice thing about this: The data is already visualized. This comes more or less out of the box.

I now introduce the variable m and set it to 100, to know how many data points I need. Next, I create a “main” sprite that’s empty, basically the main entry point of my program. Think of it like Java’s public static void main(String[] args). There I reset all the variables, set the parameters I need for the linear regression/gradient descent and clone the data point m times:

The main code to kick off the algorithm

And that’s the result so far:

Data points, neatly distributed with some correlation

Next, I create a sprite called “error point” to plot the error rate. The “error point” is similar to the data point sprite. Also, since the “errors” list is already public, I introduce two more variables: maxIter for the total number of iterations of gradient descent and currentIter to know which iteration I’m currently at. For each iteration, I create a clone of the “error point” and tell it to adjust itself, to get a nice plot of the error rate:

Error point code

Next, I introduce a sprite called “line”. This is where the main logic will happen. The line can basically adjust itself, rotate and move, whatever its internals tell it to do.

Linear regression basically tells me the coefficients

c0c_0c0

and

c1c_1c1

of a linear function. I can use those to calculate a single point (i.e.

(x,c0+c1∗x)(x, c_0 + c_1 * x)(x,c0+c1x)

). To calculate the angle

α\alphaα

of the hypotenuse, I can use this formula:

tan⁡α=sideadjacentside
\tan \alpha = \frac{side}{adjacent side}
tanα=adjacentsideside

Therefore:

α=arctan⁡sideadjacentside
\alpha = \arctan \frac{side}{adjacent side}
α=arctanadjacentsideside

When I assume

xxx

as 1 and neglect

c0c_0c0

(this is only moving the line, not rotating it), I can calculate the angle of the line as this:

α=arctan⁡c0
\alpha = \arctan c_0
α=arctanc0

This is then part of the iteration of gradient descent in the “line” sprite:

Line sprite, basic reset code, and the loop for gradient descent

So far so good. I’ve got data points, I’ve got an error plot, I’ve got a line that I can move around. Now for the fun part: The linear regression with gradient descent itself.

I’ve worked with two lists for x’s and y’s since Scratch doesn’t allow for nested lists. Minor inconvenience, but nothing that’ll stop me.

So, first, I’ll give the variables used in the loop some default values:

Default values in gradient descent loop

Next up: Looping through the data points to calculate the different between predicted Y and actual Y:

Looping through all data points

Next, I use the sums descentSumC0 and descentSumC1 to calculate new versions of c0 and c1 and set them simultaneously:

Setting of C0 and C1

After this block ran through, the line is adjusted and a new calculation is done. Until the maximum number of iterations is reached.



And this works?

Yup:

Animation of Scratch executing linear regression with gradient descent

For the full code and to try it yourself, here’s the Scratch project: scratch.mit.edu/projects/520553339

I hope you enjoyed reading this article as much as I enjoyed writing it! If so, leave a ❤️ or a ?! I write tech articles in my free time and like to drink coffee every once in a while.

If you want to support my efforts, buy me a coffee or follow me on Twitter ?! You can also support me directly via Paypal!

Buy me a coffee button


Print Share Comment Cite Upload Translate
APA
Pascal Thormeier | Sciencx (2024-03-29T09:05:32+00:00) » Machine learning in Scratch?? ??. Retrieved from https://www.scien.cx/2021/04/28/machine-learning-in-scratch-%f0%9f%90%b1%f0%9f%92%a1/.
MLA
" » Machine learning in Scratch?? ??." Pascal Thormeier | Sciencx - Wednesday April 28, 2021, https://www.scien.cx/2021/04/28/machine-learning-in-scratch-%f0%9f%90%b1%f0%9f%92%a1/
HARVARD
Pascal Thormeier | Sciencx Wednesday April 28, 2021 » Machine learning in Scratch?? ??., viewed 2024-03-29T09:05:32+00:00,<https://www.scien.cx/2021/04/28/machine-learning-in-scratch-%f0%9f%90%b1%f0%9f%92%a1/>
VANCOUVER
Pascal Thormeier | Sciencx - » Machine learning in Scratch?? ??. [Internet]. [Accessed 2024-03-29T09:05:32+00:00]. Available from: https://www.scien.cx/2021/04/28/machine-learning-in-scratch-%f0%9f%90%b1%f0%9f%92%a1/
CHICAGO
" » Machine learning in Scratch?? ??." Pascal Thormeier | Sciencx - Accessed 2024-03-29T09:05:32+00:00. https://www.scien.cx/2021/04/28/machine-learning-in-scratch-%f0%9f%90%b1%f0%9f%92%a1/
IEEE
" » Machine learning in Scratch?? ??." Pascal Thormeier | Sciencx [Online]. Available: https://www.scien.cx/2021/04/28/machine-learning-in-scratch-%f0%9f%90%b1%f0%9f%92%a1/. [Accessed: 2024-03-29T09:05:32+00:00]
rf:citation
» Machine learning in Scratch?? ?? | Pascal Thormeier | Sciencx | https://www.scien.cx/2021/04/28/machine-learning-in-scratch-%f0%9f%90%b1%f0%9f%92%a1/ | 2024-03-29T09:05:32+00:00
https://github.com/addpipe/simple-recorderjs-demo