In an afternoon, I decided to learn Terraform to spin up an EC2 instance with all the necessary ports open and attach my existing key pair so that I can SSH into the box and start working. I wanted to set up the API I was building at the time in it and invite my client for a quick test round.
Terraform was something I always wanted to try from the first day I heard about it. Moving back and forth in the AWS console didn’t feel productive at all.
The following diagram explains what I wanted to achieve.
At a time when humans are researching to synthesize human consciousness, audio synthesis may sound primitive. But can’t we agree on that being subject to one of the five senses, audible waves play a leading roll in our day to day lives?
This article's intention is to explore the practical applications of digital audio synthesis in a developer-friendly environment using Web Audio API
To see how powerful the Web Audio API is, I tried to build a synthesizer that runs in the browser and managed to build the following functionality into the “synth”.
අපි හිතමු, අපිට අවශ්යයි අපි දෙන ඕනෙම සංඛ්යාවක් 2 න් ගුණකරන software එකක්. මේක හරිම සරල ගනනය කිරීමක්. Input එක a නම් output එක 2a වෙන්න ඕන. නමුත් මෙතනදි අපිට 2 කියන ගුණකය අපේ software එකේ ලියල තියන්න (hard code කරන්න) වෙනවා. එහෙම නොකර machine learning භාවිතා කරල මේ software එක ලියන්නත් පුලුවන්.
අපිට software එකට මෙහෙම කියන්න පුලුවන් “Input එක 2 නම් output එක 4 වෙන්න ඕන, input එක 3 නම් output එක 6 වෙන්න ඕන”, මේ විදිහට inputs සහ outputs රාශියක් යොදාගෙන, ඒ සංඛ්යා අතර තියෙන සම්බන්දය (relationship) හොයාහගන්න අපිට පුලුවන්. …
Let’s implement gradient descent in python, using only numpy.
In a previous article we discussed about two series of numbers (
[1,2,3,4,5],[3,6,9,12,15]) and an artificial neuron figuring out the relationship between them. This article intend to implement the analogy we discussed there.
Cost function and it’s derivative with respect to
b is as follows.
There are two dependancies to the following implementation.
Running above code in Jupyter notebook should return the following output.
inputs: [[1 2 3 4 5]]
outputs: [[ 3 6 9 12 15]]
weight: [[1.76405235 1.76405235 1.76405235 1.76405235 1.76405235]]
bias: [[0.40015721 0.40015721 0.40015721 0.40015721 0.40015721]]
activation: [[2.16420955 3.9282619 5.69231425 7.45636659 9.22041894]]
weight: [2.6302523] bias: [0.71394367]
This is the part2 of Universal function approximators series and part1 is also available for reading.
I think you are familier with the euphoric feeling of understanding how things work. So, let’s explore the inner workings of an artificial neural network.
These are the building blocks of neural networks. Bunch of this neurons can solve complex problems yet the basic concept is so simple.
This is the part1 of Universal function approximators series and part2 is also available for reading.
“Universal function aproximators” sounds pretty happy, right?
I don’t know about you but for me anything begins with “Universal” makes me happy because I have to learn this one thing then go on and apply it to pretty much everything. So does this means we can approximate any function in the universe? Given enough data, we could :)
Let’s get to work, shall we?
First of all we need a function that nobody knows about. Since nobody knows what it is, we have to…
It was a love story.
Back in 2015 I fell in love with a popular PaaS Red Hat OpenShift. It was my first PaaS of this kind so it felt so special, I was deploying applications in my dreams. Node.js applications with MongoDB databases with all those cartridges it felt so dreamy.
Like in all love stories, money came in to the equation and I decided to do some freelance work.
One of my friends introduced me to this client who wanted to plant 1,000,000 plants. After several meetings and brainstorming sessions with the team we came up with naturenurture.lk.
Certified dreamer, Award winning procrastinator