# Feel calculus with your programming sense

I believe people who can code — have the ability to understand more complicated concepts than calculus. One reason why there are many talented programmers out there who are still struggling with calculus is not that it is hard. It is mostly because we have been taught it in the wrong way (along with many other topics).

If you have done some coding in your life — even some toy exercises like generating prime numbers, you might have already done similar things that happen in calculus, but never had the chance to relate.

The objective of this writing is neither…

# Normal Distribution — an intuitive introduction without math

I will try to keep this article free from equations and mouthful jargon— as much as possible. I, however, need you to at least have the following capabilities:

1. Able to interpret simple graphs
2. Elementary-level knowledge in probability. You at least understand that there is a 50% probability for the head if I toss a coin.
3. Integral calculus and quantum physics (kidding!)

Why Normal Distribution

An intuitive way to understand something is to investigate why it is needed. Let’s do it for normal distribution (also called, Gaussian Distribution).

Say, you have a funny habit. Every day you toss a coin 100…

# An intuitive explanation of how meaningless filters in CNN take meaningful shapes

Prerequisites: I need you to have some basic understanding of Convolutional Neural Networks. It is okay if you don’t understand the backpropagation in CNN yet. But of course, you need to have a reasonably clear understanding of how backpropagation works in a fully connected network. Have a look here if that is not clear to you yet.

The Question

You might know by now that in 2D CNN, filters are basically matrices, which are initialized with random values. Through training, these randomized matrices will take meaningful shapes.

-How?
-Through backpropagation.
-Okay, but how?

Backpropagation in Deep Learning is the same…

# Translational Invariance and Translational Equivariance in CNN

I found so many contradicting arguments on the internet, which made me even more confused. Also, there were some places, which are too long or too technical for an easy question.

So I am trying to put here in simple words what I understood from all those places.

Translational Invariance

It simply means applying translation won’t change the result.

Convolution is NOT Invariant with respect to Translation

Yes, you read it right. Imagine, if you apply translation, the convolution operation (multiplying all pixels with the corresponding elements in the filter, and then take the sum of all products) will surely…

# Speech Recognition with PyTorch for beginners

Setup The Environment

You can use your favorite IDE, or mine — PyCharm.

If you need help setting up PyCharm, have a look here.

You will also need to install PyTorch by running a simple command in the terminal — See https://pytorch.org/ for more information. As an example, If you are using windows, and pip — for the stable version (1.8.1) at the time of writing this article, you need to run

`pip3 install torch==1.8.1+cpu torchvision==0.9.1+cpu torchaudio===0.8.1 -f https://download.pytorch.org/whl/torch_stable.html`

The Project

Using PyTorch’s SPEECHCOMMANDS dataset, which includes 35 voice commands (down, follow, forward etc.), we will build a command recognizer.

# Feel Euler’s Number e with your programming sense — no equation

Sometimes programmers understand concepts better from code than from technical talks, or even worse — painful analogies.

So, I will first give here the code to evaluate e, and then start the explanation (which might not be needed after you see the code):

`STEP = 100GROWTH = 1GROWTH_RATE = GROWTH / STEPe = 1for i in range(STEP):    e = e + e * GROWTH_RATEprint(e)`

That’s it!

If you plot a graph of e for all values of STEP between 1 to 100, it will be like this:

Here is the code to plot the graph…

# Deep learning journey — things to avoid

Earlier I wrote an article suggesting what to DO when you start your journey on deep learning. This article will be on what NOT TO DO.

1. Don’t be over-obsessed with terminologies

Trust me — you can learn a deep level of deep learning without knowing the perfect formal definition of terms Machine Learning, Data Science, Artificial Intelligence, Deep Learning, Big Data, Data Mining blah blah blah — and their differences or interrelationships. Although some high-level understanding would be helpful, it is okay if you can’t figure out initially what type of artificial intelligence is not machine learning, and how…

# Debug Python code in PyCharm with conditional breakpoint

One reason why I’m so big fan of PyCharm (or any IDE from JetBrains) is because of its smart debugger.

Consider a simple example:

`import randomfor i in range(1000):    x = random.randint(1,100)    print(x)`

To pause execution at some point (say, the last line), just click beside the line number:

# Silence Trimmer — Your first speech/audio processing exercise in Python

One great thing in Python is — there are so many options to do one thing, which again is a curse sometimes.

For example, consider a case when you have to perform a series of simple operations on some audio dataset — trimming silence, calculating their average duration, and then making them all of the equal length same as the average — trimming the bigger ones, and padding the smaller ones. There are many many cool libraries available that serve some or all of these purposes. For a beginner, it creates confusion about which one to use for what.

I…

# Custom dataset in PyTorch

PyTorch provides a dataset (torch.utils.data.Dataset) and dataloader class (torch.utils.data.DataLoader). You can write a full-fledged commercial-grade application even without using them. Why do we need them then?

You can find the answer by looking at the comparison below. Both doing same thing — with and without using PyTorch’s dataset/dataloader classes. You don’t need to understand the codes — I just need you to agree with me that the right-side one looks much cleaner.

In short, we use PyTorch’s dataset, dataloader mainly to get rid of the complexity of keeping track of variables for batch, epoch etc. Also that nasty index-calculations. …

## Noobest

Jack of a few trades, master of none.

Get the Medium app