I am **Saumya Ranjan Nayak**

Goal :** To solve a problem on Earth using Neural Nets .**

I am fascinated and driven by the vision of a future where both man and machine work together , expanding the horizon of the mankind's knowledge and I am keen to make it happen and impact lives by an efficient and ethical use of technology.

Birthday | 15-12-2000 |
---|---|

futuredrivenme@gmail.com | |

Website | www.thecsengineer.com |

So, I have decided to share my journey while learning Machine Learning so I keep creating and uploading informational videos on my channel - Future Driven .

Do Subscribe and let's grow together !

Here I explain what Machine Learning is, in a very simple way :

Thank You !!

Hey there ! Welcome to the blog and so here I am attending another awesome event organized by Major League Hacking after the NewYearNewHackDay Hackathon where I presented our App and you can watch it HERE .

So, I have joined a beautiful community of tech enthusiasts called EquiCode as a community influencer and the most awesome thing is that EquiCode is participating as a guild in the ongoing Local Hack Day Event by Major League Hacking . And I would definitely suggest anyone interested in programming or the tech community to must register for the event HERE .

**What is Local Hack Day 2021 ?**

So, this how it works - so after registering for the contest one has to join the Major League Hacking's discord server where you will find many guilds and I would definitely suggest you to join our guild at EquiCode as we are one of the top in the leaderboard holding the 3rd spot .

So, after you join a guild ,you will find a lot of enthusiasts like you, who want to learn and grow.

And to earn points you and your teammates will have to register and check in daily for very simple tasks where you will learn a lot from the people at MLH and submit some forms showing that you have completed the challenge and you will earn points for yourself as well as you guild, as simple as that !

So, at the end of the day you can see both the team and individual leaderboard HERE and thus work with you guild mates to climb to the top by earning more points .

**My Experience :**

So, me and my guild mates at EquiCode are learning a lot throughout this process of attending workshops, earning points and taking our guild to the top .

I have come to know and connect with some awesome people in just this span of 2-3 days of chatting around on the discord channels and especially today was great, as we hopped on a video call to complete a challenge named "Share the Meal", which turned out to be pretty awesome as we discovered the pretty faces behind the discord usernames and at last our screenshots of the call ended up in the MLH's instagram stories . So, that was pretty great .

Ya, and honestly I love our EquiCode community a lot and would love to be a part of it for years to come and make it big and impactful.

Here's a glimpse of the happy faces :

Here's the leaderboard after Day 2 :

It may be updated while you are reading it so you can see it HERE .

And also according to the individual ranking as of now, around 5-6 people from our guild are in the top 30 individuals and that is pretty awesome . I was there after the day 1 and now I have to work a bit more I guess .

**At Last : **

So, it's going pretty great and I and my team will make sure of the fact that EquiCode wins the Local Hack Day 2021 and if not it, tries it's best .

And at last a big Thank You to Major League Hacking for arranging such an awesome event and we all are having a lot of fun .

Thanks for reading it so far !! ✔️🧡

Hey there! so recently I took part in my first hackathon organized by Major League Hacking and this how it went .

**How did I end up Hacking on a New Year Weekend ? :**

So, I got a mail about the New Year New Hack Day Hackathon and as you know like most of the programmers I didn't have anything exciting to do on the New Year's weekend so I signed up for it.

So, I saw someone asking for a Flutter developer and I knew I was not that good but I thought who am I to judge myself let's get in and see what happens and not loose such an opportunity, so I responded to him and in no time we were four people in a team.

**The Execution :**

As, the time was less than 24 hours we hopped on video call and after exchanging some ideas we ended up finalizing that we will make a fitness app which would track the user's steps and show the progress to the user and also the user would be ranked against his/her friends . It also has levels as you complete a specific number of milestone steps .

And ya I also named the app - FitYear and everyone found it great so we finalized it .

I pitched that I would make the three intro screens that would show up the first time user opens the app and I would also edit and upload the presentation video that I did edit and upload just thirty minutes before the deadline . Everyone liked the video !

Here's the pitch video :

My teammates did a great job by making the UI and connecting it to the Google Firebase backend which made the app more dynamic .

Here's the Github Link for the project and if you like it then do Star🌟,Fork 🍴 or Watch 👀 it .

**The Result :**

So, as the hackathon was about simple projects from around the world the result came around approximately after 3 hours of submission and guess what we did not win the hackathon but after all I think it was a great way to start the year and also a confidence booster for the year ahead .But I guess we will get some schwags like stickers for presenting our app in the competition .

That's how my first hackathon went, you can also share your thoughts in the comments below. I would love to read those.

I just realized this is the first blog of 2021.

**Happy New Year & Have a great 2021 !! ✔️🧡 **
Tweet

Hi, welcome to the blog and after a good response from the blog where I implemented and explained the Univariate or single variable version of the algorithms here is another walkthrough tutorial of how this works in a situation where there are multiple variables and we want to predict something .

So, here we have now 2 features that are the size or the area of the house and the number of rooms and our goal is to predict the price for an unknown input which we do at the end of the tutorial .

If you are new to these algorithms and you want to know their formulas and the math behind it then I have mentioned it on this Machine Learning Week 1 Blog .

If you are going through Andrew Ng's Machine Learning course you want to learn how to implement it in Python instead of Octave which is used in the course then you have come to the right place.

Let's get started !!

From the Machine and Deep Learning Repository :

Hi, there I am Saumya and in this notebook we I have implemented Multiple Variable Linear Regresssion and Gradient Descent from scratch and have given explaination of every step and line .

**I hope you have a great time going through it !! ✔️**

In [26]:

```
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
```

In [27]:

```
data= pd.read_csv('multLinRData.txt', header=None )
data.head()
```

Out[27]:

0 | 1 | 2 | |
---|---|---|---|

0 | 2104 | 3 | 399900 |

1 | 1600 | 3 | 329900 |

2 | 2400 | 3 | 369000 |

3 | 1416 | 2 | 232000 |

4 | 3000 | 4 | 539900 |

Here we used pandas to read our text file and used the 'head' function to see the first 5 entries of our data .

In [28]:

```
fig,axes= plt.subplots(figsize=(12,4),nrows=1, ncols=2 )
axes[0].scatter(data[0],data[2])
axes[0].set_xlabel("Size")
axes[0].set_ylabel("Price")
axes[0].set_title("Size-Price Plot")
axes[1].scatter(data[1],data[2],color='b')
axes[1].set_xlabel("No. of Rooms ")
axes[1].set_ylabel("Price")
axes[1].set_title("No. of Rooms - Price Plot")
plt.tight_layout()
```

Line 1: Here we use the 'subplots' function that would store multiple graphs in sort of an array i.e here
axes .So, you can see we have set the no. of rows=1 and no. of columns =2 i.e like setting a 2D array of 1x2
dimension and as you can see we have two plot in each columns of the only row .
Thus, the first plot & second plot are referred to as axes[0] and axes[1] resplectively .

Line 2&7: This sees the data and plots the blue dots that you see in a scattering format as the name suggests .

Line 3,4 & 8,9: Just label the graph.

Line 5 & 10 : Adds the title .

Line 12 : This fits the multiple graphs in the figure area.

This StackOverflow answer may help : Link

In [29]:

```
def CostFunction(X,y,theta):
m= len(y)
J= 1/(2*m)*np.sum((X.dot(theta)-y)**2)
return J
```

**Cost Function :**

This is the function that takes in the input array(X) and output array(y) and parameter(theta) and then computes the Cost Function Formula that we had learnt in my ML WEEK 1 BLOG.

In [30]:

```
def GradientDescent(X, y, theta, learningRate , iterations):
m= len(y)
J_list =[]
for i in range(iterations):
theta = theta - (learningRate*(1/m)*(X.transpose().dot(X.dot(theta)-y)))
J_list.append(CostFunction(X,y,theta))
return theta,J_list
```

**Gradient Descent on Linear Regression Function :**

Here we implement the formula of Gradient Descent that I had explained in the BLOG) so the funtions takes X, y , theta , Learning rate and the number of iterations as the parameter .

Inside we maintain a list of all the cost functions according to the varying theta in every iterations for visualising the descent later .

In [31]:

```
def FeatureNormalization(X):
mean = np.mean(X,axis=0)
stdDev= np.std(X,axis=0)
X_Normalized= (X-mean)/stdDev
return X_Normalized
```

**Fearture Normalization Function :**

This function normalizes or minimizes our input data to a small range that would make Gradient descent faster .

We do that by updating every value of a feature by substracting the mean of all the values of a feature from each of the value and dividing the answer by the standard deviation (max value - min value of the particular feature)

In [32]:

```
data_arr= data.values
m=len(data_arr[:,-1])
X = data_arr[:,0:2].reshape(m,2)
y= data_arr[:,-1].reshape(m,1)
Norm_X= FeatureNormalization(X)
Norm_X= np.append(np.ones((m,1)),Norm_X,axis=1)
theta= np.zeros((3,1))
CostFunction(Norm_X,y, theta)
```

Out[32]:

65591548106.45744

Line 1 : We extract the data into a dataframe , think it just as a table which hold a copy of our data .

Line 2 : We get the number of training examples i,e the no. of rows .

Line 3 : Here we store the inputs in a separate array i.e X which is of mx2 dimension.

Line 4 : Here we store the outputs in a separate array i.e y which is of mx1 dimension.

Line 5 : Here we call the Feature Normalization function i.e normalizes the input data.

Line 6 : Here we add an additional row of 1s in the front row and the X matrix becomes mx3 dimension.

Line 7 : Here we create the theta array which is just a 3x1 matrix of zeroes.

Line 8 : Here we call the function that gives us the cost function .

In [33]:

```
opt_theta , J_List= GradientDescent(Norm_X,y,theta,0.1,400)
print("h(x) = "+str(round(opt_theta[0,0],2))+" + "+str(round(opt_theta[1,0],2))+"x1 + "+str(round(opt_theta[2,0],2))+"x2 " )
```

h(x) = 340412.66 + 109447.8x1 + -6578.35x2

Line 1 : Here we just call the functiona with th appropriate values like learning rate = 0.1 and number of iterations = 400 we get back the global minimum theta (opt_theta) and the list of cost functions .

Line 2 : Here we use the obtaines optimal theta and display the **Multivariate Hypothesis Function .**

In [34]:

```
plt.plot(J_List)
plt.xlabel("Iteration")
plt.ylabel("$J(\Theta)$")
plt.title("Cost function using Gradient Descent")
```

Out[34]:

Text(0.5, 1.0, 'Cost function using Gradient Descent')

**Cost Function vs Iteration Plot**

In [35]:

```
def Predict(x, theta ):
prediction = theta.transpose().dot(x)
return prediction[0]
```

**Prediction Function**

This functions gives the prediction according to the formula :

In [36]:

```
x_sample = FeatureNormalization(np.array([1650,3]))
x_sample=np.append(np.ones(1),x_sample)
predict3=Predict(x_sample,opt_theta)
print("We predict that for the size 1650 and the no. of rooms 3 there will be a profit of "+" $"+str(round(predict3,0)))
```

We predict that for the size 1650 and the no. of rooms 3 there will be a profit of $456439.0

I hope you learnt a lot and if you want the jupyter notebook code then you can find it in my Machine and Deep Learning repository on Github . If you like the repository then do Star🌟,Fork 🍴 or or Watch 👀 it .

Thank You and waiting for your comments !!✔️🧡

Qualified for Faceboook HackerCup 2020 Round 1

2016
##### Boards 10th - ICSE

Scored 80.3%

2016 - 2018
##### High School 12th - CBSE

Scored 83.2%

2019- Present
##### University

Currently I am pursuing Bachelors in Computer Science and trying to get most out of the time and achieve small goals on my way to the seemingly bigger ones .