# Linear Regressions

```
Mix.install([
{:nx, "~> 0.5"},
{:scholar, "~> 0.1"}
])
```

`:ok`

## Data

We are going to run a simple experiment to extrapolate the formula $c = 2a - 3b + 5$ based on a a few points on the graph.

```
data =
Nx.tensor([
[1, 1, 4],
[1, 2, 1],
[1, 3, -2],
[2, 1, 6],
[3, 1, 8],
[4, 1, 10],
[6, 2, 11],
[7, 6, 1],
[8, 2, 15]
])
```

```
#Nx.Tensor<
s64[9][3]
[
[1, 1, 4],
[1, 2, 1],
[1, 3, -2],
[2, 1, 6],
[3, 1, 8],
[4, 1, 10],
[6, 2, 11],
[7, 6, 1],
[8, 2, 15]
]
>
```

In the data above we see the example `[4, 1, 10]`

and this can be *solved* as:

```
c = 2a - 3b + 5
= 2(4) - 3(1) + 5
= 8 - 3 + 5
= 10
```

## Extract Features and Labels

For building our inference model, in this case using Linear Regression, we need to split the a,b (features, or also known as x) from c (label, or y).

```
features = data[[.., 0..1]]
label = data[[.., 2]]
{features, label}
```

```
{#Nx.Tensor<
s64[9][2]
[
[1, 1],
[1, 2],
[1, 3],
[2, 1],
[3, 1],
[4, 1],
[6, 2],
[7, 6],
[8, 2]
]
>,
#Nx.Tensor<
s64[9]
[4, 1, -2, 6, 8, 10, 11, 1, 15]
>}
```

## Train The Data

Now we can use Scholar to run a linear regression on the data.

```
alias Scholar.Linear.LinearRegression, as: LR
model = LR.fit(features, label)
```

```
%Scholar.Linear.LinearRegression{
coefficients: #Nx.Tensor<
f32[2]
[1.9954067468643188, -2.999237060546875]
>,
intercept: #Nx.Tensor<
f32
5.015231132507324
>
}
```

## Predict The Result

As this was a simple model, we will just run a few manual tests, but the outputs look very accurate with a predicted model of c = 1.99a - 2.99b + 5.02

```
for [a, b] <- [[0, 0], [4, 2], [5, 5], [10, 10]] do
answer = LR.predict(model, Nx.tensor([a, b])) |> Nx.to_number() |> Float.round(2)
IO.puts("a #{a} and b #{b} => #{answer}")
answer
end
:ok
```

```
a 0 and b 0 => 5.02
a 4 and b 2 => 7.0
a 5 and b 5 => 0.0
a 10 and b 10 => -5.02
```

`:ok`