Objective Function
The objective function for logistic regression is the log likelihood. Assuming we have $N$ samples in our dataset, and $y_i$ is the true label for the $i$ – th sample (either 0 or 1), and $p(y_i \mid f1_i, f2_i, f3_i)$ is the predicted probability for the $i$ – th sample, the log likelihood is given by:
$J(\theta) = -\sum_{i=1}^{N} \left[ y_i \log(p(y_i | f1_i, f2_i, f3_i)) + (1 - y_i) \log(1 - p(y_i | f1_i, f2_i, f3_i)) \right]$
here $\theta$ represents the parameters of the model.
Loss Function:
The negative log likelihood is commonly used as the loss function in logistic regression. It is simply the negation of the log likelihood:
$J(θ)=−L(θ)$
So, the loss function for binary classification with logistic regression becomes:
$J(\theta) = -\sum_{i=1}^{N} \left[ y_i \log\left(p(y_i \mid f1_i, f2_i, f3_i)\right) + (1 - y_i) \log\left(1 - p(y_i \mid f1_i, f2_i, f3_i)\right) \right]$