Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Factor of 2 in objective for DP logistic regression #71

Open
fl16180 opened this issue Aug 29, 2022 · 0 comments
Open

Factor of 2 in objective for DP logistic regression #71

fl16180 opened this issue Aug 29, 2022 · 0 comments

Comments

@fl16180
Copy link

fl16180 commented Aug 29, 2022

Hi, I wanted to check about a factor of 2 in the implementation regarding the conversion \Lambda = 1 / (2 n C) as discussed in a previous PR #10

In Corollary 11 of [CMS18] their regularization term N(f) = 1/2 ||f||^2 so their proof takes into account the 1/2 factor in the objective formulation (that is, J = 1/n sum(loss) + 1/2 \Lambda ||f||^2 ), just like in the sklearn objective which is J= C*sum(loss) + 1/2 w^t w . So if this is the case then the conversion should be \Lambda = 1 / (n C), and \Lambda = alpha / n, which implies in your code 0.5 * alpha would just be alpha
e.g.

epsilon_p = self.epsilon - 2 * np.log(1 + self.function_sensitivity * self.data_sensitivity /

Thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant