This repository has been archived on 2024-11-04. You can view files and clone it, but cannot push or open issues or pull requests.
2018-07-21 20:34:26 +00:00
|
|
|
Security and Privacy are rapidly emerging as critical research areas.
|
|
|
|
Vulnerabilities in software are found and exploited almost everyday
|
|
|
|
and with increasingly serious consequences (e.g., the Equifax massive data
|
|
|
|
breach). Moreover, our private data is increasingly at risk and thus
|
|
|
|
techniques that enhance privacy of sensitive data (known as
|
|
|
|
privacy-enhancing technologies (PETS)) are becoming increasingly
|
|
|
|
important. Also, machine-learning (ML) is increasingly being utilized to
|
|
|
|
make decisions in critical sectors (e.g., health care, automation, and
|
|
|
|
finance). However, in deploying these algorithms presence of malicious
|
|
|
|
adversaries is generally ignored.
|
|
|
|
|
2018-08-03 22:04:42 +00:00
|
|
|
This advanced topics class will tackle techniques related to all these themes.
|
|
|
|
We will cover topics drawn from the following broad areas, depending on student
|
|
|
|
interests:
|
2018-07-21 20:34:26 +00:00
|
|
|
|
|
|
|
### Differential Privacy
|
|
|
|
- Basic properties and examples
|
|
|
|
- Advanced mechanisms
|
|
|
|
- Local differential privacy
|
|
|
|
|
|
|
|
### Cryptographic Techniques
|
|
|
|
- Zero-knowledge proofs
|
|
|
|
- Secure multi-party computation
|
|
|
|
- Verifiable computation
|
|
|
|
|
|
|
|
### Language-Based Security
|
|
|
|
- Secure information flow
|
|
|
|
- Differential privacy
|
|
|
|
- Symbolic cryptography
|
|
|
|
|
|
|
|
### Adversarial Machine Learning
|
|
|
|
- Training-time attacks
|
|
|
|
- Test-time attacks
|
|
|
|
- Model-theft attacks
|