Sunday, July 30, 2006

PET Workshop - Day 2 (June 29, 2006)



It’s Thursday, the second day of the Privacy Enhancing Technologies Workshop and I’m still adjusting to the jet lag. Below are notes from today’s sessions. But first, I’ll summarize the rest of day:

Today’s meetings ended with a rump session, in which anyone could get up and talk about their new projects or proposals for 5 minutes. There was high-level research, low-level research, proposals for conference hosting, calls for research support, and even self-confessionals. Quite unlike anything I’ve seen before. It was simply a whirlwind of people and topics.

This evening the attendees went to dinner at a very cute hotel restaurant. It was a short bike ride from my B&B - ah the freedom of bicycling in this wonderful town! The food was excellent and I had excellent table conversations. But what I found most interesting was how we began dinner with an apertif of “pims”. Quite a curious little highballish drink that included strawberries, lemons, cucumbers, fresh mint, and who knows what else.

Now, back to the sessions.

Font color = Red Session 2: Privacy Policies

Presentation 1: Enhancing Consumer Privacy in the Liberty Alliance Identity Federation and Web Services Frameworks
- Systematic walkthrough of frameworks to identify potential
privacy breaches, then makes recommendations on how
to fix them
- Liberty alliance is a framework for federated identity
- Allows a user to connect to multiple sites using different
logins, but based on common core
- Example Problem: Introduction of users to new groups
may be privacy violation
- Proposed Solution: Get user consent for every
introduction


Presentation 2: Traceable and Automatic Compliance of Privacy Policies in Federated Digital Identity Management
- Investigation of inter-organization management of identity
information.
- Description of policy “harmonization” mechanisms
- Approach relies upon the prespecification of an ontology to
model semantic relationships
- Use of privacy policy templates to ease policy specification
- Question from the audience: how can we address
matching of schemas across organizations
- Speaker Comment: Goal is try to allow service providers to
exchange information about a user without contacting a
user first?
- Speaker Comment: Belief it can be done if policies specify
this is agreeable


Presentation 3: Privacy Injector: Automated Privacy Enforcement Through Aspects
- Want to address 2 challenges: 1) How can we
“consistently” enforce a privacy policy throughout the life
cycle of data
- Uses aspect oriented programming (AOP) language, such
as AspectJ (Java-based)
- Desirable property of AOP:
i) modularization (i.e., decomposition of events/
concerns)
ii) crosscutting concerns (which affect the who
organization)
iii) it does not change existing applications - works on
bytecode, not sourcecode
- Privacy Injector: an AOP application for administering
sticky policies
1) Tracks data by assigning metadata and storing
structured policies
2) Attempts to ensure persistence and enforcement on
the fly (e.g., querying storage DB)


Presentation 4: A Systematic Approach to Automate Privacy Policies Within Enterprises
- HP Labs approach to “Privacy aware access control”
- Simple approach: perform queries against database, such
that purpose, consent, and actions (such as filtering /
obfuscation) at runtime of queries
- Query rewriting to achieve solution (think JDBC proxy)
- Extension to business / enterprise level
- Prior examples include IBM/Tivoli privacy manager, IBM
Hippocratic databases

- HP uses “identity management” perspective, which
incorporates
- User provisioning & Account Management
- Privacy-Aware Access Control System
- Manages consent & other preferences
- Obligation Management System
- Privacy obligations and policies
- All of which sits on top of data repositories
- Front end access via web portal
- Architecture based on “Validator” and “Enforcer”


Session 3: Privacy in the Real World

Presentation 5: One Big File Is Not Enough: A Critical Evaluation of the Dominant Free-Space Sanitization Technique (Garfinkle & Malan)
- One model for overwriting disk is to create one BIG file
(i.e., pick a sector and continually grow the file)
- Alternative, create lots of little files.
- Back to the big file situation - works assuming that you can
access all sectors.
- Unfortunately, can not access “slack space”: a file that is
partially overwritten with another smaller file leaves a small
amount of information.
- Question: can we recover files in the slack space?
- Developed a technique to find the slack space and
evaluated against different erasing and secure erasing
techniques
- Experiments show you can find some of the slack space,
the signature of files, and reconstruct files.
- Also looked at Big+Little technique
- Big file deletes most “deleted” files, but:
- many file names & times remain
- there are times when complete files remain
- “Journaled” file systems harder to sanitize
- Big conclusion: need to work at file-system level for
proper / complete sanitization


Presentation: Protection Privay with the MPEG-21 IPMP Framework
- Parallel between copyright and privacy protection
- MPEG-21: specifications for using, manipulating, &
navigating multimedia content
- MPEG-21 Intellectual Property Management for digital
rights management (DRM) protection
- Application required a
1) Rights expression language for privacy specification
P3P - though does not defined enforcement
algorithm
EPAL & XACML - do not have a vocabulary
***MPEG REL & ODRL - does not support elements
used in privacy - so they extended the
language
2) Enforcement System
3) Suite of tools for user / ease of use (web based)

Presentation: Personal Rights Management
- Worried about cellphone cameras being used and published all over the place (Blogs, Flickr, MySpace, etc.)
- Try to allow people to control pictures of me that are taken
by different people
- Legal controls in Europe and other realms
- Related Work:
Artificial Shutter Noise
Camera Blockers: Restrict cameras (HP) Blind cameras
with lasers (GaTech)
Cell Phone Ban (Policy-specific)
- Goals:
1) defend against non-professionals (i.e., can’t handle
hacking, telescope lenses, etc.)
2) Protect photographer as well (don’t infringe on
photographer’s privacy)
- Model: Camera sends pictures to website (without consent
of individual?
- Digital Watermarks vs. Hashing of pictures
- Broadcast what is available on the web and allow me to
find it and take it down.
- Open problems, blogs and de-identification or
trustworhiness, “beating people up with a script”, public
internet pillory.


Session 2: Privacy Policies

Presentation 1: Improving Sender Anonymity in a Structured Overlay with Imprecise Routing
- Why study? Good for building network servies - Routing
predictably and efficiently converges; usually fault tolerant
(see Chord)
- Goal: make it difficult for an adversary to determine which
key was contributed (or is administrated) by which IP
address (i.e., the “finger” position).
- Solution: obscure some information in the routing table
- Instead of reporting exact node position, report random
point within a defined range (a.k.a. imprecise routing)


Presentation 2: Selectively Traceable Anonymity
- It’s one way to deal with abuse
- Some systems have “implicit” tracing policies - but which
ones can be supported?
- Result summary
- Many models can be augmented to support tracing
- Transformation preserves “coercible” anonymity scheme
- Tracability achieved via group signatures & voting
- Little Problem 1: incentive to read unsigned messages
- Little Problem 2: sender has no incentive to sign
message
- Big Problem: No incentive to pass along unsigned
messages
- Solution: Prove sender output is consistent with signed
output
- Uses noninteractive zero knowledge proofs.
- Basic example of a non-coercible anonymity scheme -
Shamir Secret Sharing
- But it’s uncoercible because you can lie about the
share that you held
- Extension to coercible: use commitments with the shares


Presentation 3: Valet Services: Improving Hidden Services with a Personal Touch
- Example: Censorship resistant publishing
- Example: Multilevel secure chat servers
- Goals:
Accessible from anyone / anywhere
Resistent to authorized users, DoS attack, physical attack
- Existing model: Rendezvous at a server via Tor/Onion
Router
- In current network, everyone knows the “Introduction
points” that lead to hidden services
- But if we introduce “Valet” nodes, or intermediaries, we
can obscure the introduction nodes
- Public directory servers don’t know private hidden
servers don’t know hidden service exists, nor how to
access it
- Reduces denial of service attacks and adds quality
of service as an option
- Extended model: add “Valet Nodes” - in theory can give
out to each person per contact ticket / session
- Alice connects to Valet, then Valet connects to entry point
- Decreases probability malicious parties can detect
the introduction points


Presentation 4: Blending Different Latency Traffic With Alpha Mixing
- Research looks at the batching strategy of mix-nets. In
other words, high latency makes larger batches of
messages and more anonymity
- Most use high latency for anonymous email and low for
web browsing (most users)
- Alpha mixing - introduce a delay parameter that is
specified by the user (per mix)
- Adding a parameter T, the period, for each time period,
alpha gets decremented. When alpha = 0, messages get
forwarded on.
- Alternative, when n messages with alpha = 0 are
accumulated, then the messages are forwarded
- Brad’s Opinion - for those of you that have not read it, this
research is very similar in spirit to the following paper for k-
anonymity:
Buğra Gedik and Ling Liu. A Customizable k-Anonymity Model for
Protecting Location Privacy. International Conference on
Distributed Computing Systems, IEEE ICDCS 2005.

0 Comments:

Post a Comment

<< Home