Healthcare Privacy Privacy Policy Hospital Auditor Patient - - PowerPoint PPT Presentation

healthcare privacy
SMART_READER_LITE
LIVE PREVIEW

Healthcare Privacy Privacy Policy Hospital Auditor Patient - - PowerPoint PPT Presentation

Healthcare Privacy Privacy Policy Hospital Auditor Patient Patient Patient information information information Drug Company Patient Nurse Physician 2 20/02/15 PARCOMTECH, Bengaluru Why is it being formalized? A patient goes to


slide-1
SLIDE 1

2

Healthcare Privacy

Hospital Drug Company Patient information Patient Auditor Patient information Patient information Physician Nurse Privacy Policy

20/02/15 PARCOMTECH, Bengaluru

slide-2
SLIDE 2

Why is it being formalized?

  • A patient goes to a hospital and provides medical information to a

physician.

  • Physician needs help and passes some of this information along to a

nurse.

  • The nurse then turns around and sells the information to a drug

company that uses it for marketing.

  • Since some patients object to such uses of their health information

and we, as a society, want to encourage open communication between patients and their physicians, we have adopted privacy policies, such as HIPAA, that prohibit such uses of patient information without their consent.

  • To ensure that employees comply with these policies, hospitals

employ auditors who examine accesses to and transmissions of protected information looking for actions that violate the privacy policies in place.

20/02/15 PARCOMTECH, Bengaluru 3

slide-3
SLIDE 3

6

A Research Area

  • Formalize Privacy Policies

– Precise semantics of privacy concepts

  • Enforce Privacy Policies

– Audit

  • Detect violations of policy

– Accountability

  • Identify agents to blame for policy violations
  • Punish to deter policy violations (resource allocation)

20/02/15 PARCOMTECH, Bengaluru

slide-4
SLIDE 4

Purpose in Privacy Policies

  • Yahoo!'s practice is not to use the content of

messages […] for marketing purposes.

  • By providing your personal information, you

give [Social Security Administration] consent to use the information only for the purpose for which it was collected.

  • Purpose =??= Operation
  • Purpose = ??= Operation + Context

7 20/02/15 PARCOMTECH, Bengaluru

slide-5
SLIDE 5

Purpose Restrictions in Privacy Policies

  • Yahoo!'s practice is not to use the content
  • f messages […] for marketing purposes.
  • By providing your personal information,

you give [Social Security Administration] consent to use the information only for the purpose for which it was collected.

8

Not for Only for

20/02/15 PARCOMTECH, Bengaluru

slide-6
SLIDE 6

Purpose-of-use

  • Several privacy policies and laws are defined

in terms of the purposes for which the information may be used

– For example, the HIPAA privacy rule stipulates that medical information may be used only for certain purposes like treatment

20/02/15 PARCOMTECH, Bengaluru 9

slide-7
SLIDE 7

Purpose Restrictions are Ubiquitous

  • OECD’s Privacy Guidelines
  • US Privacy Laws

– HIPAA, GLBA, FERPA, COPPA,…

  • EU Privacy Directive
  • Enterprise Privacy Policies

– Google, Facebook, Yahoo,… – Hospitals, banks, educational institutions, govt

10 20/02/15 PARCOMTECH, Bengaluru

slide-8
SLIDE 8

Goal of Current Approaches

  • Give a semantics to

– “Not for” purpose restrictions – “Only for” purpose restrictions

that is parametric in the purpose

11

  • Provide automated enforcement of purpose

restrictions for that semantics

20/02/15 PARCOMTECH, Bengaluru

slide-9
SLIDE 9

Auditing

12

Auditee’s behavior Purpose restriction Environment Model Obeyed Violated Inconclusive

20/02/15 PARCOMTECH, Bengaluru

slide-10
SLIDE 10

Information Privacy

  • In the context of information systems, where

sensitive information is collected, stored, processed and communicated automatically among the components in a digital form, the challenge is to design a workflow that complies with the relevant / applicable privacy laws and enforce strict controls on access and disclosure of information

20/02/15 PARCOMTECH, Bengaluru 13

slide-11
SLIDE 11

Facebook  Zynga Policy

20/02/15 PARCOMTECH, Bengaluru 14

slide-12
SLIDE 12

Web Security

  • The web is a complex heterogeneous platform

for sharing information and distributed applications that process the information from multiple sources / stakeholders each with their own security requirements

– Entertainment – Education – Financial transactions – Social interactions

slide-13
SLIDE 13

Schematic of the Web depicting the interactions among its components

slide-14
SLIDE 14

C S

Request (Rq) Response (Rsp) (a)

C(C,{C,S},) S(S,{C,S},) C(C,{C,S},) S(S,{C,S},) Rq(C,{C,S},{C})

C creates Rq

C(C,{C,S},) S(S,{C,S},{C}) Rq(C,{C,S},{C}) C(C,{C,S},) S(S,{C,S},{C}) Rq(C,{C,S},{C}) Rsp(S,{C,S},{C,S}) C(C,{C,S},{C,S}) S(S,{C,S},{C}) Rq(C,{C,S},{C}) Rsp(S,{C,S},{C,S})

C reads Rsp (b)

Typical web interactions of a client (C) with a server (S)

slide-15
SLIDE 15

C S1 S2

  • 1. Rq1
  • 2. Rsp1
  • 3. Rq2
  • 4. Rsp2
  • 5. Rq3

(a)

Interactions in the case of a cross-origin request

slide-16
SLIDE 16

C(C,{C,S1},) S1(S1,{C,S1},) C(C,{C,S2},) S2(S2,{C,S2},) C(C,{C,S1},) S1(S1,{C,S1},) C(C,{C,S2},) S2(S2,{C,S2},) Rq1(C,{C,S1},{C}) C(C,{C,S1},) S1(S1,{C,S1},{C}) C(C,{C,S2},) S2(S2,{C,S2},) Rq1(C,{C,S1},{C})

C creates Rq1 S1 reads Rq1

C(C,{C,S1},) S1(S1,{C,S1},{C}) C(C,{C,S2},) S2(S2,{C,S2},) Rq1(C,{C,S1},{C}) Rsp1(S1,{C,S1},{C,S1})

S1 creates Rsp1

C(C,{C,S1},{C,S1}) S1(S1,{C,S1},C) C(C,{C,S2},) S2(S2,{C,S2},) Rq1(C,{C,S1},{C}) Rsp1(S1,{C,S1},{C,S1})

C reads Rsp1

C(C,{C,S1},{C,S1}) S1(S1,{C,S1},C) C(C,{C,S2},) S2(S2,{C,S2},) Rq1(C,{C,S1},{C}) Rsp1(S1,{C,S1},{C,S1}) Rq2(C,{C,S2},{C})

C creates Rq2 S2 reads Rq2

Information flow diagram in the case of a cross-origin request

slide-17
SLIDE 17

(b) C(C,{C,S1},{C,S1}) S1(S1,{C,S1},C) C(C,{C,S2},) S2(S2,{C,S2},{C}) Rq1(C,{C,S1},{C}) Rsp1(S1,{C,S1},{C,S1}) Rq2(C,{C,S2},{C})

S2 reads Rq2

C(C,{C,S1},{C,S1}) S1(S1,{C,S1},C) C(C,{C,S2},) S2(S2,{C,S2},{C}) Rq1(C,{C,S1},{C}) Rsp1(S1,{C,S1},{C,S1}) Rq2(C,{C,S2},{C}) Rsp2(S2,{C,S2},{C,S2})

S2 creates Rsp2

C(C,{C,S1},{C,S1}) S1(S1,{C,S1},C) C(C,{C,S2},{C,S2}) S2(S2,{C,S2},{C}) Rq1(C,{C,S1},{C}) Rsp1(S1,{C,S1},{C,S1}) Rq2(C,{C,S2},{C}) Rsp2(S2,{C,S2},{C,S2})

C reads Rsp2

C(C,{C,S1},{C,S1}) S1(S1,{C,S1},C) C(C,{C,S2},{C,S2}) S2(S2,{C,S2},{C}) Rq1(C,{C,S1},{C}) Rsp1(S1,{C,S1},{C,S1}) Rq2(C,{C,S2},{C}) Rsp2(S2,{C,S2},{C,S2}) Rq3(C,{C,S2},{C,S2})

C creates Rq3

Information flow diagram in the case of a cross-origin request

slide-18
SLIDE 18

Cross-Origin Requests

  • It is a common occurrence to have a web page

combine content (scripts and other resources) from several sources – mashups

  • One of the top 10 web security concerns
  • Important web security issues including cross-
  • rigin requests, origin header, referrer validation,
  • pen redirectors etc share a common cause

– Failure to accurately identify all the stakeholders responsible for a request

slide-19
SLIDE 19

Cross-Origin Requests

  • Using our approach, request 3 (Rq3) in the

example is labelled (C,{C,S2},{C,S2}) indicating that it is created by C, readable by C and S2, and has been influenced by C and S2

  • As such, if this is sent to S1, it cannot read it
  • Declassification is required to allow S1 to read

– Possible only under certain trust assumptions

slide-20
SLIDE 20

Cross-Origin Requests

  • Assume, S2 has the necessary trust in S1

permitting downgrading

  • f

Rq3 to (C,{C,S1,S2},{C,S2}) to enable S1 to read it

  • When S1 receives the message it can clearly

learn that the message has been influenced by both C and S2

  • Depending on the trust S1 has on S2, S1

responds appropriately to the message

slide-21
SLIDE 21

WebAuth Protocol

  • WebAuth# is a web-based authentication

protocol based on Kerberos

  • Three stakeholders

– User-Agent (UA), the user's browser – WebAuth-enabled Application Server (WAS), a web server that serves content authenticated via WebAuth – WebKDC, the login server and provider of authenticators to the other two components # http://webauth.stanford.edu/protocol.html#rfc.references

slide-22
SLIDE 22

WebAuth protocol for a user logging in for the first time

slide-23
SLIDE 23

MODELLING FUNCTIONAL REQUIREMENTS

slide-24
SLIDE 24

Information flow diagram in the case of WebAuth

slide-25
SLIDE 25

Information flow diagram in the case of WebAuth

slide-26
SLIDE 26

Information flow diagram in the case of WebAuth

slide-27
SLIDE 27

Information flow diagram in the case of WebAuth

slide-28
SLIDE 28

Labels of objects in the WebAuth scenario

slide-29
SLIDE 29

INCORPORATING THE SECURITY REQUIREMENTS

slide-30
SLIDE 30

Security Requirements

  • From the detailed descriptions of the steps

involved in the protocol, it becomes apparent that certain data objects are not meant to be read by certain subjects

– request_token provided by WAS to UA in steps 4 and 5 is not meant for consumption of UA, but just to be forwarded to WebKDC

slide-31
SLIDE 31

Security requirements of objects in the WebAuth scenario

slide-32
SLIDE 32

Refined Information Flow Diagram

  • Refined

information flow diagram that accounts for the security requirements also, is

  • btained by introducing appropriate (at states

marked with a ) relabelling actions and the associated state transitions

slide-33
SLIDE 33

Vulnerabilities in WebAuth

  • Mitchell et al.# formally modelled WebAuth

and using the Alloy tool discovered the following vulnerability

– an attacker carries out steps 1 - 8 of the protocol, and shares the link to WAS resource (containing his id token) with a honest user, thereby providing honest user access to sensitive information on the server, thus violating session integrity

#Devdatta Akhawe, Adam Barth, Peifung E. Lam, John Mitchell, and Dawn Song. Towards a formal foundation of web security. IEEE CSF 2010

slide-34
SLIDE 34

Vulnerabilities in WebAuth

  • Mitchell et al.’s proposed fix involves changing

the protocol itself by binding the user generating the requests in steps 3 and 9 of the protocol via a cookie, thus avoiding the attack

  • Our view

– There is no flaw in the protocol as such – A method to express the intentions of the designer unambiguously is needed so that appropriate implementations adhering to the specification can be achieved

slide-35
SLIDE 35

Vulnerabilities in WebAuth

  • Our approach using IFC provides such a way of

unambiguous specification

  • In this example, let U denote attacker and U’

the honest user

  • Once U successfully logs in, confirmation page

and id token are labelled (K,{U,K},{W,K,U}) and (K,{W,K},{W,K,U})

  • First defense: U’ cannot read confirmation

page, essential for preparing rerequest

slide-36
SLIDE 36

Vulnerabilities in WebAuth

  • Second defense: if somehow, U′ is able to

generate a request to W using the id token, the request would be labelled (U′, {U′,W}, {U′,K,W})

  • W, upon receiving the request, checks the labels
  • f id token and rerequest

– Immediately makes out that the id token was issued for U, while the request is influenced by U′ – WAS denies the request, thus blocking access of sensible content to U′

slide-37
SLIDE 37

(Reasonable) Object Labelling

  • accountPersonalInfo(M,{M,U},{M,U})
  • searchInfoIP(S,{S,A,F,E},{S,U})
  • searchInfoCID(S,{S,A,F,E},{S,U})
  • searchInfoOther(S,{S,A,F,E,O},{S,U})
  • expImpInfo(E,{E},{S,E,U,F,A})
  • pgViewInfo(F,{F,A,E},{S,U,F,A})
  • clickInfo(F,{F,A,E},{S,U,F,A})
  • adInfo(A,{F,A,E},{S,U,F,A})
slide-38
SLIDE 38

Objects on Lattice

({M,U},{M,U})

Personal Info

({S,A,F,E}, {S,U}) ({F,A,E}, {S,U,F,A}) ({E}, {S,U,F,A,E}) ({S,A,F,E,O}, {S,U})

General Search Info Experience Improvement Info Finger Printing and Ad Info Search Identifier Info Advertisements presented may be influenced by the keywords searched Experience improvement program has access to all the info that search engine has access to Advertisements presented are not influenced by user‟s sensitive personal information Search engine does not have access to some information accessible by the experience improvement program (need subject isolation)

slide-39
SLIDE 39

Inferred Subject Isolation

  • In the above example, S, A, E and O must be different

programs (isolated) i.e., they execute as different processes with different identities

– For enforcing access restrictions, identity of the requester is important – Whenever there is information (for example, advertising info) that can be accessed by one subject but not another (in our example, advertising program but not by search engine) these must have different identities

  • Without subject isolation (for example, the code for search

engine and advertising is in the same program) the objective behind labelling objects (achieving fine-grained access control) cannot be fulfilled because it introduces covert channels between programs beyond our observation power

slide-40
SLIDE 40

Advantage of IFC

  • Information flow policies make explicit the

isolation necessary in both the object space and the subject space, and clearly brings out the permissible communication channels between them to avoid unwanted information flows („non-interference‟)

slide-41
SLIDE 41

Policy Comparison

  • Given an information system and two policies

P1 and P2, we say that

– P1 is weaker than P2 iff oO P1(o)  P2(o) (P1 permits more people to read o than P2), – P1 is stronger than P2 iff P2 is weaker than P1, and – P1 and P2 are incomparable iff P1 is neither stronger nor weaker than P2

slide-42
SLIDE 42

Network of Systems

  • Often there is a need for two (or more)

information systems to interact by sharing / exchanging data

– can be achieved by forming a network

  • A major concern in a networked system is

secure data sharing i.e., compatibility of the data usage policies of the systems involved

slide-43
SLIDE 43

Secure Information Sharing

  • Information system IS1 having policy P1,

wishes to share data o1 with information system IS2 having policy P2, such sharing is secure iff P1(o1)  P2(o1)

  • The asymmetry in the above definition reflects

“directionality of data movement”

slide-44
SLIDE 44

Example

  • Facebook‟s policy (P1) prohibits transfer of its

user data to advertising partners while it permits the use of this data by platform content providers like Zynga

  • Zynga‟s policy (P2) permits transfer of user-id

to advertisers for preventing fraud

  • Facebook sharing user data with Zynga is

insecure because AdR(P1) but AdR(P2) - a clear policy conflict

slide-45
SLIDE 45

Related Work

  • In [2], authors developed a system for

automatic privacy compliance checking in big data systems and demonstrated its application to Bing

  • Drawbacks of this approach compared to ours

– Greater manual effort involved – Works only for a centrally managed system – Does not control data propagation once released

[2] Shayak Sen, Saikat Guha, Anupam Datta, Sriram K. Rajamani, Janice Tsai, and Jeannette M.

  • Wing. “Bootstrapping Privacy Compliance in Big Data Systems”, IEEE S&P, 2014