The objective of this course is: The objective of this course is



Yüklə 445 b.
tarix02.11.2017
ölçüsü445 b.
#27739



The objective of this course is:

  • The objective of this course is:

    • To draw an overview of security and privacy challenges in multi-scale digital ecosystems
    • To identify candidate methodologies to address these issues
    • To analyze a privacy-preserving decentralized reputation protocol and finally,
    • To discuss some hints for a research agenda


(Digital Ecosystems)

  • (Digital Ecosystems)

  • Security and Privacy

  • The Personalization vs Privacy Dilemma

  • Enforcing Security and Privacy

  • Privacy-Preserving Trust and Reputation protocols

  • Some Hints for a Research Agenda



(Digital Ecosystems)

  • (Digital Ecosystems)

  • Security and Privacy

  • The Personalization vs Privacy Dilemma

  • Enforcing Security and Privacy

    • Identity
    • Location
    • Accountability
    • Trust
    • Reputation
  • Privacy-Preserving Trust and Reputation protocols

  • A Proposition of Research Agenda



Both concepts design properties of a system and, by extension, their enforcement

  • Both concepts design properties of a system and, by extension, their enforcement

  • Security focuses on protecting users and businesses from intrusions, attacks, vulnerabilities, etc.

  • Security provides a safe environment and secure communication along with end user and business protection (Chang et al., 2005)

  • Feeling of security



Privacy: “State of being alone and not watched or disturbed by other people / State of being free from the attention of the public” (Oxford Dictionary)

  • Privacy: “State of being alone and not watched or disturbed by other people / State of being free from the attention of the public” (Oxford Dictionary)

  • The perception of privacy is shaped by

    • the perceived identity of the information receiver
    • the perceived usage of the information
    • the subjective sensitivity of the disclosed information, and
    • the context in which the information is disclosed
        • (Adams, cited by Lederer et al, 2003)
  • “An individual actively yet intuitively monitors and adjusts his behavior in the presence of others in an attempt to control their conceptions of his identity

  • (Goffman, cited by Lederer et al, 2003)

  • (Feeling of) Security along with privacy are user-sensitive (user-dependent) concepts

  • Security and privacy are key enablers

  • for interaction, collaboration and DE dynamics



Digital Ecosystems

  • Digital Ecosystems

  • Security and Privacy

  • The Personalization vs Privacy Dilemma

  • Enforcing Security and Privacy

    • Identity
    • Location
    • Accountability
    • Trust
    • Reputation
  • Privacy-Preserving Trust and Reputation protocols

  • A Proposition of Research Agenda



Pervasiveness/smartness means personalization

  • Pervasiveness/smartness means personalization

  • Personalization needs context and user (profile) information disclosure

  • Privacy needs context and user (profile) information hiding

  • The central dilemma: Personalization or Privacy

  • The central challenge: mitigate personalization and privacy



(Digital Ecosystems)

  • (Digital Ecosystems)

  • Security and Privacy

  • The Personalization vs Privacy Dilemma

  • Enforcing Security and Privacy

    • Identity
    • Location
    • Accountability
    • Trust
    • Reputation
  • Privacy-Preserving Trust and Reputation protocols

  • A Proposition of Research Agenda



Who are you? Is this information private or public?

  • Who are you? Is this information private or public?

  • One identity?

    • SIP address-of-record (RFC 3674)?
    • RFID tag?
    • Electronic Product Code (EPC) – EPCGlobalNetwork – Object Naming Service (ONS)
    • IETF Host Identity Protocol (HIP)
  • Multiple identities/avatars? Linked identities? Not a technological but a political question

  • From a technical point of view, identity management needs authentication

    • through a third party (certificate, signature…)?
    • through a challenge/response protocol?
      •  need some common framework => often incompatible with the dynamic interactions of open systems
    • through social recognition?
      • Trust and Reputation
  • Identity inference



“Where are you”: is this information private or public?

  • “Where are you”: is this information private or public?

  • Location is the key component of all Location-Based Services (LBS)

  • Location is related to identity

    • as a husband, I was on the French Riviera (with my wife)
    • as a PhD adviser, the secretary said I was “busy”
    • my wife changed her profile on Facebook by “Romantic WE on the Riviera”
    • my wife is friend with my PhD students


IETF (SIMPLE WG)

  • IETF (SIMPLE WG)

    • Notions of presence system/server and “presentity” i.e., unique identity in a presence system
    • Main security and privacy requirements
      • Updating presence information requires a priori authentication and authorization
      • Subscribing to/Watching presence information requires authentication
      • Watching requires compliance with the privacy filtering/policy, incl. authorization
      • Confidentiality and integrity of presence information and privacy policy must be ensured
      • (from Singh et al.)


IETF (SIMPLE WG): Privacy policy requirements (Geopriv, RFC 3693)

  • IETF (SIMPLE WG): Privacy policy requirements (Geopriv, RFC 3693)

    • Authentication/Authorization of watchers
    • Selective information: ability to specify what parts of presence information are given to watchers
    • Differential information: ability to distribute different presence information to different watchers
    • Authorization policy for anonymous subscriptions (cf. anonymity in RFC 3323)
    • National policy overrides user’s policy
  • Authorization Policy = set of rules: (conditions, actions, transformations) (action + transformation = permission)

  • The Authorization Policy is a (very) sensitive information => needs protection!

  • Issue: enable a presence server to filter and distribute presence information while hiding its actual value (Singh at al., 2006)



What happens after disclosing your location information???

  • What happens after disclosing your location information???

  • A first approach: use complex cryptography or steganography.

  • No, bad idea

  • A second approach: pray the watcher is a good and intelligent guy

  • A third approach: pray the watcher is a good guy that will respect your location privacy rules (ensure he agrees with that); give him the usage rules he is concerned by; if the guy appears as a bad guy, have a discussion together (or start a legal action)

  • This last issue illustrates a very important concern about privacy: the further (uncontrolled) usage of a disclosed information (see also: Creative Commons)



Definition: “Condition in which individuals who exercise power are constrained by external means and by internal norms” (Public Administration Dictionary)

  • Definition: “Condition in which individuals who exercise power are constrained by external means and by internal norms” (Public Administration Dictionary)

  • Accountable services/systems

    • Un-deniability (non repudiation of actions)
    • Verifiability (correctness and deviations)
    • Detection of deviations
    • (from Malone et al.)
    •  monitoring/logging
    •  easy with trusted third parties; complex otherwise


Open  unpredictable  un-secure: live is risk

  • Open  unpredictable  un-secure: live is risk

  • The alternative would be Big Brother…

  • Use security tools, forget security systems

  • Shift from (false) determinism (of security) to probability, risk management, and social-awareness



Gambetta (1990): “Trust […] is a particular level of the subjective probability with which an agent assesses that another agent [..] will perform a particular action […] in a context in which it affects his own action”

  • Gambetta (1990): “Trust […] is a particular level of the subjective probability with which an agent assesses that another agent [..] will perform a particular action […] in a context in which it affects his own action”

  • Wang (2003): “an Agent’s belief in another Agent’s capabilities, honesty and reliability based on its own direct experiences”

  • Chang and al. (2005): “Belief that the Trusting Agent has in the Trusted Agent’s willingness and capability to deliver a quality of service in a given context and in a given Timeslot”

  • Jøsang et al. (2007):

    • Trust (reliability trust) is the subjective probability by which an individual, A, expects that another individual, B, performs a given action on which its welfare depends
    • Trust (decision trust) is the extent to which one party is willing to depend on something or somebody in a given situation with a feeling of relative security, even though negative consequences are possible
  • Marsh (1994): all studies on trust make the assumption of the presence of a society



Quantifiable as a subjective probability

  • Quantifiable as a subjective probability

  • Binary-Relational and Directional (non symmetric)

  • Contextual (wrt an action)

  • Dynamic (evolves with time)

  • Relativity / Subjectivity / Fuzziness

  • Is Trust Reflexive???

  • Is Trust transitive???



Direct Trust, based on analysis of truster/trustee past interactions

  • Direct Trust, based on analysis of truster/trustee past interactions

  • Trust in the trustee’s profile elements (e.g., diploma, employer, experience…)

  • Transitive Trust / Trust recommendation and propagation (e.g., co-citation)

  • Record-based Trust, computed by analyzing the trustee (certified) record/history

  • Trust negotiation

  • Social-network base Trust: trust ~ centrality

  • Reputation of the trustee in a group

  • Usage: P2P systems (e.g., EigenTrust), routing algorithms, roaming, dynamic virtual organizations…



Reputation is the general opinion of the/a community about the trustworthiness of an individual or an entity

  • Reputation is the general opinion of the/a community about the trustworthiness of an individual or an entity

  • Reputation is computed by aggregating feedback values provided par rating agents

  • By nature, reputation is decentralized

  • Studied in economy, psychology, sociology, computing



Objective: discourage dishonest behavior

  • Objective: discourage dishonest behavior

  • Reputation = aggregate of feedbacks provided by others





Rooting out fake identities in social networks: Unvarnished.com, Duedil.com

  • Rooting out fake identities in social networks: Unvarnished.com, Duedil.com

  • Defeating pollution in peer-to-peer file sharing networks: [Costa and Almeida, 2007], [Yu, 2006], EigenTrust [Kamvar et al., 2003]

  • Discouraging selfish behavior in mobile ad-hoc networks: [Hu and Burmester, 2006], [Buchegger et al., 2004], [Buchegger, 2002] [Miao et al., 2012]



Centralized vs decentralized architecture

  • Centralized vs decentralized architecture

  • Feedback aggregation model

  • Reputation visibility

  • Reputation durability

  • Feedback durability



Easy in a centralized or trusted environment: use a trusted third party to collect the feedbacks and return the reputation value

  • Easy in a centralized or trusted environment: use a trusted third party to collect the feedbacks and return the reputation value

  • More complex in decentralized environment



(Digital Ecosystems)

  • (Digital Ecosystems)

  • Security and Privacy

  • The Personalization vs Privacy Dilemma

  • Enforcing Security and Privacy

    • Identity
    • Location
    • Accountability
    • Trust
    • Reputation
  • Privacy-Preserving Trust and Reputation protocols

  • A Proposition of Research Agenda



The fear of retaliation prevent source agents to provide their real feedback [Resnick, 2002], [ebay.com, 2008]

  • The fear of retaliation prevent source agents to provide their real feedback [Resnick, 2002], [ebay.com, 2008]

  • A privacy preserving trust system protects the user from unwanted information disclosure (~personalization vs privacy dilemma)

  • A privacy preserving reputation system protects a feedback provider by hiding either the feedback value or the identity of the user (~anonymization)

  • Issue : anonymity => risk of attacks

    • Sybil attack (attacker creates multiple identities)
    • Self-promotion, ballot stuffing, slandering (in coalition or not)
    • Whitewashing
    • Oscillation


Agent q needs to learn rt =  li

  • Agent q needs to learn rt =  li

  • Privacy of li must be preserved

  • Need efficient computation









The Semi-Honest Model

  • The Semi-Honest Model

    • Follow the protocol according to the specification
    • Passively attempt to learn the inputs of honest agents
  • The Disruptive Malicious Model

    • Provide out of range values as their inputs
    • Make erroneous computations
    • Refuse to participate in the protocol
    • Drop messages
    • Pre-maturely abort the protocol
    • Wiretap and tamper with the communication channels




















Technique 1: Additive Homomorphic Cryptosystems

  • Technique 1: Additive Homomorphic Cryptosystems

    • Product of ciphertexts = Sum of plaintexts
    • E(3).E(4) = E(3+4) = E(7)
    • Paillier Cryptosystem [Paillier, 1999]
  • Used to encrypt the shares and the sum of the shares



Technique2:

  • Technique2:

  • Non-Interactive Zero Knowledge Proofs (ZKP)

    • A Prover convinces a Verifier that a statement is true
    • No additional information is revealed


ZKP: Set Membership

  • ZKP: Set Membership

    • Given a ciphertext Eu(x) and a public set S
    • Agent u proves: x  S
    • x is not revealed
  • ZKP: Plaintext Equality

    • Given two ciphertexts Eu(x) and Ev(x)
    • Agent u proves: Both Eu(x) and Ev(x) encrypt x
    • x is not revealed
  • Used to

    • Prove that the feedback provided by an agent (i.e., the sum of its shares) is correct (lies in a specified interval)
    • Prove that the shares sent to the nodes are the correct ones
    • Prove that all agents compute their own sum correctly
    • Prove that the received u has the correct value
  • Reference

    • A Decentralized Privacy Preserving Reputation Protocol for the Malicious Adversarial Model. O. Hasan, L. Brunie, E. Bertino, N. Shang. IEEE Transactions on Information Forensics and Security, vol.8, n°6, p. 949-962, 2013.




(Digital Ecosystems)

  • (Digital Ecosystems)

  • Security and Privacy

  • The Personalization vs Privacy Dilemma

  • Enforcing Security and Privacy

    • Identity
    • Location
    • Accountability
    • Trust
    • Reputation
  • Privacy-Preserving Trust and Reputation protocols

  • Some Hints for a Research Agenda



Seamless certified and secure integration of multiple heterogeneous ecosystems, e.g., sensor network and cloud infrastructure

  • Seamless certified and secure integration of multiple heterogeneous ecosystems, e.g., sensor network and cloud infrastructure

  • Holistic trust, reputation and security business-centric value-aware framework (do not forget security…)

  • Lifecycle of a piece of information (is a piece of information a new “thing”?)

  • The issue of identity and anonymity

  • Personalization vs Privacy dilemma / User-centric privacy management proxy

  • Enforcing new rights: indifference and oblivion

  • A social Web of things

    • « [In the] Internet of Things (IoT) […] physical and virtual ‘things’ have identities […] and virtual personalities and […] are expected to become active participants in business, information and social processes […] » (CERP-IoT)
    • Identity? Personality? Relationship? Social network of things? Trust? Privacy?


Yüklə 445 b.

Dostları ilə paylaş:




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©muhaz.org 2024
rəhbərliyinə müraciət

gir | qeydiyyatdan keç
    Ana səhifə


yükləyin