Transparency versus Privacy

One the challenges we face in building OpenReputation is addressing the inherent conflict between transparency and privacy. As we see it, a reputation is a predictor of future behavior based on past behavior. One of the predictors of future behavior is the degree of public disclosure of an entity's true identity. Public disclosure (increased transparency) exposes an entity to the consequences of its actions both good and bad. This risk/reward consequence of disclosure influences those actions and therefore provides predictive value. The problem is that disclosure may expose an entity to consequences that are unrelated to the immediate purpose of the disclosure. In general, building a reputation by tracking and measuring behavior can generate a lot of information about an entity that might result in unintended consequences if made public.

Suppose for example that Alice wants to post a comment on Bob's blog. Bob will only accept comments from commenters who disclose their identity with a verifiable name and address. Bob has set this restriction to reduce rude, vulgar, and impolite comments on his blog and Bob knows that commenters are less likely to act badly if their bad acts can be traced back to their true identities. Alice decides to disclose and as result is allowed to comment and participate in the online conversation on Bob's blog. Because all the other commenters have also disclosed their names and addresses, and therefore are more likely to behave civilly, they all benefit from a more polite conversation. The problem arises when a bad actor uses the disclosed information for some other purpose. One example would be if Alice mentions that she is going out of town on vacation the following week. Now a thief would know her address and a period of time when she would not be home because of Alice's disclosure of her valid name and address. Or advertisers could now send targeted advertisement spam to her home.

Our goal in addressing the inherent conflict between reputation and privacy is to follow a policy of least exposure, that is, only expose or make public the least amount of information necessary to enable an interaction to occur. The degree of exposure should be at the discretion and control of the associated parties. This is a difficult challenge as there may be no way to accurately predict the future consequences of any degree of exposure. We can only make a best effort. But some degree of control is better than none.


The fundamental information item in Open Reputation is called a reputational event or repute for short. Each repute has a unique identifier that is always public. A repute also has identifiers for the associated parties. These identifiers may associated with public identities or may be associated with anonymous or pseudonymous identities. A repute has a details section that includes the actual repute content and context. The details section may be encrypted. An example repute might be a rating made by someone about a blog comment. In this case the repute details would include the actual rating and and the context in which the rating was made. In addition the repute includes what can be called meta-data, primarily identifiers of who made the rating (the reputer), who wrote the comment (the reputee), and what entity recorded the comment (the reputery). The following diagram shows this relationship between reputer and reputee.

Because public disclosure is a one way operation, (that is, once something that was private is made public it may never be private again), Open Reputation must be architected such that reputes are private by default. The associated parties may make them public but they must opt into that publicity. The consequence of not making a repute public may be a lower reputation score. The degree of privacy is determined by a terms of use agreement and the degree of disclosure of the associated entities.

Three Degrees of Privacy

There are three basic privacy degrees in Open Reputation. These are: full public, semi-private, and full private.

In a full public repute, the details section is unencrypted and the identifiers are associated with public identities. In a semi-private repute, either a) the details section is encrypted but at least one of the identifiers is public ,or b) the details section may be unencrypted but at least one of the identifiers is anonymous. In a full private repute the details section is encrypted and the identifiers are anonymous.

Semi-private reputes allow interactions where a public reputation associated with a public identity can be used to initiate an otherwise private conversation. This is like a telephone conversation where the parties involved in the call, given by their phone numbers, are known but the actual spoken conversation over the phone is not.

Privacy of the details of a repute is accomplished by encrypting the details. Only those parties that have access to the decryption key may view the information. In many cases the repute information is public to begin with and may not be encrypted at all. In either case, signatures allow verification that the underlying repute details have not been tampered with.

Anonymous identifiers can be created by virtue of a key generation function that creates key pairs from a master key, where the key generation function is seeded with the repute identifier as well as a private master key. This way the associated anonymous key can be recreated by the owner of the master key but not traced to the owner of the master key. Various schemes exist for key generation functions that allow generating anonymous or blinded keys. Similar approaches can be used for generating encryption keys.

Group Privacy

Multiple entities may be associated with a given repute and therefore may need the ability to decrypt the details of the repute. One approach is to share the secret location of the decryption key with the members of the group. Another approach is to use a group encryption key. In either case a distributed keychain service could enable multiple entities the ability to decrypt that same repute. The keychain service would be responsible for implementing the key distribution policy. The signatories of a repute are the members of the associated group and have access to the associated key. The policies may include rules for inclusion and revocation of membership in the group as well as permission for disclosure to external parties. Groups may be nested. A group key can be created using a group shared Diffie Hellman ECC key. A group shared key may be created by successive creation of shared key pairs. For example A, B, and C can create a single shared key. First A creates a shared key AB with B and then shared key ABC is created by sharing AB with C. The group key may not be created until the members of the group claim association with the repute. Group keys may be created for both encryption and signing.

One use of group encryption and signing keys is to enable group anonymity, that is, a group reputation may be used to facilitate an interaction between a member of a group and some external party without identifying the specific member of the group involved in the interaction. Group reputation effectively provides a fourth type of privacy, where the group is not anonymous but the specific member of the group associated with the repute is anonymous.


By supporting multiple degrees of privacy we allow the user better than black-or-white control over public disclosure. Over time we hope to refine these approaches even more to better match the needs of reputation with the least amount of privacy exposure.



© 2014 The World Table | Legacy Forum | Terms of Service | Privacy Policy