I want to get feedback on the "managed trust" concept we've been talking about.
I've been assuming that:
* post-snowden people are more concerned about data privacy.
* people are also concerned about large providers with access to all the personal data they host.
* A partial but workable solution to this problem is for "managed trust" groups to host their own data
* "managed" trust groups are groups with collective identities where people have close working relationships (like Enspiral).
* If you're hosting your own data, you might as well share that data between different apps (starting with login, profile, and group data etc) and also avoid tiresome admin work like maintaining manually sync'd data across several apps.
Groups like this would be our main target market and form the nodes in the Open-App network.
Offline and her comments in the comms doc, Alanna made a really good point that "maximised trust" groups have lots of efficiency gain. Enspiral managed to avoid a lot of bureaucracy by...trusting each other.
This sets the Open-App apart from other decentralised web projects that assume "zero trust" and need lots of strong cryptography and anonymity and other technical hurdles to make them work.
What do people think about this?
Joshua Vial Wed 6 Aug 2014 4:40AM
Agree with the concept but I would use the word high-trust rather than managed trust.
Simon Tegg Wed 6 Aug 2014 5:03AM
"High Trust" also works.
Not everyone in Enspiral has "high" trust with every other person. New contributors for example.
Joshua Vial Wed 6 Aug 2014 6:05AM
true, but if you take the average trust in the group it is a lot higher than most.
Simon Tegg Wed 6 Aug 2014 6:38AM
Enspiral: higher trust than your average sloth of bears
I think "managed" was getting at the delegation of trust to group admins and specific individuals.
Mikey Wed 6 Aug 2014 7:13AM
yeah, i used "managed-trust" instead of "high-trust" because not every trust relationship has to be a high value, it's managed by you to be any value.
i'm good with whatever people think is best, but keep in mind that not every group will be as trusty as Enspiral, think if the region of Wellington were a group.
Simon Tegg Wed 6 Aug 2014 8:23AM
@ahdinosaur I'm not sure what you mean by "managed by you to be any value"
Mikey Wed 6 Aug 2014 8:28AM
- i trust Simon 0.7 (on a scale from 0.0 to 1.0).
- i trust Enspiral 0.5.
- i trust a lot of people with letting me sleep at their place. :)
i'm using trust in the following structure:
- who you trust (a person or group)
- how much you trust (variable value from 0.0 to 1.0)
- about what do you trust (any thing)
trust forms a web:
- if i trust you, and you trust Bob, then i trust Bob some decayed value of our trust.
- if you are a member of a group i trust, then it's the same as if i trusted you with the same value.
also, trust endorsements can decay. think about reviews of restaurants on Yelp, good reviews are a better indicator than old reviews.
then, we can leverage our trust while we offer:
- i offer anyone i trust with ($500 * (trust value ^ 2)) of personal IOUs in exchange for their personal IOUs.
- i offer anyone i trust with (10 GB * trust value) of their data duplicated on my machine in exchange for my data on their machine.
- i offer anyone i trust more than 0.35 to use my car in exchange for either a shared group currency or a personal IOU.
or while we request:
- i request anyone who trusts me more than 0.3 and i trust more than 0.1 to have me over for meal one day every 2 weeks in exchange for either a shared group currency or a personal IOU.
- i request anyone who trusts me more than 0.5 and i trust more than 0.3 to let me sleep on their couch one week every 12 weeks in exchange for either a shared group currency or a personal IOU.
trust can also relate to who we allow access to our data:
- i allow Derek to access all of my data
- i allow anyone who i trust more than 0.7 to access all of my data
- i allow Enspiral to access my photos
- i allow anyone who i trust more than 0.4 to access my photos
then there's also negative trust, from -1.0 to 0.0, that can signal intentions such as bad reviews or permanent blocks.
last but not least, i'm sure someone is wondering "how can you reduce a relationship to a number!?" well... yeah, at least i think it's better than a binary (trust / no-trust). if we want the numbers to be more accessible we can always hide them behind words (1.0 is "love", 0.5 is "like", 0.0 is "meh", -0.5 is "sucks", -1.0 is "fuck off", etc).
this turned into a much longer post, hehe. i hope it makes the idea of how we can "manage trust" more clear. even if the above features are dream features, it does suggest the possibility of systems based on trust.
Caroline Smalley Wed 6 Aug 2014 4:38PM
what creates the trust values? developed overtime like a credit rating? each action (such as sending a particular person a photo, or file sharing/co-writing a blog) adds to the score, which has a demorage system of sorts, so values don't 'get stale'.
as for protocols, see last comment on backbones = cms. our intention is to incorporate a commons based governance agreement that would be managed through P2P. the initiative i'm working on is oriented around a co-op. when members join, they'll be required to 'sign' an agreement, which would in turn increase trust value.
Simon Tegg Wed 6 Aug 2014 7:36PM
as @ahdinosaur said trust values are dream features and I personally think binaries (trust, no-trust) would be an easier place to start.
The idea of the "managed trust" concept is to put people and groups at the centre, so people would directly set their personal trust of other entities. I wouldn't "trust" an algorithm to decide how much I trust another person :). You can do interesting algorithm things once you have a layer of trust relationships in place though.
Josef Davies-Coates Wed 6 Aug 2014 8:22PM
I instinctively like the idea of 'managed trust' (or whatever you want to call it) more than the 'zero trust' crypto stuff.
@tav has been talking about using 'trust maps' for years and once got this nice demo up http://www.trustmap.org/ :) - pretty much exactly what you describe above @ahdinosaur :) i.e. you can say who you trust and in what context you trust them
Theodore Taptiklis Thu 7 Aug 2014 1:37AM
Ok. Problem for me here is 'trust' is a Big Word...an abstract notion that means different things to different people.
I'm thinking a lot at present about the notion of group...and the difference between a self-organising, creative, generative, organic group like Enspiral and other kinds of groups whose compatibility are accidents of time, space or task.
So in place of Simon's 'identity' as the basic building block I'm wondering about 'relationship'. The qualities of groups and relationships seem to me to be more prospective than notions of test and identity. For me this is because trust and identity belong to a world in which we are oriented towards individualism, rather than a future world in which we become oriented towards collaboration and participation.
Alanna Irving Thu 7 Aug 2014 2:30AM
I'm having repshare flashbacks :p
I think "trusting" might be a good term. "Managed" makes me assume the system will be doing the managing, when what I think you are talking about is the system taking advantage of the efficiencies of trust relationships human naturally form. But they are not always "high-trust" relationships, they just depend on trust to an appropriate level as judged by the people. It also differentiates it usefully from "zero-trust" systems, making it a unique proposition in the decentralisation space.
Simon Tegg Thu 7 Aug 2014 8:31AM
Yeah, right now I'm more interested in the overall concept of nodes as a group of people with a focal point for their higher trust relationships than in the specifics of a web-of-trust implementation.
Caroline Smalley Thu 7 Aug 2014 5:38PM
Thanks @simontegg ..on reading all the comments, I see I completely missed the point! Lesson learned. Trust me next time? Actions.. Actioned Trust? ...or simply Actual Trust ratings based on actions / permissions for what people can/can't do. 'Actual' because we get to control it. To a certain extent, Facebook already do this. Please don't throw eggs when I say this.. I totally agree it's a nightmare to manage/'control'. Indeed, the lesser the hands on management, the better, hence I agree with @alanna and @joshuavial that 'managed' = an off-putting term.
Needs to be simple to implement.. thinking on/off vs levels. Could have basic and advanced levels for more complex scenarios. Need to make 'setting' the control station an 'as you go' process.
Couple more thoughts.. though maybe an obvious 'to do', but would help if you could create groups of 'friends' such that on/off permissions become a less arduous task, and perhaps could be complemented by an automated back-ups whereby if a program detects one of your trusted connections may have compromised your good faith, is brought to your attention. There's a name for this.. ?
@ahdinosaur never trust me to get a time right?! sorry to have missed the hangout..
Devon Auerswald Fri 17 Oct 2014 8:39AM
not sure just how appropriate this to this specific project is but some trust is far more vital than people seem to understand (not here specifically, but in general).
Consider the following possibility for example:
1) Blocking google from tracking any and all data from search engine results pages on a 0 trust basis becomes a thing, everyones starts doing it, being invisible/anonymous becomes really popular
2) People notice search results become worse and worse as google can no longer obtain high-confidence data as quickly as they did when most volume returned usable tracking data.
3) Meanwhile the spammers working to manipulate rankings by allowing google to track their "intended to manipulate rankings" data (trust me this is an all day every day, in every niche occurrence) become a larger ratio of data google is collecting and using to rank websites. Spam begins to float to the top like its 1999
4) Because people generally don't understand or care that google is just about completely reliant on their the tracking an analysis of user data, they're probably just going to assume Google got worse because of X Y or Z unrelated reason.
Whether the user notices or not, they begin to search less, find less, accomplish less and lets assume for examples sake, 300million people lose 1 minute of productivity a day because of this beating google is taking in part due to the mistrust driven by government corruption, and mostly because most people are not data scientists, nobody realizes - yet - that the 1 minute 300million people lose every day adds up to 570 years of time, every day.
That possibility scares the crap out of me whenever the privacy issues comes up because even many tech people don't catch it right away. I figured this is something worth bringing up. Trust is important. 0-trust is a poison apple in some circumstances.
Whether this example can be applied specifically to this topic or not, im not really sure - I know it will likely be important to consider whenever addressing the issue of user privacy and tracking.
For anyone scratching their head about why google would get worse - they track your mouse, clicks, search revisions, your last-destination post-search, the order of sites you visit, the first site you visit, your last search, your next search. All of that is tracked, it impacts rankings most likely in a more meaningful way than typically known SEO tactics like link building as it is far more difficult to fake and a lot more intimate with googles core goals. UX
Mikey · Wed 6 Aug 2014 4:21AM
i'll see if i can explain it another way, in case it helps anyone. we are interested in "managed-trust" decentralized systems, which are different from "zero-trust" distributed systems.
in "zero-trust" distributed systems like Bitcoin, Ethereum, or MaidSafe, the intent is to only have to trust cryptography and the algorithms that run the network. this lack of trust comes at a cost, which is usually increased duplication. for Bitcoin this cost is huge amounts of duplicate CPU work in Proof-of-Work cycles, or in MaidSafe this cost is huge amounts of duplicate data.
in "managed-trust" decentralized systems, we hope to centralize the system on our identities, both us as a person and us as a member of groups. between these identities, we hope to create trust relationships. these trust relationships form the topology of the network, so if we are sharing resources (CPU, data, food, shelter, ...), we can share directly instead of through a global network. this does mean though that we have to trust cryptography, the algorithms, and the identities we have trust relationships with.