Loomio
Sun 20 Jul 2014 5:58AM

How To Stop Censorship & Protect EndUsers

DU L4C0F Public Seen by 99

I've been saying this all along, It's not entirely been clear though until literally yesterday here thanks to

https://www.loomio.org/d/ikJUaz3l/word-tag-filter-option

My original concerns, post & inquiry re: issues began after being directed both on Twitter & while i was on Diaspora itself to come here:

https://www.loomio.org/d/tuFjC4Tn/concerns-re-abuse-child-safety

So, This is what i propose & have all along & why.

#1. You are making #podmins #censors which is not needed. The only system you have in place in current build of this network is to 'report/flag' someone for what they say. That means you then make the podmin the censor. That's not freedom, that's not free speech & that is entirely contrary to why we all need / desire a voice on a decentralized foss platform.

#2. In order to comply with both US and International Law re: Minor's and Pornography. That part is really, really simple. When a new user signs up on any pod, make them designate as adult or minor in their own country.

Once that part is in place, restrict all NSFW content so Minor's accounts can't view it. Voila! You've just entirely eliminated ALL legal liability for anything that is legal. ie. Adults can post porn, speak freely and be trolls. Kids can use Foss Diaspora Social Network on any pod and have the ability to not get sex ed the wrong way.

#3. Ultimately this is my belief, Using Tags, Minor Accounts, Clearly Warning Kids & Parents Overall With This Project PUBLICLY and making an effort to make it safe for kids to use alongside free speaking adults i think ultimately strengthens Diaspora's stronghold in future development of said Decentralized Social Networks empowering free speech vs Censoring it as you do now.

#4. I don't code ruby on rails, This is now your work ;-)

DU

L4C0F Sun 20 Jul 2014 6:12AM

Whether you realize it or not, This thread determines whether or not Diaspora & Loomio support Child Porn & reporting it, making it a safe haven for abuse & violence vs making it legally cool for both kids & adults to literally say anything. Just not threaten violence or post illegal porn or show adult content to kids. It's quite simple.

Think about it. I will return tomorrow.

T

Theatre-X Sun 20 Jul 2014 6:15AM

Which comment do you speak of? I was just putting that out there if you were speaking of the anarchy one. Hmm. I'll meditate on this a bit.

DU

rekado Sun 20 Jul 2014 6:26AM

Once that part is in place, restrict all NSFW content so Minor’s accounts can’t view it. Voila!

Please clarify "all NSFW content" --- content that has been tagged with "#nsfw" or any NSFW content, including untagged adult pictures, pornographic texts, hate speech...?

DU

L4C0F Sun 20 Jul 2014 6:39AM

+1 rekado! Thank You!

Ok, NSFW is Porn.

Hate Speech, Freedom To Spout Anything & Everything Unless It's Literally A Threat Under Law Should NOT Be NSFW.

Re: Minor Accounts. This is WHY having a disclaimer & warning parents publicly is a GOOD IDEA. It actually saves all your asses from lawsuits & angry folks like people i know on twitter lol.

So, If a kid see's a neonazi trolling someone, that's fine by me, As long as they themselves can stop the abuse if they troll their public page on posts they make for their friends & family.

I see no reason why disabling comments on posts from an abusive user is 'censorship' since they can easily post public rants about said enduser and i support such action should they desire. That i suppose is more of a cyberbullying issue i'm less concerned with as i feel if

a.) a child & parent know beforehand people speak freely & just how freely lol, that's needed.

b.) giving kids a safe place to be amongst their peers and adults while protecting abuse on their own pages.

c.) consider a group for harassment like Theatre-X proposed i think would be good for harassment publicly or threats..

The entire point is, Protect kids from porn. Protect Podmins from Lawsuits, Protect Free Speech & Stop Pretending Your System In Place Isn't CENSORSHIP LOL.

It most certainly is. Think about all this.

DU

L4C0F Sun 20 Jul 2014 7:29AM

Also to note, in my report i suggested you strip and enforce stripping of ALL EXIF data regardless on every pod on every build and make it priority.

Additionally from an infosec perspective, not encrypting user data on pods is a serious security fail. Think about that, you create a thread as i don't rubyonrails and i am not here to flood this community with MY voice only.

The other question i have is Diaspora Pods & TOR?
Why isn't this being done?

Why on earth hasn't Moglen figured out yet this is essential for future freedombox dev? ;-) Think about it peoples.

Also, i suggested instead of arguing over better and safer ways to improve your network, maybe you could even have people teach users about security, things like steganography for example ;-)

https://plus.google.com/113772323131310131648/posts/EKGTrhg1kQA

I mean, doesn't this actually all make sense to you now thinking about it all? I'm your ally, not the enemy folks.

BK

Brad Koehn Sun 20 Jul 2014 12:17PM

There are a number of challenges:

  • There's no consensus on what constitutes porn, legally or morally. Child porn was legal in Japan until earlier this month as one example.

  • There's no way to force users to tag porn, or for software to automatically recognize it. See previous point.

  • it's unlikely you'll be able to get users to tag content that is illegal as being so, for fear of prosecution.

As far as I can tell, this means we need a mechanism where users can report posts and/or comments they find offensive/illegal, so a human being can make a judgement based on their own values, laws, and TOS. A mechanism which as far as I can tell we have already.

L

lnxwalt Sun 20 Jul 2014 12:40PM

It is not possible to automatically block underage users from seeing any youth-inappropriate content. That is why youth have parents.

  • Children and teens routinely claim to be older so they can evade any restrictions on what they can view and post. We all know people who had Facebook accounts when they were 8-10 years old, despite the US age limit of 13.
  • Without voluntary "NSFW" classification by the original poster, it would require invasive content-scanning, which is guaranteed to misclassify some posts. There is no guarantee that the original poster will be in a jurisdiction with the same laws as the viewer, either, so even after automatic scanning, some inappropriate posts will be delivered.
  • School filters -- I still remember hearing teens talking about how easily they evaded the filters and how many things the filters did not catch -- why would anyone think that we're suddenly going to solve something that school censors and the companies they hire still cannot fix after fifteen or twenty years of continued effort?

In the end, we have to depend on those who post to have the courtesy to mark things #nsfw that may not be appropriate in workplaces, homes, schools. This is a judgment call by the poster and may not match what an underage viewer in another country should see.

DU

Deleted account Sun 20 Jul 2014 12:48PM

"It is not porn, it is art !". Hop ! You're fucked up ! What is porn to you might not be porn to me. Especially child porn. E.g : is a picture of a nude child, child porn ? Yes ? Unless it is a holliday picture in a naturist camp which is perfectly legal in some countries. And nude babies ? Would you be offensed ? Try to count how many parents post photos of their babies in bath on Facebook. Nobody's shouting against it. Since then, over which age nudity is not cuteness anymore and become child porn ?

Furthermore, you can't demand to stop podmins censorship while trying to enforce your censorship for some categories of users ! Your speech is pointless man !

DU

Deleted account Sun 20 Jul 2014 1:18PM

Concerning the law, in France here is what the law says which means that in his actual form, D* is perfectly respecting it.

But you didn't read it since you can't speak french, right ?
So how can you assert that D* is not in accordance with the internationnal laws since you cannot have read the law of the 197 countries in the world ?!

G

goob Sun 20 Jul 2014 2:39PM

As far as I can tell, this means we need a mechanism where users can report posts and/or comments they find offensive/illegal, so a human being can make a judgement based on their own values, laws, and TOS. A mechanism which as far as I can tell we have already.

Exactly.

L

lnxwalt Sun 20 Jul 2014 2:41PM

Additionally from an infosec perspective, not encrypting user data on pods is a serious security fail.

It is routine to encrypt passwords, but not any other part of the database. It would make most web applications unusable. The continual encrypt/decrypt cycle would nearly paralyze a server (pod) that has more than a very minimal amount of activity (including activity federated from other pods).

Do you store the keys on the server, or do you make podmins enter the keys every time there is a reboot? Because it would certainly be hard to run most SELECT x WHERE ... queries when the contents of the database are just opaque blobs.

So it is not just Diaspora that doesn't do this. It is nearly every site anywhere.

The other question i have is Diaspora Pods & TOR? Why isn’t this being done?

That is up to the individual podmin and that person's hosting company. Some hosting services forbid TOR. Some hosting services may allow Tor hidden services, but forbid relays or exit nodes. Tor is not a magic wand, so those who choose to host Tor services, relays, exit nodes should take the time to understand the implications of what they are doing first.

... maybe you could even have people teach users about security, things like steganography for example ;-)

Steganography is security by obscurity. As soon as someone who can intercept your messages suspects that you are using steganography, hiding secret messages within the content of other files or messages, the content of your hidden messages is in danger. Naturally, it depends on the resources available to your presumed attacker, but if you assume that a government agency or one of the large telecoms that carry the data has an interest in it, they will get the hidden message.

That being said, individual users on various pods can (and have) discussed steganography. It is definitely not something that should occupy the time of Diaspora developers (that is, D* should not add a data-hiding tool), but it is certainly of interest to some users.

BK

Brad Koehn Sun 20 Jul 2014 2:48PM

I would recommend that we restrict the conversation to the topic described. If you wish to discuss security or steganography please open another topic.

DU

L4C0F Mon 21 Jul 2014 2:36AM

BK

Brad Koehn Mon 21 Jul 2014 4:01PM

I don't think conflating the NSFW tag with content inappropriate for minors relieves one of any legal liability (I don't think there is liability in most countries including the US). Can you cite the specific US and international laws that say that a website cannot show pornographic content to minors and is liable for damages if it does? I would think most porn sites would have serious legal troubles if this is the case.