334 - Platform Shoves | Scoins.net | DJS

334 - Platform Shoves

This page has loose threads, dangling part(iciple`)s and is incomplete, but I ran into interruptions and lost track of what it was I wanted to know.

A long time ago, long before Facebook was dragged off into court, Mark Zuckerberg was explaining the issues that stem from being a social media platform. When it comes to content, this often reduces to a binary position of platform or publisher. 

A platform enables communication or distribution of information; a publisher curates the content it distributes.  That means that a social media platform acting as publisher is in some way judging the content uploaded as acceptable and is seen to be policing what is posted after the fact. That implies the existence of policy, which might itself be published. It rather suggests that—that may be accepting that they are publishers—require proactive techniques (software) that somehow inspects content at upload and rejects some; the user is generally unaware that there is a process (it is sufficiently fast, it appears to upload or fail to upload). There are subsequent issues over what biases are imposed, deliberate or otherwise, to which I may return.   [1,3]  

Of course, the binary choice between platform and publisher is incorrect; some platforms may become or behave as publishers, or indeed be deemed to have become so. Let us reserve platform for the process which hosts, whether or not it attempts to curate at all. Some publishers may also be platforms. I don't suppose that prevents external bodies from demanding action and I can easily imagine a state body having the power to demand that a site be closed in some peremptory and immediate manner, but that is reactive, quite possibly for deleting illegal content or for content which could be argued as such. Again, that can be a position that varies between nations – think of the French/Islamist issues over Charlie Hebdo. As an example of national differences, burning a national flag means little in Britain but is of great significance in the USA. We rapidly reach a position where within a national border there are some platforms which are allowed and some which are not, such as Facebook in China.

One of the many concerns is where opinion shades into truth, and particularly when opinion is shared more widely than on a two-person interchange. That is because at this point we move to places in which opinion is formed or nudged in directions. That would include counter-factual 'evidence', pushing of fake news, causing people to believe less in one truth simply because other 'truths' are on offer. In such ways are opinions formed or prevented from forming.

"Social media companies are not held legally liable for any illegal content, as they are likely to fall within the ‘hosting’ exemption, where the provider’s relationship to that content as a host is considered merely ‘technical, automatic or passive’," the Committee said. "The hosting exemption requires that the company does not have knowledge of the illegal activity or information, and removes or disables access to it ‘expeditiously’ if it becomes aware of it. This has formed the basis for what is called the ‘notice and takedown’ model."  [4], quoting the EU e-commerce directive.

Just because Facebook has admitted it acts as a publisher does not make any other social media platform also a publisher. As yet, anyway; one of the issues to consider is whose laws might apply. 

It would be untrue to think that a biased media is anything new. It was true as soon as we had presses, since what one printed was entirely your own issue (ha!). In Britain (and Europe) we have rules about what can or cannot be printed. However, factual reporting still has shading. One of the most controversial issues in modern reporting is media bias, particularly on political issues, but also with regard to cultural and other issues. [7] The brevity of news reports and use of soundbites has reduced fidelity to the truth, and may contribute to a lack of needed context for public understanding. From outside the profession, the rise of news management contributes to the real possibility that news media may be deliberately manipulated.

Some of the battle for public opinion falls into cyberwarfare, which I use here to describe state-sponsored action on an online medium. As Clausewitz put it, "War is the continuation of politics by other means." [7] In the context of the discussion I'm trying to provoke, I refer to the formation of political opinion and propaganda.¹  Of particular concern is the deliberate manipulation of political opinion within a state; that field can include many conflicting bodies including that of the state itself and external nations.

Just within the relative safety of the UK manipulation of opinion is something that every business wants to be involved in; it is called marketing. Within politics it is already disturbing how easily this manipulation of opinion can be done. We have powerful influencers (think media baron) with serious vested interest in preserving the status quo —which might equate directly to wielding power. We also have nation-states interested in affecting opinion in foreign states (what I think of as cyber skirmishing); this is evident in allegations of Russian involvement with elections in the US and UK - and that's just what has reached the press. Hunting for content, I come across the Oxford Internet Institute [8], OII, who identified the UK, US and Russia as among the worst perpetrators, having found evidence of social media manipulation from government agencies, politicians and political parties, private contractors, civil society organisations, ordinary people and influencers in the countries. [11] Worst perpetrators of political misinformation, that is. The Institute for Strategic Dialogue, a London-based counter-extremism think tank, said it had recorded tactics that included the use of anti-Muslim tropes on Twitter accounts to try to influence Hindu voters to abandon Labour and fake tweets and Whatsapp messages sent in the immediate aftermath of the London Bridge attack suggesting that Jeremy Corbyn had criticised the police response and been sympathetic to the killer. [12]  I want a lot more detail about how this occurs and how the UK rates as a major proponent.  (suggested reading) [14]

What can we do about this? We have free speech, which means that to a large extent you can say what you like. However, to curb excesses, this in turn means there should be a calling-out of untruth; doing this with politesse can be difficult, but it is often sufficient (sufficient action) to point to correct fact and so to separate fact from opinion. Where there is opinion without fact, recognise this, and, if you see the opinion as extreme and dangerous, one may dare to invite justification of said opinion. Personally, I'm more likely to write that person off as too extreme to be known, but that is a form of cowardice that preserves my own echo-chamber. Surely what we need is for dissenting voices to be heard, but not by shouting. Many of us simply watch but don't participate, such is the level of abuse to anyone disagreeing with a violently held (often unsubstantiated) opinion. I try to view such interchanges as if on the street and gauge my responses accordingly, though I am concerned that people are using the anonymity offered by platforms to raise their abuse levels well beyond what they'd dare to do if physically present. I see this as a social issue not easily solved and which is going to get far worse before it improves.

Fact checkers are very useful. They need funding. They need crowd support. What I'm expecting to see are false fact suppliers masquerading as truth-tellers. Or truth-sellers.

We can have legislation, but it needs to be swift to adapt to new circumstance and probably that is difficult to keep at a level of acceptability without falling prey to being censorship. As [10, page 8] describes, Australia punishes acting against the national interest (a pretty broad brush), Germany acts where free speech conflicts with constitutional values and France has judges rule on untruthful content. What quite is illegal varies widely across the globe and this too is a cause of conflict; things you cannot do or say in China are fairly easily found once you have access outside the border—what I have long thought of as a pipe through the Great Firewall.  ²  A more positive approach is to raise public media literacyI can see this being taught, too—but this sort of literacy applies to institutions quite as much as it does to individuals. In a society that purports to transparency (Canada, Norway and the UK are notable leaders, however much the natives may criticise), this has added value as some sorts of fact are very readily checked. Having a recognised official source of detail goes a long way to defeating fake truths.

DJS 20210115

Platform shoes plus a v. Hence top pic.

1   Propaganda can become a pejorative. Propaganda is biased communication. In general, propaganda is a neutral term, but is often used to imply manipulation. If researching this, look for internet manipulation.

2  The Great Firewall of China is a long-running reference to the Great Wall itself, originally but to either keep the Chinese in or the wandering hordes out. In the same way, the Firewall serves a similar purpose, and see 35th May. A pipe is a UNIX technique for passing information from one system to another.      You knew all this, didn't you?

Propaganda: when arguing whether the big white bird in view is a duck or a goose, discovering that not-duck is also not-female.

3 Secretary of State for Digital, Culture, Media and Sport. There really is a comma and it is not digital culture, but both digital and culture and media and sport. I know not what digital is as a noun. Oliver Dowden, as of Feb 2020. 

These reference links are not in an order, except as accessed and recorded.

[1]  https://yaircohen.co.uk/why-facebook-is-a-publisher/

[2] https://www.tutor2u.net/sociology/reference/who-owns-the-uk-media

[3] https://subsign.medium.com/is-facebook-a-platform-or-a-publisher-f2e2fd04d4eb  Curious use of qurate for the verb curate.

[4] https://www.pinsentmasons.com/out-law/news/uk-government-called-on-to-shift-liability-for-illegal-online-content-online-to-social-media-companies1

[5] https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/666927/6.3637_CO_v6_061217_Web3.1__2_.pdf  Chapter 2 especially.

[6] https://en.wikipedia.org/wiki/Journalism_ethics_and_standards

[7] https://en.wikipedia.org/wiki/Journalism_ethics_and_standards. As usual, a wide perspective. Yes, I've contributed again. Also, https://en.wikipedia.org/wiki/Cyberwarfare and https://en.wikipedia.org/wiki/Internet_manipulation.   

[8] Oxford Internet Institute An interesting source. Upcoming, Alternative News Networks.  Abstracted while writing this piece, The populist campaigns against European public service media: Hot air or existential threat? https://journals.sagepub.com/doi/10.1177/1748048520939868   First, the impartiality and objectivity of news media has generally become less taken-for-granted in a ‘high-choice’ media environment offering various news products of different quality. Secondly, historical left-right distinctions have become less clear-cut, also because right-wing populists challenge them. Consequently, the role of PSM in creating a shared national conversation which represents the diversity of society has also come under siege. At the same time, partisan websites and social media platforms enable certain groups to showcase content that is more aligned with the perspectives of right-wing populists.   

I am reminded that it is extremely simplistic to require a political spread to be one-dimensional, only left-right. Read. My own brief thinking came up with something like the Nolan model or the spatial model - why stop at only two dimensions? What things do you think form any relevant axis?

[9 Report on Open Government ] https://webfoundation.org/projects/ogd/    and https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/473603/51973_Cm_9151_Transparency_Accessible.pdf    

[10] https://www.stratcomcoe.org/government-responses-malicious-use-social-media.   An interesting read. I thought this was Oxford Internet Institute output, but most of that is behind an academic wall. How I dislike that; where does the money come from, if not us the public?

[11] https://inews.co.uk/news/technology/social-media-political-manipulation-soared-disinformation-uk-us-russia-propaganda-826934?ito=twitter_share_article-top

[12] https://inews.co.uk/news/long-reads/general-election-2019-political-lies-disinformation-normalised-374249. You might also look at Compassion in Politics.

[13] https://docs.google.com/a/independent.gov.uk/viewer?a=v&pid=sites&srcid=aW5kZXBlbmRlbnQuZ292LnVrfGlzY3xneDo1Y2RhMGEyN2Y3NjM0OWFl  A 50-page report from a Parliamentary Committee. As readable as all such documents, I view this as the sort of thing I'd like to (have) put in front of a sixth-form tutor group. Actually, we have accordingly decided to produce a shorter Report than usual, which takes the form of a summary of the most important points we have noted during the Inquiry, at a high level, without revealing underlying detail. We have supplemented this with a substantial Annex (not published so as to stop the Russians from reading it). An attempt to read what has been published soon runs up against redacted content. There is a conflict here; the security services are loath to act to involve themselves in democratic processes, which makes defence of those processes by them difficult. The DCMS (the dept for digital, culture, media and sport) ³ holds the remit for disinformation campaigns but has never looked outside the national borders. It would seem that  the issue of defending the UK’s democratic processes and discourse has appeared to be something of a ‘hot potato’, with no one organisation recognising itself as having an overall lead.

[14] https://www.europarl.europa.eu/RegData/etudes/STUD/2019/608864/IPOL_STU(2019)608864_EN.pdf   According to a recent study, a significant generational divide can be observed: people over 65 share seven times more fake news than young users do.3 In addition, the sinking popularity of Facebook and the growing popularity of messaging services such as Snapchat4 may also signal that this phenomenon, which has dominated public concerns for democracy in the past few years, may be taking a new direction. Just a teaser to persuade you to read more of this.


  Email: David@Scoins.net      © David Scoins 2020