On Twitter: @JamesFirth and @s_r_o_c (post feed)

Got a tip? tip@sroc.eu



Thursday, 21 July 2011

Knowledge and public interest: the six knowledge traps

Most of today's governance challenges sit at the interface of knowledge and public interest.

Regulation of information represents a unique challenge; in regulating information, we're regulating the way we communicate - and the ability to communicate openly and freely with another human being is undoubtedly a fundamental right.

In a world where some knowledge interchanges are prohibited by law, we risk a Heller-esque lock-in whereby evidence that could exonerate the defendent cannot be examined openly during a public trial because of the need for information to remain out of the public domain.  This becomes incredibly dangerous if the state is the claimant and the judicial process is not transparent to public scrutiny. The legal system could gradually become subverted by vested interests within the state.

Whilst it's unhelpful to say the internet changes everything; it's equally unhelpful to bury our heads in a hacking scandal and ignore the wider problem:- far more people than ever before hold powerful tools capable of intrusion, harassment and knowledge-abuse.



Intrusion: from simple digital recorders used openly or covertly, to stealing a password and accessing someone else's email.  Even open use of a camera at a private party may breach implied social confidence, if the picture is later published. People behave according to their environment and current company. Isn't it perfectly reasonable that people behave differently with friends, in a way they wouldn't want their boss - or parents - to see?

Harassment: can take many forms, from using multiple pseudonymous identities to attack character and integrity; to tagging hurtful pictures with the subject's name; to publishing sensitive information - or outright lies.

Knowledge-abuse: I was going to say blackmail, but it can be more subtle than demanding cash or favours in exchange for keeping schtum. The mutual exchange of favours forms the basis of civil society, so where do we draw the line? A press office having a list of favoured reporters who have in the past kept a damaging story quiet? Some damaging but legally snapped pictures - if the subject offers me more money than a tabloid for publication rights, in order to keep the photographs out of the public domain - is that blackmail?

All these problems existed in various forms prior to the internet; but the internet, in making certain offending behaviour easier, has exposed flaws in both society and the law.

If we create additional regulation to attempt to solve such issues we risk six knowledge traps:

  1. Innovation: we don't give technology a chance to solve the problems it has exposed. We risk inhibiting progress by drafting laws and regulations based on communications capability less than 20 years into the www revolution. Progress stemmed by non-compliance
  2. Accountability: it could become impossible to investigate those who influence our lives, giving them power without scrutiny.  Individual interest could trump transparency.  This is far more subtle than treating elected public officials differently - who polices the police in a world where recording the police in their duty is a criminal offence? What about celebrities and other social leaders, who have the power to sell us one drinks brand over another? Business leaders responsible for corporate ethics? Religious leaders, with influence over their congregation, etc... And where to draw the line between the freedom to raise damaging allegations, and any protection from defamation?
  3. Social progress: could be stemmed by regulation that is a generation behind. Technological capability aside, each new generation learns to live within a new reality with a revamp of social norms and customs.  A generation opposed to CCTV cameras is replaced by a generation happy to live with the new reality.  A generation unaccustomed to a world of blunt personal abuse bordering on bullying from distant contacts and complete strangers may be replaced by a generation with a thicker skin living at least as happily as the previous generation within the new reality.  We simply can't predict and pre-empt problems that a future society might face, so we shouldn't try.  Let them figure it out for themselves, and regulate after the dust from the communications explosion has settled
  4. Scalability of justice: if the law attempts to intervene too early to change widespread behaviour, there is a serious risk of illiberal and disproportionate sentencing.  Severe sentences to act as a deterrent to others.  In a country with around one thousand national, local and regional newspapers it's still possible to hold individual editors to account.  With around half the country - 30 million people - having registered to use at least one social internet service, hierarchical oversight is simply not possible.  
  5. Transparency of justice and democracy: as I mentioned at the start, regulating information can have serious consequences.  Already we have a justice system cloaked in secrecy and control, on the pretence that a controlled examination of the evidence is the best way to determine guilt. The new reality has information everywhere. We don't know often juries are influenced by uncontrolled information via the internet because it's a criminal offence under the Contempt of Court Act 1981 to solicit any information from a juror about any trial.  Decisions to limit public access to certain information can never be open to full public scrutiny without disclosing the information required to be kept secret.
  6. Competition: media plurality is more important than media accuracy now that the public can interact with each other directly and with ease.  Inaccurate reports can be questioned, reputations elevated or trashed, based on the accuracy of reporting.   Any central control or regulation which acts as a barrier to entry for new and often innovative services threatens to put media control over public interest.  There can be no central control over who is and who is not a journalist without introducing a serious risk that this control mechanism can be subverted against the public interest.  

Yet I'm not arguing for information anarchy or to legitimise intrusion. Privacy is so important, which is why I'm co-founding the Open Digital Privacy Foundation; but I feel unless we greatly scale back our boundaries and expectations we may all end up losing out in a mess of unworkable legislation and public confusion.

What should be regulated:

I feel the criminal law must focus on outlawing a few discrete areas. Essentially: criminalising intrusion where it can be clearly defined with minimal risk from trap 2; and, corporate responsibility for any organisation which holds personal data.  These laws must be enforced vigorously. 

Wire-tapping, bugging and similar are clearly areas where legislation should remain, or be reinforced.

But in blagging or bribing call centre staff for personal data, the newspapers have in a way exploited a loophole that I'm sure already existed, and will continue to exist even if the market from the newspaper industry is removed. 

It's not the right time for the law to attempt to control or censor information on a case-by-case basis (see Ryan Giggs).  This isn't scalable (trap 4) and is open to subversion (trap 5). 

Nor is it beneficial to focus [solely?] on the supply and demand for personal information. It can be counter-productive, because the criminal market is largely unaffected by the law - if there's a market, there are criminals willing to service this market, despite the risk of punishment.  

Information is undeniably valuable to blackmailers, organised crime, terrorists, etc (apologies for falling into the in these dangerous times trap).  We've got about the same chance of controlling the supply of illicit drugs as we have of preventing a call centre worker (who may or may not have a drug problem!) from selling data on the illicit market.

But, by focussing on corporate responsibility we can drive technological solutions. Give technology a chance, without falling into trap 1!  Data systems can be designed with privacy in mind - privacy by design.  Audit trails can highlight disproportionate requests.  In a call centre, the number of clients whose records are accessed should be approximately equal to the number of phone calls made or received. 

Data can be compartmentalised, records assigned to teams, and access locked to each respective team.  This introduces some inefficiencies, but what price privacy? Is the alternative - a battered reputation and/or large fine from the Information Commissioner worse?

Good business can solve many problems with only a thin framework of legislation required.  Legislation to ensure boards act in the interest of shareholders also binds boards to act in the interest of consumers; since failure to protect consumer interest can lead to reputational damage and financial losses.

This is just one reason why it's vital that consumers should be allowed to freely discuss corporate behaviour without the threat of libel.  Transparency is more important than a corporation's ability to defend a manufactured reputation,  as it drives ethical regard and is undoubtedly in the public interest.  And once the link between consumer interest and corporate behaviour is established, boards will be bound to act in the interest of consumers - give or take a few reality checks to my self-regulating dream!

Social regulation

Non of the above solves the social problems caused by individuals posting damaging or disturbing information about others.  We already have laws to protect against out-and-out harassment.  That aside, we're already seeing online behaviour moderated by new social norms.

Social reputation is a fantastic moderator.  Just as in the real world, those publishing personal tittle-tattle outside of any public interest or lies become known as the neighbourhood gossip.  People are weary about bringing them into their circle

Unfriending can occur simply by tagging an awful photo of someone - the rules are down to each and every online community, and people choose to join or leave a community based on the norms of that community.  Is it okay to tag a picture of a mate throwing up on the street?  In some circles, yes.  In mine, no.

Both technology and social systems are evolving to solve the problems brought on by the massive step change in communications capability the internet brings.  Community controls help prevent people I don't trust from bringing photos of me to the attention of those I do trust.  Inappropriate content such as pornography can be flagged by users, and excluded by choice from those who don't want to be offended.

On one hand this can be viewed a race to the gutter, as it allows communities to evolve with what others see as very low social standards, coexisting with all other communities, and I feel we should all do our best to educate and promote our own ideals of responsible online behaviour.  Sometimes people don't understand the long term consequences of the information they choose to put online.

Yet sometimes those complaining don't understand the new culture emerging because of the internet.  A new culture that will undoubtedly normalise some activity previously frowned upon, and there's only so much the law can achieve in preventing information self harm without having an adverse affect on democratic rights and freedoms.

No comments:

Post a Comment

Comments will be accepted so long as they're on-topic, do not include gratuitous language and do not include personal attacks or libellous assertions.

Comments are the views of the commentator and not necessarily the view of the blog owner.

Comments on newer posts are not normally pre-moderated and the blog owner cannot be held responsible for comments made by 3rd parties.

Requests for comment removal will be considered via the Contact section (above) or email to editorial@slightlyrightofcentre.com.