Sorry, you need to enable JavaScript to visit this website.

Web Consortium's failures show limits of self-regulation

  • W3C's credibility rocked by the failure of ‘Do Not Track’ and its embrace of DRM on the Web
  • Industry self-regulation is being passed off as ‘multi-stakeholderism’

Web Consortium's failures show limits of self-regulationDigital Consumers by Dr Jeremy Malcolm
 
ONE of the standard arguments that the United States and other developed countries make in opposing changes to Internet governance is that the Internet is already well governed through a multi-stakeholder model by a network of grassroots Internet technical community organisations.
 
These are said to include the IETF (Internet Engineering Taskforce), ICANN (the Internet Corporation for Assigned Names and Numbers), the RIRs (Regional Internet Registries) and the W3C (World Wide Web Consortium).
 
Yet when you look a little closer, none of these organisations actually represents grassroots Internet users, or are even multi-stakeholder by any usual definition of the term. Neither are they capable of legitimately dealing with broader public policy issues that go beyond the development of purely technical standards development and the allocation of Internet resources.
 
As a result, the process by which they reach such decisions is undemocratic, and some of the policy choices embodied in those decisions are unsupportable.
 
Unfortunately, these organisations often don’t seem to realise this, and will quite happily go about making policy heedless of their own limitations.
 
An example is the failed process by which the W3C’s tracking preference working group sought to develop a specification for a standard called ‘Do Not Track’ or DNT. The concept behind this standard (which I've written about in detail elsewhere) was to specify how a website or advertiser should respond to a notification expressed by users (typically through a browser setting) that they do not wish to be tracked online.
 
A few hard-working consumer representatives invested enormous efforts into injecting public interest considerations into the development of the standard; notably privacy advocate Jonathan Mayer of Stanford University – but exasperated by the lack of progress, he quit the group this August, followed shortly afterwards by the Digital Advertising Alliance (DAA) which declared the process a colossal failure.
 
Although it's not yet official, the working group is essentially dead.
 
Why? Because this is not the sort of standard that can be developed through an industry-led process, such as those to which the W3C is suited.
 
The W3C is, as its name implies, essentially an industry consortium, with annual membership fees that start in the thousands and run into the tens of thousands of dollars (does this sound ‘grassroots’ to you?).
 
The ultimate decision-making power in the organisation lies with its Director, who has shown his willingness to override community views in favour of those of corporate members. Another example of this is his unpopular decision, confirmed last week, to allow the W3C to add support for DRM-protected content to the official specification for the World Wide Web.
 
The W3C’s process is simply unsuitable for making any progress on a technical standard that involves disputed public policy issues, particularly those that impact broader community interests.
 
These public policy issues have to be resolved first – generally at a political level, and preferably through a more structured multi-stakeholder process – before it becomes possible to develop a technical standard on the basis of those decisions.
 
This was the mistake of the European Commission and the US Federal Trade Commission (FTC) in abdicating responsibility for regulating online tracking, in favour of the W3C process.
 
By and large, an industry body will not adopt public interest considerations unless forced to do so by a firm regulatory hand.
 
About 10 years ago, as an IT lawyer in Australia, I chaired what is described as a ‘co-regulatory’ industry panel that was responsible for developing a new policy for the Internet industry.
 
Co-regulation is essentially a form of self-regulation that occurs with oversight from a regulator. In the Australian model, the regulator can direct an industry member to comply with a registered co-regulatory code, even if it did not individually agree to its terms.
 
The concept is that the industry will develop a strong code in the knowledge that if the regulator isn’t satisfied with it, it will face regulation instead.
 
The practice, as I can attest from my experience as chair and from subsequent work as a consumer advocate, is that industry will grudgingly accept a code that doesn't significantly impact on its existing operations, but will actively obstruct anything stronger.
 
Web Consortium's failures show limits of self-regulationCommunity bodies captured by industry
 
The W3C is not the only example of this sort of dysfunction. The IETF has (to its credit) acknowledged its own limited inclusiveness (its parent body the IAB has 11 white males on its board of 13).
 
ICANN has recently received blistering criticism over its failure to pay attention to the community's wishes (while drawing in millions from the new global top-level domain goldrush), and soon to be released research will unveil how decisions of the RIRs such as APNIC are similarly driven by shallow discussion from a narrow segment of stakeholders (even though this takes place on notionally open mailing lists).
 
The underlying problem is that the Internet community bodies have been captured by industry, and by a narrow segment of civil society that is beholden to industry (exemplified by the global Internet Society, ISOC).
 
As a result, Internet technical standards are biased in favour of a US-led, free market-directed model of competition, which fails to incorporate broader public interest objectives (this has even been formalised in the OpenStand Declaration).
 
Standards development that involves issues such as consumer privacy and access to knowledge is a political process, and as such, capture by powerful interests becomes inevitable unless safeguards are set in place.
 
The industry-led specifications that have resulted from this paradigm speak for themselves. In July, industry released a standard for mobile apps to notify users of data collection using short-form notices, rather than lengthy privacy policies.
 
This voluntary standard, although based on a supposedly multi-stakeholder process set up by the US National Telecommunications and Information Administration (NTIA), has been criticised by American consumer groups both for its substance and for the process by which it was developed, which allowed an industry-dominated panel to push through a code that served their commercial interests.
 
Another example is the United States’ Copyright Alert System (CAS), by which Internet users' privacy is sacrificed to facilitate the delivery of copyright infringement notices to those who share content online – the system does not take account of ‘fair use’ or other copyright user rights.
 
This follows on from the 2007 Principles for User Generated Content Services, also written by industry, that were adopted by most major content platforms, and from codes agreed by major credit card companies and payment processors in June 2011, and by advertisers in May 2012, to withdraw payment services from websites allegedly selling counterfeit and pirated goods.
 
No consumer representatives (or even elected governments) had any say in the development of these codes. How is this a ‘multi-stakeholder’ model?
 
True multi-stakeholder processes (as defined at the 2002 Earth Summit, long before the Internet technical organisations appropriated the term) are:
 

processes which aim to bring together all major stakeholders in a new form of communication, decision-finding (and possibly decision-making) on a particular issue. They are also based on recognition of the importance of achieving equity and accountability in communication between stakeholders, involving equitable representation of three or more stakeholder groups and their views. They are based on democratic principles of transparency and participation, and aim to develop partnerships and strengthened networks between stakeholders.

 
Although often described (for example by the United States Government, and bodies like ISOC that follow US foreign policy) as The multi-stakeholder model of Internet governance, the Internet technical community organisations actually don’t tend to embody these principles very well.
 
Although they are typically open to participants from stakeholder groups, no attempt is made to balance their participation so that the voices of weaker stakeholders (such as consumers) are not drowned out by those with the most resources or privilege.
 
Web Consortium's failures show limits of self-regulationNew approach needed
 
Having open mailing lists is not enough, and indeed can mask abuses of the process – after all, it has been revealed that the NSA used IETF processes with the aim of weakening encryption standards.
 
A better approach is to craft, very deliberately, a public sphere for deliberation that is inclusive of all major stakeholder groups, and whose processes are participatory, transparent and democratic.
 
At the global level, this is accomplished using a stakeholder representation model (though, as we learn from ICANN, other mechanisms of accountability are also required to ensure that even this does not result in industry capture).
 
This is a work in progress, but even so, we do have an agreed template for what a multi-stakeholder Internet governance process should look like, called the Tunis Agenda, which proposed a new Internet Governance Forum (IGF) and an enhanced cooperation process which together would exemplify this model.
 
The IGF, which meets later this month in Bali, could potentially become a much better place for determining globally applicable public policy principles, through a process of multi-stakeholder deliberation that could later be implemented by bodies like the W3C through technical standards.
 
If the IGF had been in a position to fulfil this role (to be clear, it isn't yet – as I will explain in my next article), the DNT standard might perhaps have stood a better chance of success.
 
In the meantime, responsibility for the failure of DNT lies, if anywhere, with the European Commission and FTC for handballing the job to an industry-dominated standards body that was ill-equipped to do so, in the absence of firm policy guidance.
 
The ball now returns to the regulators' court – since regulation is a necessary part of the patchwork of governance (rules, norms, markets and code) that comprises the true multi-stakeholder system of Internet governance, and norm, code and market-based solutions to protect consumer privacy have manifestly failed.
 
Dr Jeremy Malcolm is an Internet and Open Source lawyer, consumer advocate and geek. He is also a senior policy officer at Consumers International and can be found on Twitter and LinkedIn.
 
Previous Instalments of Digital Consumers:
 
Is digital piracy harmful to consumers?
 
DRM, the rights of the consumer ... and the UN
 
How the PRISM surveillance scandal affects Asia
 
Online advocacy, slacktivism and making a real difference

Consumers, domains and astroturfing

How the Trans-Pacific Partnership threatens online rights and freedoms

Internet freedom in a world of states

Copyright enforcement is killing people

WCIT: Freemasons, Internet memes and salt

 
For more technology news and the latest updates, follow @dnewsasia on Twitter or Like us on Facebook.

 
Keyword(s) :
 
Author Name :
 
Download Digerati50 2020-2021 PDF

Digerati50 2020-2021

Get and download a digital copy of Digerati50 2020-2021