Exploring Frameworks for Ethically Responsible and Scalable Network Interference Measurement

Blog Post
Dec. 7, 2015

OTI fundamentally believes that everyone has the right to access an Internet that is open and secure. Over the last few years, however, the proliferation of wide-scale network interference and surveillance has exploded. As a result, there has been growing interest in exploring how to detect and measure network interference, and understanding the various forms that this can take. While some interference activity is blatant and obvious, much of it is far more subtle, and by design nearly impossible for anyone but an an IT expert to detect, much less protest or mitigate.

In 2014, with support in part from the Knight Foundation, Open Technology Fund, and TIDES Foundation, OTI began researching network interference measurement to explore how it could support our larger research and policy objectives. This research consisted of experimenting with censorship measurement tools and analysis techniques, as well as convening researchers and tool developers already studying and experimenting in the space.

As a result of this research and convenings, some core themes emerged as areas of focus and improvement for the field:

  1. Community building. Given the nascent nature of the field, better communication is crucial to building a strong community. Communication facilitates establishing and enforcing norms around vetting methodologies, sanitizing and securing data, and other facets of research which are not currently formalized.

  2. Vetting methodologies. We need to cultivate greater interdisciplinary expertise. It is important to ensure that all new methodologies are vetted by people with sufficient interdisciplinary expertise to allow them to make meaningful determinations about what is responsible, ethical, and possible, and what's not.

  3. Informed consent. We need flexible guidelines and structures for determining when informed consent must be obtained, and how to do it. For example, when a study involves highly-technical interventions, adjustments must be made to facilitate the participation of people who have almost no prior experience with the subject and may not to understand the technical language.

  4. Data standards and sharing. Common standards, structures, and processes for collecting and analyzing data will allow for better sharing within the community and beyond, allowing for external validation of studies and comparable results across tools and processes. This should include standard practices for what personal metadata can be safely gathered and what needs to be scrubbed, limited, or not collected at all.

  5. Information and systems risk. Developing standards for study design that take into account a realistic and iterative assessment of risk to every participant in the study (human or otherwise) is critical. These standards need to incorporate the input of major stakeholders, including the provider(s) of infrastructure used for the study, institutional review boards (IRBs), and study subjects themselves.

Read the full report.