GISWatch 2019 call for proposals: Artificial intelligence: Human rights, social justice and development
Global Information Society Watch (GISWatch) 2019
Annual Report
Call for proposals
Terms of reference (TOR) for country reports
Theme: Artificial intelligence: Human rights, social justice and development
Deadline for proposals: 19 April 2019
Introduction
This 2019 edition of GISWatch will focus on the implications of artificial intelligence (AI) systems on human rights, social justice and development in the local context, with a specific focus on countries in the global South.[1]
AI[2] is now receiving unprecedented global attention as it finds widespread practical application in multiple spheres of activity: from aviation and transport, to medicine, agriculture and climate change; from policing, surveillance and military robotics, to warehouse operations management, the provision of social services, smart technology in the home, search engines and social media.[3]
AI can be defined broadly as computer systems designed to perform tasks in a way that is considered to be intelligent, including those that “learn” through the application of algorithms to large amounts of data. It is not a new phenomenon: it has been around for six decades at least. But while definitions of AI (and subsets of AI)[4] might vary, it is really through the context of its conceptualisation, design and application that the meaning and social implications of its use can be understood, particularly as a result of how bias and power are embedded in AI systems.[5]
The conversation on AI has so far been driven largely by Western and global North perspectives. However, the assumptions, values, incentives and socioeconomic environments within which AI technologies function vary greatly across jurisdictions. There is no single metric that can be applied to understand properly the success, pitfalls and effects of AI on societies across the world. The questions remain: What specific and unique issues arise in different contexts? How can we make the conversation around AI more global and inclusive from the outset?
We are interested in the human rights, social justice and development implications of the application of AI in specific contexts, with an emphasis on developing countries in the global South. What effect do AI systems have on the distribution of wealth and access to resources in the digital era? What impact do these systems have on vulnerable and marginalised populations around the world? How do they impact – positively or negatively – human rights concerns such as privacy, freedom of expression and association, access to information, access to work, to organise and join trade unions, the right to food or housing, and the right to life? What are the political implications of the widespread use of data in building AI systems? What does this mean from an intersectional feminist perspective? Are there any implications for transparency and accountability, or related concerns such as open knowledge systems, open hardware and open internet architectures?
How to participate in this call
1) Read the instructions contained in this call, and if you wish to participate, send your proposal before the deadline to GISWatch production coordinator Maja Romano (maja@apc.org) and editor Alan Finlay (editor@giswatch.org). The proposal, which should be written in English, should reach us by 19 April 2019 at the latest, and include the following information (no more than 400 words):
a) Name, organisation, country
b) Outline of the issue or topic you will write about. We need to know:
i) What area of AI will you be exploring? [6]
ii) What is the context that you will be writing about? Include here the specific social application of the AI technology you will be discussing. We are interested in concrete, real-life situations that can be described so that the implications of AI become "visible" for the reader. One way to do this is to focus on a story or narrative (e.g. it might involved a local community, a policy advocacy process, a new technical development, an event, etc.) that helps to set the scene for the broader discussion of AI and human rights in a concrete and meaningful way. It is by exploring specific experiences at the local level – rather than in a high-level abstract way – that the nuanced implications of the use of technology can be understood.
iii) What are the expected human rights and/or social justice and/or development implications of the application of AI in this context that you will be exploring?
iv) What are the envisaged policy advocacy implications of your report that you expect to discuss?
v) Are there any specific research methods you will follow in writing your report? For example, will you conduct interviews with stakeholders, conduct a survey, or convene a workshop/meeting where others can share their views?
vi) How will you engage other civil society organisations working in this field in your country?
2) The authors will be selected by the middle of May. If you are selected you will have up to two months to write and submit your final report by 30 June 2019.
More on the report writing process
1) If your proposal is selected, the report you write on your chosen topic must be written in English and have a maximum length of 2,300 words. For consistency, the report should be developed using a template that will be provided to authors. APC will provide you with background readings and online training, and support you during the writing process. Sharing your progress and ideas with other authors will make the report even more cohesive and representative of the global situation, and you will be able to do this through the mailing list that will be set up for GISWatch country report authors selected for this edition.
2) Once submitted, your report will enter the editing process. The report will be edited by the GISWatch editor, and returned to you for clarifications or to respond to editorial comments. In order to ensure consistency in the quality of reports published, editorial comments are often substantial, so proper time needs to be allocated by the authors to respond to the necessary questions and changes. This process will take place from August until September 2019.
3) Once the final report has been accepted, organisations will receive a payment in support of writing of USD 700.
If you have questions do not hesitate to contact us:
GISWatch: Maja Romano (GISWatch production coordinator, maja@apc.org), cc'ing Alan Finlay (editor@giswatch.org).
Website: https://www.giswatch.org/
We look forward to your report proposal! Remember the deadline is 19 April!
Important: Please note that the aim of GISWatch is to encourage local participation in rights-based issues. Because of this, for this edition it is critical that lead authors or organisations have residence in the country they are writing about. Under certain circumstances, we may accept proposals from lead authors who are not residents in a country they wish to write about, such as proposals from displaced persons, or authors who have strong firsthand experience in a country. Lead authors may also wish to coordinate co-authors for the chapter and those co-authors may not necessarily need to be based in the same country.
Potential report angles
While your report proposal should be concrete and specific in discussing the local context (please see proposal requirements above), you might want to use the suggestions below to explore some of the following topics or as entry points for your analysis:
Gender: Feminist critiques of AI systems, including analysis on the design, development, deployment, solutions and use of AI systems.
Inclusivity: What are the various ways, whether through training, policy or access, in which AI and machine-learning systems exclude or include vulnerable communities?
Technical considerations: What technical fixes or limitations are important to consider in the context of AI and machine-learning systems in specific contexts?
Design: How do design choices impact the use and effect of AI and machine-learning systems?
Human rights: What is the nexus between AI and machine-learning systems and human rights in general?
AI and new forms of censorship: Given that AI is increasingly used to police unlawful or infringing content, what implications does this have for freedom of speech and expression? What about the increasing demand to use AI tools to police online toxicity such as hate speech against gender and minorities?
AI and privacy: From data used to train AI systems to AI applications like facial recognition, what impact do AI and machine-learning systems have on privacy?
AI and data protection: How do AI and machine-learning systems interplay with data protection regimes across the world? (The right not to be subject to automated decision making / The right to an explanation when people are legally or significantly affected by automated decision-making.)
Economic, social and cultural rights (ESCRs): How can the ESCRs inform the ongoing debate about the impact and potential of AI and machine-learning systems? What are the human rights challenges of labour automation? What about the positive implications of the use of AI in monitoring workers to improve their safety in the workplace?
Anti-trust: What competition impacts do AI and machine-learning systems have in economies?
Public and private accountability: What impact, positive or negative, can AI and machine learning have on accountability mechanisms? These can be technical, legal or social.
Economy: What is the impact of AI and automated systems on the way in which resources and wealth are distributed locally?
Knock-on effect: How does AI and automated decision making intersect with and affect other structures and mechanisms of societal decision making?
Liability and responsibility: Who is responsible for consequences of automated decision making, particularly when they cause harm or negatively affect people? What are the regulatory compliance and transparency minimum standards?
National security, law enforcement and the military use of AI, such as Automated Weapons Systems (AWS) and the use of AI in cyberwarfare.
Timeline
Deadline for proposals: 19 April 2019
Authors informed of accepted proposals: mid-May 2019
Authors to prepare country chapter: May-June 2019
Deadline for country chapter: 30 June 2019
Editing process: July-August 2019
Deadline for final country report: 31 August 2019
Notes
[1] Although the primary aim of GISWatch is to promote the views and experiences of authors from countries in the global South, we also welcome proposals from authors in developed countries across the world.
[2] Our use of the term artificial intelligence (AI) is broad, and encompasses a range of technologies, approaches and processes including algorithmic decision making, machine learning, deep learning platforms, natural language processing and speech recognition, virtual agents, biometrics, computer vision and robotic process automation.
[3] For example, through the aggregation and curation of content appearing on our social media feeds, online profiling and the delivery targeted advertising, and the construction and operation of content filters.
[4] For example, machine learning is the most popular subset of AI.
[5] Our popular understanding of AI is also often framed by science fiction utopias and dystopias. The AI of science fiction involves the concept of “strong AI” (AI that replicates human intelligence, rather than performing tasks associated with it – the super-intelligent robots of the future). But bias and power are just as easily embedded in what we might call “weak AI”, such as the algorithms that shape our everyday digital lives. While weak AI is less “theatrical” its threats have very real consequences for human development and human rights, especially considering its ubiquity in everyday computer systems. (see: John R. Searle, “Is the Brain’s Mind a Computer Program?,” Scientific American, 262, no. 1 (1990): 25–31).
[6] For example, machine learning, natural language processing, neural nets and so on.