Welcome to the Official Schedule for RightsCon Toronto 2018. This year’s program, built by our global community, is our most ambitious one yet. Within the program, you will find 18 thematic tracks to help you navigate our 450+ sessions.
Build your own customized RightsCon schedule by logging into Sched (or creating an account), and selecting the sessions that you wish to attend. Be sure to get your ticket to RightsCon first. You can visit rightscon.org for more information.
To createIf you’ve created a profile with a picture and bio, please allow a few hours for the RightsCon team to merge it with your existing speaker profile.
Last updated: Version 2.3 (Updated May 15, 2018).
This session, jointly organized by UNESCO and ICANN’s Non-Commercial Users’ Constituency, aims to foster a robust discussion on the strengths and weaknesses of multistakeholder models and to develop recommendations to improve these processes in the future.
Multistakeholder governance models are built around the idea of bringing diverse stakeholders together to collaborate on policy making solutions. These models have become particularly prevalent in Internet governance, where representatives from the commercial, technical, academic, governmental and civil society sectors all have a seat at the table and share a role in policymaking. However, systems that are designed to be egalitarian can nonetheless manifest biases in practice. In ICANN’s case, although the IANA transition has already occurred, uncertainty persists over where governments’ role in Internet governance ends and ICANN’s begins, allowing the former to wield a powerful stick over the process. Meanwhile, human rights advocates and other non-commercial interests, who in theory engage on an equal footing with their counterparts from the business community, can be placed at a natural disadvantage by the fact that they generally have fewer resources to work with.
The session will welcome participants from ICANN and from CGI.br, as well as multistakeholder Internet governance participants from civil society, academia, and the private sector, to discuss the challenges and future of multistakeholder Internet governance. Among the entry points to the discussion will be a recent study released by UNESCO, “What if we all governed the Internet? Advancing multistakeholder participation in Internet governance”, which was developed as part of the UNESCO Series on Internet Freedom.
The session will be moderated by UNESCO representative Ms. Xianhong Hu and Mr. Michael Karanicolas of the Executive Committee of ICANN's Non-Commercial Users’ Constituency.The current technological landscape has several tools for secure, encrypted, real-time group communication -- be that text chat, voice chat, or even video. Tools that can make this happen include Signal, WhatsApp, and Wire, and they are awesome! One property that these tools all share, is reliance on a piece of central infrastructure to orchestrate not just the communication logistics, but the security assurances as well.
The (n+1)sec protocol offers the possibility to have end-to-end encrypted group communications in a decentralized environment. It can be used as an overlay to any general-purpose communication network, including federated and peer-to-peer infrastructures. This session will describe the social implications and complications of running an encrypted group discussion without a central authority for key selection, transcript consistency and room moderation. We would like to engage the audience in a discussion around the presented challenges as well as the broader applications for Internet communications privacy.
https://github.com/equalitie/np1sec
The Digital Security Clinic is run by the staff of Access Now’s Digital Security Helpline, a free-of-charge resource for civil society around the world. We offer real-time, direct technical assistance and advice to activists, independent media, and civil society organizations. You can find out more about the Helpline at https://www.accessnow.org/help
The Helpline will be holding open drop-in hours in the Access Now Lounge on the second floor from 1:30pm to 5pm on Wednesday and Friday, and from 9am to 1:30pm on Thursday. You can also arrange for a one-on-one appointment with a Helpline staff member throughout the conference by emailing help@accessnow.org.
The staff identify problems and help attendees implement practices that can protect them from threats. During an average visit, Clinic technologists:
1// Assess risks and needs
2// Analyze current practices
3// Troubleshoot problems
4// Provide tools and training to address emergent issues
5// Initiate cases with Access Now’s Digital Security Helpline for any issues not resolved on the spot
6// Refer to specialists where the Helpline is unable to assist
In this session, we will bring together panellists from the European Union Agency for Fundamental Rights (FRA), the Office of the Privacy Commissioner of Canada, the Canadian Security Intelligence Review Committee, the U.S. Office of the Director of National Intelligence, each of whom is an expert in issues relating to privacy, digital rights and oversight of intelligence services. The discussion will be moderated by Amie Stepanovich, U.S. Policy Manager at Access Now.
The panellists, briefly drawing upon their experiences from Canada and the United States will discuss the importance of effective oversight as an indispensable safeguard for reviewing the activities of the intelligence services and upholding the rule of law. Following a presentation on the critical comparative EU research findings of the recently published (October 2017) second surveillance report (Surveillance by intelligence services: fundamental rights safeguards and remedies in the EU), which will bring an European Union comparative perspective to the debate, emphasis will be placed on the institutional guarantees that oversight mechanisms need to incorporate in order to be independent, efficient and transparent. Contemporary challenges for oversight bodies, which arise from a field traditionally shrouded in secrecy, such as limited powers, access to intelligence files, resources and expertise, will be thoroughly discussed. Finally, the debate will touch upon oversight bodies’ competence over international intelligence sharing, as a means of tackling gaps of accountability and ensuring robust supervision throughout the whole spectrum of the intelligence process.
Panellists will exchange ideas, share empirical findings and, in collaboration with active participants in the audience, work toward articulating innovative approaches as well as new collaborations and strategies for responding to challenges faced by many states. The panellists will launch the discussion, but the substantive session is designed to be very interactive, and will solicit engagement and concrete, solutions-oriented feedback from key participants in the audience. This will include, among others, human rights lawyers, NGO activists, technologists, privacy commission experts, or government officials.
In light of the recently published (October 2017) second volume of the FRA Report on surveillance (Surveillance by intelligence services: fundamental rights safeguards and remedies in the EU), which provides critical comparative research findings from the surveillance fieldwork and will be disseminated to participants, various examples from Europe, Canada and New Zealand will be thoroughly examined. Through an interactive discussion, participants will be sensitised to the challenges rooted in the habitual practice of the intelligence services and will investigate, together with panellists, good practices and concrete solutions for achieving robustness without compromising national security or circumventing human rights.The Ally Skills Workshop teaches simple everyday ways for people to use their privilege and influence to support people who are targets of systemic oppression in their workplaces and communities. This includes women of all races, people of color, people with disabilities, LGBTQ folks, parents, caregivers of all sorts, and people of different ages.
The workshop will be structured as a brief talk followed by a series of scenarios that are discussed in small groups.
The use of technology in the weather and climate sphere prompts challenges to the protection of human rights in ways that are not obvious. If law, policy and governance and its role in standardization should have an emancipatory potential, then it would stem from its ability to account for and safeguard rights and equitable concerns, in the face of challenges that result from the integration of emerging technologies. In the context of satellite technologies, and Distributed Ledger Technologies, panelists discuss how we think about the relationship between innovations and techniques to address changing weather and climate in the context of social conditions and the need for a better understanding of the world around us. But what exactly needs to be safeguarded while solutions are rolled out? Could it be that safeguarding rights would be individuals having the courage to secure the role of setting the standards/rules/smart contracts in order to preempt the powerful from having total control of future innovations and outcomes? What is the role of transparency as a safeguard, and technology frameworks that not only support incentive systems but give individuals the perception that they are in control of the processes happening around them, and decisions being made in the use of technology also reflect the values, beliefs and expectations of marginalized populations? A pure focus on individuals however may be incomplete, as addressing the systemic issues of weak central institutions that do not adequately consider the challenges faced by society is also an important priority.
Join us and explore the roles technology can play in systemic solutions to displaced populations crises. Through an interactive discussion among experts, activists, innovators, and session attendees, we will:
We will focus on two case studies:
- A research project to assess how technology may support child refugees in Central America (UNICEF + Article One).
- The use of Blockchain tech in financial and identity services for internally displaced populations.
This session will be an interactive, solutions-oriented panel discussion about strategies for understanding and effectively combating online violence against women in public life - particularly journalists and women in politics. Harassment and other types of violence against women online affects women in journalism and politically active women alike, and manifests in a variety of forms on almost every online platform available.Indeed, the media’s slow response to harassment of women within their own industry has arguably led to an inability to appropriately deal with the harassment of other women online - including politically-active women. For both women in politics and in journalism, there is often a lack if infrastructure - be it within political parties, from social media platforms, or on the part of media organizations - to address harassment and combat the problem of violence online. This can result in a chilling effect for women online, including through choosing not to participate in leadership or political debates, re-evaluating the types of journalistic beats they cover, deactivating or permanently deleting their online accounts, or even leaving their profession entirely. The resulting limitation of both the number of women able to participate and the range of issues discussed in politics and the media poses a fundamental challenge to democracy, progress towards gender equality and women’s empowerment, as well as to the integrity of the information space.
The panel will engage experts across multiple sectors including civic technology, gender equality, and democracy and governance to discuss methods for building international understanding of this issue and identifying strategies for combating it. It will also include tangible examples from women in these sectors who have experienced this type of violence. A moderator will first introduce the issue of online violence against women in politics and in journalism, framing the issue for the audience and highlighting the key issues to be explored in the session. The moderator will then open the floor to allow each additional speaker to share their perspectives, experiences and approaches they have utilized through a series of guiding questions and facilitate a lively dialogue between the speakers and for frequent engagement with the audience. The goal is the participants will work together to (1) raise awareness of the prevalence and anti-democratic impacts of online violence against women in politics and in journalism; (2) foster knowledge- and idea-sharing among panelists and audience participants of the strategies for understanding, documenting, and combating this type of online violence; and (3) emerge with tangible takeaways and a framework for thinking about best practices to combat online violence across sectors.
Please join us for a special preview of the Special Rapporteur’s upcoming report on State regulation and commercial moderation of user-generated online content. The Special Rapporteur examines how States should fulfill their primary duty is to ensure an enabling environment for freedom of expression and access to information online, even in the face of contemporary threats such as “fake news” and online extremism. The Special Rapporteur also conducts an in-depth investigation of how Internet companies moderate content on major social media platforms, and argues that human rights law gives companies the tools to articulate their positions in ways that respect democratic norms and counter authoritarian demands. The report is the culmination of a year-long series of consultations, visits to major internet companies and a wide range of State and civil society input.
Women represent approximately half of the world's population. However, many cannot even imagine that they could benefit from ICT for their own development and empowerment because access to digital technologies and their fundamental rights are denied for socio-economic, socio-political and/or socio-cultural reasons. In addition, women are 50% less likely than men to access the Internet, and 30-50% less likely to use it for personal empowerment (Web Foundation, 2015). And although the barriers to this are varied, we must note that the top 3 are quite telling: lack of knowledge, high costs and little relevant content
Given this scenario, in this session we want to put faces to these statistics in order to show some efforts that seek to empower women from underserved and discriminated communities and change these figures. On the one hand, we will screen a 15-min documentary made by a group of women from a slum in Bogotá, who, at the same time that were trained in techno-policy, talk about their lives, their hopes and the desire to empower themselves and build collective dreams. This will be follow by a conversation that will present the experiences within the project. On the other, we will share the experiences and learned lessons of a digital literacy project for indigenous women in Mexico and how the digital divide becomes a double exclusion element in their social, political and community contexts.
We want to promote this dialogue because we believe that the ideal Internet is where women can create, innovate, exchange ideas, express their sexuality, become a source of information, run their own businesses and participate on equal terms with our male counterparts.
Join us for a discussion about leveraging digital innovation to combat disinformation, including computational propaganda.
The use of disinformation by malicious actors to interfere in the democratic processes has become a center-stage issue for policy makers and other stake-holders around the world. Serious reflection about effectively addressing and countering this complex problem is underway and requires input from the full spectrum of stakeholders.
A key question on everyone's mind is how can we prevent and counter such nefarious activities, while at the same time protecting and promoting human rights, with freedom of expression and other civil and political rights at the forefront? And while regulation and sanctions loom large, it is an opportune moment to examine how innovation, including digital tools, apps, AI and block chain can be leveraged to protect civil society and democracy.
The Digital Security Clinic is run by the staff of Access Now’s Digital Security Helpline, a free-of-charge resource for civil society around the world. We offer real-time, direct technical assistance and advice to activists, independent media, and civil society organizations. You can find out more about the Helpline at https://www.accessnow.org/help
The Helpline will be holding open drop-in hours in the Access Now Lounge on the second floor from 1:30pm to 5pm on Wednesday and Friday, and from 9am to 1:30pm on Thursday. You can also arrange for a one-on-one appointment with a Helpline staff member throughout the conference by emailing help@accessnow.org.
The staff identify problems and help attendees implement practices that can protect them from threats. During an average visit, Clinic technologists:
1// Assess risks and needs
2// Analyze current practices
3// Troubleshoot problems
4// Provide tools and training to address emergent issues
5// Initiate cases with Access Now’s Digital Security Helpline for any issues not resolved on the spot
6// Refer to specialists where the Helpline is unable to assist
The profound changes wrought by the digital communications revolution are posing an increasingly serious threat to free and fair elections. A central challenge is how to regulate politically partisan speech in the new environment in a way that promotes a level electoral playing field and yet respects freedom of expression. Traditional systems – campaign finance regimes, rules on balance and impartiality in the broadcast media and bans on foreign interference – are either easy to get around or increasingly less relevant. In addition, the dominant online platforms wield enormous powers over speech that could potentially be used to influence elections.
This session will explore these challenges, the impact they are having on free elections and what options there are for technical and/or legal/regulatory solutions that are effective and yet respect international standards on freedom of expression. This panel will be a rare chance for experts from different regions and disciplines to provide critical perspectives on these challenges and put forward new and innovative possible solutions.
Contemporary technology - including social media platforms, mobile phones, Artificial Intelligence, robotics, and many others – can positively impact the advancement of gender equality. At the same time, it can aggravate existing and enable new forms of violence against women and girls (VAWG). Technology facilitated VAWG is a growing and a constantly evolving phenomenon that brings devastating consequences for the lives of women and girls around the world from all walks of life. Join us for a discussion about emerging trends; international responses; and what can we do next to combat the problem.
Join a portal, Live from Gaza, to hear young writers, NGO directors, and a solar entrepreneur from within the blockade as they describe what they are seeing in the unfolding human rights emergency.
Here are our speakers and audience in Gaza:
This session will consist of an introductory 10 minute presentation followed by a 15 minute discussion. We will share results of the “Mapping the Artificial Intelligence, Networked Hate, and Human Rights Landscape” research and outreach project at MIGS’s Digital Mass Atrocity Prevention Lab. This project is running from November 2017 until April 2018 and is conducted in collaboration with the Digital Inclusion Lab at Global Affairs Canada. The goal of the project is to map out and better understand the global Artificial Intelligence, Networked Hate, and Human Rights landscape with a special focus on the role and extend of Artificial Intelligence systems in the takedown of extremist content on social media platforms.
We will also share insights from a related research trip to Germany which zooms in on the societal and parliamentary discussions around (and early experiences with) the German Network Enforcement Act (or, as some call it: the “Facebook act”). We finished conducting interviews with representatives from Think Tanks (e.g. Stiftung Neue Verantwortung, Stiftung Wissenschaft und Politik), research centers (e.g. Center for Internet and Human Rights, Humboldt Institute for Internet and Society), German Parliament staff who worked on the law, and members of the Global Diplomacy Lab (an initiative by the German Foreign Ministry).
Additionally, in our session we plan to discuss outcomes of a workshop in March 2018 which we host in collaboration with with the Tech Against Terrorism project. Together we examined AI system’s application for the takedown of extremist content, AI’s possible acceleratory effects for extremist causes and ideologies, and AI system’s potential misuse by terror organisations.”
The UN Human Rights "Standards of Conduct for Tackling Discrimination against LGBTI people" launched in September 2017 provides five concrete steps that companies can take to align their policies and practices with international standards on human rights of LGBTI people. They reiterate that company should prevent human rights violations against LGBTI customers, suppliers and distributors.
During the last decades we have seen increased awareness and consequently, advancement in the field of corporate social responsibility on the Human Rights of LGBTI people particularly in the US, UK and part of Europe . Yet in "hostile environments", many tech companies are quick to distance themselves from #LGBTI people and at times be complicit in limiting their Human Rights including freedom of expression.
The session will discuss these challenges, safety and harm reduction and what the Standards mean by "acting in the public sphere" and define possible solutions so the industry aligns better its policies and practice with the Standards.
The session will also mark May 17th - the International Day Against Homophobia, Transphobia and Biphobia which is celebrated around the world.
Link to Standards: http://unfe.org/standards
#Biz4LGBTI #StandUp4HumanRights #LGBTI #IDAHOTB #whoisrightscon
UNESCO will take the occasion to present its draft Internet Universality Indicators and engage with RightsCon stakeholders to facilitate the application and implementation of these indicators at national levels. The indicators were developed as an immediate response to UNESCO’s adoption of the ‘CONNECTing the Dots’ Outcome document in 2015, when UNESCO put the concept of ‘Internet Universality’ at the heart of its work to promote an Internet that works for all. Internet Universality points to four fundamental norms – known for short as the ROAM principles – which are the guiding framework that promotes an Internet based on human rights, and the principles of openness, accessibility, and multi-stakeholder participation.
The Internet Universality indicator framework is structured around the four ROAM Principles of UNESCO Internet Universality concept (openness, accessibility, and multi-stakeholder participation), alongside Cross-Cutting Indicators concerned with gender and the needs of children and young people, sustainable development, trust and security, and legal and ethical aspects of the Internet. There are a mix of quantitative and qualitative Indicators.
UNESCO advocates these indicators in order to enrich the stakeholders’ capacity for assessing Internet development, broaden international consensus, and foster online democracy and human rights towards knowledge societies engaged in sustainable development. These indicators will help governments and other stakeholders to assess their own national Internet environments and to promote the values associated with Internet Universality. There will be a mix of quantitative and qualitative Indicators. The work on the project to define Internet Universality Indicators is being led for UNESCO by the Association for Progressive Communications (APC).
UNESCO seeks to expand partnerships and synergies with stakeholders in implementing Internet Universality principles and applying Internet indicators in different countries once they are available in June 2018.The session will be an interactive roundtable discussion. It will start with a brief presentation of the draft Indicators to be followed by comments by panelists and audiences. The document of Internet Universality Indicators will be circulated in advance of the consultation and it can be downloaded at (http://www.unesco.org/new/en/internetuniversality).
The development and application of Artificial Intelligence technologies could profoundly shape humanity’s access to information and knowledge, impact communication and the practice of journalism, as well as bring positive implications particularly for the rights to freedom of expression, privacy and association. AI has also great potential to foster open and inclusive knowledge societies (especially for marginalized groups, including persons with disabilities, rural populations, indigenous peoples), to promote open educational resources, digital persistence, cultural diversity, and scientific progress. In turn, these can all contribute to achieve democracy, peace and the sustainable development goals.
On the other hand, AI could also exacerbate inequalities such as between languages in terms of speech-to-text and translation capabilities. AI and automated processes, fuelled by big data, also raise concerns for human rights, particularly to freedom of information, freedom of expression and the right to privacy. The use of AI in content moderation on the Internet without human judgement or due process can have a negative impact on the right to impart, seek and receive information, as well as on accountability, transparency and a shared public sphere. Issues of gender and racial discrimination are also being embedded into AI systems, with adverse effects. These ethical issues accompany questions about the technical divide that already exists between developed and developing countries.
UNESCO therefore sees an urgent need to take a global, pluralistic, multidisciplinary and multi-stakeholder (e.g. public and private sectors, developed and developing countries, etc.) reflection on the ethical framework that will guide the development of AI technologies. The Organization wishes to explore these issues and reflect whether it is possible to harness big data and AI technologies as a process to advance human rights, build inclusive knowledge societies and achieve the 2030 Sustainable Development Goals.
This session aims to trigger debates and reflections on multiple ethical/political/social/legal implications of the development and application of big data and Artificial Intelligence towards building inclusive knowledge societies and achieving the 2030 Sustainable Development Goals.
Key Insights:
So, you also want to have a digital security training for your organization/community, how do you do it? How others do it? For what kind of audiences? What are you looking to protect?
This session wants to create an occasion for digital security trainers/promoters to gather, know each other, talk about their experiences and learn from others, whether you:
- are a digital security trainer
- have participated in some digital security trainings
- want to have training for your organization/community
are all welcomed to join.
We will invite trainers we know to introduce their special experience in their region or audience. (Pablo, Lobsang each 8 minutes)
We will also look at the trends by asking these questions (each 8 minutes):
In your career of digital security trainers, what trends are you seeing?
Are trainings sufficient for changing security behaviors? Or support is more important?
How do you do support? Share your experience and best practice
Trends in digital rights violations - how should trainers react and adapt?
In the end, we will take questions from the audience and discuss our takeaways. (12 mins)
We hope through this session, experienced/inexperienced trainers can connect, and together we look into the future of digital security trainings.
The panel will feature two interventions focused on the current state of Internet censorship and Internet governance in Russia. The first one, based on an ongoing research at the Citizen Lab, will focus on the shadowy Russian market of middleboxes - technological solutions for lawful interception and filtering. How is this market regulated (if ever)? What are the economic incentives beyond Internet censorship and lawful interception? How do these boxes work? What's the best methodology for middlebox research and what can we learn from it?
The second intervention, by Samat Galimov, the ex-CTO of Meduza.io, one of the most popular opposition media in Russia, will give a practitioner's perspective on "experiencing censorship". Samat will analyze in details the recent (unfruitful) attempts of Russian government to block Telegram. What can we learn about the flaws of Russian censorship infrastructure from the Telegram case? How can we explain the effect of Telegram ban on the civil society and the ongoing protests in Russia?
The interventions will be followed by an interactive session. We hope to engage the audience to define together a set of outcomes: recommendations for Russian Internet freedom advocacy groups, methodological insights for middlebox research, recommendations for Russian users, similar experiences from CIS countries (and around the world).
Canada is a global leader in Artificial Intelligence (AI) and in human rights. As such, it is developing a foreign policy strategy that articulates how AI is impacting human rights, and identifying considerations that should be taken into account in AI-mediated contexts in order to protect, respect and promote human rights.
While AI technologies can accelerate progress and sustainable development, they can also exacerbate existing human rights challenges and foster new types of violations. Affected rights include equality, privacy and freedom of expression. Promoting rights-respecting AI development requires multidisciplinary collaboration. In the spirit of “policy-making out loud”, the Digital Inclusion Lab at Global Affairs Canada invites RightsCon participants to provide input based on the various conversations taking place at RightsCon on AI, and apply the learning directly into a policy discussion in order to create a policy position for Canada that can bring leadership into this rapidly evolving and complex space.
In light of the ongoing challenges to tackle disinformation in the age of technology, there is an undeniable need for young leaders to understand how to combat disinformation in their communities, as well as to learn from each other’s experiences.
The world is witnessing a surge of traditional and online media being used to manipulate political and civic sentiment, as well as voter behavior. Both traditional and citizen journalists are on the frontlines of the fight against the threats of disinformation as they attempt to deliver facts in an information world dominated by the 24-hours news cycle and a generation of youth heavily relying on social media for their information.
In this session, speakers will present their experiences in encountering the challenges of disinformation, and the different methods they utilize to continue to provide access to accurate information to citizens especially those living in closed or closing spaces where information is increasingly being manipulated by domestic and/or foreign influencers. The panel will also address the larger question of the role of youth in the fight against disinformation, and will engage the audience in identifying creative solutions for youth journalists and activists to utilize to promote a culture of transparency through their work.
The session looks at how to build organizations and communities that are resilient to abusive behavior. It covers preventing abuse, detecting patterns of abuse, and how to handle abuse should it occur. The first half will be structured as a talk and the second half will be a Q&A
Join a worldwide effort to get Palestine visible and on the map across all platforms.
Using the Humanitarian OpenStreetMap Task Manager, participants will learn how to use geographic information science (GIS) mapping techniques to put roads, buildings, residential and agricultural areas on OpenStreetMaps, digitizing high-resolution imagery in real time to identify Palestinian villages. We’ll teach the audience how to map and as they do, our presenters will describe their work and describe the advocacy needed to keep the village safe from demolition.
We will encourage mappers to work in teams of 2, to develop a discerning eye for map imagery and inspire discussion. Here’s the agenda:
1) Go to www.OpenStreetMap.org to set-up your OSM account;
2) How to do mapping —a short tutorial asking the mappers to complete a practice run;
3) Distribution of assignments: begin mapping with coaches coming by to help;
5) An update on efforts to map Palestinian villages;
6) Next Steps; How to advocate locally, worldwide, to keep the villages standing.
Young people today are the first generation that has been tracked since birth. Their information has been recorded, shared and sold since their first days, rendering their right to privacy a polite fiction at best. The internet creates new possibilities for children to express themselves, but this continual monitoring deeply undermines the value of these opportunities. Too many platform holders and governments are currently blithely ignoring their responsibility to uphold these rights.
This interactive session will critique specific examples of ongoing corporate and governmental privacy practices around young people, and ask our panelists to imagine a better future for their rights - what it would look like, and how we could get there from here.
Tech Against Terrorism is convening a session - bringing together leading tech companies, human rights advocates, academics and government representatives - to discuss “Respecting human rights in tackling terrorist exploitation of smaller companies”. Specifically, we’ll ask participants to discuss areas where further research is needed, and how we can provide assistance to smaller companies navigating this complex environment.
As part of the Tech Against Terrorism initiative, we have developed six guiding principles, titled “The Pledge”, which inform our approach and will underpin a framework for engaging with technology companies. This work will help reinforce the importance of anticipating and addressing challenging content, and support technology companies in articulating their commitment to respecting freedom of expression and diversity in ways that are transparent, accountable, and collaborative.
Tech Against Terrorism was launched and is supported by the UN Counter-Terrorism Executive Directorate (UN CTED) to work with the global tech sector in improving knowledge-sharing and supporting companies to tackle the terrorist exploitation of the internet whilst respecting human rights. Our objective is to galvanise the tech industry and support industry-led self-regulation as one successful approach to managing harmful exploitation of the internet whilst protecting the values of openness and freedom of expression.
The advocacy community is increasingly at risk online, offline, and between the two. Whilst threats to the digital and operational security of front line defenders are increasingly understood, threats to their mental health and ability to maintain resilience offline is less so. Advocacy organizations and those who seek to support them need to better understand the range of threats faced, online and off, in order to begin developing more robust, cross-sectoral and multifaceted responses to reclaim civic space and maintain activists resilience while doing so.
This drop-in session will bring together participants from the activist, funder, tech and capacity building communities to discuss: the continuum of online and offline threats frontline defenders are facing in closing space countries, the linkages between and reinforcing nature of these threats, and ways in which online and offline communities can better collaborate to resist closing space and ensure activists resilience.
The Digital Security Clinic is run by the staff of Access Now’s Digital Security Helpline, a free-of-charge resource for civil society around the world. We offer real-time, direct technical assistance and advice to activists, independent media, and civil society organizations. You can find out more about the Helpline at https://www.accessnow.org/help
The Helpline will be holding open drop-in hours in the Access Now Lounge on the second floor from 1:30pm to 5pm on Wednesday and Friday, and from 9am to 1:30pm on Thursday. You can also arrange for a one-on-one appointment with a Helpline staff member throughout the conference by emailing help@accessnow.org.
The staff identify problems and help attendees implement practices that can protect them from threats. During an average visit, Clinic technologists:
1// Assess risks and needs
2// Analyze current practices
3// Troubleshoot problems
4// Provide tools and training to address emergent issues
5// Initiate cases with Access Now’s Digital Security Helpline for any issues not resolved on the spot
6// Refer to specialists where the Helpline is unable to assist
Data alone does not change the world, but it can empower advocates by revealing insights, shedding light on marginalized groups, identifying needs, and informing which policies work in what context and which do not, in the drive for positive and lasting progress towards equality. This information will be critical to reaching the SDGs by 2030. Joint efforts by the UN, civil society and the private sector can play an instrumental role in advocating for the rights of girls and women – when investors and business leaders contribute their voices and provide platforms for the voices of girls and women to advocate for gender equality laws and policies, policymakers will listen. When combined with a strong evidence base, the power to amplify the call for equal rights can be harnessed towards transformative change.
For this session, Equal Measures 2030, a cross-sectoral partnership focused on connecting data and evidence with advocacy and action for gender equality, will share the cross-sectoral work they’re already developing on the SDGs, including lessons learned, challenges and exciting next steps. In an interactive discussion, we’ll explore how data and technology can, and must, be harnessed to accelerate progress and why achieving the Global Goals cannot be done without collaboration across sectors.
Please note, registration for this course is required on our website first: https://www.rightscon.org/cle/.
All attorneys participating in the 2018 RightsCon Toronto conference, are invited to attend Access Now’s first Minimum Continuing Legal Education course “Ethical Duties in the Digital Age: Encryption Done Dirt Cheap.”
Speakers will engage participants in an interactive workshop on how to protect information particularly given rise of cybersecurity breaches. The panel discussion will examine:
This course is CPD accredited by the Law Society of Ontario and CLE approved by the California State Bar Association. NY State attorneys will receive CLE credits through the New York State Bar Association Approved Jurisdiction policy. Pursuant to this policy, bar associations from different jurisdictions may recognizes this course’s CLE credits.
The State of the Internet in Syria: Critical and Challenged (The SecDev Foundation)
Speakers: Abdulrahman al-Masri
Accessing the internet has become a critical need for Syrians, especially after war erupted in 2011. All Syrian actors use the internet to organize, plan, and share information. Syrian non-militant actors rely heavily on the internet to maintain contact with scattered family members and communities, and to access vital information on local threats, risks, and resources, as well as news and education more broadly. Local governance actors depend on the internet to communicate with partners and beneficiaries. Activists leverage the internet to document human rights abuses, bear witness, and reach out to the world.
The capacity of the Syrian civil society is dependent on its ability to operate online, safely and effectively. At the same time, cyberspace is awash in risk for these users - both political and opportunistic - with devastating real-life consequences. This talk will explore how and why the internet became essential for the resilience of Syria's civil society.
First Steps in Understanding Freedom of Expression Online and Offline
Speakers: Bogdan Manolea
The Internet has significantly changed our lives in the past years in many areas, including the way we access and publish information. But most importantly, it has enhanced the exercise of our freedom of expression rights both by allowing access to various sources of information but also by significantly democratizing the open publishing of any kind of information both by allowing access to various sources of information but also by significantly democratizing the open publishing of any kind of information.
Consequently, this has turned the freedom of expression – especially in the online environment – in a subject that concerns us all. Not just the journalists or the NGOs dealing with freedom of expression. Also it has become essential that the information explaining the basic concepts around freedom of expression and the relevant court’s jurisprudence are simplified and explained to a large audience that could be interested in the subject. While many books and legal studies for judges or other legal practitioners have been published on the subject matter, we believe there is now even greater need to simplify and explain the fundamentals of freedom of expression, especially as applied to the digital world. All this is presented through the lens of the current jurisprudence of the European Court of Human Rights (ECtHR) that should be the main reference point for all European Internet users.
Involving rural women in climate change fight. Why innovative solutions should provide reality to them
Speakers: Elizabeth Musoki
In most African countries, agricultural policies and investments like block chain still fail to take into account the differences in resources available for men and women, their roles, workloads and the differential constraints they face.
Both men and women engage in activities catalyse the degradation of water, environment and related resources. Land use practices such as bush burning, deforestation for wood and charcoal, farming on river banks and wetlands and poor disposal of waste products leads to natural forest cover, top soil and water body depletion. As a result, global warming, prolonged droughts, floods, landslides, soil erosion and soil infertility are affecting some parts of the world, including Uganda.
Whereas climate disasters strike indiscriminately, research shows that women and children become more affected than men. “Forced” migration, displacement, water scarcity, low agricultural production and productivity, hunger, disease and loss of income affect women the most. Rural Women stand 70 per cent vulnerability in respect to climate variability, natural disasters and food insecurity. Such a trend calls for deliberate efforts towards enhancing women’s resilience to match the men.
Increasingly, sophisticated spyware is used to target activists, journalist and human rights defenders around the globe. What happens from that first encounter with a malicious link or suspicious code? There are organizations and helplines which can assist in regaining control and security, and there are those who can help analyze where the code came from, archive it… and then there are those who publish comprehensive analytical reports, and those who hold the companies and vendors behind the technology accountable.
This roundtable session aims to bring together some of the experts behind this work, and really bring RightsCon attendees a closer look at the reporting chain, and how they can contribute. From the frontlines to the world’s capitals – we will look at what is available to civil society and its many actors in holding businesses and governments accountable for their actions.