|

Safety online

The time to regulate social media platforms has come.

Young people who sign up to use Instagram, Tik Tok or similar social media sites are targeted with potentially harmful material within 24 hours. New research by the 5 Rights Foundation, an international initiative to promote online safety, shows that this is built into the design of the platforms; it is not just an unfortunate side effect. In Canada, there are increasing reports of online hate and online sexual grooming and exploitation of children at younger ages. Online safety is now a Canadian and a global issue.

Online safety is easy to support as a broad goal but messy in details and difficult to achieve. Online hate, for example, is known to be harmful but hard to define. Sexual exploitation may be easier to define but difficult to eradicate. Any efforts to regulate online activity raise challenging issues about rights to access information, privacy and freedom of speech, expression and religion. On the responsibility side, tensions arise between the duty of providers, individual users and governments. Even well-balanced legislation will be a challenge to enforce.

New Canadian law

All of these issues are part of an online consultation that may lead to legislation in the fall. Whichever party wins the election, legislation of some kind is likely to come soon. Given the link to racism, it may become an election issue. Australia, the UK and the EU countries are leading the way with legislation that requires the operators of social media platforms to better regulate activity on their sites. The Canadian proposal, released in July, would prohibit five kinds of harmful content: terrorist content; content that incites violence; hate speech; non-consensual sharing of intimate images; and child sexual exploitation. The proposed central means of control is user complaint: if a user flags content that meets established criteria, the provider of the platform will be required to remove the material within 24 hours. Providers will be required to set up their own systems for robust flagging, notices and appeals. Additional components include a Digital Safety Commissioner to monitor implementation, a Digital Recourse Council to hear appeals and arbitrate contested decisions, and an Advisory Board to provide expert advice as the digital world changes. In a related move, Bill C-36, introduced on the last day of parliament, would add an individual crime of online hatred in the criminal code and establish a complaints process under the Canadian Human Rights Act. One unusual provision is a proposal for peace bonds to prevent a person from posting hateful content.

Any regulation of the online world is complex and sure to draw fire from all sides. Those who see it as state intrusion into areas of individual freedom will try to reduce its scope, while affected persons will question whether it is robust enough to be effective. One can expect many appeals in the early days and frequent amendments of the rules, if it ever gets through parliament. Based on what I have seen of the work of the UK Digital Commissioner, the increased public awareness alone is beneficial.

Young people have a right to fully participate in society, including the online world. That means being able to access information online, learning how to discern threats and manage their own activity, and be protected from exploitation. For young people, the most important action is equipping them to be alert to the threats and exercise their own agency as users to protect themselves and get early help from adults. In my view, this is an area for faith communities to work with young people, for their own safety and for the broader public good; their voices as digital citizens are important. It is interesting to note that informing young people about how they are being manipulated by owners, advertisers and exploiters leads to effective resistance and protective actions.

The time to regulate social media platforms has come. When cars replaced horses, governments realized they had to regulate roadways and build highways. The information highway has been unregulated for so long that it will be difficult to put rules in place. We need them and we can be part of shaping them, as well as informing ourselves and young people to exercise our collective agency as digital citizens.

Author

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *