Kuala Lumpur, 15 October 2025 – INITIATE.MY convened a regional webinar titled “Are We Ready to Tackle Tech-Facilitated Harms?”, bringing together 50 participants from Malaysia, Indonesia, and Bangladesh. Attendees included civil society organisations (CSOs), academics, researchers, lawyers, and government officials, all committed to understanding and mitigating online harms. The session explored the growing complexity of technology-facilitated risks and the current gaps in policies, regulations, and social safeguards.
Please find the webinar report here for more details.
The webinar was co-organised by INITIATE.MY with support from regional partners, providing a platform to exchange knowledge, build capacity, and encourage multi-stakeholder collaboration. INITIATE.MY, introduced as a Malaysian initiative, leverages research, strategic consultation, and capacity building to promote tolerance and prevent violence. Its Knowledge Hub, launched in November 2024 with support from the Luminate Foundation, equips CSOs with tools to monitor online harms, translate data into advocacy, and strengthen digital resilience.
Experts examined how digital platforms, while enabling innovation and connectivity, also facilitate harmful content, opaque algorithms, and profit-driven amplification of disinformation, scams, and gender-based violence. Speakers highlighted that online harms often spill into offline spaces, deepening inequality and eroding trust. Discussions emphasised that technology is not neutral: algorithmic systems shape behaviour, amplify extremist narratives, and influence perceptions across linguistically and culturally diverse communities.
Participants analysed Malaysia’s legal and regulatory landscape, identifying gaps in enforcement and fragmented responses. Current laws, including the Communications and Multimedia Act and proposed cybersecurity legislation, often fail to protect victims of cyberstalking, doxxing, and non-consensual intimate-image sharing. Experts stressed that reactive, complaint-driven enforcement is insufficient and called for proactive, system-level approaches underpinned by multi-stakeholder collaboration.
The session also addressed accountability for platforms and content moderators. Speakers recommended transparency, external oversight, and mandatory mental health support for human moderators exposed to disturbing content. Civil society’s role was highlighted as essential; despite limited resources and platform access, CSOs act as frontline responders and independent watchdogs. Participants explored “data solidarity,” advocating for cross-country collaboration to share evidence, monitor harms, and promote accountability.
The webinar concluded that creating safer digital spaces requires coordinated action among governments, platforms, and CSOs, underpinned by human rights–based governance and digital literacy. Participants reported increased knowledge, practical skills, and motivation to implement evidence-informed strategies to prevent technology-facilitated harms.
