Make no mistake, downloading and viewing ‘child porn’ is a crime in Malaysia — here's what the numbers tell us

Malay Mail
Malay Mail

KUALA LUMPUR, Jan 8 — Have you ever wondered if viewing child pornography — or more accurately labelled as "child sexual abuse material” (CSAM) in local laws — is a crime in Malaysia? And if investigators and those seeking to protect children are able to detect such activity in Malaysia?

The short answer is yes, and yes.

Bukit Aman’s Sexual, Women and Child Investigation Division (D11) principal assistant director Assistant Commissioner of Police Siti Kamsiah Hassan said most of the individuals — who were found by the police to have accessed CSAM — had claimed to “not know” that it is an offence to do so.

But she said the Sexual Offences against Children Act 2017 has clearly listed this as an offence, with six categories of offences under Section 5 to 10 covering crimes involving CSAM and with punishments ranging from jail time of up to five years to up to 30 years. (Section 4 defines CSAM as including visual, audio or written representation of a child engaged in sexually explicit conduct, or of realistic or graphic images of a child engaged in sexually explicit conduct.)

“The minimum is if child sexual abuse material is in your possession, it can be up to five years’ jail. The maximum, if you produce child pornography, you record and make child sexual abuse material, it can go up to 30 years’ jail.

“So the category of offences is different based on severity,” she told Malay Mail in a recent interview.

For example, Siti Kamsiah said the act of accessing CSAM would be a crime under Section 10, while downloading CSAM means one can be charged with possession of CSAM under Section 10, and that downloading and sharing the CSAM with others through WhatsApp or email would be an act of disseminating CSAM that is a crime punishable under Section 8.

Specifically for those viewing or owning CSAM, it is a crime under Section 10.

Under Section 10, any person who accesses or has in his possession or controls any CSAM commits an offence, and this is punishable with up to five years’ jail or up to RM10,000 in fine or both.

In explaining Section 10, the Sexual Offences Against Children Act said a person is considered to have accessed CSAM if he knowingly views it or if he knowingly causes such material to be sent to himself.

Here are three examples given in the Sexual Offences Against Children Act, including this scenario where you received an email from an unknown sender with an untitled attachment and where you open the email attachment without knowing that it contains CSAM.

Whether you committed a crime depends on how you respond to this unknown email attachment: If you immediately delete the email upon viewing the attachment’s content, you are not guilty. But if you continuously view the CSAM in the attachment despite knowing its content, you are guilty of the Section 10 offence.

In the third scenario: If you open another person’s computer and discover CSAM in that computer’s hard disk, and you then transmit that material into your pen drive and keep the pen drive in your office, you are guilty of the Section 10 offence.

Why does this all matter?

If you are wondering why Malaysia needs a refresher on crimes related to CSAM, it’s because of the exponential increase in reported incidents involving CSAM online all around the world, making it one of the biggest threats to the safety of children on the Internet in current times.

This trend is mirrored in the uptick of tips — from both international and local sources such as law enforcement agencies — received by Malaysian police of locally-registered IP addresses that have accessed CSAM. (Note however that the meaning of these figures should be carefully interpreted, as there could be various other factors behind such increases.)

One key source of tips on Internet Protocol (IP) addresses — which can be used to track down suspects — for Malaysian police is the US-based non-profit organisation National Centre for Missing & Exploited Children (NCMEC).

In annual figures, NCMEC’s CyberTipline — where the public and electronic communication service providers report on incidents of suspected child sexual exploitation globally, including those involving CSAM — had initially received 4,560 reports in 1998, before hitting 1.1 million in 2014, then climbing up to 10.2 million in 2017 and now 32 million for 2022.

For NCMEC’s CyberTipline reports globally in the latest years (21.7 million in 2020 and over 29.3 million in 2021 and 32 million in 2022), 99 per cent of such global reports of suspected child sexual exploitation for each of these years were on child pornography (including the possession, manufacture, and distribution of such CSAM).

Specifically for Malaysia, available NCMEC data shows there were 96,627 such CyberTipline reports of suspected child sexual exploitation in 2017, 219,459 such reports in 2018, 183,407 (2019), 204,506 (2020), and 269,671 (2021). (These figures are taken both directly from NCMEC and from the “Disrupting Harm in Malaysia: Evidence on online child sexual exploitation and abuse” report by ECPAT International, The International Criminal Police Organisation (Interpol), the United Nations Children’s Fund (Unicef).)

(NCMEC noted that CyberTipline reports include information on the upload location of CSAM, but cautioned that country-specific statistics could be impacted by the use of proxies and anonymisers, and that the numbers are for informational purposes and do not indicate the level of child sexual abuse in a particular country.)

While such reports include various types of incidents such as online enticement of children for sexual acts or unsolicited obscene material sent to a child, the NCMEC’s CyberTipline reports for Malaysia were similarly overwhelmingly related to CSAM (including possession, manufacture and distribution): over 99.96 per cent or 96,594 of the 96,627 CyberTipline reports in 2017, over 99.98 per cent or 219,433 of 219,459 reports in 2018, and again over 99.98 per cent or 183,383 of the 183,407 reports in 2019.

Popular social network platform Facebook submitted between 94 to 96 per cent of CyberTipline reports for Malaysia during the years 2017 to 2019 at 92,138 cases (2017), 211,739 cases (2018) and 172,294 cases (2019).

Other electronic service providers who provided CyberTips to NCMEC on suspected child sexual exploitation in Malaysia include Instagram and Google with both providing thousands of such reports from 2017 to 2019, as well as Twitter Inc./Vine.co, WhatsApp Inc and Tumblr which submitted hundreds of such reports within the same period.

In other words, online CSAM is increasingly being reported not just in the world, but in Malaysia too.

Siti Kamsiah said ignorance of the law is not a defence, and that the court would evaluate the facts when considering the sentence to be imposed on an offender for having CSAM. — Picture by Sayuti Zainudin
Siti Kamsiah said ignorance of the law is not a defence, and that the court would evaluate the facts when considering the sentence to be imposed on an offender for having CSAM. — Picture by Sayuti Zainudin

Siti Kamsiah said ignorance of the law is not a defence, and that the court would evaluate the facts when considering the sentence to be imposed on an offender for having CSAM. — Picture by Sayuti Zainudin

What happens when Malaysia receives tips on IP addresses linked to CSAM?

Based on data provided by the D11 unit, the number of locally-registered Internet Protocol addresses — a unique number assigned to each device like computers or mobile phones connected to the Internet — which were detected to have accessed CSAM and which were reported to Malaysian police has continued to grow over the years.

From just 2,660 of such IP addresses reported in 2018 to Malaysian police from both local and international sources, the number of such reports has grown within just five years to 49,621 in 2022. Even for the January to November 27 period in 2023, such reports have grown to a whopping number of 119,825 — the highest ever since 2018.

Siti Kamsiah said it is rare for victims to include IP addresses in their reports, and that usually, the Malaysian police would receive tips — through entities such as the Malaysian Communications and Multimedia Commission (MCMC) and international law enforcement agencies and sources like NCMEC — of IP addresses in Malaysia that were detected to have accessed CSAM.

“All the IP addresses that we investigate involve local IP addresses. All the IP addresses that we received are registered in Malaysia. It doesn’t matter whether the subscriber is not a local, maybe it is a foreigner living in Malaysia, but the IP address is registered in Malaysia,” she said.

Based on an IP address, the police would then check with the internet service providers — which would be the telecommunications companies or telcos providing internet — on who is the IP address’s registered owner.

The police would then physically go to the owner of the IP address, such as the house where Wi-Fi linked to the IP address was used and will inspect devices there such as laptops, handphones, and pen drives for CSAM.

“So when we go, we see there is no evidence of that CSAM, we just give a warning. We know they had accessed it, maybe they deleted it, so we don’t find it.

“If we don’t find it, we can arrest, because we already have a basis for further investigations, for forensics to check the details.

“But for now, because we know that the public also does not know it is an offence, so that’s why for now, if we go, we say ‘We received information’ and so on, so we advise them — if that CSAM is not found — that it is an offence, we can actually bring them in for further investigation and arrest them, but we don’t arrest unless we found it.

“Usually for those we arrested, we have already gone and seen in the computer, it does have (CSAM) and most time they admit it and think this is not wrong. Usually, these people keep it for themselves; when they view it, they download it, they don’t know it is wrong; the most public don’t know that accessing child pornography is an offence,” she said.

Asked if that means suspects would not know it is wrong to view CSAM or keep CSAM for themselves or give them to friends, Siti Kamsiah said most of them would say they did not know it is wrong to do so.

Siti Kamsiah said ignorance of the law is not a defence, and that the court would evaluate the facts when considering the sentence to be imposed on an offender for having CSAM.

“In law, cannot say I don’t know that is wrong. But the court can make consideration whether before this there are many (CSAM), how come you don’t know. It depends on case-by-case basis, the penalty can go up to five years.

“But if it’s the first time, it’s just one image, then maybe the court may consider maybe a fine, maybe an advisory, depends on the case,” she said.

Some may claim that they had accidentally accessed CSAM and that they do not know how there could be CSAM in their device, but such claims would be evaluated based on evidence. Siti Kamsiah cautioned that those who accidentally view CSAM could become addicted.

If the police found thousands of CSAM, they would ask the suspect whether they are a supplier or seller and whether the suspect has any child victims.

For example, Alladin Lanim — who was one of the most-wanted child sex offenders in the world over his prolific anonymous uploading of CSAM online to the dark web for years — was arrested in Sarawak in 2021 with thousands of CSAM found in his handphone.

Alladin’s arrest was the result of a joint effort between Malaysian and Australian police who managed to identify him, and he was later sentenced to more than 48 years in jail and 15 strokes of whipping over 18 sexual offences against children which he pled guilty to. Investigators identified 34 child victims he had abused, but believed there could be more victims.

Siti Kamsiah said that there are individuals who distribute or exchange CSAM through peer-to-peer file-sharing networks — where CSAM is distributed between computers in the same network but without a centralised server, and with each computer (that already has the CSAM file) sharing patches of the same file instead of one computer sharing the complete file — in order to avoid being traced as the supplier.

Police would sometimes not find CSAM on the computer of a suspect, as the suspect may not have downloaded or may have deleted the downloaded CSAM.

But Siti Kamsiah said that even if an individual merely clicks on or accesses or views CSAM from peer-to-peer networks, their IP address would already be detected as having accessed CSAM and the individual would have been detected as having already viewed it.

This is because CSAM that is shared on the internet would typically be tagged with a unique ID, which would enable easy detection.

(NCMEC, for example, analyses and tags each image or video that is identified as CSAM with a unique digital fingerprint or a unique numerical value that is known as hash values, with such hash values then added to a list or database shared with technology companies to enable the detecting, reporting and removal of such CSAM. As of 2023, NCMEC has tagged millions of CSAM files.)

Siti Kamsiah highlighted the importance of police acting on tips and reports on IP addresses that have accessed CSAM, as this would enable the police to stop those who are viewing CSAM and prevent such individuals from carrying on further sexual crimes against children. — Picture by Sayuti Zainudin
Siti Kamsiah highlighted the importance of police acting on tips and reports on IP addresses that have accessed CSAM, as this would enable the police to stop those who are viewing CSAM and prevent such individuals from carrying on further sexual crimes against children. — Picture by Sayuti Zainudin

Siti Kamsiah highlighted the importance of police acting on tips and reports on IP addresses that have accessed CSAM, as this would enable the police to stop those who are viewing CSAM and prevent such individuals from carrying on further sexual crimes against children. — Picture by Sayuti Zainudin

How tracking down those behind CSAM-linked IP addresses can help protect more children

Siti Kamsiah highlighted the importance of police acting on tips and reports on IP addresses that have accessed CSAM, as this would enable the police to stop those who are viewing CSAM and prevent such individuals from carrying on further sexual crimes against children.

“Because when it starts with just the individual liking child pornography, it may trigger other crimes. They only view at the start, then they are addicted or they want to apply or explore. If the suspect is a child, they may want to try after viewing it.

“So maybe we have identified from early on who always views CSAM, so we can identify them and take action to stop their conduct, so that they will not commit offences, we may prevent them from committing other offences. If we don’t stop them, they may later carry out grooming on children like their child or their younger sibling, and carry out sexual offences,” she said.

Based on the Malaysian police’s experience when acting on tips on IP addresses, Siti Kamsiah said that most of those arrested were adults, but that there were also children — aged below 18 such as teenage students — who have been found to like viewing CSAM.

“Based on cases before this, from someone who is not interested in child pornography, but because the individual always views it, it can stimulate their minds to like it. We once arrested a suspect in Pahang who said at the start, they did not like to view CSAM and did not like children, but after viewing child pornography involving video of child sexual acts, they start to like it and be interested and become addicted,” she said.

She also said there have been cases where a sexual offender changed from being disinterested in children to someone who is interested in children after viewing CSAM, and who subsequently carried out physical sexual assault on children. She stressed that this was why police action was important to prevent more children from becoming victims to sexual abuse.

Whether the offender who viewed CSAM is a child or an adult, Siti Kamsiah believes that it is never “too late” to prevent more child sexual offences from happening.

While it may be comparatively easier to rehabilitate a child from addiction to CSAM as compared to adult offenders, harsh penalties could also act as a deterrent for adults, she said.