Innocence lost: AI’s impact on privacy

1964

PETALING JAYA (ANN/THE STAR) – Civil societies in Malaysia are sounding a cautionary note, highlighting how seemingly innocent images and videos shared on social media can be manipulated into explicit content using artificial intelligence (AI). 

The most at-risk demographics, particularly vulnerable, are women and children.

Mak Chee Kin, the chairman of the Melaka Action Group for Parents in Education (Magpie), emphasised the need for parents to exercise caution when it comes to sharing visual content of their children on social media. 

He pointed out that individuals with technological proficiency and a lack of responsibility have the capability to manipulate online images and videos for various purposes.

Photo shows logos of some of the popular social media platforms. PHOTO: AP

“In this era of AI and advanced technology, incidents of scammed or doctored images, and in some cases, voices, are happening.

“The truth is that the more one exposes themselves to social media, the higher the risks they will encounter,” he said.

He pointed out that even those innocently sharing images or videos of loved ones on social media could be victims of undesired consequences.

“What was supposed to be a fun and happy thing could lead to unforeseen problems,” Mak added.

Parent Action Group for Education chairman Datin Noor Azimah Abdul Rahim said parents should prepare their children to be mentally strong and smart enough to face these technological challenges.

“What we can do is take care of our own children and build their confidence and mental strength to overcome such situations.

“Though it is easier said than done, it is best to talk about it with the children and strengthen their minds,” she said.

When asked, Noor Azimah agreed that the culture of oversharing online is partly to blame, as children are too eager to share without realising the consequences.

Persatuan Sahabat Wanita Selangor executive director Irene Xavier also said women should be cautioned to be wary of uploading pictures of themselves and their families.

“We have cases of women who uploaded pictures of themselves in revealing positions for their boyfriends, and when the relationship goes sour, these pictures have been abused,” she said.

She added that women should think hard before posting photographs of themselves. In September, a Melaka lawmaker highlighted how a private Telegram group uploaded hundreds of doctored and obscene pictures of innocent victims, comprising female executives, housewives and even young men.

Kota Laksamana assemblyman Low Chee Leong said the doctored image included a 65-year-old grandmother wearing a bikini.

Earlier this year, Mstar also highlighted how certain quarters used AI to superimpose women’s pictures in the nude.

It quoted user @magmalaya who made a post on X (formerly known as Twitter) cautioning women against posting too much on social media due to this.

The comment section of the post was also filled with users discussing how some were conducting “business” by selling the images.

Recently, in Spain, over 20 girls in a town came forward as victims of AI-generated nude images, which were created by feeding an AI app with fully clothed pictures of themselves on social media.

Some of the girls had received demands for payment from the creators of these images, or their pictures would be uploaded to adult websites.

MCA Public Services and Complaints Department head Datuk Seri Michael Chong said devious individuals would usually conduct background checks on those they wanted to target through deep fakes.

Chong said some of the cases he handled were between boyfriends and girlfriends and married partners who had parted ways.

“There are cases of perpetrators superimposing images of people in obscene acts, destroying their reputation, and even extorting them for money.

“If you are a victim, don’t keep it to yourselves. Speak to those around you or to non-governmental organisations that have expertise in dealing with such matters.

“From there, weigh your options and decide what you want to do next,” he said, adding that this included lodging police reports.