Saturday, November 23, 2024
HomeMental HealthInvestigating social media harm is an excellent idea, but parliament is about...

Investigating social media harm is an excellent idea, but parliament is about to see how complicated it’s to fix

- Advertisement -
- Advertisement -

Barely a day has passed by this month without politicians or commentators talking about online harms.

There have been multiple high-profile examples spurring on the conversation. There was the circulation of videos of Bishop Mar Mari Emmanuel being stabbed within the Sydney church attack. The normalisation of violent content online has also been central to the discussion of the domestic violence crisis.

Then, in fact, there’s the expressions of disdain for the Australian legal system by X (formerly Twitter) owner Elon Musk.

Inevitably, there are calls to “do something” and broad public appetite for changes in regulations. A brand new parliamentary committee will explore what that change should seem like, but may have to contend with a variety of legal, practical and ethical obstacles along the best way.



Ten busy days

On May 1 and May 10, the federal government made two major announcements.

The first was a Commonwealth response to a number of the online harms identified by National Cabinet. At the May 1 meeting, the Commonwealth promised to deliver recent measures to deal with violent online pornography and misogynistic content targeting children and young people. This included promised recent laws to ban deepfake pornography and to fund a pilot project on age-assurance technologies.

Communications Minister Michelle Rowland and Financial Services Minister Stephen Jones announced the brand new committee.
Bianca De Marchi/AAP

The second was an announcement establishing a Joint Parliamentary Select Committee to look into the influence and impacts of social media on Australian society. The government wants the committee to look at and report on 4 major issues:

  1. The decision of Meta to desert deals under the News Media and Digital Platforms Bargaining Code

  2. the essential role of Australian journalism, news and public-interest media in countering misinformation and disinformation on digital platforms

  3. the algorithms, systems and company decision-making of digital platforms in influencing what Australians see, and the impacts of this on mental health

  4. other issues in relation to harmful or illegal content disseminated over social media, including scams, age-restricted content, child sexual abuse and violent extremist material.

However, the ultimate terms of reference will probably be drafted after consultation with each the Senate crossbench and the opposition, so that they may change a bit.

Why would they do that?

Asking the committee to review the Meta decision is an odd move.

In practice, Financial Services Minister Stephen Jones can “designate” Meta and not using a referral to the parliament. That is, the minister can resolve the entire obligations of the News Media Bargaining Code apply to Meta.

However, a sounding by the committee may help to make sure Meta keeps concentrating on the problem. It also provides the chance to restate the underlying principles behind the code and the parlous state of much of the Australian news media.

In relation to harmful or illegal content disseminated over social media, there’s already a review of the Online Safety Act underway. The terms of reference appear to ask the committee to offer input into the review.



The issue of misinformation and disinformation has also been the topic of review. The government released a draft of a proposed bill to combat misinformation and disinformation in June 2023. It would give the Australian Communications and Media Authority (ACMA) power to implement an industry code, or to make one if the industry cannot.

That draft was criticised by the opposition on the time. However, there have been shifts since then and the committee is likely to be a vehicle for the introduction of an amended version of the bill.

An age-old issue

Online age verification is an easy concept that is difficult to implement unless there are significant consequences for non-compliance on a service provider.

Work on this area by the UK’s communications regulator, Ofcom, and the UK Information Commissioner’s Office are sometimes cited as leading practice. However, the commissioner’s website notes “age assurance is a posh area with technology developing rapidly”.

A group of children in a classroom using smartphones
Measures to limit children’s access to social media will probably be investigated by the committee.
Shutterstock

One approach is for the minor to discover themselves to a platform by uploading a video or to send a photograph of their ID. This is entirely contrary to the eSafety Commissioner’s messaging on online safety. The Commissioner advises parents to ensure that children don’t share images or videos of themselves and to never share their ID.

In practice, essentially the most effective age identification for minors requires parents to intervene. This may be done by utilizing software to limit access or by supervising screentime. If children and teenagers can get around the principles just by borrowing a tool from a faculty friend, age verification won’t do much.

As the International Association of Privacy Professionals has foundage verification and data protection are far harder than they appear. It is especially difficult if the age barrier shouldn’t be one already in place – akin to the adult rights that those over the age of 18 possess – but fairly a seemingly arbitrary point within the mid-teens. Other than online, a very powerful age to confirm is eighteen for things akin to alcohol sales and credit. It can be the age at which contracts may be enforced.

Countries vs corporations

One issue that is commonly raised about social media platforms is how Australia can take care of a worldwide business.

Here, the European approach within the Digital Markets Act provides some ideas. The act defines corporations with a powerful market position as “gatekeepers” and sets out rules they have to follow. Under the act, essential data have to be shared as directed by the user to make the web fairer and to make sure different sites and software can communicate with one another. It also calls for algorithms to be made more transparent, though these rules are a bit more limited.

A woman stands in front of a large screen saying Digital Markets Act and speaks at a lectern
European Commissioner for Europe fit for the Digital Age, Margrethe Vestager, helps administer the Digital Markets Act.
Virginia May/AP

In doing so, it limits the ability of gatekeeper corporations, including Alphabet (Google), Amazon, Apple, ByteDance (TikTok), Meta and Microsoft.

Obviously, Australia can’t harness the collective power of a gaggle of countries in the identical way the European Union does, but that doesn’t preclude a number of the measures from being useful here.

There is considerable public support for governments to “do something” about online content and social media access, but there are each legal and practical obstacles to imposing recent laws.

There can be the issue of getting political consensus on such measures, as seen with the controversy surrounding the misinformation bill.

But it’s clear in Australia, each residents and governments have been losing patience with letting tech corporations regulate themselves and shifting responsibility to oldsters.

- Advertisement - spot_img
- Advertisement - spot_img
Must Read
- Advertisement -
Related News
- Advertisement - spot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here