BOOK THIS SPACE FOR AD
ARTICLE ADMeta and Twitter have called for Australia's federal government to review the effectiveness of the country's digital platforms regulation in light of the passing of the Online Safety Act, along with anti-trolling and online privacy laws currently being under consideration.
Both tech giants made these demands in submissions to the Select Committee on Social Media and Online Safety, with Twitter writing that the committee should conduct a review of the online safety space in Australia one-year from its initial report, which is due next month.
The Select Committee on Social Media and Online Safety was established late last year to inquire into the practices of major technology companies and consider evidence relating to the impact social media platforms have on the mental health of Australians.
The committee's inquiry was approved by the federal government with the intention of building on the proposed social media legislation to "unmask trolls".
Twitter said the recent passing of the Online Safety Act and the government's federal probe only running for three months is not enough time to effectively implement digital platforms legislation.
"With the range of factors that need to considered to holistically advance online safety, we therefore ask for the timeline be extended for the Select Committee Inquiry into Social Media and Online Safety to allow for the effective introduction and implementation of the Online Safety Act 2021 (Cth) and to ensure meaningful consultation with the community," Twitter wrote to the committee.
Meta, meanwhile, wrote in its submission that the federal government should make statutory reviews of new digital platforms legislation mandatory to ensure they are effective and fit-for-purpose, specifically pointing to the "significant amount of new legislation that has been passed".
"Policymakers should be alive to the risk of overlapping, duplicative or inconsistent rules across different laws," Meta said.
Digital Industry Group Inc (DiGi), the Australian industry group advocating for tech giants, including Facebook, Google, TikTok, and Twitter, shared a similar sentiment in its submission to the parliamentary committee.
In its submission, DiGi wrote that proposed regulatory measures, such as making age verification mandatory on social media platforms, have been put in the limelight without any legislative notice. It said that given the unprecedented implications of age verification of Australians on a range of digital services, it said wider consultation must first take place if it were to be implemented.
DiGi added that the slew of new laws could result in overlap, and recommended that the federal government consider streamlining online safety legislation into a singular Online Safety Act.
Parliamentary committee hears testimony of death and rape threats on social media
Days after these submissions were publicly released, Australian television presenter Erin Molan appeared before the committee yesterday morning. During her appearance, she testified that trolls have faced no recourse for sending her death and rape threats on social media platforms.
"These are sent directly to me on platforms that I use professionally," Molan told the Select Committee on Social Media and Online Safety, when explaining how she received threats that were directed at herself and her daughter.
"It was almost impossible to get help and you almost feel silly. As I said, the personal impact of this on people and we've seen people take their lives, we've seen kids try to take their lives. We've seen so many lives ruined by this kind of behaviour."
She also told the committee that the work performed by social media platforms, such as Facebook and Twitter, to assist police in certain instances are "not that effective" in preventing trolling behaviour on social media platforms and that victims often feel powerless in reporting online abuse.
In light of this, Molan called for the eSafety commissioner's powers to be expanded as well as for legislation to be introduced to put more accountability on social media platforms.
"[Big tech] generate a ton of money and with that comes responsibility. They, of course, have the responsibility to ensure their platforms are a safe space because every workplace in the country needs to ensure that, within their walls, it is a safe place for their employees, but they won't do it. Unless there are laws that punish them for not doing it, why would they do it?" Molan told the committee.
University of New South Wales sociology associate professor, Michael Salter, who also appeared before the committee, told the committee that Molan's experience of feeling powerless in preventing online abuse was a common occurrence, especially among children.
"It's actually really hard to get [children] to tell an adult in their life ... so this is a really complex situation. There is work to do here in Australia, to think about how we develop a holistic response to children in order to target their unique needs and vulnerabilities," Salter said.
Salter also testified to the committee about instances when YouTube's algorithm created playlists of children dancing and performing gymnastics that are only visible to paedophiles.
He explained that as the playlists are not shown to non-paedophile communities, YouTube's detection and reporting of inappropriate behaviour response can be ineffective.
"Having basic safety expectations built into platforms from the get-go is not too much to expect from an online service provider," Salter added.