- Oops!Something went wrong.Please try again later.
Advocates are demanding social media companies step up and intervene in cases of abuse and trolling to keep people safe online.
Disability advocate Carly Findlay has told a federal parliamentary inquiry social media companies have taken little action when she has been trolled and abused.
"I don't think there has been any intervention," she told the committee on Wednesday.
"The only time I have ever seen anything happen is when Twitter says a profile has been removed for breaching standards, and that is rare.
"There's no recourse in online safety."
Ms Findlay believed the eSafety Commission was a good advocacy and reporting tool, but said it was largely powerless in stopping online abuse.
She wanted social media to be made safer, particularly given the large numbers of people living with disabilities who rely on such platforms.
"Online is our workplace and where we also socialise, and we can't just switch off," Ms Findlay said.
"There's the expectation we're able to switch off and walk away, and particularly for the disability community we rely on it for our safety and community."
The inquiry comes as the federal government pushes for tighter social media regulations and new laws to hold tech giants to account.
Its proposal will force social media platforms to take down offending posts and provide the identity of anonymous posters in some circumstances.
Advocacy group the Harmony Alliance warns these anti-trolling laws could create additional problems.
"I'm just worried with this legislation, if they're not careful, (it) will be harmful by design," chair Nyadol Nyuon said.
"Those with access and the powerful will be able to litigate and silence voices."
Representing the Let Her Speak campaign, lawyer Michael Bradley said the anti-trolling bill could have a "chilling" effect on free speech, potentially making it easier for powerful people to use defamation law to silence their critics.
Mr Bradley said he saw no evidence defamation laws had anything to do with online safety.
Australian Muslim Advocacy Network advisor Rita Jabri Markwell said her organisation's goal was to prevent another Christchurch massacre, arguing that dehumanising Muslim people online had helped create the attack.
"I was terrified for my family that day and that sinking feeling has never gone away," she said.
Ms Markwell added a Facebook whistleblower recently gave evidence that the organisation's content curation algorithm gave priority to engaging content.
She questioned what human moderation and content regulation was being done to stop engaging content being harmful to minority groups.
The inquiry is due to report back in mid-February.