AS THE 18th Congress draws to a close, a Senate committee has approved a report recommending that social media platforms like Facebook, YouTube, and Tiktok be held accountable for the spread of disinformation.

Senator Francis “Kiko” Pangilinan, chairman of the Senate Committee on Constitutional Amendments and Revision of Codes, drafted the report contained in approved Senate Resolution 953 on social media disinformation.

“To discourage inaction by the social media platforms, malice should be presumed on the part of the publisher (i.e., social media platform) if the libelous comment is made by a fake or fictitious person and such platform fails to take down the libelous content within a reasonable time,” the report recommended.

The report noted that legislation needs to be broad enough to capture these techniques that the disinformation producers use to elude accountability.

It also wants government offices to ensure that their employees are not engaging in or spreading disinformation and hate speech outside of their official functions.

Chaired by Pangilinan, the committee held four hearings on the rise of social media platforms and the rapid advancement of technology, where journalists, including Nobel Peace Prize Laureate Maria Ressa, academicians doing research on networks of disinformation, representatives of advertising and public relations agencies, a retired Supreme Court justice, and other government officials presented their positions.

The report, approved on the last session day of the 18th Congress on June 1, also wants to amend the law on libel making the use of fake accounts/fake names in making libelous comments as per se proof of malice. “Social media platforms should be held accountable,” it said.

It also seeks the revision of the Cybercrime Prevention Act of 2012 to enable it to deal with the exponential rise of the use of social media platforms for disinformation activities.

“COMELEC should play an active role in combating disinformation during elections given that a representative democracy is not possible with rampant disinformation,” it added.

 

Kiko’s work as senator commended: Senators on Wednesday, June 1, 2022, adopt Proposed Senate Resolution No. 1018 commending Sen. Francis “Kiko” Pangilinan for his immeasurable contributions to the chamber’s fruitful achievements during the 17th and 18th Congresses. The resolution cited Pangilinan’s work as chairperson of the Committee on Constitutional Amendments and Revision of Codes, and of the Committee on Agriculture and Food. Pangilinan was also recognized for his deep concern over the plight of small farmers and his role in the passage of Republic Act No. 11598 or the Cash Assistance for Filipino Farmers Act, RA 11524 or the Coconut Farmers and Industry Trust Fund Act, RA 11511 or the amendments to the Organic Agriculture Act, and RA 11321 or the Sagip Saka Act. (Senate PRIB Photos)
Kiko’s work as senator commended: Senators on Wednesday, June 1, 2022, adopt Proposed Senate Resolution No. 1018 commending Sen. Francis “Kiko” Pangilinan for his immeasurable contributions to the chamber’s fruitful achievements during the 17th and 18th Congresses. The resolution cited Pangilinan’s work as chairperson of the Committee on Constitutional Amendments and Revision of Codes, and of the Committee on Agriculture and Food. Pangilinan was also recognized for his deep concern over the plight of small farmers and his role in the passage of Republic Act No. 11598 or the Cash Assistance for Filipino Farmers Act, RA 11524 or the Coconut Farmers and Industry Trust Fund Act, RA 11511 or the amendments to the Organic Agriculture Act, and RA 11321 or the Sagip Saka Act. (Senate PRIB Photos)

The other recommendations in the draft committee report are the following:

  1. Refile the SIM card registration bill. Persons spreading hate speech and disinformation hide behind fake accounts and fake names. The SIM card registration bill may help in determining the identities of this disinformation and hate speech peddlers. Penalties should be imposed on telcos that violate the said law;
  2. File a bill that will compel social media platforms to require users to prove their identities before they can proceed with the social media platforms’ service. Peddlers of disinformation and hate speech usually hide behind fake names and fake accounts;
  3. File a bill requiring influencers or social media personalities with large followings to disclose to their followers whether they received material or monetary considerations from advertisers, politicians, and personalities. The bill should be patterned after the United States Federal Trade Commission (FTC) disclosure rules where the FTC requires influencers to disclose whenever they have any financial, employment, personal, or family relationship with a brand, including receiving free or discounted products and other perks;
  4. Require government offices to have policies governing their employees’ “sideline” digital media work while handling their respective official social media accounts. Government offices that maintain official social media accounts, such as the PCOO, should have a policy regulating their employees’ digital work outside of their official functions;
  5. Campaign finance regulations should bring in transparency and accountability. People hiring digital campaigners should be compelled to disclose what campaigns they have commissioned, how much, and who are the people involved. Also, campaigns now take on very different formats, such as influencers posting or hashtags that are made to trend;
  6. Pass the following legislative measures: a. Impose administrative sanctions against government officials or employees who use government resources to wage disinformation campaigns against the public it is supposed to serve; b. Strengthen the capacity of the educational bureaucracy to produce high-quality textbooks; c. A memory law patterned after that of Germany, that penalizes the denial of agreed-upon historical truths, subject to the right to freedom of expression and with an assurance of independent judicial intervention; d. Strengthen the capacity of the government’s own massive media and information infrastructure to report the news independent of government influence; e. Hold social media platforms accountable and treat them as information utilities; f. Strengthen the capacity of the public to become critical and discriminating users of content by reviewing and improving the Department of Education’s current Media and Information Literacy or MIL program for high school students; g. Prohibit creators of harmful content to monetize their content;
  7. Promote a “whole of society” approach that will require a lot of monitoring, civil society support, and support for independent audits. Ensure the participation of all stakeholders, especially the social media platforms, advertisers, media, and public relations agencies;
  8. Social media platforms should be more transparent in relation to microtargeting. These platforms should also provide the necessary tools for advertisers to better monitor and have more control over their ad placements. This is in response to the advertisers and media agencies which said that it is “practically impossible” for them to monitor their ads that were inadvertently placed in the content of disinformation;
  9. Social media platforms should be made responsible for their algorithms, which in some instances, create a cycle of feeding harmful, inflammatory, or untrue content to its users. They should be compelled to release the details of their algorithms and core functions to trusted independent researchers to determine if such algorithms artificially amplify false and manipulative information on a wide scale;
  10. Social media platforms should allow advertisers to do a detailed audit on where (i.e., specific page, channel, or video) their ads appear;
  11. Social media platforms should extend their direct reporting system for requests for a takedown to civil society organizations. Reporting should be more open to the public;
  12. Social media platforms should have policies requiring collaboration with civil society groups, advertisers, media agencies, and government;
  13. Academe and civil society groups should be allowed to help in the direct reporting of takedown requests which is currently only available to law enforcement agencies.
  14. Government and/or social media platforms should consider accrediting independent civil society groups, non-governmental organizations, or members of the academe to review content and identify which channels are purveyors of disinformation;
  15. 15. Academe and civil society groups should collaborate with regional and global institutions to have a wider perspective in combatting disinformation;
  16. 16. Advertisers and their creatives and media agencies should be in constant dialogue with social media platforms to ensure clean and credible content and improve algorithms for the proper placement of advertisements;
  17. 4As, MSAP, ASC, PANA, IMMAP, and other similar associations should update their self-regulatory standards, including their respective Code of Ethics, to encourage transparency and accountability in digital marketing;
  18. KBP should expand the coverage of its self-regulatory standards to cover the social media accounts and podcasts of its anchors;
  19. Government should provide more support or social safety nets to digital workers;
  20. Government should fund more research on networked disinformation;
  21. Government should identify the role of government officials in disinformation efforts (based on its fact-checking efforts, most online disinformation are from government officials);
  22. Government should focus on strengthening enforcement actions and compelling compliance of internet service providers with their obligations under the Cybercrime law; and
  23. Schools should have a multi-platform information literacy and critical thinking in the basic education curriculum, similar to Finland’s model.

Leave a Reply

Your email address will not be published. Required fields are marked *