Algorithms used by tech companies, such as those that recommend content or filter information, are often proprietary and closely guarded secrets. Proponents argue that transparency would prevent abuses and ensure fair practices. Opponents argue that it would harm business confidentiality and competitive advantage.
Narrow down the conversation to these participants:
@ISIDEWITH9mos9MO
Yes
@9MS3NJS9mos9MO
What would be the purpose in publishing the algorithms, we already understand that they optimise for engagement and attention above all else in the same way that a business optimises around revenue.
What we really need is transparency around hostile agents interfering with discourse between voters — which has nothing to do with the algorithms.
@ISIDEWITH9mos9MO
No
@9MS3NJS9mos9MO
Algorithms change daily in a continual attempt to optimise on engagement, forcing companies to publish them will do little to educate the public on the contents of these algorithms and will only force unnecessary regulatory burden on social media companies.
Regulatory burdens often have the effect of locking in the current players, reducing competition in a particular market as the fixed costs incurred by complying with these regulations prevent new entrants from starting to compete. I would very much like to see Meta taken down by competitors.
@9MKHVGZ9mos9MO
Yes, a third party oversight committee that doesn’t answer to the government should be able to be petitioned by the government when they wish to examine a company’s algorithm and then the committee will decide the validity of the request. If deemed valid any company that wishes to operate in the UK has to comply.
@B2QWQBN1wk1W
Yes it is necessary for national security that the government is keeping an eye on what tech companies specialty social media sites on what content is being put out
No, But if a Company wants to copywrite the algorithm then this would be subject to a professional 'Advanced Computer Logic' branch of the Gov but for security & correct use & oversight.
@9Q83PJ87mos7MO
No, but tech companies must be able to prove their algorithms do not pander to extreme views for profit.
@9ZJL9ZRLiberal Democrat 3mos3MO
No, because many algorithms work in a way that cannot be adequately comprehended by human intuition as they are not entirely human-created
@9QSCSHJ7mos7MO
No, but monitoring should be done to apply selective enforcement to prevent personal and psychological harm to users
@9QPBNC57mos7MO
Yes, definitely social media companies and others if they are suspected of causing harm or law-breaking.
@9QG96KK7mos7MO
Yes and no. These algorithms are intellectual property. But if they are being used on the internet they should be shared. They shouldn’t be shared if used internally within companies.
@9PFPS6W 8mos8MO
No, but some form of certification and validation process should exist to verify they may certain ethical standards
@9PK9J6D8mos8MO
Only if the intellectual property of the algorithm they use is kept confidential and not shared or leaked by the Governement.
@9PCGLSSConservative8mos8MO
As before I believe that the experts should regulate them, not the government, but it must be a mandatory system with teeth. The Algorithms must be protected, with some redactions to protect IP
@9P8XT9Z8mos8MO
Would regulators know what an algorithm did even if it was shared with them?? I very much doubt it sadly.
@9N3KP4S9mos9MO
Yes, and legislation should be introduced to break up large-tech companies or limit their power and influence
@9MS6VQ49mos9MO
No, but their algorithms must be explainable from a high level perspective
@9MQS6749mos9MO
Yes for transparency, but only if the regulators understand it.
@9MQKL9W9mos9MO
No, this is a nonsensical position. Share what algorithms? Pagerank? It's not 2006.
@9MQ472F9mos9MO
Only in extreme cases that are in the public interests
@9MP3PJM9mos9MO
They should be subject to review and come with warnings if found to promote extreme content
@9MNPM3H9mos9MO
Yes but only if there's evidence of promoting misinformation
this is an interesting question. If it is proven the algorithms are harming people, then yes. But I do not think this should be the default position.
@9PSZ3338mos8MO
It is impossible to share algorithms. Even the creators often no longer know how they work after a time.
@9PQBLRX8mos8MO
No, tech companies rarely understand how their algorithms work due to the nature of how they are created; giving regulators access to these algorithms would be pointless
This for me depends on the company's purpose and the purpose of the algorithm. There may be circumstances where for children safety etc that it is necessary.
@9NRGF298mos8MO
Yes and work with tech firms to ensure they pay to host news content on their sites such as a News Bargaining Code like they have in Australia
@9NQW4NC8mos8MO
They should be required to provide a little bit of information but only if it is deemed useful or necessary
@9NQ6N2C 8mos8MO
No, but they should require the tech companies to evidence that the algorithms do not have bias or create bubbles of visibility
@9NGTV9S8mos8MO
YES YES YES. These algorithms are a scourge on the human mind. They're made in ways to get those with low attention spans addicted to the internet, making them spend hours doom scrolling as the algorithm learns everything about you and then sells that information to third parties, who in turn pay to send you adverts to t keep on learning more about you.
@9N8Z6ZX8mos8MO
That is a stupid question. This really only relates to AI and there are no 'algorithms' that could be shared. They are mostly blackbox decision makers.
@9N7GZ6Q8mos8MO
Unless they are company specific, like KFC secret recipe that would harm the product or output of that company
@9N6Y9CH8mos8MO
Only unless if there is a fear to national security or of criminality, this must be decided by a court.
@9MYDT3G9mos9MO
Large tech companies algorithms should be included under the Freedoms of Information Act and should be available at request
Yes, but the regulators must be from a working background relative to the field - capable of understanding the algorithms.
@9MTK2S39mos9MO
No, but governments should offer advice to people on how companies use the algorithms
@9CKTJSM 9mos9MO
No, the integrity of the algorithm could be compromised or illegally shared.
@9MSFJB49mos9MO
Yes, however NDA's should be required from the regulators.
@ISIDEWITH5mos5MO
@ISIDEWITH5mos5MO
@ISIDEWITH5mos5MO
@ISIDEWITH5mos5MO
@ISIDEWITH5mos5MO
@ISIDEWITH5mos5MO
@ISIDEWITH5mos5MO
@ISIDEWITH5mos5MO
@ISIDEWITH5mos5MO
@ISIDEWITH5mos5MO
Loading the political themes of users that engaged with this discussion
Loading data...
Join in on more popular conversations.