Changes Coming To GOT

GrandOldTeam

Moderator
Staff member
Short Version

The UK government is forcing us to remove the Current Affairs forum, and NSFW type threads. Maybe also user to user private messaging.

Google 'Online Safety Act' if interested.

Long Rambling Version;

We're of course, an Everton Forum.

But in our 18+ years (not one of you wished the forum a happy birthday last week ffs), we've become a community beyond Everton.

Since 2017, we've sought to best accomodate 'current affairs' discussion in it's own sub forum - out the way, with a 'enter at own risk'/hands off moderation policy.

It's worked well.

Or as well as it could have done. Online. Politics.

However, the Online Safety Act (OSA) in the UK comes into effect on March 17, 2025. Someone much more articulate than I provides some commentary here. Also some more background info and it's implications here.

The implications of the act has already caused forums in their entirety to close, others have closed the ability to post. Some discussion on here of this act from December.

Moderating a Everton forum is challenging enough - we don't have the resource, ability, diplomacy to moderate politics in a way that would make us compliant with the act. Even if we did, I don't think we'd have the inclination to. Nor be prepared to accept the personal liability to do so.

Sadly then, we need to close the Current Affairs forum from 1st March.

We'll also need to close 'Not Safe For Work' threads, like 'The Fit Birds' thread.

There's a chance we may also need to disable user to user private messaging but I'm awaiting clarification on that.

But fear not. We can still fume about Everton.
 

The private messaging bit is mad. From that interpretation of the act, would WhatsApp, Facebook Messenger etc fall under it too?

That's around scanning mate;

The Online Safety Act 2023 introduces new rules aimed at making the internet safer, but it has raised concerns regarding private messaging. The main issues are:

1. End-to-End Encryption Concerns

The law gives Ofcom the power to require platforms to scan private messages for harmful content (e.g., child sexual abuse material).

This could undermine end-to-end encryption (E2EE) in apps like WhatsApp, Signal, and iMessage.

If platforms comply, they may need to introduce client-side scanning, which critics argue weakens security for all users.


2. Threat to Privacy & Free Speech

Enforcing scanning could mean that tech companies or even the government could access private messages.

Privacy advocates argue this sets a dangerous precedent, allowing for mass surveillance.

Some companies (like Signal) have said they will not comply and may withdraw from the UK market.


3. Technical Feasibility Issues

Many experts argue there’s no safe way to scan messages without creating security vulnerabilities.

Any backdoor created for government use could also be exploited by hackers.


4. Platform Liability

The Act places responsibility on tech companies to prevent harmful content, even in private messages.

This could lead to stricter moderation or even blocking encrypted messaging services in the UK.

 
Jesus that's absolutely terrifying, and has the potential for incredible levels of abuse. I can't believe that part has made it through

I have been a paid up member of the EFF for years, and this was called out when it first reared its head as the Online Harms White Paper.

At its core, the OSA embodies a fundamentally broken approach to online governance: rather than addressing the systemic failures of Big Tech, it attacks encryption, undermines privacy, and empowers corporate platforms to dictate what constitutes "safety." The government’s proposals to force companies to weaken end-to-end encryption are among the most extreme anti-privacy measures ever attempted in a democracy. The mere idea that "client-side scanning" or government-mandated backdoors can exist without compromising security is a delusional fantasy pushed by officials with no understanding of cryptography and a deep hostility toward an unmonitored public sphere.

Encryption is not some sinister tool for criminals - it is the bedrock of digital security. It protects everything from personal conversations to financial transactions, from whistleblowers exposing corruption to journalists reporting from authoritarian regimes. The UK government, much like its counterparts in the United States, refuses to accept that breaking encryption for the "good guys" means breaking it for everyone. Once a backdoor exists, it is only a matter of time before it is exploited, whether by malicious actors, hostile states, or even the very governments demanding its creation.

The supposed "balance" that lawmakers propose - where encryption can remain "safe" while still allowing government-mandated access is basically a fiction. Security is binary.

Either everyone is protected, or no one is. The push to break encryption under the OSA is not about safety; it is about expanding state surveillance and increasing corporate control over user data.

The irony is that the UK’s Online Safety Act, for all its posturing, does nothing to address the actual failures of platform governance. It focuses on forcing compliance through mass surveillance, rather than holding platforms accountable for the design choices that foster onlime harm. The problem was never encryption - it was never the inability to monitor users. The problem is that companies like Meta, X, and Google prioritise engagement over well-being.

These platforms are fundamentally incapable of self-regulation because their profit model depends on amplifying harm. Our government's solution? Instead of taking on the trillion-dollar tech empires that deliberately allow misinformation, extremism, and abuse to flourish, it is demanding universal surveillance of private communications.

Instead of requiring platforms to be transparent about their algorithms, it is forcing companies to weaken the strongest protections users have against authoritarian overreach, cybercrime, and corporate exploitation.

It's just maddening.
 

I have been a paid up member of the EFF for years, and this was called out when it first reared its head as the Online Harms White Paper.

At its core, the OSA embodies a fundamentally broken approach to online governance: rather than addressing the systemic failures of Big Tech, it attacks encryption, undermines privacy, and empowers corporate platforms to dictate what constitutes "safety." The government’s proposals to force companies to weaken end-to-end encryption are among the most extreme anti-privacy measures ever attempted in a democracy. The mere idea that "client-side scanning" or government-mandated backdoors can exist without compromising security is a delusional fantasy pushed by officials with no understanding of cryptography and a deep hostility toward an unmonitored public sphere.

Encryption is not some sinister tool for criminals - it is the bedrock of digital security. It protects everything from personal conversations to financial transactions, from whistleblowers exposing corruption to journalists reporting from authoritarian regimes. The UK government, much like its counterparts in the United States, refuses to accept that breaking encryption for the "good guys" means breaking it for everyone. Once a backdoor exists, it is only a matter of time before it is exploited, whether by malicious actors, hostile states, or even the very governments demanding its creation.

The supposed "balance" that lawmakers propose - where encryption can remain "safe" while still allowing government-mandated access is basically a fiction. Security is binary.

Either everyone is protected, or no one is. The push to break encryption under the OSA is not about safety; it is about expanding state surveillance and increasing corporate control over user data.

The irony is that the UK’s Online Safety Act, for all its posturing, does nothing to address the actual failures of platform governance. It focuses on forcing compliance through mass surveillance, rather than holding platforms accountable for the design choices that foster onlime harm. The problem was never encryption - it was never the inability to monitor users. The problem is that companies like Meta, X, and Google prioritise engagement over well-being.

These platforms are fundamentally incapable of self-regulation because their profit model depends on amplifying harm. Our government's solution? Instead of taking on the trillion-dollar tech empires that deliberately allow misinformation, extremism, and abuse to flourish, it is demanding universal surveillance of private communications.

Instead of requiring platforms to be transparent about their algorithms, it is forcing companies to weaken the strongest protections users have against authoritarian overreach, cybercrime, and corporate exploitation.

It's just maddening.
And let’s not forget that pretty much all governmental digital infrastructure now runs on Microsoft/Google/Amazon web services so as ever they’re punishing the people and it’s one rule for them and another for the rest of us.
 
I have been a paid up member of the EFF for years, and this was called out when it first reared its head as the Online Harms White Paper.

At its core, the OSA embodies a fundamentally broken approach to online governance: rather than addressing the systemic failures of Big Tech, it attacks encryption, undermines privacy, and empowers corporate platforms to dictate what constitutes "safety." The government’s proposals to force companies to weaken end-to-end encryption are among the most extreme anti-privacy measures ever attempted in a democracy. The mere idea that "client-side scanning" or government-mandated backdoors can exist without compromising security is a delusional fantasy pushed by officials with no understanding of cryptography and a deep hostility toward an unmonitored public sphere.

Encryption is not some sinister tool for criminals - it is the bedrock of digital security. It protects everything from personal conversations to financial transactions, from whistleblowers exposing corruption to journalists reporting from authoritarian regimes. The UK government, much like its counterparts in the United States, refuses to accept that breaking encryption for the "good guys" means breaking it for everyone. Once a backdoor exists, it is only a matter of time before it is exploited, whether by malicious actors, hostile states, or even the very governments demanding its creation.

The supposed "balance" that lawmakers propose - where encryption can remain "safe" while still allowing government-mandated access is basically a fiction. Security is binary.

Either everyone is protected, or no one is. The push to break encryption under the OSA is not about safety; it is about expanding state surveillance and increasing corporate control over user data.

The irony is that the UK’s Online Safety Act, for all its posturing, does nothing to address the actual failures of platform governance. It focuses on forcing compliance through mass surveillance, rather than holding platforms accountable for the design choices that foster onlime harm. The problem was never encryption - it was never the inability to monitor users. The problem is that companies like Meta, X, and Google prioritise engagement over well-being.

These platforms are fundamentally incapable of self-regulation because their profit model depends on amplifying harm. Our government's solution? Instead of taking on the trillion-dollar tech empires that deliberately allow misinformation, extremism, and abuse to flourish, it is demanding universal surveillance of private communications.

Instead of requiring platforms to be transparent about their algorithms, it is forcing companies to weaken the strongest protections users have against authoritarian overreach, cybercrime, and corporate exploitation.

It's just maddening.
Could also argue that people self censoring, on even private communications, is a feature of the act, as its an extra layer to stop the "wrong" ideas from being spread at all.

I could imagine an argument in 2020/21 where sharing anti lockdown or vaccine hesitant sentiments with friends on WhatsApp could be seen as a public health risk in the same way the tweets people got arrested for were (clearly) inciting violence. This act opens the door to this with it's vague nature, and I guess it's down to each situation whether each person agrees with it or not
 

Off Topic
Would you say the same if it was a Palestinian flag? I doubt it.
Well no I wouldn’t. The state of Palestine has not carried out any recent atrocities or has carried out anything of particular controversy. I need a new pic anyways so have narrowed it down to 3:
1) Al Qaeda flag with the slogan Allahu Akbar.
2) The apartheid South Africa flag with the slogan ‘I support the Afrikaans’
3) the union flag with the slogan ‘I support the government of Lord John Russell (1846-1852)

Im sure all of these will well received.
 
I have been a paid up member of the EFF for years, and this was called out when it first reared its head as the Online Harms White Paper.

At its core, the OSA embodies a fundamentally broken approach to online governance: rather than addressing the systemic failures of Big Tech, it attacks encryption, undermines privacy, and empowers corporate platforms to dictate what constitutes "safety." The government’s proposals to force companies to weaken end-to-end encryption are among the most extreme anti-privacy measures ever attempted in a democracy. The mere idea that "client-side scanning" or government-mandated backdoors can exist without compromising security is a delusional fantasy pushed by officials with no understanding of cryptography and a deep hostility toward an unmonitored public sphere.

Encryption is not some sinister tool for criminals - it is the bedrock of digital security. It protects everything from personal conversations to financial transactions, from whistleblowers exposing corruption to journalists reporting from authoritarian regimes. The UK government, much like its counterparts in the United States, refuses to accept that breaking encryption for the "good guys" means breaking it for everyone. Once a backdoor exists, it is only a matter of time before it is exploited, whether by malicious actors, hostile states, or even the very governments demanding its creation.

The supposed "balance" that lawmakers propose - where encryption can remain "safe" while still allowing government-mandated access is basically a fiction. Security is binary.

Either everyone is protected, or no one is. The push to break encryption under the OSA is not about safety; it is about expanding state surveillance and increasing corporate control over user data.

The irony is that the UK’s Online Safety Act, for all its posturing, does nothing to address the actual failures of platform governance. It focuses on forcing compliance through mass surveillance, rather than holding platforms accountable for the design choices that foster onlime harm. The problem was never encryption - it was never the inability to monitor users. The problem is that companies like Meta, X, and Google prioritise engagement over well-being.

These platforms are fundamentally incapable of self-regulation because their profit model depends on amplifying harm. Our government's solution? Instead of taking on the trillion-dollar tech empires that deliberately allow misinformation, extremism, and abuse to flourish, it is demanding universal surveillance of private communications.

Instead of requiring platforms to be transparent about their algorithms, it is forcing companies to weaken the strongest protections users have against authoritarian overreach, cybercrime, and corporate exploitation.

It's just maddening.

100% correct.

But they've embedded themselves into the economy and have that much power that they're untouchable.
 

Welcome

Join Grand Old Team to get involved in the Everton discussion. Signing up is quick, easy, and completely free.

Shop

Back
Top