Changes Coming To GOT

GrandOldTeam

Moderator
Staff member
Short Version

The UK government is forcing us to remove the Current Affairs forum, and NSFW type threads. Maybe also user to user private messaging.

Google 'Online Safety Act' if interested.

Long Rambling Version;

We're of course, an Everton Forum.

But in our 18+ years (not one of you wished the forum a happy birthday last week ffs), we've become a community beyond Everton.

Since 2017, we've sought to best accomodate 'current affairs' discussion in it's own sub forum - out the way, with a 'enter at own risk'/hands off moderation policy.

It's worked well.

Or as well as it could have done. Online. Politics.

However, the Online Safety Act (OSA) in the UK comes into effect on March 17, 2025. Someone much more articulate than I provides some commentary here. Also some more background info and it's implications here.

The implications of the act has already caused forums in their entirety to close, others have closed the ability to post. Some discussion on here of this act from December.

Moderating a Everton forum is challenging enough - we don't have the resource, ability, diplomacy to moderate politics in a way that would make us compliant with the act. Even if we did, I don't think we'd have the inclination to. Nor be prepared to accept the personal liability to do so.

Sadly then, we need to close the Current Affairs forum from 1st March.

We'll also need to close 'Not Safe For Work' threads, like 'The Fit Birds' thread.

There's a chance we may also need to disable user to user private messaging but I'm awaiting clarification on that.

But fear not. We can still fume about Everton.
 

They definitely are.

I dont think anyone can really dispute that.
They can try, but I guarantee it won't happen. Encryption can't be "ended". It's an algorithm and is the foundation of security for most online transactions. Without it, the internet and cyber security ends. They will lean on the big companies, but they've already told the government to do one.

Interestingly, companies are using more obscure ways to generate truly random seeds for their encryption, to try and stop hackers. One company (I think it's cloudflare...) has a room full of lava lamps with cameras pointed at them. The cameras take a picture of the wax in the lamp and convert the pixels in the image to numbers, which are used in the seed for the algorithm.

There's another company that has a giant mobile hanging from the ceiling in one of their offices, and similarly convert pictures of it into a numeric value as the basis for the seed. It is affected by the air conditioning and by people walking past, which helps with its randomness.
 
The below article from New Scientist gives a decent overview.

Hundreds of small websites may shut down due to UK's Online Safety Act​

Hundreds of community websites run for fans of everything from cycling to Sunderland AFC may be forced to shut down by the UK's Online Safety Act, which is designed to protect children from harmful content

The UK’s new Online Safety Act may result in hundreds of community websites and forums being permanently shut down, as site administrators say they fear the law imposes onerous obligations and exposes them to potential million-pound fines.

“We fall firmly into scope, and I have no way to dodge it,” says Dee Kitchen, who runs the cycling forum LFGSS for its 70,000 members. “The Act is too broad.”

The Online Safety Act (OSA) is designed to regulate online speech and media, essentially protecting children from “legal but harmful” content, such as pornography or bullying. Nadine Dorries, at the time the UK’s secretary of state for digital, culture, media and sport, said in a statement in 2022 that tech firms “haven’t been held to account when harm, abuse and criminal behaviour have run riot on their platforms”.

The law applies to the owner of any “online service” where users can interact, a definition so broad that it captures essentially anything but static websites with no interactive functionality. Failure to adhere comes with potential fines of up to £18 million or 10 per cent of annual turnover, whichever is higher.
Site owners like Kitchen say the new law hasn’t taken small operations like theirs into account. Their Microcosm forum software is used to power 300 online communities, but Kitchen is planning to delete them all on 16 March, the day before the OSA comes into force.

“I run these communities philanthropically, giving my time and money to do so,” says Kitchen. “It doesn’t matter that they’re run by an individual and not a company, that it loses money every month. Merely by being linked to the UK and allowing users to speak to users, it’s within scope. The penalties of non-compliance would be so devastatingly ruinous to me that I don’t see I have a choice. It’s devastating.”

Liam Dawe, who founded the website GamingOnLinux, says he is also being forced to shut a forum that has almost 15,000 users. He attributes that decision directly to the OSA, which he describes as “an incredibly wide-reaching law”.

“The whole thing is just ridiculous, putting a huge burden on individuals and micro-businesses,” says Dawe. “It will be a constant admin headache and a time-sink.”

The administrator of Sunderland Association Football Club fan website Ready To Go also told its 1700 users in a post that the site would be closing due to the OSA. “Continuing to provide the service will simply not be practical with the resources we have,” they said. “It will all just be too onerous.”
Users and administrators of other forums, including one for fans of Porsche 911 cars and the forum of a publishing house called Sea Lion Press, also expressed concerns that the OSA could force them to close, but said that they would await further detail from Ofcom, the body responsible for regulating online safety in the UK.

James Baker at Open Rights Group, a non-profit internet advocacy organisation, says this is the first evidence he has seen of websites shutting because of the OSA, but that he and his colleagues had predicted this would begin to happen as the implementation date of the law drew near.
“It’s understandable if you’re a volunteer and you’re doing something without much resource that when you suddenly get new requirements and legal obligations, that might be the final straw,” says Baker. It is possible that LFGSS and others are the “canary in the coal mine”, he says, and that a wave of closures will begin as more people become aware of the regulations they will have to comply with, and the potential personal risks for failing to do so properly. He is also concerned that foreign websites will simply ban UK users rather than have to take on additional complications.
A spokesperson for Ofcom told New Scientist that more help will be available next year for those running online communities to help them better understand their obligations.

“Given the range and diversity of services in scope of the new laws, we are not taking a ‘one size fits all’ approach. There are some things that all services will need to do, and other things that will depend on the risks associated with a particular service, and its size,” says a spokesperson.
Ofcom has launched a tool to help businesses and online communities determine if they fall under the scope of the OSA, but hasn’t as yet released its promised advisory materials – which it calls the Digital Support Service – which will provide guidance on harmful content, outline the necessary risk assessment process and record-keeping obligations.

Software engineer and consultant Russ Garrett says he wouldn’t blame anyone running a small forum who looks at more than 1000 pages of documentation published by Ofcom and decides to shut it.

“If Ofcom and the government didn’t intend to shut down these tiny, low-risk websites, they should have provided much more accessible, practical guidance for small sites,” says Garrett. “For a time-limited, risk-averse volunteer website operator, I think the only other option at this point is to engage a lawyer at a cost of many thousands of pounds.”
 
Just based on a few points there, it is quite frightening that UK law is based around things that are not absolutes. Hate speech and misinformation should be black and white as an example but the last few years have shown that it is simply what is being defined as it at the time. There has been plenty of misinformation that is actually true at a later date or hate speech simply not being actual hate speech but I don't like what your saying. Just screams of censoring the internet , by forcing it to either go beyond it's means in cases like this to adhere or simply close down. It would make a lot more sense to age restrict content by proving age than this route.
Laws are virtually never based in absolutes. That's the whole point in lawyers really, most of the time they aren't there to argue in pure black and white terms that their client did or didn't do something, they're arguing that the law can be applied in a way that means they should or shouldn't be found guilty. As somebody who's been involved in regulation I can tell you it's virtually impossible to write legislation or regulation in a purely black and white way, there will always be shades of grey.
 
Laws are virtually never based in absolutes. That's the whole point in lawyers really, most of the time they aren't there to argue in pure black and white terms that their client did or didn't do something, they're arguing that the law can be applied in a way that means they should or shouldn't be found guilty. As somebody who's been involved in regulation I can tell you it's virtually impossible to write legislation or regulation in a purely black and white way, there will always be shades of grey.

Indeed.

Doesn't prevent it being a time bandit to tick boxes and ensure documentation to show those ticked boxes.

In response to a bit of criticism, they're softening their stance a little and there's come ref to more help and support to come.

Given the range and diversity of services in scope of the new laws, we are not taking a ‘one size fits all’ approach. There are some things that all services will need to do, and other things that will depend on the risks associated with a particular service, and its size,” says a spokesperson.

Ascertaining our size, and risk (current affairs really inflates that) isn't enthralling.
 

Indeed.

Doesn't prevent it being a time bandit to tick boxes and ensure documentation to show those ticked boxes.

In response to a bit of criticism, they're softening their stance a little and there's come ref to more help and support to come.



Ascertaining our size, and risk (current affairs really inflates that) isn't enthralling.
Oh definitely. Don't get me wrong, you're the one who would have to deal with the consequences and if you're in any doubt at all as to whether you could be done for it then you're 100% doing the right thing. That's kind of what I mean really, no law is ever going to say 'you can do this, you can't do this', it will always be open to subjective interpretation to some extent and so if you think you could be found the wrong side of that it's just not worth the hassle.
 
I thought that would be a bit of a workaround too. I figured early days that all we'd need to do is document that.

But it's not just illegal content as we've long defined. There's misinformation, hateful content etc - one member reported another as making hateful posts last month because they were autistic. I disagreed. They've said they'll report that. When that happens in future - I need to ensure I, and the platform are protected/fully compliant. If I can't - what happens then?

I appreciate it’s you who is the one who’s going to have to deal with it (so this is a very easy thing for me to say) but I think you’d be safe on CA provided there was an effective intervention mechanism and you recorded with sensible rationale why a post was or wasn’t removed.

A malicious poster or a malicious campaign (like something around one side or another of the Israel-Palestine thing) could cause grief by reporting to OFCOM if they were unhappy with the decision - but I’d be amazed if they entertained it; they’ll have their hands full with genuinely criminal matters.
 
I appreciate it’s you who is the one who’s going to have to deal with it (so this is a very easy thing for me to say) but I think you’d be safe on CA provided there was an effective intervention mechanism and you recorded with sensible rationale why a post was or wasn’t removed.

A malicious poster or a malicious campaign (like something around one side or another of the Israel-Palestine thing) could cause grief by reporting to OFCOM if they were unhappy with the decision - but I’d be amazed if they entertained it; they’ll have their hands full with genuinely criminal matters.

I think a lot of that is right.

We're defined as a small platform - this is an interpretation;

1. Small Platforms

  • Definition: Platforms with fewer than 1 million UK users.
  • Regulatory Impact: These platforms still have obligations, but they are more minimal compared to medium and large platforms. You'll need to:
    • Have a clear user safety policy.
    • Provide easy reporting mechanisms for harmful content.
    • Ensure moderation of illegal content (e.g., hate speech, threats).

On the surface, it's pretty basic stuff. As a forum, excluding CA/NSFW threads - we already do more than the act asks. And have for... well, since we started in 2007.

The issue is around moderation of CA, and definitions around illegal content in CA.

Our risk assessment (see an example doc template here) which feeds into this is pretty different if we host a dedicated CA forum.

I've had to come to the conclusion that operating CA as we have, and allowing NSFW makes us vulnerable towards agitators you reference in your second paragraph. I'd also be amazed if they entertained it, but even just the process is enough of a deterrent. Time/resource isn't infinite.

Ultimately you end up weighing up risk and reward.

The picture might change too. But all we can do for now is interpret as best as we can, and ensure we're compliant/remove vulnerabilities.
 



Write your reply...

Welcome

Join Grand Old Team to get involved in the Everton discussion. Signing up is quick, easy, and completely free.

Shop

Back
Top