Jump to content

Recommended Posts

Posted

I read this article in today's Observer, by the respected journalist on digital matters, John Naughton:    https://www.theguardian.com/commentisfree/2025/jan/12/note-to-no-10-one-speed-doesnt-fit-all-when-it-comes-to-online-safety

It tells of a UK online 'platform' called "Microcosm" that hosts a wide variety of hobby and social interest groups from cancer support to cycling.  The man who runs it  as his own hobby will close it down with all the groups it supports, as he cannot afford the safety measures demanded by the OSA, which he says will cost thousands of pounds a year.

Naughton writes that the OSA is a blunt instrument,  clumsily written against the major social media, that puts ordinary sites in danger.

Should we, the other Triumph boards or indeed all the classic car and hobby sites in the UK be worried?  Is the widely used system of appointing board members as  moderators insufficient?

John

Posted

In my ignorance, I thought this might affect have a very negative effect on this and many other hobby sites, but no one has responded.   So forgive me for 'bumping' it.

Can anyone reassure me, or share my concern?

John 

Posted

Having just read into this (having somehow missed this entire thing before), in the long run little effect.

Essentially a very poorly written piece of legislation that looks to be unenforceable in reality. My thoughts on the "microcosm" scenario is that someone is making a mountain out of a molehill, possibly because they don't want to run it anymore.

The real battle on this will be with big tech, not small little interest forums like ours! And additionally, we don't have encrypted messaging on here? And that appears to be where the bug lies, in order to enact the above act companies would have to reduce encryption in order to view users messages etc.

So no, I wouldn't be too concerned as it stands.

Posted

I have a vested interest in this as I look after the CT forum

I have just glanced at this page https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer  which is having a good stab at being succinct. 

It appears that there just needs to be a system in place to deal with inappropriate/harmful content. Which we all have, as the online community is very good at reporting dodgy content, and moderators act swiftly to deal with it. And to be perfectly honest, in 15 years I have never seen anything remotely harmful. It is all either spam or spats between people who ought to know better.

Posted
5 hours ago, JohnD said:

In my ignorance, I thought this might affect have a very negative effect on this and many other hobby sites, but no one has responded.   So forgive me for 'bumping' it.

Can anyone reassure me, or share my concern?

John 

Go have a nice cup of coco and worry about something else. 

 

  • Like 1
Posted
1 hour ago, zetecspit said:

I have a vested interest in this as I look after the CT forum

I have just glanced at this page https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer  which is having a good stab at being succinct. 

It appears that there just needs to be a system in place to deal with inappropriate/harmful content. Which we all have, as the online community is very good at reporting dodgy content, and moderators act swiftly to deal with it. And to be perfectly honest, in 15 years I have never seen anything remotely harmful. It is all either spam or spats between people who ought to know better.

Yes, that's what I thought! Moderation and peer pressure prevent boards such as ours from offending against the OSA.

Naughton is a newspaper columnist, who cynics might say was just after copy, but he is also "senior research fellow in the Centre for Research in the Arts, Social Sciences, and Humanities at Cambridge University, Director of the Press Fellowship Programme at Wolfson College, Cambridge, emeritus professor of the public understanding of technology at the British Open University, and adjunct professor at University College, Cork" (Wikipedia)  So DO he and his informant have real grounds for concern?

I've written to him c/o the Observer asking him to use his next column to explain why he thinks ordinary sites are so threatened.

John

Posted
18 hours ago, JohnD said:

I've written to him c/o the Observer asking him to use his next column to explain why he thinks ordinary sites are so threatened.

I had missed this too John so apologies.

But a short perusal of the links does suggest to me that Roger's assessment is quite correct and that whilst the legislation may apply to us Sidewaysers the protections in place are 'suitable and sufficient' for the site. As I am sure a (unneccessary) forensic analysis of the forum content would demonstrate.

Posted (edited)

An item in Private Eye (p.22) is relative and refers to an Ofcom page that lists "more than 40 measures they're expected to take"

Going there, I find that: "Specifically, the rules cover services where:

people may encounter content (like images, videos, messages or comments), that has been generated, uploaded or shared by other users."

So the OSA does apply to this site.

I'm posting this on my phone and pursuing the other obligations will be easier on the desktop, but if anyone else wants to dig, it's at https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/guide-for-services/

Johm

Edited by JohnD
Posted
1 minute ago, JohnD said:

So the OSA does apply to this site.

John. My reading is that all the above posts agree that this measure applies to this site. 

What we are not agreeing is that it is complete armageddon and we have to either put in place lots of expensive systems and supervision or shut down. 

Back to 'suitable and sufficient'.

Private Eye is a slightly different kettle of fish compared to Sideways. I would suggest anyway.

And I personally would not be concerned as an owner, developer, admin of this site.

Posted

Well, maybe  not!    Ofcom provide a "helpful tool" (SIC!) so that you, the owner of the website may know if it is exempt from the Act!  Six questions, some with obvious answers, like, "No, we don’t publish/display any pornographic content."  And then:

5. Do any exemptions apply to the content on your online service?

Some types of user-to-user service are exempt from the Act. Generally, this is because there are limits to the ways users can communicate on your online service or there are limits to the type of content users can generate or share on your online service.

Your online service will be exempt if:

  • The only way users can communicate on your online service is by email, SMS, MMS and/or one-to-one live aural communications; and/or
  • Users can only interact with content generated by your business/the provider of the online service. Such interactions include: comments, likes/dislikes, ratings/reviews of your content including using emojis or symbols. For example, this exemption would cover online services where the only content users can upload or share is comments on media articles you have published, or reviews of goods and services your business provides. It would not apply if users can interact with content generated by other users.

This is confusing as previously on another page it said:

Specifically, the rules cover services where:

  • people may encounter content (like images, videos, messages or comments), that has been generated, uploaded or shared by other users. Among other things, this includes private messaging, and services that allow users to upload, generate or share pornographic content. The Act calls these ‘user-to-user services’;

I had to read this several times before I understood it, and I think it means "Can people post messages on your site?"   In which case Yes.    Although in that case why does the 'tool' conclude  that the website so described is exempt, when we could  be discussing less salubrious content than  classic cars?    Anyway, it then absolves itself of all responsibility, and says that we should consult a lawyer!

It also says that:

The rules apply to organisations big and small, from large and well-resourced companies to very small ‘micro-businesses’. They also apply to individuals who run an online service.     

But then it refers to the size of the website,  defines a 'large site' by " as a service which has an average user base of 7 million or more per month in the UK"  implying that small websites may be exempt.

This is too much for my little brain.   

John 

Posted (edited)

Thank you, zetec, and thanks to Russ Garrett!    But I read that "You’re a low-risk service only if you have assessed your risk as low for all 17 kinds of illegal harms."

And that even in that case, you must, "Name an individual (accountable to the most senior governance body) to handle content safety, reporting and complaints." 

We rely on Craig to look after the nut'n'bolts of this board, and a few leading members as voluntary moderators. Rather than impose on them any further, do we need to appoint another to do the above? Clearly they should be familiar with the Internet and the OSA!

I note that the above assessment must be done before the 16th March 2025, so there's not much time!

John

Edited by JohnD
Posted
1 hour ago, JohnD said:

We rely on Craig to look after the nut'n'bolts of this board, and a few leading members as voluntary moderators. Rather than impose on them any further, do we need to appoint another to do the above?

No, that is sufficient.

Honestly John, you seem to falling into the same trap as the guys in the first article you posted, in making mountains out of molehills.

This new Act is designed to try and force large networking sites to "self-moderate" to prevent various illegal activities from being carried out on their platforms. This is the likes of WhatsApp, Facebook, Telegram, Apple apps and a host of "semi-darkweb" sites etc.

The likes of us, CT, TSSC and the multiple other classic car or indeed or "minor enthusiast" forums are not exactly hotbeds of criminal activity. This law is not aimed at us, it is aimed at those sites where criminal activity is known to be taking place in the background.

Could it affect us? Yes, of course. Is it likely to affect us? No. Much the same as multiple other laws out there that "could" affect us on a day to day basis.

So the question here is, do we have sufficient moderation in place. And for what we are, yes. We have moderators, and I presume we quite likely have software in the background that automatically flags the more suspicious content if it was to be posted.

If this new law was seen to be a threat to all and sundry, you would have seen multiple people and companies offering paid services in how to comply (similar to what happened with the Millennium Bug) as it would be a huge money maker. The fact that we don't have this tells you all need to know about how the industry feels about it.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...