Home New Trending Search
About Privacy Terms
#
#openweb
Posts tagged #openweb on Bluesky
🛡️ Straylight Sentinel Intelligence Report | Saturday, March 14, 2026 | 14:50 UTC | Riley 🛡️ Straylight Sentinel Intelligence Report | Saturday, March 14, 2026 | 14:50 UTC 🛡️ /Straylight Sentinel Brief \[Saturday, March 14, 2026 | 14:50 UTC Edition] Listen to the Sentinel Brief BLUF (B...

A little late today but I had some morning errands to run. Here is your Saturday, March 14, 2026 | 14:50 UTC Edition of The Straylight Sentinel Intelligence Brief and Podcast for all of my #cybersecurity #infosec and #openweb friends on these interwebs.

1 1 0 0
Preview
Dries Buytaert’s Analysis Shows AI Crawlers Read Far More Than They Send Back A month-long experiment by Dries Buytaert reveals how heavily AI systems crawl the open web while sending little traffic back to publishers. After making Markdown versions of every page on his personal site available to AI crawlers, Buytaert analysed...

Analysis by Dries Buytaert shows AI crawlers read far more pages than they cite.

Cloudflare logs revealed bots fetched ~1,241 pages for every citation returned in AI answer engines.

https://bit.ly/4sJDpbU

#Drupal #AI #OpenWeb #WebPublishing

0 0 0 0
This is not a conspiracy. It is commercial logic operating in plain sight. Social media platforms have disrupted the legacy media business model: advertising revenue that once flowed to newspapers and television now flows to Meta, TikTok, and YouTube. A law compelling platforms to remove millions of under-16 accounts does not just advance child protection; it also redirects children's attention and advertiser spending back toward traditional media. Online news outlet Crikey noted drily that the ban is "as much a News Corp policy as it is a government policy."

This is not a conspiracy. It is commercial logic operating in plain sight. Social media platforms have disrupted the legacy media business model: advertising revenue that once flowed to newspapers and television now flows to Meta, TikTok, and YouTube. A law compelling platforms to remove millions of under-16 accounts does not just advance child protection; it also redirects children's attention and advertiser spending back toward traditional media. Online news outlet Crikey noted drily that the ban is "as much a News Corp policy as it is a government policy."

You've Been Murdoched: Australia's Teen Ban Offers a Warning for Europe www.techpolicy.press/youve-been-m...

In the UK, News UK "holds the same commercial position … a once-dominant publisher hollowed out by the platforms now being targeted."

#socialmediaban #censorship #openweb #techpolicy

0 1 0 0
Original post on mastodon.social

Manifesto for the Hashtag Commons hamishcampbell.com/manifesto-for-the-hashta... This story is now done enough to act as a tool: a framework that connects the projects, the struggles, the seeds of the #openweb still alive beneath the concrete of the #dotcons. It is a useful […]

0 0 0 0
A note on the current voices speaking for the #Fediverse Something that’s worth saying out loud: many of the people currently talking for the #Fediverse had very little to do with the generation that seeded this version. That doesn’t automatically make what they say wrong. But it does mean we should be careful about building strategy around their narratives. A lot of the early Fediverse energy came from the older #openweb traditions of hacker and #FOSS culture, experiments in federated infrastructure and grassroots publishing networks. The long history of things like RSS feeds, blogging, and projects like #indymedia The #Fediverse didn’t appear out of nowhere, it grew from decades of experimentation with open protocols, decentralised communication, and commons-based infrastructure. Some of the current commentators arrived after the current seeds had already been planted. That’s normal, every movement eventually attracts interpreters, professionalisers, and institutions. But it does mean there is a risk that the story gets rewritten in ways that lose the original lessons. One of those lessons is simplicity, the systems that spread tend to follow a basic rule: #KISS – Keep It Simple: Simple protocols. Simple tools. Simple ways for people to publish and connect. When infrastructure becomes complicated – governance layers, funding structures, branding strategies, endless, #NGO mediated theoretical debates – the distance between the actual people and the invisible elitism occupying the space, talking the loudest, grows larger. The Fediverse itself only exists because a handful of people quietly built working code and released it under #4opens licences. Communities adopted it because it worked, not because it was well marketed, not because institutions endorsed it and not because a conference panel explained its importance. For projects growing the #openweb, the lesson is straightforward: Don’t get too distracted by who is currently speaking for the ecosystem. Look at flows, what is being built, at what people have used and at what follows the basic principles of the commons. And keep things simple. #KISS is still the best guides we have. > Yes, There Are Parasites. And Yes, There’s Shit to Shovel Stepping around the recurring #NGO voices in #openweb debates. To do this the problem we need to compost is our lack of balance, meany of the people talking for us have done the same thing for each generation of the open web and bluntly there “common sense” has always failed as it is not native to the #openweb. These people have no idea that they keep circling this mess, so please try and step around them. Because they talk loudly and consistently, newcomers often assume they represent the ecosystem, they don’t. The practical lesson is simple: * Notice them. * Learn from the patterns of past generations. * Step around them. Our task is to grow native, functioning, living networks, not to repeat old mainstreaming debates that have consistently led nowhere. In other words: don’t argue with the noise, build around it. Keep the focus on grassroots projects, real communities, and real trust-based infrastructure. That’s how the #openweb moves forward. > #FOSS needs to take a social lead **** * * * ### Discover more from #OMN (Open Media Network) Subscribe to get the latest posts sent to your email. Type your email… Subscribe

A note on the current voices speaking for the #Fediverse hamishcampbell.com/a-note-on-the-current-vo... For projects trying to grow the #openweb, the lesson is straightforward: Don’t get too distracted by who is currently speaking for the ecosystem.

0 0 0 0
A retro-styled presentation thumbnail titled "Reclaiming Our News Feeds: A Fast-Paced Resource Showcase for the Open Web" by Dr. Wes Fryer. The central illustration depicts a broadcast tower and satellite dishes emitting concentric signal waves, surrounded by floating digital icons such as birds, winged envelopes, and speech bubbles. The design features a mid-century modern aesthetic with a warm orange and teal color palette, sunburst rays, and stylized circuit patterns, listing the websites wesfryer.com and HealOurCulture.org at the bottom.

A retro-styled presentation thumbnail titled "Reclaiming Our News Feeds: A Fast-Paced Resource Showcase for the Open Web" by Dr. Wes Fryer. The central illustration depicts a broadcast tower and satellite dishes emitting concentric signal waves, surrounded by floating digital icons such as birds, winged envelopes, and speech bubbles. The design features a mid-century modern aesthetic with a warm orange and teal color palette, sunburst rays, and stylized circuit patterns, listing the websites wesfryer.com and HealOurCulture.org at the bottom.

Check out the video from my March 5th presentation at the #Thrive2026 Democracy conference: “Reclaiming Our News Feeds” (22.5 min)
www.youtube.com/watch?v=8v4U...

More resources on:
wiki.wesfryer.com/Home/thrive2...

#openweb #SocialMedia #Fediverse #Mastodon #Flipboard #VibeCoding #BlueSky #RSS

2 0 0 0
Post image

#Creativity #ArtOverAlgorithm #OpenWeb #DigitalCapitalism #ArtistsSupportArtists
Creativity over metrics // I'm not a product
rosedreams.net ~ Web 1.0 creativity with Web 2.0 structure

5 0 0 0

I lost thousands of followers on social platforms when I stopped feeding the algorithm. I chose creativity over metrics and refused to turn myself into a product. I’m not sorry.

#Creativity #ArtOverAlgorithm #OpenWeb

8 0 0 0
Post image

Marketing should be authentic, not manipulative.
At Vibe Digital Marketing, we believe in transparency, creativity, and ethical growth. 🌍

Join the digital movement that puts people first.

#EthicalMarketing #OpenWeb #DigitalStrategy #VibeDigital #SustainableGrowth

0 0 0 0

@zenbrowser is so beautiful, innovative, and useful -- a great example of how a clear product vision can really add value even to a mature code base like #Firefox. Great work!

#opensource #innovation #openweb

1 0 0 0

CWS Bill debate yesterday hansard.parliament.uk/commons/2026...

MPs rejected a #socialmediaban in favour of giving ministers broad powers to age-gate internet services without further primary legislation publications.parliament.uk/pa/bills/cbi...

#OnlineSafetyAct #censorship #openweb #techpolicy

0 1 0 0
Kendall recently launched a consultation into banning social media for under-16s, which is expected to report in the summer.

She said on Monday that the government would seek to pass new laws after that consultation, though added this could be done without allowing MPs a chance to amend them.

Campaigners for a ban believe Keir Starmer is likely to back their cause, but worry that ministers will implement a relatively weak ban that they will not be given a chance to strengthen in parliament.

"They'll get a vote in the Commons," Kendall said, though added: "It could be secondary." Unlike government bills, secondary legislation does not allow time for MPs to amend it.

Kendall recently launched a consultation into banning social media for under-16s, which is expected to report in the summer. She said on Monday that the government would seek to pass new laws after that consultation, though added this could be done without allowing MPs a chance to amend them. Campaigners for a ban believe Keir Starmer is likely to back their cause, but worry that ministers will implement a relatively weak ban that they will not be given a chance to strengthen in parliament. "They'll get a vote in the Commons," Kendall said, though added: "It could be secondary." Unlike government bills, secondary legislation does not allow time for MPs to amend it.

Ministers must act more quickly on deepfakes to protect women and girls, Kendall says www.theguardian.com/politics/202... (UK)

#OnlineSafetyAct #socialmediaban #censorship #openweb #techpolicy

Yep, this all sounds very … safe

0 0 0 0
Shovels, Hashtags, and Revolutions: Digging Up the Roots of the #openweb It’s obvious to everyone paying attention that the relentless push of #mainstreaming over the last forty years has not made society healthier or more stable. Quite the opposite, the result has been accelerating social disintegration and the rapid expansion of #climatechaos. When the current trajectory continues, the consequences are catastrophic. Over the next fifty years we are looking at millions dead and billions displaced by climate breakdown, ecological collapse, and the political instability that follows. Flooded cities, failing agriculture, collapsing states, mass migration, these are no longer speculative futures. They are already visible on the horizon. What makes this situation so disturbing is not ignorance. For the last decade, the consequences have been very clear. Climate science, ecological data, and lived experience have converged into a single message, that the system driving this crisis cannot continue. Yet those with the power to change course continue pushing the same policies, the same economic logic, and the same institutional inertia that produced the crisis in the first place. This is not simply failure, it is knowing failure. And that raises an uncomfortable question of when does systemic negligence become a crime? For forty years the dominant ideology has been the worship of endless growth, deregulation, privatization, and extraction – what many people now recognize as the #DeathCult of #neoliberalism. On this path, ecosystems are treated as expendable, communities are hollowed out, and public institutions are dismantled in the name of “efficiency”. The result is the hollowing-out of social structures and the destabilization of the planet itself. This isn’t an accident, the evidence has been overwhelming for decades. From early climate warnings in the 1980s to the now constant stream of scientific reports and disasters, we have known were this path leads. And yet the machine keeps running. At some point we have to confront the idea that what we are witnessing is not just bad policy but something closer to systemic criminality. When leaders, corporations, and institutions knowingly pursue actions that will cause mass death and displacement, we enter the territory of #CrimeAgainstHumanity. The historical analogy that needs resurfacing is Nuremberg. After the Second World War, the world established that individuals in positions of power could be held legally responsible for crimes that harmed humanity as a whole. The principle was simple: “just following the system” is not a defence. Today we face a different kind of global crime – slower, more bureaucratic, wrapped in economic language – but far larger in scale. If millions die and billions are displaced because decision-makers continued destructive policies long after the dangers are clear, then if social democracy survives, future generations will have every reason to enforce people as accountable? This is not about vengeance, it’s about accountability and the possibility of changing course before the worst outcomes arrive. The tragedy is that alongside this destructive path there have always been alternatives – social, technological, and cultural. Grassroots networks, commons-based governance, cooperative systems, and the original ideals of the #openweb all point toward more resilient and humane ways of organising society. But these paths have been buried under forty years of blinded #mainstreaming, where every institution, including our own #NGO people, force alignment with this narrow economic logic. Digging out of this mess requires more than better technology or better policy papers, it requires collective action, memory, and courage. In other words: Shovels. Hashtags. And revolutions. Because the first step in changing the future is digging up the truth about how we got here. #OMN #techshit #compost * * * ### Discover more from #OMN (Open Media Network) Subscribe to get the latest posts sent to your email. Type your email… Subscribe

Shovels, Hashtags, and Revolutions: Digging Up the Roots of the #openweb hamishcampbell.com/shovels-hashtags-and-rev... Digging out requires more than better technology or better policy papers, it requires collective action, memory, and courage.

0 0 0 0
Update - 9 March 2026

On 3 December 2025, we issued a Confirmation Decision to AVS Group Ltd, in which we imposed two single penalties for failures to comply with the Online Safety Act, consisting of:

● £1 million for its contravention of section 12 of the Act; and
● £50,000 for its contravention of section 102(8) of the Act.

The deadline for paying the single penalties has now passed. We have yet to receive payment of the penalties and are considering next steps for recovery.

We also imposed two daily penalties to reflect that these breaches were still ongoing at the date of the Confirmation Decision. In the event of continuing non-compliance, we set out that the daily penalties would continue to accrue for both breaches until either: the company took steps to come into compliance with the relevant duties; or to a maximum of 100 days for the breach of section 12 of the Act, and 60 days for the breach of section 102(8) of the Act.

After the Confirmation Decision was issued, AVS Group Ltd took steps to introduce age assurance that is capable of being highly effective at identifying whether a user is a child across all 18 of the websites under investigation. Ofcom confirmed on 8 January 2026 that the total daily penalty to be imposed for this breach, therefore, was £8,000. The deadline for payment of this element of the penalty has now passed. We have yet to receive payment and are considering next steps for recovery.

In relation to the breach of section 102(8) of the Act, the maximum accrual period of 60 days ended on 1 February 2026. As such, on 13 February 2026 Ofcom confirmed that the total daily penalty to be imposed was £18,000. In accordance with the Act, Ofcom must provide a reasonable period from the final date the penalty was incurred (being 1 February 2026) for payment to be made. In this matter, Ofcom has allowed AVS Group Ltd until 1 April 2026, by which time the penalty must be paid.

…

Update - 9 March 2026 On 3 December 2025, we issued a Confirmation Decision to AVS Group Ltd, in which we imposed two single penalties for failures to comply with the Online Safety Act, consisting of: ● £1 million for its contravention of section 12 of the Act; and ● £50,000 for its contravention of section 102(8) of the Act. The deadline for paying the single penalties has now passed. We have yet to receive payment of the penalties and are considering next steps for recovery. We also imposed two daily penalties to reflect that these breaches were still ongoing at the date of the Confirmation Decision. In the event of continuing non-compliance, we set out that the daily penalties would continue to accrue for both breaches until either: the company took steps to come into compliance with the relevant duties; or to a maximum of 100 days for the breach of section 12 of the Act, and 60 days for the breach of section 102(8) of the Act. After the Confirmation Decision was issued, AVS Group Ltd took steps to introduce age assurance that is capable of being highly effective at identifying whether a user is a child across all 18 of the websites under investigation. Ofcom confirmed on 8 January 2026 that the total daily penalty to be imposed for this breach, therefore, was £8,000. The deadline for payment of this element of the penalty has now passed. We have yet to receive payment and are considering next steps for recovery. In relation to the breach of section 102(8) of the Act, the maximum accrual period of 60 days ended on 1 February 2026. As such, on 13 February 2026 Ofcom confirmed that the total daily penalty to be imposed was £18,000. In accordance with the Act, Ofcom must provide a reasonable period from the final date the penalty was incurred (being 1 February 2026) for payment to be made. In this matter, Ofcom has allowed AVS Group Ltd until 1 April 2026, by which time the penalty must be paid. …

Ofcom fines since Apr 2025 (not just OSA) www.ofcom.org.uk/about-ofcom/...

Update on tube site provider AVS Group, which has failed to pay Online Safety Act fines exceeding £1 million – Ofcom is "considering next steps for recovery" www.ofcom.org.uk/online-safet...

#censorship #openweb #techpolicy

0 0 0 0
The internet watchdog issued financial penalties on six separate companies since bringing in stricter legislation, but only one has paid up.

As part of LBC's Online Safety Day, we can reveal that over a million pounds worth of fines remain outstanding, with Ofcom bosses insisting success isn't measured through fines but instead through the "outcomes we are driving."

Ofcom has confirmed to LBC that since March last year, 30 companies have been investigated covering 96 sites, with 23 of those probes still ongoing, covering 79 sites.

The internet watchdog issued financial penalties on six separate companies since bringing in stricter legislation, but only one has paid up. As part of LBC's Online Safety Day, we can reveal that over a million pounds worth of fines remain outstanding, with Ofcom bosses insisting success isn't measured through fines but instead through the "outcomes we are driving." Ofcom has confirmed to LBC that since March last year, 30 companies have been investigated covering 96 sites, with 23 of those probes still ongoing, covering 79 sites.

Adult sites brazenly 'ignore' Ofcom fines www.lbc.co.uk/article/ofco...

We already knew this, and its unsurprising. Itai Tech has paid up but all of Ofcom's other targets for Online Safety Act penalties are outside the UK. The big fines are just safety theatre

#censorship #openweb #techpolicy

1 1 1 0
The first amendment, to the Crime and Policing Bill, would empower any senior government minister to amend the Online Safety Act near unilaterally for the purposes of "minimizing or mitigating the risks of harm to individuals" presented by illegal AI-generated content.  

The second amendment, to the Children's Wellbeing and Schools Bill, looks to go even further, giving ministers the ability to alter any piece of primary legislation to restrict children's access to "certain internet services." 

The Department for Science, Innovation and Technology (DSIT) has said it wants to act "at pace" in response to the findings of its consultation, the "key focus" of which is whether to ban social media for under-16s, a policy idea which has picked up momentum in multiple countries since Australia introduced a ban at the end of last year.

The first amendment, to the Crime and Policing Bill, would empower any senior government minister to amend the Online Safety Act near unilaterally for the purposes of "minimizing or mitigating the risks of harm to individuals" presented by illegal AI-generated content. The second amendment, to the Children's Wellbeing and Schools Bill, looks to go even further, giving ministers the ability to alter any piece of primary legislation to restrict children's access to "certain internet services." The Department for Science, Innovation and Technology (DSIT) has said it wants to act "at pace" in response to the findings of its consultation, the "key focus" of which is whether to ban social media for under-16s, a policy idea which has picked up momentum in multiple countries since Australia introduced a ban at the end of last year.

Coverage of measures highlighted last week in my posts above:

UK eyes sweeping powers to regulate tech without parliamentary scrutiny www.politico.eu/article/uk-e...

("Regulating tech" = blocking content and discussion)

#OnlineSafetyAct #socialmediaban #genAI #censorship #openweb #techpolicy

1 1 0 0
Screenshot of the webpage shows it's been blocked.

I used to go to this adult site all the time for years looking for inspairation doujin artist like Homare works and others for getting my new creative ideas for my doujin artworks when I was 18. But now It's all gone for real this time...RIP 🪦🔞🇦🇺😞

Screenshot of the webpage shows it's been blocked. I used to go to this adult site all the time for years looking for inspairation doujin artist like Homare works and others for getting my new creative ideas for my doujin artworks when I was 18. But now It's all gone for real this time...RIP 🪦🔞🇦🇺😞

E-Hentai org is offically blocked in Australia... What a shame, RIP ⚰🔞
#OnlineSafetyAct #censorship #openweb #fanworks
#doujinshi

0 0 0 0
Meta's reasoning is straightforward. Anyone who uses BitTorrent to transfer files automatically uploads content to other people, as it is inherent to the protocol. In other words, the uploading wasn't a choice, it was simply how the technology works.

Meta also argued that the BitTorrent sharing was a necessity to get the valuable (but pirated) data. In the case of Anna's Archive, Meta said, the datasets were only available in bulk through torrent downloads, making BitTorrent the only practical option.

"Meta used BitTorrent because it was a more efficient and reliable means of obtaining the datasets, and in the case of Anna's Archive, those datasets were only available in bulk through torrent downloads," Meta's attorney writes.

Meta's reasoning is straightforward. Anyone who uses BitTorrent to transfer files automatically uploads content to other people, as it is inherent to the protocol. In other words, the uploading wasn't a choice, it was simply how the technology works. Meta also argued that the BitTorrent sharing was a necessity to get the valuable (but pirated) data. In the case of Anna's Archive, Meta said, the datasets were only available in bulk through torrent downloads, making BitTorrent the only practical option. "Meta used BitTorrent because it was a more efficient and reliable means of obtaining the datasets, and in the case of Anna's Archive, those datasets were only available in bulk through torrent downloads," Meta's attorney writes.

Uploading Pirated Books via BitTorrent Qualifies as Fair Use, Meta Argues torrentfreak.com/uploading-pi... (scroll down for links to docs from lawsuit in California)

Context: Anna's Archive en.wikipedia.org/wiki/Anna%27...

#genAI #IPlaw #fairuse #filesharing #shadowlibraries #openweb #techpolicy

2 0 1 0
"Copyright is going to be kicked down the road," said one person with knowledge of the government's planned response to a two-month consultation on how to regulate AI companies' access to copyrighted material, due to be released in the next fortnight.

Responses to the consultation did not favour any of the government's proposed models for AI use of copyrighted materials. Ministers have instead decided they need to go back to the drawing board, gathering more evidence and spending longer consulting on various options, according to two people briefed on the plans. 

People close to the process said there was now no expectation that the government would include an AI bill in the King's Speech, due in May, and instead push any decisions and new legislation into next year.

"Copyright is going to be kicked down the road," said one person with knowledge of the government's planned response to a two-month consultation on how to regulate AI companies' access to copyrighted material, due to be released in the next fortnight. Responses to the consultation did not favour any of the government's proposed models for AI use of copyrighted materials. Ministers have instead decided they need to go back to the drawing board, gathering more evidence and spending longer consulting on various options, according to two people briefed on the plans. People close to the process said there was now no expectation that the government would include an AI bill in the King's Speech, due in May, and instead push any decisions and new legislation into next year.

UK to delay difficult decisions on AI copyright rules www.ft.com/content/e759... (£)

2024 consultation www.gov.uk/government/c...

We've had strong signals since at least December that the #AIBill was being dropped

#genAI #IPlaw #openweb #techpolicy

0 0 0 0
Key recommendations

To achieve this, the report calls on the Government to: 

● Rule out a new commercial text and data mining (TDM) exception with an opt-out model. Mixed public messaging from the Government and an extended consultation period have undermined trust and stalled licensing and investment. The Government should, in the next year, publish a final decision on its approach to AI and copyright. In the meantime, it should set out clearly that it will not introduce a new TDM exception with an opt-out mechanism, as initially proposed in its consultation on AI and copyright.
● Close gaps in protection for identity, style and digital replicas: The Government should introduce protections against unauthorised digital replicas and harmful ‘in the style of’ AI outputs. These must give creators and performers clear control over commercial exploitation of their identity.
● Make transparency about AI training data a statutory obligation. The Government should establish a clear mandatory transparency framework for UK AI developers, as well as considering how public procurement and regulatory tools could promote compliance with UK transparency requirements by international developers.
● Create the conditions for a fair and inclusive UK licensing market. A market for licensing content for AI use is already emerging and, given its wealth of creative content, the UK is well placed to benefit. The Government should support this market to grow in a way that works for AI developers and rightsholders of different sizes. It should also back the creation and adoption of the technical tools that will support a licensing-first approach: open, globally aligned standards for rights reservation, data provenance and the labelling of AI-generated content.
● Prioritise the development and adoption of sovereign AI models. International examples demonstrate that domestically governed AI systems can offer an alternative to an overreliance on opaquely trained US-based models. …

Key recommendations To achieve this, the report calls on the Government to: ● Rule out a new commercial text and data mining (TDM) exception with an opt-out model. Mixed public messaging from the Government and an extended consultation period have undermined trust and stalled licensing and investment. The Government should, in the next year, publish a final decision on its approach to AI and copyright. In the meantime, it should set out clearly that it will not introduce a new TDM exception with an opt-out mechanism, as initially proposed in its consultation on AI and copyright. ● Close gaps in protection for identity, style and digital replicas: The Government should introduce protections against unauthorised digital replicas and harmful ‘in the style of’ AI outputs. These must give creators and performers clear control over commercial exploitation of their identity. ● Make transparency about AI training data a statutory obligation. The Government should establish a clear mandatory transparency framework for UK AI developers, as well as considering how public procurement and regulatory tools could promote compliance with UK transparency requirements by international developers. ● Create the conditions for a fair and inclusive UK licensing market. A market for licensing content for AI use is already emerging and, given its wealth of creative content, the UK is well placed to benefit. The Government should support this market to grow in a way that works for AI developers and rightsholders of different sizes. It should also back the creation and adoption of the technical tools that will support a licensing-first approach: open, globally aligned standards for rights reservation, data provenance and the labelling of AI-generated content. ● Prioritise the development and adoption of sovereign AI models. International examples demonstrate that domestically governed AI systems can offer an alternative to an overreliance on opaquely trained US-based models. …

AI, copyright and the creative industries committees.parliament.uk/committee/17... report from Lords Comms/Digital Committee

Guardian coverage www.theguardian.com/technology/2...

Copyright maximalism all the way; no concessions to balance or indeed practicality

#genAI #IPlaw #openweb #techpolicy

1 0 1 0
A note to #FOSS funders I’ve been working at the heart of this space for more than 30 years, funded and unfunded. In that time I’ve seen hundreds of alternative tech projects start with energy and good intentions. Most of them wither on the vine, a very small number flower. After watching this cycle repeat for decades, one thing has become clear: the projects that survive and grow almost always follow a simple pattern. I call this the #4opens. Other people describe similar ideas as open source development, open governance, or commons-based development. The label doesn’t matter – the practice does. If you want to know which projects will flower and which will wither, look at the ground, not the words. The #4opens ask four very simple questions: * Open data – can people access and reuse the information? * Open source – can people read, modify, and share the code? * Open process – can people see and participate in how decisions are made? * Open standards – can different systems interoperate and grow a wider ecosystem? Projects that are open in all four of these ways tend to build living ecosystems. Projects that are only partially open tend to stall or collapse. The two repeating problems, over the years two patterns constantly undermine good projects. #geekproblem – A teenage mix of arrogance and ignorance that is surprisingly universal in tech culture. Developers assume technical elegance (and complexity) will automatically solve social problems. They underestimate governance, community, and messy human reality. #dotcons – The opposite pressure: corporate platforms pushing business models that prioritise extraction and growth over human need. They happily wrap themselves in the language of “open” while building fundamentally closed systems. Both pressures distort funding decisions. Both lead to projects that sound open but aren’t. Money is a dangerous subject, yes, funding matters, but money inside infrastructure projects to often distorts them quickly. For #openweb work, a useful rule of thumb is: Keep the core simple. Focus funding on maintaining the #4opens infrastructure. Let many different organisations, businesses, and NGOs build external services and applications on top. This keeps the core commons stable while allowing diversity and experimentation around it. It’s the #KISS principle applied to digital commons. When funding pushes too many external agendas into the core, projects become heavy, political, and fragile. Some uncomfortable truths, over the last decade we’ve been told several stories about security and scale that simply don’t hold up. There is no security in CLOSED systems, security emerges from open scrutiny and shared responsibility: * There is no security in radical individualism, security emerges from community. * There is no security in “trustless” systems, real resilience grows from social trust. These ideas have been obscured by hype cycles and by the influence of #dotcons and their shadow allies, the #encryptionists who push purely technical “trustless” thinking. Both camps wrap themselves in the language of openness, but their systems remain structurally closed. Words are wind, look at the ground: #4opens. The unspoken scaling problem, there is also an unspoken #geekproblem around how we think about scaling. When many developers talk about #p2p, they imagine data-to-data scaling, systems optimised to move information as efficiently as possible. From that perspective, human friction looks like a problem. But if you see #p2p as human-to-human, the picture changes. Human scaling limits – smaller communities, slower processes, local trust networks – are not bugs, they are virtues, creating resilience and accountability. The data-first model is the one favoured by the #dotcons. The human-first model is the one the #openweb actually needs. Funders should be aware of which philosophy a project is building around. A simple test If you want a quick filter when looking at proposals, ask: * Does this project genuinely follow the #4opens? * Does it build community and governance, not just code? * Is it resilient without permanent central funding? * Does it strengthen the commons, rather than a future platform? Projects that pass these tests are the ones most likely to flower, everything else tends to wither. Food for thought. #EU #NLnet #NGI #funding > Make some FOSS compost * * * ### Discover more from #OMN (Open Media Network) Subscribe to get the latest posts sent to your email. Type your email… Subscribe

For #openweb work, a useful rule of thumb is: Keep the core simple. Focus funding on maintaining the #4opens infrastructure. A note to #FOSS funders https://hamishcampbell.com/a-note-to-foss-funders/

0 0 0 0
Open

Investigation into: The provider of two image board services. Due to the nature of these services, we have decided not to name the provider and services.

Case opened: 6 March 2026

Summary: We are investigating whether a provider of two image board services has failed/is failing to comply with its duties under the Online Safety Act 2023 to:

● Complete and keep a record of a suitable and sufficient illegal content risk assessment; and
● Comply with the safety duties about illegal content and the duties relating to content reporting, complaints procedures, and terms of service which apply in relation to regulated user-to-user services.

Relevant legal provision(s): Sections 9, 10, 20 and 21 of the Online Safety Act 2023.

Open Investigation into: The provider of two image board services. Due to the nature of these services, we have decided not to name the provider and services. Case opened: 6 March 2026 Summary: We are investigating whether a provider of two image board services has failed/is failing to comply with its duties under the Online Safety Act 2023 to: ● Complete and keep a record of a suitable and sufficient illegal content risk assessment; and ● Comply with the safety duties about illegal content and the duties relating to content reporting, complaints procedures, and terms of service which apply in relation to regulated user-to-user services. Relevant legal provision(s): Sections 9, 10, 20 and 21 of the Online Safety Act 2023.

UK regulator Ofcom has launched a new investigation into the provider of two image board services for "failing to comply with its duties" under the Online Safety Act www.ofcom.org.uk/online-safet...

… but isn't identifying them, so ¯\_(ツ)_/¯

Probably chans I guess

#censorship #openweb #techpolicy

1 0 1 0
Post image

Warum ich mich für @eurosky.social entschieden habe: 🇪🇺✨
​✅ Digitale Souveränität: Daten liegen nun sicher auf EU-Servern unter EU-Recht.
✅ Volle Kompatibilität: Fühlt sich genau wie vorher an, nun aber „europäisch gehostet“.
✅ Unabhängigkeit

#Eurosky #Fediverse #ATProtocol #OpenWeb

5 2 0 1

Giving @eurosky.social a try, an EU-hosted, decentralized social identity on the AT Protocol. Post, follow, move apps—your network travels with you. #OpenWeb #Eurosky

47 3 1 0
Post image Post image Post image Post image

The Open Network Shout Looking for a massive digital refresh for 2026? The Winter Shop Clear Sale is live! Get 15k+ Art, 1,500+ PDFs, and 300+ Vintage Games in one Mega Bundle. No fluff, just pure value Eid season.
ko-fi.com/blase98248/s...
#OpenWeb #DigitalAssets #WinterClearance #Eid2026 #Gaming

0 0 0 0
The Web Isn’t Free The internet isn’t truly free: we “pay” with our time, attention, and data, which fuel platforms designed to capture and manipulate us. Alternatives like Web Monetization aim to restore privacy, fairness, and control by letting users directly support creators.

What if creators could be supported without relying only on ads or subscriptions? 🌱

Web Monetization enables real-time, frictionless support, expanding revenue options while preserving privacy and openness.

Read more: https://interledger.org/news/web-isnt-free

#Interledger #OpenWeb

0 1 0 0
The Web Isn’t Free The internet isn’t truly free: we “pay” with our time, attention, and data, which fuel platforms designed to capture and manipulate us. Alternatives like Web Monetization aim to restore privacy, fairn...

What if creators could be supported without relying only on ads or subscriptions? 🌱

Web Monetization enables real-time, frictionless support, expanding revenue options while preserving privacy and openness.

Read more: interledger.org/news/web-isn...

#Interledger #OpenWeb

0 0 0 0
Consent withdrawal provisions were also added to the sweeping Crime and Policing Bill, allowing anyone who appears in adult content to withdraw consent at any time, or producers of the material could face imprisonment and fines. Initial consent to publication would be viewed as irrelevant. If consent is withdrawn, platforms and studios must comply with the request and remove the content within 24 hours of notice. Baroness Alison Levitt said she supported the sentiment behind the consent withdrawal policy but found its enforcement problematic. Levitt observed before the House of Lords, "Where content is produced legally, as with the wider film industry, the rules and regulations governing its use are usually a commercial matter to be agreed between the performer and the production company, taking into account the intellectual property framework." That framework could pose further risks to adult entertainment performers, including those based in the United Kingdom.

Consent withdrawal provisions were also added to the sweeping Crime and Policing Bill, allowing anyone who appears in adult content to withdraw consent at any time, or producers of the material could face imprisonment and fines. Initial consent to publication would be viewed as irrelevant. If consent is withdrawn, platforms and studios must comply with the request and remove the content within 24 hours of notice. Baroness Alison Levitt said she supported the sentiment behind the consent withdrawal policy but found its enforcement problematic. Levitt observed before the House of Lords, "Where content is produced legally, as with the wider film industry, the rules and regulations governing its use are usually a commercial matter to be agreed between the performer and the production company, taking into account the intellectual property framework." That framework could pose further risks to adult entertainment performers, including those based in the United Kingdom.

Parliament Motions to Ban 'Step' and Other Taboo Porn in the U.K. avn.com/news/legal/p... industry coverage of increasingly broad anti-porn/sw measures pushed by Baroness Bertin et al in the UK Parliament

Crime and Policing Bill bills.parliament.uk/bills/3938/p...

#censorship #openweb #techpolicy

3 0 1 0
In fact, the risks may increase as new Online Safety Act duties, such as proactive scanning for illegal content, pre-publication filtering, algorithmic suppression and emergency take-down powers are introduced. Automated systems cannot understand context, political nuance or fast-changing legal realities, yet they are increasingly being used to make decisions that shape what the public can speak about online. Without clear guidance, platforms will continue to over-remove lawful content to avoid regulatory consequences.

When support for a non-violent protest movement can be interpreted as support for terrorism, the boundary between dissent and criminality becomes dangerously blurred. The result is a system in which political speech can be filtered or hidden before it is even published and where individuals may face real world consequences for lawful expression. Ultimately, the High Court's judgment shows how dangerous it is to build online regulatory frameworks on legal foundations that may not withstand scrutiny.

In fact, the risks may increase as new Online Safety Act duties, such as proactive scanning for illegal content, pre-publication filtering, algorithmic suppression and emergency take-down powers are introduced. Automated systems cannot understand context, political nuance or fast-changing legal realities, yet they are increasingly being used to make decisions that shape what the public can speak about online. Without clear guidance, platforms will continue to over-remove lawful content to avoid regulatory consequences. When support for a non-violent protest movement can be interpreted as support for terrorism, the boundary between dissent and criminality becomes dangerously blurred. The result is a system in which political speech can be filtered or hidden before it is even published and where individuals may face real world consequences for lawful expression. Ultimately, the High Court's judgment shows how dangerous it is to build online regulatory frameworks on legal foundations that may not withstand scrutiny.

Ofcom urged to clarify if Palestine Action content should still be removed online www.theguardian.com/world/2026/m... (UK)

Letter from ORG et al www.openrightsgroup.org/press-releas...

#OnlineSafetyAct #GazaGenocide #censorship #openweb #techpolicy

1 0 1 0
Preview
Mastodon Finally Gets a Universal Share Button for the Web Mastodon's new official share widget works across all servers with no tracking, solving a long-standing usability problem for the decentralised network.

Mastodon Finally Gets a Universal Share Button for the Web

#Mastodon #Fediverse #OpenWeb #SocialMedia #TechNews #AusNews

thedailyperspective.org/article/2026-03-02-masto...

0 0 0 0