Hacker Newsnew | past | comments | ask | show | jobs | submit | duped's commentslogin

> no more plastic for milk jugs

aiui most plastics in the US for this comes from the ethylene byproducts of natural gas/fracking.

Although it would be unironically fantastic if we could hard wean ourselves off plastics and go back to reusable glass jugs. If I could bring my growler to Kroger for milk and cream I'd be a happy camper


FWIW in Germany and Switzerland milk is mostly sold in carton packaging. I never understood why the US stuck to using plastic jugs

Because it's cheap

This doesn't address the wider array of age-verification related problems that people want to solve, like social media where age verification is needed to police interactions between users.

Such censorship shouldn't exist in the first place.

I could be misunderstanding the context but to me that sounds like a moderation issue assuming we even want small children on social media in the first place. There should probably be a dedicated child-safe social media site that limits what communication can take place for small children and has severe punishments for adults pretending to be children for the purposes of grooming.

Moderation is like law enforcement, it doesn't prevent crimes from happening it just punishes the people they can catch. There exist severe punishments for the kinds of behavior I'm talking about, but unsurprisingly, this does not stop kids from being harmed and it doesn't undo it.

This isn't hypothetical, by the way. There are adults catfishing kids into producing CSAM [0], kidnapping and assaulting minors [1], [2], and in the most extreme case, there's a borderline cult of crazy young adults who do terrorize people for fun [3].

It is a constant game of whackamole by moderators/admins to keep this behavior out of online spaces where kids hang out.

I recognize that this is a "think of the children" argument, but indeed that's the point. The anonymous web was created without thinking about the children, just like how all social media was created without thinking about how it could be used to harm people. Age verification is the smallest step towards mitigating that harm.

Now I disagree very strongly with the laws proposed (and indeed, I've been writing/calling/talking with state reps about this locally, because I don't want my state's bill passed). But the technical challenge needs to address the real problems that legislators are trying to go after.

[0] https://www.justice.gov/usao-wdnc/pr/discord-user-who-catfis...

[1] https://www.nbcnews.com/news/us-news/kidnapping-roblox-rcna2...

[2] https://www.nbcmiami.com/news/local/nebraska-man-charged-wit...

[3] https://www.fbi.gov/contact-us/field-offices/boston/news/ope...


I am only interesting in protected the majority of children which I believe my proposal more than covers. There will always be exceptions. Today teens share porn, warez, pirated movies and music with small children in rated-G video games. I am not proposing anything for that. It is up to businesses to detect and block such things.

Point being, there will be a myriad of exceptions. I am not looking to address the exceptions. Those can be a game of whack-a-mole as they are today. I am proposing something that would prevent the vast majority of children from being exposed to the trash we today call social media and of course also porn sites.


Look, please don't sideline/marginalize people by using the "whataboutism" term. Thats being used more and more to silence dialog from people that see problems outside the focus of a specific area. Its important that we see ALL sides of the problem.

Fair enough. Even though I do not perceive it that way I removed it in the event a majority of others have come to this conclusion.

Thank you for understanding. I know sometimes topics can get out of hand with comments about related things, but I this case. We might be better off looking at all the extremities.

These aren't exceptions or whataboutism. It's the debate being had on the floors of state legislatures.

> It is up to businesses to detect and block such things.

Which is exactly why age verification legislation is hitting the books. No one (serious) cares about whether kids can download porn and R rated movies. Parental controls already exist if the threat model is preventing access to specific content that is able to report itself as _being_ that content.

Your proposal also doesn't address the other domain that these legislators are targeting, which is addictive content. They define specifically what classifies as an addictive stream and put the onus on service providers to assert that they're not delivering addictive streams of media to kids. An HTTP header isn't enough, because it's not about the content being shown to kids but the design patterns of how it's accessed.

Essentially: age verification isn't about porn. 18+ content stirs the pot a bit with the evangelical crowd but it's really not what people are worried about when it comes to controlling digital media access with age gates.


Your proposal also doesn't address the other domain that these legislators are targeting, which is addictive content.

That sounds simple to me. If a type of content is addictive then require the RTA header.

- Adult content, or possible adult content.

- User contributed or generated content (this covers most of social media)

- Site psychological profiles that are deemed addictive (TikTok and their ilk)

Overall we are describing things that are harmful to the development of the minds of small children. If adults wish to avoid such content they can create a child account on their device for themselves to be excluded from this behavior as well. I use a child account in a couple of popular video games to avoid most of the trash talking and spam. I'm not hiding my age as the games have my debit card information but rather I opt-in to parental controls.


This is assuming children should be on social media at all, which I for one would debate.

> Sure there are many edge cases, but surely the OS and FS can just be abstracted away and you can verify that "rm .//" actually ends up doing what is expected ?

This is one reason why Windows disables symlinks by default, and it's not an abstraction but wholesale removal of a feature. Unixes can't do that without breaking decades of software that relies on their existence.

MacOS does something similar, for example the chroot() bug isn't an issue in practice because MacOS forbids chroot() by default (you need to disable system integrity protection).

The fundamental problem is caused by the POSIX APIs. They have sharp edges by their very nature. The "fix" is to remove them.


Just want to point out that race conditions are a correctness problem, not a performance problem.

Accurate a.k.a. "correct" implementation of ACID needs a single (central) source of truth and temporal serializability (or something close to that).

In practice this always "impacts" performance.

If I understand it correctly, then in physics this is called an event horizon.


Not necessarily. Most race conditions violate the `A` in ACID, but the finicky thing about atomicity is that N > 1 sequential actions that in and of themselves are atomic violates atomicity. So any atomic store is possible to misuse if you can compose multiple atomic operations on it.

In addition ACID isn't always provided by the floor beneath your programs but by designing the programs on top to uphold it and/or not require it, allowing you to relax the constraints from your lower level interfaces for performance reasons.


Firstly, atomicity and/or thread-safety not composing is where the Consistency and Isolation come in.

The "application layer" always has to enforce its own consistency guarantees. If the lower layers are total garbage, then the system is garbage. And the "speed" of the lower layers can be infinitely fast and it doesn’t matter, if the application has a latency floor. So optimize it all you want.


This is a people problem and Canonical just isn't good at hiring people

I’ve gotta agree. Some horror stories were going around about their interview process. It seemed highly optimized to select people willing to put up with insane top-down BS.

GitHub actions sucked and fell over itself long before vibe coding became mainstream.

Great! I'll take my money to someone else who can handle the current state of engineering instead of wasting it trying to predict the future.

Then they shouldn't be encouraging AI development tool usage.

I've never pushed a commit and thought huh, I wonder what copilot thinks of this.


So if you are building something where you control every SVG ever produced and rendered then this is totally reasonable.

If you ever need to interface with other tools that generate SVG you now need to have a way of essentially transpiling SVG from the wild into your tamed SVGs. Oftentimes this is done by hand, by a software developer and designer (sometimes the same person).

And this is for basic functionality that your designers expect and have trivial controls for in their vector editors, like "add a drop shadow."

The article goes into some issues with sanitization itself, and except for <script> these are a bunch of reasonable things that someone might expect to work or not have issues with. Sandboxing rendering isn't an unreasonable approach if you're not writing the parser and renderer yourself.


I see beginners trip over themselves when they get obsessed with allocation (never mind they're coming from like, Python) but in a language like Rust (or C++) you are writing programs with the intent to control how memory is managed. So it makes a lot of sense to tie memory to the types as a part of their semantics.

It's not without problems, but the idea is less confusing in practice than it seems.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: