Hacker Newsnew | past | comments | ask | show | jobs | submit | ckozlowski's commentslogin

Well said. I'll also add, that with these networks, the sooner you can get traffic off your network the better. There's strong incentive to have your datacenter near these peering points. And since MAE-East was the first, it's been the largest as it's been snowballing the oldest. AOL's HQ was here, Equinix built their peering point soon after MAE-East, etc.

There's a great read about the whole area here: https://www.amazon.com/Internet-Alley-Technology-1945-2005-I...

As for AWS, I often see it repeated that the DCs are the oldest and therefor in disrepair. That's not true; many of the first ones have since been replaced. But there are services that are located here and only here.

But I'll also add, a lot of customers default to using US-East-1 without considering others, and too many deploy in only one AZ. Part of this is AWS's fault as their new services often launch in US-East-1 and West-2 first, so customers go to East-1 to get the new features first.

Speaking as one who was with AWS for 10 years as a TAM and Well-Architected contributor, I saw a lot of customers who didn't design with too much resiliency in mind, and so they get adversely affected when east-1 has an issue (either regional or AZ). The other regions have their fair bit of issues as well. It's not so much that east-1 necessarily fails more than the others, it's that it has so many AZs and so many workloads that people notice it more.


> But there are services that are located here and only here and only here

Why is that? You would think the company ending events like IAM going poof due to it being dependent on us-east-1 would be top priority to fix?


Oh, I feel this. As a pre-teen, I loved it. SETI@Home running on our family's Pentium 100, X-Files, and hanging out on the "Parascope" forum and chatroom on AOL for all things UFO related. An "I Want To Believe" poster on my wall.

I've (thankfully) moved on past that, but I look back at that with nostalgia.


Same here, you either move past it or go crazy, if the UFO subreddits are any indication. Given the population was already obsessed with blurry clumps of pixels in short video clips, AI is going to send a lot of those people into trailers in the desert with a lifetime supply of aluminum foil.

And now that we have mass-produced hyper-maneuverable quad-copter drones, the whole "it moved backward in a way that no aircraft ever could!" doesn't really hit as hard.


My current UFO conspiracy brain is that the UAP makers want Greenland as their home, and the US wants to get the benefits of that. Because that would be less crazy to me than what it's really about.


This is probably the reason | except the UAP makers are now us


They saw a huge uptick in users during the COVID pandemic. As the corona virus is a protein shell, and their software folds protein molecules, they were able to apply it to look for targets for other molecules to attach to the virus where it would normally latch onto a cell, this could then lead to treatments.

They'd found some promising results, and were working with a pharmaceutical company to manufacture the first compounds that could then be tested. Unfortunately that company's facility was located in eastern Ukraine. =(

But that aside, they've still been going strong.


I could not find mentions of any Ukraine-based company working with them. Do you have more info?


I agree, and the "not the first time" I think is key there. Setting expectations I think is crucial. For ours (5yo), we're clear about what he can watch and for how long. We control the device. "Two episodes before dinner" or so. Over time, he learns how this works. And we're not afraid to tell him that now isn't a good time for the TV.

It's not to say we never have any complaints over this, but when we do, it's rare and usually because something else is amiss (hungry, frazzled, tired).

But most instances it's like last night, where we were clear that we had time for two episodes of Tumble Leaf before dinner. At the end of the second one he announced "last one!" and got up off the couch as we picked up the remote.


5yo parent here. Agreed. And sometimes they just need to chill.

I agree with the overall sentiment. Too much screen time is bad. Kids need to get out and play, indoors or out. In our house, it's a lot of biking and playing with friends outside, Legos, Brio, Magnatiles, matchbox cars, or just crafts.

But sometimes they're frazzled, out of sorts, and would benefit from just being able to sit and chill.

So we'll put on something for him that we're comfortable with. Tumble Leaf, Blaze & The Monster Machines, Trash Truck, or the occasional Ghibli movie.

We do not give him a tablet or other portable device. He sits and watches on the couch, we set a expectation, and stick to that.

I think controlling the device is important. Keeping the screen as something we control and not something he carries around seems to allow us better control and helps him understand the limits in play. 90% of the time, we have no fuss.

And it's not bad. In moderation, TV can be just fine. Often it genuinely helps him soothe and relax (Especially if he's been really active and engaged all day), and as you said, helps us get something done. Two episodes of one of his favorite shows is great to help him unwind while we're making dinner.

But we keep time/episode limits as well, and that seems to keep things in balance along with the aforementioned things.


Seconding this. We've made Daddy Mix Tapes, "Mommy Reads Stories", and other compilations.

Adding to the plethora of good ideas here: My wife bought these hanging tabs to stick onto the cards[1], and then strings a keycable[2] through them so my son has groups of them together. Yoto makes folding binders for them as well, but the keycable method seems to be a bit easier for our 5yo to handle.

1. https://www.amazon.com/dp/B0B2JL79PY

2. https://www.amazon.com/dp/B06XXFZHJQ


Many, but not all. There were Coppermine derivatives eventually: https://www.cpu-world.com/CPUs/Celeron/TYPE-Celeron%20(Coppe...


Thanks for looking up the numbers!

That would be quite the "budget" SMP build. The 366MHz "Mendocino" was based on the prior Pentium II core I believe. So quite the disparity in single-threaded workloads.


Yes, because there weren't really CPUs then that had double the performance.

Celeron CPUs were usually CPUs that shared the same core architecture as the current Pentium standard, but often had a lower core clock speed, lower core memory speed, and/or had smaller L2 caches.

Workloads have different constraints however, and simply doubling cache, clock speed, or memory bandwidth doesn't necessarily double performance, especially when running more than one application at once. Keep in mind, this is Windows 98 /NT/2000 era here.

Symmetric multi-processing (SMP) could be of huge benefit however, far more than simple doubling any of the above factors. Running two threads at once was unheard of on the desktop. These were usually reserved for higher-binned parts, like full-fledged Pentium workstations and Xeons (usually the latter.) But Abit's board gave users a taste of that capability on a comparative budget. Were two cheaper than a single fast CPU? Probably not in all cases (depends on speeds). But Abit's board gave users an option in between a single fast Pentium and a orders of magnitude more professional workstation: A pair of cheaper CPUs for desktop SMP. And that was in reach of more people.

In short, two Celerons were probably more expensive than a single fast Pentium, but having SMP meant being able to run certain workloads faster or more workloads at once at a time when any other SMP system would have cost tons.


>Celeron CPUs were usually CPUs that shared the same core architecture as the current Pentium standard, but often had a lower core clock speed, lower core memory speed, and/or had smaller L2 caches.

This had an interesting side effect: Celerons of that era overclocked extremely well (stable 300 -> 500MHz+), due to the smaller and simpler on-die L2 cache relative to the Pentiums of the era, whose L2 cache was much larger but had to be off-die (and less amenable to overclocking) as a result.

An overclocked dual Celeron could easily outperform the highest-end Pentiums of the era on clock-sensitive, cache-insensitive applications, especially those designed to take advantage of parallelism.


IIRC Celeron cache being on die was actually faster as it was on die, this was mitigated on the Pentiums by there being more of it. It seemed like in games the faster cache performed better.

Another thing that helped the Celeron overclocking craze is Intel seemed to damage the brand badly out of the gate. The original Celerons had no cache at all, performed terribly and took a beating in PC reviews. So even though the A variants were much better this still had a stink on them.

The thing that probably helped the Celeron the most with overclocking though was they gimped them by only giving them a 66mhz front side bus speed. Since you had to increase this number to push the locked multiplier CPU speed up this was an advantage if you were going to overclock as you could buy a capable motherboard and run it at stable 100mhz. Whereas you'd have a lot more system wide problems trying to push a Pentium's 100mhz bus higher.


Yeah; mine ran very stable at 466 for >decade. It was impressive.

You could attempt to head toward ~700 but I never could keep it stable there.


That was a bit of a two edged sword as the heavily overclocked Celerons would benchmark extremely well, but be somewhat disappointing in actual applications due to the lack of cache space. It was right at the start of the era where cache misses became the defining factor in real world performance. CPUs ran ahead of DRAM and it has never caught back up, even as per-core CPU performance plateaued.


Going from a single CPU to a dual CPU would, in theory, double performance _at best_. In other words, only under workloads that supported multithreading perfectly.

But in the real world, the perceived performance improvement was more than doubling. The responsiveness of your machine might seem 10 or 100x improved, because suddenly that blocking process is no longer blocking the new process you're trying to launch, or your user interface, or whatever.


One thing I've noticed is that the phrase "CPU hog" has faded from common usage


Very interesting observation. Multicore systems have been fairly standard for the last 10+ years, and while you occasionally notice a misbehaving process hog an entire core, it never visibly impacts system performance because there are still several other idle cores, so you don't notice said "hogs."

It's much rarer to see misbehaving multithreaded processes hog all of the cores. Perhaps most processes are not robustly multithreaded, even in 2025. Or perhaps multithreading is a sufficiently complex engineering barrier that highly parallelized processes rarely misbehave, since they are developed to a higher standard.


> Multicore systems have been fairly standard for the last 10+ years, and while you occasionally notice a misbehaving process hog an entire core, it never visibly impacts system performance because there are still several other idle cores, so you don't notice said "hogs."

Except on Windows laptops. Where, although the computer is idle, your favourite svchost.exe will heat your system and trigger thermal trottling.


100%. Its common for non-technical users to complain their laptop is faulty, because it gets hot and the battery drains very quickly. They have no concept of a runaway process in a hard loop causing this.


I didn't either!

Granted, I wasn't good at video games in general. And this one infuriated me, because I loved it. I could easily beat the first level, but then I crashed on carrier landing. This happened for years. I only ever saw the first level of this game.

Then one day, while staying at my elementary afterschool sitter's house, one of the kids there told me he played Top Gun as well. He could land, but wasn't very good at the rest of the game.

A plan was formed.

The next day, I brought the cartridge over, and we settled in. I'd play the level, then hand him the controller at which point he'd plant it on the deck. Rinse and Repeat. Top Gun and Top Gun: The Second Mission didn't have too many levels, (6 maybe?) and I don't think it took us too long to beat. Neither one of us had seen much of the game. But working together, we beat both in a matter of hours.

I still look back on that as one of the few NES games I finished without codes or a Game Genie, just the help of a friend. =D


The blog says that failing to land on the carrier didn't actually fail the mission. Maybe you're misremembering? I just remember this game being so frustrating that I never replayed it.


Entirely possible I misremembered. As another commenter pointed out, it might be that I never got past mission 2. On further recollection, I think it was just Top Gun: The Second Mission that I owned. I remember playing both, but it might have been the second that vexed me the most.


I also clearly recall it failing the mission, could there possibly have been different versions of the game? I've heard before of Nintendo distributing slight variations of the same game based on region back in those days, perhaps that's what's going on?


You did lose a life; so if you failed the landing, then the in-air refuel, then the landing it was game-over at the end of the second level.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: