Foreign investment isn't fake growth and money being spent in the country is definitely a good thing. It's how Singapore managed to kickstart its economy in the 1960s. Lee Kuan Yew tried very hard, and succeeded, in getting foreign corporations to set up shop in Singapore. The key is to capture value and move up the chain over time rather than getting stuck as a "cheaper back office".
Yep, and today the situation is completely reversed. Through acquisition and business development Singapore is the country which owns the brands and invests in other countries. Poland just needs to stick to the formula. It's citizens are building global-class professional, managerial, and business development experience. Soon if not already those employees will start itching to build their own businesses. Poland just needs to maintain a competitive environment, and not let international companies suppress local startups by lobbying for anti-competitive laws and policies that favor the big guys, foreign or domestic. If it wants to give local companies a leg up, do it indirectly by investing in education and research.
I remember reading about this 4 years ago as the new Chris Lattner project and was super excited, though a little skeptical.
I think that nowadays with vibe/agentic coding, high performance Python-like languages become ever more important. Directly using AI agents to code, say, C++, is painful as the verbose nature of the language often causes the context window to explode.
I was thinking about doing the opposite for the common task of "SVG of a pelican riding a bike". Obviously, directly spitting out the SVG is gonna be bad. But image gen can produce a really stunning photorealistic image easily. Probably a good way to get an LLM to produce a decent bike-pelican SVG is to generate an image first and then get the model to trace it into an SVG. After all, few human beings can generate SVG works of art by just typing out numbers into Notepad. At the core of it, we still rely on looking at it and thinking about it as an image.
Nowadays some parents went back to opting for cloth diapers. Apart from the obvious environmental aspect, there's the idea that ultra absorbent and comfy diapers disincentivize babies from signalling that they are about to poop. Apparently, babies can communicate when they need to go even quite early on, in what's called "elimination communication". This also makes them a lot easier to potty train later on.
I used cloth diapers to great effect with my two kids. We'd use disposable ones when going out, but for around the house (and at daycare, bless them!) we were able to use cloth. I think we saved a pile of money, and yes, they were both trained pretty early.
Nobody wants them, even free... I guess I'll just throw them all out eventually, I've offered to new parents and they're all horrified by the concept
Our baby was capable of sending these signals when she was a few weeks. So most pees she does hanging above the sink. This saves so many diapers, crazy. And much more comfortable for her to never have a wet butt, not even a minute. Would recommend!
I think within the next few months we can actually get her to go to the potty by herself. She’s 15 months now.
This industry wasn’t just good. It did destroy babies sensitivity to soiling.
Agreed - like the sibling comment, we also used cloth diapers with our two kids. They were actually great. The ones we had were basically two-part construction: there was an outer shell with adjustable snaps for appropriate sizing, and an inner liner that absorbed the moisture. Both were easily washable. Like other parents we knew who did this, we added a small hand-held sprayer / bidet wand to one of our toilets and used it to hose off the diaper and liners. We would then toss them in the washing machine. I think these also provided more cushion for the kids’ bottoms and they both ended up sitting and scooting on them pretty fast. Also like the others here, we used disposables on the go / on vacation. Just my two cents, but we loved our cloth diapers.
We had cloth diaper service for our two children, where they'd deliver a huge stack of nice soft thick cotton squares, and take away the dirty ones, once a week. They barely smelled, especially in the beginning before solid foods start. They were excellent as burpy cloths on the shoulder too. Disposable diapers were more excellent for outside, and at later times for sleeping through the night when we realized that the absorbency was better for sleep. We definitely felt better about the environment with the reusable cloth ones.
That reminds me of Paquerette Down the Bunburrows [1] which is a very fun pathfinding game where the bunnies will pathfind to try to run away from you. It's not exactly what you described, but it is very fun and surprisingly deep and challenging.
That brings up an interesting issue, which is that many systems do have more noise in y than in x. For instance, time series data from an analog-to-digital converter, where time is based on a crystal oscillator.
Well yeah, x is specifically the thing you control, y is the thing you don't. For all but the most trivial systems, y will be influenced by something besides x which will be a source of noise no matter how accurately you measure. Noise in x is purely due to setup error. If your x noise was greater than your y noise, you generally wouldn't bother taking the measurement in the first place.
You could, and maybe sometimes you would, but generally you won't. If at all possible, it makes a lot more sense to improve your setup to reduce the x noise, either with a better setup or changing your x to be something you can better control.
(Generalized) linear models have a straightforward probabilistic interpretation -- E(Y|X) -- which I don't think is true of total least squares. So it's more of an engineering solution to the problem, and in statistics you'd be more likely to go for other methods such as regression calibration to deal with measurement error in the independent variables.
Is there any way to improve upon the fit if we know that e.g. y is n times as noisy as x? Or more generally, if we know the (approximate) noise distribution for each free variable?
> Or more generally, if we know the (approximate) noise distribution for each free variable?
This was a thing 30 odd years ago in radiometric spectrometry surveying.
The X var was time slot, a sequence of (say) one second observation accumulation windows, the Yn vars were 256 (or 512, etc) sections of the observable ground gamma ray spectrum (many low energy counts from the ground, Uranium, Thorium, Potassium, and associated breakdown daughter products; some high energy counts from the infinite cosmic background that made it through the radiation belts and atmosphere to near surface altitudes)
There was a primary NASVD (Noise Adjusted SVD) algorithm (Simple var adjustment based on expected gamma event distributions by energy levels) and a number of tweaks and variations based on how much other knowledge seemed relevant (broad area geology and radon expression by time of day, etc)
Yeah, you can generally "whiten" the problem by scaling it in each axis until the variance is the same in each dimension. What you describe is if x and y have a covariance matrix of like
[ σ², 0;
0, (nσ)² ]
but whitening also works in general for any arbitrary covariance matrix too.
so maybe that's why OP didn't realize that it had already been posted recently. With the older scheme, not only is SEO bad, but it was really hard to remember which date corresponds to which blog post, and people can brute force search for my hidden (unpublished) blog posts easily.
reply