Hacker Newsnew | past | comments | ask | show | jobs | submit | perks_12's commentslogin

I did. I keep my charging cable close, or basically always attached. Battery life is abysmal. No idea why, had other Linux-based laptops before. Battery life was never good, but never this bad.

Just give us the option to get the quality back, Anthropic. I get that even a $200 subscription is not possible eventually, but give us the option to sub the $1000 tier or tell us to use the API tier, but give us some consistency.


This. I get much more value than 90€ from my Claude Code subscription. I am willing to pay more for consistency and not having to watch my back all the time, because I might get screwed over.


[flagged]


can a druggie stop using when the quality is too poor? I get your analogy, but it doesn't apply here


[flagged]


the parallel druggie are the AI companies who want to quit burning cash but tealize their users are all addicted to 40k GPUs that cost $100s dollars a month to use and theres no way to train a SOTA model better and guarantee better efficiency; so you promo double tokens as a cover for a QUANT downgrade while publishing a reskinned "upgrade" as super killer AI hoping some B2B will take a hit of the crack pipe.

</tinfoil>


I am not familiar with time series models, but judging from your answer, it would be necessary to feed long time series into this model for it to detect trends. What is a token here? Can it, for the lack of a better example, take in all intraday movements of a stock for a day, a week, a month, etc?


I tend to avoid time series forecasting when I can help it because I find it hard to communicate to stakeholders that a neural network (or another method) is not an oracle.

If you are talking about granularity of observations, it would depend on what you are trying to predict (the price in an hour or the price in 12 months?) and how quickly you need the prediction (100ms? Tomorrow morning?). If I had infinite data I would use granularity as a hyper parameter and tune that to a level that produced the best test results.

I am for example currently using weekly averages for non-price data forecasting. I could use daily data but weekly is absolutely adequate for this purpose.


You can use lightgbm with appropriate feature engineering.


Using many different models, just not NN for this particular application.


I know the Hustle is a Hubspot content factory, but I got to admit they've been capturing more and more of my reading and (YouTube) watch time recently. They have fascinating topics and seem to be researched very well.


I google a lot (or rather, Kagi). I loved to explore the web when I was younger. But over time I lost any interest in trying to gather informational bits from increasingly shittier websites designed to have more ads and hide relevant information for as many ad slots as possible. These days I hit the quick answer button inside Kagi more often and just accept that I might have some false information in there. If it is critical to be right, I usually consult primary sources directly anyway.


> These days I hit the quick answer button inside Kagi more often

Just incase you didn’t know you can append ? to any query and get a quick answer straight away


Actually, I didn't know. Thanks!


I've been using Linux on my devices for quite some time now. I was pleasantly surprised when I had to start 4k video editing work and could just use Davinci Resolve. 2026 might not be the year of the Linux desktop, but it's getting better day by day.


Hosting for their documentation would only be a noteworthy amount if they chose to host on Vercel. Other than that it's a Hetzner box at $100 per month tops.


Couldn't this be laid out as, We assume scraping and parsing liability unless it is ruled as being illegal, in which case your use would be illegal and our liability shield wouldn't help you?


You actually get more, for the 1500€ you don't have to mess with sticks yourself. They solder it directly on your mainboard. If that isn't great service, I don't know what is.


I don’t know why you are getting downvoted. The sarcasm in the comment is pretty obvious.


I'm not sure if it's sarcasm or not, but either way it's just true


This whole discussion is weird. The ETSC-linked sources do not make any statements regarding vehicle size or US American car standards. It just claims that European standards 'supported' fewer deaths.

I am European, I don't think big trucks are particularly well supported by our road systems but I don't think we need to look at American car standards to get the next 10x reduction in traffic-related deaths.

IMHO it is not explainable how in 2025 there are still cars sold without LIDAR-based anti-collision systems, how are these still extra? Systems to warn of objects in the blind spot areas are available yet not mandatory.

This reads like the classic western world strawman to me. Instead of looking at how to improve things we just make sure things are not getting worse. By burning a strawman, in this case trucks from the US. Which are best described as a niche market over here, but now that we have a newly defined enemy, we do not have to confront our shitty carmakers about technological advancements.

These people do not care about human lives, they care about politics.


You can take one sides complaints about “trucks” or “immigrants”, swap the word, and sell it directly to the other side.

It’s 95% a political football; the other 5% is people actually concerned about the issue.


One of the points was that European manufacturers will start making more cars in US purely because it is cheaper to do so due to the lower bar. Why would we want that? Our market is quite big anyway and this agreement is an attempt to shoulder their way into the market without the sacrifices that local manufacturers are subjected to. Besides cars from US can already be bought and imported.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: