Hacker Newsnew | past | comments | ask | show | jobs | submit | aggregator-ios's commentslogin

If I am understanding this correctly, the $99/year Apple Developer Program allows you to notarize applications for macOS so users do not receive the warning/damaged binary dialog. I simply had AI generate the signing code, and you can run that script on any CICD or on your machine and push the artifacts to a CDN. Works wonderfully for macOS, and users of my app have had no issues with it.

Let me know by replying here if you want me to share the build+sign code or have any questions.


I'm totally interested in hearing more about this, please share more details about how you get this working.

I would’ve read what the site was about if I didn’t get the most complicated cookie consent modal. Just backed out and won’t be visiting that now.

FWIW I did not see a cookie modal. Most likely it was blocked by uBlock Origin's Annoyances filters. You should give it a try, it fixes a lot of this crap.

I usually don't look at the details of the cookies, but this one is insane.

"Necessary cookies help make a website usable by enabling basic functions like page navigation and access to secure areas of the website. The website cannot function properly without these cookies." Includes: Eventbrite, Google, LinkedIn, Shopify, Stripe, NY Times, and more.

Goodbye.


Yes, there are _40_ necessary cookies which you can’t deny, 40, like it’s basically an ad in disguise

yah one of the most obnoxious cookie filters. You can visit these links though :)

kokaachi.com

www.maachis.art

harshitagrawal.com

map-india.org/matchbox-momentos


I was introduced to the 83 Plus and it was simply the most mindblowing device at the time. We were given a sheet to share with our parents on why it was an important device to own/borrow. Me and several friends would trade apps through the TI-Link cable, and we would play games, write software for it and there was even a popularity rank in school about whose program was installed on more calculators.

For a lot of people it introduced them to TI-Basic which was quite capable, and for others you could get into Assembly which allowed for more powerful applications. There were 2 parts of the memory, BASIC programs were in regular memory that could be easily erased, and another part which was Flash Apps.

I later upgraded to the 89 which had a better CPU, screen resolution and processing power and it was phenomenal in helping me understand every single math class, including EE/EECS. It made me sad to see them banned in exams, because having a 83+/89/any calculator was in no way helpful in any of the exams I took, but it was more of a "control the students" thing in college. The Math department determined that because they couldn't prove that people were not using the internet/portable PC's in their calculators, that they could not guarantee the fairness of it all.

Weird argument to make knowing that a 20 year old student was engineering a full internet capable PC into a calculator at the time would have been the envy of the world (and every engineering program).

This all depends on the quality of education and not simply handing out problems that require rote memorization of the methods to solve an equation and instead derive or figure out the equation yourself after understanding the problem after which you're free to use the calculator to "plug and chug".


Was the card $10 or are you saying that the chip is a $10 part?


For those that read the article and are still confused (as I was) about what Apple hardware would give you the full 10GbE speeds:

- 10GbE Thunderbolt adapter is still the best. Full symmetrical 10GbE on laptops as far back as the 2018 MacBook Pro 13" (Intel) and every laptop since. Including the Airs starting with the M1 chip (Not sure about Neo).

- No Apple hardware supports the 3.2 v2x2 standard (20Gbps) and your connection will be downgraded to 10Gbps on these RTL8159 chips. Because of processing overhead, you will only get 5-7Gbps of total Ethernet throughput.

- Upgraded Mac Mini or Apple Studio base models have builtin 10GbE ports

For now, thunderbolt adapters are still the most reliable 10GbE for Apple laptops.


> 10GbE Thunderbolt adapter is still the best. Full symmetrical 10GbE on laptops as far back as the 2018 MacBook Pro 13" (Intel) and every laptop since. Including the Airs starting with the M1 chip (Not sure about Neo).

The neo doesn't have thunderbolt at all so no, that won't fly.


Luckily I suspect the intersect on the Venn diagram isn’t huge for Neo buyers, and those wanting / needing 10gigE


Yup.


Thinking about it, it would be pretty magical. Neo with 10GbE to fast storage and CPU and GPU: Thin client that's pretty damn thick for how thin it is.


Thank you, I was suspecting the same but was not sure.


I'm building https://jsonquery.app to query, store, and organize JSONs, with natural language querying so you don't have to learn another query language.

Also building BetterGit: (https://www.satishmaha.com/BetterGit/) A simpler, cross platform Git GUI where all the commonly used actions are right in front of you

And also Crush Depth: A remake (from 13y ago) of a tower defense game for Apple's platforms (iOS, iPadOS, and macOS). Checkout the TestFlight: https://testflight.apple.com/join/gkD5c2U1


Love it! I was instantly hooked and its a lot of fun to play. And it's so smooth and easy to pick up!


I tested the E2B and E4B models and they get close but inaccurate (non working) results when generating jq queries from natural language.

This is of importance to me as I work on https://jsonquery.app and would prefer to use a model that works well with browser inference.

gemma-4-26b-a4b-it and gemma-4-31b-it produced accurate results in a few of my tests. But those are 50-60GB in size. Chrome has a developer preview that bundles Gemini Nano (under 2GB) and it used to work really well, but requires a few switches to be manually switched on, and has recently gotten worse in quality when testing for jq generation.


Same, I quickly tested it for code gen and it produced mostly good code for simple problems, but it sometimes hallucinated words in non-English scripts inside the code.


What CRDT's solve is conflicts at the system level. Not at the semantic level. 2 or more engineers setting a var to a different value cannot be handled by a CRDT.

Engineer A intended value = 1

Engineer B intended value = 2

CRDT picks 2

The outcome could be semantically wrong. It doesn't reflect the intent.

I think the primary issue with git and every other version control is the terrible names for everything. pull, push, merge, fast forward, stash, squash, rebase, theirs, ours, origin, upstream and that's just a subset. And the GUI's. They're all very confusing even to engineers who have been doing this for a decade. On top of this, conflict resolution is confusing because you don't have any prior warnings.

It would be incredibly useful if before you were about to edit a file, the version control system would warn you that someone else has made changes to it already or are actively working on it. In large teams, this sort of automation would reduce conflicts, as long as humans agree to not touch the same file. This would also reduce the amount of quality regressions that result from bad conflict resolutions.

Shameless self plug: I am trying to solve both issues with a simpler UI around git that automates some of this and it's free. https://www.satishmaha.com/BetterGit


> CRDT picks 2

They don’t have to.

The crdt library knows that value is in conflict, and it decides what to do about it. Most CRDTs are built for realtime collab editing, where picking an answer is an acceptable choice. But the crdt can instead add conflict marks and make the user decide.

Conflicts are harder for a crdt library to deal with - because you need to keep merging and growing a conflict range. And do that in a way that converges no matter the order of operations you visit. But it’s a very tractable problem - someone’s just gotta figure out the semantics of conflicts in a consistent way and code it up. And put a decent UI on top.


For that you need a very centralized VCS, not a decentralized one. Perforce allows you to lock a file so everybody else cannot make edits to it. If they implemented more fine-grained locking within files, or added warnings to other users trying to check them out for edits, they'd be just where you want a VCS to be.

How, or better yet, why would Git warn you about a potential conflict beforehand, when the use case is that everyone has a local clone of the repo and might be driving it towards different directions? You are just supposed to pull commits from someone's local branch or push towards one, hence the wording. The fact that it makes sense to cooperate and work on the same direction, to avoid friction and pain, is just a natural accident that grows from the humans using it, but is not something ingrained in the design of the tool.

We're collectively just using Git for the silliest and simplest subset of its possibilities -a VCS with a central source of truth-, while bearing the burden of complexity that comes with a tool designed for distributed workloads.


> It would be incredibly useful if before you were about to edit a file, the version control system would warn you that someone else has made changes to it already or are actively working on it. In large teams, this sort of automation would reduce conflicts, as long as humans agree to not touch the same file. This would also reduce the amount of quality regressions that result from bad conflict resolutions.

Bringing me back to my VSS days (and I'd much rather you didn't)


I knew I should have put a trigger warning, because I was thinking of this as I was typing it. Sorry!


Yeah, same thoughts. I also think semantic merge is the best. Also it would be nice if you could add a plugin for custom binary file formats, such as sqlite (which obviously can't be merged like a text file).


well, the mismatch here is widened by the fact that almost everyone it seems uses git with a central, prominent, visible, remote repository. Where as git was developed with the a true distributed vision. Now sure that truely distributed thing only becomes final when it reaches some 'central' repo, but it's quite a big different than we all do.


I haven't used them, but doesn't SVN or Mercurial do something like this? It blocks people from working on a file by locking them, the problem is that in large teams there are legitimate reasons for multiple people to be working on the same files, especially something like a large i18n file or whatever.


Absolutely. Cheaper Chromebooks are terrible machines. Those screens should be illegal and probably causes a lot of eye strain and headaches. Same with a lot of the sub $800 PC laptops. The colors aren't even... colors. The trackpad? Yuck. Everything else falls apart right outside of the warranty period of just 90 days or 1 year. Oh and good luck spending the first day just uninstalling/formatting everything from scratch and getting the vendor specific features to work again.

For people who have always wanted an Apple laptop, this is it. The niceties are not necessary, and perfect little things to cut out to bring the price down for the masses.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: