Hacker Newsnew | past | comments | ask | show | jobs | submit | impulser_'s commentslogin

"We haven’t committed to rewriting. There’s a very high chance all this code gets thrown out completely."

People conflate “high chance of X” with “X will happen” all the time. See elections, for example.

The phrasing strongly implies that they are taking the migration seriously and carefully. Merging straight to canary after 9 days is insane.

People need to understand Google. They have a long line of failures, because they are an innovative company. Their whole goal is to scale products to billions of users. So if they release a product, and they see no path to billions of users they cut it and move on.

This has always been the way Google has worked. This is why they are literally the most successful company in the history of the world.


That's the economic model of Saudi Arabia. Just because the world wants to pay for oil and apparently ad inventory today doesn't mean that it'll do so forever.

People really need to stop assuming more training data the better. This is not how it works. LLM thrive off consistency.

Go for example has significantly less training data than Python, but LLMs are the best at it. Why? Go is often written the same. You go from project to project and the code looks all the same. There only a very few ways to write Go.


Have you tried purego?

You can just embed the C library into the binary of the Go app call it directly in Go. Most of the time I have found calling C from Go faster or on par with calling C from Python.

https://github.com/ebitengine/purego


Local models are always going to be useless unless compute get significantly cheaper, and it's not. TSMC might literally run out of capacity to build any consumer compute product.

Once computer constraints ease up, you will see much larger models. The reason LLM seems to have stalled a bit is because there just not enough compute.

You have more people using AI which requires more compute, and you want to build larger models which requires more compute and you have limited compute. What do you do?


Right.. and computers were once the size of a large room vs now fit into a pocket.

" The reason LLM seems to have stalled a bit is because there just not enough compute."

lol okay mate.


https://x.ai/news/anthropic-compute-partnership

There is literally not enough compute dude. If we had infinite compute we would see everyone building Mythos size models.

"Right.. and computers were once the size of a large room vs now fit into a pocket."

Yeah because we offload much of the compute to off device. You think we would have these small computers in our pockets if the internet didn't exist?


> Right.. and computers were once the size of a large room vs now fit into a pocket.

and yet now we have far bigger rooms with far bigger computers anyway

Hardware may improve exponentially, but demand for compute increases double-exponentially. we'll always need more, bigger computers


Majority of apps are B2C apps, they don't need any of this.

All you need is Apple and Google Oauth.


If you are just starting out its probably a good idea. Think about the use case when google bans either your app or bans your app user?

Then your business is entirely screwed anyway because you've just lost half the market

At least to me it sounded very much like they were talking about mobile.


It depends on your use case.

If you are a B2C app, you are probably more concerned about:

- social providers (Apple and Google being the big ones, but others could play a role--FB or Tiktok for example)

- easy registration (but not too easy, you want to avoid bot spam)

- self-service account management (updating profile fields, consents [CCPA, GDPR, others], resetting passwords

- single sign-on between your apps (if you have multiple)

- language support (for your backend, and mobile/web front end)

- cost

- possibly MFA, possibly passkeys


Yeah, but data centers allow for jobs which gives people money to buy food.

People should also look at Railway especially if majority of your users are in a single region because you will only really pay a price during active times and during times with low activity you will pay almost nothing.

Everyone thought Google Search would die from AI, but people are searching more than ever.


Not sure how they will justify zero click to advertisers though, except youtube.

https://www.google.com/search?q=did+google+seach+increase&oq...


I think they make most of their money off searches with intent (“vehicle detailing near me”) and things for which they still send you somewhere. The kind of searches that an LLM can just answer probably largely just sent you to Wikipedia or somewhere nobody was paying much for anyway.

It’s possible AI will do a better job of capturing ad dollars by better serving intentional searchers.


There's a part of the tech industry that uses what I would call dark influencer techniques. Search is dead. Lidar is too expensive for AVs. LLMs are as scary as thermonuclear bombs. China China China. Without ALPRs you'll get carjacked picking up Tommy from soccer.

Some of it is for stock pumping, some for regulatory capture, some is flooding the zone with shit.

This kind of "marketing" is part of the reason why tech is held in low esteem now. It destroys the sense of optimism and replaces it with fake tech bro worship.


I hate the china fear mongering. It's like the 50s red scare but 10x dumber since by all realistic accounts china is just. another government. a scary one and powerful one, yes, but so is the us. They aren't a rogue state like the DPRK or Iran, aren't funding terrorism by any realistic, and realise that starting any wars is a very bad idea


I think it's they don't want to set a precedent on refunding for bugs because one bug could cost them millions.


Is that even legal? What happens if my landlord accidentally charges me 10x rent this month and refuse to correct it even after I ask? That's just straight up stealing. I feel like at a minimum I'm getting my money back one way or another, and they are likely to face consequences for theft.


But, no need to set a precedent: I'm quite confident that a US court would refund a person or company that overpaid due to a bug in Antropic's billing.


This is not just one bug, though; it’s a bug that takes money that ain’t theirs to take.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: