People need to understand Google. They have a long line of failures, because they are an innovative company. Their whole goal is to scale products to billions of users. So if they release a product, and they see no path to billions of users they cut it and move on.
This has always been the way Google has worked. This is why they are literally the most successful company in the history of the world.
That's the economic model of Saudi Arabia. Just because the world wants to pay for oil and apparently ad inventory today doesn't mean that it'll do so forever.
People really need to stop assuming more training data the better. This is not how it works. LLM thrive off consistency.
Go for example has significantly less training data than Python, but LLMs are the best at it. Why? Go is often written the same. You go from project to project and the code looks all the same. There only a very few ways to write Go.
You can just embed the C library into the binary of the Go app call it directly in Go. Most of the time I have found calling C from Go faster or on par with calling C from Python.
Local models are always going to be useless unless compute get significantly cheaper, and it's not. TSMC might literally run out of capacity to build any consumer compute product.
Once computer constraints ease up, you will see much larger models. The reason LLM seems to have stalled a bit is because there just not enough compute.
You have more people using AI which requires more compute, and you want to build larger models which requires more compute and you have limited compute. What do you do?
People should also look at Railway especially if majority of your users are in a single region because you will only really pay a price during active times and during times with low activity you will pay almost nothing.
I think they make most of their money off searches with intent (“vehicle detailing near me”) and things for which they still send you somewhere. The kind of searches that an LLM can just answer probably largely just sent you to Wikipedia or somewhere nobody was paying much for anyway.
It’s possible AI will do a better job of capturing ad dollars by better serving intentional searchers.
There's a part of the tech industry that uses what I would call dark influencer techniques. Search is dead. Lidar is too expensive for AVs. LLMs are as scary as thermonuclear bombs. China China China. Without ALPRs you'll get carjacked picking up Tommy from soccer.
Some of it is for stock pumping, some for regulatory capture, some is flooding the zone with shit.
This kind of "marketing" is part of the reason why tech is held in low esteem now. It destroys the sense of optimism and replaces it with fake tech bro worship.
I hate the china fear mongering. It's like the 50s red scare but 10x dumber since by all realistic accounts china is just. another government. a scary one and powerful one, yes, but so is the us. They aren't a rogue state like the DPRK or Iran, aren't funding terrorism by any realistic, and realise that starting any wars is a very bad idea
Is that even legal? What happens if my landlord accidentally charges me 10x rent this month and refuse to correct it even after I ask? That's just straight up stealing. I feel like at a minimum I'm getting my money back one way or another, and they are likely to face consequences for theft.
But, no need to set a precedent: I'm quite confident that a US court would refund a person or company that overpaid due to a bug in Antropic's billing.
reply