Hacker Newsnew | past | comments | ask | show | jobs | submit | blonder's commentslogin

Handling cash costs money too though. I know some small business are credit/debit card only since they do not want to deal with the hassle of cash. Out of everywhere I have been, only one place (some grocery chain in SLC) has accepted debit cards but not credit cards.

May I ask why you eschew the basically free money that comes from credit card rewards as a responsible credit card user?

Is it really free money? Actual cash? I've always seen rewards programs advertised in terms of discounts on specific products or services: consumer electronics, cruise vacations, furniture, gift cards, and other things I rarely spend money on. I expect it to be an overstock clearinghouse, something like the old Columbia House record club, where you would page through a catalog of random stuff looking for anything you could convince yourself to settle for, just because you'd already paid for the subscription. It sounds like a hassle and I'd rather ignore it.

Maybe it once was like what you're thinking, but not anymore.

There are fee free cards that give cash back as statement credits (AMEX Blue iirc). No limitations on what you can spend it on. The Apple Card does 2% cash back which you can just transfer to your bank account.

The Amazon card requires a Prime membership, but gives 5% back on anything bought at Amazon. I bought my last TV using the 5% back I had received.

Then there are top tier cards like the Chase Sapphire or Cap One Venture X that have yearly fees. But, if you take 1+ trips/year they immediately pay for themselves and more (credit for global entry, yearly statement credit for travel that almost equals the yearly fee, lounge entry, etc...). I routinely use points from the Venture X to cover travel expenses like tickets, rentals, hotels, eating out, etc...


If you hold $100k in Bank of America (or a linked Merrill Edge account), they will give you up to 5.25% cash back for their credit cards in certain categories, and 2.62% unlimited.

https://frugalprofessor.com/bank-of-america-customized-cash-...

To your point, it's not free money at all: the credit card companies are collecting fees, and the merchants are passing them on to you. This is a way to claw a part of that back - if you don't use a rewards card, you're paying _even more_.


Yes, there's quite a few that just give you actual money: You can get a check back. You often get a better return if you instead purchase things at a specific retailer or something like that, but it's not all gift cards and discounts.

Yes, on some credit cards it's actual 2% cash - Apple Credit Card, Fidelity.

Amazon gives you 5% back for using their credit card, it's criminal not to use it.

If you buy a lot of equipment or expensive equipment - B&H credit card covers sales tax! I.e. 10% for my area! (I don't use it since I don't buy that much, but still it's an option)


Yes literal dollars I can spend anywhere. It can even be deposited into my bank. For doing nothing at all except paying my normal expenses via my 2% cash back card I get $400-800 annually.

I know I could probably min-max this into more by juggling different cards for things like Amazon and Costco but I'm lazy and don't want to think.


I appreciate your rabid optimism, but considering that Moores Law has ceased to be true for multiple years now I am not sure a handwave about being able to scale to infinity is a reasonable way to look at things. Plenty of things have slowed down in progress in our current age, for example airplanes.


Someone always crawls out of the woodwork to repeat this supposed "fact" which hasn't been true for the entire half-century it's been repeated. Jim Keller (designer of most of the great CPUs of the last couple decades) gave a convincing presentation several years ago about just how not-true it is: https://www.youtube.com/watch?v=oIG9ztQw2Gc Everything he says in it still applies today.

Intel struggled for a decade, and folks think that means Moore's law died. But TSMC and Samsung just kept iterating. And hopefully Intel's 18a process will see them back in the game.


During the 1990s (and for some years before and after) we got 'Dennard scaling'. The frequency of processors tended to increase exponentially, too, and featured prominently in advertising and branding.

I suspect many people conflated Dennard scaling with Moore's law and the demise of Dennard scaling is what contributes to the popular imagination that Moore's law is dead: frequencies of processors have essentially stagnated.

See https://en.wikipedia.org/wiki/Dennard_scaling


Yup. Since then we've seen scaling primarily in transistor count, though clock speed has increased slowly as well. Increased transistor count has led to increasingly complex and capable instruction decode, branch prediction, out of order execution, larger caches, and wider execution pipelines in attempt to increase single-threaded performance. We've also seen the rise of embarrassingly parallel architectures like GPUs which more effectively make use of additional transistors despite lower clock speeds. But Moore's been with us the whole time.

Chiplets and advanced packaging are the latest techniques improving scaling and yield keeping Moore alive. As well as continued innovation in transistor design, light sources, computational inverse lithography, and wafer scale designs like Cerebras.


Yes. Increase in transistor count is what the original Moore's law was about. But during the golden age of Dennard scaling it was easy to get confused.


Agreed. And specifically Moore's law is about transistors per constant dollar. Because even in his time, spending enough could get you scaling beyond what was readily commercially available. Even if transistor count had stagnated, there is still a massive improvement from the $4,000 386sx Dad somehow convinced Mom to greenlight in the late 80s compared to a $45 Raspberry Pi today. And that factors into the equation as well.

Of course, feature size (and thus chip size) and cost are intimately related (wafers are a relatively fixed cost). And related as well to production quantity and yield (equipment and labor costs divide across all chips produced). That the whole thing continues scaling is non-obvious, a real insight, and tantamount to a modern miracle. Thanks to the hard work and effort of many talented people.


The way I remember it, it was about the transistor count in the commercially available chip with the lowest per transistor cost. Not transistor count per constant dollar.

Wikipedia quotes it as:

> The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years.

But I'm fairly sure, if you graph how many transistors you can buy per inflation adjusted dollar, you get a very similar graph.


Yes. I think you're probably right about phrasing. And transistor count per inflation adjusted dollar is the unit most commonly used to graph it. Similar ways to say the same thing.


The Law of Accelerating Returns is a better formulation, not tied to any particular substrate, it's just not as widely known.

https://imgur.com/a/UOUGYzZ - had chatgpt whip up an updated chart.

LoAR shows remarkably steady improvement. It's not about space or power efficiency, just ops per $1000, so transistor counts served as a very good proxy for a long time.

There's been sufficiently predictable progress that 80-100 TFLOPS in your pocket by 3035 is probably a solid bet, especially if a fully generative AI OS and platform catches on as a product. The LoAR frontier for compute in 2035 is going to be more advanced than the limits of prosumer/flagship handheld products like phones, so theres a bit of lag and variability.


You could put 64TBs of storage into your pocket with current technology. There are 4TB microSD cards available.

Not sure about the stated GFlops.. but I suspect we find that AI doesn't need that much compute to begin with.


You can run models locally on high end smartphones today with apps like PocketPal or Local LLM.


Is Detroit the first Waymo city that sees more than a negligible amount of snow? It will be interesting to see Waymos snowy road debut.


Waymo was doing tests in NYC earlier this fall but it looks like there are legal difficulties to getting actual approval in New York: https://www.thecity.nyc/2025/09/09/waymo-driverless-cars-nyc...


not that I'm complaining, but you'd think the capital of the big3 atuomakers would have put up a bigger fight than NYC


I think banning or severely limiting advertising similar to cigarettes would be a good start. Stop having sports broadcasts be so intertwined with gambling, seeing odds on the screen when watching sports is gross.


Sam has claimed that they are profitable on inference. Maybe he is lying but I don't think speaking so absolutely about them losing money on that is something you can throw around so matter of fact. They lose money because they dump an enormous amount of money on R&D.


I don't think you can confidently say how it will pan out. Maybe OpenAI is only unprofitable at the 200/month tier because those users are using 20x more compute than the 20/month users. OpenAI claims that they would be profitable if they weren't spending on R&D [1], so they clearly can't be hemorrhaging money that badly on the service side if you take that statement as truthful.

[1] https://www.axios.com/2025/08/15/sam-altman-gpt5-launch-chat...


"OpenAI claims that they would be profitable if they weren't spending on R&D "

Ermmm dude they are competing with Google. They have to keep reinvesting otherwise Google captures the users OAI currently has.

Free cash flows matter. Not accounting earnings. On a FCFF basis they largely in the red. Which means they have to keep raising money, at some point somebody will turn around and ask the difficult questions. This cannot go on forever.

And before someone mentions Amazon... Amazon raised enough money to sustain their reinvestment before they eventually got to the place where their EBIT(1-t) was greater than reinvestment.

This is not at all whats going on with OAI.


>OpenAI claims [...]

If you're gonna buy at face value whatever Scam Altman claims, then I have some Theranos shares you might be interested in.


Stealing a catalytic converter to sell for money cannot be equivocated to shoplifting. Plenty of shoplifters are doing it for the thrill or to obtain things that they wouldn't pay for, no one is doing that with cats, they are doing it to try and survive.


There was serious money in catalytic converter theft and an organized ring behind it raking in millions of dollars (up to $545 million) [0]. That’s not trying to survive. Since the arrest of the organizers of the ring, catalytic converter theft has fallen off significantly: without that criminal enterprise, catalytic converter theft ceased to be wildly lucrative. People who steal to survive steal essentials like food, not catalytic converters.

[0] https://en.m.wikipedia.org/wiki/2020%E2%80%932022_catalytic_...


Bitcoin was in the thousands for most of 2017, that as a booster to a tech heavy portfolio would have done extremely well.


That’s true… BTC returned 100% per year since 2017 (thanks to ChatGPT for the math).

But I would presume a regular Joe didn’t YOLO $100k into that, either.

I guess my original point was: there is no realistic way a retail investor walked away from a $100k investment a multimillionaire after 8 years unless they took an enormous, absurd punt.


If you invested $100K in Microsoft (one of the safest and most valuable and boring company in the world at the time) in 2017 and held, you’d not be a multimillionaire, but you’d be a millionaire. People really underestimate how crazy the asset bubble has been over the last decade.


USD 100k is an absurd amount for most people to invest all at once in anything whatsoever, not to mention the risk level involved. USD 10k is a lot but more reasonable. And that nets way less than USD 100k in this situation, which isn't life changing.


As a (former) huge tesla fan/shareholder, and current model 3 enjoyer, the cybertruck really makes me upset. The millions of dollars and engineering hours devoted to that thing that could have been devoted to a new product that people actually want and use (or even a halo product like the forgotten new roadster) was and is incredibly wasteful.


The cybertruck is the full vehicle version of the mistake Musk made by pushing for the falcon-wing doors on Model X. He said that was a huge mistake, and nothing like that would ever happen again.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: