The title here is misleading (there's no 2022 in the linked page).
The joke is that this was released in 2016 when the new MBP with the touch bar came out, and was reviewing the _previous_ version as an improvement over the new one.
I feel like people are missing the joke here for lots of reasons, but the addition of 2022 to the title here contributes.
"There's even speculation that Apple may refresh its antiquated Mac Pro and desktop macs, neither of which have been updated since their release in 2022."
So, the fictional review is set in 2022 or later.
Edit: I should watch The Curious Case of Benjamin Button :)
No, this is Benjamin Button: the gimmick is he is traveling backwards in time. So when he says "since 2022" in 2016, he is living in 2016 having just lived through the period 2022->2016 in that order. So this was a prediction, in 2016, that the Mac Pro and desktop macs would not be touched by Apple between 2022 and 2015.
Pity he didn't mention the pandemic. Could have been a useful warning to everyone.
No, it's meant to be moving backwards in time, so 2022 is in the past.
I realize that this isn't how Benjamin Button works. It's not the perfect concept. I can't believe I'm actually spending time typing this explaining how this kinda cheap joke someone else wrote works and acknowledging the weaknesses of it, but here we are.
Everyone derides Apple for not listening to the demands of users and then, when Apple finally does give in they... continue to deride them. I'm just happy with +magsafe -touchbar. Even if the price is Apple saving face and pretending this was their idea the whole time, so be it.
Apple wanted to be "bold" and forced a lot of their users into a pipeline that didn't last. And now they are listening to the demands of their users and backtracking.
Users that stuck with them are now being punished for their brand loyalty.
Apple: "If you want to connect to HDMI, you have to buy a dongle now. Also TOUCHBAR! Woot."
User: "Ugh...fine"
Apple: "So... now that you own that dongle that you didn't need, but we decided to make you need. Well, we decided not to make you need it anymore! It's built in. Also, no more touchbar! See! We listen!"
User: "Ugh......."
I'm not here to harp on Apple, but I understand why some people are a little put off by their decisions.
I like the magsafe and the card reader could be useful. But I think Apple should have stuck with the usb-c. Someone should push the standard. USB-C could handle anything (mouse, keyboard, printer, screen, camera, network, etc...). There was a time when the printer, mouse, keyboard and network all had different cables: It was a mess. I also can't interchange these cables. USB-C changes that and it is also smaller and fancier than the USB-A type.
Just the other day, I had to buy an HDMI cable for my screen. That could be a USB-C. But screen makers and graphic card makers are not going to change overnight. No one is strong enough to push the standard too. If anyone can do it, it's Apple. This is, in my opinion, a step backward.
I agree and would love to have everything usb-c. But I’m skeptical that will happen, for one practical thing. Apparently not every usb-c cable is created equal and not every usb-c port is created equal either. (Remember that Nintendo device with an usb-c port that turned out not to comply to the spec?)
If every cable looks the same but one is suited for 240W of juice but doesn’t do data and the other vice versa, maybe we would be worse off…
> (Remember that Nintendo device with an usb-c port that turned out not to comply to the spec?)
Someone needs to be the pedant and say this every time it comes up: It wasn't the Nintendo device that didn't respect the spec, it was knock-off docks and chargers that didn't respect the spec and damaged the Nintendo device [0]. I happen to think that Nintendo's engineers should have designed against some shenanigans by third parties (I do with my USB-C devices), but they weren't strictly wrong.
I spent more time than I'd like to admit trying to find the perfect off-brand usb-c dongle that would work with my switch (so I can plug it in to TVs when I travel, without needing to bring the whole dock). I am about 80% confident that the dongle that I eventually got wont brick my Switch, as long as I remember to plug in the power first and then the Switch (or was it the other way around)?
Oh yeah, and also, unrelated to the above, I now inspect the tiny grey-on-black voltage information whenever I'm about to plug in a USB-C cable for providing power.
Oh yeah, and also, each USB-C cable has its own special purpose, since they aren't actually interchangeable, as it turns out.
USB-C somehow managed to find the worst intersection of "Universally pluggable" without being a universal standard.
I really don’t understand the anger over usb c, even now. I got a single cable dock that handles all my peripherals. It’s not like they made that useless; it would still work and it’s still nice to have a single cable that drives everything. Even if I got a new MBP I’d continue to use USB-C, because it’s a better UX.
Hell, you can now buy monitors that work only with USB-C. It’s clearly the way everything is going, Apple just moved a bit too fast.
That dongle is $69, but you can try buying the off-brand, where you’re not sure whether it works, not sure it’s doesn’t upload your data to China, and not sure it doesn’t have malware. If you’re hit, good luck explaining your choice of brand to your insurance. So basically, it’s $69.
So? Next to a few thousand dollars of laptop, monitor, and peripherals what's another $70? Especially since that hub will be compatible with every laptop I will own for the next few decades, regardless of the brand.
> I got a single cable dock that handles all my peripherals.
That you have to drag with you if you want those ports away from your desk. Plug into a TV or projector and you need a dongle. No thanks. Ports are fine by me and I don't care if that adds 0.5mm or some stupid aesthetics nonsense.
I don't understand this dongle stuff. You just buy a USB-C to HDMI cable and plug that in. I don't see how it's any different than buying an HDMI cable.
I'm a photographer, so the SD card slot is nice for me personally, but it really doesn't make sense to build it into the laptop. It must be used by a tiny percentage of people.
Most new TVs/displays can do Thunderbolt so there’s your USBC connector replacing HDMI. Hdmi can be useful for longer runs because of integrity and there you’d want a HDMI->C cable/adapter that’s an active part.
For SD cards, the majority of people don’t use SD cards (even customers for the pro line). It’s a concession to an important niche, but forcing that niche to have a USBC adapter that is built into the SD card (maybe not possible?), having a separate SD card reader dongle, or having the camera itself be the SD card reader all seem like better choices than forcing all laptops to have an SD reader.
What’s the compelling story for why SD card or HDMI strictly need to be built in for all customers? The story for HDMI is kind of the strongest because the HDMI market is much more common day-to-day and USB-C as a display technology hasn’t yet permeated standalone TVs and projectors.
Generally I agree, but projectors specifically have almost always required dongles w/ Macs and the solution was usually to keep one attached to the projector cable or tethered nearby. Mini-DVI, Mini-Displayport, and even HDMI wasn't standard on many office projectors. It's a hassle but not specific to usb-c or a recent phenomenon.
My LG 5k monitor acts as my hub and power source, so I'm pretty close to "plug 1 thing". And from there I have a wired apple keyboard that connects to the monitor (yes with a dongle). But the keyboard has 2 old USB connectors which double as thumb drive connectors.
As it stands, I still have an analog audio out because I've procrastinated on getting a usb digital in/out, and I also have a ethernet <-> thunderbolt connector because I only got about 400Mb/s when going via the monitor instead of 950+ when going direct.
3 small connectors all on 1 side isn't that bad.
That being said, I welcome the HDMI addition mostly for older conference rooms which aren't equipped with Apple TV / Webex Proximity / etc.
Yea, having "one cable to rule them all" is great. I can see why having an HDMI port would be nice when you're traveling, or using shared projectors, but at this point, if you don't have a USB-C to HDML adaptor...
100% this. I think it's a good lesson that sometimes there are multiple decisions/outcomes which are good enough, and there is no one perfect choice. Some people will be happy if you go one way and others prefer the other way. But waffling back and forth between them instead of sticking to one coherent strategy can be worse than committing to either of the options.
My problem with USB-C is its too dainty. Since it became the defacto charger interface for laptops I can't tell you how many chargers and laptops we've had to either replace or get repaired under warranty because the physical connector isn't up to the task.
USB-C seems to be fine perfectly straight on with no strain at all but use it in anything less than ideal situations and it wears out surprisingly fast. The connectors tend to be longer than they are wide, which is always bad for robustness and they have 0 built in strain relief.
I think it performs well and they got the reversible design right but its just a little too wimply for the way people really use their devices.
Im not saying we need to go back to screws. But to compare usb-c to the previous standard power connector lenovo used, which was shaped like a usb type a port but a bit chunkier. we never had issues with the ports wearing out or getting damaged with those. They lasted the life of the device just fine. But I have T14s with usb-c charging that need new chargers or charge ports after a few months.
Obviously YMMV but when you manage a device fleet you start to see more clearly how "average users" treat things and how designs hold up.
I haven't felt this way until recently when using a USB-C port plugged in while in bed. Felt like I had to be careful moving around or I'll hit the cable and damage my port.
Nah, this absolutely looks like the result of Jony Ive leaving. Head of design quits and suddenly a long term trend under his leadership suddenly reverses? Makes more sense than any sinister explanation.
Both are not mutually exclusive. His leaving may be why they changed track. However that is not why they are harping these reverts as new features/innovation.
Marketing releases talking about “innovation” are equivalent to protestations of innocence before a criminal court; you should ignore it because everyone is motivated to claim the same thing regardless of whether it’s true or not. Doesn’t mean it’s always true or always false, sometimes real innovation occurs, but taken alone it provides no useful information.
Yeah, I think Jony Ive jumped the shark with the touchbar and eliminating ports. I dunno why they ditched MagSafe. The Worse parody of latter day Ive [1] was pretty spot on.
I really like this MacBook Pro but at $2000 I may wait for the next Air. I really don't need the Pro features, HDMI, SD card, that many GPUs, ..., and my 2017 Air is just fine for now.
> Evans Hankey, vice president of Industrial Design, and Alan Dye, vice president of Human Interface Design, will report to Jeff Williams, Apple’s chief operating officer
When is the last time you saw Apple make everyone happy?
If you think this is bad, you should have been around after Jobs left in the 90's and the icky product line after Mac IIfx. No one liked the PowerMacs and PCs were so incredibly cheap that only art departments had macs.
But even when Jobs returned, remember when everyone lost their shit when Jobs removed the disk drive from the iMac?
Maybe the iPhone 3 was the last universally rejoiced Apple product.
(But that notch tho. It really tweaks my obsessive nature. I'm still on an iPhone 6 so I haven't encountered the notch anywhere yet.)
I think it's good for people to not think in such absolute terms. People are happy this product exists, but it's also annoying that it took so long. I'd rather have that type of response than the standard "fanboys" vs "haters"
Who is this "everyone"? Taking a look at the announcement thread here on HN, the response to these machines has been extremely positive.
I can't really find any criticism of the machines themselves. Negative comments are directed at Apple for pre-existing customer-hostile behavior like their repair policies.
Well they never gave us our money back for our keyboards where the keys fell off or the dGPU that overheats and throttles the CPU, when the CPU isn't even doing any useful work
> Everyone derides Apple for not listening to the demands of users and then, when Apple finally does give in they... continue to deride them.
I explain such behavior to myself by keeping in mind that the internet is many people with many opinions. If one sees an inconsistent response to a topic, one is likely seeing disjoint groups of people that separately have consistent opinions, rather than a cohesive group that can't make up its mind. I find it easy to miss that when I'm not paying attention to who said what.
They switched whole-hog to USB-C on their computers to push industry adoption and were derided for it, with people bitching and moaning about how they needed dongles for "everything." What was left out of the conversation: they dropped the proprietary magsafe connector, opened the door to charging their laptops off third party chargers, 12v/airline adapters, etc. They also opened the door to having one cable connecting your laptop to power, display, input devices, storage, and networking by means of a dock. But the teenage masses on reddit screamed "DONGLEEEEESSSS" and spewed shitty memes. People who evoke Nazi language to describe their smug choice in computer hardware, that's a crowd we should listen to...
Apple stopped including USB power bricks because everyone on the planet is drowning in them by now, and there are many choices like multi-port AC adapters, ones with different combinations of USB A and C meaning you probably want to pick the adapter you want anyway. But people screamed blue-bloody-murder. And then not long after Apple does it, Samsung quietly follows suit and nobody says a fucking word.
Apple dropped floppy drives and the world screamed about how stupid they were. At the time software had long since stopped coming on floppy, USB thumb drives were out, ethernet was everywhere and wifi was going mainstream. Apple helped put the nail in the coffin, pushing many people to USB thumb drives, network file shares, wifi, etc.
Apple dropped optical drives, same thing. Software was coming digitally; I worked for a company that offered a digital download in 1999 and we were a dinky little niche company. By the time Apple stopped including optical drives on their laptops, CDs were largely dead for software, everyone was using MP3s, etc. That room freed up space for stuff like bigger batteries.
Apple pushed hard on digital display interconnects when much of the PC world was still chugging along on VGA which worked poorly with actual raster displays that were rapidly becoming the norm and PC hardware companies just couldn't seem to figure out how the fuck to communicate "here are the resolutions and refresh rates I can do", something Macs had been doing for a decade. Kids these days don't remember the pains of having to do geometric adjustments on a digital display for a supposedly raster signal.
Dropping headphone connectors on their phones, people screamed blue bloody murder. A pair of pretty decent wireless headphones with 8 hours of battery life were not hard to find for $30ish, now they're commonplace for cheaper, and damn near every car comes with a bluetooth audio function in its stereo, even work vans. And if you want hardwired headphones, a dongle is a $5-20 affair, with Apple's audio adapter scoring very good marks in audio DAC tests...
> Despite the many improvements, Apple is actually dropping the price on its flagship 15" MacBook Pro by $400
I would have been happy if the price had stayed the same or even gone up a bit. But the jump to $2k (which is $2,500 including AppleCare and tax) is a bridge too far.
I'll keep suffering with my butterfly keyboard until amazing futuristic features like HDMI and SD trickle down into the more affordable machines. /s
$2K for the cheapest possible build. Add in a couple upgdades (1TB SSD, 32GB RAM) and it isn't hard to hit $3K+. There are great machines but a bit too pricey for my liking.
I think the price point sets the stage for a lower cost offering in the $699-$1299 range that will not be branded as "pro" and will likely have more of the eco cores and incredible battery life.
It’s a fine price for the quality of machine. In my opinion, people simply haven’t reckoned with the fact that they only need an M1 Macbook Air, which is competitive with Windows machines above its price point.
I agree that most people (myself included) only need an M1 MBA. Problem is, that machine has two ports, one of which is for charging. Even the M1 MBP only comes with this limitation.
You shouldn't have to upgrade past the base Pro machine — and shell out an extra $800 — to get more than one spare port (and 'fancy' stuff like HDMI and SD).
Waiting for review. Price should be comparable to performance. If Apple machines are expensive but perform comparable to $3.5k over-configured laptops, then it's a good deal.
On such machines, I wonder what is the sweet point. If you work seriously, you’ll be on your desk, large screen, separate keyword. If you are in the train, you’ll prefer the smaller screen.
The real problem is, it became normal to have “padding: 300px” in webapps.
I haven't used a 14", but that may be the sweet spot. That's why it's taken the $2k price point. I like the portability of the 13", while the 16" can be cumbersome on an airplane.
The 13" I have now is adequate, but I miss the extra screen real-estate I had with the 15/16". The 16", especially, was really a treat.
I think that the new laptops are a huge deal for Apple. Some folks argue that we're celebrating them putting back ports that should have been there from the beginning. And that the keyboards should have never gotten so bad.
I see it as a celebration. It's essentially an acknowledgement that they were wrong about all that, and they've built a heavier and "uglier" machine that people actually want/need.
As SJ said, design is how it works. Dongletown + bad keyboards was a bad design, and now they're back to prioritizing usability.
An acknowledgment of what? Apple does not owe you anything. You chose to buy their products, with or without the need for dongles. They have allowed feedback and the market to correct this by offering revised products.
Apple doesn't allow any leeway though. It's "my way or the highway" with displays, keyboards, ports, headphone jacks, charging cables, programming languages, API entitlements, app distribution, content integration, graphics API, coreutil versions, package managers and even your goddamn operating system.
“We are very, very sorry. We were so wrong. We didn’t listen to all your posts on Reddit and HN, but we should have. To make it up we will be releasing a $500 M1 Max that is fully upgradable, repairable, and runs Arch Linux” - Tim Cook
A bit off topic but as much as we can praise the return of some ports on the new MBP, it's amazing to compare with products like https://us.vaio.com/collections/vaio-sx12
I think Benjamin Button needs to review: web browsers and JS stack, graphic and brand design, product design, domestic manufacturing industry, social media, subscription model of product pricing, political party extremism, car UX/UI, smart appliances and IoT gadgets.
I'd love to see social media, especially the 'upgrade' when FB gets rid of the timeline, leaving an uncluttered interface where you can interact with your friends without distractions.
> Rumors are also swirling that the company will add a headphone jack to its already popular iPhone. The announcement could come as early as this month.
I ordered a 16" M1 Max 64GB and am excited for it, but I'm sad that Apple dropped the touch bar. I do use it all the time, scrubbing through slides in Keynote, audio controls in Logic, hanging up a Facetime call...
It’s a high end computer with high end specs. Apple makes lower end computers that are cheaper. They’re not going for the bottom of the market, so you’ll always be able to find a cheaper PC. What’s the complaint? Options are good.
I bought a mac mini with 8gb ram for $650. It significantly outperforms my company macbook pro which cost almost $3k. I run logic, docker, intellij, and it never complains about not enough memory.
You are comparing a portable device with a powered one. If you don't need portability/mobility of your device any manufacturer is going to be costlier for comparable performance even if they can reach that level.
Even with jump in M1 performance at lower power envelopes, top of the line deskop chips will easily beat it. Apple has only reduced the gap in the performance between high power draw and lower power draw machines, they have not surpassed it.
The Mac pro is still a Intel based machine, it comes with relatively beefy GPU configurations as well ( the PC world has even better options) and with a lot of expansion slots that you cannot do in a Mini, the top of the line model is a 28-core 2.9 GHz processor. It will definitely beat the M1 Mini[1] in performance and cost(~5-10x).
You(I) may not see the value of that increased performance[2] and will be unwilling to pay the premium (assuming there is no use for other features like expansion cards), however there is decent market who do value that 2.5x more than the 5-10x price difference it costs.
[2] The power draw is also considerably more for the Mac pro ~ 1180W compared to 150W for the Mini, so not 1:1 comparison technically, however if you are only looking at non portable compute as the only value prop then it shouldn't matter.
The new USB-Cs allow you to transmit power and data. The fingerprint reader gives you an extra layer of security and saves you from constantly typing in new passwords. The one complaint is the [some peripheral that goes bad]
The joke is they go around in a cycle and everyone notes with a mix of praise and nostalgia how things have changed.
I feel we should take a moment to appreciate the USB-C end on the magsafe charger so we can replace the cable when it breaks instead of paying for the entire brick.
yeah i think the rub is that much of the internet world uses x86 containers/vms because so much backend infra is still x86 and that will probably stay the case for a long time.
my understanding is that when you're on intel, vms are accelerated with this vt-x hardware support stuff that essentially runs the machine code in the vm on bare metal. before vt-x, vms were dog slow and involved running the code in a full software emulation of the target cpu which is slow because there's a whole layer of indirection that adds loads of instructions for every target instruction executed and breaks all the caching and branch prediction and such.
the rosetta thing seems cool in that they essentially jit and cache translations of intel binaries that are executed under darwin, translating them once and caching the result, and it turns out they tweaked their chips to be intel like to make this translation lightweight... so you can run intel binaries from the darwin shell and after the first run, they're quite fast...
but... the question is, are there effectively two layers of emulation that break each other's acceleration? the vm emulation and then the x86 processor emulation? can you run x86 vms in a vt-x like way, while taking advantage of rosetta's jit translation (and the caching of those results)? it seems really hard as the host machine would not have any knowledge of when a child vm is even launching a process, let alone its identity, and if it doesn't know these things, how would it know to jit that executable and save the result?
seems you would need some sort of special hardware emulation support, but i'm just speculating...
edit: it seems you could also do something by adding support in the guest os, but i have doubts as to how the maintainers would feel about such a thing!
I feel like the only complaint of the 2016 MBP was keyboard and ports. The M chips are literally a lottery at this point. Apple making their own chips is probably a rare thing in any business where the maker of a product eventually makes the most complicated part of the product themselves from scratch which only a handful of other people in the industry have managed in 2 decades. Yes iPhone chips were designed by Apple for quite some time but a general purpose computer is a lot more different than a heavily controlled App Store only device.
The first M1 laptop was released in 2020. It's been met with more or less universal praise: performance is impressive, performance per watt is mind-boggling, and the overwhelming majority of popular programs and applications have either been ported to Apple Silicon by now or work adequately in Rosetta 2.
It's almost as if you meant to post this comment a year ago.
Lottery? M1 MacBook is the best computer I’ve ever had, and I’ve been around.. It is limited only by RAM and GPU, and now with M1 Pro and Max that limitation is lifted too.
Apple knocked it out of the park, that much is clear by now.
They aren't making a chip from scratch though--they've licensed an ISA and built a derivative design customized to their needs. This isn't to knock the work Apple has done--it's not trivial and clearly not everyone else is able or willing to do it. But going from absolute zero and inventing an entire new ISA, all of the tooling for it, etc. is an order of magnitude more work for both Apple and all of its developers. They made a smart move to use ARM's ISA and liberal licensing that allows them to build on top of it and make exactly the hardware they need.
This isn't a direct response per se, but your comments made me think of some relevant background.
Apple has been deeply involved with ARM almost since the beginning. Allegedly, the acronym "ARM" was changed from "Acorn RISC Machine" to "Advanced RISC Machine" at the behest of Apple, and their engineers seem to have been involved soon after the first ARM chip was created for internal use in Acorn's computers, making modifications to the chip and ISA to make it suitable for the Newton, their combined efforts creating the first commercially released ARM chip, the ARM6.
More recently, Apple has done a lot of work with LLVM. They weren't the original authors, but they've effectively created a lot of their own tooling.
All this to say, while they did license ARM, and they did start with someone else's tooling, they were so deeply involved in the origin/growth of both I think you may be underselling their involvement/work. If they didn't already have such deep historical ties to ARM, I suspect they would have seriously considered making their own architecture.
For all intents and purposes, they are. None of Apple's SoCs since the A6 (2012) have been based on ARM's Cortex-A cores; the CPU design is fully in-house at this point.
Yes, but the M1 is nothing like a new product. As you indicated, Apple's first custom SoC was released in 2010, 11 years ago. They have >10 years of experience shipping SoCs for Apple products. The M1 family can be viewed, to some extent, as an extension of the work on the Ax chips, which likely builds on their experience customizing chips for the iPod family.
The ISA is roughly equivalent to an API. It specifies how to talk to the chip but does not define how it is implemented. Apple has done a lot of custom design of their chips to optimize them for use with Mac OS and Mac software. This is not just Apple copying a chip design.
Okay so let me be clear what I meant in the first line. I meant MBP users only wanted the ports to be fixed and keyboard since 2016. Getting the M chips that are this goods was a lottery winner for macOS users. I am saying this in a positive tone not negative (though I could have worded it wrong).
The second part is where I say that even if Apple has worked on ARM chips for their phones for quite some time tuning it, macs are still significant product and even if the chips are in house its still a gamble to completely shift a product from what was basically an x86 monopoly. What I meant is that the way Apple pulled it off is rare in the business.
The joke is that this was released in 2016 when the new MBP with the touch bar came out, and was reviewing the _previous_ version as an improvement over the new one.
I feel like people are missing the joke here for lots of reasons, but the addition of 2022 to the title here contributes.