Hacker News new | past | comments | ask | show | jobs | submit
loading story #40289185
loading story #40286118
loading story #40286128
loading story #40286654
loading story #40287933
loading story #40287384
Seeing an M series chip launch first in an iPad must be result of some mad supply chain and manufacturing related hangovers from COVID.

If the iPad had better software and could be considered a first class productivity machine then it would be less surprising but the one thing no one says about the iPads is “I wish this chip were faster”

Welcome to being old!

Watch a 20-year old creative work on an iPad and you will quickly change your mind. Watch someone who has, "never really used a desktop, [I] just use an iPad" work in Procreate or LumaFusion.

The iPad has amazing software. Better, in many ways, than desktop alternatives if you know how to use it. There are some things they can't do, and the workflow can be less flexible or full featured in some cases, but the speed at which some people (not me) can work on an iPad is mindblowing.

I use a "pro" app on an iPad and I find myself looking around for how to do something and end up having to Google it half the time. When I watch someone who really knows how to use an iPad use the same app they know exactly what gesture to do or where to long tap. I'm like, "How did you know that clicking on that part of the timeline would trigger that selection," and they just look back at you like, "What do you mean? How else would you do it?"

There is a bizarre and almost undocumented design langauge of iPadOS that some people simply seem to know. It often pops up in those little "tap-torials" when a new feature roles out that I either ignore or forget… but other people internalize them.

loading story #40289995
loading story #40295748
loading story #40289153
loading story #40293493
loading story #40302240
loading story #40293511
To be honest, I wish my iPad's chip was slower! I can't do anything other than watch videos and use drawing programs on an iPad, why does it need a big expensive power hungry and environmentally impactful CPU when one 1/10 the speed would do?

If I could actually do something with an iPad there would be a different discussion, but the operating system is so incredibly gimped that the most demanding task it's really suited for is .. decoding video.

loading story #40287803
loading story #40290012
loading story #40287869
loading story #40291424
loading story #40289557
loading story #40288128
loading story #40295236
My guess is that the market size fit current yields.
loading story #40286582
loading story #40286349
Maybe they're clocking it way down. Same performance, double the battery life.
loading story #40286538
My current assumption is that this has to do with whatever "AI" Apple is planning to launch at WWDC. If they launched a new iPad with an M3 that wasn't able to sufficiently run on-device LLMs or whatever new models they are going to announce in a month, it would be a bad move. The iPhones in the fall will certainly run some new chip capable of on-device models, but the iPads (being announced in the Spring just before WWDC) are slightly inconveniently timed since they have to announce the hardware before the software.
loading story #40288066
Well, it also affects the battery life, so it's not entirely wasted on the ipad
To me it just feels like a soft launch.

You probably have people (like myself) trying to keep up with the latest MacBook Air who get fatigued having to get a new laptop every year (I just upgraded to the M3 not too long ago, from the M2, and before that... the M1... is there any reason to? Not really...), so now they are trying to entice people who don't have iPads yet / who are waiting for a reason to do an iPad upgrade.

For $1,300 configured with the keyboard, I have no clue what I'd do with this device. They very deliberately are keeping iPadOS + MacOS separate.

loading story #40286981
loading story #40286570
loading story #40286909
loading story #40286619
I'm wondering if it's because they're hitting the limits of the architecture, and it sounds way better to compare M4 vs M2 as opposed to vs M3, which they'd have to do if it launched in a Macbook Pro.
loading story #40287510
My guess is that the M4 and M3 are functionally almost identical so there's no real reason for them to restrict the iPad M4 launch until they get the chip into the MacBook / Air.
I have the faintest of hope that WWDC will reveal a new hybrid Mac/iPad OS. If it ever happens I won’t hesitate to buy an iPad Pro.
To offer something better to those who have an iPad Pro M2 and a more powerful environment to run heavier games.
Meanwhile the mac mini is still on M2
loading story #40295005
loading story #40287857
loading story #40286253
loading story #40286556
loading story #40290505
loading story #40286346
loading story #40286167
Generally, I feel that telling a company how to handle a product line as successful as the iPads doesn't make much sense (what does my opinion matter vs their success), but I beg you, please make Xcode available on iPad OS or provide an optional and separate MacOS mode similar to Dex on Samsung tablets. Being totally honest, I don't like MacOS that much in comparison to other options, but we have to face the fact that even with the M1, the iPads raw performance was far beyond the vast majority of laptops and tablets in a wide range of use cases, yet the restrictive software made that all for naught. Consider that the "average" customer is equally happy with and, due to pricing, generally steered towards the iPad Air, which are great devices that cover the vast majority of use cases essentially identical to the Pro.

Please find a way beyond local transformer models to offer a true use case that differentiates the Pro from the Air (ideally development). The second that gets announced, I'd order the 13-inch model straight away. As it stands, Apple's stance is at least saving me from spending 3,5k as I've resigned myself to accept that the best hardware in tablets simply cannot be used in any meaningful way. Xcode would be a start, MacOS a bearable compromise (unless they start to address the instability and bugs I deal with on my MBP, which would make MacOS more than just a compromise), Asahi a ridiculous, yet beautiful pipedream. Fedora on an iPad, the best of hardware and software, at least in my personal opinion.

loading story #40296055
loading story #40286605
Apple was really losing me with the last generation of intel macbooks but these m class processors are so good they've got me locked in all over again
loading story #40286666
Nano-Texture

I really hope this comes to all Apple products soon (iPhones, all iPads, etc).

It's some of the best anti-reflective tech I've seen that keeps color and brightness deep & bright.

loading story #40289615
loading story #40296812
loading story #40291069
loading story #40286161
loading story #40294445
loading story #40291117
loading story #40286415
loading story #40286268
loading story #40295611
loading story #40289929
loading story #40288136
loading story #40291904
loading story #40286644
loading story #40286088
loading story #40297705
loading story #40294009
loading story #40288921
loading story #40286577
loading story #40288241
loading story #40286120
loading story #40286710
loading story #40292574
loading story #40286087
loading story #40288029
loading story #40286441
loading story #40286500
loading story #40287156
loading story #40287296
loading story #40299850
loading story #40287080
loading story #40286104
loading story #40287903
loading story #40286291
loading story #40286615
loading story #40294371
loading story #40292919
loading story #40290247
> M4 has Apple’s fastest Neural Engine ever, capable of up to 38 trillion operations per second, which is faster than the neural processing unit of any AI PC today.

I didn't even realize there is other PC-level hardware with AI-specific compute. What's the AMD and Intel equivalent of Neural Engine? (not that it matters since it seems the GPU where most of the AI workload is handled anyway)

> not that it matters since it seems the GPU where most of the AI workload is handled anyway

GPUs can also have AI-specific compute (e.g., Nvidia’s tensor cores.)

AMD/Intel just call it an NPU.
loading story #40288346
loading story #40296126
loading story #40289627
loading story #40286757
loading story #40295451
loading story #40288661
loading story #40288677
loading story #40286098
loading story #40298606
loading story #40293261
loading story #40290225
loading story #40286358
loading story #40290912
loading story #40290911
loading story #40292135
loading story #40291043
loading story #40298311
loading story #40295628
loading story #40292840
loading story #40289748
loading story #40286728
loading story #40290875
loading story #40295588
loading story #40292982
loading story #40302017
loading story #40291660
loading story #40291883
loading story #40286274
loading story #40287003
loading story #40291146
loading story #40295665
loading story #40292806
loading story #40292714
loading story #40291701
loading story #40303827
loading story #40286289
loading story #40287120
loading story #40299120
loading story #40292658
loading story #40286946
loading story #40287917
loading story #40290028
loading story #40286294
loading story #40287228
loading story #40287815
loading story #40286640
loading story #40286872
loading story #40293639
loading story #40288662
loading story #40286153
loading story #40292066
loading story #40288294
loading story #40286612
loading story #40286181
No numbers on battery life improvements?
loading story #40286727
loading story #40288874
loading story #40297103
loading story #40288717
loading story #40286135
loading story #40286148
loading story #40287419
loading story #40286129
loading story #40286300
loading story #40286497
loading story #40292405
loading story #40292367
loading story #40295708
loading story #40290909
loading story #40286108
loading story #40287181
loading story #40286472
loading story #40289745
loading story #40286437