"They've been relegated to pretty insignificant roles in the PC business."
👁️👄👁️
link
fedilink
English
31Y

Not if they still suck ass

Macs are pretty popular

Their market share is up but still less than 10%.

I wonder if intel is betting on increased centralized cloud computing as the way forward for personal computers. So the efficiency benefits of ARM are irrelevant in their minds since they think the real power will come from big data centers.

chameleon
link
fedilink
5
edit-2
1Y

AWS has a shitton of in-house “Graviton” ARM stuff available and the ARM server chips from Ampere are popping up in more and more places as well. Most Linux servery distros have ARM images available now, and most software builds without major changes. It’s a slow transition but it’s already happening.

@catacomb@beehaw.org
link
fedilink
English
21Y

Yeah and ARM servers are cheap. You can often get twice the processor cores and memory for the same price.

That doesn’t always map to twice the performance, though some benchmarks would suggest it could for certain applications.

Do they not realize data centers are powered by electricity that costs money? Why not also use ARM there and cut costs?

Gormadt
link
fedilink
21Y

If they have to emulate x86 stuff it may pose some issues with efficiency

But that will likely only be in the short term

melroy
link
fedilink
24
edit-2
1Y

Well they should be afraid. I want a ARM Linux laptop as well. Or even better RISC-V! Yes plz… THE WORLD NEED RISC-V, Yesterday.

I wish the Pinebook Pro was updated cause I’d give that a shot. Or better, an ARM powered Framework laptop

bedrooms
link
fedilink
351Y

I love my ARMed Mac because battery life. I almost never use the power cable outside.

Veraxus
link
fedilink
11Y

If only I could get wifi to work on a linux partition, it would be the perfect linux machine.

The wifi worked fine for me on Fedora Asahi, macbook air m2.

bedrooms
link
fedilink
1
edit-2
1Y

Maybe you can buy a USB-C Wifi interface that’s small enough. Assuming there’s something like that.

And it’s really responsive even on battery. It’s actually a little bad because I can have too many windows open and can’t find anything.

MacOS doesn’t throttle performance on battery like many Windows power plans do, that’s why

HeartyBeast
link
fedilink
210M

Well it can when it needs to. It just doesn’t need to much

@Sentau@feddit.de
link
fedilink
2
edit-2
1Y

MacOS doesn’t need to throttle performance because ARM and other RISC architectures are naturally very power efficient

They didn’t do it on x86 either I believe.

Well those Intel CPUs used to thermal throttle anyway in their outlandishly inadequate cooling designs so they did not need to throttle power either way. Now they could throttle power but don’t have to

lol3droflxp
link
fedilink
11Y

deleted by creator

Intel is evidently not paying attention.

Apple already did though. Even specifically replacing Intel chips because Intel’s offering was dogshit that was destroying their ability to offer the design they wanted with their stupid power draw.

The rest of ARM is behind, and Windows has done a shit job of ARM support, but that doesn’t mean that’s forever.

V ‎ ‎
link
fedilink
English
221Y

Especially when it’s becoming increasingly obvious that Windows isn’t the future. Windows has maintained dominance because it is great at backwards compatibility. ARM erodes that advantage because of architectural differences, coupled with the difficulty and drawbacks of emulating x86 on ARM. Mobile is eating more and more market share, and devs aren’t making enterprise software for Windows like they use to.

No one working on a greenfield project says “let’s develop our systems on Windows server” unless they already were doing that. Windows as a service is the more likely future, funneled by Azure.

@catacomb@beehaw.org
link
fedilink
English
410M

Even some shops working with Windows Server are asking “wait, why are we paying for these licenses?”

Then it comes down to whether it’s cheaper to rewrite legacy applications or continue to pay for licenses.

V ‎ ‎
link
fedilink
English
210M

My former employer made this decision recently. They moved off .NET and onto a web app with a RHEL server. Time will tell if they pull it off.

ripcord
link
fedilink
4
edit-2
1Y

Also, Chromebooks. And the more powerful CPUs the more they’ll be purchased too.

And low-end Windows laptops.

Maybe not a giant piece of the pie of the current market, but definitely a dent as these more powerful CPUs come online.

@loki@lemmy.ml
link
fedilink
English
41Y

The rest of ARM is behind

That might change with Snapdragon X. It isn’t out yet but competition to the top will hopefully start getting the prices down.

https://www.anandtech.com/show/21105/qualcomm-previews-snapdragon-x-elite-soc-oryon-cpu-starts-in-laptops-

Windows also seems more concerned with going all in on cloud computing, the whole “you will own nothing and like it” paradigm. So making a faster and more efficient mobile platform isn’t probably a high priority for them.

Them trying to force control away from users is bad.

But arm’s efficiency make it a damn good option for a thin client.

Them taking control away from me makes me not use them. Not a problem at all.

I was never too deep because I always hated everything about Windows UX, but I was stuck with them for gaming for a bit. Luckily Steam fixed that for pretty much everything I wanted to play but Madden (and after hours of it also not working on a separate Windows install I tried just for that purpose, I threw in the towel on that, too).

The funny thing is I actually kind of like the idea of a thin client as a general rule. Not for gaming or anything else latency sensitive, but offloading heavy lifting is perfectly fine with me. Just not in a way I don’t have control of.

I’m stuck with it because of work. Luckily, “Industry 4.0” is completely fucking fed up with M$ and they’re abandoning Windows in droves. I’m just waiting for my vendor to finish polishing their MacOS and Linux alternatives.

Yah, I’m really not enthused with the idea of having to pay monthly rent for my computers ability to function.

I wonder if intel just values their existing experience with 86 more than any potential efficiency gains since the efficiency matters a lot less when the whole system is just a glorified screen and antenna.

@flashgnash@lemm.ee
link
fedilink
1
edit-2
10M

I’m really not sure even Microsoft could get away with that

The moment a subscription service comes into play for something they take for granted as free people will start looking at alternatives. Chromebooks and macbooks exist and from what I hear Chromebooks are starting to become serious competition for Windows

Plus Linux desktop obviously getting more user friendly and preinstalled on laptops

I think it matters more.

Apple’s battery life is so good in large part because ARM is way better at low end power draw.

I’d say their recent trend towards packing in E(fficiency)-cores along with their previously standard P(erformance)-core design shows that they’re sensitive to and reacting to both the higher core counts of AMD and the greater efficiency of ARM

Intel planning to abuse its quasi-monopoly to stifle competition and innovation? They wouldn’t dare, would they? /s

AutoTL;DR
bot account
link
fedilink
English
41Y

🤖 I’m a bot that provides automatic summaries for articles:

Click here to see the summary

But Intel CEO Pat Gelsinger doesn’t seem worried about it yet, as he said on the company’s most recent earnings call (via Seeking Alpha).

“Arm and Windows client alternatives, generally, they’ve been relegated to pretty insignificant roles in the PC business,” said Gelsinger.

Ideally, Arm-based PCs promise performance on par with x86 chips from Intel and AMD, but with dramatically better power efficiency that allows for long-lasting battery life and fanless PC designs.

Qualcomm’s latest Snapdragon chip for PCs, the 8cx Gen 3 (also called the Microsoft SQ3), appears in two consumer Windows devices.

Even if Gelsinger is wrong, he’s trying to spin the rise of Arm PCs as a potentially positive thing, saying that Intel would be happy to manufacture these chips for its competitors.

Right now, TSMC has an effective monopoly on cutting-edge chip manufacturing, making high-end silicon for Qualcomm, Nvidia, AMD, Apple, and (tellingly) Intel itself.


Saved 71% of original text.

meseek #2982
link
fedilink
11Y

🥱

This sounds very familiar to when Steve Ballmer wasn’t worried about the iPhone at all.

Or when Kodak didn’t worry about digital cameras

The problem with ARM laptops is all of the x86 windows software that will never get ARM support and all of the users that will complain about poor performance if an emulator is used to run the x86 software.

Most Linux software already supports ARM natively. I would love to have an ARM laptop as long as it has a decent GPU with good open source drivers. It would need full OpenGL and Vulkan support and not that OpenGL ES crap though.

Modern ARM GPUs already support OpenGL and Vulkan, that’s not a problem. Just some platforms chose to go mobile APIs due to running Android.

The trick with emulation that Apple did was to add custom instructions to the CPU that are used by the emulation layer to efficiently run x86_64 code. Nothing is stopping other CPU manufacturers from doing the same, the only issue is that they have to collaborate with the emulation developer.

The driver situation is less than ideal. Mesa got support for Mali but that’s not the only GPU that comes with ARM chips and you get bonkers situations. E.g. with my rk3399-based NanoPC, a couple of years ago (haven’t checked in a while and yes it’s a Mali) rockchip’s blob supported vulkan for android but only gles for linux as rockchip never paid ARM the licensing fees for that.

And honestly ARM is on the way down: Chip producers are antsy about the whole Qualcomm thing and Qualcomm itself is definitely moving away from ARM, as such my bets for the long and even mid-term are firmly on RISC-V. Still lack desktop performance but with mobile players getting into the game laptops aren’t far off.

Windows as always turn out to be the main villain.

Microsoft is actually pushing Windows on ARM right now, since their exclusivity deal with Qualcom expired. This is going to get interesting.

w2tpmf
link
fedilink
191Y

Windows has nothing to do with it. They are talking about software applications that were made for x86. Stuff like Adobe CC, etc.

Windows runs on ARM (and has for a decade) and the apps available in the Windows app store run on ARM.

Apple has shown that the market could be willing to adapt.

But then again, they’ve always had more leverage than the Wintel-crowd.

But what people seem to ignore is that there is another option as well: hardware emulation.

IIRC correctly old AMD CPU’s, notably the K6, was actually a RISC core with a translation layer turning X86 instructions into the necessary chain of RISC instructions.

That could also be a potential approach to swapping outright. If 80% of your code runs natively and then 20% passes this hardware layer where the energy loss is bigger than the performance loss you might have a compelling product.

Virtually all modern x86 chips work that way

@barsoap@lemm.ee
link
fedilink
2
edit-2
10M

Microcoding has been a thing since the 1950s, it’s the default. Early RISCs tried to get away with it and for a brief time RISCs weren’t microcoded kinda by definition, but it snuck back in because it’s just too useful to not hard-wire everything. You maybe get away with it on MIPS but Arm? Tough luck. RISC-V can be done and it can make microcontroller-scale chips simpler, but you can also implement the RV32I (full) insn set in terms of RVC (compressed subset) and be faster. Not to mention that when you get to things like the vector extensions you definitely want to use microcode. The Cray-1 was hardwired, but they, too, dropped it for a reason.

I guess in modern days RISC more or less means “a decent chunk of the instruction set will not be microcoded but can instead be used as microcode”, whereas with modern CISC processors the instruction set and the microcode may have no direct correspondences at all.

@DJDarren@thelemmy.club
link
fedilink
English
510M

Apple has shown that the market could be willing to adapt.

It’s less that they’ll adapt, and more that they don’t really care. And particularly in the case of Apple users: their apps are (mostly) available on their Macs already. The vast majority of people couldn’t tell you what architecture their computer runs on and will just happily use whatever works and doesn’t cost them the earth.

I didn’t mean the customers, but sure.

meseek #2982
link
fedilink
310M

Except software applications like Adobe CC have supported ARM for nearly 5 years now. As do most software because mobile exists (and mobile is exclusively ARM) and these days, apps need to cover desktop and mobile and web. ARM has essentially been forced on everyone because of mobile. Whether they like it or not, ARM is here to stay.

But none of this is a technical limitation. It’s a political one. Companies like MS don’t care about the technology, they just care about moving in a way that gives them control so they can maintain and expand their monopoly through licensing and other restrictions.

And if it wasn’t for these meddling gnu followers it would have gotten away with it too.

interolivary
link
fedilink
1
edit-2
1Y

Doesn’t Microsoft have something similar to Apple’s Rosetta 2 JIT x86 -> ARM code translation kajigger? I could swear I’ve seen something like that mentioned

Edit: not sure whether it was WOW64 that I read about, that seems to only work for running 32 bit intel code on ARM (although I have no idea if that’s actually a problem or not when running modern Windows binaries, the last Windows I ran was Vista)

aard
link
fedilink
110M

They have, and in my experience it works nicer than Rosetta.

Windows 10 had it limited to 32bit binaries (but Windows 10 on ARM is generally very broken). Windows 11 can handle both 32 and 64bit emulation.

interolivary
link
fedilink
110M

Yeah I recall it somehow being better designed than Rosetta but I can’t dig up where I read about this

aard
link
fedilink
210M

Don’t want to go into too much details - from a high level perspective the Windows version integrates better into the overall system. In Rosetta, once you’re in the emulation layer it can be rather complicated to execute native components from there. In Windows - with some exceptions - that’s not a problem.

lol it’s already out there on tens of millions of laptops, but I guess hubris is the way to go

aname
link
fedilink
231Y

Bollocks! 64k RAM is enough for anything!

StarDreamer
link
fedilink
English
191Y

A more recent example:

“Nobody needs more than 4 cores for personal use!”

thingsiplay
link
fedilink
101Y

I don’t know who said this, but my bet would be Intel. Without AMD, we would probably still stuck on 4 cores.

StarDreamer
link
fedilink
English
141Y

Yep it’s Intel.

They said it up until their competitor started offering more than 4 cores as a standard.

They’re of course exaggerating a little and speaking confidently because theyre in the business of selling a product and not in the business of trash talking what they sell or reducing confidence in their product.

That said the M1/M2 silicon battery life gains were a huge leap forward when they first launched but in terms of battery efficiency and power AMD has been nipping at their heels, and in due time intel will likely get it’s stuff together and join them. You can already get ryzen laptops efficient enough and cool running enough that the fan is off during most light usage, and they can get hours into the mid to high teens on some models.

Likewise even macs will start to drain quite a bit when say watching an hd video 1.75x speed, or playing a video game, or encoding something using max CPU power. So while the Macs do have a power per watt advantage, you’ll still need to be plugged in.

And thats BEST arm vs intel and amd as they catch up. Samsung, google, and qualcom dont really have anything like the m2 at play and while qualcom is rumored to be close the samsung fab’d chips definitely arent.

So as things are the death Intel and AMD has been greatly exaggerated and in part due a combination of the usual apple hype combined with that hype being VERY VERY justified this go around.

@SenorBolsa@beehaw.org
link
fedilink
1
edit-2
10M

Yeah, I hope so, but they also cannot just lie about the direction they think they are headed like that as a public company. With the kind of progress translation has made it just seems inevitable that the switch will happen for lower power consumer devices at least. (Lower power being relative to a high end workstation) interesting to see if maybe this means a pivot to commercial only products.

@abhibeckert@beehaw.org
link
fedilink
3
edit-2
10M

Likewise even macs will start to drain quite a bit when say watching an hd video 1.75x speed, or playing a video game

That’s not my experience. I can play demanding games (CPU/GPU flat out) for several hours on battery on my Mac, and it only has a 50Wh battery.

With “normal” use I get about 18 hours on a charge.

I generally charge it overnight, like a phone, except I don’t do it every night. I often don’t even have access to a charger for days at a time, a laptop charger isn’t part of my normal travel kit. If I notice the battery “running low” that means I need to find a charger in, like, five hours time.

The high end MacBook Pros, with a 12 core CPU and 38 core GPU… yeah those can draw a lot of power. In fact they even drain the battery while plugged into a charger if you really push them. But I don’t think of those as “proper” laptops. They’re more like a portable desktop.

A demanding game on a macbook air m2 will still draw close to 30 watts and while that is actually still good for a laptop relative to what the output is, and you can probably do things to improve that by tweaking in game settings, it’s still going to suck power out of a 50Whr battery.

Steamdecks also run an efficient ryzen apu that lets them play games for 2-8 hours depending on how things are tweaked. Likewise on my 39Whr ryzen thinkpad(intel got a 59whr battery dont get me started on that nonsense) I can get 8-12 hours depending on usage normal browsing as well.

This isnt to take down the m1 & m2. They are definitively more powerful, theyre definitively more efficient, I’m not disputing that. But the gap isnt as huge as it was when the m1 launched.

@abhibeckert@beehaw.org
link
fedilink
2
edit-2
10M

I’m on an M1 MacBook Air - Anandtech measured between 11 and 17 watts with an M1 Mac Mini.

However, the Mac Mini has an excessively large cooling system for the chipset it runs (before Apple Silicon, they sold the same Mac with an Intel i7 that turbo boosted to 4.6Ghz).

The MacBook Air has basically no cooling at all and it definitely throttles under high load. It’s still fast enough to get 60fps with good graphics settings while throttled for the games I play - I’d say it’s about on par with my gaming PC that has an entry level Nvidia GPU, but there’s no way it’s drawing as much power as in Anandtech’s testing on an actively cooled chip.

Based on the battery life I’m getting, I’d guess it’s drawing somewhere around 8 watts on average while playing games. It’s a very efficient chip… it draws 0.2 watts while idle according to Anandtech testing. Remember, this family of chips started life on devices with a 10Wh battery and the MacBook Air isn’t much faster than an iPhone.

You are absolutely right about efficiency. Even the (less efficient) M2 is way better than the 6800U for example under single-threaded load. It’s ~5W vs ~15W, around 3 times as power hungry as the M2, while performing slightly worse.

The M1 is around 25% more efficient than the M2.

@abhibeckert@beehaw.org
link
fedilink
1
edit-2
10M

The M1 is around 25% more efficient than the M2.

No that’s not right. The M2 is far more efficient. Third party tests report he M2 MacBook Air lasts up to 3 hours longer than the M1.

Yes, it draws more power under peak load… but it more than makes up for that with better performance allowing it to return to an idle state more quickly. Give an M1 and an M2 the same task, and the M2 will draw less power to get the task done.

Your original discussion with @lemillionsocks@beehaw.org, was about power usage while gaming, and the corresponding worst-case battery life. I was referring to this as efficiency.

I understand now that the term was misleading The M1 is 25% more frugal than the M2 under worst-case load.

Dark Arc
link
fedilink
English
910M

It’s possible this is a result of improvements Intel is planning for their x86 chips. They’ve already mirrored the efficiency and performance core designs that AFAIK originated in ARM.

In a way, this might be Intel making a prediction based on how years ago Intel launched an x86 replacement, and AMD launched x86-64 … and AMD won because people didn’t want to rebuild all their software/couldn’t get their software.

yeah but back then it was not 90% web apps. also programming languages are way better supporting both platforms. ARM is far from being a little player anymore

Dark Arc
link
fedilink
English
110M

That’s true, but Windows ARM and Linux desktop ARM are still pretty niche.

The web apps thing definitely makes this a lot easier for ARM to takeoff in the PC segment. Though, a lot of those devices are pretty well served by Chromebooks … of which, I think many are already ARM.

Create a post

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

  • 1 user online
  • 144 users / day
  • 275 users / week
  • 709 users / month
  • 2.87K users / 6 months
  • 1 subscriber
  • 3.09K Posts
  • 64.9K Comments
  • Modlog