Apple's chip design team is *killing* it!i Bravo!
From: Jolly Roger <jollyroger@pobox.com>comp.sys.mac.system,comp.sys.mac.hardware.misc,misc.phone.mobile.iphone,comp.mobile.ipad
Newsgroups:
Subject: Apple Silicon M1 Chip in MacBook Air Outperforms High-End16-Inch MacBook Pro and All iOS Devices
Date: 12 Nov 2020 17:37:00 GMT--- Synchronet 3.18b-Win32 NewsLink 1.113
Message-ID: <i15a5sFk7a8U1@mid.individual.net>
Apple's chip design team is *killing* it!i Bravo!
Apple Silicon M1 Chip in MacBook Air Outperforms High-End 16-Inch
MacBook Pro and All iOS Devices
---
Apple introduced the first MacBook Air, MacBook Pro, and Mac mini with
M1 Apple Silicon chips yesterday, and as of today, the first benchmark
of the new chip appears to be showing up on the Geekbench site.
<https://jmp.sh/X57oksC>
The M1 chip, which belongs to a MacBook Air with 8GB RAM, features a single-core score of 1687 and a multi-core score of 7433. According to
the benchmark, the M1 has a 3.2GHz base frequency.
When compared to existing devices, the M1 chip in the MacBook Air
outperforms all iOS devices. For comparison's sake, the iPhone 12 Pro
earned a single-core score of 1584 and a multi-core score of 3898, while
the highest ranked iOS device on Geekbench's charts, the A14 iPad Air,
earned a single-core score of 1585 and a multi-core score of 4647.
<https://jmp.sh/E0RKkIv>
Single Core benchmarks
In comparison to Macs, the single-core performance is better than *any*
other available Mac, and the multi-core performance beats out all of the
2019 16-inch MacBook Pro models, including the 10th-generation high-end 2.4GHz Intel Core i9 model. That high-end 16-inch MacBook Pro earned a single-core score of 1096 and a multi-core score of 6870.
Though the M1 chip is outperforming the 16-inch MacBook Pro models when
it comes to raw CPU benchmarks, the 16-inch MacBook Pro likely offers
better performance in other areas such as the GPU as those models have high-power discrete GPUs.
<https://jmp.sh/qkyu8hL>
Multi Core benchmarks
It's worth noting that there are likely to be some performance
differences between the MacBook Pro and the MacBook Air even though
they're using the same M1 chip because the MacBook Air has a fanless
design and the MacBook Pro has an new Apple-designed cooling system.
There's also a benchmark for the Mac mini, though, and it has about the
same scores.
The Mac mini with M1 chip that was benchmarked earned a single-core
score of 1682 and a multi-core score of 7067.
Update: There's also a benchmark for the 13-inch MacBook Pro with M1
chip and 16GB RAM that has a single-core score of 1714 and a multi-core
score of 6802. Like the MacBook Air, it has a 3.2GHz base frequency. A
few other MacBook Air benchmarks have surfaced too with similar scores,
and the full list is available on Geekbench.
<https://www.macrumors.com/2020/11/11/m1-macbook-air-first-benchmark/>
--
E-mail sent to this address may be devoured by my ravenous SPAM filter.
I often ignore posts from Google. Use a real news client instead.
JR
On 12 Nov 2020 17:37:00 GMT, Jolly Roger wrote:
Apple's chip design team is *killing* it!i Bravo!
See also the actual _facts_ (not utter bullshit from Apple MARKETING)...
o The new ARM technology TSMC Silicon powered MacBook
On 2020-11-12 9:54 a.m., Arlen Holder wrote:
On 12 Nov 2020 17:37:00 GMT, Jolly Roger wrote:
Apple's chip design team is *killing* it!i Bravo!
See also the actual _facts_ (not utter bullshit from Apple MARKETING)...
o The new ARM technology TSMC Silicon powered MacBook
Since that is utter bullshit, why should I bother looking further.
Apple designs its own chips, Arlen.
Accept it.
On 2020-11-12, Alan Baker <notonyourlife@no.no.no.no> wrote:
On 2020-11-12 9:54 a.m., Arlen Holder wrote:
On 12 Nov 2020 17:37:00 GMT, Jolly Roger wrote:
Apple's chip design team is *killing* it!i Bravo!
See also the actual _facts_ (not utter bullshit from Apple MARKETING)... >>>
o The new ARM technology TSMC Silicon powered MacBook
Since that is utter bullshit, why should I bother looking further.
Apple designs its own chips, Arlen.
Accept it.
Poor little butt-hurt Arleen is a pathetic waste of life.
The M1 chip, which belongs to a MacBook Air with 8GB RAM,
In comparison to Macs, the single-core performance is better than *any*
other available Mac, and the multi-core performance beats out all of the
2019 16-inch MacBook Pro models, including the 10th-generation high-end 2.4GHz Intel Core i9 model. That high-end 16-inch MacBook Pro earned a single-core score of 1096 and a multi-core score of 6870.
In message <i15a5sFk7a8U1@mid.individual.net> Jolly Roger <jollyroger@pobox.com> wrote:
The M1 chip, which belongs to a MacBook Air with 8GB RAM,
and:
In comparison to Macs, the single-core performance is better than *any*
other available Mac, and the multi-core performance beats out all of the
2019 16-inch MacBook Pro models, including the 10th-generation high-end
2.4GHz Intel Core i9 model. That high-end 16-inch MacBook Pro earned a
single-core score of 1096 and a multi-core score of 6870.
Suck it, JF.
Apple's chip design team is *killing* it!i Bravo!
Apple Silicon M1 Chip in MacBook Air Outperforms High-End 16-Inch
MacBook Pro and All iOS Devices...
On 2020-11-13 05:19, Lewis wrote:
In message <i15a5sFk7a8U1@mid.individual.net> Jolly Roger <jollyroger@pobox.com> wrote:
The M1 chip, which belongs to a MacBook Air with 8GB RAM,
and:
In comparison to Macs, the single-core performance is better than *any*
other available Mac, and the multi-core performance beats out all of the >>> 2019 16-inch MacBook Pro models, including the 10th-generation high-end
2.4GHz Intel Core i9 model. That high-end 16-inch MacBook Pro earned a
single-core score of 1096 and a multi-core score of 6870.
Suck it, JF.
Don't be a child about it. The poor bastard just plunked down hard
earned cash to upgrade his Mac Pro (which he neither needed nor could
afford at the time )...
On 2020-11-12 12:37, Jolly Roger wrote:
Apple's chip design team is *killing* it!i Bravo!
Apple Silicon M1 Chip in MacBook Air Outperforms High-End 16-Inch
MacBook Pro and All iOS Devices...
This is all making me wonder if I shouldn't wait for the next Mx Mini
and get one really nice display and a new sidebar display in lieu of a
new iMac - although I expect that to be a real killer too when it comes out.
that doesn't excuse his constant stream of pig-ignorance and bullshit
and constant harping about how things worked on the VAX 30 years ago.
The VAX examnple is relevat
So I maintain my argument that reducing memory by half on its laptops is
not going to be good.
On 2020-11-13 15:27, Lewis wrote:
that doesn't excuse his constant stream of pig-ignorance and bullshit
and constant harping about how things worked on the VAX 30 years ago.
Constant? Really? you can't find a better insult?
The VAX examnple is relevat
In message <VzDrH.541236$HY4.328729@fx37.iad> JF Mezei <jfmezei.spamnot@vaxination.ca> wrote:
The VAX examnple is relevat
Nope.
Hint: as soon as you string the letters V A and X together, no one
reads any further.
Hint: as soon as you string the letters V A and X together, no one
reads any further.
In article <slrnrquabb.1fdt.g.kreme@ProMini.lan>, Lewis <g.kreme@kreme.dont-email.me> wrote:
In message <VzDrH.541236$HY4.328729@fx37.iad> JF Mezei
<jfmezei.spamnot@vaxination.ca> wrote:
The VAX examnple is relevat
Nope.
Hint: as soon as you string the letters V A and X together, no one
reads any further.
you're missing out on additional laughs.
The VAX examnple is relevat because VMS made the memory differences
quite visible to the system manager because different areas of the OS
were set as parameters in SYSGEN. On the OS-X, because the memory
management is totally opaque to the user, you don't really see the difference. But the difference happens.
So I maintain my argument that reducing memory by half on its laptops is
not going to be good.
On 2020-11-13 19:48, Lewis wrote:
Hint: as soon as you string the letters V A and X together, no one
reads any further.
Since you don't actually read my posts and just find any spot where you
can start insulting me, it doesn't matter. But admitting you don't read
the post should send a message that your insults are not based on
facts, just just automatically insult withourt reading, and whenever challenged to provide actual facts to show I was wrong, you refuse.
I have experience going from one platform to another where memory requireements differences were very visible to the system manager. Yet,
you instead choose to discredit and insult me.
When the 8086 went from 32 to 64 bits, the transition was basically invisible because neither Windows nor OS-X exposes one to the intricate memory management now done by the OS automatically.
And more importantly, compatibility means that existing 32bit binaries
ran unchanged using 32 bit addresses and opcodes. The change was
gradual as more and more applicatiosn were compiled in 64 bit mode.
In the acse of Apple, the progressive move to 64 biuts was compensated
by progressive removal of 32 bit frameworks and eventually removal of
all 32 bit frameworks which meant only one copy loaded onto memory.
But with this all done before move to ARM, the move to ARM does expose
the binary size to how many ARM opcodes are necessary to replace 8086
ones.
Againa, you or your ilk claimed that binaries would be smaller to
explain why the Laptop only has 8GB or 16GB.
If by "basically invisible" you mean "a whole lot of work, and very
much a substantial effort for many developers", sure.
I fail to see the connection from your earlier statements within this posting to your discussion of differing instruction sets and
instruction densities.
On 2020-11-17 17:18, Stephen Hoffman wrote:
If by "basically invisible" you mean "a whole lot of work, and very
much a substantial effort for many developers", sure.
From the user point of view, the Mac transitions did not expose the user
to system memoriry management changes.
This is in the context of the RAM limitations of the new Macs. Someone
argues that moving from x86 to ARM would require less memory, a premise
I disagree with. (they said an 8GB M1 Mac would be just as capable as a
16GB Intel one).
I tried to explain than going 64 bits and going RISC end up with larger binaries and provided the exmaple of VMS where were were exposed in
SYSGEN to those changes, needing far greater allocation of space for shareable images, larger working sets etc.
This is all opaque to Mac users, but it doesn't mean that moving from
X86 to ARM will end up requiring less RAM.
The transition to 64 but on x86 was more gradual and as each 32 but
framework was widthdrawn over the years, in increased size of new 64 but
apps was comendated by reduced footprint with elimination of 32 bit APIs.
But with the move to ARM, Apple has already copmpleted the pruning of 32
bit APIs so there is no saving in RAM from removing old APIs, so it is
just a matter of whether the ARM binary is larger than the x86 binary fo
same app (needing more memory to load). The data alloactions would be
the same.
(with regards to the Universal, the file containing the binary may be
larger, but one assumes that the image loader only loads the appropriate portion of the universal into memory).
I fail to see the connection from your earlier statements within this
posting to your discussion of differing instruction sets and
instruction densities.
This was in relation to someone arguing the binaries for an ARM platform would e smaller than those for X86, thus iot was OK for the new laptops
to have less RAM in them.
On 2020-11-17 17:18, Stephen Hoffman wrote:
If by "basically invisible" you mean "a whole lot of work, and very
much a substantial effort for many developers", sure.
From the user point of view, the Mac transitions did not expose the user
to system memoriry management changes.
This is in the context of the RAM limitations of the new Macs. Someone
argues that moving from x86 to ARM would require less memory, a premise
I disagree with. (they said an 8GB M1 Mac would be just as capable as a
16GB Intel one).
On 2020-11-17 17:18, Stephen Hoffman wrote:
If by "basically invisible" you mean "a whole lot of work, and very
much a substantial effort for many developers", sure.
From the user point of view, the Mac transitions did not expose the
user to system memoriry management changes.
This is in the context of the RAM limitations of the new Macs. Someone argues that moving from x86 to ARM would require less memory, a premise
I disagree with. (they said an 8GB M1 Mac would be just as capable as
a 16GB Intel one).
I tried to explain than going 64 bits and going RISC end up with larger binaries and provided the exmaple of VMS where were were exposed in
SYSGEN to those changes, needing far greater allocation of space for shareable images, larger working sets etc.
This is all opaque to Mac users, but it doesn't mean that moving from
X86 to ARM will end up requiring less RAM.
The transition to 64 but on x86 was more gradual and as each 32 but framework was widthdrawn over the years, in increased size of new 64
but apps was comendated by reduced footprint with elimination of 32 bit APIs.
But with the move to ARM, Apple has already copmpleted the pruning of
32 bit APIs so there is no saving in RAM from removing old APIs, so it
is just a matter of whether the ARM binary is larger than the x86
binary fo same app (needing more memory to load). The data alloactions would be the same.
I fail to see the connection from your earlier statements within this
posting to your discussion of differing instruction sets and
instruction densities.
This was in relation to someone arguing the binaries for an ARM
platform would e smaller than those for X86, thus iot was OK for the
new laptops to have less RAM in them.
You know that instead of drawing analogies you can actually measure it, right?
…and other times is not (same codebase and no optimization, mind you)
Have you seen cases where the ARM binary is half the size of the Intel
one which would make an 8 GB M1 Mac function with same amount of
page/swap files as a 16GB Intel?
This disccusion is happening because of of the nospam/lewis/whatever
argued the M1 chip didn't need as much memory to function as the Intel
and that the 8GM Mac in 2020 was perfectly suitable.
On 2020-11-18 03:51, Krzysztof Mitko wrote:
You know that instead of drawing analogies you can actually measure it,
right?
Have you seen cases where the ARM binary is half the size of the Intel
one which would make an 8 GB M1 Mac function with same amount of
page/swap files as a 16GB Intel?
This disccusion is happening because of of the nospam/lewis/whatever
argued the M1 chip didn't need as much memory to function as the Intel
and that the 8GM Mac in 2020 was perfectly suitable.
Optimizations happen in the compiler and then by the LLVM layer.
The developers were certainly exposed. I had C code that needed
changes, both due to framework deprecations and due to pointer changes.
Apps using entirely OO had rather less work required, outside of
frameworks deprecations.
I don't know what the ratio would be, pending tests.
Having fast main storage
can make less main memory feasible.
Being somewhat familiar with what you were using as an example, I was confused. I'm still confused.
I'd expect a substantial effect from faster I/O buses and faster main storage than from architectural integer sizes—you're seemingly assuming old paging performance characteristics within your memory sizing calculations here,
Being somewhat familiar with what you were using as an example, I was confused. I'm still confused.
What I rememher is that moving from 32 to 64 (VAX to Alpha) requires a
lot of changes.
On 2020-11-18 03:51, Krzysztof Mitko wrote:
You know that instead of drawing analogies you can actually measure it,
right?
Have you seen cases where the ARM binary is half the size of the Intel
one which would make an 8 GB M1 Mac function with same amount of
page/swap files as a 16GB Intel?
Nobody said that. Your "confusion" is that a universal binary is twice
the size as an individual binary.
Your other confusion is that the integrated devices (graphics, I/O)
share the main memory. And while that is so, it is no different than
the lower spec'd Macs that share memory with intel based graphics.
Adding on other devices (USB 3 / Thunderbolt, etc) uses relatively
little of this memory.
Most importantly, with this shared memory comes huge gains in speed.
anything in the real world. And please do not mention VAX again.
Intel Laptops come wth 16GB of RAM. Apple halved the default to 8GB for these laptops with both having the GPU share RAM with CPU.
Most importantly, with this shared memory comes huge gains in speed.
This is Apple marketing.
VAX.
Most importantly, with this shared memory comes huge gains in speed.
This is Apple marketing.
No, you are wrong and there is tons of evidence proving you are wrong.
On 2020-11-20 19:51, Lewis wrote:
Most importantly, with this shared memory comes huge gains in speed.
This is Apple marketing.
No, you are wrong and there is tons of evidence proving you are wrong.
So based on your logic, the Intel GPUs in the Intel Laptop chips should
have outperformed the fancy external GPUs years ago because of shared
memory. Yet, when Apple unleashed its $60,000 Cheese Grater Mac Pro, it
used external GPUs, not the internal Intel ones.
On 2020-11-18 12:54, Stephen Hoffman wrote:
Being somewhat familiar with what you were using as an example, I was confused. I'm still confused.
Don't worry. It's not you.
On 2020-11-20 13:45, Alan Browne wrote:
Nobody said that. Your "confusion" is that a universal binary is twice
the size as an individual binary.
I never said that. The image activator will only load the relevant
binary from disk, so the fact that the .App directory structiure has a
file containing an intel binary next to the one containing the ARM
binary is irrelevant to RAM requirements since it won't be loaded ever.
Your other confusion is that the integrated devices (graphics, I/O)
share the main memory. And while that is so, it is no different than
the lower spec'd Macs that share memory with intel based graphics.
Intel Laptops come wth 16GB of RAM. Apple halved the default to 8GB for
these laptops with both having the GPU share RAM with CPU.
Adding on other devices (USB 3 / Thunderbolt, etc) uses relatively
little of this memory.
I broiught that up in the contect is upcoming chips, whether Apple will increase the IO subsystem and how. This is not memory related.
Most importantly, with this shared memory comes huge gains in speed.
This is Apple marketing. An application sends code to the external GPU
once, and the GPU then uses its own memory controller accessing its own memory to do all its computations independantly.
In the shared memory model, the GPU and CPU compete for access to memory since they use the same memory controller accessing the same memory, and
this has the issues of memory allocation (hence the need for a
hyopervisor I suspect).
PCI Express 4 is very very fast. So the difference in performnce is not garanteed.
Now, if you are viewing a movie where you are constantly transfering new frames and using the GPU to dis0lay instead of to compute, then the
shared memory is good. But for such a task, the PCI Express 4 is more
than enough anyways.
On 2020-11-20 19:22, JF Mezei wrote:
On 2020-11-20 13:45, Alan Browne wrote:
Nobody said that. Your "confusion" is that a universal binary is twice
the size as an individual binary.
I never said that. The image activator will only load the relevant
binary from disk, so the fact that the .App directory structiure has a
file containing an intel binary next to the one containing the ARM
binary is irrelevant to RAM requirements since it won't be loaded ever.
Your other confusion is that the integrated devices (graphics, I/O)
share the main memory. And while that is so, it is no different than
the lower spec'd Macs that share memory with intel based graphics.
Intel Laptops come wth 16GB of RAM. Apple halved the default to 8GB for
Google: "intel laptop 4GB" and plenty pop up. likewise 8 GB.
The low end intel ones have shared memory with graphics and IIRC AMD
have a similar scheme in some CPU's.
Your balloon just popped. (Again!).
My Mac Mini at work is "only" 8 GB and I do all of the above plus a
Fusion VM of Win 10 and all works fine - and it's an i3. Not a heavy
lifter by any stretch - but fine for administration of a small business
(and I likely do much more than most). It's far better than the MBA it replaced, to be sure (and it too, 8 GB, did fine).
I broiught that up in the contect is upcoming chips, whether Apple will
increase the IO subsystem and how. This is not memory related.
It's most definitely memory related - the unified memory is all about
speed through proximity and tight integration.
Metal will take care of converting OS/app level operations to GPU memory
- and that is blazing fast. The rest is 0 time as far as the CPU is concerned.
Others (nospam, JR, Lewis) have posted many links to factual information about M1 Macs beating many higher end intel Macs in performance (speed, energy). That's really all you need to know.
In effect when running Mac OS on Mx architectures it's not the same Mac
OS at the system level. And that really is the point: moving what used
to be complex OS ops out of the OS and into the chip allowing the CPU to
do higher level things much faster and by default the lower level things blaze.
On 2020-11-20 19:51, Lewis wrote:
Most importantly, with this shared memory comes huge gains in speed.
This is Apple marketing.
No, you are wrong and there is tons of evidence proving you are wrong.
So based on your logic, the Intel GPUs in the Intel Laptop chips should
have outperformed the fancy external GPUs years ago because of shared
memory. Yet, when Apple unleashed its $60,000 Cheese Grater Mac Pro, it
used external GPUs, not the internal Intel ones.
Where you are purposefully keeping you head in the sand
On Nov 20, 2020, Alan Browne wrote
(in article <90UtH.1025$jL.287@fx13.iad>):
On 2020-11-18 12:54, Stephen Hoffman wrote:
Being somewhat familiar with what you were using as an example, I was
confused. I'm still confused.
Don't worry. It's not you.
“Anyone who isn’t confused really doesn’t understand the situation†- Edward R. Murrow
In message <RUauH.12386$ZbR7.10525@fx01.iad> Alan Browne <bitbucket@blackhole.com> wrote:
Others (nospam, JR, Lewis) have posted many links to factual information
about M1 Macs beating many higher end intel Macs in performance (speed,
energy). That's really all you need to know.
But he would need to 1) read 2) reread 3) reread 4) understand.
And what are the odds of that?
In effect when running Mac OS on Mx architectures it's not the same Mac
OS at the system level. And that really is the point: moving what used
to be complex OS ops out of the OS and into the chip allowing the CPU to
do higher level things much faster and by default the lower level things
blaze.
The key thing that is most impressive about the M1 is that it can run translated Intel x64 code faster than Intel CPUs can.
On 2020-11-21 11:47, Lewis wrote:
In message <RUauH.12386$ZbR7.10525@fx01.iad> Alan Browne <bitbucket@blackhole.com> wrote:
Others (nospam, JR, Lewis) have posted many links to factual information >>> about M1 Macs beating many higher end intel Macs in performance (speed,
energy). That's really all you need to know.
But he would need to 1) read 2) reread 3) reread 4) understand.
And what are the odds of that?
Best? 3 out of 4. Obviously.
In effect when running Mac OS on Mx architectures it's not the same Mac
OS at the system level. And that really is the point: moving what used
to be complex OS ops out of the OS and into the chip allowing the CPU to >>> do higher level things much faster and by default the lower level things >>> blaze.
The key thing that is most impressive about the M1 is that it can run
translated Intel x64 code faster than Intel CPUs can.
That's a blackbox view so far. We don't know the cycle speed(s) so it's
hard to know what is actually what.
In message <XKbuH.41697$gR8.5257@fx45.iad> Alan Browne <bitbucket@blackhole.com> wrote:
On 2020-11-21 11:47, Lewis wrote:
In message <RUauH.12386$ZbR7.10525@fx01.iad> Alan Browne <bitbucket@blackhole.com> wrote:
Others (nospam, JR, Lewis) have posted many links to factual information >>>> about M1 Macs beating many higher end intel Macs in performance (speed, >>>> energy). That's really all you need to know.
But he would need to 1) read 2) reread 3) reread 4) understand.
And what are the odds of that?
Best? 3 out of 4. Obviously.
In effect when running Mac OS on Mx architectures it's not the same Mac >>>> OS at the system level. And that really is the point: moving what used >>>> to be complex OS ops out of the OS and into the chip allowing the CPU to >>>> do higher level things much faster and by default the lower level things >>>> blaze.
The key thing that is most impressive about the M1 is that it can run
translated Intel x64 code faster than Intel CPUs can.
That's a blackbox view so far. We don't know the cycle speed(s) so it's
hard to know what is actually what.
We do know that Intel x64 apps are running faster on the M1 than on the remaining 13" Mac Book Pro. We also know that some of the APIs execute
twice as fast in translation than on the Intel chips. I forget the exact
call that Apple mentioned, but it was something like NSRelease which
takes 30ns on intel, 8 on the M1, and 14ns on the M1 in Intel mode.
Ah, yes, here it is:
<https://daringfireball.net/2020/11/the_m1_macs>
Fun fact: retaining and releasing an NSObject takes ~30 nanoseconds on current gen Intel, and ~6.5 nanoseconds on an M1
… and ~14 nanoseconds on an M1 emulating an Intel.
On 2020-11-18 12:54, Stephen Hoffman wrote:
The developers were certainly exposed. I had C code that needed
changes, both due to framework deprecations and due to pointer changes.
Apps using entirely OO had rather less work required, outside of
frameworks deprecations.
What I rememher is that moving from 32 to 64 (VAX to Alpha) requires a
lot of changes. Migrating the UAF didn't work because WSLIMIT and
related paramenters were grossly insufficient because processes now
needed a lot more memory. (and same with corresponding SYSGEN
parameters)
The curret migration is from 64 bits OS-X on Intel 8086 to 64 bits OS0X
on ARM. So the big change is not on pointer size, but rather on how
many instructions are needed to perform the same task (and size of instructions).
I don't know what the ratio would be, pending tests.
Would you agree that someone claiming the migration from 64 bits OS-X
on x86 to 64 bits OS-X on ARM would NOT result in half of RAM
requirement as some here have argued?
Having fast main storage can make less main memory feasible.
Is there data that shows that an SSD can be as fast as DDR4 memory,
when you consider that all accesses go through the Secure Enclave which encrypts/decrypts data?
Being somewhat familiar with what you were using as an example, I was
confused. I'm still confused.
I is simple. Someone stated that reducing base model from 16 to 8GB was nornal because M1 chip required much less memory than a x86 running the
same OS that has already been pruned of 32bit legacy APIs (so truly comparing apples to apples)pardon the pun).
I was trying to argue that if anything, memory footp]rint of gong from
CISC to RISC inxreases RAM requirement.
I'd expect a substantial effect from faster I/O buses and faster main
storage than from architectural integer sizes—you're seemingly
assuming old paging performance characteristics within your memory
sizing calculations here,
Are you saying that increased reliance on paging to disk is compensated
by having faster disk and faster IO ?
With that logic, wouldn't it be fair to state that having more RAM
would icrease performance even further because it would be faster than paging to disk?
Running the Adobe suite to do videos, I recently uipgraded from 24 to
64 GB, 4 to 10 cores. And I am finding it already insufficient RAM. So
I am quite susprised Apple finds 8GB sufficient to do any video work.
In message <XKbuH.41697$gR8.5257@fx45.iad> Alan Browne <bitbucket@blackhole.com> wrote:
On 2020-11-21 11:47, Lewis wrote:
In message <RUauH.12386$ZbR7.10525@fx01.iad> Alan Browne
<bitbucket@blackhole.com> wrote:
Others (nospam, JR, Lewis) have posted many links to factual information >>>> about M1 Macs beating many higher end intel Macs in performance (speed, >>>> energy). That's really all you need to know.
But he would need to 1) read 2) reread 3) reread 4) understand.
And what are the odds of that?
Best? 3 out of 4. Obviously.
In effect when running Mac OS on Mx architectures it's not the same Mac >>>> OS at the system level. And that really is the point: moving what used >>>> to be complex OS ops out of the OS and into the chip allowing the CPU to >>>> do higher level things much faster and by default the lower level things >>>> blaze.
The key thing that is most impressive about the M1 is that it can run
translated Intel x64 code faster than Intel CPUs can.
That's a blackbox view so far. We don't know the cycle speed(s) so it's
hard to know what is actually what.
We do know that Intel x64 apps are running faster on the M1 than on the remaining 13" Mac Book Pro. We also know that some of the APIs execute
twice as fast in translation than on the Intel chips. I forget the exact
call that Apple mentioned, but it was something like NSRelease which
takes 30ns on intel, 8 on the M1, and 14ns on the M1 in Intel mode.
Ah, yes, here it is:
<https://daringfireball.net/2020/11/the_m1_macs>
Fun fact: retaining and releasing an NSObject takes ~30 nanoseconds on current gen Intel, and ~6.5 nanoseconds on an M1
... and ~14 nanoseconds on an M1 emulating an Intel.
This was David Smith, an Apple Engineer (not the be confused with the developer _DavidSmith).
On 2020-11-21 17:54:43 +0000, Lewis said:
Ah, yes, here it is:
<https://daringfireball.net/2020/11/the_m1_macs>
Fun fact: retaining and releasing an NSObject takes ~30 nanoseconds on
current gen Intel, and ~6.5 nanoseconds on an M1
... and ~14 nanoseconds on an M1 emulating an Intel.
This was David Smith, an Apple Engineer (not the be confused with the
developer _DavidSmith).
Plus, the "cycle speed" is completely irrelevant because they utterly different chip types. Every sane person knows that despite having the
same clock speeds, a "3GHz 680x0" is different to a "3GHz PowerPC 605"
and diffferent to a "3GHz Intel x86" and different to a "3GHz Apple M1". That clock speed measures the chips within the same family, it's not
even remotely useful to compare different chips ... which is why the old "Intel vs PowerPC" MHz debate was always just a ridiculous stupidity.
On 2020-11-21 12:54, Lewis wrote:
In message <XKbuH.41697$gR8.5257@fx45.iad> Alan Browne <bitbucket@blackhole.com> wrote:
On 2020-11-21 11:47, Lewis wrote:
In message <RUauH.12386$ZbR7.10525@fx01.iad> Alan Browne <bitbucket@blackhole.com> wrote:
Others (nospam, JR, Lewis) have posted many links to factual information >>>>> about M1 Macs beating many higher end intel Macs in performance (speed, >>>>> energy). That's really all you need to know.
But he would need to 1) read 2) reread 3) reread 4) understand.
And what are the odds of that?
Best? 3 out of 4. Obviously.
In effect when running Mac OS on Mx architectures it's not the same Mac >>>>> OS at the system level. And that really is the point: moving what used >>>>> to be complex OS ops out of the OS and into the chip allowing the CPU to >>>>> do higher level things much faster and by default the lower level things >>>>> blaze.
The key thing that is most impressive about the M1 is that it can run
translated Intel x64 code faster than Intel CPUs can.
That's a blackbox view so far. We don't know the cycle speed(s) so it's
hard to know what is actually what.
We do know that Intel x64 apps are running faster on the M1 than on the
remaining 13" Mac Book Pro. We also know that some of the APIs execute
twice as fast in translation than on the Intel chips. I forget the exact
call that Apple mentioned, but it was something like NSRelease which
takes 30ns on intel, 8 on the M1, and 14ns on the M1 in Intel mode.
Ah, yes, here it is:
<https://daringfireball.net/2020/11/the_m1_macs>
Fun fact: retaining and releasing an NSObject takes ~30 nanoseconds on
current gen Intel, and ~6.5 nanoseconds on an M1
… and ~14 nanoseconds on an M1 emulating an Intel.
Good data points. Thx. Astounding. Did I ever tell you about
migrating from VAX to Alpha? You see, ...
Google: "intel laptop 4GB" and plenty pop up. likewise 8 GB.
Hypervisor? You're fabulating (again). Further the memory controller function is implicit in the ARM h/w -
In short, the memory required by a device is allocated by the OS and
thus unavailable to the OS or apps for other purposes.
You just don't need it to move app graphics data to a GPU.
Higher end (pro) Macs may yet use 3rd party GPU's and will need data transported to them. Or Apple may simply up their game on the GPU side
by another leap and the era of 3rd party GPUs ends for Macs...
In effect when running Mac OS on Mx architectures it's not the same Mac
OS at the system level. And that really is the point: moving what used
to be complex OS ops out of the OS and into the chip allowing the CPU to
do higher level things much faster and by default the lower level things blaze.
Apple have tremendous resources of all kinds with the sales to support bleeding edge RD&E. Can you imagine if Apple got in their bonnet to do
the world's most powerful supercomputer based on the emergent Mx architecture?
<https://daringfireball.net/2020/11/the_m1_macs>
Fun fact: retaining and releasing an NSObject takes ~30 nanoseconds on current gen Intel, and ~6.5 nanoseconds on an M1
… and ~14 nanoseconds on an M1 emulating an Intel.
Maybe your app mix doesn't fit within the target market for these early M1-based systems,
Plus, the "cycle speed" is completely irrelevant because they utterly different chip types.
I have a 3 minute video on After Effects that cause the later to grow to ~30GB of RAM when working on it, and when rending takes up the full 64GB
with the additional processes.
The problem is that Apple didn't market these laptops as great to ruin browsers or wordk processors, it specifically targetted video editing
which has been oe of the highest consumer of RAM.
I'd have no problem if Apple marketied the laptop to view movies, browse
the web, write emails and compose text documents, manage your photos.
But Apple explicitely and repeatedy mentioned video editing.
Perhaps Final Cut pro got a magic update where it no longer requires RAM
to process 4K movies.
On 2020-11-21 12:54, Lewis wrote:
<https://daringfireball.net/2020/11/the_m1_macs>
Fun fact: retaining and releasing an NSObject takes ~30 nanoseconds on
current gen Intel, and ~6.5 nanoseconds on an M1
… and ~14 nanoseconds on an M1 emulating an Intel.
Thsi is expected
since that call from an Intel image simply has over
head to gather Intel format argument list and call the native ARM
NRObject routine that executes with all optjizations for ARM because it
was compikled for ARM and then returns results to the jacket routine
which then reformats return code in Intel format and returns to the
calling program.
Suspect you will see the same ~8 nanosecond overhead on all system
But Apple explicitely and repeatedy mentioned video editing.
On 2020-11-21 15:43, Your Name wrote:
Plus, the "cycle speed" is completely irrelevant because they utterly
different chip types.
But extremely relevant if an M1 on an Air were clocked at lower speed
than an M1 on MacBook pro or Mini.
Apple did not divulge that information and it is only through benchmarks
that once can see that they _appear_ to all have the smae clocks.
When I upgraded my CPU, I had a number of choices. My 4 core XEON was at 3.7hhz. the 10 core one at 3.0 ghz, and 12 core at 2.7ghz.
These are the same E5 generation, same socket, smame memory
controller/speed. So yes, clock speed matters because you lose
significant speecd on th 12 core one for single core tasks.
The clock speed can't be used to compare anything/everything, but there
are cases where it is a very important metric when coparing different variants of the same chip generation/architecture.
(it isn't the only one since the internal caches also matter though
perhaps this is diferrent in M1, but we really have no details on how
Apple implemented its memory other than "unified memorry" which is meaningless markleting keynote gobbledeegook.
On 2020-11-21 14:58, Stephen Hoffman wrote:
Maybe your app mix doesn't fit within the target market for these early
M1-based systems,
The problem is that Apple didn't market these laptops as great to ruin browsers or wordk processors, it specifically targetted video editing
which has been oe of the highest consumer of RAM.
I'd have no problem if Apple marketied the laptop to view movies, browse
the web, write emails and compose text documents, manage your photos.
But Apple explicitely and repeatedy mentioned video editing.
Perhaps Final Cut pro got a magic update where it no longer requires RAM
to process 4K movies.
there's a final cut pro benchmark where an m1 macbook pro finished
rendering a video faster and with plenty of battery to spare, while the
intel macbook drained the battery partway through and had to be plugged
in for it to complete.
You need to read better, you lumpoy moron. The M1 mac runs the Intel
code FASTER in translation that the code runs on an Intel CPU.
But extremely relevant if an M1 on an Air were clocked at lower speed
than an M1 on MacBook pro or Mini.
Wrong... ...again.
And yet, reviewers have actually DONE video testing...
...and it works great.
On 2020-11-21 11:05, Alan Browne wrote:
Google: "intel laptop 4GB" and plenty pop up. likewise 8 GB.
Try doing video work with only 4FB of work.
Intel Laptops come wth 16GB of RAM. Apple halved the default to 8GB for these laptops with both having the GPU share RAM with CPU.
Hypervisor? You're fabulating (again). Further the memory controller
function is implicit in the ARM h/w -
In a normal CPU with multiple cores, there must be a memory controller
to synchronize cache coherence and memory access. Two cores can't access
the same memory at same time for instance. But when these all run the
same instance of the operating system, the oprerting system at least has control of mapping fo real vs virtual memory.
When you have difference instrances, you need a hypervisor to manage resources such as disk access, access to RAM etc.
In this case, you have a separate instance for th GPU which isn't the
OS-X that you see on your screen. You have a separate instance for the
neural engine. The secure enclave is more of an io device so likely
mapped as an io device in terms of memory access.
So when the GPU wants to access RAM, it can't just willy nilly access
RAM, because the OS-X instance is using RAM and doesn't expect someone
else to write/change RAM comtent that is allocatred to a process in OS-X.
It might be possible for OS-X itself to manage the GPU's memory to
ensure it uses memory OS- has authroized it to read/write or read only.
But you still need a controller for cache coherence between all users
(CPU cores, GPU cores, neural cores, etc) as well as ensuring two don't access same location at same time.
In short, the memory required by a device is allocated by the OS and
thus unavailable to the OS or apps for other purposes.
The issue ios that the GPU is an independnat processor not controlled by OS-X. So it it decides it now needs 4GB of RAM instead of 2GB to render
a complex scene, it can't just take that RAM from the 8GB the system
has, There needs to be a mechanism for fair allocation to ensiure one
doesn't step onto the other.
You just don't need it to move app graphics data to a GPU.
You are forgettng GPUs being used as co-processor to render scenes and sending the data back to the CPU. (video rendering of effects to make a movie on disk as opposed to just playing a movie or game with output
doing to screen).
Higher end (pro) Macs may yet use 3rd party GPU's and will need data
transported to them. Or Apple may simply up their game on the GPU side
by another leap and the era of 3rd party GPUs ends for Macs...
Which is why looking at how Apple has done M1 is important because it
likely shows how they intend to scale it to higher end machines.
In effect when running Mac OS on Mx architectures it's not the same Mac
OS at the system level. And that really is the point: moving what used
to be complex OS ops out of the OS and into the chip allowing the CPU to
do higher level things much faster and by default the lower level things
blaze.
Marketing gobbledeegook.
the OS exists to manage workloads. It is the individual apps that are
written to make use of neural engine, or GPU. It is the apps that have
the Metal code they send to the GPU to be compiled and execurted, not
the OS that decides to optimise a request and send it to neural enine or
send it to GPU.
You are confusing Apple's internal use of those co processors (such as
neural engie for FaceID, Secure Enclave for Touch ID etc) with
widespread use of them. These are explictely called, not something the
OS decides to use on the fly for your process.
Apple have tremendous resources of all kinds with the sales to support
bleeding edge RD&E. Can you imagine if Apple got in their bonnet to do
the world's most powerful supercomputer based on the emergent Mx
architecture?
So far, Apple has succesfully scaled up the iPhone to laptops. The performance is good enough to match/beat midrange, so beyond entry level laptops for raw CPU power, but useless for high workloads due to 16GB
RAM limit.
Lest wait and see if Apple develops a chip that comperes with IBM Power
or or even XEON (by the time Apple gets there, Intel might have a modern
XEON on market). HP has tried to sell ARM-based server farms, so this
isn't new. But you need more than 16GB if you want to be taken seriously.
And access to memory by many many cores is THE holy grail in high end computting.
So it remains to be seen how Apple managed the memory controller in M1
and whether that is a scalable architecture to say 24 cores and 256GB of
RAM.
On 2020-11-21 12:54, Lewis wrote:
<https://daringfireball.net/2020/11/the_m1_macs>
Fun fact: retaining and releasing an NSObject takes ~30 nanoseconds on
current gen Intel, and ~6.5 nanoseconds on an M1
… and ~14 nanoseconds on an M1 emulating an Intel.
Thsi is expected since that call from an Intel image simply has over
head to gather Intel format argument list and call the native ARM
NRObject routine that executes with all optjizations for ARM because it
was compikled for ARM and then returns results to the jacket routine
which then reformats return code in Intel format and returns to the
calling program.
On 2020-11-22 17:35, Lewis wrote:
You need to read better, you lumpoy moron. The M1 mac runs the Intel
code FASTER in translation that the code runs on an Intel CPU.
I did not deny this. I explained that the reason a translated image
calling a system service sees little difference with a native image
calling same system service is that both execute the same code for the
system service, except the translated image first goes to a jacket
routine that builds the argument list in ARM format to call the native
system routine and that this overhead is small and would be likely fixed
for all system service calls.
there's a final cut pro benchmark where an m1 macbook pro finished rendering a video faster and with plenty of battery to spare, while the intel macbook drained the battery partway through and had to be plugged
in for it to complete.
Comparing video rendedering against an Intel chip that was never pitched
as high performance i suseless. Comparinfg it against an Core i7 or i9
on iMac pro is relebant.
So the fact that the M1 is up there with the iMacPro that has a XEON is relevant, the fact that it beats an old laptop with a laptop chip isn't.
(this is purely in context of rendering videos, and how scalable the M1
will be in the next few months)
Howevere, I am very perplexed by the low ram requirements to render a
large movie. I have to wonder if this is just adding a few simple titles
and cut/fades, or whether this editing includes a lot fo special effects.
be careful about early reviews because they are "sponsored" by Apple who provides the reviewer with free machines.
But the fact they the M1 is performing very well with cinebench shows it
does have much power in it.
Had apple provided actual numbers during keynote, it would have
impressed instead of leading people to wonder why APple was so
purposefully opaque with useless graphs with no scale etc.
So it remains to be seen how Apple managed the memory controller in M1
and whether that is a scalable architecture to say 24 cores and 256GB of RAM.
The M1 is a kickoff point. There is nothing I can find limiting the
number of cores or memory size to such a low number. I wouldn't be surprised if the limit exceeds 256 cores and many, many TB of RAM.
On 2020-11-22 17:21, nospam wrote:
there's a final cut pro benchmark where an m1 macbook pro finished
rendering a video faster and with plenty of battery to spare, while the
intel macbook drained the battery partway through and had to be plugged
in for it to complete.
Comparing video rendedering against an Intel chip that was never pitched
as high performance i suseless. Comparinfg it against an Core i7 or i9
on iMac pro is relebant.
So the fact that the M1 is up there with the iMacPro that has a XEON is relevant, the fact that it beats an old laptop with a laptop chip isn't.
(this is purely in context of rendering videos, and how scalable the M1
will be in the next few months)
Howevere, I am very perplexed by the low ram requirements to render a
large movie. I have to wonder if this is just adding a few simple titles
and cut/fades, or whether this editing includes a lot fo special effects.
On 2020-11-22 18:00, Alan Baker wrote:
And yet, reviewers have actually DONE video testing...
...and it works great.
be careful about early reviews because they are "sponsored" by Apple who provides the reviewer with free machines.
But the fact they the M1 is performing very well with cinebench shows it
does have much power in it.
Had apple provided actual numbers during keynote, it would have
impressed instead of leading people to wonder why APple was so
purposefully opaque with useless graphs with no scale etc.
On 2020-11-21 14:58, Stephen Hoffman wrote:
Maybe your app mix doesn't fit within the target market for these early M1-based systems,
The problem is that Apple didn't market these laptops as great to ruin browsers or wordk processors, it specifically targetted video editing
which has been oe of the highest consumer of RAM.
I'd have no problem if Apple marketied the laptop to view movies, browse
the web, write emails and compose text documents, manage your photos.
But Apple explicitely and repeatedy mentioned video editing.
Perhaps Final Cut pro got a magic update where it no longer requires RAM
to process 4K movies.
On 2020-11-22 17:41, Alan Baker wrote:
But extremely relevant if an M1 on an Air were clocked at lower speed
than an M1 on MacBook pro or Mini.
Wrong... ...again.
I did not claim it was clocked lower. But the fact that Apple did not publish this means that we have to wait for independant benchmarks to
see what the M1 in the Air looks like compared to the M1 in Pro and Mini.
They are coming out now, and we are starting to see that all three
computers appears to have same baseline clock speeds. But no number on whether they have speed boost or not. We do know that the Air does
throttle back a bit for long workloads, but not that much.
That is not a hypervisor. A hypervisor is a system that lets a computer operate several OS' concurrently.
These things are implicit in the OS' management and allocation of
resources and dispatching via threads.
That memory is allocated and fixed. The CPU (naturally) can at OS
privilege levels read the o/p of whatever subcontracted operations there are.
Marketing? Really? A key reason these Macs are so blazing fast is in
good part due to the unified memory.
fact separate chip carriers on the M1 SOC. But that is mapped for the
whole M1 to see and write - including GPU and lots of high speed I/O -
and governed by the memory controller - in turn controlled by the OS.
The OS, controls the configuration and access to the devices via the SDK's/API's etc.
There is nothing about the ARM design that limits the memory to 16GB.
I don't think this is such a big challenge for Apple. You do know they dwarf intel right? Hint: intel is 10X bigger than hp and Apple is ___X bigger than intel.
You do not know that. Your assumptions about how Rosetta works are absolutely simple. Apple did not invest "simple" where Rosetta is concerned.
I am watching a video where actual rendering, not just playing of a
video and the MacBook Pro with 8GB RAM beat the pants off a MacPro 2019 *Cheese grater) with 192 GB.
The tested did fins that wiuth 8K video, there were problems playing
back. (on the other hand, displaying on a atiny screen isn't actual an
8K display, it is a scaled down image whereeas on the Mac pro, the guy
had a real display).
He noted plenty of swap file used.
So if performance is this good with swappping, I have to wonder what it
will be like of/when Apple releases models with up to 256GB of memory.
The video I watched:
https://youtu.be/GMzT3bajoPs
Note the software he is using is very GPU intensive, not CPU intensive.
On a test fo Prem,iere Pro, he noticed this was CPU intensive, not GPU intensive, and saw dropped frames playing back.
The real horsepower is wgen you export though, not playing back.
I am susprised disl ip would be so far for swappinG/paging sicne it all
gores through secure enclave for encryption. So that part must be super
fast too.
On 2020-11-22 18:37, Alan Browne wrote:
That is not a hypervisor. A hypervisor is a system that lets a computer
operate several OS' concurrently.
What do you think an 8 core GPU does?
What do you think the neural network does? Neither runs OS-X. Yet they
share hardware such as RAM.
These things are implicit in the OS' management and allocation of
resources and dispatching via threads.
A GPU is not a thread running under control of OS-X. It is a different independant computer with its own CPU cores, it own operating system instance.
And consider how Windows runs on laptops with Intel's integrated memory sharing GPUs. Windows is unaware of these GPUs sharinfg memory because
some form of hypervisor manages this, just as it mamages resources when
you have different OS instances running on same machine.
That memory is allocated and fixed. The CPU (naturally) can at OS
privilege levels read the o/p of whatever subcontracted operations there
are.
Where did you find the ifnormation on how GPU and CPU share memory and
who controsl what ?
Marketing? Really? A key reason these Macs are so blazing fast is in
good part due to the unified memory.
How do you know that? Unified memory is marketing gobbledeegook. The Intellaptop CPUs also had shared memory with GPUs, yet, achieved newhere
near the performance that Apple appears to be achieving with the M1.
There is a lot more to the soup than just the opaque memory susbsystem
Apple has developped.
On 2020-11-22 18:39, Alan Browne wrote:
You do not know that. Your assumptions about how Rosetta works are
absolutely simple. Apple did not invest "simple" where Rosetta is
concerned.
I had reasd documents about Rosetta 2 when they were released and they explained the translation of argument lists via a jacked Intel routine
that then called the ARM native system routine.
On 2020-11-20 19:51, Lewis wrote:
Most importantly, with this shared memory comes huge gains in speed.
This is Apple marketing.
No, you are wrong and there is tons of evidence proving you are wrong.
So based on your logic, the Intel GPUs in the Intel Laptop chips should
have outperformed the fancy external GPUs years ago because of shared
memory. Yet, when Apple unleashed its $60,000 Cheese Grater Mac Pro, it
used external GPUs, not the internal Intel ones.
Where you are purposefully keeping you head in the sand are the
scalability issues. The integrated GPU with shared memory ahs worked in laptops (and for M1 laptops , outperforming by far similar Intel
laptops that also hsrae memory).
However, as you scale up, you find compuyers all have external GPUs
because they want more hosrsepower and when you grow GPU horsepower, you
grow demand on memory access and this starts to conflict with CPU
accessing memory while CPU is chugging along.
So again, how Apple deals with its architecture to scale it remains to
be seen. I am sure they have plans and prototypes inhouse. But we
haven't seen how they plan to scale this to iMacs and Mac pro.
On 2020-11-22 17:35, Lewis wrote:
You need to read better, you lumpoy moron. The M1 mac runs the Intel
code FASTER in translation that the code runs on an Intel CPU.
I did not deny this. I explained that the reason a translated image
calling a system service sees little difference with a native image
routine that builds the argument list in ARM format to call the native
system routine and that this overhead is small and would be likely fixed
for all system service calls.
On 2020-11-22 18:39, Alan Browne wrote:
You do not know that. Your assumptions about how Rosetta works are
absolutely simple. Apple did not invest "simple" where Rosetta is
concerned.
I had reasd documents about Rosetta 2 when they were released and they explained the translation of argument lists via a jacked Intel routine
that then called the ARM native system routine.
In message <mnCuH.30291$kM7.20340@fx43.iad> JF Mezei <jfmezei.spamnot@vaxination.ca> wrote:
On 2020-11-22 17:35, Lewis wrote:
You need to read better, you lumpoy moron. The M1 mac runs the Intel
code FASTER in translation that the code runs on an Intel CPU.
I did not deny this. I explained that the reason a translated image
calling a system service sees little difference with a native image
Which is NOT what happens. The difference is literally the Intel chip is
100% slower than the M1 in translation.
routine that builds the argument list in ARM format to call the native
system routine and that this overhead is small and would be likely fixed
for all system service calls.
This is nonsense. Once again, in small words.
Intel is slower than new Apple chip when running native Intel Code.
The M1 runs native Intel code FASTER than Intel chips.
Which of these words is confusing you?
On 2020-11-22 18:00, Alan Baker wrote:
And yet, reviewers have actually DONE video testing...
...and it works great.
be careful about early reviews because they are "sponsored" by Apple who provides the reviewer with free machines.
On 2020-11-22 8:58 p.m., Lewis wrote:
In message <mnCuH.30291$kM7.20340@fx43.iad> JF Mezei <jfmezei.spamnot@vaxination.ca> wrote:
On 2020-11-22 17:35, Lewis wrote:
You need to read better, you lumpoy moron. The M1 mac runs the Intel
code FASTER in translation that the code runs on an Intel CPU.
I did not deny this. I explained that the reason a translated image
calling a system service sees little difference with a native image
Which is NOT what happens. The difference is literally the Intel chip is
100% slower than the M1 in translation.
routine that builds the argument list in ARM format to call the native
system routine and that this overhead is small and would be likely fixed >>> for all system service calls.
This is nonsense. Once again, in small words.
Intel is slower than new Apple chip when running native Intel Code.
The M1 runs native Intel code FASTER than Intel chips.
Which of these words is confusing you?
To be more accurate:
"Intel Macs running native Intel code are slower than Macs running the
new Apple chip when running Rosetta-translated Intel code."
In message <rpfftq$ffk$1@dont-email.me> Alan Baker <notonyourlife@no.no.no.no> wrote:
On 2020-11-22 8:58 p.m., Lewis wrote:
In message <mnCuH.30291$kM7.20340@fx43.iad> JF Mezei <jfmezei.spamnot@vaxination.ca> wrote:
On 2020-11-22 17:35, Lewis wrote:
You need to read better, you lumpoy moron. The M1 mac runs the Intel >>>>> code FASTER in translation that the code runs on an Intel CPU.
I did not deny this. I explained that the reason a translated image
calling a system service sees little difference with a native image
Which is NOT what happens. The difference is literally the Intel chip is >>> 100% slower than the M1 in translation.
routine that builds the argument list in ARM format to call the native >>>> system routine and that this overhead is small and would be likely fixed >>>> for all system service calls.
This is nonsense. Once again, in small words.
Intel is slower than new Apple chip when running native Intel Code.
The M1 runs native Intel code FASTER than Intel chips.
Which of these words is confusing you?
To be more accurate:
"Intel Macs running native Intel code are slower than Macs running the
new Apple chip when running Rosetta-translated Intel code."
That's way to complex for the pinhead.
On 2020-11-22 18:00, Alan Baker wrote:
And yet, reviewers have actually DONE video testing...
...and it works great.
be careful about early reviews
Which is NOT what happens. The difference is literally the Intel chip is
100% slower than the M1 in translation.
be careful about early reviews because they are "sponsored" by Apple who
provides the reviewer with free machines.
You are a fucking lying piece of shit.
On 2020-11-22 23:58, Lewis wrote:
Which is NOT what happens. The difference is literally the Intel chip is
100% slower than the M1 in translation.
The context was a comparison in calling a specific system service from a translated image vs the natively compiled image with a very small
difference.
So mu response was in line. The small difference v=between the two is
because the translated image ends up calling the native ARM system
service after going through a jacket routine which translates the Intel format argument list into ARM format argument list.
This has nothing to do with code actually running on Intel. It was a comparison of translated image vs natively compiled.
On 2020-11-23 11:52, Lewis wrote:
be careful about early reviews because they are "sponsored" by Apple who >>> provides the reviewer with free machines.
You are a fucking lying piece of shit.
Apple has seeded well known reviewers with the the 2 new computers so
they can run benchmarks and report.
More recent reviews will start to include people who actually purchased
units for their own use.
They still show the M1 with performance that is very very good (despite
paging). But the more recent ones provide more colour on performance.
And the more recent ones are also the ones who provided comparison with
a $60,000 cheese grater Mac pro (since early reviews from seeded units
were standalone without many comparisons).
I am watching a video where actual rendering, not just playing of a
video and the MacBook Pro with 8GB RAM beat the pants off a MacPro 2019 *Cheese grater) with 192 GB.
The tested did fins that wiuth 8K video, there were problems playing
back. (on the other hand, displaying on a atiny screen isn't actual an
8K display, it is a scaled down image whereeas on the Mac pro, the guy
had a real display).
He noted plenty of swap file used.
So if performance is this good with swappping, I have to wonder what it
will be like of/when Apple releases models with up to 256GB of memory.
The video I watched:
https://youtu.be/GMzT3bajoPs
Note the software he is using is very GPU intensive, not CPU intensive.
On a test fo Prem,iere Pro, he noticed this was CPU intensive, not GPU intensive, and saw dropped frames playing back.
The real horsepower is wgen you export though, not playing back.
I am susprised disl ip would be so far for swappinG/paging sicne it all
gores through secure enclave for encryption. So that part must be super
fast too.
On 2020-11-23 11:52, Lewis wrote:
be careful about early reviews because they are "sponsored" by Apple who >>> provides the reviewer with free machines.
You are a fucking lying piece of shit.
Apple has seeded well known reviewers with the the 2 new computers so
they can run benchmarks and report.
More recent reviews will start to include people who actually purchased
units for their own use.
Here is a difference: early ones didn't report on paging when rendering videos and didn't report on "export" function performance (where the
real horsepower happens), but more recent ones do.
They still show the M1 with performance that is very very good (despite paging). But the more recent ones provide more colour on performance.
And the more recent ones are also the ones who provided comparison with
a $60,000 cheese grater Mac pro (since early reviews from seeded units
were standalone without many comparisons).
On 2020-11-22 18:37, Alan Browne wrote:
That is not a hypervisor. A hypervisor is a system that lets a computer
operate several OS' concurrently.
What do you think an 8 core GPU does?
What do you think the neural network does? Neither runs OS-X. Yet they
share hardware such as RAM.
These things are implicit in the OS' management and allocation of
resources and dispatching via threads.
A GPU is not a thread running under control of OS-X. It is a different independant computer with its own CPU cores, it own operating system instance.
And consider how Windows runs on laptops with Intel's integrated memory sharing GPUs. Windows is unaware of these GPUs sharinfg memory because
some form of hypervisor manages this, just as it mamages resources when
you have different OS instances running on same machine.
That memory is allocated and fixed. The CPU (naturally) can at OS
privilege levels read the o/p of whatever subcontracted operations there
are.
Where did you find the ifnormation on how GPU and CPU share memory and
who controsl what ?
Marketing? Really? A key reason these Macs are so blazing fast is in
good part due to the unified memory.
How do you know that? Unified memory is marketing gobbledeegook. The
Intellaptop CPUs also had shared memory with GPUs, yet, achieved newhere
near the performance that Apple appears to be achieving with the M1.
There is a lot more to the soup than just the opaque memory susbsystem
Apple has developped.
fact separate chip carriers on the M1 SOC. But that is mapped for the
whole M1 to see and write - including GPU and lots of high speed I/O -
and governed by the memory controller - in turn controlled by the OS.
Are you aware this is standard fiunction of RAM with a memory controller serving multiple cores and oon larger systems multiple physical CPUs ?
Mac Pro 2009 amd Xserve 2009 could come would multple physical CPUs and memory controller architecture to deal with that.
The OS, controls the configuration and access to the devices via the
SDK's/API's etc.
GPUs are far more than devices, they are computers on their own.
Normalll have their own compiler, own RAM, and many cores. Here, they
shjare memory with the CPU's cores, but would still have its own OS
instance , its own compiler to take Metal code, compile and run it.
What has changed is that the application urnning on CPU sends the source
code via memory DMA isntead of via the PCI-E bus. But the GPU remans a separate entity.
There is nothing about the ARM design that limits the memory to 16GB.
Correct. But Putting 256GB of RAM on an SoC may not be physically
possible in near term, so how Apple manages growing from glorified
iPhone SoC to a full computer CPU with peripheral (expandable RAM, more thunderbolt buses etc) remains to be seen. Or will they insist that
going forwrds, paging performance is such that people won't need more
than 16GB of RAM ?
Had Apple had external RAM, then then path would be clear. But with
internal in-chip RAM, it remains to be seen whether Apple will insist on keeping all RAM inside or if it will evolved to external RAM (with
whatecver performance issues this might bring).
I don't think this is such a big challenge for Apple. You do know they
dwarf intel right? Hint: intel is 10X bigger than hp and Apple is ___X
bigger than intel.
HP self destructed. Intel is still a very large gorilla in the room,
and while they are currently comatose, they could come back. AMD has
uppped the game quite a bit in the x86 arena and NVIDIA upped it big
time in GPU one.
From a volume point of view, Apple is still small potatoes in the CPU market. And since it is doubtful Apple will start selling their CPUs to
competitors, it will remain a small but veruy focused operation to make purpose chips for Apple.
What remains to be seen is whether M1 contains a lot of non-ARM magic
inside for that type of performance, or whether it really was just a
question of getting down to 5nm process and getting that performance.
When HPE did ARM servers, the peformance was oh-hum. Didn't attract news headlines.
Now, Apple is beating expectations big time with its performance.
So the question becomes whether other ARM licencees will be able to
match Apple once they get to 5nm etc, or whether Apple truly has special sauce inside that will prevent the others from coming close. Only time
will tell.
On 2020-11-22 18:39, Alan Browne wrote:
You do not know that. Your assumptions about how Rosetta works are
absolutely simple. Apple did not invest "simple" where Rosetta is
concerned.
I had reasd documents about Rosetta 2 when they were released and they explained the translation of argument lists via a jacked Intel routine
that then called the ARM native system routine.
No, that was not the context at all. learn to read.
14ms on M1 translated. 30ms on Intel chip.
Nobody said larger memory models would be SOC. Apple can elect to put
some on the SOC and expand that to off SOC (possibly slower at that -
but would reduce or eliminate swaps).
If it isn't clear to you, go look again: the memory on the M1 SOC is in
its own chip carrier (two actually).
wherever. And as I say above I wouldn't be very surprised if we end up
with 'slower' RAM off SOC and 'fastest' RAM on SOC.
I'm sure Apple are beyond caring what AMD, intel and NVIDIA are doing.
20M+ CPU's per year under their own control (cost savings) while making those computers much higher performing than what intel can put out Watt
for Watt, dollar for dollar.
Apple don't care about how many CPU's they make. They care about how
many computers they sell.
And this will gain a lot of converts.
The others do not have anywhere near Apple's engineering.
More
importantly, they are not integrating from the battery to the keyboard
to the screen and everything in between (including the OS) as Apple is.
On 2020-11-23 16:50, Alan Browne wrote:
If it isn't clear to you, go look again: the memory on the M1 SOC is in
its own chip carrier (two actually).
The only question
Wouldn.'t surprise me to find that all M1s are 16GB with the lower end modelsl having 8GB disbaled.
wherever. And as I say above I wouldn't be very surprised if we end up
with 'slower' RAM off SOC and 'fastest' RAM on SOC.
Possible but comlicates the memory controller.
20M+ CPU's per year under their own control (cost savings) while making
those computers much higher performing than what intel can put out Watt
for Watt, dollar for dollar.
the per watt performance matters for laptops, not really for desktops. (though you get into cooling challenges with current higher end wintel
rigs).
Apple don't care about how many CPU's they make. They care about how
many computers they sell.
And this will gain a lot of converts.
This remains to be seen. The currently released models aren't omputers,
they are appliances. Can't change memory, can't change disk. Can't
repair it. And OS-X has moved closed to IOS and becoming mreo and more closed to normal applications (walled garden via App Store).
Whether this works to gain market share or not remains to be seen.
The others do not have anywhere near Apple's engineering.
Don't underestimate AMD and NVIDIA.
More
importantly, they are not integrating from the battery to the keyboard
to the screen and everything in between (including the OS) as Apple is.
Integration is not so beneficial for desktops and gaming rigs. Ability
to choose, configure your sustem is a HUGE thing for that market. Apple
will offer one solution with 1 GPU, 1 CPU and 1 or 2 RAM options.
Integration is not so beneficial for desktops and gaming rigs. Ability
to choose, configure your sustem is a HUGE thing for that market. Apple will offer one solution with 1 GPU, 1 CPU and 1 or 2 RAM options.
Gaming rigs is not Apple's main market. Desktops remain to be seen in
terms of configuration.
On 2020-11-21 14:58, Stephen Hoffman wrote:
Maybe your app mix doesn't fit within the target market for these early
M1-based systems,
The problem is that Apple didn't market these laptops as great to ruin browsers or wordk processors, it specifically targetted video editing
which has been oe of the highest consumer of RAM.
I'd have no problem...
...if Apple marketied the laptop to view movies, browse the web, write emails and compose text documents, manage your photos.
But Apple explicitely and repeatedy mentioned video editing.
Perhaps Final Cut pro got a magic update where it no longer requires
RAM to process 4K movies.
On 2020-11-21 14:58, Stephen Hoffman wrote:
Maybe your app mix doesn't fit within the target market for these early
M1-based systems,
The problem is that Apple didn't market these laptops as great to ruin browsers or wordk processors, it specifically targetted video editing
which has been oe of the highest consumer of RAM.
I'd have no problem if Apple marketied the laptop to view movies, browse
the web, write emails and compose text documents, manage your photos.
But Apple explicitely and repeatedy mentioned video editing.
Perhaps Final Cut pro got a magic update where it no longer requires RAM
to process 4K movies.
Alan Browne <Blackhole@entropy.ultimateorg> wrote:
...
Integration is not so beneficial for desktops and gaming rigs. Ability
to choose, configure your sustem is a HUGE thing for that market. Apple
will offer one solution with 1 GPU, 1 CPU and 1 or 2 RAM options.
Gaming rigs is not Apple's main market. Desktops remain to be seen in
terms of configuration.
Speaking of gaming, why is it bad in Mac? I remember when Steve Jobs showing computer games back then like Quake. I was excited to see some hardcore Mac gaming. It's like Linux is doing better than Macs. :(
Alan Browne <Blackhole@entropy.ultimateorg> wrote:
...
Integration is not so beneficial for desktops and gaming rigs. Ability
to choose, configure your sustem is a HUGE thing for that market. Apple
will offer one solution with 1 GPU, 1 CPU and 1 or 2 RAM options.
Gaming rigs is not Apple's main market. Desktops remain to be seen in
terms of configuration.
Speaking of gaming, why is it bad in Mac? I remember when Steve Jobs showing computer games back then like Quake. I was excited to see some hardcore Mac gaming. It's like Linux is doing better than Macs. :(
Alan Browne <Blackhole@entropy.ultimateorg> wrote:
...
Integration is not so beneficial for desktops and gaming rigs. Ability
to choose, configure your sustem is a HUGE thing for that market. Apple
will offer one solution with 1 GPU, 1 CPU and 1 or 2 RAM options.
Gaming rigs is not Apple's main market. Desktops remain to be seen in
terms of configuration.
Speaking of gaming, why is it bad in Mac?
On 2020-11-24 17:16, Ant wrote:
Alan Browne <Blackhole@entropy.ultimateorg> wrote:
...
Integration is not so beneficial for desktops and gaming rigs. Ability >>>> to choose, configure your sustem is a HUGE thing for that market. Apple >>>> will offer one solution with 1 GPU, 1 CPU and 1 or 2 RAM options.
Gaming rigs is not Apple's main market. Desktops remain to be seen in
terms of configuration.
Speaking of gaming, why is it bad in Mac?
It's not bad in Mac.
But.
The gaming world grew where the market was largest and that is Windows
PC's by far back 20 years ago when Apple had a very slim slice of the PC market.
Gamers could easily customize and upgrade their rigs every few months if they cared to as new processors, GPU's etc, came out. And relatively cheaply. The "PC" was just a skelton holding the latest configuration.
Now as GPU's become more and more commoditized, that distinction will
wane and game makers will be able to hop OS' more and more easily.
But Apple explicitely and repeatedy mentioned video editing.
And maybe perhaps possibly the folks at Apple have run some tests and
know about the performance?
And perhaps once you've tried your own tests, you'll know if M1 is
faster or slower.
Apple would have known about performance prior to keynote, and chose
instead to provide useless graphs that reduced the credibilitty of its claims.
A while back, I was blasted here for stating these keynotes are large marketing events airmed at the public, told that no, they are aimed at
the trade press. If they are aimed at the trade press, that Apple
should have provided hard comparable performance numbers since they make
the M1 look really good.
The argument made by Apple apologists was that the M1 required less RAM, hence no problem with only 8GB to do video rendering.
Apple would have known about performance prior to keynote, and chose
instead to provide useless graphs that reduced the credibilitty of its claims.
Only in the eyes of complete idiots like you who don't understand what
those graphs actually represent.
Some of the more serious tests I have seen pointed out the gigabytes of page/swap file used during rendering on the 8GB M1 and much less on the
16GB.
On 2020-11-25 14:03, Jolly Roger wrote:
Only in the eyes of complete idiots like you who don't understand what
those graphs actually represent.
Graphs with ptreyy lines but no scale are meaningless.
Claiming your machine is faster than 95% of laotops is meaningless
(laptop sold in unitsl laptop sold in revenue, 95% of laptop models, 95%
of laptops still in use today?
Cosnidering the M1 is faster than many desktops, Apple should have
provided hard numbers to really show it with a huge bang
meaningless "Reality distortion field" marketing gobbledeegook.
On 2020-11-25 14:03, Jolly Roger wrote:
Only in the eyes of complete idiots like you who don't understand what
those graphs actually represent.
Graphs with ptreyy lines but no scale are meaningless.
Claiming your machine is faster than 95% of laotops is meaningless
(laptop sold in unitsl laptop sold in revenue, 95% of laptop models, 95%
of laptops still in use today?
On 2020-11-24 19:12, Stephen Hoffman wrote:
But Apple explicitely and repeatedy mentioned video editing.
And maybe perhaps possibly the folks at Apple have run some tests and
know about the performance?
The tests that are coming out now show HUGE performance for M1. (Though
I did spot one with an Intel Mac Mini with external GPU that beat the
M1, but that is a outlier).
Apple would have known about performance prior to keynote, and chose
instead to provide useless graphs that reduced the credibilitty of its claims.
A while back, I was blasted here for stating these keynotes are large marketing events airmed at the public, told that no, they are aimed at
the trade press. If they are aimed at the trade press, that Apple
should have provided hard comparable performance numbers since they make
the M1 look really good.
And perhaps once you've tried your own tests, you'll know if M1 is
faster or slower.
The argument made by Apple apologists was that the M1 required less RAM, hence no problem with only 8GB to do video rendering.
Some of the more serious tests I have seen pointed out the gigabytes of page/swap file used during rendering on the 8GB M1 and much less on the
16GB. :cl pf RAM does impact performance negatively, but because M1 is
so much faster, it stll outperforms those without paging.
Someone did tests for read/write on the Mini SSD, and found read was
slightly better, write not as good as on the Intel Mini (so roughly the same). So a machine hindered by paging but that still performs better
than machines that don't page means the CPU is actually faster than it appears to be.
On 2020-11-25 14:03, Jolly Roger wrote:
Only in the eyes of complete idiots like you who don't understand what
those graphs actually represent.
Graphs with ptreyy lines but no scale are meaningless.
Claiming your machine is faster than 95% of laotops is meaningless
(laptop sold in unitsl laptop sold in revenue, 95% of laptop models, 95%
of laptops still in use today?
95% of laptops seen during a short survey at a Delta gate at some airpoirt?
Cosnidering the M1 is faster than many desktops, Apple should have
provided hard numbers to really show it with a huge bang, instead of meaningless "Reality distortion field" marketing gobbledeegook.
Graphs with ptreyy lines but no scale are meaningless.
Nope. The show RELATIVE performance.
And even if they ARE aimed at the trade press, Apple then knows that the trade press will be presenting greater depth as the review the systems.
In article <rpmi80$5qa$1@dont-email.me>,
Alan Baker <notonyourlife@no.no.no.no> wrote:
And even if they ARE aimed at the trade press, Apple then knows that the
trade press will be presenting greater depth as the review the systems.
And will you be so vehemently defending Apple in the trade press
as you are on usenet?
In article <rpmid3$5qa$2@dont-email.me>,
Alan Baker <notonyourlife@no.no.no.no> wrote:
Graphs with ptreyy lines but no scale are meaningless.
Nope. The show RELATIVE performance.
Performance is only part of why people buy computers.
Based on
the PPC emulator history I cannot trust Apple to maintain the X86
emulator indefinitely.
So the question becomes whether I am okay
with being forced to replace useful software like I did with
Illustrator and Photoshop.
Some people like being able to boot
the same hardware to Linux. Et cetera.
Absolute performance has to be enough to justify other costs.
In article <rpmid3$5qa$2@dont-email.me>,
Alan Baker <notonyourlife@no.no.no.no> wrote:
Graphs with ptreyy lines but no scale are meaningless.
Nope. The show RELATIVE performance.
Performance is only part of why people buy computers. Based on
the PPC emulator history I cannot trust Apple to maintain the X86
emulator indefinitely. So the question becomes whether I am okay
with being forced to replace useful software like I did with
Illustrator and Photoshop. Some people like being able to boot
the same hardware to Linux. Et cetera.
Absolute performance has to be enough to justify other costs.
In article <rpmid3$5qa$2@dont-email.me>,
Alan Baker <notonyourlife@no.no.no.no> wrote:
Graphs with ptreyy lines but no scale are meaningless.
Nope. The show RELATIVE performance.
Performance is only part of why people buy computers. Based on
the PPC emulator history I cannot trust Apple to maintain the X86
emulator indefinitely. So the question becomes whether I am okay
with being forced to replace useful software like I did with
Illustrator and Photoshop. Some people like being able to boot
the same hardware to Linux. Et cetera.
Absolute performance has to be enough to justify other costs.
Windoze is semi-around as an ARM version, but Apple have said it's up
to Microsloth whether or not it is ever made available for use on Macs.
On 2020-11-25 23:42, Your Name wrote:
Windoze is semi-around as an ARM version, but Apple have said it's up
to Microsloth whether or not it is ever made available for use on Macs.
Not "natively" on the sense that to boot from the SSD, you are required
to have a signed OS-X version which also works with secure enclave which
acts as disk controller (doing encryptioo/Ndecryption).
So you would have to boot OS-X to disable boot code signatire
verification and enable boot from external drive and then you could boot
from alternate OS.
Howeer, Apple proprietary Secure Boot console would require either an
EFI emulator or that Windows be allowed to support that Secure Boot thing.
The move away form industry standard architcture (EFI, x86,
OPCI-express) and to a closed appliance with whatever tech inside Apple
finds works best may provide grest OS-X/IOS experience, but makes it an applicance instead of a "computer".
On 2020-11-25 23:42, Your Name wrote:
Windoze is semi-around as an ARM version, but Apple have said it's up
to Microsloth whether or not it is ever made available for use on Macs.
Not "natively" on the sense that to boot from the SSD, you are required
to have a signed OS-X version which also works with secure enclave which
acts as disk controller (doing encryptioo/Ndecryption).
Howeer, Apple proprietary Secure Boot console would require either an
EFI emulator or that Windows be allowed to support that Secure Boot thing.
The move away form industry standard architcture (EFI, x86,
OPCI-express) and to a closed appliance with whatever tech inside Apple
finds works best may provide grest OS-X/IOS experience, but makes it an applicance instead of a "computer".
The move away form industry standard architcture (EFI, x86,
OPCI-express) and to a closed appliance with whatever tech inside Apple
finds works best may provide grest OS-X/IOS experience, but makes it an applicance instead of a "computer".
The move away form industry standard architcture (EFI, x86,
OPCI-express) and to a closed appliance with whatever tech inside Apple
finds works best may provide grest OS-X/IOS experience, but makes it an applicance instead of a "computer".
Not "natively" on the sense that to boot from the SSD, you are required
to have a signed OS-X version which also works with secure enclave which
acts as disk controller (doing encryptioo/Ndecryption).
Cite.
Rubbish.
On 2020-11-26 14:35, JF Mezei wrote:
The move away form industry standard architcture (EFI, x86,
OPCI-express) and to a closed appliance with whatever tech inside Apple
finds works best may provide grest OS-X/IOS experience, but makes it an
applicance instead of a "computer".
Rubbish.
On 2020-11-26 14:53, Alan Baker wrote:
Not "natively" on the sense that to boot from the SSD, you are required
to have a signed OS-X version which also works with secure enclave which >>> acts as disk controller (doing encryptioo/Ndecryption).
Cite.
Do you deny that Apple stated that Secure Boot on Macs will
requiresigned OS to boot (unless disabled)?
If you deny the obvious, the onus is on you to provide cite.
So you still call it a destop if you can't change/upgrade RAM, can't change/upgrade disk?
do you call your iPhoen and iPads computers?
Apple's desktops are
becoming functionally equivalent fixed-config appliances.
On 2020-11-26 18:58, Alan Browne wrote:
Rubbish.
So you still call it a destop if you can't change/upgrade RAM, can't change/upgrade disk?
do you call your iPhoen and iPads computers?
becoming functionally equivalent fixed-config appliances.
On 2020-11-26 18:58, Alan Browne wrote:
Rubbish.
So you still call it a destop if you can't change/upgrade RAM, can't change/upgrade disk?
On 2020-11-26 23:58:00 +0000, Alan Browne said:
On 2020-11-26 14:35, JF Mezei wrote:
The move away form industry standard architcture (EFI, x86,
OPCI-express) and to a closed appliance with whatever tech inside Apple
finds works best may provide grest OS-X/IOS experience, but makes it an
applicance instead of a "computer".
Rubbish.
When the original Mac was released, "expert" fools whinged that it and
it's graphical interface was just a toy and passing fad ... now the
majority either has a Mac or wants one (to escape Windoze Hell) and
every computer comes with a graphical interface.
On 2020-11-26 14:53, Alan Baker wrote:
Not "natively" on the sense that to boot from the SSD, you are required
to have a signed OS-X version which also works with secure enclave which >>> acts as disk controller (doing encryptioo/Ndecryption).
Cite.
Do you deny that Apple stated that Secure Boot on Macs will
requiresigned OS to boot (unless disabled)?
Do you deny that Apple stated that Secure Boot on Macs will
requiresigned OS to boot (unless disabled)?
That is not what you said though, is it? No, it is not.
On 2020-11-26 23:05, Lewis wrote:
Do you deny that Apple stated that Secure Boot on Macs will
requiresigned OS to boot (unless disabled)?
That is not what you said though, is it? No, it is not.
I stated you require OS-X on the machine because of Secure Boot signing
the boot process. (until disabled and you enable booting from external
disk).
Was told this was false. Asked to confirm, and now you move goalposts
instead of admitting you were wrong in saying I was wrong.
On 2020-11-26 5:14 p.m., JF Mezei wrote:
On 2020-11-26 18:58, Alan Browne wrote:
Rubbish.
So you still call it a destop if you can't change/upgrade RAM, can't
change/upgrade disk?
More tell is what you clipped of what YOU have already said:
'but makes it an applicance instead of a "computer".'
Are you seriously contending that a computer is NOT a computer unless
you can upgrade a hardware component?
On 2020-11-26 23:05, Lewis wrote:
Do you deny that Apple stated that Secure Boot on Macs will
requiresigned OS to boot (unless disabled)?
That is not what you said though, is it? No, it is not.
I stated you require OS-X on the machine because of Secure Boot signing
the boot process. (until disabled and you enable booting from external
disk).
Was told this was false. Asked to confirm, and now you move goalposts--- Synchronet 3.18b-Win32 NewsLink 1.113
instead of admitting you were wrong in saying I was wrong.
On 2020-11-27 01:25:38 +0000, Alan Baker said:
On 2020-11-26 5:14 p.m., JF Mezei wrote:
On 2020-11-26 18:58, Alan Browne wrote:
Rubbish.
So you still call it a destop if you can't change/upgrade RAM, can't
change/upgrade disk?
More tell is what you clipped of what YOU have already said:
'but makes it an applicance instead of a "computer".'
Are you seriously contending that a computer is NOT a computer unless
you can upgrade a hardware component?
It hasn't been possible to upgrade the internals of most Macs for quite
some time ... that trend started well before the Apple Silicon computers were even a twinkle of a dream.
It hasn't been possible to upgrade the internals of most Macs for quite
some time ... that trend started well before the Apple Silicon
computers were even a twinkle of a dream.
On 2020-11-27 01:42, Your Name wrote:
It hasn't been possible to upgrade the internals of most Macs for quite
some time ... that trend started well before the Apple Silicon
computers were even a twinkle of a dream.
But you could still install any operating system on it. The introduction
of T2 and now Secure Enclave makes that much harder and makes built-in
drive unusable for such endeavour.
But you could still install any operating system on it. The introduction
of T2 and now Secure Enclave makes that much harder
and makes built-in
drive unusable for such endeavour.
On 2020-11-26 18:58, Alan Browne wrote:
Rubbish.
So you still call it a destop if you can't change/upgrade RAM, can't change/upgrade disk?
do you call your iPhoen and iPads computers? Apple's desktops are
becoming functionally equivalent fixed-config appliances.
On 2020-11-26 23:58:00 +0000, Alan Browne said:
On 2020-11-26 14:35, JF Mezei wrote:
The move away form industry standard architcture (EFI, x86,
OPCI-express) and to a closed appliance with whatever tech inside Apple
finds works best may provide grest OS-X/IOS experience, but makes it an
applicance instead of a "computer".
Rubbish.
When the original Mac was released, "expert" fools whinged that it and
it's graphical interface was just a toy and passing fad ... now the
majority either has a Mac or wants one (to escape Windoze Hell) and
every computer comes with a graphical interface.
On 2020-11-27 01:42, Your Name wrote:
It hasn't been possible to upgrade the internals of most Macs for quite
some time ... that trend started well before the Apple Silicon
computers were even a twinkle of a dream.
But you could still install any operating system on it. The introduction
of T2 and now Secure Enclave makes that much harder
and makes built-in drive unusable for such endeavour.
On 2020-11-26 10:42 p.m., Your Name wrote:
On 2020-11-27 01:25:38 +0000, Alan Baker said:
On 2020-11-26 5:14 p.m., JF Mezei wrote:
On 2020-11-26 18:58, Alan Browne wrote:
Rubbish.
So you still call it a destop if you can't change/upgrade RAM, can't
change/upgrade disk?
More tell is what you clipped of what YOU have already said:
'but makes it an applicance instead of a "computer".'
Are you seriously contending that a computer is NOT a computer unless
you can upgrade a hardware component?
It hasn't been possible to upgrade the internals of most Macs for quite
some time ... that trend started well before the Apple Silicon
computers were even a twinkle of a dream.
/I/ never said anything to the contrary.
It is obvious to anyone of even very modest intelligence that whether
or not a personal computer IS a personal computer is not in any way dependent upon whether or not you can change a single piece of hardware
in it.
That's all I was saying.
On 2020-11-26 11:40 p.m., JF Mezei wrote:
On 2020-11-27 01:42, Your Name wrote:
It hasn't been possible to upgrade the internals of most Macs for quite
some time ... that trend started well before the Apple Silicon
computers were even a twinkle of a dream.
But you could still install any operating system on it. The introduction
of T2 and now Secure Enclave makes that much harder and makes built-in
drive unusable for such endeavour.
Claims for which you provide NO support... ...again.
On 2020-11-27 07:29:51 +0000, Alan Baker said:
On 2020-11-26 10:42 p.m., Your Name wrote:
On 2020-11-27 01:25:38 +0000, Alan Baker said:
On 2020-11-26 5:14 p.m., JF Mezei wrote:
On 2020-11-26 18:58, Alan Browne wrote:
Rubbish.
So you still call it a destop if you can't change/upgrade RAM, can't >>>>> change/upgrade disk?
More tell is what you clipped of what YOU have already said:
'but makes it an applicance instead of a "computer".'
Are you seriously contending that a computer is NOT a computer
unless you can upgrade a hardware component?
It hasn't been possible to upgrade the internals of most Macs for
quite some time ... that trend started well before the Apple Silicon
computers were even a twinkle of a dream.
/I/ never said anything to the contrary.
It is obvious to anyone of even very modest intelligence that whether
or not a personal computer IS a personal computer is not in any way
dependent upon whether or not you can change a single piece of
hardware in it.
That's all I was saying.
I was agreeing with you. I just forgot to put "Yep" at the start. :-)
Claims for which you provide NO support... ...again.
Yes, iPhones and iPads are definitely, without contest, computers. I
can load whatever s/w I like on them and I can write my own s/w for them
if need be (no need at present).
And of course this doesn't prevent anyone from adding peripherals via
the USB (Thunderbold) ports.
You're making stuff up because the rest of your statements are falling
to pieces. Or because you're very unsatisfied with your recent CPU
upgrade to your Mac Pro.
I'll ask again.
Yes, iPhones and iPads are definitely, without contest, computers. I
can load whatever s/w I like on them and I can write my own s/w for them if need be (no need at present).
Whatever software you like? Only possible if you jailbreak the iPhone
and side-load via Cydia. Otherwise, you only have access to app curated
by Apple on the App store.
And of course this doesn't prevent anyone from adding peripherals via
the USB (Thunderbold) ports.
Assuming OS-X supports such peripherals. Note: M1 version of OS-X does
not support external GPUs.
You need to go through hoops to allow booting from external drive which
is disable by default.
On 2020-11-27 02:58, Alan Baker wrote:
Claims for which you provide NO support... ...again.
I'll ask again.
Do you deny that Apple uses its Secure Boot console on M1 based Maxs?
Do you deny that Secure Boot does code signature to boot only an OS that
is signed by Apple (unless disabled after booting into OS-X).
Do you deny that the Secure Enclave in M1 acts as controller for all
access to the 1 drive inside the machine and that it coperates wurg S to authorize access and perform encryotioN/decryption of data like it does
with IOS ?
All these were stated during keynote.
On 2020-11-27 10:23, Alan Browne wrote:
Yes, iPhones and iPads are definitely, without contest, computers. I
can load whatever s/w I like on them and I can write my own s/w for them
if need be (no need at present).
Whatever software you like? Only possible if you jailbreak the iPhone
and side-load via Cydia. Otherwise, you only have access to app curated
by Apple on the App store.
And of course this doesn't prevent anyone from adding peripherals via
the USB (Thunderbold) ports.
Assuming OS-X supports such peripherals. Note: M1 version of OS-X does
not support external GPUs.
You need to go through hoops to allow booting from external drive which
is disable by default.
You're making stuff up because the rest of your statements are falling
to pieces. Or because you're very unsatisfied with your recent CPU
upgrade to your Mac Pro.
Your logic is astounding. Why would I be unsatisfied going from 4 to 10 cores, 10MB to 25MB cache? Render times when either Premiere or After Effects makes full use of cores are hugely shorter.
you do realize that these are the *first* apple silicon macs and in no
way defines what future macs will or will not support, right?
You need to go through hoops to allow booting from external drive which
is disable by default.
a very, very small hoop.
On 2020-11-27 10:23, Alan Browne wrote:
Yes, iPhones and iPads are definitely, without contest, computers. I
can load whatever s/w I like on them and I can write my own s/w for them
if need be (no need at present).
Whatever software you like? Only possible if you jailbreak the iPhone
and side-load via Cydia. Otherwise, you only have access to app curated
by Apple on the App store.
You need to go through hoops to allow booting from external drive which
is disable by default.
You're making stuff up because the rest of your statements are falling
to pieces. Or because you're very unsatisfied with your recent CPU
upgrade to your Mac Pro.
Your logic is astounding. Why would I be unsatisfied going from 4 to 10 cores, 10MB to 25MB cache?
Cite, please.
On 2020-11-27 20:10, nospam wrote:
you do realize that these are the *first* apple silicon macs and in no
way defines what future macs will or will not support, right?
Yes I realize. But I am responding to your ilk's statements that 8GB is plenty because M1 doesn't need as much memory, and that closed systems
that can't get any upgrade are fine, that built-in GPUs are fine and
you'll never need more power etc etc etc.
The issue is is that **IF** the M1 chip is the shape of things to come
in terms of having truly everything in one chip, this poses some
challenges that ask questions on how Apple will scale it to span all the
way up to Mac Pro.
If Apple will break out of the chip for higher end models, then we'll
see what happens.
You need to go through hoops to allow booting from external drive which
is disable by default.
a very, very small hoop.
It still makes the built-in SSD useless because accessed via Secure
Enclare, can't be replaced except bu apple (and they'll likely refuse
because soldered in etc).
Cite, please.
https://support.apple.com/en-gb/HT208198
https://support.apple.com/en-ca/guide/macbook-pro/apdcf567823b/mac
Secure boot and Startup Security Utility: Support for secure boot is
turned on automatically. It’s designed to verify that the operating
system software loaded on your computer at startup is authorized by
Apple. See the Apple Support article About Secure Boot.
Note the "authorized by Apple".
Cite, please.
https://support.apple.com/en-gb/HT208198
https://support.apple.com/en-ca/guide/macbook-pro/apdcf567823b/mac
Secure boot and Startup Security Utility: Support for secure boot is
turned on automatically. It’s designed to verify that the operating
system software loaded on your computer at startup is authorized by
Apple. See the Apple Support article About Secure Boot.
you do realize that these are the *first* apple silicon macs and in no
way defines what future macs will or will not support, right?
Yes I realize. But I am responding to your ilk's statements that 8GB is plenty because M1 doesn't need as much memory,
and that closed systems
that can't get any upgrade are fine,
that built-in GPUs are fine
and
you'll never need more power etc etc etc.
The issue is is that **IF** the M1 chip is the shape of things to come
in terms of having truly everything in one chip, this poses some
challenges that ask questions on how Apple will scale it to span all the
way up to Mac Pro.
If Apple will break out of the chip for higher end models, then we'll
see what happens.
You need to go through hoops to allow booting from external drive which
is disable by default.
a very, very small hoop.
It still makes the built-in SSD useless because accessed via Secure
Enclare, can't be replaced except bu apple (and they'll likely refuse
because soldered in etc).
On 2020-11-27 7:34 p.m., FUDMiester wrote:
The issue is is that **IF** the M1 chip is the shape of things to come
in terms of having truly everything in one chip, this poses some
challenges that ask questions on how Apple will scale it to span all the
way up to Mac Pro.
No. It really doesn't.
In message <xZgwH.11709$Zh7.236@fx04.iad> JF Mezei <jfmezei.spamnot@vaxination.ca> wrote:
On 2020-11-27 10:23, Alan Browne wrote:
Yes, iPhones and iPads are definitely, without contest, computers. I
can load whatever s/w I like on them and I can write my own s/w for them >>> if need be (no need at present).
Whatever software you like? Only possible if you jailbreak the iPhone
No.
and side-load via Cydia. Otherwise, you only have access to app curated
by Apple on the App store.
No, that is horseshit. The VAST majority of software for iOS is not
created by Apple, how stupid are you?
Apple does veto apps on the App Store under they own set of rules ...
and when you look at the awful mess on the Google Play store (let alone elsewhere for Android apps), you thank your God / lucky stars / whoever
that they do!
In article <rpspue$eqr$1@gioia.aioe.org>, Your Name
<YourName@YourISP.com> wrote:
Apple does veto apps on the App Store under they own set of rules ...
and when you look at the awful mess on the Google Play store (let alone
elsewhere for Android apps), you thank your God / lucky stars / whoever
that they do!
google curates the play store.
On 2020-11-27 10:23, Alan Browne wrote:
Yes, iPhones and iPads are definitely, without contest, computers. I
can load whatever s/w I like on them and I can write my own s/w for them
if need be (no need at present).
Whatever software you like? Only possible if you jailbreak the iPhone
and side-load via Cydia. Otherwise, you only have access to app curated
by Apple on the App store.
And of course this doesn't prevent anyone from adding peripherals via
the USB (Thunderbold) ports.
Assuming OS-X supports such peripherals. Note: M1 version of OS-X does
not support external GPUs.
You need to go through hoops to allow booting from external drive which
is disable by default.
You're making stuff up because the rest of your statements are falling
to pieces. Or because you're very unsatisfied with your recent CPU
upgrade to your Mac Pro.
Your logic is astounding. Why would I be unsatisfied going from 4 to 10 cores, 10MB to 25MB cache? Render times when either Premiere or After Effects makes full use of cores are hugely shorter.
On 2020-11-27 20:10, nospam wrote:
you do realize that these are the *first* apple silicon macs and in no
way defines what future macs will or will not support, right?
Yes I realize. But I am responding to your ilk's statements that 8GB is plenty because M1 doesn't need as much memory, and that closed systems
that can't get any upgrade are fine, that built-in GPUs are fine and
you'll never need more power etc etc etc.
The issue is is that **IF** the M1 chip is the shape of things to come
in terms of having truly everything in one chip, this poses some
challenges that ask questions on how Apple will scale it to span all the
way up to Mac Pro.
If Apple will break out of the chip for higher end models, then we'll
see what happens.
You need to go through hoops to allow booting from external drive which
is disable by default.
a very, very small hoop.
It still makes the built-in SSD useless because accessed via Secure
Enclare, can't be replaced except bu apple (and they'll likely refuse
because soldered in etc).
In message <rpsi12$7v3$3@dont-email.me> Alan Baker <notonyourlife@no.no.no.no> wrote:
On 2020-11-27 7:34 p.m., FUDMiester wrote:
The issue is is that **IF** the M1 chip is the shape of things to
come in terms of having truly everything in one chip, this poses
some challenges that ask questions on how Apple will scale it to
span all the way up to Mac Pro.
No. It really doesn't.
You have to understand that not only is JF a champion FUD spreader who
frames everything he says so as to project the worst possibilities in
all things, he is also so stupid that he thinks everyone else, and
especially Apple, is even stupider.
So he honestly believes that Apple has switched their platform to a
new chip and has NO IDEA how they will release machines over the next
2 years that will be any faster or better than the $700 Mac mini they
just released.
He has referred to Apple and to hundreds of people who have received
the M1 machines as either liars or stupidly misguided or paid off
accomplices in a conspiracy to make the M1 Mac appear better than they
are.
He has insisted that he knows more about chip design and how chips
work that the best team of chip designers in the world.
He continues to deny that the M1 machines can process 4K video better
than his 7 year old machine, despite many people showing the
performance is equal or nearly equal to current top-end iMac Pros.
Jason Snell recently discussed running a de-noiser tool on his M1 mac.
This tool is the reason he purchased a 10 core iMac Pro, and the tool
maxes out the iMac. The iMac is faster, but only a bit. And the
de-noiser is running in Rosetta.
JF has done this sort of shit before, and he will do it again. On any
topic JF will e sure to have a dogmatic opinion based on ignorance, misunderstanding, or simply outdated information. He will ignore all
evidence that counter his fervently held dogma, and will go to the
extent of lying repeatedly or willfully ignoring explicit proof he is
wrong (like his posting the link to the HT that proved his claim it
was "difficult" to bypass secure boot was literally a single check).
The stupid runs deep.
On 2020-11-27 7:41 p.m., JF Mezei wrote:
Cite, please.
https://support.apple.com/en-gb/HT208198
https://support.apple.com/en-ca/guide/macbook-pro/apdcf567823b/mac
Secure boot and Startup Security Utility: Support for secure boot is
turned on automatically. It’s designed to verify that the operating
system software loaded on your computer at startup is authorized by
Apple. See the Apple Support article About Secure Boot.
Note the "authorized by Apple".
"See the Apple Support article About Secure Boot."
OK. Let's do that!
<https://support.apple.com/en-us/HT208198>
'No Security
No Security doesn't enforce any of the above security requirements for
your startup disk.'
You lose.
In message <TFjwH.14267$vhX.1265@fx17.iad> JF Mezei <jfmezei.spamnot@vaxination.ca> wrote:
Cite, please.
https://support.apple.com/en-gb/HT208198
https://support.apple.com/en-ca/guide/macbook-pro/apdcf567823b/mac
Secure boot and Startup Security Utility: Support for secure boot is
turned on automatically. It’s designed to verify that the operating
system software loaded on your computer at startup is authorized by
Apple. See the Apple Support article About Secure Boot.
And disabling it is literally a single check box, as is shown in that
same exact HT.
In message <281120200151121081%nospam@nospam.invalid> nospam <nospam@nospam.invalid> wrote:
In article <rpspue$eqr$1@gioia.aioe.org>, Your Name
<YourName@YourISP.com> wrote:
Apple does veto apps on the App Store under they own set of rules ...
and when you look at the awful mess on the Google Play store (let alone
elsewhere for Android apps), you thank your God / lucky stars / whoever
that they do!
google curates the play store.
Every store in existence "curates" their offerings.
I ran a 6 hour YouTube stream recorded in 1080p through the beta
handbrake M1 version.
First, in hardware mode it ran at 177fos, not at all surprising.
In software mode converting to 1080p "Fast" preset, the fps of the
conversion dropped at times to as low as 68fps for a few seconds, but
most of the time it was between 85-105fps with occasional peaks as high
as 122fps.
At the same time, I was using the system the entire time and noticed no lagging at all.
'No Security
No Security doesn't enforce any of the above security requirements for
your startup disk.'
You lose.
Also, still not clear to me
On 2020-11-28 16:18, Lewis wrote:
I ran a 6 hour YouTube stream recorded in 1080p through the beta
handbrake M1 version.
First, in hardware mode it ran at 177fos, not at all surprising.
In software mode converting to 1080p "Fast" preset, the fps of the
conversion dropped at times to as low as 68fps for a few seconds, but
most of the time it was between 85-105fps with occasional peaks as high
as 122fps.
At the same time, I was using the system the entire time and noticed no
lagging at all.
MBA or MBP or Mini?
With a more pristine source video what fps for conversion to .265 from .264?
Can you look at the memory config under "about" or "system information"?
Yes I realize. But I am responding to your ilk's statements that 8GB is
plenty because M1 doesn't need as much memory,
they do not. that is a well established fact.
the built in gpu is outperforming many discrete gpus with a *lot* less
power. keep in mind these are low end systems.
it's the shape of things in the very first models.
more nonsense. where did you get the idea the built in ssd is useless
when booting from an external drive?
In message <KmzwH.23780$PP.20307@fx07.iad> Alan Browne <bitbucket@blackhole.com> wrote:
On 2020-11-28 16:18, Lewis wrote:
I ran a 6 hour YouTube stream recorded in 1080p through the beta
handbrake M1 version.
First, in hardware mode it ran at 177fos, not at all surprising.
In software mode converting to 1080p "Fast" preset, the fps of the
conversion dropped at times to as low as 68fps for a few seconds, but
most of the time it was between 85-105fps with occasional peaks as high
as 122fps.
At the same time, I was using the system the entire time and noticed no
lagging at all.
MBA or MBP or Mini?
MBA.
With a more pristine source video what fps for conversion to .265 from .264?
I don't have a more pristine video of any size worth testing.
Can you look at the memory config under "about" or "system information"?
the MBA has 16GB, though I doubt I needed it.
Oh, the 6GB video file was remote, but the converted file was written locally. I doubt this affected the speed, but it might have cost s few
FPS.
Oh, the MBA also got a little warm. A LITTLE warm. Not to warm to set it
on my bare legs with not even the slightest hint of discomfort, but
enough to notice it was warm.
So he honestly believes that Apple has switched their platform to a new
chip and has NO IDEA how they will release machines over the next 2
years that will be any faster or better than the $700 Mac mini they just released.
He has referred to Apple and to hundreds of people who have received the
M1 machines as either liars or stupidly misguided or paid off
accomplices in a conspiracy to make the M1 Mac appear better than they
are.
He continues to deny that the M1 machines can process 4K video better
than his 7 year old machine,
Jason Snell recently discussed running a de-noiser tool on his M1 mac.
This tool is the reason he purchased a 10 core iMac Pro, and the tool
maxes out the iMac. The iMac is faster, but only a bit. And the
de-noiser is running in Rosetta.
That it's a walled garden of sorts does not take away from the plain,
solid, uncontestable fact that they are computers.
That is fine with me. My iPhone and iPad are "appliances" in the sense
that I don't spend a lot of effort protecting them (backups) except for
any music I purchase via them or photos I take. Everything else about
them is backed up intrinsically.
Current Mac Pros go to 1.5 TB of RAM and I doubt Apple will disappoint
that end of the professional market.
On 2020-11-27 23:24, Alan Baker wrote:
'No Security
No Security doesn't enforce any of the above security requirements for
your startup disk.'
You lose.
In order to get to the menu to disable this, you need to have OS-X, in particular the recovery partition (which is now on an APFS GUID
partition which contains the AFPS "partition" for recovery).
On the M1 machines, it isn't clear if the built-in SSD even has a GUID partitioning scheme, it could be native APFS since there is no EFI which required GUID partition for the boot. Good luck installing windows on
that SSD if it is partitioned APFS natively with no GUID. And you need
to keep it in order to access that recovery boot menu to disable the
security (in case some glitch causes it to return).
Also, still not clear to me whether Linux, even if it supported APFS,
could run from that SSD since it is controlled by the secure enclave.
(so you have to boot from external drives).
You are quick to insult, but nevet provide actual hard information, just marketing gobledeegook.
On 2020-11-28 09:10, Alan Browne wrote:
That it's a walled garden of sorts does not take away from the plain,
solid, uncontestable fact that they are computers.
That is fine with me. My iPhone and iPad are "appliances" in the sense
that I don't spend a lot of effort protecting them (backups) except for
any music I purchase via them or photos I take. Everything else about
them is backed up intrinsically.
Are modern fridges computers?
They have an OS in them, sometimes there
are sopftware upgrades for them, and many even have a large screen on
door, Internet connectivity and a browser.
Current Mac Pros go to 1.5 TB of RAM and I doubt Apple will disappoint that end of the professional market.
Well your buddies Lewis, nospam and their ilk all stated 8GB is plenty because M1 requires much less memory than Intel.
Yes I realize. But I am responding to your ilk's statements that 8GB is
plenty because M1 doesn't need as much memory,
they do not. that is a well established fact.
Yet to see any explanation for why applications require less RAM when
running on ARM/M1 vs x86.
The folks who got seeds from Apple and did the early benchmarks reported
only on performance, didn,t report on paging.
Later reports do mention
the paging that happens on the 8GB. The fact that even with paging, the machine outperformas the older Apple laptop based on Intel is even more admirable, but it does not make an "establish fact" that M1 requires
less memory.
the built in gpu is outperforming many discrete gpus with a *lot* less power. keep in mind these are low end systems.
The comparisons so far were with the Intel machines with built-in Intel
GPUs. And yes, the M1 beats the Intel ones easily.
Another thing to consider is that when you work with a native 4K
resolution screen, the GPU moves a lot more data. When you work in an
video editing app where the video is scaled to much smaller side to fit
the panel in the app (not full screen), then you need to consider that
the app itself can do previews in background so that when you need to
dispaly video it is very fast.
This is why the real tests is when someone exports videos. (which the
newer benchmarks form people who didn't get seeds will publish). Those benchmarks still show the M1 ahead. But for such work, the performance
is not as stellar as just showing the video on screen. And this is where
the 16GB has better performance due in part to less paging and better cooling.
it's the shape of things in the very first models.
Correct. And we don't know what the next Mxx chips will be like. But
based on how Apple presented it, the built-in GPU appears to be a fairly
hard standard and it isn't clear how Apple will deal with the Mac Pro or
iMac Pro.
Same with whether any macs will have PCI-E bus for expansion. The architecture and OS integration become a lot simpler if you have fixed
self contained configs and need not support peripherals you don't make.
more nonsense. where did you get the idea the built in ssd is useless
when booting from an external drive?
I would have to re-read the IOS guide to Security section on the secure enclave access to the drive (with knowledge it is now used on computers,
not just iDevices).
Even Apple's public documehts tell people they must do a backup on
esternal drive because data on an SSD controlled by Secute Enclave (or
T2) is by design, non recoverable (to prevenrt 3 letter organisations
from spying on you).
You cannot format the drive externally and then install it. You need to
keep APFS partition because you need to use Recovery boot to set
security options to allow remote boot, disable code signing of boot code
etc.
On 2020-11-27 23:24, Alan Baker wrote:
'No Security
No Security doesn't enforce any of the above security requirements for
your startup disk.'
You lose.
In order to get to the menu to disable this, you need to have OS-X,
On the M1 machines, it isn't clear
On 2020-11-28 00:49, Lewis wrote:
So he honestly believes that Apple has switched their platform to a new
chip and has NO IDEA how they will release machines over the next 2
years that will be any faster or better than the $700 Mac mini they just
released.
He has referred to Apple and to hundreds of people who have received the
M1 machines as either liars or stupidly misguided or paid off
accomplices in a conspiracy to make the M1 Mac appear better than they
are.
You shoudld apply to Trump Organization, your ability to skew facts and
He continues to deny that the M1 machines can process 4K video better
than his 7 year old machine,
I never said that.
Jason Snell recently discussed running a de-noiser tool on his M1 mac.
This tool is the reason he purchased a 10 core iMac Pro, and the tool
maxes out the iMac. The iMac is faster, but only a bit. And the
de-noiser is running in Rosetta.
oh, so you do admit now that some Intel Macs are still faster for some
tasks.
On 2020-11-28 00:47, nospam wrote:
Yes I realize. But I am responding to your ilk's statements that 8GB is
plenty because M1 doesn't need as much memory,
they do not. that is a well established fact.
Yet to see any explanation for why applications require less RAM when
running on ARM/M1 vs x86.
The folks who got seeds from Apple and did the early benchmarks reported
only on performance, didn,t report on paging. Later reports do mention
the paging that happens on the 8GB. The fact that even with paging, the machine outperformas the older Apple laptop based on Intel is even more admirable, but it does not make an "establish fact" that M1 requires
less memory.
Another thing to consider is that when you work with a native 4K
resolution screen, the GPU moves a lot more data.
On 2020-11-28 5:38 p.m., JF Mezei wrote:
On 2020-11-28 09:10, Alan Browne wrote:
That it's a walled garden of sorts does not take away from the plain,
solid, uncontestable fact that they are computers.
That is fine with me. My iPhone and iPad are "appliances" in the sense
that I don't spend a lot of effort protecting them (backups) except for
any music I purchase via them or photos I take. Everything else about
them is backed up intrinsically.
Are modern fridges computers?
No. They are refrigerators that have a rudimentary computer within them.
Next.
On 2020-11-28 09:17, Alan Browne wrote:
Current Mac Pros go to 1.5 TB of RAM and I doubt Apple will disappoint
that end of the professional market.
Well your buddies Lewis, nospam and their ilk all stated 8GB is plenty because M1 requires much less memory than Intel.
Of course, you won't challenge them because I am the designated target
for any/all gratuitous insults without ever providing answers/facts.
Apple will not create a low volume variant that is significantly
different from the rest of the Axx/Mxx family in order to support
external memory, external GPUs, PCI-express etc.
Are modern fridges computers?
No. They are refrigerators that have a rudimentary computer within them.
Next.
Some fridges have bult-in screens that aren't really much different to
an Android tablet device.
1) The computer comes with macOS already installed.
2) The computer comes with the recovery partition pre-installed.
You finally acknowledge that an 8GB M1 Mac far outperforms a Intel
machine even with paging,
On 2020-11-29 00:08, Lewis wrote:
1) The computer comes with macOS already installed.
2) The computer comes with the recovery partition pre-installed.
On Intel, the volume is a GUID partitioning scheme with an EFI partition
and an APFS partition, within which are APFS volumes, including
boo9table OS-X and bootable recovery.
It is not clear what Secure Boot supports as boot volumes. Consider a
case where trhe boot volume is expected to be APFS as primary
partitioning scheme.
You'll have to do fancy footwork to backup the AFPS volume integrally,
format drive as GUID scheme, create an APFS partition and then restore
the APFS backup into it tro have OS0X, recovery in it, and you can then create GUID partition for Linux/Windows/whatever else OS you want.
But then, the question arises of what type of boot environment "Secure
Boot" supports. Does it have a EFI emulator to boot another system? Does Linux have Secure Boot support ?
That is why you are likely stuck to external drives for any alternate OS booting. And we'll need documentation on what Secure Boot supports and
which operating systems have support to boot from Secure Boot.
On 2020-11-29 00:08, Lewis wrote:
1) The computer comes with macOS already installed.
2) The computer comes with the recovery partition pre-installed.
On Intel, the volume is a GUID partitioning scheme with an EFI partition
and an APFS partition, within which are APFS volumes, including
boo9table OS-X and bootable recovery.
It is not clear
You'll have to do fancy footwork to backup the AFPS volume integrally,
format drive as GUID scheme, create an APFS partition and then restore
the APFS backup into it tro have OS0X,
But then, the question arises of what type of boot environment "Secure
Boot" supports. Does it have a EFI emulator to boot another system? Does Linux have Secure Boot support ?
That is why you are likely stuck to external drives for any alternate OS booting.
And we'll need documentation on what Secure Boot supports and
which operating systems have support to boot from Secure Boot.
On 2020-11-29 00:20, Lewis wrote:
You finally acknowledge that an 8GB M1 Mac far outperforms a Intel
machine even with paging,
I never said otherwise.
My argument is with your claim that M1 requires less memory.
If you agree that the 8GB model pages a lot more than the 16GB model,
then why do you argue that the M1 based Macs require less memory than
their Intel counterpart?
Just because performace is good despite paging does not mean that the M1
chip requires less memory to run the same programs. Consider that if
the machine had modern aount of memory (say 24GB),
those video renders would be even faster than then are on only 8GB.
You or your ilk said that "exports" are I/O intensive.
On 2020-11-28 09:10, Alan Browne wrote:
That it's a walled garden of sorts does not take away from the plain,
solid, uncontestable fact that they are computers.
That is fine with me. My iPhone and iPad are "appliances" in the sense
that I don't spend a lot of effort protecting them (backups) except for
any music I purchase via them or photos I take. Everything else about
them is backed up intrinsically.
Are modern fridges computers? They have an OS in them, sometimes there
are sopftware upgrades for them, and many even have a large screen on
door, Internet connectivity and a browser.
On 2020-11-28 09:17, Alan Browne wrote:
Current Mac Pros go to 1.5 TB of RAM and I doubt Apple will disappoint
that end of the professional market.
Well your buddies Lewis, nospam and their ilk all stated 8GB is plenty because M1 requires much less memory than Intel.
In the context of an MBA and many uses of MBP or Mac Mini, 8 GB is
indeed plenty for many users whose use is browsing, e-mail,
spreadsheets, word processing, presentations, photo editing, video
editing (smaller projects) and so on. My SO has been using her MBA
thusly for years w/o issue - and it of course has GPU sharing the CPU's RAM. Alas no big excuse to buy an M1 MBA at this time.
M1 Mac Book Air:
/dev/disk0 (internal):
#: TYPE NAME SIZE IDENTIFIER
0: GUID_partition_scheme 500.3 GB disk0
1: Apple_APFS_ISC â¨â© 524.3 MB disk0s1
2: Apple_APFS â¨Container disk3â© 494.4 GB disk0s2
3: Apple_APFS_Recovery â¨â© 5.4 GB disk0s3
Backing up and restoring is not an issue and your idiotic FUD is not
fooling anyone.
I have no idea and I also don't fucking care.
If Microsoft gets off their ass and decided to make ARM64 Windows not a steaming pile of shit, then they will get it running on M1 Macs.
You can run a Linux OS on the M1 Macs right now, of course,
That the M1 requires less memory is a FACT that you just acknowledged
above, in the part that you snipped out.
It is not opinion, the M1 Macs do a lot more, a lot faster, adn they do
it with LESS RAM.
It might, it might not. I do not have an 8GB model and I am not likely to have one,
You or your ilk said that "exports" are I/O intensive.
No, shit-for-brains, I never said anything like that. CONVERTING is intensive.
They are fridges equipped with a computer with very limited 'scope' of use.
On 2020-11-29 08:55, Alan Browne wrote:
They are fridges equipped with a computer with very limited 'scope' of use.
Is a fridge equipped with web browser, touch screen, keeping track of inventory in fridge etc just a friedge, but a Chromebook that does the
same a computer?
On 2020-11-29 08:55, Alan Browne wrote:
They are fridges equipped with a computer with very limited 'scope' of use.
Is a fridge equipped with web browser, touch screen, keeping track of inventory in fridge etc just a friedge, but a Chromebook that does the
same a computer?
On 2020-11-29 06:59, Lewis wrote:
M1 Mac Book Air:
/dev/disk0 (internal):
#: TYPE NAME SIZE IDENTIFIER
0: GUID_partition_scheme 500.3 GB disk0 >> 1: Apple_APFS_ISC â¨â© 524.3 MB disk0s1
2: Apple_APFS â¨Container disk3â© 494.4 GB disk0s2
3: Apple_APFS_Recovery â¨â© 5.4 GB disk0s3
Thanks for the info. A rare time when you actually provide any.
It's how conversatiosn should proceed when there are questions.
Does this mean that IOS still runs on GUID with APFS on top?
Backing up and restoring is not an issue and your idiotic FUD is not
fooling anyone.
Apple states you have to backup to external drive because a backup on a
SSD is not recoverable due to encryption that is tied to that specific hardware.
I have no idea and I also don't fucking care.
Yet you choose to insult people who do care.
If Microsoft gets off their ass and decided to make ARM64 Windows not a
steaming pile of shit, then they will get it running on M1 Macs.
Heard something about licensing problems between Microsoft and ARM. No
idea if true or not. But the boot console remains the issue since
Microsoft would have to get the specs from Secure Boot, find a way to
either sign the OS or permanently disbale code signing. ( if OS-X
recovery partition ios gone, there would be no way to undo a "factory
reset" of NVRAM to allow unsigned OS to boot).
On 2020-11-29 07:09, Lewis wrote:
That the M1 requires less memory is a FACT that you just acknowledged
above, in the part that you snipped out.
Funny how you are unable to prvide any factual reason why a 64 bit RISC architecture would recquire less RAM to run the same app running on a
CISC 64 bit architecture.
making claims often enough doesn'T make it a "FACT".
On 2020-11-29 06:59, Lewis wrote:
M1 Mac Book Air:
/dev/disk0 (internal):
#: TYPE NAME SIZE IDENTIFIER
0: GUID_partition_scheme 500.3 GB disk0
1: Apple_APFS_ISC â¨â© 524.3 MB disk0s1
2: Apple_APFS â¨Container disk3â© 494.4 GB disk0s2
3: Apple_APFS_Recovery â¨â© 5.4 GB disk0s3
Thanks for the info. A rare time when you actually provide any.
It's how conversatiosn should proceed when there are questions.
Does this mean that IOS still runs on GUID with APFS on top?
Backing up and restoring is not an issue and your idiotic FUD is not
fooling anyone.
Apple states you have to backup to external drive because a backup on a
SSD is not recoverable due to encryption that is tied to that specific hardware.
I have no idea and I also don't fucking care.
Yet you choose to insult people who do care.
If Microsoft gets off their ass and decided to make ARM64 Windows not a
steaming pile of shit, then they will get it running on M1 Macs.
Heard something about licensing problems between Microsoft and ARM.
idea if true or not. But the boot console remains the issue since
Microsoft would have to get the specs from Secure Boot,
find a way to either sign the OS
You can run a Linux OS on the M1 Macs right now, of course,
Linus Torvalds disagrees with you.
On 2020-11-29 07:09, Lewis wrote:
It might, it might not. I do not have an 8GB model and I am not likely to
have one,
So, you posted disk utility listing without having an M1?
Apple states you have to backup to external drive because a backup on a
SSD is not recoverable due to encryption that is tied to that specific hardware.
You do not backup to the same physical drive you are trying to backup.
NO ONE does that, because it is MORONIC. Again, what the fuck is wrong
with you?
In article <slrnrs85p0.v8c.g.kreme@claragold.local>, Lewis <g.kreme@kreme.dont-email.me> wrote:
Apple states you have to backup to external drive because a backup on a
SSD is not recoverable due to encryption that is tied to that specific
hardware.
You do not backup to the same physical drive you are trying to backup.
NO ONE does that, because it is MORONIC. Again, what the fuck is wrong
with you?
there are a lot of morons in this world.
some people do exactly that, backing up to another partition on the
same drive. i wish i was kidding.
On 2020-11-29 12:48 p.m., JF Mezei wrote:
On 2020-11-29 08:55, Alan Browne wrote:
Is a fridge equipped with web browser, touch screen, keeping track of
They are fridges equipped with a computer with very limited 'scope' of use. >>
inventory in fridge etc just a friedge, but a Chromebook that does the
same a computer?
A fridge with an integrated computer is still a fridge.
A fridge with an integrated computer is still a fridge.
Don't know. Don't care.
It doesn't change your bullshit into truth.
That certainly would explain why JF things being told not to backup to
the internal SSD is a "problem" as there is nothing too stupid for him.
On 2020-11-29 16:30, Alan Baker wrote:
Don't know. Don't care.
It doesn't change your bullshit into truth.
So you admit to know knowing or caring, yet somehow you have
authoritative knowledge to claim I am wrong.
On 2020-11-29 17:31, Lewis wrote:
That certainly would explain why JF things being told not to backup to
the internal SSD is a "problem" as there is nothing too stupid for him.
I stumbled on a note from Apple that for T2 equipped Macs, you you need external backups. I merely related that here.
Cite please.
On 2020-11-30 03:10, Alan Baker wrote:
Cite please.
https://support.apple.com/en-us/HT208344
There was more obvious one I had seen, but since your purpose in life is
just to insult me, there is no reason for me to waste my time on you.
I stumbled on a note from Apple that for T2 equipped Macs, you you need
external backups. I merely related that here.
On 2020-11-29 16:28, Alan Baker wrote:
A fridge with an integrated computer is still a fridge.
A phone with an integrated computer is still a phone.
My Siemens M55
But I view it more as an appliance than a ccomputer.
few people write code,
scripts
on their iPhone to run on the iPhone.
You write software on a real computer
The definition of "computer" is difficult to make.
On 2020-11-30 2:03 a.m., JF Mezei wrote:
On 2020-11-30 03:10, Alan Baker wrote:
Cite please.
https://support.apple.com/en-us/HT208344
There was more obvious one I had seen, but since your purpose in life is
just to insult me, there is no reason for me to waste my time on you.
I can see why you snipped your own earlier statement:
I stumbled on a note from Apple that for T2 equipped Macs, you you need >>>> external backups. I merely related that here.
And then we see what the article actually says:
'Always back up your content to a secure external drive or other secure backup location so that you can restore it, if necessary.'
So the point is not the externality. That's a given.
What they're discussing is need to use a SECURE backup if you want to maintain your security; that backing up to an insecure location defeats
the whole purpose.
What they are NOT doing is drawing any relationship between a T2 chip
and some special need to do your backups externally because you have a machine with one.
You lose.
Sysop: | Gate Keeper |
---|---|
Location: | Shelby, NC |
Users: | 790 |
Nodes: | 20 (0 / 20) |
Uptime: | 40:23:19 |
Calls: | 12,115 |
Calls today: | 5 |
Files: | 5,294 |
D/L today: |
72 files (9,959K bytes) |
Messages: | 564,933 |