How????????
'What Apple needed was a chip that took the lessons learned from years
of refining mobile systems-on-a-chip for iPhones, iPads, and other
products then added on all sorts of additional functionality in order to address the expanded needs of a laptop or desktop computer.
"During the pre-silicon, when we even designed the architecture or
defined the features," Srouji recalled, "Craig and I sit in the same
room and we say, 'OK, here's what we want to design. Here are the things that matter.'”
When Apple first announced its plans to launch the first Apple Silicon
Mac this year, onlookers speculated that the iPad Pro's A12X or A12Z
chips were a blueprint and that the new Mac chip would be something like
an A14X—a beefed-up variant of the chips that shipped in the iPhone 12 this year.
Not exactly so, said Federighi:
"The M1 is essentially a superset, if you want to think of it relative
to A14. Because as we set out to build a Mac chip, there were many differences from what we otherwise would have had in a corresponding,
say, A14X or something.
We had done lots of analysis of Mac application workloads, the kinds of graphic/GPU capabilities that were required to run a typical Mac
workload, the kinds of texture formats that were required, support for different kinds of GPU compute and things that were available on the
Mac… just even the number of cores, the ability to drive Mac-sized displays, support for virtualization and Thunderbolt.
There are many, many capabilities we engineered into M1 that were requirements for the Mac, but those are all superset capabilities
relative to what an app that was compiled for the iPhone would expect."'
<https://arstechnica.com/gadgets/2020/11/we-are-giddy-interviewing-apple-about-its-mac-silicon-revolution/>
How in the world was ArsTechnica so easily fooled???
'Srouji expanded on the point:
The foundation of many of the IPs that we have built and that became foundations for M1 to go build on top of it… started over a decade ago.
As you may know, we started with our own CPU, then graphics and ISP and Neural Engine.
So we've been building these great technologies over a decade, and then several years back, we said, "Now it's time to use what we call the
scalable architecture." Because we had the foundation of these great
IPs, and the architecture is scalable with UMA.
Then we said, "Now it's time to go build a custom chip for the Mac,"
which is M1. It's not like some iPhone chip that is on steroids. It's a whole different custom chip, but we do use the foundation of many of
these great IPs.'
Alan Baker wrote:
The foundation of many of the IPs that we have built and that became
foundations for M1 to go build on top of it… started over a decade
ago. As you may know, we started with our own CPU, then graphics and
ISP and Neural Engine.
So we've been building these great technologies over a decade, and
then several years back, we said, "Now it's time to use what we call
the scalable architecture." Because we had the foundation of these
great IPs, and the architecture is scalable with UMA.
Then we said, "Now it's time to go build a custom chip for the Mac,"
which is M1. It's not like some iPhone chip that is on steroids. It's
a whole different custom chip, but we do use the foundation of many of
these great IPs.'
That happens to not be how you design a processor.
But when it comes to a billion dollar marketing campaign,
you expect hyperbole at many levels. If an executive
tries to use the reality distortion field, stuff like
that happens. Doing lines of coke off coffee tables,
that's what executives do. Engineers live in much
more austere environments. A cup of coffee, and smelly
carpeting with too much formaldehyde in it. That's
where engineering is done.
You need to find an article here (subscription based).
As there isn't as much free-lance interest in processor arch
as there once was.
https://www.linleygroup.com/mpr/about_report.php
A good part of hardware design now, comes from software people.
They know the compiler. The compiler is a long lead-time item
of some importance (it takes around ten years to make
a good one from scratch). Preparing a processor architecture requires spotting patterns where both the compiler and the hardware
could be modified. You're staring at workload traces, but with
an eye to your favorite pet theory. You're not staring at
lines of coke on a glass coffee table.
Alan Baker wrote:
How????????
'What Apple needed was a chip that took the lessons learned from years
of refining mobile systems-on-a-chip for iPhones, iPads, and other
products then added on all sorts of additional functionality in order
to address the expanded needs of a laptop or desktop computer.
"During the pre-silicon, when we even designed the architecture or
defined the features," Srouji recalled, "Craig and I sit in the same
room and we say, 'OK, here's what we want to design. Here are the
things that matter.'”
When Apple first announced its plans to launch the first Apple Silicon
Mac this year, onlookers speculated that the iPad Pro's A12X or A12Z
chips were a blueprint and that the new Mac chip would be something
like an A14X—a beefed-up variant of the chips that shipped in the
iPhone 12 this year.
Not exactly so, said Federighi:
"The M1 is essentially a superset, if you want to think of it relative
to A14. Because as we set out to build a Mac chip, there were many
differences from what we otherwise would have had in a corresponding,
say, A14X or something.
We had done lots of analysis of Mac application workloads, the kinds
of graphic/GPU capabilities that were required to run a typical Mac
workload, the kinds of texture formats that were required, support for
different kinds of GPU compute and things that were available on the
Mac… just even the number of cores, the ability to drive Mac-sized
displays, support for virtualization and Thunderbolt.
There are many, many capabilities we engineered into M1 that were
requirements for the Mac, but those are all superset capabilities
relative to what an app that was compiled for the iPhone would expect."'
<https://arstechnica.com/gadgets/2020/11/we-are-giddy-interviewing-apple-about-its-mac-silicon-revolution/>
How in the world was ArsTechnica so easily fooled???
'Srouji expanded on the point:
The foundation of many of the IPs that we have built and that became
foundations for M1 to go build on top of it… started over a decade
ago. As you may know, we started with our own CPU, then graphics and
ISP and Neural Engine.
So we've been building these great technologies over a decade, and
then several years back, we said, "Now it's time to use what we call
the scalable architecture." Because we had the foundation of these
great IPs, and the architecture is scalable with UMA.
Then we said, "Now it's time to go build a custom chip for the Mac,"
which is M1. It's not like some iPhone chip that is on steroids. It's
a whole different custom chip, but we do use the foundation of many of
these great IPs.'
That happens to not be how you design a processor.
But when it comes to a billion dollar marketing campaign,
you expect hyperbole at many levels. If an executive
tries to use the reality distortion field, stuff like
that happens. Doing lines of coke off coffee tables,
that's what executives do. Engineers live in much
more austere environments. A cup of coffee, and smelly
carpeting with too much formaldehyde in it. That's
where engineering is done.
You need to find an article here (subscription based).
As there isn't as much free-lance interest in processor arch
as there once was.
https://www.linleygroup.com/mpr/about_report.php
A good part of hardware design now, comes from software people.
They know the compiler. The compiler is a long lead-time item
of some importance (it takes around ten years to make
a good one from scratch). Preparing a processor architecture requires spotting patterns where both the compiler and the hardware
could be modified. You're staring at workload traces, but with
an eye to your favorite pet theory. You're not staring at
lines of coke on a glass coffee table.
As for "where do ideas come from", see Intel. Their innovation
came from hiring a team in Israel. Folsom didn't save them,
Israel did. In the case of Apple, I believe they may have
made a bulk purchase of some CPU people, but I don't sit
around tracking bumpf like that. Knowing which company that
was, would tell you why the processor looks the way it does.
I'm kinda curious how so many functional units can be put in
parallel, because it isn't "normal" for exceptionally high
retirement rates on the same clock tick. Four functional units
would be pushing it, as maybe you could arrange three functional
units to retire on the same tick, but squeezing a fourth into
the model is tough (that means, using a fourth functional
unit, that actually gets used occasionally). This is a measure
of "how much parallelism exists in normal code". And that
situation has been stagnant for some time. The AMD Zen3 made
a small move in that direction.
And this is where a journal that specializes in that sort of
analysis, would shred whatever PR campaign was ongoing.
As for Anandtech, one of the web pages mentions "we used what we had",
which means they weren't able to apply their normal benchmark suite.
You know, Anand left Anandtech years ago, to work at Apple.
Anandtech was bought by a magazine company, and that magazine
company also bought Tomshardware (and probably a few others).
Another good supply of analysis used to come from the Russians (ixbt.com). They knew some things about the Pentium P4 that nobody else knew
(how hyperthreading worked and how the first hyperthreading
had a bug in its recirculator). Some of their analysis was
from first principles. They could write code and demonstrate
that an idea they had, was real. But they're not around any more,
so scratch another source of analysis potential.
I'm sorry, but if you're expecting any sort of reasoned
(non-NDA) analysis these days, you pretty well have to pay for
it. The kiddies can only regurgitate what they're given.
But there are some people with the chops to do the analysis.
You couldn't even visit comp.arch any more and expect anything
cogent. I think that was spammed out of existence at some point
and the people left.
*******
One thing I've learned over the years, is hardware is useless
without good software. And then the question is, is the software
for a product, what an individual user wants or not. That's
where my interest in the M1 trails right off... The ability
of poorly written software, to squander tiny improvements in
hardware, is legendary. This is why today, you can hardly have
a web browser that isn't railed and non-responding. An M1
won't fix that. Most of the time, my processor components
sit there unused - but that's software for you.
This is also the reason I'm not interested in owning anything
with an NVMe in it. Fine fine hardware. Rendered useless by
software.
Just yesterday, I did an experiment with a ramdisk, where
the file copy rate was 1.8MB/sec. Now think about that for
a moment, just how pathetic that is. A "device" with a 5GB/sec
sequential benchmark, that in a real life test situation,
can only manage 1.8MB/sec performance. (That's the NTFS fuse
file system on a recent Linux distro.) This is why "dreams of M1"
would be tempered with reality, and that reality is
the bloated software we used today. That software is... everywhere.
This is the Year of the Container, as containers and virtualization
threaten to ruin multiple ecosystems at the same time. Try loading
a Snap and see how long it takes before your application is
ready (20 seconds). Does Apple use containers ? Does Apple use
virtualization ? Of course they do. I don't even need to check,
because there's plenty of copying and "me too" in the software
industry, even if critical analysis would tell them the idea
was wrong.
This is why I no longer get excited about "whizzy hardware".
A software guy will always find a way to ruin it.
Try watching a few videos of how fast software loads on the new M1 Macs...
On Sun, 22 Nov 2020 23:43:10 -0800, Alan Baker wrote:
Try watching a few videos of how fast software loads on the new M1 Macs...
To Paul,
On 2020-11-23 9:08 a.m., Arlen Holder wrote:
On Sun, 22 Nov 2020 23:43:10 -0800, Alan Baker wrote:
Try watching a few videos of how fast software loads on the new M1 Macs... >>To Paul,
Notice how Arlen doesn't actually address a single point that was made.
Notice that you should never reply to trolls.
On 11/23/2020 10:24 AM, Alan Baker wrote:
On 2020-11-23 9:08 a.m., Arlen Holder wrote:
On Sun, 22 Nov 2020 23:43:10 -0800, Alan Baker wrote:
Try watching a few videos of how fast software loads on the new M1
Macs...
To Paul,
Notice how Arlen doesn't actually address a single point that was made.
Notice that you should never reply to trolls.
On Mon, 23 Nov 2020 10:45:59 -0700, Ken Blake wrote:
Notice that you should never reply to trolls.
And yet, my post was _filled_ to the brim with facts!
o Every fact I claim is backed up by well-cited public reports, in fact!
FACTS:
Facts are what intelligent adults use to form their belief systems
o Not purely bullshit (but verrrry pretttty) MARKETING shills!
Adults don't believe in Santa Claus, because adults comprehend facts
o Children believe in him because they can't separate facts from MARKETING
FACTS:
Apple licensed ARM technology & fabs on TSMC Silicon for good business reasons; yet not for the reasons they will endlessly shill in brochures.
These Apple apologists will be completely immune to _why_ Apple needs to
spin this cost-cutting production-control timing-based decision to switch.
Luckily we covered, in gory detail _why_ Apple has to spin this decision:
o Apple Plans to Announce Move to Its Own Mac Chips at WWDC [ARM, TSMC, 5nm, A14]
<https://groups.google.com/g/comp.sys.mac.system/c/iN5nqHcaZmM>
Remember the Apple spin of "you're holding it wrong" or "it's courageous"
to remove functionality so that the customer has to later buy it back?
Just as Apple had to spin their decision to take away the functionality and then tell you to buy it back for the basic headphone jack, which 99.95% of all Android devices have today (simply because of the fact it _is_ basic functionality).
o How many of the existing Android phones lack headphone jack basic hardware functionality?
<https://groups.google.com/g/comp.mobile.android/c/ZjnD2kAf-mI>
Or Apple's spin on taking away basic functionality (again, so that the consumer has to buy it back) as being "green" to remove basic accessories?
o Which recent 2019 & 2020 Android phones do NOT come with a charger in the box (that are in the price range of an iPhone)?
<https://groups.google.com/g/comp.mobile.android/c/-FSFIHYbs3o>
This is why I no longer get excited about "whizzy hardware".
Newsgroups: comp.sys.mac.advocacy,comp.mobile.android,comp.sys.mac.system,alt.comp.os.windows-10
On Fri, 20 Nov 2020 20:03:01 -0500, Paul wrote:
This is why I no longer get excited about "whizzy hardware".
Keywords: *costs will be reduced*
The only time Apple tells the truth, is in court (AFAICT).
o And this question isn't likely to make it into court.
However, rational people do discern between MARKETING bullshit
o And logical business acumen (i.e., directed toward pure profit)
Apple has both in huge quantities:
a. Marketing bullshit (Apple has the lowest R&D in all high tech!)
b. Stellar business acumen (Apple makes ungodly profit off gullibles)
See this thread for clues as to _why_ Apple is now making TSMC-Silicon
using ARM licensed technology (and see this specific post on the topic):
o Explore the new system architectire of Apple Silicon Macs, by JF Mezei <https://groups.google.com/g/comp.sys.mac.system/c/ElvAtPCgr6I/m/LXPrho3lAgAJ>
Reproduced below...
"Apple's introduction to ARM in the Mac was the T1 coprocessor chip
in the 2016 MacBook Pro with Touch Bar."
Path: doubletreewisp!news.mixmin.net!aioe.org!peer01.ams4!peer.am4.highwinds-media.com!news.highwinds-media.com!fx02.ams4.POSTED!not-for-mail--
From: Alan Barker <notonmylife@no.no.no.no>
Newsgroups: comp.sys.mac.advocacy,comp.mobile.android,comp.sys.mac.system,alt.comp.os.windows-10
Subject: Re: How is ArsTechnica being taken in so easily "Apple Marketing"? References: <rp8ujg$j5q$1@dont-email.me>
Followup-To: comp.sys.mac.advocacy
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:78.0) Gecko/20100101 Thunderbird/78.5.0
MIME-Version: 1.0
In-Reply-To: <rp8ujg$j5q$1@dont-email.me>
Content-Type: text/plain; charset=UTF-8
Content-Language: en-US
Content-Transfer-Encoding: 7bit
Lines: 11
Message-ID: <1c2vH.42851$%m67.17180@usenetxs.com>
X-Complaints-To: https://www.astraweb.com/aup
NNTP-Posting-Date: Tue, 24 Nov 2020 07:00:13 UTC
Date: Fri, 20 Nov 2020 12:03:45 -0800
X-Received-Bytes: 1142
X-Received-Body-CRC: 3994957184
Xref: doubletrewisp comp.sys.mac.advocacy:56859 comp.mobile.android:70163 comp.sys.mac.system:85534 alt.comp.os.windows-10:129172
Apple is...all about... *Much ado about nothing...*
Just so you know, what happened in this thread is two things that_always_ happen in Apple newsgroups (because of the oddities of Apple cultists like Alan Baker clearly is and Apple apologists like nospam always is).
1. Apple makes what is really a minor technical change (in that designing with ARM and fab'ing with TSMC Silicon is no big deal whatsoever technically)...
On Fri, 20 Nov 2020 12:03:45 -0800, Alan Barker wrote:
Ed Pawloski sniffs my ass crack all over the Internet, much like
Alan Baker does & Snit did, where what strikes me is how creepy they are.
Adults on this newsgroup will note Ed's header is the exact same header
that has ruined many a newsgroup, most recently alt.home.repair:
Sysop: | Gate Keeper |
---|---|
Location: | Shelby, NC |
Users: | 790 |
Nodes: | 20 (0 / 20) |
Uptime: | 39:10:45 |
Calls: | 12,115 |
Calls today: | 5 |
Files: | 5,294 |
D/L today: |
72 files (9,959K bytes) |
Messages: | 564,927 |