> On Sun, 30 Jun 1996, Rogier Wolff wrote:
>
> > I was reading some NT sources (No, please don't shoot me :-), and
> > found something that Linux should've had for quite a while now.
>
> On a similar note, I was speaking to a friend who is fairly familiar with
> NT. He states that you can install NT 3.5/4.0 on computer A, a 586, and
> also on computer B, a DEC alpha, and also computer C, a PowerPC RS/6000.
>
Not true- there are DIFFERENT binary versions for the different platforms
(iAPx86, MIPS R4x00, PowerPC, DEC AXP).
> Then you can go to a store and buy Word or Office for NT, and install the
> same software on computer A and on computer B and on computer C.
>
There's an EMULATIOn pack you can get from another company (can't remember
the name) to run NT binaries from other platforms on your choice of NT
platform.
> Now I must admit this sounded really unlikely; after all, what kind of
> binary format could be implemented to generate code that is acceptable to
> a DEC Alpha, a Pentium, and a PowerPC?
>
Exactly. Your friend there needs a lesson in NT - I and my coworkers are
working on getting an AlphaServer, running NT and wanted to see about
Win95 apps running on NT/AXP.
> My friend attributed this to a "Hardware Abstraction Layer", which I
> simply read as being a microkernel based approach to solving processor
> and platform differences, and what came to mind was mklinux running on a
> PowerMac....
Hardware Abstraction Layer (HAL) is for 1 (ONE) thing - to make it so that
all hardware (peripheral devices, not CPUs) can be communicated with in
the same way by user-level software through a uniform API, and that's due
to the uniform driver interface in NT.
>
> Yet, when I got an mklinux box up, and transfered an x86 binary to it, and
> tried to run it, no go!
>
Exactly. You are still running ML code, and the original source must be
rebuilt with a native compiler to run on a different CPU-level
architecture.
> % file foobar
> foobar: ELF 32-bit LSB executable i386 (386 and up) Version 1
>
> What is it that NT has supposedly implemented?
>
Just different builds of the OS for different platforms. I have gotten the
Microsoft Developer Network News for years (they just sent it to me) and
it explained that the source MUST BE REBUILT for applications to work on
different CPU hardware.
> What does it take to do "binary compatibility" across platforms? (i.e.
> what does it take to run the *same* binary on x86, 68k, alpha, ppc?
>
EMULATION!!!!!!!!!!
> It would be nice to be able to say, "hey, if you install Linux on your
> machine, it will run all the code for Linux, whether compiled on PowerMac
> or on a Pentium"...
>
Emulation is not easy, and often slow. The best emulation I've seen is
ARDI's Executor Mac emulation software.
> How can a microkernel approach be modified to allow binary compatibility?
>
It's not easy, do you REALLY want to make an even bigger kernel? You'd
have to have emulation for EVERY CPU architecture that Linux is written to
and THAT would be BIG and SLOW.
> Is this an appropriate thing to have on a wish-list for 2.1?
> Hopefully, NT on a PowerPC or a DEC box isn't *emulating* x86 code;
> otherwise we see the horrendous performance you see on a PowerMac trying
> to execute 68k code!
>
Nope, not appropriate for your 2.1 wishlist. Maybe (big maybe) 3.0, 3.1,
or on down the road (once again, BIG maybe).
> It appears to me that NT has a technical advantage if what my friend says
> is true?
>
Nope, not even. Your friend is off his rocker, I have to say.
Derrik Pates
dpates@cavern.nmsu.edu
-- "Some help would be nice... Or a sandwich and a cold beer!!!" --Boston Low, "The Dig"