Re: File systems are semantically impoverished compared to database

Nathan Hand (nathanh@chirp.com.au)
Sat, 26 Jun 1999 14:45:33 +1000


On Fri, Jun 25, 1999 at 10:28:21PM -0400, Arvind Sankar wrote:
> On Fri, Jun 25, 1999 at 10:55:07AM -0400, Lou Grinzo wrote:
> > You can get the performance today by using directories, but that
> > destroys usability. Documents need to appear as files to the user,
>
> Why must they appear as files to the user? I don't see quite what is very wrong
> with having a complex document as a directory.

Not only is there nothing wrong with this approach, but for people who
use LaTeX for largish documents this is the way it *has* been done for
a number of years.

Typically you create a "document" as a directory. Inside the directory
you breakup the book into chapters, and I use a main.tex to join these
chapters together (that's just my convention).

The table of contents is automatically generated for me and kept in an
additional file. Same deal for the glossary, table of figures, etc. It
is all automated by the build process (a Makefile).

Images, figures, even code snippets, are kept in seperate files. There
are obvious advantages with this scheme. I can, for example, ensure an
extract of code compiles and be sure that the final print includes the
debugged snippet of code. Figures can be opened in appropriate editors
without any modifications to the editor: they already understand files
and directories. The final print includes the latest graphic.

Looking back on this it's obvious that each image, chapter, figure and
code snippet was a "fork" of the document. The actual document was not
the main.dvi produced, it was the entire directory.

So the more I look at the problem, I realise that Linux doesn't need a
forked filesystem, because Linux already deals with the problem nicely
with directories plus applications that think in a UNIX way.

It wasn't even difficult moving LaTeX "documents" to other machines. I
simply tared the lot up and transferred the tar file (or more recently
I use recursive get in lftp). This is analogous to binhex'ing on Mac.

> If they must indeed be files, what kind of structure is supposed to be provided
> by the OS? It seems that there are very few things that _all_ applications
> would want to be in there, no? All I can think of is an icon and the mime type
> of the file. Even the icon would probably be deducible from the mime type.
> (Storing the name of the executable that is used to open the document is IMHO
> wrong. There are too many different places that an executable might live, and
> there are too many different executables that might be used to open a single
> file type. Eg do you use gimp, xv, or electric eye to open a JPEG file?)

I agree.

Keep forked filesytems out of Linux. Move forks into userspace if that
is needed, so double clicking the LaTeX "document directory" starts up
LyX or a binder application, but filesystem support isn't needed.

And in all honesty, what is stopping an application from keeping their
document format as tar, and unpacking into /tmp when you start work on
a document, then retarring at the end. All the speed, very simple.

Forks are a smart and useful idea, but you can do it all in userspace.

-- 
Nathan Hand - Chirp Web Design - http://www.chirp.com.au/ - $e^{i\pi}+1 = 0$
Phone: +61 2 6230 1871   Fax: +61 2 6230 4455   E-mail: nathanh@chirp.com.au

- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.rutgers.edu Please read the FAQ at http://www.tux.org/lkml/