Re: Why recovering from broken configs is too hard

From: Alexander Viro (viro@math.psu.edu)
Date: Thu May 03 2001 - 13:36:47 EST


On Thu, 3 May 2001, Eric S. Raymond wrote:

> Alexander Viro <viro@math.psu.edu>:
> > I'm not talking about connectedness of the thing. However, I suspect that
> > graph has a small subset such that removing it makes it fall apart.
>
> Um. So how does that help?

First of all, it reduces the complexity of finding the best valid
approximation (mind you, I'm not saying that it's the problem you
need to solve, just that you don't need anything close to 3^1976
even for _that_).

And simple variation of the full thing can be used for "find a
valid approximation within given distance".

Full (and dumb) variant first:
        for all combinations of values of core variables
                for each peripherial group
                        find the best valid approximation within that group,
                        given the values of core variables.
The cost is 3^(size of core) * 3^(size of maximal peripherial group) *
number of peripherial groups * size of maximal periph. group * maximal
number of constraints for periph. group.

Again, I'm _not_ suggesting to do that. However, limited variant of the
problem (find valid approximation within 10 changes from given or complain)
is much easier.

Eric, could you point me to a place where I could grab the current
set of constraints? In any case I have a very strong gut feeling that
separation the variables into core and peripherial is essential part
of data and ignoring it makes problem much harder than it really is.
I'd like to see how large the core actually is and how large periph.
groups are.

-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.kernel.org
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/



This archive was generated by hypermail 2b29 : Mon May 07 2001 - 21:00:17 EST