2019-04-24 15:24:44 (GMT)
Hi everyone!
I have been fixing various things in Zip and I found something that confused me.
2019-04-24 15:25:14 (GMT)
Namely, Zip had ext_neg rule that would rewrite literals that were of functional type by applying skolems to them.
2019-04-24 15:25:21 (GMT)
I have two problems with understanding this
2019-04-24 15:26:23 (GMT)
1. Why would it rewrite the literal? Isn't this incomplete?
2. Why would it supply only variables appearing in the literal to the skolem? Shouldn't all variables appearing the clause be supplied?
2019-04-24 15:27:17 (GMT)
1. old (circa 2017) support for HO was incomplete, yes
2. might be a bug, can't remember
2019-04-24 15:27:42 (GMT)
Ok ok thanks :)
I have now implemented new rule that corresponds to NegExt in the paper :)
2019-04-24 15:28:39 (GMT)
I always assumed that the implemented rule wouldn't delete the old clause. Apparently I didn't look closely enough.
2019-04-24 15:30:13 (GMT)
It's registered as a literal rule, meaning a simplification rule operating at the literal level.
2019-04-24 15:30:19 (GMT)
It is a literal rewrite rule.
As far as I understand in changes the literal in place.
2019-04-24 15:31:16 (GMT)
Ok, I trust you are right about this. I never checked.
2019-04-24 15:33:54 (GMT)
This is a sign that maybe Env.add_lit_rule should be named Env.add_lit_rw_rule so it's more clear it's a simplification rule
2019-04-24 15:34:55 (GMT)
Yes, that might have made it clearer to me
2019-04-24 15:40:3 (GMT)
But for question number two, I think the implementation may have been correct.
Example: f X = c \/ g Y != h Z
An instance of the extensionality axiom is: g Y = h Z \/ g Y (diff (g Y) (h Z)) != h Z (diff (g Y) (h Z))
Then we can conclude: f X = c \/ g Y (diff (g Y) (h Z)) != h Z (diff (g Y) (h Z)).
And we can abbreviate diff (g Y) (h Z) by sk Y Z:
f X = c \/ g Y (sk Y Z) != h Z (sk Y Z).
As you can see, only the variables in the literal need to appear as arguments of the skolem.
2019-04-24 15:41:12 (GMT)
I need to write correctness proofs for the "Extensions" section of our paper when I am back. Then I will find these kind of errors.
2019-04-24 15:43:14 (GMT)
Fwiw, I initially wrote that with skolem constants, then realized it was wrong not to include variables, and picked the ones in the literal. I'm sure I thought about it in more details but it's hard to remember now.
⇒ proofs are good!
2019-04-24 15:45:32 (GMT)
Now might also be a good time to ask about other things in Zipperposition that are not obvious to y'all, @Alexander Bentkamp @Petar Vukmirovic
2019-04-24 16:46:45 (GMT)
But for question number two, I think the implementation may have been correct.
Example:f X = c \/ g Y != h Z
An instance of the extensionality axiom is:g Y = h Z \/ g Y (diff (g Y) (h Z)) != h Z (diff (g Y) (h Z))
Then we can conclude:f X = c \/ g Y (diff (g Y) (h Z)) != h Z (diff (g Y) (h Z)).
And we can abbreviatediff (g Y) (h Z)bysk Y Z:
f X = c \/ g Y (sk Y Z) != h Z (sk Y Z).
As you can see, only the variables in the literal need to appear as arguments of the skolem.
Thanks a lot Alex! I will correct the implementation tomorrow!
2019-04-24 18:29:33 (GMT)
Also, on my home laptop opam does not find the package iter.
Do you know how I can fix this?
2019-04-24 18:35:2 (GMT)
opam update?
2019-04-24 18:35:21 (GMT)
(check that you have opam2, btw)
2019-04-24 18:41:36 (GMT)
I have opam 1.2 and I did opam update. How do I update it to 2?
2019-04-24 18:43:43 (GMT)
You probably need to reinstall opam/update opam via your OS, sorry :-/
opam 1.2 is really old at this point and everyone has moved to opam2 a while ago. On archlinux it's packaged, btw :)
2019-04-24 18:43:57 (GMT)
(I can help in pv/email if you run into troubles)
2019-04-25 8:5:47 (GMT)
no no, thanks anyways, I will find ways to do it :)
Just didn't know there are differences that great between opam1 and opam2.
2019-04-25 13:38:9 (GMT)
The biggest difference is the opam file format, which is why the opam1 repo has been frozen for a while, and all new packages are on the opam2 repo (and thus you can't see Iter from opam1). Otherwise the CLI is better and external dependencies have been removed.
2019-04-25 17:11:11 (GMT)
Thanks a lot!
2019-04-25 17:21:25 (GMT)
One more thing. Today I lost all my hair with trying to implement dupSup
2019-04-25 17:23:26 (GMT)
the problem is half of the variables come from one scope, and the other part comes from the other scope
Is it going to be a huge speed penalty if I would just rename all the variables apart?
2019-04-25 17:26:14 (GMT)
Is this a problem? Most inferences in superposition.ml already have this particular property that they mix literals from two distinct scopes :slight_smile:
2019-04-25 17:26:37 (GMT)
(the most subtle part is probably demodulation, but it relies on the properties of matching to still not rename variables)
2019-04-25 19:3:59 (GMT)
this rule is very specific because there is a term that is constructed so that it contains parts from two scopes :)
I will see what I can do tomorrow :)
thanks :)
2019-04-25 19:7:4 (GMT)
one more thing
on my home laptop I get build error
Error: Unbound constructor SI.Negated
My work laptop works out of the box. I think that for some reason my build system cannot find
sat_solver module. SHould I change something in some build scripts?
2019-04-25 19:15:4 (GMT)
fixed it :)
2019-04-25 19:38:42 (GMT)
this rule is very specific because there is a term that is constructed so that it contains parts from two scopes :)
I really think there are examples of that in superposition.ml already (even in simultaneous superposition). The trick is to build the mixed term _after_ substitution, in the new scope.
2019-04-25 19:39:8 (GMT)
For the SAT solver, be sure you have msat 0.8
2019-04-25 19:41:50 (GMT)
dune external-lib-deps src/main/zipperposition.exe gives your an overview of the required libs (as do the opam files)
2019-04-26 9:20:49 (GMT)
this rule is very specific because there is a term that is constructed so that it contains parts from two scopes :)
I really think there are examples of that in
superposition.mlalready (even in simultaneous superposition). The trick is to build the mixed term _after_ substitution, in the new scope.
Yeah, but the issue is that you have created special code that does simultaneous superposition.
I would like to reuse superposition code, which might not be possible, but I will try nevertheless to see what happens.
Thanks for all the help :)
2019-04-26 15:7:59 (GMT)
@Alexander Bentkamp: Hi Alex :)
For the rule prune_arg, you do not want to go under lambdas, right? In the text,
you say no bound variables should be applied to the variable.
I am designing an algorithm that will take care of functional dependency, so
I want to check what was the intention behind the code.
2019-04-26 15:37:4 (GMT)
What do you call "functional dependency" in this context?
2019-04-26 16:45:7 (GMT)
It turns out that in HOL, you can prune some arguments to applied vars
For example, if you have X a (f a) and X b (f b) appearing in the clause
you can remove f a and f b arguments since they are functionally dependent
2019-04-26 16:47:30 (GMT)
@Simon Cruanes The PruneArg rule removes redundant arguments of applied variables. For example, if a clause contains the subterms X a (f a), X b (f b), and X Y (f Y) and those are all occurrences of X, then we can replace those occurrences by X a, X b, and X Y (and change the type of X accordingly). The ground instances stay the same.
In general, an argument can be removed if it "functionally" depends on the other arguments, i.e. if it can be expressed syntactically in terms of the other arguments. Two interesting special cases are when X has a duplicate argument, e.g. X a a, X b b, and X Y Y, and when X has one constant argument, e.g. X a, X a, and X a.
@Petar Vukmirovic Below lambdas is ok. The only important thing is that the ground instances stay the same. Actually, even bound variables in the arguments would be ok in some cases. It's just that the function describing the functional dependency must not refer to bound variables.
For example: λ u, f (X u u) (X (g u) (g u)) can be reduced to λ u, f (X u) (X (g u)). (The functional dependency is Y ↦ Y)
But: λ u, f (X g (g u)) (X h (h u)) must not be reduced because the functional dependency Y ↦ Y u refers to the bound variable u, which is not allowed. If we performed this reduction, we would remove ground instances.
2019-04-26 16:54:40 (GMT)
@Alexander Bentkamp Regarding the PruneArg rule, did you observe improvements because of it? I was wondering about its applicability after I saw it in your paper
2019-04-26 16:54:48 (GMT)
Ok thanks. I see what you mean here. It is not that easy to differentiate between those cases, so for now I will not include that.
2019-04-26 16:55:18 (GMT)
I created an algorithm that subsumes two things you implemented + does functional dependency
2019-04-26 16:56:10 (GMT)
Very nice!
2019-04-26 16:56:35 (GMT)
For each argument of a variable, algorithm creates a set of all ways the argument can be described by other arguments.
Then, it computes the intersection of these sets for each appearance
And analyzes that set to see what can be pruned
2019-04-26 16:56:56 (GMT)
Actually I really liked your implementation which was smart + elegant + efficient
and it gave me idea to create this
2019-04-26 16:57:9 (GMT)
@Alexander Bentkamp Regarding the PruneArg rule, did you observe improvements because of it? I was wondering about its applicability after I saw it in your paper
I don't know how much it helps. We should evaluate it.
2019-04-26 16:57:31 (GMT)
Actually I really liked your implementation which was smart + elegant + efficient
and it gave me idea to create this
Opposed to my implementation of JP unification :D
2019-04-26 16:57:58 (GMT)
Actually I really liked your implementation which was smart + elegant + efficient
and it gave me idea to create thisOpposed to my implementation of JP unification :D
JP unification is hopeless... :)
2019-04-26 16:58:37 (GMT)
Alexander Bentkamp Regarding the PruneArg rule, did you observe improvements because of it? I was wondering about its applicability after I saw it in your paper
Interestingly, it was used in 20-30 problems only Zipperposition could solve. Getting rid of arugments is important, because it will make unification less crazy.
2019-04-26 17:0:32 (GMT)
So if you had occurences , ,
2019-04-26 17:1:6 (GMT)
ok back to normal text So if you had occurences X a (f (f a)) b c a, X b (f (f b)) b d b, X Y (f (f Y)) b e Y
2019-04-26 17:1:48 (GMT)
it will create three lists of sets for arguments:
2019-04-26 17:3:42 (GMT)
[{a,0}, {f(f a), f(f 0), f (f 4)}, {b}. {c}, {a, 4}],
[{b,0}, {f(f b), f(f 0), f (f 4)}, {b}. {d}, {b, 4}],
[{Y,0}, {f(f Y), f(f 0), f (f 4)}, {b}. {e}, {Y, 4}],
2019-04-26 17:3:53 (GMT)
When you take intersections you end up with
2019-04-26 17:4:30 (GMT)
[{0}, {f (f 0), f (f 4)}, {b}, {}, {4}]
2019-04-26 17:5:7 (GMT)
from the first one, you see that you need to get rid of the fourth argument (with DB index 0), so you delete (ignore for efficiency) all the occurrences of 4
2019-04-26 17:5:33 (GMT)
Interestingly, it was used in 20-30 problems only Zipperposition could solve.
Hum, that sounds good. This comparing with Ehoh / Leo-III / Satallax?
2019-04-26 17:5:49 (GMT)
realize that the second argument can be deleted because there is dependency on first (that is fourt argument)
and you realize that b is always the same
2019-04-26 17:6:41 (GMT)
so it really generalizes to what paper says
and also if some argument appears 2+ times, it is going to find all of them and remove them in one swoop
2019-04-26 17:7:2 (GMT)
Getting rid of arugments is important, because it will make unification less crazy.
Hum, I don't quite see why. If one argument is a function of another wouldn't it be the case that having solved the unification problem up to the previous argument already provides a solution to that argument?
2019-04-26 17:7:34 (GMT)
Getting rid of arugments is important, because it will make unification less crazy.
Hum, I don't quite see why. If one argument is a function of another wouldn't it be the case that having solved the unification problem up to the previous argument already provides a solution to that argument?
Because if you had a unification problem X a a = f a You can project on first or the second argument
2019-04-26 17:8:0 (GMT)
If this was on some super deep layer of unification, everything above that would duplicate
2019-04-26 17:9:5 (GMT)
Hum, I think I was thinking only of FO unification hehe
2019-04-26 17:9:38 (GMT)
You are lucky man if you have to think only about FO unification :)
2019-04-26 17:10:9 (GMT)
We are gonna start thinking about HO unification soon though....
2019-04-26 17:11:6 (GMT)
Good luck :)
2019-04-26 17:39:45 (GMT)
re: functional dependencies: ok it makes sense, since X a can recreate f a easily. Cool trick.
2019-04-26 18:22:20 (GMT)
We are gonna start thinking about HO unification soon though....
You're mostly thinking about HO matching, in a SMT context, aren't you?
2019-04-26 18:58:7 (GMT)
mostly, yes. But for finding conflicting instances it's actually E-unification rather than E-matching.
2019-05-13 14:36:28 (GMT)
Does anyone know if there is a way to record and print a substitution applied in an inference from the proof object?
2019-05-13 15:33:49 (GMT)
In each Proof.step there's a list of parents, which are either proofs or proofs with substitutions (Subst.P_subst). It's used in the (partial) checker in src/proofs in which there's a decomposition between instantiation steps (pure substitution) and reasoning step (pure ground reasoning with "frozen" universal variables).
You can print low level proofs with --check--dot-llproof <file> (see attached svg file on pb64.zf )
2019-05-13 17:10:24 (GMT)
(which reminds me that updating the proof checking for HO inferences would be absolutely great to have)
2019-05-14 8:21:10 (GMT)
Simon Cruanes: (which reminds me that updating the proof checking for HO inferences would be absolutely great to have)
I'd like to do that. I'd like to understand how the proof checker works. What is actually checking the proofs?
2019-05-14 8:31:9 (GMT)
Oh, I see. It's a tableau prover?
2019-05-14 8:45:16 (GMT)
The proof checker seems to work fine on the inference rule we added (FluidSup).
2019-05-14 9:3:39 (GMT)
Is there a fundamental problem with checking extensionality and equisatisfiability or is it just that you haven't found the time to do that?
2019-05-14 13:33:23 (GMT)
So, the checker works on proofs where instantiations and actual inferences are separate, because indeed it allows to use a basic ground tableau prover on the reasoning steps (basically use the tableau to split ∧ and ∨ and a congruence closure to deal with equality on ground terms).
For extensionality, it should be relatively easy if it's done with a skolem function (like if f ≠ g becomes f witness(f,g) ≠ g witness(f,g), or using a custom rule of the tableau so it also performs its own extensionality). I think there's at least a todo item, maybe code, to deal with beta-reduction.
2019-05-16 2:31:39 (GMT)
Another remark on something I just saw (cc @Alexander Bentkamp): x == Some (f y) will almost always be fast. (==) is physical equality and Some … allocates. What you'd want there is x = Some (f y) or, even better, CCOpt.equal T.equal x (Some (f y)) (if these are terms).
2019-05-16 2:32:10 (GMT)
(seen in sup_at_var_condition on master)
2019-05-16 14:1:52 (GMT)
Hi Simon!!
About the style : I saw the email sent by github. I am sorry we introduced this confusion,
I avoid tabs like pest, but maybe it was introduced from Visa's Visual Studio Code.
We will check and make sure it does not happen :)
Cheers,
Petar.
2019-05-16 14:42:31 (GMT)
Don't be sorry, it's fine :grinning_face_with_smiling_eyes:. I just think it's a good moment to have a more uniform style, anyway — the OCaml community is moving towards it, as most modern languages have such tools already (reason, JS, TS, rust, Go, etc.)
Will show y'all this summer.
2019-05-29 11:12:21 (GMT)
Hi everyone :)
I think I might have discovered a heuristic that can be helpful with extensionality, but for it
I kind of have to know if the clause if just from the input or if several inerences with it have already been performed.
Is there a way to check this easily in Zipperposition?
Cheers,
Petar.
2019-05-29 13:40:4 (GMT)
The proof contains this kind of information, yes. Afair there's a queue heuristic that looks at the "depth" of the proof _from a goal_ (i.e. a conjecture, to be goal-oriented). So you can reuse similar code to check how many inferences (not simplifications) were made.
2019-05-29 14:22:7 (GMT)
Hi SImon!
Thanks a lot! I managed to write something that does what I wanted :)