Propositional Independence - Formula-Variable Independence andForgetting

J. Lang, P. Liberatore and P. Marquis

Volume 18, 2003

Links to Full Text:

Abstract:
Independence -- the study of what is relevant to a given problem of reasoning -- has received an increasing attention from the AI community. In this paper, we consider two basic forms of independence, namely, a syntactic one and a semantic one. We show features and drawbacks of them. In particular, while the syntactic form of independence is computationally easy to check, there are cases in which things that intuitively are not relevant are not recognized as such. We also consider the problem of forgetting, i.e., distilling from a knowledge base only the part that is relevant to the set of queries constructed from a subset of the alphabet. While such process is computationally hard, it allows for a simplification of subsequent reasoning, and can thus be viewed as a form of compilation: once the relevant part of a knowledge base has been extracted, all reasoning tasks to be performed can be simplified.

Extracted Text

Journal of Artificial Intelligence Research 18 {2003} 391-443Submitted 5/02;
published 5/03
Propositional Independence:
 Formula-V ariable Independence and Forgetting
 Jer^ omeLang lang@irit.fr
 IRIT-UPS,118 route de Narbonne
 31062 Toulouse Cedex, France
 Paolo Liberatore liberato@dis.uniroma1.it
 Dipartimentodi Informatica e Sistemistica
 Universit022a di Roma \La Sapienza" via Salaria 113, 00198 Roma, Italy
Pierre Marquis marquis@cril.univ-artois.fr
 CRIL-CNRS/Universite d'Artois
 rue de l'Universite
 S.P. 16, 62307 Lens Cedex, France
 Abstract
 Independence{ the study of what is relevantto a given problem of reasoning
{ has received an increasing attention from the AIcommunity. In this paper,
weconsider two
 basic forms of independence, namely, a syntactic one and asemantic one.
We show features and drawbacks of them. Inparticular, while the syntactic
form of independence is compu-
 tationally easy to check, there are cases in which things that intuitively
are not relevant
 are not recognized as such. We also consider the problem of forgetting,
i.e.,distilling from
 a knowledge base only thepart that is relevant to the set ofqueries constructed
from a
 subsetof the alphabet. While such process iscomputationally hard, it allows
for a simplifi- cation of subsequent reasoning, and can thusbe viewed as
a form of compilation:once the
 relevant part of a knowledge base has been extracted, all reasoning tasksto
be performed
 can be simplified. 1. Introduction
 We successively present some motivations, the scope of the paper, and the
contribution and
 the organization of the paper.
 1.1 Motivations
 In many situations of everyday life, we are confronted with the problem
ofdetermining what
 is relevant. Especially, as a preliminary step tovarious intelligent tasks
{e.g., planning,de-
 cision making, reasoning}, it is natural and reasonable to discard everything
but what is relevant to achieve them efficiently. For instance, before starting
towrite this paper, we had
 toconsider the relevant literature on thetopic, and only it; gathering on
our desks all the papers about relevancethat we have at hand led us to set
away our favourite cook books, because they are of no help to the task of
writing this paper. The ability to focuson what
 is relevant {or duallytodiscard what is not} can be considered as a central,
characteristic
 feature of intelligence; it is believed that over 90045of the neuronal
connections of our brains c
 fl2003 AI Access Foundation and Morgan Kaufmann Publishers.All rights reserved.Lang,
Liberatore,& Marquis
 are inhibitory and serve togive up sensorial inputs {Subramanian, Greiner,
& Pearl, 1997}.
 This explains why irrelevance, under various names as independence,irredundancy,
influ-
 enceability, novelty, separability, is nowadays considered as an important
notion in many
 fieldsof artificial intelligence {see a survey for more details, e.g.,Greiner
& Subramanian,
 1995; Subramanianet al.,1997}.
 In the following, we areconcerned with relevanc e for re asoning. In this
framework, the task is typically that of determiningwhether some piece of
knowledge {a query}'
 can be derived from a knowledgebase 006. Several reasoning schemes can
be taken into
 account here, from theclassical one {inference is classical entailment}to
more sophisticated
 ones {common-senseinference}. When dealing with such reasoningtasks, relevance
is often
 exploitedso as to make inference more efficient from a computational point
of view. Inother
 reasoning problems, the purpose isto explicitly derive some intensionally-characterized
 pieces of knowledge {e.g.,tellme all what you know about Tweety}. For such
problems
 {that are not reducible to decision problems},relevance also has a role
to play, for instance
 by allowing us tocharacterize the pieces of information we are interested
in by forgetting
 the other ones. To what extent is the goal of improving inference reachable?
In order to addressthis
 point, a key issue is thecomputational complexity one.Indeed, assume that
we know that the resolution of some reasoning problems canbe sped up once
relevant informationhas been
 elicited. In the situationwhere it is computationally harder to point outsuch
information
 from the input than toreason directly from the input, computational benefits
are hard to
 be expected.If so, alternative uses of relevance for reasoning are to be
investigated.For
 instance, searching for relevance information can be limited by consideringonly
pieces of
 knowledge that can begenerated in a tractable way.If such information depend
only on the knowledge base, another possible approach is to {tentatively}
compensate the computational resources spent in deriving the relevance information
through many queries {computing pieces of relevant information canthen be
viewed as a form of compilation}. 1.2 Scope of the Paper Little is known
about the computationalcomplexity of relevance. This paper contributes
 to fillthis gap. Thecomplexity of several logic-based relevance relations
is identified in a propositional setting. By logic-based wemean that the
notions of relevancewe focus on
 arenot extra-logical butbuilt inside the logic: they are defined usingthe
standard logical
 notionsof {classical}formula, model, logical deduction, etc.The stress is
laid on notions of relevance that can prove helpful forimproving inference
and, in particular, the mostbasic
 form of it, classical entailment.Relevance is captured by relations in themetalanguage
of
 the logic, that is, weformalize relevance as a relation between ob jects
of the propositional
 language{formulas or sets of literals/variables} thus expressing the fact
that some object is
 relevant to some otherone.
 Two notions play a central role in this paper. The first one,{semantical}
formula-
 variableindependenc e {FV-independence for short}tells that a propositional
formula 006 is independentfrom a given setV of variables if and only if
itcan be rewritten equivalently as a formula in which none of the variables
in V appears. Thesecond one is the notion
 offorgetting a given set Vof variables in a formula 006.It is intimately
linked to the 392Propositional Independence notion of formula-variableindependence
because, as we show, the result of forgetting the
 set of variablesV in a formula 006 can be defined as the strongest consequence
of 006 being independent from V . Bothnotions have appeared in the literature
undervarious names
 and with severaldifferent {but equivalent} definitions,and are highly useful
for many tasks in automated reasoning and many problems inartificial intelligence:
 1. automated deduction and conse quenc efinding. Formula-v ariableindependence
can be
 usefulfor checking {un}satisfiability or forconsequence finding. Indeed,
structuring the knowledge base by finding usefulindependencies may be worth
doing beforerun-
 ning any search algorithm.This principle is at work in {Amir &McIlraith,
2000}. In
 optimal cases, forexample, a satisfiability problem will be decomposed into
a small
 number ofsatisfiability problems on easier knowledge bases{with less variables}
as
 shown by Park and Gelder {1996}. As to improvinginference, this can be particu-
 larly helpful in the situation where the set ofqueries under consideration
is limited
 to formulas ' that are {syntactically or semantically} independent from
a setV of
 variables.
 2. query answering {see in particular Amir& McIlraith, 2000}, diagnosis
{see the work by Darwiche, 1998}. Renderinga formula 006 independent from
a setV of variables
 throughvariable forgetting gives rise to a formula that is query-equivalent
to 006 w.r.t. V in the sense that everylogical consequence ' of 006 that
is independent from V
 also is a logicalconsequence of 006 once made independent fromV , and the
converse
 holds as well. Interestingly,theset of all the formulas independent froma
set of
 variables is a stable production field {Siegel, 1987; Inoue,1992}, and focusing
on
 sucha production field is valuablefor several reasoning schemes. For instance,
in the
 consistency-basedframework for diagnosis {Reiter, 1987}, thequeries we are
interested
 in are theconflicts of the system to be diagnosed, i.e.,the clauses that
are independent
 from every variable used to represent the system, except the abnormality
propositions used to encode the component failures. 3. know ledge basestructuring,
topic-base d re asoning. Formula-v ariableindependence is
 a key notion fordecomposing a propositional knowledge base {KBfor short},
i.e.,
 a finite set of propositional formulas, into smaller subbases.Such a decomposition
 is all the morevaluable as the number of variables the subbases depends
on is low. Optimally, a knowledge base 006 ={'
 1
 ; :::; ' n
} is fully decomposable if it can be written
 as 006 = 006 1
 [ : : :[ 006
 n
 where006
 i
 and 006 j
 depend on disjoint sets ofvariables for all
 i 6= j . Such decompositions wereconsidered in several papers {Parikh, 1996;Amir
 & McIlraith, 2000; Marquis& Porquet, 2000} with somewhat different motivations.
 The most intuitive motivation for searching such decompositions isthat it
gives a
 better understanding of the knowledge base, by structuring it with respect
to {possible
 disjoint but notnecessarily} sets of topics {Marquis & Porquet,2000}.
 4. belief revision and inconsistency-toler antre asoning. Decomposing a
propositional
 knowledge base 006 intosubbases{006
 1
 ; : : : ; 006
 n} proves also to be relevant for defin-
 ing inconsistency-tolerant relations as well as belief revision operators.
The approach
 proposedby Chopra and Parikh {1999} proceeds asfollows: the knowledge base
006 393Lang, Liberatore,& Marquis
 is first partitioned into{006
 1
 ; : : : ;006
 n
} such that theintersection of the languages of
 the subbases {i.e., the sets DepVar{006
 i
 }of variables the subbases depend on} areas
 small as possible; then 'is inferred from 006 w.r.t. the givenpartition
if and only if
 the conjunction ofall 006
 i
 such thatDepV ar{006
 i
 } \ DepV ar{'}6= ; is consistent and entails'. In a revision situation,this
approach also ensures that the only old beliefs
 that may be thrown awayare about the variables relevant to the input formula.
Fi- nally, as recently shown by someof the authors of the present paper,
forgettingcan be
 advantageously exploited asa weakening mechanism for recoveringconsistency
from
 an inconsistent KB {Lang &Marquis, 2002}.
 5. beliefupdate. A belief update operatormaps a knowledge base 006 and
an input formula ' expressing some explicit evolution ofthe world to a new
knowledge base 006005 '; 006and
 006005 ' respectively represent theagent's knowledge befor eand after
the evolution of the world expressed by the update.Several authors argued
that an update operator should
 preserve the part of the knowledge base not concerned by the update.This
leads to the
 followingthree-stage process, proposed independently by Doherty, Lukaszewicz,
and Madalinska-Buga j {1998} andbyHerzig and Rifi {1998} {and named mpmaand
wss#,
 respectively}:{i} determine the variables relevant to the update, namely,DepV
ar{'};
 {ii} forget these variables in 006 toobtain a new formula F orgetVar{006;
DepV ar{'});
 {iii} expand by'. In more compact terms, thisupdateoperator is expressed
by
 006005 ' = F orgetVar{006; DepV ar{'}) ^ '. Thus,both our results on
FV-independence and variable forgetting are relevant to the computational
issues pertaining tothis kind
 of belief updates. 6. re asoning aboutaction, decision making, planning.
Logical languages for reasoning about action express the effects{deterministic
or not, conditional or not} of actions by
 means of propositional formulas{Gelfond & Lifschitz, 1993; Sandewall, 1995;
Fargier,
 Lang, & Marquis, 2000; Herzig, Lang, Marquis, & Polacsek, 2001}; findingthe
vari-
 ables the effects are dependent on enables to identify the variables whose
truth value
 may be changed by the action; moreover, formula-literal independence { a
refinement
 of FV-independence that we introduce { also tells us in which direction
{fromfalse
 to true and/or from true to false}the possible change may occur.This is
especially
 usefulfor filteringout irrelevant actions in a decision makingor planning
problem.
 7. prefer enc e repr esentation.Logical languages for representing preference
writeelemen-
 tary goals as propositional formulas and it is crucial to identify those
variables that
 have no influence onthe agent's preference. Therefore, formula-variable
independence
 has an important role to play; for instance, the framework ofso-called ceteris
paribus preference statements of Tanand Pearl {1994} interprets a preference
item' :   >: 
 by:for any pair of worlds {!;!
 0
 } such that {i}!|=  , {ii} ! 0
|= : and {iii} ! and
 ! 0
 coincide on all variables ' and   areindependent from, thenwe have a strict
 preferenceof ! over !
 0 .
 1.3 Contributionand Organization of the Paper
 The rest of the paper is structured asfollows. After some formal preliminaries
givenin Sec-
 tion 2, the key notion of formula-variable independence is presented inSection
3. Because it
 394Propositional Independence captures a very basic form of logic-basedindependence,
this relation has already been intro-
 duced in the literature under severalnames, like influenceability {Boutilier,
1994},relevance
 to a sub jectmatter {Lakemeyer, 1997}, or redundancy {Doherty et al., 1998}.
Although it
 is conceptually and technically simple, thisnotion has not been studied
in a systematic way,
 and our very first contribution aims at fillingthis gap, first by giving
several equivalent char-
 acterizationsof this notion {this is useful, as several papers introduced
and used the same concepts under different names}, andsecondby investigating
carefully its computational complexity. In particular, we show that, in the
general case, checking whether aformula is
 independentfrom a set ofvariables is coNP-complete.Then, we go beyond this
very simple notion by introducing the morefine-grained notion of formula/literalindependenc
e inorder
 todiscriminatethe situation where a formula 006 conveys some information
about a literal but no information about its negation.This refinement is
helpful whenever the polarity
 of information is significant,which is the case in many AI fields{including
closed-world
 reasoningand reasoningabout actions}. We also study severalinteresting notions
derived
 fromformula-variable and formula-literal independence, such as the notion
of simplified for-
 mula {006 is Lit- {Var-} simplified if it depends on everyliteral {variable}
occurring in it} and the corresponding process of simplifyinga formula. Despite
this complexity and because
 the size of a simplifiedformulacan never be larger than the size of theoriginal
formula,
 simplificationcan prove a valuable relevance-based preprocessing for improving
manyforms
 of inference.
 InSection 4, we turn to the second key notion,namely forgetting a given
set of variables in a formula {Lin & Reiter, 1994}.The forgetting process
plays an importantrole in many
 AI tasks and has beenstudied in the literature under variousother names,
such as vari-
 able elimination, or marginalization {Kohlas, Moral,& Haenni, 1999}. Several
semantical characterizations and metatheoretic properties of forgetting are
presented. Based on this notion, we introduce an additional notion of dependence
between two formulasgiven a set
 of variables:we state that 006 is equivalentto 010 w.r.t. V if and only
if both formulas are log-
 ically equivalent once made independent from every variable except those
of V .Here again,
 it is important to make a distinction between a variableand its negation
{for instance, one can be interested in the positiveconflicts of a system,
only}. Forthis purpose, we introduce
 a notion of literal forgetting that refines thecorresponding notion of variable
forgetting.We
 show how closed-worldinference can be simply characterized from thecorresponding
equiv-
 alence relation.Finally, we identify the complexity ofboth notions of equivalence
and show them to be hard {005
 p
 2
 -complete}.As a consequence, forgetting variablesor literals within a
 formula cannot beachieved in polynomial time in the generalcase {unless
the polynomial
 hierarchycollapses at the first level}. We also show that a polysize propositionalrepresen-
 tation of forgetting is veryunlikely to exist in the general case.We nevertheless
present
 some restricted situations where forgetting istractable.
 Section 5 shows FV-independenceclosely related to notions of irrelevancealready
in-
 troduced in the literature by various authors. Section 6 discusses otherrelated
work and
 sketches furtherextensions of some notions and results studied in the paper.
Finally, Sec-
 tion 7concludes the paper. Proofs of the mainpropositions are reported in
an appendix. Aglossary of the notations is at the end ofthis paper, right
before the bibliography.
 395Lang, Liberatore,& Marquis
 2. Preliminaries We first recall some basic notionsfrom propositional logic,
and from complexitytheory.
 2.1 PropositionalLogic
 Let P S bea finiteset of propositional variables.P ROP
 P S
 isthe propositional language
 builtup fromP S , the connectives and the Boolean constants true and false
in the usual
 way. For every V 022P S , P ROP V
 denotes the sublanguage ofP ROP
 P S
generated from the
 variablesof V only. A literal of P ROP
 V is either a variable ofV {positive literal} or the negation of a variable
ofV {negative literal}. Theirset is denoted L
 V
 , while L
 +
 V {resp. L
 000 V
 }
 denotes the set of positive {resp. negative} literals built upfrom V . A
clause ffi{resp.a
 term fl }of P ROP
 V
is a {possibly empty} finite disjunction{resp. conjunction} of literals of
 P ROP
 V
 .A CNF {resp. a DNF} formula ofP ROP
 V
 isa finite conjunction of clauses {resp. disjunction of terms} of P ROP
V
 . As usual, every finite set of formulas from P ROP P S
 is
 identifiedwith the formula that is the conjunction of itselements.
 From now on, 006 denotes a propositionalformula, i.e., a member ofP ROP
 P S
. V ar{006} is
 the set ofpropositional variables appearing in 006.If L 022 L
 P S , V ar{L}is the set of variables
 fromP S upon which literals ofL are built. Elements of PS aredenoted v,
x,y etc.
 Elements of L P S
 are denoted l, l
 1
 , l 2
 etc. Subsets ofP S aredenoted V ,X , Y etc. In order to simplify notations,
we assimilate everysingleton V ={v}with its unique element v.The
 size of a formula 006, denoted by|006|, is the numberof occurrences of
variables it contains.A
 propositional formula 006 is saidto be in Negation Normal Form{NNF} if
and only if only
 propositionalsymbols are in the scope of an occurrenceof : in 006. It is
well-known that every propositional formula 006 builtup from the connectives
^, _, :, } only, can be turned in linear time into an equivalent NNF formula
by \pushing down" every occurrence of : in
 it {i.e.,exploiting De Morgan's law} and removing doublenegations {since
: is involutive}. Slightly abusing words, we call theformula resulting from
this normalization processthe
 NNF of 006 and we noteLit{006} the set of literals fromL
 P S
 occurringin the NNF of 006. For
 instance, the NNF of 006 = :{(:a ^ b} _ c}is {a _ :b} ^ :c; hence, we
have Lit{006} ={a; :b; :c}.
 Note that the NNF of aformula depends on its syntactical structure,i.e.,
two formulae that
 aresynctatically different may have different NNFs even if they are equivalent.
F ull instantiations of variablesof V 022 P S arecalledV -worlds
 1
 ; they are denoted by
 ! V
 and their set is denoted 012 V
 . When Aand B aretwo disjoint subsets ofP S , !
 A ^ !
 B
 denotesthe A [ B -world that coincideswith !
 A
 on Aand with !
 B
 onB . An interpretation
 ! over P ROP
 P S
 is just a P S-world, and ! is said to be a model of 006 whenever it makes
006 true. We denote M od{006} the set of models of 006. For every formula
006 and everyvariable x, 006
 x 0
 {resp. 006
x 1
 } is the formulaobtained
 by replacing every occurrence of x in 006 by the constant false {resp.
true}. 006 l 1
 {resp. 006
 l 0
} is an abbreviation for 006
 x 1
 {resp. 006 x 0
 } whenl is a positive literal xand for 006
 x 0 {resp. 006
 x 1
 } when l is a negativeliteral :x.
 Givenan interpretation ! and a literall, we let F orce{!; l} denote the
interpretation that gives the same truth value as ! to all variablesexcept
the variable of l, andsuch thatF orce{!; l}|= l. Inother words, F orce{!;
l} is the interpretation satisfyingl that is1. TheV -worlds are also called
partial models over V .
 396Propositional Independence the closest to !. For instance, provided that
P S={a; b} and!{a} = !{b} = 1, we have
 Force{!; :b}{a} = 1 and F orce{!; :b}{b} = 0.Clearly, if !|=l then F orce{!;
l} = !. If
 L={l
 1
 ;: : : ; l
 n
}is a consistent set of literals, thenF orce{!; L} is definedas
 F orce{: : :{Force{!; l
 1
 }; : : :}; l
 n
 }: Lastly, given an interpretation! and a variable x, we letS witch{!; x}denote
the interpre-
 tation that givesthe same truth value as ! to all variables except x, and
that gives tox
 the value opposite to thatgiven by !.
 Twoformulas 011 and 010 are said to beequivalent modulo a formula 006if
and only if
 006 ^ 011021 006 ^ 010.
 Inthis paper we use the concepts of primeimplicates and prime implicants.
Theset of
 prime implicates of a formula 006, denoted by I P {006}, is definedas:
 I P {006} ={ffi clause| 006|= ffi and 69ffi
 0
 clauses.t. 006|= ffi 0
 and ffi
0
|= ffi andffi 6j= ffi
 0}:
 Among all the implicatesof 006 {i.e., the clauses entailed by 006},the
prime implicates
 of 006 are the minimalones w.r.t.|= {i.e., the logicallystrongest ones}.
The set of prime implicants of a formula 006, denoted by P I {006}, is
defined dually as: P I {006} ={fl term| fl|= 006 and 6 9fl 0
 term s.t. fl 0
|= 006and fl|= fl
 0
 and fl
 0 6j= fl}: Among all the implicants of 006 {i.e.,the terms implying 006},
the prime implicants of006
 are the maximal ones w.r.t.|= {i.e., the logically weakestones}.
 Of course, the set of prime implicants/ates may contain equivalentterms/clauses.
We
 can restrict ourattemption to one term/clause for each set ofequivalent
terms/clauses.
 Statedotherwise, in P I {006} andI P {006}, only one representative per
equivalence class is
 kept.
 Example 1 Let 006 ={a _ b;:a ^ c } e;d , e}. The set of primeimplicates
of 006 is, by definition,
 I P {006}={a_ b;a _ :c _ e; :d _ e;d_ :e;a _ :c _d}:
 2.2 ComputationalComplexity
 The complexity results we give in this paper refer to some complexityclasses
which deserve
 some recalls.More details can be found in Papadimitriou's{1994} textbook.
Given a problem a, we denote bya the complementary problem ofa. We assume
that the classes P, NP and coNP are known to the reader. The following classes
will also be considered:
 
*  BH
 2
 {also known as DP} is the class of all languages Lsuch that L = L
1
 \ L
 2 , where
 L
 1 is in NP and L
2
 in coNP. The canonicalBH
 2
 -complete problem issat{unsa t: a
 pairof formulas h';  iis in sat{unsa t if and only if' is satisfiable and
isnot.
 The complementary classcoBH
 2
 is the class of alllanguages L such that L =L
 1
 [ L 2
 ,
 where L 1
 is in NP andL
 2
 in coNP. Thecanonical coBH
 2
 -completeproblem is sat-
 or-unsa t: a pair of formulas h'; i is in sat-or-unsa tif and only if '
is satisfiable or   is not.
 397Lang, Liberatore,& Marquis
 
*  001
 p
 2
 = P
 NP
 is the class of all languagesrecognizable in polynomial time by a deter-
ministic Turing machine equipped withan NP oracle, i.e., a device able to
solveany
 instance of an NP or acoNP problemin unit time. F001
 p
 2
 isthe corresponding class of
 function problems. 
*  006
 p
 2
 = NP
 NP is the class of all languages recognizable inpolynomial time by a nonde-
 terministic Turing machine equipped with an NPoracle. The canonical 006
p
 2
 -complete
 problem 2-qbf is the set of all tripleshA ={a
 1 ; :::; a
 m
}; B ={b
 1
 ; :::; b
 n}; 010i where A and B are two disjointsets of propositionalvariables and
010 isa formula from
 P ROP A[B
 . A positiveinstance of this problem is a triple hA; B ; 010i for which
there exists a A-world ! A
 such that for allB -world !
 B we have !
 A ^ !
 B
|= 010.
 
*  005 p
 2
 = co006
 p
 2
 =coNP
 NP
 . Thecanonical 005
 p
 2 -complete problem 2-qbf is the set of all
 tripleshA ={a
 1 ; :::; a
 m
}; B ={b
 1
 ; :::; b
 n}; 010i where Aand B are two disjoint sets of propositional variables
and 010 is aformula from P ROP
A[B
 . A positiveinstance of this
 problem is a triplehA; B ; 010i such thatfor every A-world !
 A
 there exists a B -world !
 B
 for which!
 A
 ^ ! B
|= 010. 006
 p
 2
 and 005
 p
 2
 are complexity classes located at the so-called second level of the polynomial
 hierarchy, which plays a prominentrole in knowledge representation and reasoning.
3. Formula-Literal and Formula-V ariable Independence
 Formula-literal and formula-variable independence capture some forms of
independence
 between the truth values of variables and the possible truth values of a
formula. Roughly speaking, 006 is dependent onl because it tells one something
positiveabout l: more precisely, there is a contest, i.e., a conjunction
ofliterals, that, added to to 006 enables one toinfer
 l. For instance, 006= {a } b} is dependent onb, since b can be inferred
from 006under the
 assumption that ais true. Of course, we cannot assume contexts that are
inconsistent with
 006, and we cannot assume l itself.A formula 006 will be considered as
independent from
 variable xif and only if it is both independent from x and independent from
:x. Dually,
 we can interpret both forms of dependence as aboutness relations;when 006
is dependent on
 a literall, it tells one something aboutl, and when 006 is dependent on
a variable x, it tells
 something about x or about :x. 3.1 Syntactical Independence The easiest
way to define dependencebetween a formula 006 and a literall is by assuming
 that 006, when putinto NNF, contains l. Remindingthat Lit{006} is the
set of literals that occur in the NNF of 006, this can be formally expressed
by the following definition: Definition 1 {syntactical FL-independence} Let
006 bea formula from P ROP P S
 , l a literal of L
 PS
 , and L a subset ofL
 P S
 .
 
*  006 is said to be syntactically Lit-dependenton l {resp. syntacticallyLit-independent
 froml} if and only if l2Lit{006} {resp. l 62Lit{006}}.
 398Propositional Independence 
*  006 is said to be syntactically Lit-dependenton L if and only if thereis
a l2 L such
 that006 is syntactical ly Lit-dependent on l. Otherwise, 006is said to
be syntactically Lit-independent from L.
 From this definition it follows immediately that 006 is syntactically Lit-independent
from
 L if and only ifLit{006} \ L = ;. Thus, in order to determine whether
aformula 006 is
 syntacticallyLit-dependent on a set L of literals, itsuffices to check whether
there exists a literal in L that occurs in the NNF of 006.
 Example 2 Let010 = :{a ^ b}. The NNF of 010 is{:a _ :b};therefor e, 010
is syntactical ly
 Lit-dependent on both :a and :b, while it issyntactical ly Lit-independentfrom
a and b.
 As this exampleillustrates, a propositional formula can easily be syntactically
Lit-
 independentfrom a literal while syntactically Lit-dependenton its negation.
This is as
 expected, since 006 may imply lin some context, while it may be always
impossible to derive
 :l. In other words, the set of literals 006 is syntactically Lit-independent
from isnot closed
 under negation. Interestingly, a notion of syntactical formula-variable
independence can be
 defined from the more basic notion of syntactical FL-independence.
 Definition 2 {syntactical FV-independence} Let006 be a formula fromP ROP
 P S
 , v a
 variable ofP S , and V a subset ofP S .
 
*  006is said to be syntactically Var-dependent on v {resp.syntactically
Var-independent fr om v} if and only ifv2 V ar{006} {resp. v 62 V ar{006}}.
 
*  006 is said to be syntactically Var-dependenton V if and only if there
is a variable v
 inV s.t. 006 is Var-dep endenton it, i.e., if and only if V ar{006}  V 6= ;. Otherwise, 006
 is said to be syntactically Var-independentfrom V .
 Example3 Let 006 = {a^ :b}. 006 is syntactical ly Var-dep endent on aand
on b and syntac-
 tical ly Var-indep endent from c.
 Syntactical FV-independence can be easily expressed as syntacticalFL-independence:
 006 is syntactically Var-independent from V ifand only if it is syntactically
Lit-independent from V [ f:x| x 2 V}. Clearlyenough, both syntactical formula-literalindependence
and
 syntactical formula-variable independence can be checked inlinear time.
 However, these basic formsof independence suffer from two importantdrawbacks.
First,
 they do not satisfythe principleof irrelevance of syntax:two equivalent
formulas are not always syntactically independentfrom the same literals/variables.
Second,syntacti-
 cal dependence does not always capture the intuitive meaning of dependence:
for instance,
 006 = {a ^ :b ^ {a _b}) is syntactically Lit-dependent ona, :b, b; since:b
can be derived
 from 006,006 is about :b in some sense.Contrastingly, there is no way
toderive b from 006,
 unlessproducing an inconsistency.
 Handlingsuch a separation requires a more robust notionof independence,
to be intro-
 duced in the following section.
 399Lang, Liberatore,& Marquis
 3.2 SemanticalIndependence
 We now give asemantical definition of independence,which does not suffer
from the afore- mentioned drawbacks, i.e., a definitionthat does not depend
on the syntactical formin
 which formulas are expressed.We willprove that this semanticaldefinition
of indepen-
 dence does notsuffer from the second drawback of syntaxindependence, i.e.,
a formula
 thatis semantically dependent on a literal always enables one to derive
the literal in some context.
 Definition 3 {(semantical} FL-independence} Let006 be a formula fromP
ROP
 P S
 , l 2
 L
 PS
 , and L a subset ofL
 P S
 .
 
*  006 is said to be Lit-independent
 2 from l, denotedl 67! 006, if and only if there exists a
 formula 010 s.t.010 021 006 and 010is syntactical ly Lit-independentfrom
l. Otherwise, 006is
 said to be Lit-dependenton l, denoted l 7!006. The set of al l literals
of L
 P S
 such that
 l 7! 006is denoted by DepLit{006}.
 
*  006 is said to be Lit-independent fromL, denoted L 67! 006, if and only
if L\DepLit{006} =
 ;. Otherwise,006 is said to be Lit-dependenton L, denoted L 7!006.
 Simply rewriting the definition,006 is Lit-independent from L if and onlyif
there exists a
 formula 010 s.t. 010 021 006 and 010 is syntactically Lit-independent
from L. Thus, FL-independence is not affected by the syntactic formin which
a formula is expressed, that is,replacing 006 with
 any of its equivalent formulas does not modify therelation 7!. Since 006
is Lit-independent
 from L if and only if 006 can be made syntactically Lit-independent fromL
while preserving
 logical equivalence, it follows that syntactical Lit-independenceimplies
Lit-independence,
 but the converse does not hold in the general case. Example 4 Let 006 =
{a ^ :b ^ {a _b}). We have DepLit{006} ={a; :b}. Note that 006 is
 Lit-independent from b be c auseit is equivalent to 010 = {a^ :b}, in whichb
does not appe ar
 p ositively.
 As in the case of syntactical independence, we can formalize the factthat
a formula 006
 has some effectson the truth value of a variablev. Indeed, we define a notion
of{semantical}
 formula-variableindependence, which can also be easilydefined from the {semantical}
notion of FL-independence.
 Definition 4{(semantical} FV independence} Let 006 be a formula from P
ROP
 P S , v 2
 P S, and V a subset of PS .
 
*  006 is said tobe Var-independent from v, denoted v 67! +
 000
 006, if and only if there exists a formula 010 s.t. 010 021006 and
010 is syntactical ly Var-indep endent from v. Otherwise, 006is
 said to be Var-dependent on v, denoted v7!
 +
 000 006. We denote by DepVar{006} the set
 of all variables v suchthat v7!
 +
 000
 006.2.In order to avoid any ambiguity, we will refer to the syntactical
forms of independence explicitly from
 now on; inother words, independence must be taken as semantical by default
in the rest of the paper.
 400Propositional Independence 
*  006 is said to be Var-independent from V , denoted V67!
 +
 000
 006, if and only if V \ DepV ar{006} = ;. Otherwise, 006 is said to
be Var-dependent on V, denoted V 7!
 +
 000
 006. Clearly, 006 is Var-independentfrom V if and only if there exists
aformula 010 s.t. 010 021 006 and 010 is syntactically Var-independent
from V . Moreover,Var-independence is to Lit-
 independence as syntactical Var-independenceis to syntactical Lit-independence;
indeed, 006 is Var-independent fromV if and only if 006 is Lit-independent
from L
 V
 . Example 5 Let 006 = {a ^ :b ^ {a _c}). We have DepVar{006} ={a; b}.
Note that 006 is
 Var-indep endent from c.
 The definition of semantical FL-independenceis based on the set of literals
of formulas equivalent to 006. Intuitively, this is the easiest way to define
anotion of independence that
 isnot dependent on the syntax. However,proving theorems directly from this
definition is not so easy. Forinstance, we will prove that determiningwhether
a formula 006 is Lit-
 dependent on literal l is inNP, but this result cannot bedire ctly proved
from Definition 3, since checking all possible formulasequivalent to 006
cannot be done with a polynomial
 non-deterministic guessing.We give now a semantical characterization of
FL-independence.
 Proposition 1 A formula 006is Lit-independent from literal l if and only
if, for any inter- pretation ! 2 012 P S
 , if !|= 006 then F orce{!; :l}|=006.
 As a direct corollary, we get that 006 is Lit-independent from L if and
only if for any
 literal l 2 L and any interpretation ! 2 012
 PS
 , if !|=006 then F orce{!;:l}|= 006. This property gives an idea of howFL-dependence
works. Indeed, if 006 isLit-dependent
 on a literal l, then there exists an interpretation! such that !|=006 and
F orce{!;:l} 6j= 006, which means that {a} !|= l and {b} the literall in
! is \really needed" to make! a model
 of 006, that is, thepartial interpretation obtained by removingl in ! doesnot
satisfy 006. This property also explains why FL-dependence formalizes the
concept of \true in some context", asexplained at the beginningof this section.
Indeed, 006 is Lit-dependenton l if
 andonly if there is somecontext {consistent with 006, and that doesnot
imply l}, in which
 006 implies l. This can be proved from the above proposition:if ! is an
interpretation such that !|= 006 butF orce{!; :l} 6j= 006, then the term:
fl =
 ^
 {{x2 P S|!|= 006 ^ x} [ f:x| x 2 P S and!|= 006 ^ :x}}nfl}
 is consistent with 006 {by construction, fl isequivalent to the disjunction
of a termequivalent
 to ! witha term equivalent to F orce{!; :l}); it also holdsthat fl ^ 006|=
l, while fl 6j= l, that
 is, flis a context in which 006 impliesl.
 Moreover, Proposition1 shows that our notion of Lit-independencecoincides
with the
 notion of{anti-}monotonicity {Ryan, 1991, 1992}.To be more precise, a {consistent}
formula
 is said to be monotonic {resp.antimonotonic} in variable vifand only if
it is Lit-independent from :v {resp. fromv}. Interestingly, a subclassical
3
 inference relation, callednatural
 infer enc e,has been defined on this ground {Ryan,1991, 1992}. Basically,
a formula'
 is considered as a consequence of aformula 006 if and only if it is a logicalconsequence3.That
is, a restriction of classical entailmentj=.
 401Lang, Liberatore,& Marquis
 of it, and006 Lit-independence of a literal implies its 'Lit-independence.
Accordingly,
 natural inference prevents us from consideringp _ q as a consequence ofp
{it is a relevant
 implication}.All our characterization results about Lit-independence, including
complexity
 results,have an immediate impact on such a naturalinference relation {especially,
they directly show that the complexity ofnatural inference is in 001
 p 2
 }.
 Proposition 1 iseasily extended to formula-variableindependence:
 Corollary 1A formula 006 is Var-indep endentfrom variable v ifand only
if, forany
 interpretation ! 2012
 P S
 , we have!|= 006 if and only ifS witch{!; v}|= 006.
 Thefollowing metatheoretic properties of FL-independence are used in the
rest of this section.
 Proposition 2 {1} DepLit{006} 022Lit{006};
 {2} If006 021 010, then DepLit{006} = DepLit{010}; {3a} DepLit{006
^010} 022 DepLit{006}[ DepLit{010};
 {3b}DepLit{006 _ 010}022 DepLit{006}[ DepLit{010};
 {4} l 2DepLit{006} if and only if:l 2 DepLit{:006}.
 FV-independenceexhibits similar properties, plus negation stability {point
{4} below},
 whichis not satisfied by FL-independence. Proposition 3
 {1}DepV ar{006} 022 Var{006};
 {2} If006 021 010, then DepVar{006} = DepV ar{010};
 {3a} DepV ar{006^ 010} 022 DepLit{006}[ DepLit{010};
 {3b}DepV ar{006 _ 010}022 DepLit{006}[ DepLit{010};
 {4} DepVar{:006} = DepVar{006}.
 Beyondthese properties, FL-independence and FV-independence do not exhibit
any
 particularly interesting structure. In particular, FL-independenceand FV-independence
 neither are monotonicnor anti-monotonic w.r.t. expansion of 006{strengthening
or weakening
 006can easily make it no longer independent from a set of literals or variables}.
Recall that L 7! 006 if and onlyif L\DepLit{006} 6= ;. This means that
the Lit-dependence
 of 006 on L only implies the Lit-dependence on a literal of L, not a \full"
Lit-dependence on any literal of L. Inother words, ifwe want to checkwhether
a formula 006 is Lit-dependent on any literal of L, we need anotion stronger
than FL-dependence, called full FL-dependence.
 The same can be said forFV-dependence.
 Definition 5 {full FL/FV-dependence} Let 006 be a formula from P ROP P
S
 , L be a sub-
 set of L
 P S
 and V be a subset of P S .
 
*  006 is fully Lit-dependenton L if and only if L 022DepLit{006}.
 
* 006 is fully Var-dependenton V if and only if V022 DepV ar{006}. 402Propositional
Independence Example 6 006 = {a^ :b ^ {b _ :b}) is ful ly Lit-dependent
on{a; :b}and ful ly Var-dep endent on{a; b}. Contrastingly, 006 is not ful
lyLit-dependent on{a; b}.
 Clearly enough, whenever 006is fully Lit-dependent on L, it is alsofully
Var-dependent
 onV ar{L}. However,the converse does not hold, as the previousexample shows.
 While full FL-dependence{resp. full FV-dependence} can be checked in linear
time
 once DepLit{006} {resp. DepV ar{006})is known, at the end of the section
we prove that
 determining these sets isNP-hard, and that deciding full FL-dependence{as
well as full
 FV-dependence}is NP-complete.
 Let us now considerthe particular case of full FL-dependence whenL= Lit{006},
or the full FV-dependence when V= V ar{006}. If full dependence holds in
these cases, wesay that 006 is Lit-simplified, or Var-simplified, respectively.Var-simplification
is achieved when 006 contains no occurrence of any variable it is Var-independentfrom.
Lit-simplification
 correspondsto the more restricted situation where the NNF of006 does not
contain any
 occurrence of a literal it is Lit-independent from.
 If a formula 006 is notLit-simplified {resp. Var-simplified},then there
is some literal {resp.
 variable} that occurs in the NNF of 006, but 006 is not Lit-dependent
{resp.Var-dependent} on
 it.This means that the syntactic form in which006 is expressed contains
a literal or variable
 that is indeed useless. Definition 6 {simplified formula}Let 006 be a formula
from P ROP
 P S .
 
*  006 isLit-simplifiedif and only if Lit{006} = DepLit{006}. 
*  006 is Var-simplifiedif and only if V ar{006} =DepV ar{006}.
 Asthe following example illustrates, every formulathat is Lit-simplified
also is Var- simplified but the converse does nothold in the general case.
 Example 7006 = {a ^ :b ^{b _ :b} ^ {a _ c}) neither is Lit-simplified
nor Var-simplifie d. The equivalent formula {a^ :b ^ {b _ :b}) is Var-simplifie
dbut it is not Lit-simplified. Finally,
 the equivalent formula{a ^ :b} is both Lit-simplified and Var-simplified.
 Simplified formulas do not incorporate any useless literals or variables.As
the next
 proposition shows it,the notion of simplifiedformula actually is thepoint
where syntactical
 independence and {semantical} independence coincide. Proposition 4 Let006
be a formula fromP ROP
 P S
 .
 
*  006 is Lit-simplified if and only if the fol lowingequivalenc e holds:
for everyL 022 L
 P S ,
 006 is syntactical ly Lit-independent fromL if and only if L 67!006 holds.
 
*  006is Var-simplifie d if and only if thefol lowing equivalenc e holds:for
every V 022 P S,
 006 is syntactically Var-indep endent fromV if and only if V 67! +
 000
 006holds.
 Thus, while a formula caneasily be Lit-independent from a set ofliterals
without be-
 ing syntacticallyLit-independent from it, simplificationis a way to join
Lit-independence
 withits syntactical restriction {which is easier tograsp, and as we will
see soon, easier 403Lang, Liberatore,& Marquis
 to check in the generalcase}: Lit-independence and syntacticalLit-independence
coincide
 on Lit-simplifiedformulas. The same holds for Var-independence and syntactical
Var-
 independence.
 The strength ofthe notion of simplificationlies in the fact thatevery formula
can be
 simplifiedpreserving its models. This is useful, assimplified formulas can
be shorter and easier to understand than formulas containing useless literals.
 Proposition5 For every 006 from P ROP
 P S , there exists a Lit-simplified formula 010 s.t.
 006021 010.
 Since Lit-simplifiedKBs are also Var-simplified, this proposition also shows
that every
 KB can be Var-simplified without modifying its set of models.
 Interestingly, both FL-independence and FV-independence can be characterizedwith-
 out considering the corresponding syntactic notions of independence, just
by comparing formulas obtained by setting the truthvalue of literals we want
to check the dependence.
 Proposition6 Let 006 be a formula from P ROP
 P S and l be a literal of L
 P S
 . The next four
 statements are equivalent:
 {1} l67! 006;
 {2} 006 l 1
|=006
 l 0
 ; {3} 006|= 006 l 0
 ;
 {4}006
 l 1
|= 006.
 The above properties can be used to check whether a formula is Lit-dependent
on a
 literal,as the following example shows.
 Example 8 006 = {a^ :b ^ {b _ :b}) is Lit-independent from b since replacingb
by true
 within 006gives rise to an inconsistent formula.Contrastingly, 006 is
Lit-dependent on :b
 since replacing b by false within 006 gives 006 b 0
 021 a, which is not at least as logic al ly strong
 asthe inconsistent formula obtained by replacing b by truewithin 006.
 A similar proposition holds for FV-independence, characterizing the variables
a formula
 006 dependson, using the formulas 006
 x 0
 and 006
 x 1
 .
 Proposition7 Let 006 be a formula from P ROP
 P S and x be a variable ofP S . The next
 fourstatements are equivalent:
 {1} x 67!
 +
 000
 006;
 {2}006
 x 0
 021006
 x 1
 ; {3} 006 021 006 x 0
 ;
 {4}006 021 006
 x 1
 .
 As in the case of literal dependence, the above property can be used to
find out whether
 a formula isVar-dependent on a variable. Example 9 006 = {a^ {b _ :b})is
Var-indep endent fromb since we have 006 b 0
 021 006 b 1
 021 a.
 404Propositional Independence Interestingly, FL-independenceand FV-independence
can be determined in aneffi-
 cient way when 006 is given in some specific normal forms, namely, prime
implicate nor-
 malform or primeimplicant normal form. For such normalforms, Lit-independence
and
 Var-independence come down to their corresponding syntactical forms.
 Proposition 8Let 006 be a formula from P ROP
 P S and L be a subset ofL
 P S
 . Thenext
 statements are equivalent:
 {1} L 67!006;
 {2} P I{006} 022 ffl|fl is a term that doesnot contain any literal from
L};
 {3} IP {006} 022 fffi|ffi is a clause that doesnot contain any literal
from L}.
 Proposition 9Let 006 be a formula from P ROP
 P S and V be a subset ofP S . The next
 statementsare equivalent:
 {1}V 67!
 +
 000 006;
 {2} P I{006} 022 ffl|fl is a term that doesnot contain any variable fromV};
 {3} IP {006} 022 fffi|ffi is a clause that doesnot contain any variable
fromV}.
 Example 10Let 006 = {a ^ :b ^ {b _ :b}). We have P I {006}={{a ^ :b}}
and I P {006} ={a; :b}
 {upto logic al equivalenc e}.We can easily observe that006 is Lit-independent
fromb looking
 at PI {006} {or I P {006}}. We can also easilystate that 006 is Lit-dependenton
:b and Var-
 indep endentfrom c by consideringany of these normal forms.
 Proposition 7shows that a KB can be easily Var-simplified {i.e., in polynomial
time} as soon as the variables it is Var-independent from have been determined.Indeed,
we can
 easily design agreedy algorithm for simplifyingKBs. Thisalgorithm consists
in considering
 everyvariable x of V ar{006}in a successive way, while replacing006 by
006
 x 0 whenever 006 is
 Var-independent from x. This algorithm runs in time polynomial in the size
of 006 once the variables 006 is Var-independentfrom have been computed.
Atthe end, the resulting KB is
 Var-simplified.
 Lit-simplifying aKB is not so easy if no assumptions are made about its
syntax, due
 tothe fact thatliterals signs are crucial here. Indeed,looking only at the
occurrence of a variable inside a formula is notsufficient to state whether
or not this is a positive occurrence
 {or a negative one}.Fortunately , turninga formula intoits NNF is computationally
easy {as
 long as it does not contain any occurrence of connectives like , or 010}
and it proves sufficient to design a greedy algorithm for Lit-simplifyinga
KB when the literals it is Lit-independent from have been identified.Indeed,
within an NNF formula, every literalcan be considered
 easily as an atomic object. This algorithm consists in considering every
literal l of Lit{006} in a successive way, whilereplacing every occurrence
of l in 006 by f alse considering every literal of Lit{006} as an atom.Stated
otherwise, when l isa positive{resp. negative} literal
 x{resp. :x}, replacingl by f alse does not meanreplacing :x {resp. x}by
true: only the
 occurrences of l with the right sign areconsidered. This algorithm runs
in time polynomial
 in the size of 006 once theliterals 006 is Lit-independent from have been
computed. At the
 end,the resulting KB is Lit-simplified. 405Lang, Liberatore,& Marquis
 3.3 ComplexityResults
 While syntactical FL and FV-dependence can be easily checked inlineartime
in the
 size of the input, this isfar from being expected for {semantical}FL-dependence
and FV-
 dependence in thegeneral case:
 Proposition 10 {complexity of FL/FV-dependence} FL dependence, FV depen-
 dence,full FL dependenceand full FV dependenceare NP-complete.
 Thus, although they look simple, theproblems of determining whether a formula
is independent from a literal or a variable are as hard as checking propositional
entailment.
 Interestingly, the complexity of both decision problemsfall down to P whenever
checking {in}dependence becomes tractable.Apart from the case of syntactical
independence, some
 other restrictions on 006 makes {in}dependence testable in polynomial time.Especially,
we
 get: Proposition 11 Whenever 006belongs to a class C ofCNF formulasthat
is tractable for
 clausalquery answering {i.e., there exists a polytime algorithm to determine
whether006|= fl
for any CNF formula fl }and stable for variable instantiation {i.e., replacing
in 006 2 C any variable by true or by falsegives a formula that stil l belongs
to C } then FL dependence,
 FV dependence,full FL dependenceand full FV dependencearein P.
 In particular, when 006is restricted to a renamable Horn CNF formula orto
binary
 clauses {Krom formula}, allfour decision problems above belong toP.
 We have also investigated the complexity of checking whether aformula is
Lit-simplified
 {andVar-simplified}:
 Proposition12 Lit-simplified formula andVar-simplified formula areNP-complete.
 All these complexity results have some impact on the approachesthat explicitly
need
 computingDepVar{006} as a preprocessing task.Namely, we have the following
result: Proposition 13
 1.Determining whether DepLit{006}= L {where L is a set ofliterals}, and
determining
 whetherDepV ar{006} = X {where X is a set of variables} isBH
 2
 -complete. 2. The sear ch problemconsisting in computing DepLit{006} {resp
e ctively DepVar{006}} is
 in F001
 p
 2
 and is both NP-hard and coNP-hard.
 3.4 Discussion The previous characterizations and complexity results lead
to several questions:when is
 itworthwhile to preprocessa knowledge base by computing independencerelations?
How
 should these independence relations be computed? What is the levelof generality
of the
 definitionsand results we gave in this section? 406Propositional Independence
Many equivalent characterizations offormula-variable independence have been
given in
 the literature, each of which could serve as a definition {Definition 4,Corollary
1, each of the
 statements {2},{3} and {4} in Proposition 7 and of thestatements {2}, {3}
in Proposition 9}, so one may wonder which one has to be used in practice.
 In many papersreferring explicitly to formula-variableindependence, the
prime impli-
 cant/cate characterization {Proposition 9} is used as adefinition{Boutilier,
1994; Doherty et al., 1998}. Generally speaking, this is not the cheapest
way to compute the setof vari-
 ables a formula dependson, since the size of P I {006}is exponential in
the size of 006 in the worst case. This characterization is tobe used in
practice only if the syntacticalform of
 006 is such that its primeimplicants or prime implicates can be computedeasily
from it
 {this is the case forinstance whenever 006 is a Krom formula}.Clearly,
the cheapest way to compute formula-variable independenceconsists in using
any of the equivalent formulations
 of Proposition 7, which all consist of validitytests.
 Checking whether a formula 006 is Var-independent from a variablex is coNP-complete,
 whichimplies that simplifying a knowledge base by getting rid of redundant
variablesneeds
|Var{006}|calls to an NP oracle, and is thus inF001
 p
 2 and not below {unless NP= coNP}. This
 may look paradoxical {and sometimes useless} to preliminarilycompute several
instances
 of aNP or coNP-hard independence problem tohelp solving a {single} instance
of aNP
 or coNP-complete problem.However, this negative comment has ageneral scope
only,
 and in manyparticular cases, this can prove quiteefficient {indeed, even
when 006 has no particular syntactical form, the satisfiability or the unsatisfiability
of 006
 x 0
 ^ :006 x 1
 may
 be particularly easy to check}.Furthermore, if the knowledge base 006 isto
be queried many
 times,thenthe preprocessing phase consisting in Var-simplifying 006 by
ignoring useless variables is likely to be worthwhile.
 4. Forgetting
 In this section, we define what forgettingis, present some of its properties,
and finally give
 some complexity results. 4.1 Definitions and Properties A basic way to simplifya
KB w.r.t.a set of literals or a set of variablesconsists in forgetting
 literals/variables in it. Beyond the simplificationtask, forgetting is a
way to make a formula
 independent from literals/variables. Let us first start with literalforgetting:
 Definition 7 {literalforgetting} Let 006 be a formula from P ROP P S
 and L be a subset
 ofL
 PS
 . F orgetLit{006; L} is the formula inductively defined as fol lows:
 1. ForgetLit{006; ;}= 006,
 2. F orgetLit{006;{l}} = 006 l 1
 _ {:l ^ 006},
 3.F orgetLit{006;{l} [ L} = F orgetLit{F orgetLit{006; L};{l}}. 407Lang,
Liberatore,& Marquis
 This definition is soundsince the ordering in which literals ofL are considered
does
 notmatter
 4
 . Wecan also prove that the definition above is equivalent to the one in
which Point 2. is replaced by F orgetLit{006;{l}} = 006
 l 1
 _ {:l ^006
 l 0
 } just because 006 and 006
 l 0
 are equivalentmodulo :l.
 Let us now give a semantical characterization of literalforgetting. If L
={l}, that is, L
 is composed ofa single literal, then forgetting l from a formula 006 amounts
to introducing the model F orce{!;:l} for each model !of 006 such that !|=l.
 Proposition 14The set of models of ForgetLit{006;{l}} can be expresse das:
 M od{F orgetLit{006;{l}}) =M od{006} [ fF orce{!; :l}| !|= 006}
 ={!| F orce{!; l}|= 006}
 A similar statement can be given for the case in which L is composed ofmore
than one
 literal. In this case, for each model ! of006, we can forceany subset of
literals L
 1 022L such
 that!|= L
 1 to assume the value false. Proposition 15 The set of models of F orgetLit{006;
L} can be expresse d as:
 M od{F orgetLit{006; L})={!| F orce{!; L
 1
 }|= 006 where L 1
 022 L} As a corollary, we obtain thefollowing properties of literal forgetting:
Corollary 2 Let 006, 010 be formulas fromP ROP
 P S
 and L
 1
 ;L
 2
 022 L P S
 .
 
* 006|= F orgetLit{006; L
 1
 }holds.
 
*  If 006|= 010 holds, then ForgetLit{006; L
 1 }|= F orgetLit{010; L
 1
 }holds as wel l.
 
* If L
 1
 022L
 2
 holds, thenF orgetLit{006; L 1
 }|=F orgetLit{006; L 2
 } holds as well.
 Let now consider an example. Example 11 Let 006= {:a _ b} ^{a _ c}. Wehave
F orgetLit{006;f:a}} 021 {a_ c} ^ {b _ c}.
 The key proposition forthe notion of forgetting is the following one:4.
The proof of thisstatement is almost straightforward:let l
 1
 , l 2
 be two literals; 1. if l
 1
 6= l
 2
 andl
 1
 6= :l
 2
 then F orgetLit{F orgetLit{006; l 1
 }; l
2
 } = 006
 l 1
  1;l
 2
  1
 _{:l
 1
 ^006}
 l
 2
  1
 _ {:l 2
 ^
 006
 l
 1
  1
 } _ {:l
 1 ^ :l
 2
 ^006} 021006
 l 1
  1;l
 2
  1
 _{:l
 1
 ^006
 l
 2
 1
 } _ {:l
 2
 ^ 006 l
 1
  1 } _ {:l
1
 ^ :l
 2 ^ 006} is symmetric
 in l
 1
 and l 2
 ;
 2. ifl
 2
 =:l 1
 = :l thenF orgetLit{F orgetLit{006; l}; :l}= {006
 l 1
 _{:l ^ 006
 l 0
 })
l 0
 _ {l^ {006
 l 1 _ {:l ^
 006 l 0
 })
 l 0
 } 021006
 l 1
 _006
 l 0
 issymmetric in l and :l; 3. the case l
 1 = l
 2
 istrivial.
 408Propositional Independence Proposition 16 Let006 be a formula fromP
ROP
 P S
and L 022 L
 PS
 . F orgetLit{006; L} is the
 logic ally strongest conse quenc eof 006 that is Lit-independentfrom L
{up to logic alequiva-
 lenc e}.
 Thefollowing immediate consequence of Proposition 16establishes strong relationships
 betweenliteral forgetting and Lit-independence: Corollary 3 Let 006be a
formula from PROP
 P S
 andL 022 L
 P S . 006 is Lit-independentfrom
 L if and only if006 021 F orgetLit{006; L} holds.
 Thefollowing one gives an immediate application of literal forgetting: Corollary
4 If a formula' is Lit-independent fromL, then 006|=' holds if and only
if
 ForgetLit{006; L}|= '.
 This result proves that forgetting literals from L does notaffect entailment
of formulas
 that are Lit-independentfrom L. This is in some sense analogous to theconcept
of filtration
 in modal logics{Goldblatt, 1987}; indeed,if we are interestedin knowing
whether 006|='
 only for formulas 'that are Lit-independent from L, then the literals of
L can be forgotten in 006.
 Let us now investigate the computation of F orgetLit{006; L}. Let us first
considerDNF
 formulas 006. Forgettingliterals within DNF formulas is a computationallyeasy
task. On the
 one hand, forgettingliterals within a disjunctive formula comes down to
forgetting them in
 every disjunct: Proposition 17 Let006, 010 be two formulas from P ROP
 P S and L 022 L
 P S
 .
 F orgetLit{006 _ 010; L}021 F orgetLit{006; L} _ F orgetLit{010; L}:
 On the other hand,forgetting literals within a consistent term simplyconsists
in remov-
 ing them from theterm:
 Proposition 18 Let fl be a consistentterm from P ROP
 P S
 {viewed as the set ofits literals}
 and L022 L
 P S
 .F orgetLit{fl ;L} 021
 V
l2flnL
 l.
 Combining the two previous propositions shows how literals can be forgottenfrom
a
 DNF formula 006 in polynomialtime. It is sufficient to delete everyliteral
of L from each
 disjunctof 006 {if one of the disjuncts becomes empty then F orgetLit{006;
L} 021 true}.
 Thingsare more complicated for conjunctive formulas 006^010. Especially,
there is noresult
 similar to Proposition 17 forconjunctive formulas. While ForgetLit{006;
L}^F orgetLit{010; L} is a logical consequence of ForgetLit{006 ^ 010;L}
{see Corollary2}, the conversedoes not hold
 in the general case. Example 12 Let 006= a, 010 = :a, and L ={a}. Since
a 67! 010, we have F orgetLit{010; L} 021
 010.Since F orgetLit{006; L} is valid, we have {F orgetLit{006; L}^ F
orgetLit{010; L}) 021 :a.
 Since 006 ^ 010 is inconsistent, F orgetLit{006^ 010; L} is inconsistent
as wel l.
 409Lang, Liberatore,& Marquis
 Clearlyenough, any non-valid clause ffi isLit-independent fromL if and only
if Lit{ffi}   L = ;.Since the conjunction of two formulas thatare Lit-independentfrom
L is Lit- independent from L {see Proposition2}, and since every formula
is equivalent to a CNF
 formula,Proposition 16 shows F orgetLit{006; L} equivalentto the set of
all clauses ffi that are entailed by 006 and are from{ffi clause| Lit{ffi}
\ L = ;g. Because{ffi clause| Lit{ffi} \L = ;g is
 closed undersubsumption {i.e., itis a stable productionfield}, it is possible
to take advantage
 of consequence finding algorithms{see Marquis, 2000}, toderive a CNF representation
 ofF orgetLit{006; L}. Especially, in the casewhere 006 is a CNF formula,
resolution-based consequence findingalgorithms like those reported in {Inoue,
1992} or {del Val, 1999} can
 be used; this is notvery surprising since resolution is nothing butvariable
elimination.
 Incontrast to the disjunctive formulas situation,there is no guarantee that
such consequence- finding algorithms run in time polynomialin the input size
when the input is a conjunctive
 formula {otherwise, as explained in the following, we would have P= NP}.
Nevertheless,
 forgettingliterals within a conjunctive formula 006 can be easy in some
restricted cases, es- peciallywhen 006 is given by the setof its prime implicates;
in this situation, it issufficient
 to give up those clauses containing a literal from L.
 Proposition 19 Let 006be a formula from PROP
 P S
 andL 022 L
 P S .
 I P {F orgetLit{006; L}) ={ffi|ffi2I P {006}and Lit{ffi} \L = ;g:
 The notion ofliteral forgetting generalizes the notion of variable elimination
from propo-
 sitional logic {that was already known by Boole as eliminationof middle
terms and has been generalized to the first-order case in amore recent past
by Lin & Reiter, 1994}.Indeed,
 variable elimination is aboutvariable forgetting, i.e., the one achieved
not considering the
 literals signs: Definition 8 {variable forgetting}Let 006 be a formula
from P ROP
 P S and let V be a subset of P S . ForgetV ar{006; V} is the formula inductively
defined as fol lows:
 
*  ForgetV ar{006; ;} = 006,
 
*  ForgetV ar{006;{x}} = 006
 x 1
 _ 006
x 0
 ,
 
* F orgetV ar{006;{x} [ V } = ForgetV ar{F orgetVar{006; V };{x}}.
 Example13 Let 006 = {:a _ b} ^ {a _c}. We have F orgetVar{006;{a}}021
{b_ c}. As a direct consequence of the definition,F orgetV ar{006;{x
 1
 ; :::; x n
}} is equivalent to the
 quantified booleanformula {usually with free variables!}denoted 9x
 1
 : : : 9x
 n 006.
 Clearly enough, forgetting a variable x amounts to forgetting both theliterals
x and
 :x. Proposition 20 Let006 be a formula fromP ROP
 P S
 and V 022 P S. We have
 F orgetVar{006; V } 021F orgetLit{006; L V
 }
 This result, togetherwith the previous results on literal forgetting, gives
us the following
 corollariesfor variable forgetting:
 410Propositional Independence Corollary 5
 M od{F orgetV ar{006;{x}}) = M od{006}[ fS witch{!;x}| !|=006}:
 Corollary 6Let 006 be a formula andV 022 P S . ForgetV ar{006; V} is
the logic al ly strongest
 c onse quenc e of 006that is Var-indep endent fromV {up to logic al equivalenc
e}.
 Corollary 7If a formula ' is Var-indep endentfrom V , then 006|= ' if and
only if F orgetV ar{006;V }|= '. A consequence of the latter result is thatforgetting
variables is useful when only asubset
 of variables are really usedin the queries. Thus, if 006 representssome
pieces of knowledge
 abouta scenario of interest, and we are interested in knowing whether a
fact ' is true inthe
 scenario, the logical operation to dois to query whether 006|='. Now, if
the possible facts'
 we are interested in do not involve some variables V, then these variables
can be forgotten from 006, as querying whether 'is implied can be done on
F orgetVar{006; V } instead of 006. Through the previous proposition, somealgorithms
for forgetting variables can beeasily
 derived from algorithms forforgetting literals {some of them have beensketched
before}.
 Specifically, polynomial time algorithms for forgetting variables within
a DNF formula or a formula given by the set of its primeimplicates can be
obtained. Other tractableclasses of
 propositional formulas for variable forgetting exist. Forinstance, taking
advantage of the fact that F orgetV ar{006 ^ 010; V }021 F orgetV ar{006;
V } ^ F orgetVar{010; V } wheneverV ar{006}   Var{010} = ; holds, Darwiche {1999}showed that variable forgetting in a
formula 006 can
 bedone in linear time assoon as 006 is in Dec omp osableNegation Normal
Form {DNNF}, i.e., a {DAG}NNF formula in which the conjuncts of any conjunctive
subformula do not share any variable. Interestingly, the DNNF fragment of
propositional logic is strictly more
 succinct than the DNF one{especially, some DNNF formulas only admitexponentially-large
 equivalentDNF formulas} {Darwiche & Marquis, 1999}. In the general case,
just as for the literalsituation, there is no way to forget efficiently
 {i.e., in polynomial time} a set ofvariables within a formula {unlessP =
NP}. Nevertheless,
 the following decomposition property can be helpful in some situations {actually,
it is heavily
 exploitedin Kohlas et al., 1999}.
 Proposition21 Let 006, 010be two formulas from PROP
 P S
 ,and V be a subset ofP S . If
 V 67! +
 000
 006,then F orgetV ar{006^ 010; V } 021006 ^ F orgetV ar{010; V }.
Note that the corresponding property forliteral forgetting does not hold
{as a previous example shows}.
 Forgettingliterals or variables proves helpful in various settings {we already
sketched some of them in the introduction}.For instance, minimal model inference
{orcircum-
 scription McCarthy,1986} can be expressed using literal forgetting{but not
directly using
 variableforgetting, which shows the interest of themore general form we
introduced}.In-
 deed, it is well-known that closed world inference from a knowledge base
006 canbe logically
 characterized as classicalentailment from 006 completed with someassumptions.
In the
 circumscriptionframework {McCarthy, 1986}, given apartition hP; Q; Z iof
P S , such as- sumptions are the negations of the formulasff s.t. ff does
not contain any variable from
 411Lang, Liberatore,& Marquis
 Z and for everyclause fl containing only positiveliterals built up from
P and literals built up from Q, if 006 6j= fl holds, then 0066j= fl _ ffholds
as well. Equivalently ,:ff is considered a
 reasonableassumption whenever it is {syntactically} Var-independent from
Z andexpand-
 ing 006 with it does not modify what is already known about L +
 P
 [ L Q
 . Clearly enough, the signs of literals from L
P
 really matter here. Usingour previous notations, :ff is assumed if and only
if Z 67!
+
 000
 :ff and 006021
 L
 +
 P [L
 Q
 006^ :ff. As a consequence, wederive the following
 characterization ofcircumscription:
 Proposition 22Let 006, 010 be two formulas from P ROP P S
 , and P, Q, Z be thre e disjoint
 sets of variables from P S {such that Var{006} [ V ar{010}022 P [ Q [
Z}. It holds:
 
* If 010 does not containany variable from Z
 CI RC {006; hP; Q; Zi}|= 010
 if and only if
 006|= F orgetLit{006^ 010;L
 000 P
 [ L
 Z }
 
*  In the general case:
 C I RC{006; hP; Q; Z i}|= 010
 ifand only if
 006|= F orgetLit{006 ^ :F orgetLit{006 ^ :010; L
 Z
 [L
 000
 P
 }; L
 Z
 [L
 000
 P
 }
 where C I RCis circumscription as defined in {McCarthy, 1986}.
 Similar characterizations can be derived for the otherforms of closed world
reasoning
 pointed out so far.
 Forgetting also is a central concept when weare concerned with query answering
w.r.t. a restricted target language. Indeed,in many problems, there is a
set of variables for which
 we are not interested in their truth value {so we canforget them}. For instance,
in the SATPLAN framework by Kautz, McAllester,and Selman {1996}, compiling
away fluents or
 actions amounts to forgetting variables. Since the only variableswe are
really interested in
 within a given set of clauses representing aplanning problem instance are
those representing the plans, we can compile away any other variable, if
this does not introduce an increase
 of size of the resultingformula. Another situation where such aforgetting
naturally occurs
 ismodel-based diagnosis {Reiter, 1987}; compilingaway every variable except
the abnor- mality ones does not remove anypiece of information required to
compute theconflicts
 and the diagnoses of a system.Thus, Darwiche {1998} shows how boththe set
of conflicts
 and the set ofconsistency-based diagnoses of a system is characterized by
the formula ob-
 tained by forgetting every variableexcept the abnormality ones in the conjunction
ofthe
 system description and the available observations. Providedthat the system
description
 hasfirst been turned into DNNF, forgetting can be achieved in linear time
and diagnoses 412Propositional Independence containing a minimal number of
faulty components can be enumerated in {output} poly-
 nomial time. Interestingly, this work shows that the diagnosis taskdoes
not require the
 {usually expensive} computation of prime implicates/implicants to beachieved
{actually,
 computingprime implicates/implicants is just away to achieve variable forgetting
andnot
 a goal in consistency-baseddiagnosis}. Forgetting every variable from a
formula allows for consistency checking since 006 is consistent if and only
if F orgetVar{006; V ar{006}) isconsistent.
 The well-known Davis andPutnam algorithm for satisfiability testing {Davis
& Putnam,
 1960} {recently revisited by Dechter and Rish {1994}under the namedirectional
resolution}
 basically consists incomputing a clausal representation of ForgetV ar{006;
V ar{006}) from a
 CNF 006 usingresolution; if the empty clause is not generated,then 006
is consistent and the
 converse also holds.
 Forgettingcan also be used as a key concept in orderto organize knowledge
so as
 to replace oneglobal inference into a number of localinferences as shown
{among oth-
 ers} by Kohlas et al. {1999} and Amir andMcIlraith {2000}, McIlraith and
Amir {2001}. Loosely speaking, such approaches rely on the idea that exploiting
all the pieces of infor- mation given in a knowledge base is typically not
required for query answering.Focusing
 on what is relevant to the query is sufficient.While such techniques do
not lower the complexity of inference from the theoreticalside, they can
lead to significant practical improvements. For instance, assumethat 006
consists of three formulas 010 1
 , 010
 2 , and 010
 3
.
 For any query 011, letV
 011
 = {
 S
 3
 i=1 V ar{010
 i }) n V ar{011}.We have 006|=011 if and only if
 F orgetVar{
 V
 3
 i=1
 010
 i ; V
 011
 }|= 011. If V ar{010
 3
 } \{
 S
 2
 i=1
 V ar{010
 i
 }) = ;, this amountsto test inde-
 pendentlywhether F orgetV ar{ V
 2
 i=1 010
 i
 ; V 011
 }|=011 holds or F orgetV ar{010
 3
 ; V 011
 }|=011 holds.
 This way,one global inference is replaced by two local inferences. Now,
F orgetVar{010
 1
 ^ 010
 2
 ; V 011
 } is equivalentto F orgetV ar{010 1
 ^ F orgetVar{010
 2
 ;V
 011
 \ {V ar{010
 2
 }n Var{010
 1
 })}; V
 011
 }.
 Accordingly, every variable from 010
 2
 that is not a variable of 010 1
 or a variableof 011 can be
 forgottenfirst within010
 2
 because it gives noinformation relevant to the query; thus,only a
 few pieces of knowledge haveto be \propagated" from 010
2
 to 010
 1
 before answering the query, and forgetting allows for characterizingthem
exactly.
 As evoked in the introduction, another scenario in which forgettingis useful
is that of
 belief update.Indeed, there are many formalizations of belief update that
are based on a
 form of variable forgetting. The basic scenario is thefollowing one: we
have a formula006
 that represents our knowledge; thereare some changes in the world, and what
weknow is
 that after them a formula' becomes true. The simplest way todeal with the
update is
 toassume that' represents all we know about thetruth value of the variablesin
'. As a
 result,we have to \forget" from 006 the valueof the variables in V ar{'}.
There are different formalizations of this schema, based onwhether formula
' is considered to carryinformation
 about variablesit mentions {Winslett, 1990} or only on the variables it
depends on {Hegner, 1987}, or also on variablesrelated to dependent variablesvia
a dependence function {Herzig, 1996}. This kind of update schema,while less
known than the Possible ModelsApproach by
 Winslett {1990}, has proved to be suited for reasoning about actions{Doherty
et al., 1998;
 Herzig& Rifi, 1999}. Furthermore, the possibilityto forget literals {and
not variables} is
 also valuablein this framework to take account for persistent information,
as shown recently by some of us {Herzig et al., 2001},sincethe polarity of
information is oftensignificant.
 For instance, whileforgetting the fluent alive from a knowledge base is
not problematic,
 forgettingthe persistent fluent :alivewould surely be inadequate.
 413Lang, Liberatore,& Marquis
 Forgetting can also be used to characterize a dependence relation calleddefinability
 {Lang & Marquis, 1998b} aswell as the strongest necessary {resp.weakest
sufficient}
 conditionsof a propositional variable on a setV of variables given a theory
006 {Lin, 2000;
 Doherty, Lukaszewicz, & Szalas, 2001}. As shown in{Lang & Marquis, 1998b;
Lin,2000;
 Doherty et al., 2001}, all these notions have many applications in variousAI
fields, including
 hypothesisdiscrimination, agent communication, theory approximation and
abduction.
 Finally, based on literal and variableforgetting, valuable equivalencerelations
over for-
 mulas can also be defined:
 Definition 9 {Lit-equivalence, Var-equiv alence} Let 006, 010 be two formulasfrom
P ROP
 P S ,
 L be a subset ofL
 P S
 , andV be a subset of PS .
 
*  006 and010 are said to beLit-equivalent given L, denoted 006 021
 L
 010, if and only if
F orgetLit{006; Lit{006}n L} 021 F orgetLit{010; Lit{010} n L}.
 
*  006 and010 are said to beVar-equiv alent given V, denoted 006 021
V
 010, if and only if F orgetV ar{006;V ar{006} n V }021 F orgetV ar{010;
V ar{010} n V }.
 Example 14 Let006 = {a } b}^ {b } c}and 010 = {a } d} ^ {d } c}. Let L
= f:a;c}. 006
 and010 are Lit-equivalent givenL.
 Such equivalencerelations capture some forms of dependence between formulas.
They
 are useful in thevarious situations where it is required toformally characterize
the fact that
 two knowledge bases share some theorems.Thus, two formulas are Lit-equivalent
given L
 whenever everyclause containing literals from L only is alogical consequence
of the first
 formula if and only if it is a logicalconsequence of the second formula.
Inthe same vein,
 two Var-equiv alent formulas given V havethe same clausal consequences built
up fromV .
 Clearly, Lit-equivalence is more fine-grained than Var-equiv alence in the
sense that two formulas Var-equiv alentgiven V are also Lit-equivalent given
L = L
 V
 , the set of literals builtupon V , but the converse does not hold in the
general case.Some applications are
 relatedto knowledgeapproximation {010 is a correct approximation of006
over L if and only
 if 010 and 006 are Lit-equivalentgiven L}, normalization {turning a formula
006 into a CNF 010
 by introducing new symbols is acceptable as long as the two formulas are
equivalentover
 the original language, i.e., 006and 010 are Var-equiv alent givenV = V
ar{006}),and so on.
 4.2 ComplexityResults
 It is quite easy to provethat forgetting is a computationally expensive
operation in the
 general case. Indeed,since a formula 006 is consistent if and only if F
orgetLit{006; Lit{006})
 is consistent and since thelatter formula is Lit-independent from everyliteral
{i.e., itis
 equivalentto true or equivalent tof alse}, there is no way tocompute a formula
010 equivalent to F orgetLit{006; L} in polynomial time, unlessP = NP.
Actually, we can derive the more
 constraining result, showingthat the size of any formula equivalent to F
orgetLit{006; L} may
 be superpolynomially larger than the size of 006. Proposition 23 Let006
be a formula fromP ROP
 P S
 and let L be a finite subset of L
 P S
 .
 In the general case, there is no prop ositional formula010 equivalent to
F orgetLit{006; L} s.t.
414Propositional Independence the size of 010 is polynomial ly bounde d
in|006| +|L|,unless NP \ coNP 022 P/poly{which is
 consider e d unlikely incomplexity theory}.
 Thisresult shows that computing an explicit representation of F orgetLit{006;
L} under
 the form of a propositionalformula is hard, even in acompilation-based approach
where
 thetime needed to derive such a formula isneglected.
 Finally, we have alsoderived:
 Proposition 24 {complexityof Lit/Var-equiv alence}
 Lit-equiv alenceand Var-equiv alence are 005
 p
 2 -complete.
 5. Influenceability, Relevance, and Strict Relevance
 In this section, we show thatseveral notions of dependence introduced inthe
literature are
 equivalentto, or can be expressed in terms of, semantical independence.
In particular, we show that Boutilier's definition ofinfluenceability{Boutilier,
1994} isin factequivalent to
 {semantical}FV-dependence. The definition of relevance as given by Lakemeyer
{1997} can also be proved to be equivalent to FV-dependence. One of the two
definitions given by Lake- meyer {1997} for strict relevance can also be
expressed in terms of FV-dependence. These
 results allow forfinding the complexity of all these forms of dependence
as a direct corollary
 tothe complexity results reported in the previoussection. For the sake of
completeness, we also give the complexity of theoriginal definition of strict
relevance{Lakemeyer, 1995}
 {which is notdirectly related to FV-dependence}, which turnsout to be computationally
 simplerthan the subsequent definition given by Lakemeyer {1997}.
 5.1 Influenceability Boutilier {1994} introducesa notion of influenceability.
Roughlyspeaking, a formula 006 is
 influenceable from a set of variablesV if there exists a scenario in whichthe
truth value of
 006 depends onthe value of the variablesin V . This idea can be formalizedas
follows.
 Definition 10{influenceability} Let 006be a formula from PROP
 P S
 andV a subset of
 PS . 006 is influenceablefrom V if and only if there exists a P S n V-world
!, and two
 V-worlds !
 1
 and!
 2
 s.t. !^ !
 1
|= 006 and ! ^ ! 2
|= :006hold.
 In other words, there is ascenario ! in which the formula 006 can be true
or false,
 dependingon the value of the variablesin V . While influenceability looks
different from
 the definitions given in this paper, it can be shown that in factinfluenceabilitycoincides
 withFV-dependence. Proposition 25 Let006 be a formula fromP ROP
 P S
 and V a subset of PS . 006 is influ-
 ence able from V ifand only if 006 is Var-dep endenton V .
 As a consequence, a model-theoretic characterization of influenceabilitycan
be easily
 derived from the one forFV-independence. The complexity ofinfluenceability
is an easy
 corollaryto this property: influenceabilityis NP-complete.
 415Lang, Liberatore,& Marquis
 5.2 Relevance Lakemeyer {1995, 1997} introducesseveral forms of relevance.
We show how these forms of
 relevance are strongly related to FV-independence. We also complete the
results given in
 {Lakemeyer, 1997}, by exhibitingthe computational complexity of each form
ofrelevance
 introduced in {Lakemeyer, 1997}.
 5.2.1 Relevanceof a formula to a subject matter Lakemeyer's notion of relevance
of a formula to a sub jectmatter can be defined in terms of prime implicates
of the formula, as follows{see Definition 9 in Lakemeyer, 1997}: Definition
11 {relevance to a subject matter} Let 006be a formula from PROP
 P S
 and V a subset of P S. 006 is relevantto V if and only if there exists
a prime implicate of006
 mentioning a variable from V .
 Example 15Let 006 = {a ^c} and V ={a; b}. 006 is relevant to V .
 Asa consequence, Lakemeyer's notion of irrelevance of a formula to a sub
jectmatter
 coincides with FV-independence. Proposition 26 Let006 be a formula fromP
ROP
 P S
 and V a subset of PS . 006 is relevant to V if and only if006 is Var-dep
endent on V.
 Thus, the model-theoretic characterization of FV-independence also applies
toirrele-
 vance of a formula to asub ject matter. We also have that the irrelevance
of a formula to a sub ject matter coincides with Boutilier'sdefinition of
influenceability.Finally, the above
 propositionallows for an easy proof of complexity forrelevance, namely,
relevance of a
 formula to a subject matter is NP-complete.
5.2.2 Strict relevance of a formulato a subject matter
 Lakemeyerhas introduced two forms of strict relevanc e. The {chronologically}
firstone has
 been given in {Lakemeyer,1995}, as follows.
 Definition 12 {strict relevance to a sub ject matter, Lakemeyer, 1995} Let
006 be a
 formula from PROP
 P S
 andV a subset of P S .006 is strictly relevantto V if and only if
 every prime implicate of 006contains a variable fromV .
 Lakemeyer has also introduced another notion of strict relevance {Lakemeyer,
1997},
 more demanding than the original one.Here we consider an equivalentdefinition.
 Definition 13 {strict relevance to a sub ject matter, Lakemeyer,1997} Let
006 bea
 formula from P ROP P S
 and V asubset of P S . 006is strictly relevant to Vif and only if
 thereexists a prime implicate of 006mentioning a variable from V, and every
prime implicate of 006 mentions only variables from V .
 416Propositional Independence Both definitionsprevent tautologies andcontradictory
formulas from being strictly rele- vant to any set of variables. The basic
difference betweenthese two definitions is that in the first one we want
that every primeimplicate of 006 contains at least a variable from V, while
 in the second case we impose that every prime implicate of 006 must contain
only variables
 fromV
 5
 . As the followingexample shows, there are formulas for whichthe two definitions
 of strict relevance do not coincide.
 Example 16Let 006 = {a _b} and V ={a}. There is only one primeimplicate
of 006,
 namelya _ b. Since it contains at least a variable ofV , it fol lows that
006is strictly relevant
 toV w.r.t. {Lakemeyer, 1995}.However, since the prime implicate a _ b is
not comp ose d
 only of variables ofV {be c ause b 62V }, it fol lows that 006is not strictly
relevant to Vw.r.t.
 {Lakemeyer, 1997}. Through FV-independence, we can derivean alternative
characterization of the notion of strict relevanc e introduced by Lakemeyer
{1997}. Indeed, as a straightforward consequence
 of the definition, we have:
 Proposition 27Let 006 be a formula from P ROP
 P S and V a subset ofP S . 006 is strictly relevant to V if and only if006
is Var-dep endent on Vand Var-indep endent fromV ar{006} n V . We have identified
the complexity of both definitions of strict relevance, and they turn
 out to bedifferent: the first definition is harderthan the second one.
 Proposition28 {complexity of strict relevance} {1} strict relevance of a
formula to a subject matter {Lakemeyer, 1995} is
 005
 p
 2
 -complete. {2} strict relevance of a formula to a subject matter {Lakemeyer,1997}
is
 BH
 2
 -complete.
 These complexity results improve Theorem 50 from {Lakemeyer,1997}, which
only
 pointsout the NP-hardness of irrelevance and strict irrelevance as defined
in{Lakemeyer,
 1997}.
 6.Other Related Work and FurtherExtensions
 In this section, we firstdiscuss other related work, then some possibleextensions
of the
 notions and results wehave presented before.
 6.1Other Related Work
 Asalready evoked, independence has beenconsidered under various forms in
various AI
 fields.5. Strictrelevance as in {Lakemeyer, 1997} could also be shown to
bestrongly related to controllability {Boutilier, 1994; Lang & Marquis, 1998b}.
417Lang, Liberatore,& Marquis
 6.1.1 Independencein propositional logic
 There are otherforms of independence in propositional logic thatwe have
not considered
 in this article,especially, definability,controllability {Lang & Marquis,
1998b} as well as
 conditional independence {Darwiche,1997}. If X , Yand Z are three disjoint
sets ofvariables
 and 006 is a knowledgebase then X and Y areconditionally independent w.r.t.
Zknowing
 006 if and only if for any Z -world !
 Z , once we know !
Z
 {and 006}, learning something about X cannot make us learn anythingnew
about Y {and viceversa}. The computational
 issues pertaining to conditional independence and to stronger notions as
well as related notions such as relevancebetween sub ject matters {Lakemeyer,1997}
and novelty {Greiner
 & Genesereth, 1983}, havebeen extensively studied in a companion paper {Lang,
Liberatore, & Marquis, 2002}.
 6.1.2Implicit and explicit dependence
 Several approaches to belief change make use of a explicit dependence relation,
whichmeans
 that it is part of the input{while ours is implicit, i.e., derived from
theinput}. Thus, com-
 putingindependence relations from a knowledge base canbe seen as an upstream
task
 enablingus to specify the \core" {minimal} independence relation upon which
the belief changeoperator is based; this core independence relation can then
be completed by spec-
 ifying explicitly some additional dependencies using knowledge about the
domain.Such
 an approach has been proposed for belief revision in {Fari~nas del Cerro
& Herzig, 1996},
 for belief update in {Marquis, 1994} and{Herzig, 1996} and for reasoning
about action in {Herzig & Rifi, 1999}.
 6.1.3 Independence and contexts Contextual reasoning {Ghidini & Giunchiglia,
2001} has been introduced forformalizing
 domains in which knowledgecan naturally be divided into parts {contexts}.
Each context is
 characterized by its own language and alphabet. The knowledge base of a
context contains
 what is relevant to a partof the domain. However, it is not guaranteed that
the different
 partsof the domain do not interact, so inference inone context may be affected
by the knowledge of some other context. The main difference between contextual
reasoning and independence is that the lat- ter is a study of the relevance
relation that can be drawn from a\flat" {i.e., not divided
 intocontexts} knowledge base; whereas contextualreasoning is on knowledge
about specific contexts, that is, knowledge is expressedby specifying which
context it refers to.In other
 words, the relevance relation is a result of reasoning aboutknowledge in
studying depen-
 dency; on the other hand, it is one of the data that has tobe provided for
reasoning about contexts.
 6.1.4 The definition ofirrelevance by Levy et al.
 The definition of irrelevance given by Levy, Fikes, and Sagiv {1997} aimsatestablishing
 which facts of a knowledgebase are irrelevant to the derivation of a query.
In particular, they consider a first-order logic with nofunction symbols
and a set of inference rules.A
 knowledge base is a set of closedformulas {formulas with no free variables}.
Derivation of a 418Propositional Independence query{another closed formula}
is obtained by applying the inference rules to the knowledge base and the
logical axioms of the theory.
 A formula 036 of the knowledge base is irrelevant to the derivation of
another formula
  if 036 does not \participate" to theprocess of inferring   from the knowledgebase.
For
 example, in the knowledgebase{A{1}; C {2}; A{X } }B {X }}, itis clearthat
A{1} is relevant to B {1}, while C{2} is not.
 This definition becomes complicated when more complex scenarios areconsidered.
Namely,
 Levy et al.consider three different \coordinates":first, whether all derivationsare
consid-
 ered or just one; second, whether we consider all derivations or just minimalones;
third,
 whetherwe consider membership to the proof or just derivability from the
formulas that composethe proof {in this case, we have four possible choices}.
 Besides the fact that this notion of irrelevance is based on first-order
logic, there are other, moresubstantial, differencesbetween it and the ideas
investigated in this paper.
 First, it is a relation betweentwo formulas, given a background knowledge
base. As such,
 it is morerelated to other notions of relevancein the literature {Lang &
Marquis, 1998a}. Second, it is based on the concept of proof, which is something
not completely dependenton
 the semantics. Forexample, replacing the usual modus ponens withthe strange
inference
 ruleff;fi ; ff } fl !fl , which does not change thesemantics of the logics,
then the formula C {2} of the knowledge base above becomes magically relevantto
B {1}. This is perfectly reasonable in the approach by Levy etal., where
improving efficiency using a specific proof
 theory is the aim.
 6.2 Extending our Notions and Results The notions and results presented
in thispaper can be extended in several directions. 6.2.1 Numerical and ordinal
notions of formula-variable independence;
 forgettingin more general structures
 Itmay be worth wondering about how thenotions of FL-independence and FV-independence
can be generalized to the case where theknowledge base is not a mere propositional
for- mula but a probability distribution over 012
 P S
, or an incomplete probability {or equivalently
 a set of probabilitydistributions}, or a stratified knowledge base, orany
other ordinal or
 numericalstructure. First, let us notice that some of our characterizations
lead to quite intuitive andgeneral
 ideas. To take the caseof FV-independence, we have shown that the following
three state-
 ments are equivalent when 006 is a propositionalknowledge base:
 {a} 006 does not tell anythingabout x, in any context; {b} 006 can be
rewritten equivalently in a formula 006
 0
 in which x does not appear
 {Definition 4};
 {c} for anytwo interpretations ! and ! 0
 that differ only in the value given to x, the status of ! withrespect to
006 {i.e., model or countermodel} is the same as that of!
 0
 {Corollary 1}. As to variable forgetting: {d} F orgetV ar{006; V } is the
most general consequenceof 006 that is Var-independentfrom V
 {Corollary 16}. 419Lang, Liberatore,& Marquis
 Now, these definitions have a sufficient level of generality to beextended
to the case where
 the knowledgebase 006 is replaced by another structure. Thus, some of us
have extended thenotion of Var-independence {characterizedusing
 {b}) and variableforgetting to ordinal conditional functions {represented
as stratified bases}
 {Lang,Marquis, & Williams, 2001}. The basic purpose was to extend the \forget
and expand" approach at work for updating\flat" knowledge bases {as briefly
discussed in Section 4} to updating more sophisticatedepistemic states.
 Thecase of incompleteprobabilities is also interesting since generalizing{a},
{b} and {c}
 will lead to thesame intuitive notion
 6 . As to variable forgetting, itis not hard to notice
 thatit correspondsto the well-known notion of marginalization.
 6.2.2 Formula-variable independence in nonclassical {propositional} logics
 The definition of FL- and FV-independence {006 can be rewritten equivalently
in a formula
 006 0
 in which l {resp.x} does not appear} is quitegeneral, in the sense that
the logical language and/or the consequence relation{and thus the notion
of logical equivalence} may
 vary , whichenables us to draw from this principle notions ofFL- and FV-independence
 {andfrom then on,notions of literal and variable forgetting} for nonclassical
logics. This
 is what has been done {at least partly} in {Lakemeyer,1997}. Here we briefly
consider two cases:
 {i} subclassic allogics. These logics are built on aclassical language but
have a weaker con- sequence relation than classical logic.For instance, in
most multivalued logics, due to the
 factthat the excluded middle {a _ :a} is not a theorem, a formula suchas
a _ {:b ^ b}
 will depend both on aand on b {whereas it depends only ona in classical
logic}, because it cannot be rewritten equivalentlyinto a formula in which
b does notappear {in particular, a
 is not always equivalent to a _{:b ^ b} in suchlogics}; on the contrary,
the formulaa _ {a ^ b} is equivalent to a in usual multivalued logics and
thus depends ona only.
 {ii} modal logics. Now, the language isobtained by extending a classical
propositionallan-
 guage with one or more modalities,while the consequence relation extends
that ofclassical
 logic {in the sense that a modality-free formula is a theorem if and only
if it is a theorem
 of classical logic}.Therefore, results such as Propositions6 and7 still
make sense and we
 conjecture that they are still valid.To take an example, in the logicS 5
where 003006}006006 is
 a theorem, a formulasuch as 003006 ^ 006{006 _ 010} is independent
from 010 while 006006 ^ 003{006_ 010}
 is not.
 6.2.3Restricted queries and characteristic models Forgetting a set of variablesmodifies
a knowledge base while preserving someof the con-
 sequences, namely,those built on variables that are notforgotten. Other
frameworks are6. This is not the case for single probability distributions:
whenthe knowledge base is represented by a prob- ability distribution pr,
{a} doesnot lead to an acceptable definition {because afull probability distribution
 on012
 P S
 alwaystell something about x}; {c} suggests the definition 8! 2 012
 P S
 ; pr{Switch{!; x}) = pr{!},
 which leads to the decomposability of pr w.r.t. x:pr is a joint probability
distributionobtained from
 a probability distributionpr
 P Snfxg on 012
 P Snfxg
 and the probabilitydistribution pr
 x
 onfxg defined by
pr
 x
 {x}=
 12 {= pr
 x
 {:x}). {b} would lead to thesame definition, noticing that a full probabilitydis-
 tribution can be expressed morecompactly by a \partial" probability distribution{here,
a probability
 distributionon 012 P Snfxg
 }from which pr is induced by the maximumentropy principle.
 420Propositional Independence based on restricting the possible queriesin
other ways, i.e., consideringonly queries inHorn
 form {Kautz, Kearns, & Selman, 1995}. Reasoning with characteristic models{Kautz,
Kearns,& Selman, 1993} is based onusing
 only a subset of models of theoriginal formula. As for forgetting, thismay
increase the
 size of the representation ofa knowledge base exponentially.Khardon and
Roth {1996}
 have shownthat characteristic models can be used forefficient reasoning
on some sets of restricted queries.
 7. ConcludingRemarks
 We have investigatedseveral ways of answering the key question of determining
what a
 propositionalknowledge base tells about the {in}dependencies between variables
and for- mulas. For each of the notionsstudied, we have related it to otherpreviously
known notions
 and we have studiedit from a computational point ofview, giving both complexity
results and characterization results to be usedfor practical computation.
In the light of our results,
 it appears that the various forms of logical independence are closelyconnected.
Especially,
 severalof them had been proposed by differentauthors without being explicitely
related. Boutilier's influenceability and Lakemeyer'srelevance of a formula
to a subject matter are
 identical to FV-independence {Propositions 25 and 26}. We also discussed
much related work, andsuggest some extensions to moregeneral framework than
mere propositional logic.
 The following table gives asynthetic view of many notions addressed in thispaper
and
 the corresponding complexityresults.
 Problem Notation DefinitionComplexitySynt.Lit-independence Lit{006}\ L
= ; P
 Lit-independence L 67! 006 9010 : 010 021 006; Lit{010} \ L =; coNP-complete
 Dependentliterals DepLit{006}{l| fl} 7! 006} BH
 2
 -complete FullLit-independence L022 DepLit{006} coNP-complete
 Lit-simplified 8l :{l} 67! 006, l 62 Lit{006} NP-complete
 Lit-equivalence 006021
 L
 010F orgetLit{006; Lit{006}n L} 021
 F orgetLit{010; Lit{010} n L} 005
 p
 2 -completeProblemson literals and their complexity. Problem Notation DefinitionComplexitySynt.Var-independence
V ar{006}\ V = ; P
 Var-independence V 67! +
 000
 0069010 : 010 021 006; V ar{010} \ 006= ; coNP-complete
 Dependentvariables DepV ar{006}{v|v 7!
 + 000
 006} BH 2
 -complete
 FullV ar-independence V 022DepV ar{006} coNP-complete Var-simplified 8v
:{v} 67!
 +
000
 006 , v 62V ar{006} NP-complete Var-equiv alence 006 021 V
 010 F orgetVar{006; V ar{006}n V } 021
 F orgetVar{010; V ar{010}n V } 005
 p 2
 -completeProblems on variables and their complexity.
 421Lang, Liberatore,& Marquis
 The fact that both notionsof formula-variable independence andforgetting
have been
 usedas key concepts in many AI fields {includingautomated reasoning, belief
revision and update, diagnosis, reasoning about actionsetc.} has been discussed
before {Sections 1 and
 4}, so we will refrain from repeating it here. The gain of generalityoffered
by the corre-
 spondingliteral-oriented notions introduced in this paper has also been
established {e.g., Proposition 20}, and their application to several AI problems
{like closed-world reasoning and belief update} has been sketched. Primarily,
one of the main motivations for the notions of formula-variable independence
 and forgetting wasto improve inference from a computational side, by enabling
to focus on
 relevant pieces of knowledge. Theextent to which this goal can be reachedcan
be discussed
 at the light of ourcomplexity results:
 
*  Most{in}dependenc e relations have a highcomplexity. The notions connected
to FV- dependence {FL-dependence, full FV- and FL-dependence, influenceability,
relevance
 to a sub ject matter and strictrelevance second form} have a complexity
at the first
 level of the polynomialhierarchy, which means that they can be checked by
a satis-
 fiabilityor/and an unsatisfiability solver.They become \tractable" when
syntactical restrictions are made {Proposition 11}.Forgetting {literals or
variables}also is com-
 putationally expensive.The remaining notions are in complexity classeslocated
at
 the second level of the polynomial hierarchy. Worse,under the standard assumptions
 of complexitytheory, the explicit computation of literal orvariable forgetting
cannot
 beachieved in polynomial space in the generalcase {Proposition23}. This
pushes towards the negative conclusion that allthese notions are hard to
be computed {at least in the worst case} except if thesize or the syntactical
form of the input enablesit.
 The fact that these problems fall inthe second level of the polynomial hierarchyare
 not that surprising since this is wherea large part {if not the ma jority}of
important
 problems in knowledgerepresentation
 7
 fall. 
*  But a high worst-casecomplexity does not nec essarilyprevent from practic
alalgo-
 rithms! Thus, Amir andMcIlraith have shown the computational benefits that
can
 beachieved bystructuring a KB {through the forgetting operation}so as to
achieve
 inferenceand consequence findingmore efficiently {Amir &McIlraith, 2000;
McIlraith
 & Amir, 2001}.A similar discrepancy between the worstcase situation and
the practi-
 calones can be observed in other domains; especially, satisfiability-based
checkersfor
 quantified boolean formulas{Biere, Cimatti, Clarke, Fujita,& Zhu, 1999;
Williams,
 Biere,Clarke, & Gupta, 2000} used for formal verification purposes {bounded
model checking} exhibitinteresting computationalbehaviours {actually, they
typically per-
 form better than specializedalgorithms, as shown in Rintanen, 2001}, despitethe
 fact that they are confronted tothe problem of variable forgetting {i.e.,elimination
 of existentially quantified variables}.
 
*  More over,prepr o c essing may play an important role. What we mean with\prepro-
 cessing" refers to the task ofcomputing {in}dependence relations and forgettingbefor
e7.Such as abduction, nonmonotonic inference, belief revision, belief update,
some forms ofplanning and
 decision making.
 422Propositional Independence performing more problem-oriented tasks suchas
consequence finding, diagnosis, ac- tion/update, decision making etc. Thus,Lit-simplifying
{or Var{simplifying}a KB
 during a preliminaryoff-line phasecan prove helpfulfor improving on-lineinference
 since simplification never increasesthe size of a KB. As shown by Proposition23,
a
 similar conclusion cannot be drawn to what concerns forgetting. This seems
to be the
 priceto be paid to benefit fromthe power of forgetting. However,this negative
conclu-
 sion must betempered by the two followingcomments.On the one hand, forgetting
is
 interesting per se; it is not onlya tool that can help improving inference
insome cases
 but also a goal in several AIapplications. On the other hand, our complexityresults
 relate to the worst case situation,only, and, as evoked before, forgettingis
feasible in
 many practical cases.Finally, let us note that there are several complete
propositional
 fragmentsfor which forgetting is easy. Especially, as advocated by Darwiche
{1999}, compiling a KB into a DNNF formula duringan off-line step can prove
practically valuable to achieve forgetting in anefficient way, providedthat
the size of the compiled
 formremains small enough {which cannot be guaranteed in the worst case}.
Since it is not known whether the DNNF fragmentis strictly more succinct
than the prime implicates one {Darwiche & Marquis, 1999},theprime implicates
fragment can also be targeted with profit as a compilation language for some
knowledge bases; especially,
 some recent approaches to theimplicitrepresentation of prime implicates
{Simon & del Val, 2001} exhibit verysignificant empirical performances {they
enablethe com-
 putation of sets of primeimplicates containing up to 10
 70
 clauses}. Accordingly, they
 can prove valuablefor the practical computing of independence andforgetting.
 Acknowledgements The third author has been partly supported by the IUT de
Lens, the Universite d'Artois,
 the RegionNord / Pas-de-Calais under the TACT-TICpro ject, and by the European
Com- munity FEDER Program. Someresults of this paper already appeared inSection
3 of the
 paper {Lang & Marquis,1998a} \Complexity results for independence anddefinability
in
 propositionallogic",Pro c. of the 6
th
 International Conferenc eon Principles of Know ledge
 R epr esentation and Re asoning{KR'98}, pages 356-367, 1998.
 423Lang, Liberatore,& Marquis
 Appendix A: Proofs Proposition 1 A formula006 is Lit-independent froml
if and only if, for any interpretation
 ! 2 012
 P S
 , if !|= 006 then F orce{!; :l}|=006.
 Proof: Assumethat 006 is Lit-independent from l. Then, there exists a formula
010 in NNF that is equivalent to 006, and does not contain l. Then, for
any! 2 012
 P S
 such that !|= 010 we have F orce{!; :l}|=010. Since 010 is equivalentto
006, we conclude that the same property holds for 006.
 Assume that, for any interpretation ! 2 012 P S
 , !|= 006 implies F orce{!; :l}|=006. We
 prove that 006 isLit-independent from l. Indeed,let fl
 !
 be the termwhose only model is !.
 The following equivalence holds: 006 021
 _
{fl
 !
| !|= 006}
 021 _
{fl
 !| !|= 006and ! 6j= l}[ ffl
 !
| !|= 006 and !|= l}
 021
 _{fl
 !
|!|= 006 and !6j= l}[ ffl !
 _ fl
 Force{!;:l}| !|= 006and !|= l}
 The latter step can be done because!|= 006 implies that Force{!; :l}is
also a model
 of 006.Now, if ! 6j=l then fl
 !
 does not contain l. Onthe other hand, if!|=l, then
 fl
 ! _ fl
 F orce{!;:l}
 canbe rewritten as a conjunction of literals not containing neither l nor
 its negation.As a result, the above formula {which is in NNF} does not contain
l, which
 means that 006 is Lit-independent from l. 005
 Proposition 2
 {1} DepLit{006} 022 Lit{006}; {2} If 006 021 010, then DepLit{006}
=DepLit{010};
 {3a}DepLit{006 ^ 010}022 DepLit{006} [ DepLit{010};
 {3b} DepLit{006 _ 010} 022 DepLit{006}[ DepLit{010}; {4} l 2 DepLit{006}if
and only if :l 2 DepLit{:006}.
 Proof: 1. Trivial.
 2. L7! 006 if and only if there exists a formula 011 that is equivalent
to 006, andsuch that
 011 is syntactically Lit-independent from L. Since 006021 010, it follows
that 010021 011.
 3. Let 011 {resp.005} be a NNF formula equivalent to 006 {resp. 010}
s.t.no literal of L occurs
 in it. Then 011 ^ 005{resp. 011 _ 005} is a formulaequivalent to 006
^ 010{resp. 006 _ 010}, which is in NNF, and no literal of Loccurs in
it.
 4. Straightforwardfrom the fact that l appears in a NNFformula 006 if and
only if :l appears in the NNF form of:006.
 005
 424Propositional Independence Proposition 3
 {1} DepVar{006} 022 V ar{006};
 {2} If 006 021010, then DepV ar{006}= DepV ar{010};
 {3a} DepV ar{006 ^010} 022 DepLit{006}[ DepLit{010};
 {3b}DepV ar{006 _ 010}022 DepLit{006}[ DepLit{010};
 {4} DepVar{:006} = DepVar{006}.
 Proof:{1} is trivial; {2} and {3} are similarto the proof of points {2}
and {3} ofPropo-
 sition 2, replacing \literal" by\variable", l by v, andDepLit by DepV
ar.As to {4}: if
 x 62DepV ar{006} then there exists a formula 010 equivalent to 006 in
whichx does not appear;
 sincex does not appear in :010 either,:010 is a formula equivalentto :006
in which x does not appear. 005
 Proposition4 Let 006 be a formula from P ROP
 P S .
 
*  006 isLit-simplified if and only if the following equivalenc e holds:
forevery L 022 L
 PS
 ,
 006 is syntactical ly Lit-independent fromL if and only if L 67!006 holds.
 
*  006is Var-simplifie d if and only if thefol lowing equivalenc e holds:for
every V 022 P S,
 006 is syntactically Var-indep endent fromV if and only if V 67! +
 000
 006holds.
 Proof:
 
* Lit-simplification
 { }:Assume 006 is Lit-simplified{thusLit{006} = DepLit{006}) and letL
022 L
 P S
 .
 If 006 is syntactically Lit-independent from L, thenL\ Lit{006} = ;,
thusL   DepLit{006} =;, i.e., L67! 006. { {: Assume that 006 is notLit-simplified.
Then there exists l2 Lit{006} s.t. 006
 is Lit-independent from l. With L={l}, it is clear that 006is syntactically
 Lit-dependenton L, while Lit-independent from it, contradiction.
 
*  Var-simplification.The proof is similar to the Lit-simplificationcase,
replacing \Lit-
 independent" by\Var-independent", L byV , L
 P S by P S , lby x, Lit{006} byV ar{006}.
 005
 Proposition 5 For every006 from P ROP P S
 , thereexists a Lit-simplified formula010 s.t.
 006 021 010.
 Proof: Since 006 is Lit-independent from L n DepLit{006}we know that there
exists a
 NNF formula 010 equivalentto 006 such that Lit{010}\ {L n DepLit{006})=
;, i.e., such that
Lit{010} 022 DepLit{006}.By point {2} of Proposition 2 we have DepLit{010}
= DepLit{006}.
 Thus, DepLit{010}022 Lit{010}) 022 DepLit{006} = DepLit{010}, fromwhich
we conclude that
 DepLit{010} = Lit{010}, i.e., 010 isLit-simplified. 005
 Proposition6 Let 006 bea formula from P ROP P S
 and l be a literal of L
P S
 . The next
 four statements are equivalent: 425Lang, Liberatore,& Marquis
 {1} l 67!006;
 {2} 006
l 1
|= 006 l 0
 ;
 {3}006|= 006
 l 0
 ;
 {4} 006 l 1
|=006.
 Proof:
 {1}} {2}: Let v the variable of l and assume that 006 l 1
 6j=006
 l 0
 , whichmeans that there is
 a P Sn fv}-world !suchthat !|= 006 l 1
 ^ :006 l 0
 ; since 006 isequivalent to {l ^006
 l 1
 }_
 {:l ^ 006 l 
 0}, we have! ^ fl} j=006
 l 1
 ^l|= 006 and Force{! ^ fl}; :l} = ! ^f:l} j=
 006
 l 1
 ^ :006 l 0
 ; hence,F orce{! ^fl}; :l} 6j= 006 and therefore l7!006 by Proposition
1.
 {2}} {3}: Assume 006 l 1
|=006
 l 0
 .We have the following chain ofimplications:
 006 021 {l ^ 006
 l 1 } _ {:l ^ 006 l 0
 }
 006|= {l ^006
 l 0
 }_ {:l ^ 006
 l 0
 }
 006|= 006
 l 0
 {3} } {1}:Let us assume that 006|= 006
 l 0
 , and prove that l 67! 006.Indeed, the assumption
 can be rewrittenas:
 8! 2 012
 P S
 : {!|= 006 } !|=006
 l 0
 } Now, !|= 006 l 0
 is equivalent to say that changing the truth value of l to false, ! is still
a model of 006. Informulas,
 8! 2 012 P S
 : {!|= 006 } F orce{!; :l}|= 006}
 This is exactly the definition of l67! 006.
 {2} }{4}: Same as the proof of {2}} {3}.
 {4} } {1}: Similar to the proof of {3}} {1}.
 005
 Proposition7 Let 006 bea formula from P ROP P S
 and x be a variable of P S .The next
 four statements are equivalent:
 {1} x67!
 +
 000
 006;
 {2} 006
 x 0
 021 006 x 1
 ;
 {3}006 021 006
 x 0
 ;
 {4} 006021 006
 x 1 .
 Proof: Easyconsequence of the definition of FV-independence{006 is Var-independent
from x if and only if it is Lit-independent from{x; :x}},and Proposition
6}. 005
 Proposition 8 Let006 be a formula fromP ROP
 P S
 and L be a subset ofL
 P S
 . Thenext
 statements are equivalent: 426Propositional Independence {1} L 67! 006;
{2} P I {006} 022 ffl| fl is a term that does not contain any literalfrom
L};
 {3}I P {006} 022 fffi|ffi is a clause that does not contain any literalfrom
L}.
 Proof: 
*  {1} , {2}: { }: If 006 is Lit-independent from L, then there exists a
NNFformula 010 equivalent
 to006 s.t. Lit{010} \L = ;. Clearly enough, the property Lit{010}  L= ; is still
 satisfiedif 010 is turned into DNF, especiallywhen010 is turned into its
prime
 implicant normal form. Since two equivalent formulas have the same prime
implicants, no term of P I{006} contains a literal fromL.
 { {: PI {006} is a NNF formula that is equivalent to 006 and syntactically
Lit- independent from L. Hence,006 is Lit-independent from L. 
*  {1} , {3}:Similar to the prime implicant situation, usingCNF instead of
DNF.
 005
 Proposition 9 Let006 be a formula fromP ROP
 P S
 and V be a subset ofP S . The next
 statementsare equivalent:
 {1} V67!
 +
 000
 006;
 {2} P I{006} 022 ffl|fl is a term that doesnot contain any variable fromV};
 {3} IP {006} 022 fffi|ffi is a clause that doesnot contain any variable
fromV}.
 Proof:Easy consequence of the definition of FV-independence, plus Proposition
9 above.005
 Proposition 10FL dependence, FV dependence, full FL dependence and full
FV dependenceare NP-complete.
 Proof:
 
* FL dependence
 { Membership.In order to show that 006 is Lit-dependent on L, it is sufficient
to guess a literal l from Land a {V ar{006} n fV ar{l}}}-world! that is
a model of
 006 l 1
 butnot a model of006
 l 0
 .These tests can be achieved in time polynomial
 in|006|. { Hardness. Let us consider the mapping M s.t. M {006} =h006
^ new; newi, where
 new isa propositionalvariable that does not occur in 006.Clearly enough,
M {006}
 can be computed in time polynomial in|006|. Moreover,006 is satisfiable
if and only
 if 006 ^ new isLit-dependenton{new}.
 
* FV dependence
 427Lang, Liberatore,& Marquis
 { Membership. Inorder to show that 006 is Var-dependenton V , it is sufficient
to guess a variable x fromV and a {V ar{006}n fx}}-world ! that is not a
model of
 the formula 006 x 0
 , 006 x!1
 . Thistest can be achieved in time polynomial in|006|.
 { Hardness.Similar to the hardness proof of FL-dependence,replacing \Lit-
 dependent"by \Var-dependent".
 
* full FL dependence
 { Membership. 006 is fully Lit-dependent onL={l
 1
 ; :::; l
 n
}if and only if 006
 l 1
  1
 6j=
 006
 l
 1  0
 and ... and006
 l
 n
 1
 6j= 006 l
 n
  0 holds. Equivalently , 006 is fully Lit-dependent
 on L ={l
 1
 ; :::; l n
} if and only if the formula rename
 1
 {006 l
 1
  1 ^ :006
 l
 1
  0
 } ^::: ^
 rename
 n {006
 l
 n  1
 ^ :006 l
 n
  0 } is satisfiable, where eachrename
 i
 is a renaming that maps variables to new symbols{in a uniform way}. Since
this formulacan be
 computed in time polynomial in|006| +|V|, the membershipof full FL depen-
dence to NP follows. { Hardness. Full FL-dependenceand FL-dependence coincide
in the case inwhich L is composed of a single literal.Since the NP-hardness
of FL-dependencehas
 been proved using a setL composed of a single literal, theNP-hardness of
full
 FLdependence follows.
 
*  full FV dependence { Membership. 006 is fully Var-dependent on V ={x
 1
 ; :::; x n
} if and only if 0066021
 006
 x
 1
  0
 and ...and 006 6021 006
 x n
  0
 holds.Equivalently , 006 is fully Var-dependent on
 V ={x
 1
 ; :::; x n
} if and only if the formula rename
 1
 {006010006
 x
 1  0
 }^:::^rename
 n
 {006010
 006
 x
n
  0
 } is satisfiable,where each rename
 i is a renaming that maps variablesto
 new symbols {in a uniform way}. Since this formula can be computed intime
 polynomial in|006| +|V|, the membership of full FV dependence toNP follows.
 { Hardness.Full FV-dependence and FV-dependence coincide if L is composed
of
 asingle literal. Since the NP-hardness ofFV-dependence has been proved using
a set L composed of a single literal, theNP-hardness of full FV-dependence
follows. 005
 Proposition 11Whenever 006 belongs to a classC of CNF formulas that is
tractable for
 clausal query answering {i.e., there exists a polytime algorithm todetermine
whether 006|=fl
 for any CNF formulafl } and stable for variable instantiation{i.e., replacing
in 006 2 Cany
 variable by true or byfalse gives a formula that stil lbelongs to C } thenFL
dependence,
 FV dependence,full FL dependenceand full FV dependenceare in P.
 Proof:This is a straightforward consequence ofPropositions 6 and 7. When
006 belongs to a class C of formulas that is tractable for clausal query
answering and stablefor variable
 instantiation,we can easily check whether 006|= 006
 x 0 and 006
 x 0|= 006 holds. 005 428Propositional Independence Proposition 12 Lit-simplifiedformula
and Var-simplified formulaare NP-complete.
 Proof:
 
*  Lit-simplification { Membership. Easy consequence of thefact that Lit-simplified
formula is a restriction of full FL dependencethat is in NP {and NP is closedunder
 polynomial reductions}. { Hardness. We prove thata formula 010, built over
an alphabetX ={x
 1 ; : : : ; x
 n},
 is satisfiable if and onlyif formula 006 is L-simplified, where 006 =
{010 _ 010[X=:X ]} ^
 ^ x
 i
 2X x
 i
 , y i
 where 010[X=:X ] is the formula obtained byreplacing in 010 each occurrence
ofx
 i
 {resp. :x
 i
 } with :x
 i
 {resp. x i
 }.
 First, if010 isnot satisfiable, neither 010[X=:X ] is, thus {010 _010[X=:X
]} is not satisfiable. As a result, 006 is notsatisfiable, thus it is not
Lit-simplified, because
 DepLit{006} = ;but 006 mentions variablesx
 i
 and y i
 .
 Assume 010 satisfiable.Clearly, 006 is satisfiable as well:let ! bea model
of 006.We
 prove that 006 isLit-simplified by showing that it is Lit-dependent on
each literal it
 contains.We have Lit{006}={x
 1
 ;: : : ; x
 n
 ; y 1
 ; : : : ; y n
 ; :x
 1
 ; : : : ; :x n
 ; :y
 1
 ; : : : ; :y n
}.
 Letl
 i
 2 Lit{006}.
 {1} Assume that!|= l
 i . Then, F orce{!; :l
 i
 }6j= 006. Indeed, letV ar{l
 i
 } = x
 i
 {resp.y
 i
 }: !satisfiesx
 i
, y
 i
 so changing thetruth value of x
 i only {resp. y
 i only} leads to a model that does notsatisfy x
 i
 ,y i
 .
 {2} Otherwise,we have !|= :l
 i
 . Replacingl
 i
 by :l
 i
 in the proof of case{1}
 enables deriving the expectedconclusion.
 { Var-simplification 003 Membership. Easyconsequence of the fact that Var-simplifiedformula
 is a restriction offull FV dependence that is in NP{and NP is closed
 underpolynomial reduction}.
 003Hardness. The proof is similar to the proof of NP-hardness of Lit-simplified
formula, replacing \Lit-simplified" with\Var-simplified".
 005
 Proposition 13
 1. Determiningwhether DepLit{006} = L{where L is a set of literals}, and
determining
 whetherDepV ar{006} = X {where X is a set of variables} isBH
 2
 -complete. 2. The sear ch problemconsisting in computing DepLit{006} {resp
e ctively DepVar{006}} is
 in F001
 p
 2
 and is both NP and coNP-hard. 429Lang, Liberatore,& Marquis
 Proof:
 
*  Determining whether DepLit{006} = L is BH
2
 -complete.
 { Membership. DepLit{006} = Lif and only if {i} l 7! 006 holds for every
l 2 L and {ii} l67! 006
 holds for every l2 Lit{006} n L;the set of instances h006;Li such that
{i} holds is the union of a linear number of problems in NP, thus it is in
NP; similarly, the
 set of instancesh006; Li such that {ii}holds is a problem in coNP. This
provesthe
 membership to BH
 2
 of the problem of determining whetherDepLit{006} = L.
 { Hardness.
 Let h010; 000i be a pair of propositional formulas. Without loss of generality,
we as-
 sume that 010 and 000do not share any variable and thatX = V ar{010}
={x
 1
 ; :::; x n
},
 Var{000} ={z
 1 ; :::; z
 p
}. Furthermore we assume thatX 6= ; {if it were not thecase
 it would then be sufficientthen to replace 010 by 010 ^{t _ :t} before
performing the reduction}. We prove thath010; 000i is a positive instance
of sat-unsa t,i.e., 010
 is satisfiable and 000 isunsatisfiable, if and only if DepLit{010 ^ :000}
= Lit{006{010})
 where 006{010}= {010_ 010[X=:X ]}^
 V
 x
i
 2X
 x
 i
 , y
 i as in a previous proof {note that 006{010} and 000 do not share any
variables}.
 {i} if 010 is satisfiableand 000 is unsatisfiable, then 006{010} isLit-simplifiedac-
 cording to the previous proof and thus DepLit{006{010})= Lit{006{010})
whereas
 DepLit{:000} = ;, which together entail that DepLit{006{010} ^ :000}
=Lit{006{010}).
 {ii} Assume that010 is unsatisfiable and DepLit{006{010}^ :000} = Lit{006{010}).Then
 006{010} is not simplified {recall that V ar{010} is not empty and henceso
is Lit{006{010})},
 thusDepLit{006{010}) 032 Lit{006{010}). Now, DepLit{006{010} ^
:000} =Lit{006{010}) implies
 DepLit{006{010} ^ :000} \Lit{000} = ;, which is possible only if 000
is a tautology {be- cause 006{010} and 000 do not share any variables};
butin this case, 006{010}^ :000 is
 inconsistent and thus DepLit{006{010} ^ :000} = ;, hence, Lit{006{010})
= ;, which is
 contradictory.
 {iii} Assume that 000 is satisfiableand DepLit{006{010} ^ :000} = Lit{006{010}).
The second condition can hold only if006{010} is unsatisfiable, hence not
Lit- simplified {since Lit{006{010})6= ;}, so010 is unsatisfiable aswell;
this takes us
 backto the case {ii}, which leads us to the samecontradiction again.
 
*  ComputingDepLit{006} is in F001 p
 2
 since FL dependenceis in NP and DepLit{006}022
 Lit{006} {hence, itissufficient to test for every l2Lit{006} whether or
not 006 is Lit- independent from it}. Itis also both NP-hard and coNP-hard.
NP-hardness is showed by the following polynomial reductionfrom sat: A CNF
formulaF is satisfiable
 if and only ifF is valid or DepLit{F } 6= ;; clearlyenough, valid CNF formulas
 can be recognized in polynomial time.Thus, sat can be solved if we know
how to
 compute DepLit{006} for any 006, which shows thatcomputing DepLit{006}
is NP-hard.
 The proof of coNP-hardness is similar {F isunsatisfiable if and only if
Fis not valid
 and DepLit{F } = ;}.
 430Propositional Independence 
*  Determining whether DepVar{006} = V is BH 2
 -complete. Membershipfollows easily
 from the membershipofthe corresponding problem for DepLit{006}. Hardness
is similar
 to the hardness proof for DepLit{006} = L.
 
* Computing DepV ar{006} is inF001
 p
 2
 and is both NP-hard and coNP-hard. Similar to the
 correspondingresult for DepLit{006}.
 005
 Proposition 14The set of models of ForgetLit{006;{l}} can be expresse das:
 M od{F orgetLit{006;{l}}) =M od{006} [ fF orce{!; :l}| !|= 006}
 ={!| F orce{!; l}|= 006}
 Proof: The proof is obtained immediately from the definition. 005 Proposition
15 The set of models of F orgetLit{006; L} can be expresse d as:
 M od{F orgetLit{006; L})={!| F orce{!; L
 1
 }|= 006 where L 1
 022 L} Proof: By induction on|L|. The base case inwhichL is empty is trivial.
Let now assume that the property holds for anyL composed of k elements, and
prove that it holds as well
 forL [ fl}.
 Bydefinition,
 !|=F orgetLit{006; L[ fl}}
 if and only if!|= F orgetLit{F orgetLit{006; L};{l}})
 if and only if !|=F orgetLit{006; L}or F orce{!; l}|= F orgetLit{006;
L}
 if and only ifF orce{!; L
 0
 }|= ForgetLit{006; L} whereL
 0
 022 fl}
 Using the inductionhypothesis, we can express the set of models of ForgetLit{006;
L} as the models !
 0
 such that F orce{! 0
 ; L
 1 }|= 006, whereL
 1
 is any subset ofL. As a result,
 !|= F orgetLit{006; L [ fl}}
 if and only if F orce{!; L
 0
 } =!
 0
 and Force{!
 0
; L
 1
 }|= 006; where L 0
 022 fl}and L
 1
 022L
 if and only if Force{!; L
 0 1
 }|=006 where L
 0
1
 022 L [ fl}
 As a result, the models ofF orgetLit{006; L}are the models that can be
mapped into models of 006 by forcing a subset ofliterals in L to become
true.005
 Proposition 16Let 006 be a formula from P ROP
 P S and L 022 L
 P S
 . F orgetLit{006; L} is
 thelogic al ly strongest conse quenc eof 006 that is Lit-independentfrom
L {up to logic alequiv-
 alenc e}.
 Proof:By induction on|L|.The base case|L| = 0 is trivial.Let us now assume
that
 the proposition holds for every|L| 024k andshow that it remains true when|L|
= k +1. 431Lang, Liberatore,& Marquis
 Let L =L
 0
 [ fl}. By the induction hypothesis, we canassume that F orgetLit{006; L
 0
 }
 isthe most general consequence of 006 that isLit-independent from L
 0 . For the sake
 of simplicity, let 006
0
 denote this formula. Itremains to show that F orgetLit{006; L} =
 ForgetLit{F orgetLit{006; L
 0
 };{l}} = F orgetLit{006
 0
 ;{l}} is the most general consequence of 006
 0
 that is Lit-independent from l. Two cases are to be considered:
 
*  l =x. F orgetLit{006 0
 ; x} is Lit-independent from x if and only if
 ForgetLit{006
 0
 ; x}|= ForgetLit{006
 0
 ; x}
 x 0 holds. We have
 {006
 0
 x 1 _ {:x ^ 006 0
 })
 x 0
 021 {006
 0 x 1
 _ 006 0
 x 0
 }
 This formula clearly is a logicalconsequence of F orgetLit{006 0
 ;{x}}.Hence F orgetLit{006 0
 ; x}
 isLit-independent from x. ForgetLit{006
 0
 ; x} is a logical consequence of 006 0
 . Indeed,
 forevery 006
 0
 2PROP
 P S
 ,x 2 P S , we have 006 0
 021{x^ 006
 0
 x 1
 } _ {:x ^ 006
 0
 x 0
 }. It remains to show that every logical consequence010 of 006
 0
 that isLit-independent from x is a logical consequence of F orgetLit{006
 0
 ; x}.Let 010 any formula s.t. 006 0
|= 010 holds and 010|= 010
x 0
 . Since 006 0
 021{x^006
 0
 x 1 } _ {:x ^ 006 0
 x 0
 } holds, we have 006 0
|= 010 if and only if both {x ^ 006 0
 x 1
}|= 010 and {:x ^ 006
 0
x 0
 }|= 010 hold. Thus, in order to showthat
 F orgetLit{006 0
 ; x}|= 010 holds, it is sufficient to prove that 006
 0
 x 1
|= 010 holds.Suppose
 that is not the case.Then, there exists an implicant flof 006
 0
 x 1
 that is not an implicant of 010. However, since {x ^ 006
 0
x 1
 }|= 010 holds, we know that {x ^ fl } is an implicant of 010. Since 010
is equivalentto the conjunction of its prime implicates, therenecessarily
 exists a prime implicate031 of 010 s.t. {x ^fl }|= 031andfl 6j= 031hold.
This imposes that x belongs to 031 butno literal offl belongs to 031.By
construction, 031
 x 0
 is an implicate of 010
 x 0
and 031
 x 0 is strictly stronger than 031. Since 010|= 010 x 0
 holds, 010|= 031
 x 0
 holds
 as well.This contradicts the fact that 031is a prime implicate of 010.

*  l = :x. Thedemonstration is similar, mutatis mutandis.
 005
 Proposition17 Let 006, 010be two formulas from PROP
 P S
 andL 022 L
 P S .
 F orgetLit{006_ 010; L} 021F orgetLit{006; L}_ F orgetLit{010; L}:
 Proof: The claim can be easily proved from Proposition 15.Indeed, the models
of
 ForgetLit{006_ 010;L} are the models ! suchthatF orce{!; L
1
 }|= 006_ 010, where L
 1 022 L. Now,
F orce{!; L
 1 }|= 006 _010 holds if and only if F orce{!; L
 1
 }|= 006 holds or F orce{!; L
 1
 }|= 010
 holds.
 Onthe other hand, the models of ForgetLit{006; L}_ F orgetLit{010; L}
are those such that
 Force{!; L
 1 }|= 006 or Force{!; L
 1 }|= 010 for someL
 1
 022L.This is equivalent to the above condition. 005
 432Propositional Independence Proposition 18 Letfl be a consistentterm from
P ROP
 P S
 and L 022L
 P S
 .
F orgetLit{fl ; L} 021
 V
 l2fls:t:l62L
 l. Proof: We first provethe following lemma:
 Lemma 1For any consistent termfl , we have F orgetLit{fl ; l} 021 flnfl}.
 Pro of{lemma}:
 case 1 l2 fl , i.e., fl= l ^ fl
 0 .
 F orgetLit{fl ; l} 021 fl l 1
 _ {:l _ fl } 021fl
 0
 021 fln fl}.
 case2 :l 2 fl ,i.e., fl = :l^ fl
 0
 . F orgetLit{fl ;l} 021 ? _ fl 021fl 021 fl n fl}.
 case 3 l62 fl and :l 62fl .
 F orgetLit{fl ; l} 021 fl_ {:l _ fl }021 fl 021 fl nfl}.
 005
 Astraightforward induction on L completes theproof. 005
 Proposition19 Let 006 bea formula from P ROP P S
 and L 022L
 P S
 .
 I P {F orgetLit{006; L}) ={ffi|ffi2I P {006} andLit{ffi} \ L= ;g.
 Proof:
 
*  022: Let ffi 2I P {F orgetLit{006; L}). Since 006|= F orgetLit{006;
L} and F orgetLit{006; L}|=
 ffihold, ffi is an implicate of 006.Hence, there exists a prime implicateffi
 0
 of 006 s.t.ffi
 0
|=ffi
 holds. Since ffidoes not contain any literal fromL, this is also the case
for ffi 0
 . Thus,
ffi
 0
 is a logicalconsequence of 006 that is Lit-independent fromL. As a consequence
of
 Proposition 16, it must be the case thatF orgetLit{006; L}|= ffi
 0 . Hence, there exists a
 prime implicate ffi
 00 of F orgetLit{006; L} s.t. ffi
 00|= ffi
 0 holds. This implies that ffi 00
|= ffi holds as well, and since both clausesare prime implicates of the same
formula, they are equivalent. But this implies thatffi 021ffi
 0
 holds, which completes the proof. 
*  : Let ffibe a prime implicate of 006 that does not contain any literal
from L.ffi is
 Lit-independentfrom L. Since ffi isan implicate of 006, it must also be
animplicate of
 F orgetLit{006; L}. Subsequently, there exists a prime implicate ffi 0
 of F orgetLit{006; L}
 s.t. ffi 0
|= ffiholds. Since 006|=F orgetLit{006; L}and F orgetLit{006; L}|= ffi
 0 both hold,
 we have 006|= ffi
 0
 as well. Hence, there exists a primeimplicate ffi
 00
 of 006 s.t.ffi
 00
|= ffi
 0
 holds. Therefore,ffi
 00
|=ffi holds and since both clauses areprime implicates of the same
 formula,they are equivalent. But this impliesthat ffi 021ffi
 0 holds, which completes the
 proof.
 005
 433Lang, Liberatore,& Marquis
 Proposition 20Let 006 be a formula from P ROP
 P S and V 022 P S. We have
 F orgetVar{006; V } 021F orgetLit{006; L V
 }
 Proof:By induction on|V|. The proposition trivially holds for|V| = 0. Let
us assume it is true whenever|V|= k. Let V ={v
 1
 ; :::; v k+1
}. Bydefinition, we have
 ForgetV ar{006; V} = F orgetV ar{F orgetV ar{006;{v
 1
 ; :::; v k
}};{v k+1
}}
 Bythe induction hypothesis, F orgetVar{006; V } is equivalent to F orgetV
ar{010;{v
 k+1
}},
 where 010 isdefined as:
 010= F orgetLit{006;{x; x 2 fv
 1
 ; : : : ; v k
 gg [ f:x;x 2 fv
 1
 ; : : : ; v
 k
 gg}
 We haveF orgetV ar{006;V } 021 F orgetVar{010;{v
k+1
}}. By definition, F orgetV ar{010;v
 k+1
 } = 010 v
 k+1
  0
 _ 010
 v k+1
  1
 :
 We also have F orgetLit{010;{v
 k+1
 ;:v
 k+1
}}
 = F orgetLit{F orgetLit{010;{v k+1
}};f:v
 k+1
}}
 = F orgetLit{(:v
 k+1
 ^010
 v
 k+1  0
 } _ 010 v
 k+1
  1
 ; f:v
 k+1
}}
 = {v
 k+1
 ^ {(:v
 k+1
 ^010
 v
 k+1  0
 } _ 010 v
 k+1
  1
 }
 v
 k+1
  1
 } _{(:v
 k+1
 ^ 010
 v
 k+1
  0
 } _010
 v
 k+1  1
 }
 v k+1
  0
 :
 This simplifies to {v
 k+1
 ^ 010 v
 k+1
  1
 } _ 010
 v k+1
  0
 _ 010
 v
 k+1
  1
 , which isalso equivalent to
 010 v
 k+1
  0
 _ 010
 v k+1
  1
, hence equivalent to ForgetV ar{010;{v
 k+1
}}.Consequently, we have F orgetV ar{006;V }
 021 F orgetLit{F orgetLit{006;{x; x 2fv
 1 ; : : : ; v
 k gg [ f:x; x2 fv
 1
 ;: : : ; v
 k
 gg};{v
 k+1 ; :v
 k+1}}
 021 F orgetLit{006; V }:
 005 Proposition 21 Let006, 010 be two formulas from P ROP
 P S , and V be a subset ofP S . If
 V67! +
 000
 006,then F orgetV ar{006^ 010; V } 021006 ^ F orgetV ar{010; V }.
Proof: Let us consider the case whereV ={v}. Bydefinition, we have F orgetVar{006
^
 010;{v}} = {006 ^010}
 v 0
 _{006 ^ 010}
 v 1
 . Equivalently ,F orgetV ar{006 ^010;{v}} 021{006
 v 0
^
 010
 v 0 } _ {006
 v 1
 ^ 010
 v 1
 }. When v67!
 +
 000
 006, we have 006 021006
 v 0
 021006
 v 1
 .Accordingly,
 F orgetVar{006 ^ 010;{v}} 021 006 ^{010
 v 0
_ 010
 v 1
 } 021 006 ^ F orget{010;{v}}. Astraightforward
 induction completes theproof. 005
 Proposition22 Let 006, 010be two formulas from PROP
 P S
 ,and P , Q, Z be thre e disjoint
 sets of variables from P S {such thatV ar{006} [ V ar{010}022 P [ Q [
Z}. It holds:
 434Propositional Independence 1. If 010 does not contain any variable from
Z C I RC {006; hP; Q; Z i}|=010
 if and only if
 006|= F orgetLit{006 ^ 010;L
 000 P
 [ L
 Z }
 2. In the general case:
 C I RC{006; hP; Q; Z i}|= 010
 ifand only if
 006|= F orgetLit{006 ^ :F orgetLit{006 ^ :010; L
 Z
 [L
 000
 P
 }; L
 Z
 [L
 000
 P
 }
 where C I RCis circumscription as defined in {McCarthy, 1986}.
 Proof: 1. This is a consequence of Theorem 2.5from {Przymusinski, 1989}.
This theorem states that if 010 does not containliterals from Z , then C
I RC{006; hP; Q; Z i}|= 010
 holdsif and only if there is no clause fls.t. fl does not contain anyliteral
from
 L
 000 P
 [ L
 Z and 006|= :010 _ fl but 006 6j= fl . This is equivalent to state that,
if010 does not contain literals from Z, then C I RC {006;hP; Q; Z i}|=
010 holds if and only if, for every clause fl containingonly literals from
L
 + P
 [ L
 Q , we have 006 ^010|= fl ifand
 only if 006|=fl . It is easy to see that theequivalence is preserved would
any formula
 fl Lit-independentfrom L
 000
 P [ L
 Z
 beconsidered {any formula can be turned intoan
 equivalent CNF formula}.Thus, Theorem 2.5 can be rephrased inforgetting
terms: if
 010 does not contain variables from Z ,then C I RC {006; hP; Q; Z i}|=010
holds if and only
 if 006^ 010 021
 L
 +
 P
 [L
 Q
 006 if and only if 006|= F orgetLit{006^ 010; L
 000 P
 [ L
 Z }.
 2. This is a consequence ofTheorem 2.6 from {Przymusinski, 1989}.This theorem
 states that CI RC {006; hP; Q; Zi}|= 010 holds if andonly if 006|= 010
or there existsa
 formula 011 s.t. 011does not contain any literal fromL
 000
 P
[ L
 Z
 , 006|= 010 _ 011 holds and C I RC {006; hP; Q; Z i}|=:011. It is easy
to see that sucha formula 011 exists if and only if the conjunction ff of
all the formulas011 such that 011 does not contain anyliteral from
 L
 000 P
 [ L
 Z and 006|= 010_ 011 holds is s.t. C I RC{006; hP; Q; Z i}|= :ff.Since
006|= 010 _011 holds
 if and only if 006^ :010|= 011 holds,ff is equivalent to ForgetLit{006^
:010;L
 000
 P
[ L
 Z
 }.Thus,
 C I RC {006; hP; Q; Z i}|= 010 holds if and only if 006|= 010 holds
or C I RC{006; hP; Q; Z i}|=
 :ForgetLit{006 ^ :010; L
 000
 P [ L
 Z
 }holds. In the case where 006|= 010 holds, 006 ^ :010is
 inconsistent, so it is also the caseof F orgetLit{006 ^ :010; L
 000
P
 [ L
 Z }. Thus, if 006|= 010
 holds, C I RC{006; hP; Q; Z i}|= :F orgetLit{006^:010; L
000
 P
 [L Z
 } holds as well.Accordingly,
 C I RC{006; hP; Q; Z i}|= 010 holds if and only if C I RC {006; hP; Q;
Zi}|= :ForgetLit{006 ^
 :010; L
 000
 P [ L
 Z
 }holds. Since F orgetLit{006^ :010; L
 000 P
 [ L
 Z } does not contain any literal from Z , the point just above enables concluding
the proof.
 435Lang, Liberatore,& Marquis
 005
 Proposition23 Let 006 bea formula from P ROP P S
 and let L be a subset of L
 P S . In the
 generalcase, there is no prop ositionalformula010 equivalent to ForgetLit{006;
L}s.t. the size
 of 010 is polynomial ly bounde d in|006| +|L|,unless NP \ coNP 022 P/poly.
 Proof: The justification is twofold:
 {1} F orgetLit{006; L} is the logically strongestconsequence of 006 that
is Lit-independent from L. Consequently,for every formula fl 2P ROP
 P S
 , we have 006|=fl if and only if
 ForgetLit{006; L}|= fl , where L= Lit{006} n Lit{fl }. Because F orgetLit{006;
L} depends only on literals of Lit{006}\ Lit{fl }, it is aninterpolant
of 006 and fl, i.e., a formula ' s.t.006|= '
 and'|= fl hold;thus we have 006|= F orgetLit{006; Lit{006} n Lit{fl})|=
fl . {2} the existence of a propositional formula, interpolant of 006 and
fl, and of size polynomially
 bounded in|006| +|fl| would imply thatNP \ coNP 022 P/poly {Boppana &Sipser,
1990},
 which is considered veryunlikely in complexity theory.005
 Proposition 24Lit-equivalence and Var-equiv alenceare 005
 p 2
 -complete.
 Proof:
 
*  Var-equiv alence. { Membership: In order to check the membership to the
complementaryproblem,
 guessing a clause flbuilt up from V , and checkingthat {006|= fland 010
6j= fl} or
 {006 6j=fl and 010|=fl } is sufficient. Thecheck step can be easily accomplished
in polynomial time when an NPoracle is available. Hence, the complementary
 problem belongs to 006 p
 2
 .
 { Hardness: Let M bethe mapping that associates the tripleh006; true; Ai
to the quantified boolean formula8A9B 006 {where{A; B} is a partition ofV
ar{006}).
 Clearly enough,M is polytime. Moreover,we have:
 8A9B 006 is valid if and only if|= 9B : 006 if and only if|=F orgetV ar{006;B
}
 if and only if006 021
 A
 true Since the validity problem for 2-qbf formulas is 005 p
 2
 -complete, this proves the
 005
 p
 2
 -hardness of Var-equiv alence.
 
*  Lit-equivalence. { Membership: See the membership proof above, replacing\built
up from V "by
 \s.t. Lit{fl } 022 L". { Hardness: Let Mbe the mapping that associatesh006;
010; V ito h006; 010; L V
 i. Clearly enough, M {h006;010; V i} can be computedin time polynomial
in jh006;010; V ij. We have shown that 006 and 010 are Var-equiv alent
given V ifand only if 006 and 010 are
 436Propositional Independence Lit-equivalent given L V
 . Hence, Mis a polynomial many-one reduction from Var-equiv alence to Lit-equivalence.
Since Var-equiv alenceis 005
 p
 2
 -hard,
 this is also the case forLit-equivalence.
 005 Proposition 25 Let006 be a formula fromP ROP
 P S
and V a subset of PS . 006 is influ-
 ence able from V ifand only if 006 is Var-dep endenton V .
 Proof: Proposition4 from {Boutilier, 1994} states that 006 isinfluenceable
from V if and only if there exists a prime implicant of006 that contains
a variable fromV , where V is the set of controllable variables.Proposition
9 completes the proof.005
 Proposition 26Let 006 be a formula from P ROP
 P S and V a subset ofP S . 006 is relevant to V if and only if006 is Var-dep
endent on V.
 Proof: The proof istrivial from Proposition 9. 005 Proposition 27 Let006
be a formula fromP ROP
 P S
 and V a subset of PS . 006 is strictly
 relevant to V if and only if006 is Var-dep endent on Vand Var-indep endent
fromV ar{006} n V . Proof: Easy from the definition ofstrict relevance,
plus Proposition 9 {006is Var-dependent
 from every variable occurring in a prime implicate of 006and Var-independent
from all the remainingvariables}. 005 Proposition 28
 {1}strict relevance of a formula to asubject matter {Lakemeyer,1995} is
005
 p
2
 -
 complete. {2} strict relevance of a formula to a subject matter {Lakemeyer,1997}
is BH
 2
 - complete.
 Proof:
 
*  strict relevance {Lakemeyer,1995}
 { Membership. Letus consider the complementary problem.Guess a clause fl
,
 checkthat it does not contain any variable from V {this can be achieved
in time
 polynomial in|fl| +|V|, hence in time polynomial in|006| +|V|since no prime
 implicate of 006 caninclude a variable that does not occurin 006}. Then
check that
 it is an implicate of 006 {one call to anNP oracle} and check that every
subclause
 of fl obtainedby removing from it one of its k literals is not an implicate
of 006
 {kcalls to an NP oracle}. Sinceonly k +1 calls to such an oracleare required
 to check thatfl is a prime implicate of 006, thecomplementary problem of
strict
 relevance belongs to 006 p
 2
 . Hence,strict relevance belongs to 005 p
 2
 .
 { Hardness. Let{A; B} be a partition of V ar{006} {for any formula 006}.8A9B
006 is
 valid if and only if every prime implicate of 006 that contains a variablefrom
A
 437Lang, Liberatore,& Marquis
 also contains a variable from B if and only if everyprime implicate of 006
contains
 a variable from B {sinceV ar{006} = A [B } if and only if 006 is strictlyrelevant
to
 B . 
*  strict relevance{Lakemeyer, 1997}
 { Membership:Straightforward from Propositions 27 and10.
 { Hardness: By exhibiting apolynomial reduction from sat-unsa tto strict
rel-
 evanceof a formula to a subject matter. To any pair h036; i of propo-
 sitionalformulas,let rename{ } be a formulaobtained from   by renaming its
variables. Obviously, {i} rename{ } issatisfiable if and only if   is. Now,
let new bea new variable, let
 006= 036^ new ^ :rename{ }
 and V = Var{036} [ fnew}. By Proposition 8, 006 is Var-dependent on V
if and only if there is a prime implicant of006 mentioning a variable fromV
, i.e., if and
 only if036 ^ :rename{ } is satisfiable, thus, using {i}: {ii} 006 is Var-dependenton
V if and only if both036 and :  aresatisfiable. Then, again after Proposition8,006
is Var-independent fromV ar{006} n V = V ar{rename{ }) if and only if no
prime implicantof 006 mentions a variable
 from V ar{rename{ }), i.e., if and only if rename{ } is unsatisfiable,
thus, using {i}:
 {iii} 006 is Var-independent from V ar{006} n Vif and only if   is satisfiable.
Thus, from Proposition 27, {ii} and{iii}, we get that 006 is strictly relevant
to V
 if and only if036 is satisfiable and rename{ } is not.
 005 438Propositional Independence Appendix B: Glossary
 Hereis a small glossary of useful terms with the place where their definition
can be found. P ROP
 V
 propositional language generated by VSection 2.1
 L
 V set of literals built up from VSection 2.1
 L
 + V
 set of positive literals builtup from V Section 2.1
L
 000
 V
 set of negative literalsbuilt up from V Section2.1
 V ar{006} setof propositional variables appearing in006 Section 2.1
 NNF negationnormal form Section 2.1
 Lit{006} set of literals occurring in theNNF of 006 Section 2.1
 ! V
 a V -world{instantiations of all variablesof V } Section 2.1
 012
 V
 set of allV -worlds Section 2.1
 ! world {full instanciation}Section 2.1
 M od{006}set of models of 006 Section2.1
 f or{S }formula such that M od{006}= S Section 2.1
 006 x 0
 Section 2.1 006
 x 1
Section 2.1
 006
 l 1
 Section 2.1
 Force{!; l} Section2.1
 I P {006} setof prime implicates of 006 Section2.1
 P I {006} setof prime implicants of 006 Section2.1
 BH
 2
 ,coBH
 2
 Section 2.2 001
 p
 2
 , 006
 p
 2
 , 005
 p
 2
 Section 2.2
 syntactical Lit-dependenceDefinition 1
 syntactical Var-dependence Definition 2
 l 7! 006 {semantical} Lit-dependence Definition 3
 DepLit{006} literals such that l7! 006 Definition 3
v 7!
 +
 000
 006 {semantical} Var-dependenceDefinition 4
 DepLit{006}variables such that l7!006 Definition 4
 Lit-simplified Definition 6
 Var-simplifie d Definition 6 F orgetLit{006; L} literal forgetting Definition7
 F orgetV ar{006; L} variable forgettingDefinition 8
 006021
 L
 010 Lit-equivalenceDefinition 9
 006021
 V
 010 Var-equiv alenceDefinition 9
 influenceabilityDefinition 10
 relevanceto a sub ject matter Definition11
 strict relevance to a subject matter Definition 12
 439Lang, Liberatore,& Marquis
 References
 Amir,E., & McIlraith, S. {2000}. Partition-basedlogical reasoning. In Pro
c e e dingsof
 the Seventh International Conferenc e on Principles of Know ledge Repr esentation
and
 Re asoning {KR'00}, pp. 389{400. Biere, A., Cimatti, A., Clarke, E. M.,
Fujita, M., & Zhu, Y. {1999}. Symbolicmodel
 checking using SAT procedures instead of BDDs. InPro c e e dings of Design
Automation Conferenc e {DAC'99}. Boppana, R. B., & Sipser, M. {1990}.The
complexity of finite functions.In van Leeuwen, J.
 {Ed.}, Handbo ok of Theor etic al Computer Science,Vol. A, chap. 14. Elsevier
Science Publishers{North-Holland}, Amsterdam. Boutilier, C. {1994}. Towarda
logic for qualitative decision theory. In Pro c e e dingsof the
 Fourth International Conferenc e on the Principles of Know ledge Repr esentation
and
 Re asoning {KR'94}, pp. 75{86. Chopra, S., & Parikh, R. {1999}.An inconsistency
tolerant model for beliefrepresentation
 and belief revision.In Pro c e e dings of the Sixteenth International Joint
Conferenc eon
 Artificial Intel ligence {IJCAI'99}, pp. 192{197.
 Darwiche, A. {1997}. A logical notion ofconditional independence: properties
and applica- tions. Artificial Intel ligence, 97 {1{2}, 45{82. Darwiche,A.
{1998}. Model-baseddiagnosis using structured system descriptions.Journal
 of Artificial Intelligence Rese ar ch,8, 165{222.
 Darwiche, A. {1999}.Compiling knowledge into decomposable negationnormal
form. In
 Pro c e e dingsof the Sixteenth International Joint Conferenc e on Artificial
Intel ligence
 {IJCAI'99}, pp. 284{289. Darwiche, A., & Marquis, P.{1999}. A perspective
on knowledgecompilation. In Pro-
 c e e dingsof the Seventeenth International Joint Conferenc e on Artificial
Intel ligence
 {IJCAI'01}, pp. 175{182. Davis, M., & Putnam, H. {1960}.A computing procedure
for quantification theory. Journal
 of the ACM, 7, 201{215.
 Dechter,R., & Rish, I. {1994}. Directionalresolution; the davis-putnam procedure,
revisited. In Pro c e e dings of the Fourth International Conferenc e on
thePrinciples of Know ledge
 R epresentation and Re asoning {KR'94}, pp. 134{145.
 del Val,A. {1999}. A new method for consequencefindingand compilation in
restricted language. In Pro c e e dingsof the Sixteenth National Conferenc
e on Artificial Intel ligence
 {AAAI'99}, pp. 259{264,Orlando {FL}.
 Doherty,P., Lukaszewicz, W., & Madalinska-Buga j, E. {1998}. The PMA andrelativizing
 change for action update.In Pro c e e dings of the SixthInternational Conferenc
e on
 Principles of Know ledge Repr esentation and Re asoning {KR'98}, pp. 258{269.
 Doherty,P., Lukaszewicz, W., & Szalas, A.{2001}. Computing strongest necessary
and weakest sufficient conditions offirst-order formulas. In Pro c e edings
of the Seventeenth
 InternationalJoint Conferenc e on Artificial Intelligence {IJCAI'01}, pp.
145{151. 440Propositional Independence Fargier, H., Lang, J., & Marquis,
P. {2000}. Propositional logic and one-stagedecision
 making. In Pro c e edings of the Seventh International Conferenc e on Principles
of
 Knowledge Repr esentation and Re asoning {KR'00}, pp. 445{456. Fari~nas
del Cerro, L., & Herzig,A. {1996}. Belief change and dependence.In Pro c
e e d-
 ingsof the Sixth Conferenc e on Theor etic al Aspe cts of Re asoning about
Know ledge {T ARK'96}, pp. 147{161.
 Gelfond, M., & Lifschitz, V. {1993}.Representing action and change by logicprograms.
 Journalof Lo gicPro gr amming, 17, 301{323. Ghidini, C., & Giunchiglia,
F. {2001}.Local model semantics, or contextualreasoning =
 locality + compatibility. Artificial Intel ligence, 127, 221{259.
 Goldblatt,R. {1987}. Lo gics of Time andComputation, Vol. 7 of CSLI Le ctur
e Notes.
 Centerfor the Study of Language and Information, Stanford,CA.
 Greiner, R., & Genesereth, M. R.{1983}. What's new? a semantic definition
of novelty.
 In Pro c e e dings of the Eighth International Joint Conferenc e on Artificial
Intelligence
 {IJCAI'83}, pp. 450{454. Greiner, R., & Subramanian, D. {1995}.Proceedings
of the AAAI fall symposium onrele-
 vance.. TechnicalReport FS-94-02, AAAI Press.
 Hegner, S.{1987}. Specification and implementation ofprograms for updating
incomplete
 informationdatabases. In Pro c e e dingsof the Sixth ACM SIGACT SIGMOD SIGART
 Symp osium on Principles of Database Systems {PODS'87}, pp. 146{158. Herzig,
A. {1996}. The PMA revisited.In Pro c e e dings of the FifthInternational
Conferenc e
 onthe Principles of Know ledge Repr esentation and Re asoning {KR'96}, pp.
40{50.
 Herzig, A., Lang, J., Marquis,P., & Polacsek, T. {2001}.Updates, actions,
and planning.
 In Pro c e e dings of the Seventeenth International Joint Conferenc eon
Artificial Intel-
 ligence {IJCAI'01}, pp. 119{124.
 Herzig, A., & Rifi, O. {1998}. Updateoperations: a review. In Pro c e e
dings of the Thirteenth Eur op e an Conferenc eon Artificial Intel ligence
{ECAI'98}, pp. 13{17.
 Herzig, A., & Rifi, O.{1999}. Propositional belief base updateand minimal
change. Artificial Intel ligence, 115{1}, 107{138.
 Inoue, K. {1992}.Linear resolution in consequence{finding.Artificial Intel
ligence,56 {2{3},
 301{353.
 Kautz, H., Kearns, M., & Selman, B. {1993}.Reasoning with characteristic
models.In
 Pro c e e dings of theEleventh National Conferenc e on ArtificialIntel ligence
{AAAI'93}, pp. 34{39.
 Kautz, H., Kearns, M., &Selman, B. {1995}. Horn approximations ofempirical
data.
 Artificial Intelligence, 74 {1}, 129{145. Kautz, H., McAllester, D., & Selman,
B.{1996}. Encoding plans in propositionallogic.
 InPro c e e dingsof the Fifth International Conferenc eon the Principles
of Know ledge R epr esentation and Re asoning{KR'96}, pp. 374{384.
 441Lang, Liberatore,& Marquis
 Khardon, R., & Roth, D.{1996}. Reasoning with models.Artificial Intel ligence,87
{1{2},
 187{213.
 Kohlas,J., Moral, S., & Haenni, R. {1999}.Propositional information systems.
Journalof
 Lo gic and Computation, 9 {5}, 651{681.
 Lakemeyer,G. {1995}. A logical account of relevance. In Pro c e e dingsof
the Fourte enth
 InternationalJoint Conferenc e on Artificial Intelligence {IJCAI'95}, pp.
853{859. Lakemeyer, G. {1997}. Relevance from an epistemic perspective.Artificial
Intel ligence, 97 {1{2}, 137{167.
 Lang, J., Liberatore, P., & Marquis, P. {2002}. Conditional independence
in propositional
 logic. ArtificialIntel ligence, 141 {1{2},79{121.
 Lang, J., & Marquis, P. {1998a}. Complexity results for independence and
definability in
 propositionallogic. In Pro c e e dingsof the Sixth International Conferenc
eon Principles
 of Know ledgeRepr esentation and Re asoning{KR'98}, pp. 356{367.
 Lang,J., & Marquis, P. {1998b}. Twonotions of dependence in propositional
logic: con- trollability and definability. In Pro c e e dingsof the Fifteenth
National Conferenc e on
 Artificial Intelligence {AAAI'98}, pp. 268{273. Lang,J., & Marquis, P. {2002}.Resolving
inconsistencies by variableforgetting. In Pro c e e d- ings of the Eighth
International Conferenc e on Principles of Know ledge Repr esenta-
 tion and Re asoning {KR'02}, pp. 239{250. Lang, J., Marquis, P., & Williams,
M.-A.{2001}. Updating epistemic states.In Pro c e e dings
 ofthe Fourte enth AustralianJoint Conferenc e on Artificial Intelligence
{AI'01}, pp.
 297{308.
 Levy, A., Fikes, R., &Sagiv, Y. {1997}. Speeding up inferences using relevance
reasoning:
 A formalism andalgorithms. Artificial Intel ligence, 97 {1-2}, 83{136. Lin,F.
{2000}. On strongest necessary andweakest sufficient conditions. InPro c
e e dings of
 theSeventhInternational Conferenc e on Principles of Knowledge Repr esentation
and Re asoning {KR'00}, pp. 167{175. Lin, F., & Reiter, R. {1994}.Forget
it!. In Pro c e e dingsof the AAAI Fal l Symposiumon
 Relevanc e, pp. 154{159. Marquis, P. {1994}. Possiblemodels approach via
independency. In Pro c e e dingsof the
 Eleventh Europ e anConferenc e on Artificial Intelligence {ECAI'94}, pp.
336{340. Marquis, P. {2000}. Consequencefinding algorithms. In Handbo okon
Defeasible Re ason-
 ing and Uncertainty Management Systems, Volume 5: Algorithms for Uncertainand
 Defeasible Re asoning, chap. 2, pp. 41{145. KluwerAcademicPublishers.
 Marquis, P., & Porquet,N. {2000}. Decomposing propositional knowledge bases
through
 topics {extendedabstract}. In Pro c e e dingsof the meeting on "Partial
knowledge and
 uncertainty:independenc e, conditioning,inferenc e".
 McCarthy ,J. {1986}. Applications of circumscriptiontoformalizingcommon-sense
knowl-
 edge.Artificial Intel ligence,28, 89{116.
 442Propositional Independence McIlraith, S., & Amir, E. {2001}.Theorem proving
with structured theories.In Pro-
 c e e dingsof the Seventeenth International Joint Conferenc e on Artificial
Intel ligence
 {IJCAI'01}, pp. 624{631. Papadimitriou, C. H. {1994}. ComputationalComplexity.
Addison-Wesley . Parikh, R. {1996}. Beliefs,belief revision, and splitting
languages, pp. 266{268. Logic,
 Languageand Computation. CSLI Publications.
 Park, T. J., & Gelder, A. V. {1996}.Partitioning methods for satisfiabilitytesting
on large
 formulas. InPro c e e dings of the Thirteenth International Conferenc e
on Automated
 De duction {CADE'96}, pp.748{762.
 Przymusinski, T. C. {1989}.An algorithm to compute circumscription.Artificial
Intel li-
 gence, 38, 49{73.
 Reiter,R. {1987}. A theory of diagnosis from firstprinciples. Artificial
Intel ligence, 32,
 57{95.
 Rintanen, J. {2001}. Partialimplicit unfolding in the Davis-Putnam procedurefor
quantified
 boolean formulae.In Pro c e e dings of the QBF2001Workshop at IJCAR'01,
pp. 84{93.
 Ryan, M. D. {1991}. Defaultsand revision in structured theories. InPro c
e e dings of the Sixth IEEE Symposium on Lo gic in Computer Science {LICS'91},
pp. 362{373.
 Ryan, M. D. {1992}.Order e d presentationsof theories. Ph.D. thesis, ImperialCollege,
 London.
 Sandewall,E. {1995}. Fe atur es and Fluents. Oxfor University Press.
 Siegel, P. {1987}. Representation et utilisation des connaissanc esen calcul
prop ositionnel. Th022ese d'
  Etat, Universite d'Aix{Marseille 2.{in french}.
 Simon, L., & del Val, A. {2001}. Efficient consequencefinding. In Pro c
e e dingsof the
 Seventeenth International JointConferenc e on Artificial Intelligence {IJCAI'01},
pp.
 359{365.
 Subramanian, D., Greiner, R., & Pearl, J. {1997}. Artificial IntelligenceJournal:
Special
 Issue on Relevance, 97 {1-2}, 1997.
 Tan,S., & Pearl, J. {1994}. Specificationand evaluation of preferences for
planningunder
 uncertainty. InPro c e e dings of the Fourth International Conferenc e on
thePrinciples
 of Know ledge Repr esentation and Re asoning {KR'94}.
 Williams, P. F., Biere, A.,Clarke, E. M., & Gupta, A. {2000}.Combining Decision
Diagrams
 andSAT Procedures for Efficient SymbolicModel Checking. In Pro c e e dingsof
the
 Twelfth International Conferenc e on Computer Aided Verific ation {CAV'00}.
 Winslett, M. {1990}. UpdatingLo gic al Databases.Cambridge Tracts in Theoretical
Com- puterScience. Cambridge University Press. 443