State-of-the-art
Contemporary logic meets the challenge of modeling the processing of vague information in various different ways. Deductive Fuzzy Logic as a degree based and truth functional approach is a main candidate of a mathematical tool for the formalization of reasoning under vagueness. But its adequateness for representing all relevant aspects of vagueness is often disputed. This certainly calls for a broad-minded, interdisciplinary approach, which in particular also takes into account the lively debate on theories of vagueness in analytic philosophy. However, communication between the relevant communities is rare and mostly confrontational.(The Intl. Prague Colloquium 'Uncertainty: Reasoning about probability and vagueness', organized by some of us in September 2006, has been one of the few events so far, that attempted to bridge the gap by providing a platform for corresponding discussions.) Moreover, the envisaged open and inclusive framework for discussing reasoning under vagueness also calls for references to wider concepts of modeling different types of imperfect information. Concerning an appropriate application scenario we hint at aspects of data extraction.
Theories of vagueness
The literature on theories of vagueness, even if restricted to contemporary analytic philosophy, is vast. Besides competing degree based, epistemic, and pragmatic accounts, supervaluationism and contextualism are particularly important theories for our purposes.
- Supervaluationism maintains that vague statements have to be evaluated with respect to all their admissible precisifications. The slogan 'truth is supertruth' expresses the idea that a logically complex statement, built up from vague atomic propositions, is true if and only if it is true in each of its (classical) precisifications. This is often understood as a vindication of classical logic, also in contexts of vagueness. To which extend this position is tenable, how supervaluation is embedded in linguistic practice, and how to model logical inference in this framework are all topics of lively debates.
- Contextualism is related to pragmatic theories and seeks to model the 'open texture' of vague terms as well as conversational scores, that keep track of semantic decisions made during conversations.
Deductive fuzzy logics
By deductive fuzzy logics (or mathematical fuzzy logic) one usually refers to manyvalued logical systems related to the formalization of the graded approach to vagueness. The most important systems of deductive fuzzy logic are the so-called t-norm based fuzzy logics. They correspond to [0, 1]-valued calculi defined by a conjunction and an implication interpreted by a (left-continuous) t-norm and its residuum, respectively. This formerly neglected realm of logic has shown great development over the past ten years, comprising many different points of view (logical, algebraic, proof-theoretical, model-theoretical, functional representation, and complexity), as witnessed by dozens of widely cited papers and a number of important monographs that have appeared in the literature.
Imperfect information
Vagueness can be viewed as just one type of phenomena that arise when dealing with imperfect information. Two other phenomena of different nature that can also pervade imperfect information are uncertainty and truthlikeness. Uncertainty appears when agents have to deal with incomplete information states. It usually refers to the notion of belief regarding the truth of a proposition (usually 'crisp', but not necessarily so) and is typically graded. From a logical point of view, uncertainty formalisms (e.g. probabilistic, possibilistic) are captured by intensional, modal-like, logics, which are not truth functional. Truthlikeness, probably the least known of the above three notions, can be regarded as a special case of the more general concept of similarity. Its logical counterpart is some form of similarity-based reasoning, where this last concept is often associated with reasoning by analogy, which is an important form of non-demonstrative inference. In the degree based approach to truthlikeness, following Niiniluoto, the truthlike value of a sentence is considered as its degree of proximity to truth, and is given by some appropriate distance measure between the models of this sentence and the models of what is known about the reality.
Knowledge Extraction
The wide spectrum of existing data mining methods brings a large variety of different formal representations for the knowledge learned from data, such as decision and association rules, classification hierarchies, clusters, regression functions, probabilistic networks etc. Extracting knowledge from vague information remains a major challenge.
Probably the oldest rule extraction method is the GUHA (General Unary Hypotheses Automaton), which has been elaborated in the 1960s-1970s by Hájek and his team. The rules that GUHA obtains from data are sentences of the observational logic, which is a Boolean predicate logic with generalized quantifiers. During the last decade, also its extensions to fuzzy logic have been investigated.
As another target for possible applications we refer to the Lixto tool, which allows the extraction of web data based on interactively specified sample templates. Also here vagueness of data and intended specifications poses problems that need to be tackled in future research.