Van-Nam Huynh; Yoshiteru Nakamori; Hiroakira Ono; Jonathan Lawry; Vladik Kreinovich; Hung T. Nguyen Springer-Verlag Berlin and Heidelberg GmbH & Co. KG (2008) Pehmeäkantinen kirja
Large-scale data processing is important. Most successful applications of m- ern science and engineering, from discovering the human genome to predicting weather to controlling space missions, involve processing large amounts of data and large knowledge bases. The corresponding large-scale data and knowledge processing requires intensive use of computers. Computers are based on processing exact data values and truth values from the traditional 2-value logic. The ability of computers to perform fast data and knowledgeprocessingisbasedonthehardwaresupportforsuper-fastelementary computer operations, such as performing arithmetic operations with (exactly known) numbers and performing logical operations with binary (“true”-“false”) logical values. In practice, we need to go beyond exact data values and truth values from the traditional 2-value logic. In practical applications, we need to go beyond such operations. Input is only known with uncertainty. Let us ?rst illustrate this need on the example of operations with numbers. Hardware-supported computer operations (implicitly) assume that we know the exact values of the input quantities. In reality, the input data usually comes from measurements. Measurements are never 100% accurate. Due to such factors as imperfection of measurement - struments and impossibility to reduce noise level to 0, the measured value x of each input quantity is, in general, di?erent from the (unknown) actual value x of this quantity. It is therefore necessary to ?nd out how this input uncertainty def ?x = x ?x = 0 a?ects the results of data processing.