ALGORITHMS
In arithmetic, process of solving an issue by repeatedly is using a simpler computational process. A basic example is the process of long division in arithmetic. The term algorithm is now applied to plenty of kinds of issue solving that employ a mechanical sequence of steps, as in setting up a computer program. The sequence may be displayed in the type of a flowchart in order to make it simpler to follow.
As with algorithms used in arithmetic, algorithms for computers can range from simple to highly complex. In all cases, however, the task that the algorithm is to accomplish must be definable. That is, the definition may involve mathematical or logic terms or a compilation of knowledge or written instructions, but the task itself must be that can be said in some way. In terms of ordinary computer usage, this means that algorithms must be programmable, even if the tasks themselves turn out to have no solution.
In computational devices with built-in microcomputer logic, this logic is a kind of algorithm. As computers increase in complexity, increasingly software-program algorithms are taking the type of what is called hard program. That is, they are increasingly becoming part of the basic circuitry of computers or are fundamentally attached adjuncts, as well as standing alone in special devices such as office payroll machines. Plenty of different applications algorithms are now available, & highly advanced systems such as artificial intelligence algorithms may become common in the future.
ARTIFICIAL INTELLIGENCE
Artificial Intelligence (AI), a term that in its broadest sense would indicate the ability of an artifact to perform the same kinds of functions that characterize human thought. The likelihood of developing some such artifact has intrigued human beings since ancient times. With the growth of modern science, the search for AI has taken major directions: psychological & physiological research in to the nature of human thought, & the technological development of increasingly sophisticated computing systems.
In the latter sense, the term AI has been applied to computer systems & programs able to performing tasks more complex than straightforward programming, although still far from the realm of actual thought. The most important fields of research in this area are knowledge processing, pattern recognition, game-playing computers, & applied fields such as medical diagnosis. Current research in knowledge processing deals with programs that enable a computer to understand written or spoken knowledge & to produce summaries, answer specific questions, or redistribute knowledge to users interested in specific areas of this knowledge. Essential to such programs is the ability of the process to generate grammatically correct sentences & to establish linkages between words, ideas, & associations with other ideas. Research has shown that whereas the logic of language structure-its syntax-submits to programming, the issue of meaning, or semantics, lies far deeper, in the direction of true AI.
In medicine, programs have been developed that analyze the disease signs, medical history, & laboratory check results of a patient, & then recommend a diagnosis to the physician. The diagnostic program is an example of so-called professional systems-programs designed to perform tasks in specialized areas as a human would. Professional systems take computers a step beyond straightforward programming, being based on a process called rule-based inference, in which pre-established rule systems are used to process the knowledge. Despite their sophistication, systems still do not approach the complexity of true smart thought.
Lots of scientists stay doubtful that true AI can ever be developed. The operation of the human mind is still small understood, & computer design may stay fundamentally incapable of analogously duplicating those unknown, complex processes. Various routes are being used in the work to reach the aim of true AI. Approach is to apply the idea of parallel processing-interlinked & concurrent computer operations. Another is to generate networks of experimental computer chips, called silicon neurons, which mimic data-processing functions of brain cells. Using analog expertise, the transistors in these chips emulate nerve-cell membranes in order to operate at the speed of neurons.
LINEAR PROGRAMMING
Linear Programming, mathematical & operations-research process, used in administrative & economic planning to maximize the linear functions of a large number of variables, subject to positive constraints. The development of high-speed electronic computers & data-processing techniques has brought about lots of recent advances in linear programming, & the process is now widely used in industrial & military operations.
Linear programming is fundamentally used to discover a set of values, selected from a prescribed set of numbers, that will maximize or minimize a given polynomial form & this is illustrated by the finished; the manufacturer knows that as lots of articles as are produced can be sold.
Tidak ada komentar:
Posting Komentar