June 22, 2010

-page 51-

So what is unified consciousness? As we said, the predominant form of the unity of consciousness is being aware of several things at the same time. Intuitively, this is the notion of several representations being aspects of a single encompassing conscious state. A more informative idea can be gleaned from the way philosophers have written about unified consciousness. As emerging from what they have said, the central feature of unified consciousness is taken to be something like this unity of consciousness: A group of representational relations related to each other that to be conscious of any of them is to be conscious of others of them and of the group of them as a single group.


In order for science to be rigorous, Husserl claimed that mind must ‘intend’ itself as subject and also all its ‘means’. The task of philosophy, also, is so, that in to substantiate that science is, in fact, rigorous by clearly distinguishing, naming, and taxonomizing phenomena. What William James termed the stream of consciousness was dubbed by Husserl the system of experience. Recognizing, as James did, that consciousness is contiguous, Husserl eventually concluded that any single mental phenomenon is a moving horizon receding in all directions at once toward all other phenomena.

Interesting enough, this created an epistemological dilemma that became pervasive in the history of postmodern philosophy. the dilemma is such that if mind ‘intends’ itself as subject and objects within this mind are moving in all directions toward all other objects, how can any two minds objectively agree that they are referring to the same object? The followers of Husserl concluded that this was not possible, therefore, the prospect that two minds can objectively or inter-subjectively know the same truth is annihilated.

Ever so, that it is ironic, least of mention, that Husserl’s attempt to establish a rigorous basis for science in human consciousness served to reinforce Nietzsche’s claim that truths are evolving fictions that exist only in the subjective reality of single individuals. And it also massively reinforced the stark Cartesian division between mind and world by seeming to legitimate the view that logic and mathematical systems reside only in human subjectivity and, therefore, that there is no real or necessary correspondence of physical theories with physical reality. These views would later be embarked by Ludwig Wittgenstein and Jean-Paul Sartre.

One of Nietzsche’s fundamental contentions was that traditional value (represented primarily by Christianity) had lost their power in the lives of individuals. He expressed this in his proclamation “God is dead.” He was convinced that traditional values represented “slave morality,” such that it was the characterological underpinning with which succeed too weakly and resentful individually created morality. Who encouraged such behaviour as gentleness and kindness because the behaviour served their interests?

By way of introducing some of Nietzsche’s written literature, which it may as such, by inclination alone be attributively contributive that all aspiration’s are in fact the presentation of their gestural point reference. A few salient points that empower Nietzsche as the “great critic” of that tradition, in so that by some meaningfully implication, is to why this critique is potentially so powerful and yet as provocative by statements concerting the immediacy of its topic.

Although enwrapped in shrouds his guising shadow that which we can identify Nietzsche in a decisive challenge to the past, from one point of view there should be nothing too remarkably new about what Nietzsche is doing, least of mention, his style of doing so is very intriguing yet distinctive. For him, undertaking to characterized methods of analysis and criticism, under which we should feel quite familiar with, just as the extracted forms of familiarity are basic throughout which contextual matters of representation have previously been faced. He is encouraging as a new possibility for our lives of a program that has strong and obvious roots in certain forms of Romanticism. Thus, is to illustrate how the greater burden of tradition, as he is deeply connected to categorical priorities as in the finding that considerations for which create tradition.

Irish philosopher and clergyman George Berkeley set out to challenge what he saw as the atheism and skepticism inherent in the prevailing philosophy of the early 18th century. His initial publications, which asserted that no objects or matter existed outside the human mind, were met with disdain by the London intelligentsia of the day. Berkeley aimed to explain his “Immaterialist” theory, part of the school of thought known as idealism.

The German philosopher Immanuel Kant tried to solve the crisis generated by Locke and brought to a climax by Hume; his proposed solution combined elements of rationalism with elements of empiricism. He agreed with the rationalists that one can have an exact and certain opening for knowledge, but he followed the empiricists in holding that such knowledge is more informative about the structure of thought than about the worlds’ outside thought. He distinguished three kinds of knowledge, analytical deductions, for which is exact and certain but uninformative, because it makes clear only what is contained in definitions; Synthetic empirically, which conveys information about the world learned from experience, but is subject to the errors of the senses. Theoretical synthetics, which are discovered by pure intuitive certainty, are both exact and understanding. Its expressions are the necessary conditions that the mind imposes on all objects of experience. Mathematics and philosophy, according to Kant, provide this last. Since the time of Kant, one of the most frequently argued questions in philosophy has been whether or not such a thing as theoretic synthetical knowledge really exists.

Because of the diversity of positions associated with existentialism, the term is impossible to define precisely. Certain themes common to nearly all existentialist writers can, however, be identified. The term itself suggests one major theme: the stress on concrete individual existence and, consequently, on subjectivity, individual freedom, and choice.

Most philosophers since Plato have held that the highest ethical good are the same for everyone; as far as one is to approach moral perfection, one resembles other morally perfect individuals. The 19th-century Danish philosopher Søren Kierkegaard, who was the first writer to call himself existential, reacted against this tradition by insisting that the highest good for the individual are to find his or her own unique vocation. As he wrote in his journal, “I must find a truth that is true for me . . . the idea for which I can live or die.” Other existentialist writers have echoed Kierkegaard's belief that one must choose one's own way without the aid of universal, objective standards. Against the traditional view that moral choice involves an objective judgment of right and wrong, existentialists have argued that no objective, rational basis can be found for moral decisions. The 19th-century German philosopher Friedrich Nietzsche additionally contended with an individuality that must define for which situations are to count as moral situations.

All existentialists have followed Kierkegaard in stressing the importance of passionate individual action in deciding questions of both morality and truth. They have insisted, accordingly, that personal experience and acting on one's own convictions are essential in arriving at the truth. Thus, the understanding of a situation by someone involved in that situation is superior to that of a detached, objective observer. This emphasis on the perspective of the individual agent has also made existentialists suspicious of systematic reasoning. Kierkegaard, and other existentialist writers have been deliberately unsystematic in the exposition of their philosophies, preferring to express themselves in aphorisms, dialogues, parables, and other literary forms. Despite their antirationalist position, however, most existentialists cannot be said to be irrationalists in the sense of denying all validity to rational thought. They have held that rational clarity is desirable wherever possible, but that the most important questions in life are not accessible for reason and the accessible knowledge as cohered by supporting structures of scientific understanding, in that they have argued that even science is not as rational as is commonly supposed. For instance, asserted that the scientific assumption of an orderly universe is for the most part a worthwhile rationalization.

Perhaps the most prominent theme in existentialist writing is that of choice. Humanity's primary distinction, in the view of most existentialists, is the freedom to choose. Existentialists have held that human beings do not have a fixed nature, or essence, as other animals and plants do: Yet, to every human that make choices that create his or her own natures embark upon the dogma that which, in its gross effect, formulates his or hers existential decision of choice. That if, one might unduly sway to consider in having to embody the influences that persuade one’s own self to frowardly acknowledge the fact of an existence that precedes the idealization pertaining to its essences. Choice is therefore central to human existence, and it is inescapable; even the refusal to choose is a choice. Freedom of choice entails commitment and responsibility. Because individuals are free to choose their own path, existentialists have argued, they must accept the risk and responsibility of following their commitment wherever it leads.

Kierkegaard held that recognizing that one experience is spiritually crucial not only a fear of specific objects but also a feeling of general apprehension, which he called dread. He interpreted it as God's way of calling each individual to agree to a personally valid way of life. The word anxiety (German Angst) has a similarly crucial role in the work of the 20th-century German philosopher Martin Heidegger - anxiety leads to the individual's confrontation with nothingness and with the impossibility of finding ultimate justification for the choices he or she must make. In the philosophy of Sartre, the word nausea is used for the individual's recognition of the pure contingency of the universe, and the word anguish is used for the recognition of the total freedom of choice that confronts the individual at every moment.

Existentialism as a distinct philosophical and literary movement belongs to the 19th and 20th centuries. However, elements of existentialism can be found in the thought (and life) of Socrates, in the Bible, and in the work of many pre-modern philosophers and writers.

The first to anticipate the major concerns of modern existentialism was the 17th-century French philosopher Blaise Pascal. Pascal rejected the rigorous rationalism of his contemporary René Descartes, asserting, in his Pensées (1670), that a systematic philosophy that presumes to explain God and humanity is a form of pride. Like later existentialist writers, he saw human life as for paradoxes: The human self, which combines mind and body, is itself a paradox and contradiction.

Nineteenth-century Danish philosopher Søren Kierkegaard played a major role in the development of existentialist thought. Kierkegaard criticized the popular systematic method of rational philosophy advocated by German Georg Wilhelm Friedrich Hegel. He emphasized the absurdity inherent in human life and questioned how any systematic philosophy could apply to the ambiguous human condition. In Kierkegaard’s deliberately unsystematic works, he explained that each individual should attempt an intense examination of his or her own existence.

Kierkegaard, generally regarded as the founder of modern existentialism, reacted against the systematic absolute idealism of the 19th-century German philosopher Georg Wilhelm Friedrich Hegel, who claimed to have worked out a total rational understanding of humanity and history. Kierkegaard, on the contrary, stressed the ambiguity and absurdity of the human situation. The individual's response to this situation must be to live a totally committed life, and this commitment can only be understood by the individual who has made it. The individual, therefore, must always be prepared to defy the norms, least of mention, for which any if not all sociological associations that bring of some orientation, that for the sake of the higher persuasion brings the possible that implicate of a personally respective way of life. Kierkegaard ultimately advocated a “leap of faith” into a Christian way of life, which, although hard to grasp and fully in the risk of which was the only commitment he believed could save the individual from despair.

Danish religious philosopher Søren Kierkegaard rejected the all-encompassing, analytical philosophical systems of such 19th-century thinkers as German philosopher G.W.F. Hegel. Instead, Kierkegaard focussed on the choices the individual must make in all aspects of his or her life, especially the choice to maintain religious faith. The literaturized work of Fear and Trembling, 1846 and translated, 1941, Kierkegaard explored the conceptual representations of faith through which an examination of the biblical story of Abraham and Isaac, under which God demanded that Abraham show by his proving of faith by sacrificing his son.

One of the most controversial works of 19th-century philosophy, Thus Spake Zarathustra (1883-1885) articulated through Friedrich Nietzsche’s theory of the Übermensch, a term translated as “Superman” or “Overman.” The Superman was an individual who overcame what termed the “slave morality” of traditional values, and lived according to his own morality. Who also advanced his idea that “God is dead,” or that traditional morality was no longer relevant in people’s lives. In the sage Zarathustra came down from the mountain where he had spent the last ten years alone to preach to the people.

Nietzsche, who was not conversant with the functional dynamics that were the contributive peculiarities for which their premise is attributable to Kierkegaard. The influence of the subsequential existentialist thought, only through his criticism of traditional metaphysical and moral assumptions and through his espousal of tragic pessimism and the life-affirming individual will that opposes itself to the moral conformity of the majority. In contrast to Kierkegaard, whose attack on conventional morality led him to advocate a radically individualistic Christianity, proclaimed the “death of God” and went on to reject the entire Judeo-Christian moral tradition in favour of a heroic pagan ideal.

The “will” (philosophy and psychology), is the capacity to choose among alternative courses of action and to act on the choice made, particularly when the action is directed toward a specific goal or is governed by definite ideals and principles of conduct? Bestowing the consignment of willed behaviour contrasts with behaviour stemming from instinct, impulse, reflex, or habit, none, of which involves conscious choice among alternatives. Again, a consigning of willed behaviour contrasts with the vacillations manifested by alternating choices among conflicting alternatives.

Until the 20th century most philosophers conceived the will as a separate faculty with which every person is born. They differed, however, about the role of this faculty in the personality makeup. For one school of philosophers, most notably represented by the German philosopher Arthur Schopenhauer, universal will-power is the primary reality, and the individual's will forms part of it. In his view, the will dominates every other aspect of an individual's personality, knowledge, feelings, and direction in life. A contemporary form of Schopenhauer's theory is implicit in some forms of existentialism, such as the existentialist view expressed by the French philosopher Jean-Paul Sartre, which regards personality as the desire to action, and actions as they are the manifestations of the will for which gives meaning to the universe.

Most other philosophers have regarded the will as coequal or secondary to other aspects of personality. Plato believed that the psyche is divided into three parts: Reason, will, and desire, for rationalist philosophers, such as Aristotle, Thomas Aquinas, and René Descartes. The will is the agent of the rational soul in governing purely animal appetites and passions. Some empirical philosophers, such as David Hume, discount the importance of rational influences upon the will; They think of the will as ruled mainly by emotion. Evolutionary philosophers, such as Herbert Spencer, and pragmatist philosophers, such as John Dewey, conceive the will not as an innate faculty but as a product of experience evolving gradually as the mind and personality of the individual development in social interaction.

Modern psychologists tend to accept the pragmatic theory of the will. They regard the will as an aspect or quality of behaviour, than as a separate faculty. It is the whole person who wills. This act of willing is manifested by (1) the fixing of attention on distant goals and abstract standards and principles of conduct; (2) the weighing of alternative courses of action and the taking of deliberate action that seems best calculated serving specific goals and principles; (3) the inhibition of impulses and habits that might distract attention from, or otherwise conflict with, a goal or principle; and (4) perseverance against deterrents and the obstruction, that within one’s pursuit of goals or adherence is given into the characteristic principles.

The modern philosophy movements of phenomenology and existentialism have been greatly influenced by the thought of German philosopher Martin Heidegger. According to Heidegger, humankind has fallen into a crisis by taking a narrow, technological approach to the world and by ignoring the larger question of existence. People, if they wish to live authentically, must broaden their perspectives. Instead of taking their existence for granted, people should view themselves as part of Being (Heidegger's term for that which underlies all existence).

Heidegger, like Pascal and Kierkegaard, reacted against an attempt to put philosophy on a conclusive rationalistic basis - as Max Scheler (1874-1928), the German social and religious philosopher, whose work reflected the influence of the phenomenology of his countryman Edmund Husserl. Born in Munich, Scheler taught at the universities of Jena, Munich, and Cologne. In The Nature of Sympathy, 1913 translated 1970, he applied Husserl's method of detailed phenomenological description to the social emotions that relate human beings to one another - especially love and hate. This book was followed by his most famous work, Formalism in Ethics and Non-Formal Ethics of Values, 1913, and translated 1973, a two-volume study of ethics in which he criticized the formal ethical approach of the German philosopher Immanuel Kant and substituted for it a study of specific values as they directly present themselves to consciousness. Scheler converted to Roman Catholicism in 1920 and wrote On the Eternal in Man, 1921 and translated 1960, to justify his conversion, followed by an important study of the sociology of knowledge, Die Wissensformen und die Gesellschaft (Forms of Knowledge and Society, 1926). Later he rejected Roman Catholicism and developed a philosophy, based on science, in which all abstract knowledge and religious values are considered sublimations of basic human drives. This is presented in his last book, The Place of Man in the Universe, 1928 translated 1961.

Phenomenology of the 20th-century German philosopher Edmund Husserl. Heidegger argued that humanity finds itself in an incomprehensible and indifferent world. Human beings can never hope to understand why they are here; Instead, each individual must choose a goal and follow it with passionate conviction, aware of the certainty of death and the ultimate meaninglessness of one's life. Heidegger contributed to existentialist thought an original emphasis on Being and ontology and on language.

The subjects treated in Aristotle's Metaphysics (substance, causality, the nature of being, and the existence of God) fixed the content of metaphysical speculation for centuries. Among the medieval Scholastic philosophers, metaphysics were known as the “transphysical science” on the assumption that, by means of it, the scholar philosophically could make the transition from the physical world to a world beyond sense perception. The 13th-century Scholastic philosopher and theologian St. Thomas Aquinas declared that the cognition of God, through a causal study of finite sensible beings, was the aim of metaphysics. With the rise of scientific study in the 16th century the reconciliation of science and faith in God became an increasingly important problem.

The Irish-born philosopher and clergyman George Berkeley (1685-1753) argued that everything, that human beings were to conceive of exists as an idea in a mind, a philosophical focus that is idealism. Berkeley reasoned that because one cannot control one’s thoughts, they must come directly from a larger mind: That of God. In his treatise, Concerning the Principles of Human Knowledge, written in 1710, Berkeley explained why he believed that it is “impossible . . . that there should be any such thing as an outward object.”

Before the time of the German philosopher Immanuel Kant’s metaphysics was characterized by a tendency to construct theories based on deductive knowledge, that is, knowledge derived from reason alone, in the contradistinction to empirical knowledge, which is gained by reference to the facts of experience. From theoretical knowledge were deduced general propositions held to be true of all things. The method of inquiry based on deductive principles is known as rationalistic. This method may be subdivided into monism, which holds that the universe is made up of a single fundamental substance: Dualism, is nonetheless, the belief in two such substances, and pluralism, which proposes the existence of many fundamental substances.

In the 5th and 4th centuries Bc, Plato postulated the existence of a realm of Ideas that the varied objects of common experience imperfectly reflect. He maintained that these ideal Forms are not only more clearly intelligible but also more real than the transient and essentially illusory objects themselves.

George Berkeley is considered the founder of idealism, the philosophical view that all physical objects are dependent on the mind for their existence. According to Berkeley's early 18th-century writing, an object such as a table exists only if a mind is perceiving it. Therefore, objects are ideas.

Berkeley speculated that all aspects of everything of which one is conscious are reducible to the ideas present in the mind. The observer does not conjure external objects into existence, however, the true ideas of them are caused in the human mind directly by God. Eighteenth-century German philosopher Immanuel Kant greatly refined idealism through his critical inquiry into what he believed to be the limit of possible knowledge. Kant held that all that can be known of things is the way in which they appear in experience, there is no way of knowing what they are substantially in themselves. He also held, however, that the fundamental principles of all science are essentially grounded in the constitution of the mind than being derived from the external world.

George Berkeley, argued, that all naturalized associations brought upon the human being to conceive of existent and earthly ideas within the mind, a philosophical focus that is known as idealism.

Trying to develop an all-encompassing philosophical system, German philosopher Georg Wilhelm Friedrich Hegel wrote on topics ranging from logic and history to art and literature. He considered art to be one of the supreme developments of spiritual and absolute knowledge, surpassed only by religion and philosophy. In his excerpt from Introductory Lectures on Aesthetics, which were based on lectures that Hegel delivered between 1820 and 1829, Hegel discussed the relationship of poetry to other arts, particularly music, and explained that poetry was one mode of expressing the “Idea of beauty” that Hegel believed resided in all art forms. For Hegel, poetry was “the universal realization of the art of the mind.”

Nineteenth-century German philosopher Georg Wilhelm Friedrich Hegel disagreed with Kant's theory concerning the inescapable human ignorance of what things are in themselves, instead arguing for the ultimate intelligibility of all existence. Hegel also maintained that the highest achievements of the human spirit (culture, science, religion, and the state) are not the result of naturally determined processes in the mind, but are conceived and sustained by the dialectical activity.

Hegel applied the term dialectic to his philosophic system. Hegel believed that the evolution of ideas occurs through a dialectical process - that is, a conceptual lead to its opposite, and because of this conflict, a third view, the synthesis, arises. The synthesis is at a higher level of truth than the first two views. Hegel's work is based on the idealistic conceptualized representation of a universal mind that, through evolution, seeks to arrive at the highest level of self-awareness and freedom.

German political philosopher Karl Marx applied the conceptualize representation of dialectic social and economic processes. Marx's so-called dialectical materialism, frequently considered a revision of the Hegelian, dialectic of free, reflective intellect. Additional strains of idealistic thought can be found in the works of 19th-century Germans Johann Gottlieb Fichte and F.W.J. Schelling, 19th-century Englishman F.H. Bradley, 19th-century Americans Charles Sanders Peirce and Josiah Royce, and 20th-century Italian Benedetto Croce.

The monists, agreeing that only one basic substance exists, differ in their descriptions of its principal characteristic. Thus, in idealistic monism the substance is believed to be purely mental; in materialistic monism it is held to be purely physical, and in neutral monism it is considered neither exclusively mental nor solely physical. The idealistic position was held by the Irish philosopher George Berkeley, the materialistic by the English philosopher Thomas Hobbes, and the neutral by the Dutch philosopher Baruch Spinoza. The latter expounded a pantheistic view of reality in which the universe is identical with God and everything contains God's substance.

George Berkeley set out to challenge what he saw as the atheism and skepticism inherent in the prevailing philosophy of the early 18th century. His initial publications, which asserted that no objects or matter existed outside the human mind, were met with disdain by the London intelligentsia of the day. Berkeley aimed to explain his “Immaterialist theory, is part of the school of thought known as idealism, to a more general audience in Three Dialogues between Hylas and Philonous (1713).

The most famous exponent of dualism was the French philosopher René Descartes, who maintained that body and mind are radically different entities and that they are the only fundamental substances in the universe. Dualism, however, does not show how these basic entities are connected.

In the work of the German philosopher Gottfried Wilhelm Leibniz, the universe is held to consist of many distinct substances, or monads. This view is pluralistic in the sense that it proposes the existence of many separate entities, and it is monistic in its assertion that each monad reflects within itself the entire universe.

Other philosophers have held that knowledge of reality is not derived from some deductive principles, but is obtained only from experience. This type of metaphysic is called empiricism. Still another school of philosophy has maintained that, although an ultimate reality does exist, it is altogether inaccessible to human knowledge, which is necessarily subjective because it is confined to states of mind. Knowledge is therefore not a representation of external reality, but merely a reflection of human perceptions. This, nonetheless, is basically known as skepticism or agnosticism, in that their appreciation of the soul and the reality of God.

Immanuel Kant had circulated his thesis on, The Critique of Pure Reason in 1781. Three years later he expanded on his study of the modes of thinking with an essay entitled “What is Enlightenment?” In this 1784 essay, Kant challenged readers to “dare to know,” arguing that it was not only a civic but also a moral duty to exercise the fundamental freedoms of thought and expression.

Several major viewpoints were combined in the work of Kant, who developed a distinctive critical philosophy called Transcendentalism. His philosophy is agnostic in that it denies the possibility of a strict knowledge of ultimate reality; it is empirical in that it affirms that all knowledge arises from experience and is true of objects of actual and possible experience and it is rationalistic in that it maintains the deductive character of the structural principles of this empirical knowledge.

These principles are held to be necessary and universal in their application to experience, for in Kant's view the mind furnishes the archetypal forms and categories (space, time, causality, substance, and relation) to its sensations, and these categories are logically anterior to experience, although manifested only in experience. Their logical anteriority to comprehend an experience only makes these categories or structural principle’s transcendental. They transcend all experience, both actual and possible. Although these principles determine all experience, they do not in any way affect the nature of things in themselves. The knowledge of which these principles are the necessary conditions must not be considered, therefore, as constituting a revelation of things as they are in themselves. This knowledge concerns things only as far as they appear to human perception or as they can be apprehended by the senses. The argument by which Kant sought to fix the limits of human knowledge within the framework of experience and to demonstrate the inability of the human mind to penetrate beyond experience strictly by knowledge to the realm of ultimate reality makes up the critical feature of his philosophy, giving the key word to the titles of his three leading treatises, Critique of Pure Reason, Critique of Practical Reason, and Critique of Judgment. In the system propounded in these works, Kant sought also to reconcile science and religion in a world of two levels, comprising noumena, objects conceived by reason although not perceived by the senses, and phenomena, things as they appear to the senses and are accessible to material study. He maintained that, because God, freedom, and human immortality are noumenal realities, these conceptualized understandings were brought through the moral faith than through scientific knowledge. With the continuous development of science, the expansion of metaphysics to include scientific knowledge and methods became one of the major objectives of metaphysicians.

Some of Kant's most distinguished followers, notably Johann Gottlieb Fichte, Friedrich Schelling, Georg Wilhelm Friedrich Hegel, and Friedrich Schleiermacher, negated Kant's criticism in their elaborations of his transcendental metaphysics by denying the Kantian conception of the thing-in-itself. They thus developed an absolute idealism opposing Kant's critical transcendentalism.

Since the formation of the hypothesis of absolute idealism, the development of metaphysics has resulted in as many types of metaphysical theory as existed in pre-Kantian philosophy, despite Kant's contention that he had fixed definitely the limits of philosophical speculation. Notable among these later metaphysical theories is radical empiricism, or pragmatism, a native American form of metaphysics expounded by Charles Sanders Peirce, developed by William James, and adapted as instrumentalism by John Dewey; voluntarism, the foremost exponents of which are the German philosopher Arthur Schopenhauer and the American philosopher Josiah Royce; phenomenalism, as it is exemplified in the writings of the French philosopher Auguste Comte and the British philosopher Herbert Spencer, emergent evolution, or creative evolution, originated by the French philosopher Henri Bergson; and the philosophy of the organism, elaborated by the British mathematician and philosopher Alfred North Whitehead. The salient doctrines of pragmatism are that the chief function of thought is to guide action, that the meaning of concepts is to be sought in their practical applications, and that truth should be tested by the practical effects of belief; According to instrumentalism, ideas are instruments of action, and their truth is determined by their role in human experience. In the theory of voluntarism suspects that Will is postulated as the supreme manifestation of reality. The exponents of phenomenalism, who are sometimes called positivists, contend that everything can be analysed in actual or possible occurrences, or phenomena, and that anything that cannot be analysed in this manner cannot be understood. In emergent or creative evolution, the evolutionary process is characterized as spontaneous and unpredictable than mechanistically determined. The philosophy of the organism combines an evolutionary stress on constant process with a metaphysical theory of God, the eternal objects, and creativity.

In the 20th century the validity of metaphysical thinking has been disputed by the logical positivists and by the so-called dialectical materialism of the Marxists. The basic principle maintained by the logical positivists is the verifiability theory of meaning. According to this theory, a sentence has factual meaning only if it meets the test of observation. Logical positivists argue that metaphysical expressions such as “Nothing exists except material particles” and “Everything is part of one all-encompassing spirit” cannot be tested empirically. Therefore, according to the verifiability theory of meaning, these expressions have no factual cognitive meaning, although they can have an emotive meaning about human hopes and feelings.

The dialectical materialists assert that the mind is conditioned by and reflects material reality. Therefore, speculations that conceive of constructs of the mind as having any other than material reality are themselves strangling unreal and can result only in delusion. To these assertions metaphysicians reply by denying the adequacy of the verifiability theory of meaning and of material perception as the standard of reality. Both logical positivism and dialectical materialism, they argue, conceal metaphysical assumptions, for example, that everything is observable or at least connected with something observable and that the mind has no distinctive life of its own. In the philosophical movement known as existentialism, thinkers have contended that the questions of the nature of being and of the individual's relationship to it are extremely important and meaningful concerning human life. The investigation of these questions is therefore considered valid of whether or not its results can be verified objectively.

Since the 1950s the problems of systematic analytical metaphysics have been studied in Britain by Stuart Newton Hampshire and Peter Frederick Strawson, the former concerned, in the manner of Spinoza, with the relationship between thought and action, and the latter, in the manner of Kant, with describing the major categories of experience as they are embedded in language. In the United States, metaphysics have been pursued much in the spirit of positivism by Wilfred Stalker Sellars and Willard Van Orman Quine, wherefore Sellars has aspired to express metaphysical questions in linguistic terms, and Quine has attempted to decide whether the structure of language commits the philosopher to asserting the existence of any entities whatever and, if so, what kind. In these new formulations the issues of metaphysics and ontology remain vital.

Twentieth-century French intellectual Jean-Paul Sartre helped to develop existential philosophy through his writings, novels, and plays. Considerable amounts of Sartre’s workings focuses on the dilemma of choice faced by free individuals and on the challenge of creating meaning by acting responsibly in an indifferent world. In stating that “man is condemned to be free,” Sartre reminds us of the responsibility that accompanies human decisions.

Sartre first gave the term existentialism general currency by using it for his own philosophy and by becoming the leading figure of a distinct movement in France that became internationally influential after World War II. Sartre's philosophy is explicitly atheistic and pessimistic; He declared that human beings require a rational basis for their lives but are unable to achieve one, and thus human life is a “futile passion.” Sartre nevertheless, insisted that his existentialism be a form of humanism, and he strongly emphasized human freedom, choice, and responsibility. He eventually tried to reconcile these existentialist concepts with a Marxist analysis of society and history. Because, for Heidegger, one is what one does in the world, a phenomenological reduction to one's own private experience is impossible; and because human action consists of a direct grasp of objects, it is not necessary to posit a special mental entity called a meaning to account for intentionality. For Heidegger, being thrown into the world among things in the act of realizing projects is a more fundamental kind of intentionality than that revealed in merely staring at or thinking about objects, and it is this more fundamental intentionality that makes possible the directedness analysed by Husserl.

In the mid-1900s, French existentialist Jean-Paul Sartre attempted to adapt Heidegger's phenomenology to the philosophy of consciousness, in effect returning to the approach of Husserl. Sartre agreed with Husserl that consciousness is always directed at objects but criticized his claim that such directedness is possible only by means of special mental entities called meanings. The French philosopher Maurice Merleau-Ponty rejected Sartre's view that phenomenological description reveals human beings to be pure, isolated, and free consciousnesses. He stressed the role of the active, involved body in all human knowledge, thus generalizing Heidegger's insights to include the analysis of perception. Like Heidegger and Sartre, Merleau-Ponty is an existential phenomenologist, in that he denies the possibility of bracketing existence.

In the treatise Being and Nothingness, French writer Jean-Paul Sartre presents his existential philosophical framework. He reasons that the essential nothingness of human existence leaves individuals to take sole responsibility for their own actions. Shunning the morality and constraints of society, individuals must embrace personal responsibility to craft a world for themselves. Along with focussing on the importance of exercising individual responsibility, Sartre stresses that the understanding of freedom of choice is the only means of authenticating human existence. A novelist and playwright as well as a philosopher, Sartre will become a leader of the modern existentialist movement.

Although existentialist thought encompassing the uncompromising atheism of Nietzsche and Sartre and the agnosticism of Heidegger, its origin in the intensely religious philosophies of Pascal and Kierkegaard, foreshadowed its profound influence on 20th-century theologies. The 20th-century German philosopher Karl Jaspers, although he rejected explicit religious doctrines, influenced a contemporary theology through his preoccupation with transcendence and the limits of human experience. The German Protestant theologian’s Paul Tillich and Rudolf Bultmann, the French Roman Catholic theologian Gabriel Marcel, the Russian Orthodox philosopher Nikolay Berdyayev, and the German Jewish philosopher Martin Buber inherited many of Kierkegaard's concerns, especially that a personal sense of authenticity and commitment is essential to religious faith.

Renowned as one of the most important writers in world history, 19th-century Russian author Fyodor Dostoyevsky wrote psychologically intense novels that probed the motivations and moral justifications for his characters’ actions. Dostoyevsky commonly addressed themes such as the struggle between good and evil within the human soul and the idea of salvation through suffering. The Brothers Karamazov (1879-1880), generally considered Dostoyevsky’s best work, interlaces religious exploration with the story of a family’s violent quarrels over a woman and a disputed inheritance.

Twentieth-century writer and philosopher Albert Camus examined what he considered the tragic inability of human beings to understand and transcend their intolerable conditions. In his work Camus presented an absurd and seemingly unreasonable world in which some people futilely struggle to find meaning and rationality while others simply refuse to care. For example, the main character of The Stranger (1942) kills a man on a beach for no reason and accepts his arrest and punishment with a dispassion. In contrast, in The Plague (1947), Camus introduces characters who act with courage in the face of absurdity.

Several existentialist philosophers used literary forms to convey their thought, and existentialism has been as vital and as extensive a movement in literature as in philosophy. The 19th-century Russian novelist Fyodor Dostoyevsky is probably the greatest existentialist literary figure. In Notes from the Underground (1864), the alienated antihero rages against the optimistic assumptions of rationalist humanism. The view of human nature that emerges in this and other novels of Dostoyevsky is that it is unpredictable and perversely self-destructive; Only Christian love can save humanity from itself, but such love cannot be understood philosophically. As the character Alyosha says in The Brothers Karamazov (1879-80), “We must love life more than the meaning of it.”

The unfolding narrations that launch the chronological lines are attributed to the Russian novelist Fyodor Dostoyevsky’s Notes from Underground (1864) - “I am a sick man . . . I am a spiteful man”- are among the most famous in 19th-century literature. Published five years after his release from prison and involuntary, military service in Siberia, Notes from Underground is a sign of Dostoyevsky’s rejection of the radical social thinking he had embraced in his youth. The unnamed narrator is antagonistic in tone, questioning the reader’s sense of morality plus the foundations of rational thinking.

In the 20th century, the novels of the Austrian Jewish writer Franz Kafka, such as The Trial 1925, translated, 1937, and The Castle (1926, translated, 1930), present isolated men confronting vast, elusive, menacing bureaucracies; Kafka's themes of anxiety, guilt, and solitude reflect the influence of Kierkegaard, Dostoyevsky, and, the influence of Nietzsche is also discernible in the novels of the French writer’s André Malraux and in the plays of Sartre. The work of the French writer Albert Camus is usually associated with existentialism because of the prominence of such themes as the apparent absurdity and futility of life, the indifference of the universe, and the necessity of engagement in a just cause. Existentialist themes are also reflected in the theatre of the absurd, notably in the plays of Samuel Beckett and Eugène Ionesco. In the United States, the influence of existentialism on literature has been more indirect and diffused, traces of Kierkegaard's thought can be found in the novels of Walker Percy and John Updike, and various existentialist themes are apparent in the work of such diverse writers as Norman Mailer, John Barth, and Arthur Miller.

Nietzsche’s concept has often been interpreted as one that postulates a master-slave society and has been identified with totalitarian philosophies. Many scholars deny the connection and attribute it to misinterpretation of Nietzsche 's work.

For him, an undertaking to characterize its method of analysis and criticism, under which we should feel quite familiar with, just as the extracted forms of familiarity are basic throughout which contextual matters of representation have previously been faced. He is encouraging as a new possibility for our lives a program that has strong and obvious roots in certain forms of Romanticism. Thus, is to illustrate how Nietzsche, the greater burden of tradition, as he is deeply connected to categorical priorities as to finding the considerations of which make of tradition.

Yet, Kant tried to solve the crisis generated by Locke and brought to a climax by Hume; his proposed solution combined elements of rationalism with elements of empiricism. He agreed with the rationalists that one can have an exact and certain opening for knowledge, but he followed the empiricists in holding that such knowledge is more informative about the structure of thought than about the world outside thought.

During the 19th century, the German philosopher Georg Wilhelm Friedrich Hegel revived the rationalist claim that absolutely certain knowledge of reality can be obtained by equating the processes of thought, of nature, and of history. Hegel inspired an interest in history and a historical approach to knowledge by Herbert Spencer in Britain and by the German school of historicisms. Spencer and the French philosopher Auguste Comte brought attention to the importance of sociology as a branch of knowledge, and both extended the principles of empiricism to the study of society.

The American school of pragmatism, founded by the philosophers Charles Sanders Peirce, William James, and John Dewey at the turn of this century, carried empiricism by maintaining that knowledge is an instrument of action and that all beliefs should be judged by their usefulness as rules for predicting experiences.

In the early 20th century, epistemological problems were discussed thoroughly, and subtle shades of difference grew into rival schools of thought. Special attention was given to the relation between the act of perceiving something, the object directly perceived, and the thing that can be said to be known because of the perception. The phenomenalists contended that the objects of knowledge are the same as the objects perceived. The neorealist argued that one has direct perceptions of physical objects or parts of physical objects, than of one's own mental states. The critical realists took a middle position, holding that although one perceives only sensory data such as colours and sounds, these stand for physical objects and provide knowledge of it.

A method for dealing with the problem of clarifying the relation between the act of knowing and the object known was developed by the German philosopher Edmund Husserl. He outlined an elaborate procedure that he called phenomenology, by which one is said to be able to distinguish the way things are from the way one thinks they really are, thus gaining a more precise understanding of the conceptual foundations of knowledge.

During the second quarter of the 20th century, two schools of thought emerged, each indebted to the Austrian philosopher Ludwig Wittgenstein. The first of these schools, logical empiricism, or logical positivism, had its origins in Vienna, Austria, but it soon spread to England and the United States. The logical empiricists insisted that there be only one kind of knowledge: Scientific knowledge; In that, any legitimate claim that is reinforced through the knowledge claim must be verifiable in experience. So that, much that had passed for philosophy was neither true nor false but literally meaningless. Finally, following Hume and Kant, a clear distinction must be maintained between analytic and synthetic statements. The so-called verifiability criterion of meaning has undergone changes because of discussions among the logical empiricists themselves, and their critics, but has not been discarded. More recently, the sharp distinction between the analytic and the synthetic has been attacked by many of philosophers, chiefly by American philosopher W.V.O. Quine, whose overall approach is in the pragmatic tradition.

The latter of these recent schools of thought, generally called linguistic analysis, or ordinary language philosophy, seem to break with traditional epistemology. The linguistic analysts undertake to examine the actualized directive in key epistemological terms are used-terms such as knowledge, perception, and probability - and to formulate definitive rules for their use to avoid verbal confusion.

John Austin (1911-1960), a British philosopher, a prominent figure in 20th-century analytic and linguistic philosophy, was born in Lancaster, England, he was educated at the University of Oxford. After serving in British intelligence during World War II (1939-1945), he returned to Oxford and taught philosophy until his death.

Austin viewed the fundamental philosophical task to be that of annualizing and clarifying ordinary language. He considered attention to distinctions drawn in ordinary language as the most fruitful starting point for philosophical inquiry. Austin's linguistic work led to many influential concepts, such as the speech-act theory. This arose from his observation that many utterances do not merely describe reality but also affect reality; They are the performance of some act than a report of its performance. Austin came to believe that all languages are performatives and is made up of speech acts. Seven of his essays were published during his lifetime. Posthumously published works include Philosophical Papers (1961), Sense and Sensibilia (1962), and How to Do Things with Words (1962).

Thomas Hill Green (1836-1882), British philosopher and educator, who led the revolt against empiricism, the dominant philosophy in Britain during the latter part of the 19th century. He was born in Birkin, Yorkshire, England, and educated at Rugby and the University of Oxford. He taught at Oxford from 1860 until his death, initially as a fellow and after 1878 as Whyte Professor of Moral Philosophy.

A disciple of the German philosopher Georg Wilhelm Friedrich Hegel, Green insisted that consciousness provide the necessary basis for both knowledge and morality. He argued that a person's highest good is realization and that the individual can obtainably achieve realization, only in society. Society has an obligation, in turn, to provide for the good of all its members. The political implications of his philosophy laid the basis for sweeping social-reform legislation in Britain. Besides being the most influential British philosopher of his time, Green was a vigorous champion of popular education, temperance, and political liberalism. His writings include Prolegomena to Ethics (1883) and Lectures on the Principles of Political Obligation (1895), as both liberalized materials were posthumously published.

The outcome of this crisis in economic and social thinking was the development of positive liberalism. As noted, certain modern liberals, like the Austrian-born economist Friedrich August von Hayek, consider the positive attitude an essential betrayal of liberal ideals. Others, such as the British philosophers Thomas Hill Green and Bernard Bosanquet, known as the “Oxford Idealists,” ‘devised a so-called organic liberalism designed to hinder hindrances to the good life’. Green and Bosanquet advocated positive state action to promote - fulfilment, that is, to prevent economic monopoly, abolish poverty, and secure people against the disabilities of sickness, unemployment, and old age. The identified liberalism came alongside with the extension of democracy.

Most of the philosophical discussions of consciousness arose from the mind-body issues posed by René Descartes in the 17th century. Descartes asked: Is the mind, or consciousness, independent of matter? Is consciousness extended (physical) or unexceeded (nonphysical)? Is consciousness determinative, or is it determined? English philosophers such as John Locke equated consciousness with physical sensations and the information they provide, whereas European philosophers such as Gottfried Wilhelm Leibniz and Immanuel Kant gave a more central and active role to the awakening of consciousness.

The philosopher who most directly influenced subsequent exploration of the subject of consciousness was the 19th-century German educator Johann Friedrich Herbart, who wrote that ideas had quality and intensity and that they may inhibit or simplify every other. Thus, ideas may pass from “states of reality” (consciousness) to “states of tendency” (unconsciousness), with the dividing line between the two states being described as the threshold of consciousness. This formulation of Herbart clearly presages the development, by the German psychologist and physiologist Gustav Theodor Fechner, of the psycho-physical measurement of sensation thresholds, and the later development by Sigmund Freud of the concept of the unconscious.

No simple, agreed-upon definition of consciousness exists. Attempted definitions tend to be tautological (for example, consciousness defined as awareness) or merely descriptive (for example, consciousness described as sensations, thoughts, or feelings). Despite this problem of definition, the subject of consciousness has had a remarkable history. There had occasioned that the primary subject matter of psychology, consciousness as an area of study has suffered almost a total dissolution, later reemerging to become a topic of current interest.

The experimental analysis of consciousness dates from 1879, when the German psychologist Wilhelm Max Wundt started his research laboratory. For Wundt, the task of psychology was the study of the structure of consciousness, which extended well beyond sensations and included feelings, images, memory, attention, duration, and movement. Because early interest focussed on the content and dynamics of consciousness, it is not surprising that the central methodology of such studies was introspection; That is, subjects reported on the mental contents of their own consciousness. This introspective approach was developed most fully by the American psychologist Edward Bradford Titchener at Cornell University. Setting his task as that of describing the structure of the mind, Titchener attempted to detail, from introspective-reports, the dimensions of the elements of consciousness. For example, taste was “dimensionalized” into four basic categories, sweet, sour, salt, and bitter. This approach was known as structuralism.

By the 1920s, however, a remarkable revolution had occurred in psychology that was essentially to remove considerations of consciousness from psychological research for some fifty years: Behaviourism captured the field of psychology. The main initiator of this movement was the American psychologist John Broadus Watson. When in a 1913 article, Watson stated, ‘I believe that we can write on the preliminaries of psychology and never use the term’s consciousness, mental states, mind . . . imagery and the like.’ Psychologists then turned almost exclusively to behaviours, as described as to stimulus and response, and consciousness was totally bypassed as a subject. A survey of eight leading introductory psychology texts published between 1930 and the 1950s found no mention of the topic of consciousness in five texts, and in two it was treated as a historical curiosity.

Impelled of the 1950s, were, however, an interest in the subject of consciousness returned, specifically in those subjects and techniques relating to altered states of consciousness, such in sleep and dreams, meditation, biofeedback, hypnosis, and drug-induced states. An increase in sleep and dream research was directly fuelled by a discovery used for the nature of consciousness. A physiological indicator of the dream state was found: At roughly 90-minute intervals, the eyes of sleepers were observed to move rapidly, and while the sleepers' brain waves would show a pattern resembling the waking state. When people were awakened during these periods of rapid eye movement, they usually reported dreams, whereas if awakened at other times they did not. This and other research clearly suggested that sleep, once considered a passive state, were instead an active state of consciousness.

During the 1960s, an increased search for “higher levels” of consciousness through meditation resulted in a growing interest in the practices of Zen Buddhism and Yoga from Eastern cultures. A full flowering of this movement in the United States was seen in the development of training programs, such as Transcendental Meditation, that was, - directed procedures of physical relaxation and focussed attention. Biofeedback techniques also were developed to bring body systems involving factors such as blood pressure or temperature under voluntary control by providing response from the body, so that subjects could learn to control their responses. For example, researchers found that persons could control their brain-wave patterns to some extent, particularly the so-called alpha rhythms generally associated with a relaxed, meditative state. This finding was especially used for those interested in consciousness and meditation, and several ‘alpha training’ programs emerged.

Another subject that led to increased interest in altered states of consciousness was hypnosis, which involves a transfer of conscious control from the one person to another person. Hypnotism has had a long and intricate history in medicine and folklore and has been intensively studied by psychologists. Much has become known about the hypnotic state, compared with individual suggestibility and personality traits; The subject has now been largely demythologized, and the limitations of the hypnotic state are well known. Despite the increasing use of hypnosis, however, much remains to be learned about this unusual state of focussed attention.

Many people in the 1960s experimented with the psychoactive drugs known as hallucinogens, which produce mental or mind distortions of conscious dialectic awareness. The most prominent of these drugs is lysergic acid diethylamide, or LSD; Mescaline and psilocybin, the latter two have long been associated with religious ceremonies in various cultures. LSD, because of its radical thought-modifying properties, was initially explored for its so-called mind-expanding potential and for its psychotomimetic effects (imitating psychoses). Little positive use, however, has been found for these. As the metaphysic of an orderly but simple linkage between environment and behaviour became unsatisfactory in recent decades. Interest in altered states of consciousness may be taken as a visible sign of renewed interest in the topic of consciousness. That persons are active and intervening participants in their behaviour has become increasingly clear. Environments, rewards, and punishments are not simply defined by their physical character. Memories are organized, not simply stored in the composites of memory. An entirely new area called cognitive psychology has emerged that centre on these concerns. In the study of children, increased attention is being paid to how they understand, or perceive, the world at different ages. In the field of animal behaviour, researchers increasingly emphasize the inherent characteristics resulting from the way a species has been shaped to respond adaptively to the environment. Humanistic psychologists, with a concern for-actualization and growth, have emerged after a long period of silence. Throughout the development of clinical and industrial psychology, the conscious states of persons as to their current feelings and thoughts were important. The role of consciousness, however, was often de-emphasised in favour of unconscious needs and motivations. Trends can be seen, however, toward a new emphasis on the nature of states of consciousness.

When the German psychologist Wilhelm Max Wundt started his research laboratory. For Wundt, the task of psychology was the study of the structure of consciousness, which extended well beyond sensations and included feelings, images, memory, attention, duration, and movement. Because early interest focussed on the content and dynamics of consciousness, it is not surprising that the central methodology of such studies was introspection; that is, subjects reported on the mental contents of their own consciousness. This introspective approach was developed most fully by the American psychologist Edward Bradford Titchener at Cornell University. Setting his task as that of describing the structure of the mind, Titchener attempted to detail, from introspective self-reports, the dimensions of the elements of consciousness.

Scientists have long since considered the nature of consciousness without producing a fully satisfactory definition. In the early 20th century American philosopher and psychologist William James suggested that consciousness is a mental process involving both attention to external stimuli and short-term memory. Later scientific explorations of consciousness mostly expanded upon James’s work. In this, a 1997 special issue of Scientific American, Nobel laureate Francis Crick, who helped determine the structure of DNA, and fellow biophysicists Christof Koch explain how experiments on vision might deepen our understanding of consciousness.

Thirteenth-century Italian philosopher and theologian Saint Thomas Aquinas attempted to synthesize Christian belief with a broad range of human knowledge, embracing diverse sources such as Greek philosopher Aristotle and Islamic and Jewish scholars. His thought exerted lasting influence on the development of Christian theology and Western philosophy. Author Anthony Kenny examines the complexities of Aquinas’s concepts of substance and accident.

In the 5th century Bc, the Greek Sophists questioned the possibility of reliable and objective knowledge. Thus, a leading Sophist, Gorgias, argued that nothing really exists, that if anything did exist it could not be known, and that if knowledge were possible, it could not be communicated. Another prominent Sophist, Protagoras, maintained that no person's opinions can be said to be correct than another's, because each is the sole judge of his or her own experience. Plato, following his illustrious teacher Socrates, tried to answer the Sophists by postulating the existence of a world of unchanging and invisible forms, or ideas, about having exact and accurate knowledge is possible. The thing’s one sees and touches, they maintained, are imperfect copies of the pure forms studied in mathematics and philosophy. Accordingly, only the abstract reasoning of these disciplines yields genuine knowledge, whereas reliance on sense perception produces vague and inconsistent opinions. They concluded that philosophical contemplation of the unseen world of forms is the highest goal of human life.

Aristotle followed Plato in regarding abstract knowledge as superior to any other, but disagreed with him as to the proper method of achieving it. Aristotle maintained that most knowledge is derived from experience. Knowledge is gained either directly, by abstracting the defining traits of a species, or indirectly, by deducing new facts from those already known, according to the rules of logic. Careful observation and strict adherence to the rules of logic, which were first set down in systematic form by Aristotle, would help guard against the pitfalls the Sophists had exposed. The Stoic and Epicurean schools agreed with Aristotle that knowledge originates in sense perception, but against both Aristotle and Plato they maintained that philosophy is to be valued as a practical guide to life, than as an end in it.

After many centuries of declining interest in rational and scientific knowledge, the Scholastic philosopher Saint Thomas Aquinas and other philosophers of the Middle Ages helped to restore confidence in reason and experience, blending rational methods with faith into a unified system of beliefs. Aquinas followed Aristotle in regarding perception as the starting point and logic as the intellectual procedure for arriving at reliable knowledge of nature, but he considered faith in scriptural authority as the main source of religious belief.

From the 17th to the late 19th century, the main issue in epistemology was reasoning versus sense perception in acquiring knowledge. For the rationalists, of whom the French philosopher René Descartes, the Dutch philosopher Baruch Spinoza, and the German philosopher Gottfried Wilhelm Leibniz were the leaders, the main source and final test of knowledge was deductive reasoning based on evident principles, or axioms. For the empiricists, beginning with the English philosophers Francis Bacon and John Locke, the main source and final test of knowledge was sense perception.

French thinker René Descartes applied rigorous scientific methods of deduction to his exploration of philosophical questions. Descartes is probably best known for his pioneering work in philosophical skepticism. Author Tom Sorell examines the concepts behind Descartes’s work Meditationes de Prima Philosophia (1641, Meditations on First Philosophy), focussing on its distinctive use of logic and the reactions it aroused.

Bacon inaugurated the new era of modern science by criticizing the medieval reliance on tradition and authority and by setting down new rules of scientific method, including the first set of rules of inductive logic ever formulated. Locke attacked the rationalist belief that the principles of knowledge are intuitively evident, arguing that all knowledge is derived from experience, either from experience of the external world, which stamps sensations on the mind, or from internal experience, in which the mind reflects on its own activities. Human knowledge of external physical objects, he claimed, is always subject to the errors of the senses, and he concluded that one cannot have absolutely certain knowledge of the physical world.

George Berkeley conceded with Locke who retained in the possibility of knowing that some of our ideas (those of primary qualities) give us an adequate representation of the world around us, and that the various sources of knowledge, and above all the limits and doubtful capacities of our minds. It is through this that Locke connected his epistemology with the defence of religious toleration. Nevertheless, Berkeley denied Locke's belief that a distinction can be made between ideas and objects. The British philosopher David Hume continued the empiricist tradition, but he did not accept Berkeley's conclusion that knowledge was of ideas only. He divided all knowledge into two kinds: Knowledge of relations of ideas - that is, the knowledge found in mathematics and logic, which is exact and certain but provide no information about the world; and knowledge of matters of fact - that is, the knowledge derived from sense perception. Hume argued that most knowledge of matters of fact depends upon cause and effect, and since no logical connection exists between any given cause and its effect, one cannot hope to know any future matter of fact with certainty. Thus, the most reliable laws of science might not remain true, that of a conclusion that had a revolutionary impact on philosophy.

During the 19th century, the German philosopher Georg Wilhelm Friedrich Hegel revived the rationalist claim that absolutely certain knowledge of reality can be obtained by equating the processes of thought, of nature, and of history. Hegel inspired an interest in history and a historical approach to knowledge emphasized by Herbert Spencer in Britain and by the German school of historicisms. Spencer and the French philosopher Auguste Comte brought attention to the importance of sociology as a branch of knowledge, and both extended the principles of empiricism to the study of society.

In the early 20th century, epistemological problems were discussed thoroughly, and subtle shades of difference grew into rival schools of thought. Special attention was given to the relation between the act of perceiving something, the object directly perceived, and the thing that can be said to be known because of the perception. The phenomenalists contended that the objects of knowledge are the same as the objects perceived. The neorealist argued that one has direct perceptions of physical objects or parts of physical objects, than of one's own mental states. The critical realists took a middle position, holding that although one perceives only sensory data such as colours and sounds, these stand for physical objects and provide knowledge of it.

During the second quarter of the 20th century, two schools of thought emerged, each indebted to the Austrian philosopher Ludwig Wittgenstein. The first of these schools, logical empiricism, or logical positivism, had its origins in Vienna, Austria, but it soon spread to England and the United States. The logical empiricists insisted that there be only one kind of knowledge - scientific knowledge, that any valid knowledge claim must be verifiable in experience: Consequently, that much that had passed for philosophy was neither true nor false but literally meaningless. Finally, following Hume and Kant, a clear distinction must be maintained between analytic and synthetic statements.

Of these recent schools of thought, generally called linguistic analysis, or ordinary language philosophy, seems to break with traditional epistemology. The linguistic analysts undertake to examine the actualization laced upon the way major epistemological terms are used-terms such as knowledge, perception, and probability - and to formulate definitive rules for their use to avoid verbal confusion. British philosopher John Langshaw Austin argued, for example, that to say a statement was truly added but nothing to the statement except a promise by the speaker or writer. Austin does not consider truth a quality or property attaching to statements or utterances.

Positivism, is a contained system of philosophy based on experience and empirical knowledge of natural phenomena, in which metaphysics and theology are regarded as inadequate and imperfect systems of knowledge.

The doctrine was first called positivism by the 19th-century French mathematician and philosopher Auguste Comte, but some positivist concepts may be traced to the British philosopher David Hume, the French philosopher Duc de Saint-Simon, and Immanuel Kant.

The keystone of Kant's philosophy, sometimes called critical philosophy, is contained in his Critique of Pure Reason (1781), in which he examined the bases of human knowledge and created an individual epistemology. Like earlier philosophers, Kant differentiated modes of thinking into analytic and synthetic propositions. An analytic proposition is one in which the predicate is contained in the subject, as in the statement “Black houses are houses.” The truth of this type of proposition is evident, because to state the reverse would be to make the proposition self-contradictory. Such propositions are called analytic because truth is discovered by the analysis of the concept itself. Synthetic propositions, on the other hand, are those that cannot be arrived at by pure analysis, as in the statement “The house is black.” All the common propositions that result from experience of the world are synthetic.

Propositions, according to Kant, can also be divided into two other types, empirical and deductive. Empirical propositions depend entirely on sense perception, but a deductive proposition has for itself - , a fundamental validity and is not based on such perception. The difference between these two types of propositions may be illustrated by the empirical “The house is black” and the deductivity “Two plus two makes four.” Kant's thesis in the Critique is that making synthetic speculative judgment is possible. This philosophical position is usually known as transcendentalism. In describing how this type of judgment is possible Kant regarded the objects of the material world as fundamentally unknowable, from the point of view of reason, they serve merely as the raw material from which sensations are formed. Objects of themselves have no existence, and space and time exists only as part of the mind, as “intuitions” by which perceptions are measured and judged.

Besides these intuitions, Kant stated that several deductive concepts, which he called categories, also exists. He divided the categories into four groups concerning quantity, which are unity, plurality, and totality. Those concerning quality values, for which reality, negation, and limitation, are the concerning relations under which are substance-and-accident, cause-and-effect, and reciprocity, all of these under consideration contend with the concerns of modality, in that they are possibly to explicate upon existence, and necessity. The intuitions and the categories can be applied to make judgments about experiences and perceptions, but cannot, according to Kant, be applied to abstract ideas such as freedom and existence without leading to inconsistencies in the form of coupling incomparable propositions, or “antinomies,” in which both members of each pair can be proven true.

In the Metaphysics of Ethics (1797) Kant described his ethical system, which is based on a belief that the reason is the final authority for morality. Actions of any sort, he believed, must be undertaken from a sense of duty dictated by reason, and no action had rendered for expediency or solely in obedience to law or custom can be regarded as moral. Kant described two types of commands given by reason, the hypothetical imperative, which dictates a given course of action to reach a specific end, and the categorical imperative, which dictates a course of action that must be followed because of its rightness and necessity. The categorical imperative is the basis of morality and was stated by Kant in these words: “Act as if the maxim of your action were to become a vessel through which means were a way of your will and general common law.”

Kant's ethical ideas are a logical outcome of his belief in the fundamental freedom of the individual as stated in his Critique of Practical Reason (1788). This freedom he did not regard as the lawless freedom of anarchy, but as the freedom of a self-government, the freedom to obey consciously the laws of the universe as revealed by reason. He believed that the welfare of each individual should properly be regarded as an end, that the world was progressing toward an ideal society in which reason would “bind every law giver to make his laws so that they could have sprung from the united will of an entire people, and to regard every subject, in as far as he wishes to be a citizen, based on whether he has conformed to that will.” In his treatise Perpetual Peace (1795) Kant advocated the establishment of a world federation of republican states.

Kant had a greater influence than any other philosopher of modern times. Kantian philosophy, particularly as developed by the German philosopher Georg Wilhelm Friedrich Hegel, was the basis on which the structure of Marxism was built; Hegel's dialectical method, which was used by Karl Marx, was an outgrowth of the method of reasoning by “antinomies” that Kant used. The German philosopher Johann Fichte, Kant's pupil, rejected his teacher's division of the world into objective and subjective parts and developed an idealistic philosophy that also had great influence on 19th-century socialists. One of Kant's successors at the University of Königsberg, J.F. Herbart, incorporated some of Kant's ideas in his system of pedagogy.

Besides works on philosophy, Kant wrote many treatises on various scientific subjects, many in the field of physical geography. His most important scientific work was General Natural History and Theory of the Heavens (1755), in which he advanced the hypothesis of the formation of the universe from a spinning nebula hypothesis that later was developed independently by Pierre de LaPlace.

Among Kant's other writings are Prolegomena to Any Future Metaphysics (1783), Metaphysical Rudiments of Natural Philosophy (1786), Critique of Judgment (1790), and Religion Within the Boundaries of Pure Reason (1793).

Metaphysics, is the branch of philosophy that is concerned with the nature of ultimate reality. Metaphysic is customarily divided into ontology, which deals with the question of how many fundamentally distinct sorts of entities compose the universe, and metaphysics proper, which is concerned with describing the most general traits of reality. These general traits together define reality and would presumably characterize any universe whatever. Because these traits are not peculiar to this universe, but are common to all possible universes, metaphysics may be conducted at the highest level of abstraction. Ontology, by contrast, because it investigates the ultimate divisions within this universe, is more closely related to the physical world of human experience.

The term metaphysic is believed to have been derived in Rome about 70Bc, with the Greek Peripatetic philosopher Andronicus of Rhodes (flourished 1st century Bc) in his edition of the works of Aristotle. In the arrangement of Aristotle's works by Andronicus, the treatise originally called First Philosophy, or Theology, followed the treatise Physics. Hence, the First Philosophy became known as meta (ta) physica, or “following (the) Physics,” later shortened to Metaphysics. The word took on the connotation, in popular usage, of matters transcending material reality. In the philosophic sense, however, particularly as opposed to the use of the word by occultists, metaphysic apply to all reality and is distinguished from other forms of inquiry by its generality.

The subjects treated in Aristotle's Metaphysics (substance, causality, the nature of being, and the existence of God) fixed the content of metaphysical speculation for centuries. Among the medieval Scholastic philosophers, metaphysics were known as the “transphysical science” on the assumption that, by means of it, the scholar philosophically could make the transition from the physical world to a world beyond sense perception. The 13th-century Scholastic philosopher and theologian St. Thomas Aquinas declared that the cognition of God, through a causal study of finite sensible beings, was the aim of metaphysics. With the rise of scientific study in the 16th century the reconciliation of science and faith in God became an increasingly important problem.

Before the time of Kantian metaphysics that was characterized by a tendency to construct theories based on deductive knowledge, that is, knowledge derived from reason alone, in contradistinctions to empirical knowledge, which is gained by reference to the facts of experience. From deductive knowledge were to signify a deduced general proposition held to be true of all things. The method of inquiry based on deductive principles is known as rationalistic. This method may be subdivided into monism, which holds that the universe is made up of a single fundamental substance; Dualism, may be viewed as the belief in two such substances, as the pluralism for which proposes the existence of several fundamental properties.

The monists, agreeing that only one basic substance exists, differ in their descriptions of its principal characteristics. Thus, in idealistic monism the substance is believed to be purely mental; in materialistic monism it is held to be purely physical, and in neutral monism it is considered neither exclusively mental nor solely physical. The idealistic position was held by the Irish philosopher George Berkeley, the materialistic by the English philosopher Thomas Hobbes, and the neutral by the Dutch philosopher Baruch Spinoza. The latter expounded a pantheistic view of reality in which the universe is identical with God and everything contains God's contention.

George Berkeley set out to challenge what he saw as the atheism and skepticism inherent in the prevailing philosophy of the early 18th century. His initial publications, which asserted that no objects or matter existed outside the human mind, were met with disdain by the London intelligentsia of the day. Berkeley aimed to explain his “Immaterialist” theory, part of the school of thought known as idealism, to a more general audience in Three Dialogues between Hylas and Philonous (1713).

The most famous exponent of dualism was the French philosopher René Descartes, who maintained that body and mind are radically different entities and that they are the only fundamental substances in the universe. Dualism, however, does not show how these basic entities are connected.

In the work of Gottfried Wilhelm Leibniz, the universe is held to consist of many distinct substances, or monads. This view is pluralistic in the sense that it proposes the existence of many separate entities, and it is monistic in its assertion that each monad reflects within itself the entire universe.

Other philosophers have held that knowledge of reality is not derived from theoretical principles, but is obtained only from experience. This type of metaphysic is called empiricism. Still another school of philosophy has maintained that, although an ultimate reality does exist, it is altogether inaccessible to human knowledge, which is necessarily subjective because it is confined to states of mind. Knowledge is therefore not a representation of external reality, but merely a reflection of human perceptions. This view is known as skepticism or agnosticism in respect to the soul and the reality of God.

It is empirical in that it affirms that all knowledge arises from experience and is true of objects of actual and possible experience; and it is rationalistic in that it maintains the speculative character of the structural principles of this empirical knowledge.

These principles are held to be necessary and universal in their application to experience, for in Kant's view the mind furnishes the archetypal forms and categories such that experience, is manifested only in experience. Their logic precedes the experience from which of these categories or structural principle’s are made transcendental. They transcend all experience, both actual and possible. Although these principles determine all experience, they do not in any way affect the nature of things in themselves. The knowledge of which these principles are the necessary conditions must not be considered, therefore, as constituting a revelation of things as they are in themselves. This knowledge concerns things only insofar as they appear to human perception or as they can be apprehended by the senses. The argument by which Kant sought to fix the limits of human knowledge within the framework of experience and to demonstrate the inability of the human mind to penetrate beyond experience strictly by knowledge to the realm of ultimate reality constitutes the critical feature of his philosophy, given the key word to the titles of his three leading treatises, Critique of Pure Reason, Critique of Practical Reason, and Critique of Judgment. He maintained that, because God, freedom, and human immortality are noumenal realities, these concepts are understood through moral faith rather than through scientific knowledge. With the continuous development of science, the expansion of metaphysics to include scientific knowledge and methods became one of the major objectives of metaphysicians.

Since the formation of the hypothesis of absolute idealism, the development of metaphysics has resulted in as many types of metaphysical theory as existed in pre-Kantian philosophy, despite Kant's contention that he had fixed definitely the limits of philosophical speculation. Notable among these later metaphysical theories is radical empiricism, or pragmatism, a native American form of metaphysics expounded by Charles Sanders Peirce, developed by William James, and adapted as instrumentalism by John Dewey; Voluntarism, is the foremost exponents of which are the German philosopher Arthur Schopenhauer and the American philosopher Josiah Royce, for phenomenalism is exemplified in the writings of the French philosopher Auguste Comte and the British philosopher Herbert Spencer, emergent evolution, or creative evolution, originated by the French philosopher Henri Bergson, and the philosophy of the organism, which is elaborated by the British mathematician and philosopher Alfred North Whitehead. The salient doctrines of pragmatism are that the chief function of thought is to guide action, that the meaning of concepts is to be sought in their practical applications, and that truth should be tested by the practical effects of belief: According to instrumentalism, ideas are instruments of action, and their truth is determined by their role in human experience. In the theory of voluntarism ‘the Determination of Will’ is postulated as the supreme manifestation of reality. The exponents of phenomenalism, who are sometimes called positivists, contend that everything can be analysed as to actual or possible occurrences, or phenomena, and that anything that cannot be analysed in this manner cannot be understood. In emergent or creative evolution, the evolutionary process is characterized as spontaneous and unpredictable rather than mechanistically determined. The philosophy of the organism combines an evolutionary stress on constant process with a metaphysical theory of God, the eternal objects, and creativity what is Mysticism but an immediate, direct, intuitive knowledge of God or of ultimate reality attained through personal religious experience? Wide variations are found in both the form and the intensity of mystical experience. The authenticity of any such experience, however, is not dependent on the form, but solely on the quality of life that follows the experience. The mystical life is characterized by enhanced vitality, productivity, serenity, and joy as the inner and outward aspects harmonize in union with God.

Daoism (Taoism) emphasizes the importance of unity with nature and of yielding to the natural flow of the universe. This contrasts greatly with Confucianism, another Chinese philosophy, which focuses on society and ethics. The fundamental text of Daoism is traditionally attributed to Laozi, a legendary Chinese philosopher who supposedly lived in the 500s Bc.

Elaborate philosophical theories have been developed in an attempt to explain the phenomena of mysticism. Thus, in Hindu philosophy, and particularly in the metaphysical system known as the Vedanta, the self or atman in man is identified with the supreme self, or Brahman, of the universe. The apparency of separateness and individuality of beings and events are held to be an illusion (Sanskrit maya), or convention of thought and feeling. This illusion can be dispelled through the realization of the essential oneness of atman and Brahman. When the religious initiate has overcome the beginningless, ignorance (Sanskrit avidya) upon which, depends on the apparent separability of subject and objects, of self and no self, a mystical state of liberation, or moksha, is attained. The Hindu philosophy of Yoga incorporates perhaps the most comprehensive and rigorous discipline ever designed to transcend the sense of personal identity and to clear the way for an experience of union with the divine self. In China, Confucianism is formalistic and antimystical, but Daoism, as expounded by its traditional founder, the Chinese philosopher Laozi (Lao-tzu), has a strong mystical emphasis.

The philosophical ideas of the ancient Greeks were predominantly naturalistic and rationalistic, but an element of mysticism found expression in the Orphic and other sacred mysteries. A late Greek movement, Neoplatonism, was based on the philosophy of Plato and shows the influence of the mystery religions. The Muslim Sufi sect embraces a form of theistic mysticism closely resembling that of the Vedanta. The doctrines of Sufism found their most memorable expression in the symbolic works of the Persian poets Mohammed Shams od-Din, better known as Hafiz, and Jalal al-Din Rumi, and in the writings of the Persian al-Ghazali. Mysticism of the pre-Christian period is evidenced in the writings of the Jewish-Hellenistic philosopher Philo Judaeus.

The Imitation of Christ, the major devotional works of medieval German monk Thomas à Kempis, was written more than 500 years ago to aid fellow members of religious orders. The book, simple in language and style, has become one of the most influential works in Christian literature. It is a thoughtful yet practical treatise that guides the reader toward a spiritual union with God through the teachings of Jesus Christ and the monastic qualities of poverty, chastity, and obedience. In this, Kempis urges Christians to live each day as if it might be their last.

Saint Paul was the first great Christian mystic. The New Testament writings’ best known for their deeply mystical emphasis are Paul’s letters and the Gospel of John. Christian mysticism as a system, however, had arisen from Neoplatonism through the writings of Dionysius the Areopagite, or Pseudo-Dionysius. The 9th-century Scholastic philosopher John Scotus Erigena translated the works of Pseudo-Dionysius from Greek into Latin and thus introduced the mystical theology of Eastern Christianity into Western Europe, where it was combined with the mysticism of the early Christian prelate and theologian Saint Augustine.

In the Middle Ages mysticism was often associated with monasticism. Many celebrated mystics are found among the monks of both the Eastern church and the Western church, particularly the 14th-century Hesychasts of Mount Athos in the former, and Saints Bernard of Clairvaux, Francis of Assisi, and John of the Cross in the latter. The French monastery of Saint Victor, near Paris, was an important centre of mystical thought in the 12th century. The renowned mystic and Scholastic philosopher Saint Bonaventure was a disciple of the monks of St. Victor and St. Francis, who derived mysticism directly from the New Testament, without reference to Neoplatonism, remains a dominantly deliberated figure in modern mysticism. Among the mystics of Holland were Jan van Ruysbroeck and Gerhard Groote, the latter a religious reformer and founder of the monastic order known as the Brothers of the Common Life. Johannes Eckhart, called Meister Eckhart, was the foremost mystic of Germany.

Written by an anonymous English monk in the late 14th century, ‘The Cloud of Unknowing’ has been deeply influential in Christian mysticism. The author stressed the need for contemplation to understand and know God, with the goal of experiencing the spiritual touch of God, and perhaps even achieving a type of spiritual union with God here on earth. Encouraging the faithful to meditate as a way of prayer, putting everything but God out of their minds, even if, at first, all they are aware of is a cloud of unknowing.

Other important German mystics are Johannes Tauler and Heinrich Suso, and followers of Eckhart and members of a group called the Friends of God. One of this group wrote the German Theology that influenced Martin Luther. Prominent later figures are to include, Thomas à Kempis, generally regarded as the author of The Imitation of Christ. English mystics of the 14th and 15th centuries include Margery Kempe and Richard Rolle, Walter Hilton, Julian of Norwich, and the anonymous author of The Cloud of Unknowing, an influential treatise on mystic prayer.

Several distinguished Christian mystics have been women, notably Hildegard of Bingen, Saint Catherine of Siena, and Saint Teresa of Ávila. The 17th-century French mystic Jeanne Marie Bouvier de la Motte Guyon delivered a naturalized mystical doctrine of quietism to France.

Sixteenth-century Spanish mystic and religious reformer Saint Teresa of Ávila’s books on prayer and contemplation frequently dealt with her intense visions of God. Her autobiography, The Life of Saint Teresa of Ávila, written in the 1560s, is frank and unsophisticated in style, and its vocabulary and theology is accessible to the everyday reader. Through this, Teresa described the physical and spiritual sensations that accompanied her religious raptures.

By its pursuit of spiritual freedom, sometimes at the expense of theological formulas and ecclesiastical discipline, mysticism may have contributed to the origin of the Reformation, although it inevitably disagreed with Protestant, as it had with Roman Catholic, religious authorities. The Counter Reformation inspired the Spiritual Exercises of Saint Ignatius of Loyola. The Practice of the Presence of God by Brother Lawrence was a classic French work of a later date. The most notable German Protestant mystics were Jakob Boehme, author of Mysterium Magnum (The Great Mystery), and Kaspar Schwenkfeld. Mysticism finds expression in the theology of many Protestant denominations and is a salient characteristic of such sects as the Anabaptists and the Quakers.

New England, Congregational divine, Jonathan Edwards, exhibited a strong mystical tendency, and the religious revivals that began in his time, and spread throughout the United States during the 19th century derived much of their peculiar power from the assumption of mystical principles, great emphasis being placed on heightened feeling as a direct intuition of the will of God. Mysticism manifested itself in England in the works of the 17th-century Cambridge Platonists: In those of devotional writer William Law, author of the Serious Call to a Devout and Holy Life, and in the Art and Poetry of William Blake.

Religious Revivals, by its term is widely used among Protestants since the early 18th century to denote periods of marked religious interest. Evangelistic preaching and prayer meetings, frequently accompanied by intense emotionalism, are characteristic of such periods, which are intended to renew the faith of church members and to bring others to profess their faith openly for the first time. By an extension of its meaning, the term is sometimes applied to various important religious movements of the past. Instances are recorded in the Scriptures as occurring both in the history of the Jews and in the early history of the Christian church. In the Middle Ages revivals took place concerning the Crusades and under the charge of the monastic orders, sometimes with strange adjuncts, as often happens with the Flagellants and the dancing mania. The Reformation of the 16th century was also accompanied by revivals of religion.

It is more accurate, however, to limit the application of the term revival to the history of modern Protestantism, especially in Britain and the United States where such movements have flourished with unusual vigour. The Methodist churches originated from a widespread evangelical movement in the first half of the 18th century. This was later called the Wesleyan movement or Wesleyan revival. The Great Awakening was the common designation for the revival of 1740-42 that took place in New England and other parts of North America under the Congregational clergyman Joseph Bellamy, and three Presbyterian clergymen, Gilbert Tennent, William Tennent, and their father, the educator William Tennent. Both Princeton University and Dartmouth College had their origin in this movement. Toward the end of the 18th century a fresh series of revivals began in America, lasting intermittently from 1797 to 1859. In New England the beginning of this long period was called the evangelical reawakening.

Churches soon came to depend upon revivals for their growth and even for their existence, and, as time went on, the work was also taken up by itinerant preachers also called circuit riders. The early years of the 19th century were marked by great missionary zeal, extending even to foreign lands. In Tennessee and Kentucky, encampment conventions, great open-air assemblies, began about 1800AD to play an important part in the evangelical work of the Methodist Church, now the United Methodist Church. One of the most notable products of the camp meeting idea was the late 19th-century Chautauqua Assembly, a highly successful educational endeavour. An outstanding religious revival of the 19th century was the Oxford movement (1833-45) in the Church of England, which resulted in the modern English High Church movement. Distinctly a revival, it was of a type different from those of the two preceding centuries. The great American revival of 1859-61 began in New England, particularly in Connecticut and Massachusetts, and extended to New York and other states. It is believed that in a single year half a million converts were received into the churches. Another remarkable revival, in 1874-75, originated in the labours of the American evangelists Dwight L. Moody and Ira D. Sankey. Organized evangelistic campaigns have sometimes had great success under the leadership of professional evangelists, among them Billy Sunday, Aimee Semple McPherson, and Billy Graham. The Salvation Army carries on its work largely by revivalistic methods.

American religious writer and poet Thomas Merton joined a monastery in 1941 and was later ordained as a Roman Catholic priest. He is known for his autobiography, The Seven Storey Mountains, which was published in 1948.

The 20th century has experienced a revival of interest in both Christian and non-Christian mysticism. Early commentators of note were Austrian Roman Catholic Baron Friedrich von Hügel, British poet and writer Evelyn Underhill, American Quaker Rufus Jones, the Anglican prelate William Inge, and German theologian Rudolf Otto. A prominent nonclerical commentator was American psychologist and philosopher William James in The Varieties of Religious Experience (1902).

At the turn of the century, American psychologist and philosopher William James gave a series of lectures on religion at Scotland’s University of Edinburgh. In the twenty lectures he delivered between 1901 and 1902, published together as The Varieties of Religious Experience (1902), James discussed such topics as the existence of God, religious conversions, and immortality. In his lectures on mysticism. James defined the characteristics of a mystical experience - a state of consciousness in which God is directly experienced. He also quoted accounts of mystical experiences as given by important religious figures from many different religious traditions.

In non-Christian traditions, the leading commentator on Zen Buddhism was Japanese scholar Daisetz Suzuki; on Hinduism, Indian philosopher Sarvepalli Radhakrishnan; and on Islam, British scholar R. A. Nicholson. The last half of the 20th century saw increased interest in Eastern mysticism. The mystical strain in Judaism, which received particular emphasis in the writings of the Kabbalists of the Middle Ages and in the Hasidism movement of the 18th century, was again pointed up by the modern Austrian philosopher and scholar Martin Buber. Mid-20th-century mystics of note included French social philosopher Simone Weil, French philosopher Pierre Teilhard de Chardin, and American Trappist monk Thomas Merton.

Comte chose the word positivism on the ground that it showed the “reality” and “constructive tendency” that he claimed for the theoretical aspect of the doctrine. He was, in the main, interested in a reorganization of social life for the good of humanity through scientific knowledge, and thus controls of natural forces. The two primary components of positivism, the philosophy and the polity (or a program of individual and social conduct), were later welded by Comte into a whole under the conception of a religion, in which humanity was the object of worship. Many of Comte's disciples refused, however, to accept this religious development of his philosophy, because it seemed to contradict the original positivist philosophy. Many of Comte's doctrines were later adapted and developed by the British social philosophers John Stuart Mill and Herbert Spencer and by the Austrian philosopher and physicist Ernst Mach.

In the early 20th century British mathematician and philosopher Bertrand Russell, along with British mathematician and philosopher Alfred North Whitehead, attempted to prove that mathematics and numbers can be understood as groups of concepts, or set classifications. Russell and Whitehead tried to show that mathematics is closely related to logic and, in turn, that ordinary sentences can be logically analysed using mathematical symbols for words and phrases. This idea resulted in a new symbolic language, used by Russell in a field he termed philosophical logic, in which philosophical propositions were reformulated and examined according to his symbolic logic.

During the early 20th century a group of philosophers who were concerned with developments in modern science rejected the traditional positivist ideas that held personal experience to be the basis of true knowledge and emphasized the importance of scientific verification. This group became known as logical positivists, and it included the Austrian Ludwig Wittgenstein and Bertrand Russell and G.E. Moore. It was Wittgenstein's Tractatus Logico-philosophicus (1921, German-English parallels texts, 1922) that proved to be of a decisive influence in the rejection of metaphysical doctrines for their meaninglessness and the acceptance of empiricism as a matter of logical necessity.

Philosophy, for Moore, was basically a two-fold activity. The first part involves analysis, that is, the attempt to clarify puzzling propositions or concepts by indicating fewer puzzling propositions or concepts to which the originals are held to be logically equivalent. Moore was perplexed, for example, by the claim of some philosophers that time is unreal. In analysing this assertion, he maintained that the proposition “time is unreal” was logically equivalent, as, “there are no temporal facts.” (“I read the article yesterday” is an example of a temporal fact.) Once the meaning of an assertion containing the problematic concept is clarified, the second task is to determine whether justifying reasons exist for believing the assertion. Moore's diligent attention to conceptual analysis for achieving clarity established him as one of the founders of the contemporary analytic and linguistic emphasis in philosophy.

Moore's most famous work, Principia Ethica (1903), contains his claim that the concept of good refers to a simple, unanalyzable, indefinable quality of things and situations. It is a nonnatural quality, for it is apprehended not by sense experience but by a kind of moral intuition. The quality goodness is evident, argued Moore, in such experiences as friendship and aesthetic enjoyment. The moral concepts of right and duty are then analysed as to producing whatever possesses goodness.

Several of Moore's essays, including “The Refutation of Idealism” (1903), contributed to developments in modern philosophical realism. An empiricist in his approach to knowledge, he did not identify experience with sense experience, and he avoided the skepticism that often accompanies empiricism. He came to the defence of the common-sense point of view that suggests that an experience result in knowledge of an external world independent of the mind.

Moore also wrote Ethics (1912), Philosophical Studies (1922), and Philosophical Papers (1959) and edited (1921-47) Mind, a leading British philosophical journal.

Nonetheless, language, Wittgenstein argued in the Tractatus, is composed of complex propositions that can be analysed into fewer complex propositions until one arrives at simple or elementary propositions. Correspondingly, the world is composed of complex facts that can be analysed into fewer complex facts until one arrives at simple, or atomic, facts. The world is the totality of these facts. According to Wittgenstein’s picture theory of meaning, it is the nature of elementary propositions logically to picture atomic facts, or ‘states of affairs’. He claimed that the nature of language required elementary propositions, and his theory of meaning required that there be atomic facts pictured by the elementary propositions. On this analysis, only propositions that picture facts - the propositions of science are considered cognitively meaningful. Metaphysical and ethical statements are not meaningful assertions. The logical positivists associated with the Vienna Circle were greatly influenced by this conclusion.

Wittgenstein came to believe, however, that the narrow view of language reflected in the Tractatus was mistaken. In the Philosophical Investigations he argued that if one looks to see how language is used, the variety of linguistic usage becomes clear. Words are like tools, and just as tools serve different dynamic functions, so linguistic expressions serve many foundational functional structures as bound akin the stability of fundamental linguistics. Although some propositions are used to picture facts, others are used to command, question, pray, thank, curse, and so on. This recognition of linguistic flexibility and variety led to Wittgenstein’s concept of a language game and to the conclusion that people play different language games. The scientist, for example, is involved in a different language game than the theologian. Moreover, the meaning of a proposition must be understood concerning its context, that is, for the rules of the game of which that proposition is a part. The key to the resolution of philosophical puzzles is the therapeutic process of examining and describing language in use.

The positivists today, who have rejected this so-called Vienna school of philosophy, prefer to call themselves logical empiricists to dissociate themselves from the emphasis of the earlier thinkers on scientific verification. They maintain that the verification principle it is philosophically unverifiable positivism, is a contained system of philosophy based on experience and empirical knowledge of natural phenomena, in which metaphysics and theology are regarded as inadequate and imperfect systems of knowledge.

Positivism is the system of philosophy based on experience and empirical knowledge of natural phenomena, in which metaphysics and theology are regarded as inadequate and imperfect systems of knowledge.

The doctrine was first called positivism by the 19th-century French mathematician and philosopher Auguste Comte, but some positivist ideas may be traced to the British philosopher David Hume, the French philosopher Duc de Saint-Simon, and the German philosopher Immanuel Kant.

Several major viewpoints were combined in the work of Kant, who developed a distinctive critical philosophy called transcendentalism. His philosophy is agnostic in that it denies the possibility of a strict knowledge of ultimate reality; it is empirical in that it affirms that all knowledge arises from experience and is true of objects of actual and possible experience; and it is rationalistic in that it maintains the theoretical character of the structural principles of this empirical knowledge.

Although, principles determine all experience, they do not in any way affect the nature of things in themselves. The knowledge of which these principles are the necessary conditions must not be considered, therefore, as constituting a revelation of things as they are in themselves. This knowledge concerns things only insofar as they appear to human perception or as they can be apprehended by the senses. The argument by which Kant sought to fix the limits of human knowledge within the framework of experience and to demonstrate the inability of the human mind to penetrate beyond experience strictly by knowledge to the realm of ultimate reality constitutes the critical feature of his philosophy. Kant sought also to reconcile science and religion in a world of two levels, comprising noumena, objects conceived by reason although not perceived by the senses, and phenomena, things as they appear to the senses and are accessible to material study. He maintained that, because God, freedom, and human immortality are noumenal realities, these concepts are understood through moral faith than through scientific knowledge. With the continuous development of science, the expansion of metaphysics to include scientific knowledge and methods became one of the major objectives of metaphysicians.

All the same, in that, these theoretical principles as were structurally given are those contained or restricted by their measure through which we discover that the philosopher John Locke (1632-1704), was that he founded the school of empiricism. Under which of his understanding, Locke explained his theory of empiricism, a philosophical doctrine holding that all knowledge is based on experience, in An Essay Concerning Human Understanding (1690). Locke believed the human mind to be a blank slate at birth that gathered all its information from its surroundings - starting with simple ideas and combining these simple ideas into more complex ones. His theory greatly influenced education in Great Britain and the United States. Locke believed that education should begin in early childhood and should proceed gradually as the child learns increasingly complex ideas.

Locke was born in the village of Wrington, Somerset, on August 29, 1632. He was educated at the University of Oxford and lectured on Greek, rhetoric, and moral philosophy at Oxford from 1661 to 1664. In 1667 Locke began his association with the English statesman Anthony Ashley Cooper, 1st earl of Shaftesbury, to whom Locke was friend, adviser, and physician. Shaftesbury secured for Locke a series of minor government appointments. In 1669, in one of his official capacities, Locke wrote a constitution for the proprietors of the Carolina Colony in North America, but it was never put into effect. In 1675, after the liberal Shaftesbury had fallen from favour, Locke went to France. In 1679 he returned to England, but in view of his opposition to the Roman Catholicism favoured by the English monarchy at that time, he soon found it expedient to return to the Continent. From 1683 to 1688 he lived in Holland, and following the so-called Glorious Revolution of 1688 and the restoration of Protestantism to favour, Locke returned once more to England. The new king, William III, appointed Locke to the Board of Trade in 1696, a position from which he resigned because of ill health in 1700. He died in Oates on October 28, 1704.

The ideas of 17th-century English philosopher and political theorists John Locke greatly influenced modern philosophy and political thought. Locke, who is best known for establishing the philosophical doctrine of empiricism, was criticized for his “atheistic” proposition that morality is not innate within human beings. However, Locke was a religious man, and the influence of his faith was overlooked by his contemporaries and subsequent readers. Author John Dunn explores the influence of Locke’s Anglican beliefs on works such as An Essay Concerning Human Understanding (1690).

Locke's empiricism emphasizes the importance of the experience of the senses in pursuit of knowledge than intuitive speculation or deduction. The empiricist doctrine was first expounded by the English philosopher and statesman Francis Bacon early in the 17th century, but Locke gave it systematic expression in his Essay Concerning Human Understanding (1690). He regarded the mind of a person at birth as a tabula rasa, a blank slate upon which experience imprinted knowledge, and did not believe in intuition or theories of innate conceptions. Locke also held that all persons are born good, independent, and equal.

English philosopher John Locke anonymously published his Treatises on Government (1690) the same year as his famous Essay Concerning Human Understanding. In the Second Treatise, Locke described his concept of a ‘civil government’. Locke excluded absolute monarchy from his definition of civil society, because he believed that the people must consent to be ruled. This argument later influenced the authors of the Declaration of Independence and the Constitution of the United States.

Locke's views, in his Two Treatises of Government (1690), attacked the theory of divine right of kings and the nature of the state as conceived by the English philosopher and political theorist Thomas Hobbes. In brief, Locke argued that sovereignty did not reside in the state but with the people, and that the state is supreme, but only if it is bound by civil and what he called ‘natural’ law. Many of Locke's political ideas, such as that relating to natural rights, property rights, the duty of the government to protect these rights, and the rule of the majority, were later embodied in the U.S. Constitution.

Locke further held that revolution was not only a right but often an obligation, and he advocated a system of checks and balances in government. He also believed in religious freedom and in the separation of church and state.

Locke's influence in modern philosophy has been profound and, with his application of empirical analysis to ethics, politics, and religion, he remains one of the most important and controversial philosophers of all time. Among his other works are Some Thoughts Concerning Education (1693) and The Reasonableness of Christianity (1695).

In accord with empirical knowledge it is found that pragmatism, is an aligned to a philosophical movement that has had a major impact on American culture from the late 19th century to the present. Pragmatism calls for ideas and theories to be tested in practice, by assessing whether acting upon the idea or theory produces desirable or undesirable results. According to pragmatists, all claims about truth, knowledge, morality, and politics must be tested in this way. Pragmatism has been critical of traditional Western philosophy, especially the notion that there is absolute truths and absolute values. Although pragmatism was popular for a time in France, England, and Italy, most observers believe that it encapsulates an American faith in understanding and practicality and an equally American distrust of abstract theories and ideologies.

American psychologist and philosopher William James helped to popularize the philosophy of pragmatism with his book Pragmatism: A New Name for Old Ways of Thinking (1907). Influenced by a theory of meaning and verification developed for scientific hypotheses by American philosopher C.S. Peirce, James held that truth is what work, or has good experimental results. In a related theory, James argued the existence of God is partly verifiable because many people derive benefits from believing.

The Association for International Conciliation first published William James’s pacifist statement, “The Moral Equivalent of War,” in 1910. James, a highly respected philosopher and psychologist, was one of the founders of pragmatism of which was a philosophical movement holding that ideas and theories must be tested in practice to assess their worth. James hoped to find a way to convince men with a long-standing history of pride and glory in war to evolve beyond the need for bloodshed and to develop other avenues for conflict resolution. Spelling and grammar represents standards of the time.

Pragmatists regarded all theories and institutions as tentative hypotheses and solutions, and for this reason they believed that efforts to improve society, through such means as education or politics, must be geared toward problem solving and must be ongoing. Through their emphasis on connecting theory to practice, pragmatist thinkers attempted to transform all areas of philosophy, from metaphysics to ethics and political philosophy.

Pragmatism sought a middle ground between traditional ideas about the nature of reality and radical theories of nihilism and irrationalisms, which had become popular in Europe in the late 19th century. Traditional metaphysics assumed that the world has a fixed, intelligible structure and that human beings can know absolute or objective truths about the world and about what constitutes moral behaviour. Nihilism and irrationalism, on the other hand, denied those very assumptions and their certitude. Pragmatists today still try to steer a middle course between contemporary offshoots of these two extremes.

The ideas of the pragmatists, moreover, were considered revolutionary when they first appeared. To some critics, pragmatism’s refusal to affirm any absolutes carried negative implications for society. For example, pragmatists do not believe that a single absolute idea of goodness or justice exists, but rather than these concepts are changeable and depend on the context in which they are being discussed. The absence of these absolutes, critics feared, could result in a decline in moral standards. The pragmatists’ denial of absolutes, moreover, challenged the foundations of religion, government, and schools of thought. As a result, pragmatism influenced developments in psychology, sociology, education, semiotics (the study of signs and symbols), and scientific method, as well as philosophy, cultural criticism, and social reform movements. Various political groups have also drawn on the assumptions of pragmatism, from the progressive movements of the early 20th century to later experiments in social reform.

Pragmatism is best understood in its historical and cultural context. It arose during the late 19th century, a period of rapid scientific advancement typified by the theories of British biologist Charles Darwin, whose theories suggested to many thinkers that humanity and society are in a perpetual state of progress. During this same period a decline in traditional religious beliefs and values accompanied the industrialization and material progress of the time. In consequence it became necessary to rethink fundamental ideas about values, religion, science, community, and individuality.

The three most important pragmatists are American philosopher’s Charles Sanders Peirce, William James, and John Dewey. Peirce was primarily interested in scientific method and mathematics; His objective was to infuse scientific thinking into philosophy and society, and he believed that human comprehension of reality was becoming ever greater and that human communities were becoming increasingly progressive. Peirce developed pragmatism as a theory of meaning - in particular, the meaning of concepts used in science. The meaning of the concept ‘brittle’, for example, is given by the observed consequences or properties that objects called ‘brittle’ exhibit. For Peirce, the only rational way to increase knowledge was to form mental habits that would test ideas through observation, experimentation, or what he called inquiry. Many philosophers known as logical positivists, a group of philosophers who have been influenced by Peirce, believed that our evolving species was fated to get ever closer to Truth. Logical positivists emphasize the importance of scientific verification, rejecting the assertion of positivism that personal experience is the basis of true knowledge.

James moved pragmatism in directions that Peirce strongly disliked. He generalized Peirce’s doctrines to encompass all concepts, beliefs, and actions; he also applied pragmatist ideas to truth as well as to meaning. James was primarily interested in showing how systems of morality, religion, and faith could be defended in a scientific civilization. He argued that sentiment, as well as logic, is crucial to rationality and that the great issues of life-morality and religious belief, for example, - are leaps of faith. As such, they depend upon what he called ‘the will to believe’ and not merely on scientific evidence, which can never tell us what to do or what is worthwhile. Critics charged James with relativism (the belief that values depend on specific situations) and with crass expediency for proposing that if an idea or action works the way one intends, it must be right. But James can more accurately be described as a pluralist - someone who believes the world to be far too complex for anyone philosophy to explain everything.

Dewey’s philosophy can be described as a foundational version of philosophical naturalism, which regards human experience, intelligence, and communities as ever-evolving mechanisms. Using their experience and intelligence, Dewey believed, human beings can solve problems, including social problems, through inquiry. For Dewey, naturalism led to the idea of a democratic society that allows all members to acquire social intelligence and progress both as individuals and as communities. Dewey held that traditional ideas about knowledge, truth, and values, in which absolutes are assumed, are incompatible with a broadly Darwinian world-view in which individuals and societies are progressing. In consequence, he felt that these traditional ideas must be discarded or revised. For pragmatists, everything people know and do depend on a historical context and are thus tentative rather than absolute.

Many followers and critics of Dewey believe he advocated elitism and social engineering in his philosophical stance. Others think of him as a kind of romantic humanist. Both tendencies are evident in Dewey’s writings, although he aspired to synthesize the two realms.

The pragmatists’ tradition was revitalized in the 1980s by American philosopher Richard Rorty, who has faced similar charges of elitism for his belief in the relativism of values and his emphasis on the role of the individual in attaining knowledge. Interest has renewed in the classic pragmatists - Pierce, James, and Dewey - as an alternative to Rorty’s interpretation of the tradition.

In an ever-changing world, pragmatism has many benefits. It defends social experimentation as a means of improving society, accepts pluralism, and reject’s dead dogmas. But a philosophy that offers no final answers or absolutes and that appears vague as a result of trying to harmonize opposites may also be unsatisfactory to some.

It may prove fitting to turn tables of a direction that inclines by inclination some understanding of Kant's most distinguished followers, notably Johann Gottlieb Fichte, Friedrich Schelling, Georg Wilhelm Friedrich Hegel, and Friedrich Schleiermacher, who negated Kant's criticism in their elaborations of his transcendental metaphysics by denying the Kantian conception of the thing-in-itself. They thus developed an absolute idealism opposing Kant's critical transcendentalism.

Since the formation of the hypothesis of absolute idealism, the development of metaphysics has resulted in as many types of metaphysical theory as existed in pre-Kantian philosophy, despite Kant's contention that he had fixed definitely the limits of philosophical speculation. Phenomenalism, as it is exemplified in the writings of the French philosopher Auguste Comte and the British philosopher Herbert Spencer; Emergent evolution, or creative evolution, originated by the French philosopher Henri Bergson. The philosophy of the organism, elaborated by Alfred North Whitehead. The salient doctrines of pragmatism are that the chief function of thought is to guide action, that the meaning of concepts is to be sought in their practical applications, and that truth should be tested by the practical effects of belief; According to instrumentalism, ideas are instruments of action, and their truth is determined by their role in human experience. In the teachings of voluntarism may obtainably presuppose that Will is theoretically equal to postulates as they are the supreme manifestation of reality. The exponents of phenomenalism, who are sometimes called positivists, contend that everything can be analysed as to actual or possible occurrences, or phenomena, and that anything that cannot be analysed in this manner cannot be understood. In emergent or creative evolution, the evolutionary process is characterized as spontaneous and unpredictable than mechanistically determined. The philosophy of the organism combines an evolutionary stress on constant process with a metaphysical theory of God, the eternal objects, and intuitive creativity.

Comte chose the word positivism on the ground that it suggested the ‘reality’ and ‘constructive tendency’ that he claimed for the theoretical aspect of the doctrine. He was, in the main, interested in a reorganization of social life for the good of humanity through scientific knowledge, and thus controls of natural forces. The two primary components of positivism, the philosophy and the polity (or a program of individual and social conduct), were later welded by Comte into a whole under the conception of a religion, in which humanity was the object of worship.

In response to the scientific, political, and industrial revolution of his day, Comte was fundamentally concerned with an intellectual, moral, and political reorganization of the social order. Adoption of the scientific attitude was the key, he thought, to such a reconstruction.

Comte, also, argued that an empirical study of historical processes, particularly of the progress of the various interrelated sciences, reveals a law of three stages that govern human development. He analysed these stages in his major work, the six-volume Course of Positive Philosophy (1830-42, which was translated by 1853). Because of the nature of the human mind, each science or branch of knowledge passes through “three different theoretical states: the theological or fictitious state; The metaphysical or abstract state; and, lastly, the scientific or positive state.” At the theological stage, events are immaturely explained by appealing to the will of the gods or of God. At the metaphysical stage phenomena are explained by appealing to abstract philosophical categories. The final evolutionary stage, the scientific, involves relinquishing any quest for absolute explanations of causes. Attention is focussed altogether on how phenomena are related, with the aim of arriving at generalizations subject to observational verification. Comte's work is considered as the classical expression of the positivist attitude - namely, that the empirical sciences are the only adequate source of knowledge.

Although Kant rejected belief in a transcendent being, Comte recognized the value of religion in contributing to social stability. In his four-volume System of Positive Polity, 1851-54 and translated, 1875-77, he proposed his religion of humanity, aimed as the presentation to socially beneficial behaviour. Comte's chief significance, however, derives from his role in the historical development of positivism.

Wittgenstein’s philosophical life may be divided into two distinct phases: an early period, represented by the Tractatus, and a later period, represented by the Philosophical Investigations. Throughout most of his life, however, Wittgenstein consistently viewed philosophy as linguistic or a conceptual analysis. In the Tractatus he argued that “philosophy aims at the logical clarification of thoughts.” In the Philosophical Investigations, however, he maintained that “philosophy is a battle against the bewitchment of our intelligence by means of language.”

This recognition of linguistic flexibility and variety led to Wittgenstein’s concept of a language game and to the conclusion that people play different language games. The scientist, for example, is involved in a different language game than the theologian. Moreover, the meaning of a proposition must be understood in terms of its context, that is, about the rules of the game of which that proposition is a part. The key to the resolution of philosophical puzzles is the therapeutic process of examining and describing language in use.

During the early 20th century a group of philosophers who were concerned with developments in modern science rejected the traditional positivist ideas that held personal experience to be the basis of true knowledge and emphasized the importance of scientific verification. This group became known as logical positivists, and it included the Austrian Ludwig Wittgenstein and the British Bertrand Russell and G.E. Moore. It was Wittgenstein's Tractatus Logico-philosophicus, 1921; German-English parallels text, 1922, that proved to be of decisive influence in the rejection of metaphysical doctrines for their meaninglessness and the acceptance of empiricism as a matter

The positivists today, who have rejected this so-called Vienna school of philosophy, prefer to call themselves logical empiricists to dissociate themselves from the emphasis of the earlier thinkers on scientific verification. They maintain that the verification principle itself is philosophically unverifiable.

Edmund Husserl inherited his view from Brentano, that the central problem in understanding thought is that of explaining the way in which an intentional direction, or content, can belong to the mental phenomenon that exhibits it. What Husserl discovered when he contemplated the content of his mind were such acts as remembering, desiring, and perceiving, besides the abstract content of these acts, which Husserl called meanings. These meanings, he claimed, enabled an act to be directed toward an object under a certain aspect. Such directedness, called intentionality, he held to be the essence of consciousness. Transcendental phenomenology, according to Husserl, was the study of the basic components of the meanings that make intentionality possible. After, the Méditations Cartésiennes (1931, Cartesian Meditations, 1960), he introduced genetic phenomenology, which he defined as the study of how these meanings are built up in the course of experience.

Edmund Husserl is considered the founder of phenomenology. This 20th-century philosophical movement is dedicated to the description of phenomena as they present themselves through perception to the conscious mind.

Edmund Husserl, introduced the term in his book Ideen zu einer reinen Phänomenolgie und phänomenologischen Philosophie, 1913 Ideas: A General Introduction to Pure Phenomenology, 1931. Early followers of Husserl such as German philosopher Max Scheler, was influenced by his previous book, Logische Untersuchungen, two volumes, 1900 and 1901, Logical Investigations, 1970, claimed that the task of phenomenology is to study essences, such as the essence of emotions. Although Husserl himself never gave up his early interest in essences, he later held that only the essences of certain special conscious structural foundations are the proper Objectifies of phenomenology. As formulated by Husserl after 1910, phenomenology is the study of the structures of consciousness that enable consciousness to refer to objects outside itself. This study requires reflection on the content of the mind to the exclusion of everything else. Husserl called this type of reflection the phenomenological reduction. Because the mind can be directed toward nonexistent with real objects, Husserl recognized that phenomenological reflection does not really presuppose that of anything that exists, but amounts to a ‘bracketing of existence’- that is, setting aside the question of the real existence of the meditated objective.

Husserl argued against his early position, which he called psychologies, in Logical Investigations, 1900-1901 and translated, 1970. In this book, regarded as a radical departure in philosophy, he contended that the philosopher's task is to contemplate the essences of things, and that the essence of an object can be arrived at by systematically varying that object in the imagination. Husserl noted that consciousness is always directed toward something. He called this directedness intentionality and argued that consciousness contains ideal, unchanging structures called meanings, which determine what object the mind is directed toward at any given time.

During his tenure (1901-1916) at the University of Göttingen, Husserl attracted many students, who began to form a distinct phenomenological school, and he wrote his most influential work, Ideas: A General Introduction to Pure Phenomenology, 1913; and translated 1931. In this book Husserl introduced the term phenomenological reduction for his method of reflection on the meanings the mind employs when it contemplates an object. Because this method concentrates on meanings that are in the mind, whether or not the object present to consciousness actually exists, he proceeded to give detailed analyses of the mental structures involved in perceiving particular types of objects, describing in detail, for instance, his perception of the apple tree in his garden. Thus, although phenomenology does not assume the existence of anything, it is nonetheless a descriptive discipline; according to Husserl, phenomenology is devoted, not to inventing theories, but rather to describing the “things themselves.”

After 1916 Husserl taught at the University of Freiburg. Phenomenology had been criticized as an essentially solipistic method, confining the philosopher to the contemplation of private meanings, so in Cartesian Meditations, 1931 and translated, 1960, Husserl attempted to show how the individual consciousness can be directed toward other minds, society, and history. Husserl died in Freiburg on April 26, 1938.

Husserl's phenomenology had a great influence on a younger colleague at Freiburg, Martin Heidegger, who developed existential phenomenology, and Jean-Paul Sartre and French existentialism. Phenomenology remains one of the most vigorous tendencies in contemporary philosophy, and its impact has also been felt in theology, linguistics, psychology, and the social sciences.

What is more, Husserl discovered when he contemplated the content of his mind were such acts as remembering, desiring, and perceiving, beyond the abstract content of these acts, which Husserl called meanings. These meanings, he claimed, enabled an act to be directed toward an object under a certain aspect. Such directedness, called intentionality, he held to be the essence of consciousness. Transcendental phenomenology, according to Husserl, was the study of the basic components of the meanings that make intentionality possible. Successively, in Méditations Cartésiennes (1931, Cartesian Meditations, 1960), he introduced genetic phenomenology, which he defined as the study of how these meanings are built up in the course of experience.

Phenomenology attempts to describe reality as for pure experience by suspending all beliefs and assumptions about the world. Though first defined as descriptive psychology, phenomenological attempts in philosophical than psychological investigations into the nature of human beings. Influenced by his colleague Edmund Husserl, and German philosopher Martin Heidegger published Sein und Zeit (Being and Time) in 1927, an effort to describe the phenomenon of being by considering the full scope of existence.

All phenomenologists follow Husserl in attempting to use pure description. Thus, they all subscribe to Husserl's slogan ‘To the things themselves’. They differ among themselves, however, whether the phenomenological reduction can be realized, and what is manifest to the philosopher as giving a pure description of experience. Martin Heidegger, Husserl's colleague and most brilliant of critics, claimed that phenomenology must necessitate the essential manifestations what is hidden or perhaps underlying to cause among the

ordinary, everyday experience. He therefore, endeavoured within Being and Time, to describe what he called the structure of everydayness, or being-in-the-world, which he found an interconnected system of equipment, social roles, and purposes.

Martin Heidegger strongly influenced the development of the 20th-century philosophical school of existential phenomenology, which examines the relationship between phenomena and individual consciousness. His inquiries into the meaning of ‘authentic’ or ‘inauthentic’ existence greatly influenced a broad range of thinkers, including French existentialist Jean-Paul Sartre. Author Michael Inwood explores Heidegger’s key concept of Dasein, or “Being,” which was first expounded in his major work Being and Time.

Accountably of Heidegger, one is what one does in the world, a phenomenological reduction to one's own private experience is impossible. Because human action consists of a direct grasp of objects, positing a special mental entity called a meaning to account for intentionality is not necessary. For Heidegger, being thrown into the world among things in the act of realizing projects is a more fundamental kind of intentionality than that revealed in merely staring at or thinking about objects, and it is this more fundamental intentionality that makes possible the directedness analysed by Husserl.

In the mid-1900s, French existentialist Jean-Paul Sartre attempted to adapt Heidegger's phenomenology to the philosophy of consciousness, in effect returning to the approach of Husserl. Sartre agreed with Husserl that consciousness is always directed at objects but criticized his claim that such directedness is possible only by means of special mental entities called meanings. The French philosopher Maurice Merleau-Ponty rejected Sartre's view that phenomenological description reveals human beings to be pure, isolated, and free consciousness. He stressed the role of the active, involved body in all human knowledge, thus generalizing Heidegger's insights to include the analysis of perception. Like Heidegger and Sartre, Merleau-Ponty is an existential phenomenologists, in that he denies the possibility of bracketing existence.

Phenomenology has had a pervasive influence on 20th-century thought. Phenomenological versions of theology, sociology, psychology, psychiatry, and literary criticism have been developed, and phenomenology remains one of the most important schools of contemporary philosophy.

Phenomenology attempts to describe reality as for pure experience by suspending all beliefs and assumptions about the world. Though first defined as descriptive psychology, phenomenology attemptively afforded through the efforts established in philosophical than psychological investigations into the nature of human beings. Influenced by his colleague Edmund Husserl (known as the founder of phenomenology),

Husserl's colleague and most brilliant of critics, claimed that phenomenology ought be effectually manifested in what is hidden in ordinary, everyday experience. He thus attempted in Being and Time, to describe what he called the structure of everydayness, or being-in-the-world, which he found an interconnected system of equipment, social roles, and purposes.

German philosopher Martin Heidegger strongly influenced the development of the 20th-century philosophical school of existential phenomenology, which examines the relationship between phenomena and individual consciousness. His inquiries into the meaning of ‘authentic’ or ‘inauthentic’ existence greatly influenced a broad range of thinkers, including French existentialist Jean-Paul Sartre. Author Michael Inwood explores Heidegger’s key concept of Dasein, or ‘Being’, which was first expounded in his major work Being and Time.

Besides Husserl, Heidegger was especially influenced by the pre-Socratics, by Danish philosopher Søren Kierkegaard, and by German philosopher Friedrich Nietzsche. In developing his theories, Heidegger rejected traditional philosophic terminology in favour of an individual interpretation of the works of past thinkers. He applied original meanings and etymologies to individual words and expressions, and coined hundreds of new, complex words. Heidegger was concerned with what he considered the essential philosophical question: What is it, to be? This led to the question of what kind of ‘Being’ human beings have. They are, he said, thrown into a world that they have not made but that consists of potentially useful things, including cultural and natural objects. Because these objects come to humanity from the past and are used in the present for the sake of future goals, Heidegger posited a fundamental relation between the mode of being of objects, of humanity, and of the structure of time.

The individual is, however, always in danger of being submerged in the world of objects, everyday routine, and the conventional, shallow behaviour of the crowd. The feeling of dread (Angst) brings the individual to a confrontation with death and the ultimate meaninglessness of life, but only in this confrontation can an authentic sense of Being and of freedom be attained.

After 1930, Heidegger turned, in such works as Einführung in die Metaphysik (An Introduction to Metaphysics, 1953), to the interpretation of particular Western conceptions of Being. He felt that, in contrast to the reverent ancient Greek conception of being, modern technological society has fostered a purely manipulative attitude that has deprived Being and human life of meaning - a condition he called nihilism. Humanity has forgotten its true vocation and must recover the deeper understanding of Being (achieved by the early Greeks and lost by subsequent philosophers) to be receptive to new understandings of Being.

Heidegger's original treatment of such themes as human finitude, death, nothingness, and authenticity led many observers to associate him with existentialism, and his work had a crucial influence on French existentialist Jean-Paul Sartre. Heidegger, however, eventually repudiated existentialist interpretations of his work. His thought directly influenced the work of French philosophers’ Michel Foucault and Jacques Derrida and of German sociologist Jurgen Habermas. Since the 1960s his influence has spread beyond continental Europe and has had an increasing impact on philosophy in English-speaking countries worldwide.

Because, for Heidegger, one is what one does in the world, a phenomenological reduction to one's own private experience is impossible. Because human action consists of a direct grasp of objects, positing a special mental entity called a meaning to account for intentionality is not necessary. For Heidegger, being given off into the world among things in the act of realizing projects is a more fundamental kind of intentionality than that revealed in merely staring at or thinking about objects, and it is this more fundamental intentionality that makes possible the directedness analysed by Husserl.

Like Heidegger and Sartre, Merleau-Ponty Maurice (1908-1961), A French existentialist philosopher, whose phenomenological studies of the role of the body in perception and society opened a new field of philosophical investigation. He taught at the University of Lyon, at Sorbonne, and, after 1952, at the Collège de France. His first important work was The Structure of Comportment, 1942 translated, 1963, an interpretative analysis of behaviourism. His major work, Phenomenology of Perception, 1945 and translated 1962, is a detailed study of perception, influenced by the German philosopher Edmund Husserl's phenomenology and by Gestalt psychology. In it, he argues that science presupposes an original and unique perceptual relation to the world that cannot be explained or even described in scientific terms. This book can be viewed as a critique of cognitivism - the view that the working of the human mind can be understood under rules or programs. It is also a telling on the critique of the existentialism of his contemporary, Jean-Paul Sartre, showing how human freedom is never total, as Sartre claimed, but is limited by our characterization.

Born in Vienna on April 26, 1889, Wittgenstein was raised in a wealthy and cultured family. After attending schools in Linz and Berlin, he went to England to study engineering at the University of Manchester. His interest in pure mathematics led him to Trinity College, University of Cambridge, to study with Bertrand Russell. There he turned his attention to philosophy. By 1918 Wittgenstein had completed his Tractatus Logico-philosophicus, 1921 translated, 1922, a work he then believed provided the “solution” to philosophical problems. Subsequently, he turned from philosophy and for several years taught elementary school in an Austrian village. In 1929 he returned to Cambridge to resume his work in philosophy and was appointed to the faculty of Trinity College. Soon he began to reject certain conclusions of the Tractatus and to develop the position reflected in his Philosophical Investigations, published, Posthumously 1953, and translated 1953. Wittgenstein retired in 1947, he died in Cambridge on April 29, 1951. A sensitive, intense man who often sought solitude and was frequently depressed, Wittgenstein abhorred pretense and was noted for his simple style of life and dress. The philosopher was forceful and confident in personality, however, and he exerted considerable influence on those with whom he came in contact.

Wittgenstein’s philosophical life may be divided into two distinct phases: an early period, represented by the Tractatus, and a later period, represented by the Philosophical Investigations. Throughout most of his life, however, Wittgenstein consistently viewed philosophy as linguistic or conceptual analysis. In the Tractatus he argued that “philosophy aims at the logical clarification of thoughts.” In the Philosophical Investigations, however, he maintained that “philosophy is a battle against the bewitchment of our intelligence by means of language.”

Language, Wittgenstein argued in the Tractatus, is composed of complex propositions that can be analysed into fewer complex propositions until one arrives at simple or elementary propositions. Correspondingly, the world is composed of complex facts that can be analysed into fewer complex facts until one arrives at simple. The world is the totality of these facts. According to Wittgenstein’s picture theory of meaning, it is the nature of elementary propositions logically to picture atomic facts, or “states of affairs.” He claimed that the nature of language required elementary propositions, and his theory of meaning required that there be atomic facts pictured by the elementary propositions. On this analysis, only propositions that picture facts - the propositions of science - are considered cognitively meaningful. Metaphysical and ethical statements are not meaningful assertions. The logical positivists associated with the Vienna Circle were greatly influenced by this conclusion.

Wittgenstein came to believe, nonetheless, that the narrow view of language reflected in the Tractatus was mistaken. In the Philosophical Investigations he argued that if one looks to see how language is used, the variety of linguistic usage becomes clear. Although some propositions are used to picture facts, others are used to command, question, pray, thank, curse, and so on. This recognition of linguistic flexibility and variety led to Wittgenstein’s concept of a language game and to the conclusion that people play different language games. The scientist, for example, is involved in a different language game than the theologian. Moreover, the meaning of a proposition must be understood as to its context, that is, for the rules of the game of which that proposition is a part. The key to the resolution of philosophical puzzles is the therapeutic process of examining and describing language in use.

Phenomenology attempts to describe reality as pure experience by suspending all beliefs and assumptions about the world. Though first defined as descriptive psychology, phenomenology attempts of philosophical than psychological investigations into the nature of human beings. Influenced by his colleague Edmund Husserl and the German philosopher Martin Heidegger published Being and Time, in 1927, an effort to describe the phenomenon of being by considering the full scope of existence.



German philosopher Martin Heidegger strongly influenced the development of the 20th-century philosophical school of existential phenomenology, which examines the relationship between phenomena and individual consciousness. His inquiries into the meaning of ‘authentic’ or ‘inauthentic’ existence greatly influenced a broad range of thinkers.

Because, for Heidegger, one is what one does in the world, a phenomenological reduction to one's own private experience is impossible. Because human action consists of a direct grasp of objects, positing a special mental entity called a meaning to account for intentionality is not necessary. For Heidegger, being thrown into the world among things in the act of realizing projects is a more fundamental kind of intentionality than that revealed in merely staring at or thinking about objects, and it is this more fundamental intentionality that makes possible the directedness analysed by Husserl.

Consciousness, is the latest development of the organic and so what is most unfinished and unstrong. It was in 1882, the year and publication of The Gay Science. Yet, the domination with which several times he spoke against antisemitism. Although overlooking Wagner’s antisemitism during the period in which he idealized him was easy for him, when Wagner gained wider public and his antisemitism became more intense did forcefully condemn him for it. In the Gay Science Nietzsche wrote that ‘Wagner is Schopenhauerian in his hatred of the Jews to whom he is not able to do justice even when it comes to their greatest deed, after all, the Jews are the inventors of Christianity; (Recognizing that is important while Nietzsche frequently attacked those forces that led the developments of Christianity and its destructive impact there is no simple condemning. Here Nietzsche is genuinely castigating Wagner, [and Schopenhauer] and recognizing this greatest deed of the Jews, the consequences may have been as deeply as the neurotic creature. Nevertheless, a creature who brought into the world something new and full of promise. In addition, as we can take to consider, in the words of Bernard Williams, ‘Nietzsche’s ever-present sense that his own consciousness would not be possible without the developments that he disliked’.

The problem with consciousness lies at work who of the scientists has long considered the nature of consciousness without producing a fully satisfactory definition. In the early 20th century American philosopher and psychologist William James suggested that consciousness be a mental process involving both attention to external stimuli and short-term memory. Later scientific explorations of consciousness mostly expanded upon Jame’s work. In an article dated from a 1997 special issue of Scientific American, Nobel laureate Francis Crick, who helped determine the structure of a DNA, and fellow biophysicist Christof Koch explains how experiments on vision might deepen our understanding of a sensible characterization of consciousness.

States of Consciousness., are no simple, agreed-upon definition of consciousness exists? Attempted definitions tend to be tautological (for example, consciousness defined as awareness) or merely descriptive (for example, consciousness described as sensations, thoughts, or feelings). Despite this problem of definition, the subject of consciousness has had a remarkable history. At one time the primary subject matter of psychology, consciousness as an area of study suffered an almost total demise, later reemerging to become a topic of current interest.

French thinker René Descartes applied rigorous scientific methods of deduction to his exploration of philosophical questions. Descartes is probably best known for his pioneering work in philosophical skepticism. Author Tom Sorell examines the concepts behind Descartes’s work Meditationes de Prima Philosophia (1641; Meditations on First Philosophy), focussing on its unconventional use of logic and the reactions it aroused.

Most of the philosophical discussions of consciousness arose from the mind-body issues posed by the French philosopher and mathematician René Descartes in the 17th century. Descartes asked: Is the mind, or consciousness, independent of matter? Is consciousness extended (physical) or unextended (nonphysical)? Is consciousness determinative, or is it determined? English philosophers such as John Locke equated consciousness with physical sensations and the information they provide, whereas European philosophers such as Gottfried Wilhelm Leibniz and Immanuel Kant gave a more central and active role to consciousness.

The philosopher who most directly influenced subsequent exploration of the subject of consciousness was the 19th-century German educator Johann Friedrich Herbart, who wrote that ideas had quality and intensity and that they may inhibit or facilitate one another. Thus, ideas may pass from “states of reality” (consciousness) to “states of tendency” (unconsciousness), with the dividing line between the two states being described as the threshold of consciousness. This formulation of Herbart clearly presages the development, by the German psychologist and physiologist Gustav Theodor Fechner, of the Psycho-physical measurement of sensation thresholds, and the later development by Sigmund Freud of the concept of the unconscious.

The experimental analysis of consciousness dates from 1879, when the German psychologist Wilhelm Max Wundt started his research laboratory. For Wundt, the task of psychology was the study of the structure of consciousness, which extended well beyond sensations and included feelings, images, memory, attention, duration, and movement. Because early interest focussed on the content and dynamics of consciousness, it is not surprising that the central methodology of such studies was introspection; that is, subjects reported on the mental contents of their own consciousness. This introspective approach was developed most fully by the American psychologist Edward Bradford Titchener at Cornell University. Setting his task as that of describing the structure of the mind, Titchener attempted to detail, from introspective self-reports, the dimensions of the elements of consciousness. For example, taste was “dimensionalized” into four basic categories: sweet, sour, salt, and bitter. This approach was known as structuralism.

By the 1920s, however, a remarkable revolution had occurred in psychology that was essentially to remove considerations of consciousness from psychological research for some 50 years: Behaviourism captured the field of psychology. The main initiator of this movement was the American psychologist John Broadus Watson. In a 1913 article, Watson stated, “I believe that we can write a psychology and never use the terms consciousness, mental states, mind . . . imagery and the like.” Psychologists then turned almost exclusively to behaviour, as described in terms of stimulus and response, and consciousness was totally bypassed as a subject. A survey of eight leading introductory psychology texts published between 1930 and the 1950s found no mention of the topic of consciousness in five texts, and in two it was treated as a historical curiosity.

Beginning in the late 1950s, however, interest in the subject of consciousness returned, specifically in those subjects and techniques relating to altered states of consciousness: sleep and dreams, meditation, biofeedback, hypnosis, and drug-induced states. Much of the surge in sleep and dream research was directly fuelled by a discovery relevant to the nature of consciousness. A physiological indicator of the dream state was found: At roughly 90-minute intervals, the eyes of sleepers were observed to move rapidly, and at the same time the sleepers' brain waves would show a pattern resembling the waking state. When people were awakened during these periods of rapid eye movement, they almost always reported dreams, whereas if awakened at other times they did not. This and other research clearly indicated that sleep, once considered a passive state, was instead an active state of consciousness (see Dreaming; Sleep).

During the 1960s, an increased search for “higher levels” of consciousness through meditation resulted in a growing interest in the practices of Zen Buddhism and Yoga from Eastern cultures. A full flowering of this movement in the United States was seen in the development of training programs, such as Transcendental Meditation, that were self-directed procedures of physical relaxation and focussed attention. Biofeedback techniques also were developed to bring body systems involving factors such as blood pressure or temperature under voluntary control by providing feedback from the body, so that subjects could learn to control their responses. For example, researchers found that persons could control their brain-wave patterns to some extent, particularly the so-called alpha rhythms generally associated with a relaxed, meditative state. This finding was especially relevant to those interested in consciousness and meditation, and a number of “alpha training” programs emerged.

Another subject that led to increased interest in altered states of consciousness was hypnosis, which involves a transfer of conscious control from the subject to another person. Hypnotism has had a long and intricate history in medicine and folklore and has been intensively studied by psychologists. Much has become known about the hypnotic state, relative to individual suggestibility and personality traits; the subject has now been largely demythologized, and the limitations of the hypnotic state are fairly well known. Despite the increasing use of hypnosis, however, much remains to be learned about this unusual state of focussed attention.

Finally, many people in the 1960s experimented with the psychoactive drugs known as hallucinogens, which produce disorders of consciousness. The most prominent of these drugs are lysergic acid diethylamide, or LSD; mescaline (see Peyote); and psilocybin; the latter two have long been associated with religious ceremonies in various cultures. LSD, because of its radical thought-modifying properties, was initially explored for its so-called mind-expanding potential and for its psychotomimetic effects (imitating psychoses). Little positive use, however, has been found for these drugs, and their use is highly restricted.

Scientists have long considered the nature of consciousness without producing a fully satisfactory definition. In the early 20th century American philosopher and psychologist William James suggested that consciousness is a mental process involving both attention to external stimuli and short-term memory. Later scientific explorations of consciousness mostly expanded upon James’s work. In this article from a 1997 special issue of Scientific American, Nobel laureate Francis Crick, who helped determine the structure of DNA, and fellow biophysicist Christof Koch explain how experiments on vision might deepen our understanding of consciousness.

The concept of a direct, simple linkage between environment and behaviour became unsatisfactory in recent decades, the interest in altered states of consciousness may be taken as a visible sign of renewed interest in the topic of consciousness. That persons are active and intervening participants in their behaviour has become increasingly clear. Environments, rewards, and punishments are not simply defined by their physical character. Memories are organized, not simply stored (see Memory). An entirely new area called cognitive psychology has emerged that centres on these concerns. In the study of children, increased attention is being paid to how they understand, or perceive, the world at different ages. In the field of animal behaviour, researchers increasingly emphasize the inherent characteristics resulting from the way a species has been shaped to respond adaptively to the environment. Humanistic psychologists, with a concern for self-actualization and growth, have emerged after a long period of silence. Throughout the development of clinical and industrial psychology, the conscious states of persons in terms of their current feelings and thoughts were of obvious importance. The role of consciousness, however, was often de-emphasised in favour of unconscious needs and motivations. Trends can be seen, however, toward a new emphasis on the nature of states of consciousness.

The overwhelming question in neurobiology today is the relation between the mind and the brain. Everyone agrees that what we know as mind is closely related to certain aspects of the behaviour of the brain, not to the heart, as Aristotle thought. Its most mysterious aspect is consciousness or awareness, which can take many forms, from the experience of pain to self-consciousness. In the past the mind (or soul) was often regarded, as it was by Descartes, as something immaterial, separate from the brain but interacting with it in some way. A few neuroscientists, such as Sir John Eccles, still assert that the soul is distinct from the body. Nonetheless, most neuroscientists now believe that all aspects of mind, including its most puzzling attribute. Consciousness or awareness is likely to be explainable in a more materialistic way as the behaviour of large sets of interacting neurons. As William James, the father of American psychology, said a century ago, consciousness is not a thing but a process.

Exactly what the process is, as, yet, to be discovered. For many years after James penned The Principles of Psychology, consciousness was a taboo concept in American psychology because of the dominance of the behaviorist movement. With the advent of cognitive science in the mid-1950s, it became possible again for psychologists to consider mental processes as opposed to merely observing behaviour. In spite of these changes, until recently most cognitive scientists ignored consciousness, as did most neuroscientists. The problem was felt to be either purely ‘philosophical’ or too elusive to study experimentally. Getting a grant just to study consciousness would not have been easy for a neuroscientist.

Such timidity is ridiculous, so to think about how best to attack the problem scientifically, may be in how to explain mental events as caused by the firing of large sets of neurons? Although there are those who believe such an approach is hopeless, however, worrying too much over aspects of the problem that cannot be solved scientifically is not productive or, more precisely, that it cannot be solved solely by using existing scientific ideas. Radically new concepts may be needed to recall, and the modifications of scientific thinking may be forced upon us by quantum mechanics. Seemingly, the only sensible approach is to press the experimental attack until we are confronted with dilemmas that call for new ways of thinking.

There are many possible approaches to the problem of consciousness. Some psychologists feel that any satisfactory theory should try to explain as many aspects of consciousness as possible, including emotion, imagination, dreams, mystical experiences and so on. Although such an all-embracing theory will be necessary over time, it is wiser to begin with the particular aspect of consciousness that is likely to yield most easily. What this aspect may be a matter of personal judgment. Selecting the mammalian visual system because humans are very visual animals and because so much experimental and theoretical work has already been done on it.

Grasping exactly what we need to explain is not easy, and it will take many careful experiments before visual consciousness can be described scientifically. In that, no attempt to define consciousness is of itself the dangers of premature definitions. (If this seems like a copout, try defining the word ‘gene’- you will not find it easy.) Yet the experimental evidence that already exists provides enough of a glimpse of the nature of visual consciousness to guide research.

Visual theorists agree that the problem of visual consciousness is ill-posed. The mathematical term ‘ill posed’ means that additional constraints are needed to solve the problem. Although the main function of the visual system is to perceive objects and events in the world around us, the information available to our naked eyes is not sufficient by itself to provide the brain with its unique interpretation of the visual world. The understanding held within the brain must essential use experience (either its own or that of our distant descendabilities, which is embedded in our genes) to help interpret the information coming into our eyes. An example would be the derivation of the three-dimensional representation of the world from the two-dimensional signals falling onto the retinas of our two eyes or even onto one of them.

Visual theorists also would agree that seeing is a constructive process, one in which the brain has to carry out complex activities (sometimes called computations) to decide which interpretation to adopt of the ambiguous visual input. ‘Computation’ implies that the brain acts to form a symbolic representation of the visual world, with a mapping (in the mathematical sense) of certain aspects of that world onto elements in the brain.

Ray Jackendoff of Brandeis University postulates, as do most cognitive scientists, that the computations carried out by the brain are largely unconscious and that what we become aware of is the result of these computations. However, while the customary view is that this awareness occurs at the highest levels of the computational system, Jackendoff has proposed an intermediate-level theory of consciousness.

What we see, Jackendoff suggests, relates to a representation of surfaces that are directly visible to us, with their outline, orientation, colour, texture and movement. (This idea has similarities to what the late David C. Marr of the Massachusetts Institute of Technology called a 2 ½ dimensional sketch. It is more than a two-dimensional sketch because it conveys the orientation of the visible surfaces. It is less than three-dimensional because depth information is not explicitly represented.) In the next stage this sketch is processed by the brain to produce a three-dimensional representation. Jackendoff argues that we are not usually aware of this three-dimensional representation.

An example may make this process clearer. If you look at a person whose back is turned to you, you can see the back of the head but not the face. Nevertheless, your brain infers that the person has a face. We can deduce as much because if that person turned around and had no face, you would be very surprised.

The viewer - entering representation is that he might correspond to the visible support of the head from which its back-end is usually proven as the observable aid under which that you are vividly aware. What your brain infers about the front would come from some kind of three-dimensional representation. This does not mean that information flows only from the surface representation to the three-dimensional one; It almost flows in both directions. When you imagine the front of the face, what you are aware of is a surface representation generated by information from the three-dimensional model.

Distinguishing it between an explicit and an implicit representation is important. An explicit representation is something symbolized without further processing. An implicit representation contains the same information but requires further processing to make it explicit. The pattern of coloured pixels on a television screen, for example, contains an implicit representation of objects (say, a person's face), but only the dots and their locations are explicit. When you see a face on the screen, there must be neurons in your brain whose firing, in some sense, symbolizes that face.

We call this pattern of firing neurons an active representation. A latent representation of a face must also be stored in the brain, probably as a special pattern of synaptic connections between neurons. For example, you probably have a representation of The Sky Dome in your brain, a representation that is usually inactive. If you do think about the Dome, the representation becomes active, with the relevant neurons firing away.

An object, incidentally, may be represented in more than one way - as a visual image, as a set of words and their related sounds, or even as a touch or a smell. These different representations are likely to interact with one another. The representation is likely to be distributed over many neurons, both locally and more globally. Such a representation may not be as simple and straightforward as uncritical introspection might indicate. There is suggestive evidence, in that it is partly from studying how neurons fire in various parts of a monkey's brain and partly from examining the effects of certain types of brain damage in humans. That these different aspects of a face and of the implications of a face - may be represented in different parts of the brain.

First, there is the representation of a face as a face, two eyes, a nose, a mouth and so on. The neurons involved are usually not too fussy about the exact size or position of this face in the visual field, nor are they very sensitive to small changes in their orientation. In monkeys, there are neurons that respond best when the face is turning in a particular direction, while others are more concerned with the direction in which the eyes are gazing.

Then there are representations of the parts of a face, as separate from those for the face as a whole. What is more, that the implications of seeing a face, such as that person's sex, the facial expression, the familiarity or unfamiliarity of the face, and in particular whose face it is, may each be correlated with neurons firing in other places.

What we are aware of at any moment, in one sense or another, is not a simple matter. It is to suggest, that there may be a very transient form of fleeting awareness that represents only simple features and does not require an attentional mechanism. From this brief awareness the brain constructs a viewer - cantered representation - what we see vividly and clearly - that does require attention. This in turn probably leads to three-dimensional object representations and thence to more cognitive ones.

Representations corresponding to vivid consciousness are likely to have special properties. William James thought that consciousness to involve both attention and short-term memory. Most psychologists today would agree with this view. Jackendoff writes that consciousness is ‘enriched’ by attention, implying that whereas attention may not be essential for certain limited types of consciousness, it is necessary for full consciousness. Yet it is not clear exactly which forms of memory are involved. Is long-term memory needed? Some forms of acquired knowledge are so embedded in the machinery of neural processing that they are almost used in becoming aware of something. On the other hand, there is evidence from studies of brain-damaged patients that the ability to lay down new long-term episodic memories is not essential for consciousness to be experienced.

Imagining that anyone could be conscious is difficult if he or she had no memory whatsoever of what had just happened, even an extremely short one. Visual psychologists talk of iconic memory, which lasts for a fraction of a second, and working memory (such as that used to remember a new telephone number) that lasts for only a few seconds unless it is rehearsed. It is not clear whether both are essential for consciousness. In any case, the division of short-term memory into these two categories may be too crude.

If these complex processes of visual awareness are localized in parts of the brain, which processes are likely to be where? Many regions of the brain may be involved, but it is almost certain that the cerebral neocortex plays a dominant role. Visual information from the retina reaches the neocortex mainly by way of a part of the thalamus (the lateral geniculate nucleus), being of another significant visual pathway, of which the retina is to the superior colliculus, at the top of the brain stem.

The cortex in humans consists of two intricately folded sheets of nerve tissue, one on each side of the head. These sheets are connected by a large tract of about half a billion axons called the corpus callosum. It is well known that if the corpus callosum is cut, as is done for certain cases of intractable epilepsy, one side of the brain is not aware of what the other side is seeing. In particular, the left side of the brain (in a right-handed person) appears not to be aware of visual information received exclusively by the right side. This shows that none of the information required for visual awareness can reach the other side of the brain by travelling down to the brain stem and, from there, back up. In a normal person, such information can get to the other side only by using the axons in the corpus callosum.

A different part of the brain - the hippocampal system - is involved in one-shot, or episodic, memories that, over weeks and months, it passes on to the neocortex. This system is so placed that it receives inputs from, and projects to, many parts of the brain. Thus, one might suspect that the hippocampal system is the essential seat of consciousness. This is not true: Evidence from studies of patients with damaged brains shows that this system is not essential for visual awareness, although naturally a patient lacking one is severely disabled in everyday life because he cannot remember anything that took place more than a minute or so in the past.

In broad terms, the neocortex of alert animals probably acts in two ways. By building on crude and redundant wiring, for which is produced by our genes and by embryonic processes. The neocortex draws on visual and other experience to prolong the ‘filament’ to assimilate itself and create sectional categories (or "features") it can respond to. A new category is not fully created in the neocortex after exposure to only one example of it, although some small modifications of the neural connections may be made.

The second function of the neocortex (at least of the visual part of it) is to respond extremely rapidly to incoming signals. To do so, it uses the categories it has learned and tries to find the combinations of active neurons that, because of its experience, are most likely to represent the relevant objects and events in the visual world at that moment. The formation of such coalitions of active neurons may also be influenced by biases coming from other parts of the brain: For example, signals telling it what best to attend to or high-level expectations about the nature of the stimulus.

Consciousness, as James noted, is always changing. These rapidly formed coalitions occur at different levels and interact to form even broader coalitions. They are transient, lasting usually for only a fraction of a second. Because coalitions in the visual system are the basis of what we see, evolution has seen to it that they form as fast as possible, otherwise, no animal could survive. The brain is impeded in forming neuronal coalitions rapidly because, by computer standards, neurons act very slowly. The brain formally compensates of stabilizing the account for which this relative slowness is partially used through a number of neurons, simultaneously and in parallel, and partly by arranging the system in a roughly hierarchical manner.

If visual awareness at any moment corresponds to sets of neurons firing, then the obvious question is: Where are these neurons located in the brain, and in what way are they firing? Visual awareness is highly unlikely to occupy all the neurons in the neocortex that are firing above their background rate at a particular moment. It would be to expect that, theoretically, at least some of these neurons would be involved in doing computations - trying to arrive at the best coalitions - whereas others would express the results of these computations, in other words, what we see.

Fortunately, some experimental evidence can be found to back up this theoretical conclusion. A phenomenon called binocular rivalry may help identify the neurons whose firing symbolizes awareness. This phenomenon can be seen in dramatic form in an exhibit prepared by Sally Duensing and Bob Miller at the Exploratorium in San Francisco.

Binocular rivalry occurs when each eye has a different visual input relating to the same part of the visual field. The early visual system on the left side of the brain receives an input from both eyes but sees only the part of the visual field to the right of the fixation point. The converse is true for the right side. If these two conflicting inputs are rivalrous, one sees not the two inputs superimposed but first one and then the other, and so given alternatively.

In the exhibit, called "The Cheshire Cat," viewers put their heads in a fixed place and are told to keep the gaze fixed. By means of a suitably a placed mirror, one of the eyes can look at another person's face, directly in front, while the other eye sees a blank white screen to the side. If the viewer waves a hand in front of this plain screen at the same location in his or her visual field occupied by the face, the face is wiped out. The movement of the hand, being visually very salient, has captured the brain's attention. Without attention the face cannot be seen. If the viewer moves the eyes, the face reappears.

In some cases, only part of the face disappears. Sometimes, for example, one eye, or both eyes, will remain. If the viewer looks at the smile on the person's face, the face may disappear, leaving only the smile. For this reason, the effect has been called the Cheshire Cat effect, after the cat in Lewis Carroll's Alice's Adventures in Wonderland.

Although recording activity in individual neurons in a human brain is very difficult, such studies can be done in monkeys. A simple example of binocular rivalry has been studied in a monkey by Nikos K. Logothetis and Jeffrey D. Schall, both then at M.I.T. They trained a macaque to keep its eye’s still and to signal whether it is seeing upward or downward movement of a horizontal grating. To produce rivalry, upward movement is projected into one of the monkey's eyes and downward movement into the other, so that the two images overlap in the visual field. The monkey signals that it sees up and down movements alternatively, just as humans would. Even though the motion stimulus coming into the monkey's eyes is always the same, the monkey's percept changes every second or so.

Cortical area MT (which some researchers prefer to label V5) is an area mainly concerned with movement. What do the neurons in MT do when the monkey's percept is sometimes up and sometimes down? (The researchers studied only the monkey's first response.) The simplified answer - the actual data are more disorganized - is that whereas the firing of some of the neurons correlates with the changes in the percept, for others the average firing rate is unchanged and independent of which direction of movement the monkey is seeing at that moment. Thus, it is unlikely that the firing of all the neurons in the visual neocortex at one particular moment corresponds to the monkey's visual awareness. Exactly which neurons do correspond to awareness remains to be discovered.

Having postulated that when we clearly see something, there must be neurons actively firing that stand for what, we see. This might be called the activity principle. Here, too, there is some experimental evidence. One example is the firing of neurons in a specific cortical visual area in response to illusory contours. Another and perhaps more striking case are the filling in of the blind spot. The blind spot in each eye is caused by the lack of photoreceptors in the area of the retina where the optic nerve leaves the retina and projects to the brain. Its location is about 15 degrees from the fovea (the visual centre of the eye). Yet if you close one eye, you do not see a hole in your visual field.

Philosopher Daniel C. Dennett of Tufts University is unusual among philosophers in that he is interested both in psychology and in the brain. This interest is much to be welcomed. In a recent book, Consciousness Explained, he has argued that talking about filling in is wrong. He concludes, correctly, that "an absence of information is not the same as information about an absence." From this general principle he argues that the brain does not fill in the blind spot but ignores it.

Dennett's argument by itself, however, does not establish that filling in does not occur; it only suggests that it might not. Dennett also states that "your brain has no machinery for [filling in] at this location." This statement is incorrect. The primary visual cortex lacks a direct input from one eye, but normal "machinery" is there to deal with the input from the other eye. Ricardo Gattass and his colleagues at the Federal University of Rio de Janeiro have shown that in the macaque some of the neurons in the blind-spot area of the primary visual cortex do respond to input from both eyes, probably assisted by inputs from other parts of the cortex. Moreover, in the case of simple filling in, some of the neurons in that region respond as if they were actively filling in.

Thus, Dennett's claim about blind spots is incorrect. In addition, psychological experiments by Vilayanur S. Ramachandran have shown that what is filled of a volume in a can be quite complex depending on the overall context of the visual scene. How, he argues, can your brain be ignoring something that is in fact commanding attention?

Filling in, therefore, is not to be dismissed as nonexistent or unusual. It probably represents a basic interpolation process that can occur at many levels in the neocortex. It is, incidentally, a good example of what is meant by a constructive process.

How can we discover the neurons whose firing symbolizes a particular percept? William T. Newsome and his colleagues at Stanford University have done a series of brilliant experiments on neurons in cortical area MT of the macaque's brain. By studying a neuron in area MT, we may discover that it responds best to very specific visual features having to do with motion. A neuron, for instance, might fire strongly in response to the movement of a bar in a particular place in the visual field, but only when the bar is oriented at a certain angle, moving in one of the two directions perpendicular to its length within a certain range of speed.

Exciting just a single neuron is technically difficult, but it is known that neurons that respond to roughly the same position, orientation and direction of movement of a bar tend to be located near one another in the cortical sheet. The experimenters taught the monkey a simple task in movement discrimination using a mixture of dots, some moving randomly, the rest all in one direction. They showed that electrical stimulation of a small region in the right place in cortical area MT would bias the monkey's motion discrimination, almost always in the expected direction.

Thus, the stimulation of these neurons can influence the monkey's behaviour and probably its visual percept. Such exploring experiments do not, only show decisively that the firing of such neurons is the exact neural correlate of the percept. The correlate could be only a subset of the neurons being activated. Or perhaps the real correlate is the firing of neurons in another part of the visual hierarchy that is strongly influenced by the neurons activated in area MT.

These same reservations apply also to cases of binocular rivalry. Clearly, the problem of finding the neurons whose firing symbolizes a particular percept is not going to be easy. It will take many careful experiments to track them down even for one kind of percept.

The purpose of vivid visual awareness is obviously to feed into the cortical areas concerned with the implications of what we see, as from its position, the information shuttles on the one hand to the hippocampal system, to be encoded (temporarily) into long-term episodic memory, and on the other to the planning levels of the motor system. Nevertheless, is it possible to go from a visual input to a behavioural output without any relevant visual awareness?

That such a process can happen is demonstrated by the remarkable class of patients with ‘blind-sight’. These patients, all of whom have suffered damage to their visual cortex, can point with fair accuracy at visual targets or track them with their eyes while vigorously denying seeing anything. In fact, these patients are as surprised as their doctors by their abilities. The amount of information that ‘gets through’, however, is limited: Blind-sight patients have some ability to respond to wavelength, orientation and motion, yet they cannot distinguish a triangle from a square.

It is naturally of great interest to know which neural pathways are being used in these patients. Investigators originally suspected that the pathway ran through the superior colliculus. Recent experiments suggest that a direct but weak connection may be involved between the lateral geniculate nucleus and other visual areas in the cortex. It is unclear whether an intact primary visual cortex region is essential for immediate visual awareness. Conceivably the visual signal in blind-sight is so weak that the neural activity cannot produce awareness, although getting through to the motor system remains strong enough.

Normal-seeing people regularly respond to visual signals without being fully aware of them. In automatic actions, such as swimming or driving a car, complex but stereotypical actions occurred with little, if any, associated visual awareness. In other cases, the information conveyed is either very limited or very attenuated. Thus, while we can function without visual awareness, our behaviour without it is restricted.

Clearly, it takes a certain amount of time to experience a conscious percept. It is tediously difficult to determine just how much time is needed for an episode of visual awareness, but one aspect of the problem that can be demonstrated experimentally is that signals received close together in time are treated by the brain as simultaneous.

A disk of red light is flashed for, say, 20 milliseconds, followed immediately by a 20-millisecond flash of green light in the same place. The subject reports that he did not see a red light followed by a green light. Instead he saw a yellow light, just as he would have if the red and the green light had been flashed simultaneously. Yet the subject could not have experienced yellow until after the information from the green flash had been processed and integrated with the preceding red one.

Experiments of this type led psychologist Robert Efron, now at the University of California at Davis, to conclude that the processing period for perception is about 60 to 70 milliseconds. Similar periods are found in experiments with tones in the auditory system. It is always possible, however, that the processing times may be different in higher parts of the visual hierarchy and in other parts of the brain. Processing is also more rapid in trained, compared with naive, observers.

Because it appears to be involved in some forms of visual awareness, it would help if we could discover the neural basis of attention. Eye movement is a form of attention, since the area of the visual field in which we see with high resolution is remarkably small, roughly the area of the thumbnail at arms’ length. Thus, we move our eyes to gaze directly at an object in order to see it more clearly. Our eyes usually move three or four times a second. Psychologists have shown, however, that there appears to be a faster form of attention that moves around, in some sense, when our eyes are stationary.

The exact psychological nature of this faster attentional mechanism is currently questionable. Several neuroscientists, however, including Robert Desimone and his colleagues at the National Institute of Mental Health, have shown that the rate of firing of certain neurons in the macaque's visual system depends on what the monkey is attending too in the visual field. Thus, attention is not solely a psychological concept; it also has neural correlates that can be observed. A number of researchers have found that the pulvinars, a region of the thalamus, appears to be involved in visual attention. We would like to believe that the thalamus deserve to be called ‘the organ of attention’, but this status has yet to be established.

The major problem is to find what activity in the brain corresponds directly to visual awareness. It has been speculated that each cortical area produces awareness of only those visual features that are ‘columnar’, or arranged in the stack or column of neurons perpendicular to the cortical surface. Thus, the primary visual cortex could code for orientation and area MT for motion. So far experientialists have not found one particular region in the brain where all the information needed for visual awareness appears to come together. Dennett has dubbed such a hypothetical place ‘The Cartesian Theatre’. He argues on theoretical grounds that it does not exist.

Awareness seems to be distributed not just on a local scale, but more widely over the neocortex. Vivid visual awareness is unlikely to be distributed over every cortical area because some areas show no response to visual signals. Awareness might, for example, be associated with only those areas that connect back directly to the primary visual cortex or alternatively with those areas that project into one another's layer four (The latter areas are always at the same level in the visual hierarchy.)

The key issue, then, is how the brain forms its global representations from visual signals. If attention is crucial for visual awareness, the brain could form representations by attending to just one object at a time, rapidly moving from one object to the next. For example, the neurons representing all the different aspects of the attended object could all fire together very rapidly for a short period, possibly in rapid bursts.

This fast, simultaneous firing might not only excite those neurons that symbolized the implications of that object but also temporarily strengthen the relevant synapses so that this particular pattern of firing could be quickly recalled in the form of short-term memory. If only one representation needs to be held in short-term memory, as in remembering a single task, the neurons involved may continue to fire for a period.

A problem arises if being aware of more than one object at absolute measure from the corresponding outlet in time is imperative. If all the attributes of two or more objects were represented by neurons firing rapidly, their attributes might be confused. The colour of one might become attached to the shape of another. This happens sometimes in very brief presentations.

Some time ago Christoph von der Malsburg, now at the Ruhr-Universität Bochum, suggested that this difficulty would be circumvented if the neurons associated with anyone objects all fired in synchrony (that is, if their times of firing were correlated) but out of synchrony with those representing other objects. Recently two groups in Germany reported that there does appear to be correlated firing between neurons in the visual cortex of the cat, often in a rhythmic manner, with a frequency in the 35- to 75-hertz range, sometimes called 40-hertz, or g, oscillation.

Von der Malsburg's proposal prompted to suggest that this rhythmic and synchronized firing might be the neural correlate of awareness and that it might serve to bind together activity concerning the same object in different cortical areas. The matter is still undecided, but at present the fragmentary experimental evidence does little to support such an idea. Another possibility is that the 40-hertz oscillations may help distinguish figures from ground or assist the mechanism of attention.

Are there some particular types of neurons, distributed over the visual neocortex, whose firing directly symbolizes the content of visual awareness? One very simplistic hypothesis is that the activities in the upper layers of the cortex are largely unconscious ones, whereas the activities in the lower layers (layers five and six) mostly correlate with consciousness. We have wondered whether the pyramidal neurons in layer five of the neocortex, especially the larger ones, might play this latter role.

These are the only cortical neurons that project right out of the cortical system (that is, not to the neocortex, the thalamus or the claustrum). If visual awareness represents the results of neural computations in the cortex, one might expect that what the cortex sends elsewhere would symbolize those results. What is more, the neurons in layer five show an unusual propensity to fire in bursts. The idea that layer five neurons may directly symbolize visual awareness is attractive, but it still is too early to tell whether there is anything in it.

Visual awareness is clearly a difficult problem. More work is needed on the psychological and neural basis of both attention and very short-term memory. Studying the neurons when a percept changes, even though the visual input is constant, should be a powerful experimental paradigm. We need to construct neurobiological theories of visual awareness and test their using a combination of molecular, neurobiological and clinical imaging studies.

It is strongly believed that once we have mastered the secret of this simple form of awareness, we may be close to understanding a central mystery of human life: How the physical events occurring in our brains while we think and act in the world relate to our subjective sensations - that is, how the brain relates to the mind.

Afterthought or precept, may that it is that it now seems likely that there are rapid ‘on-line’ systems for stereotyped motor responses such as hand or eye movement. These systems are unconscious and lack memory. Conscious seeing, on the other hand, seems to be slower and more subject to visual illusions. The brain needs to form a conscious representation of the visual scene that it then can use for many different actions or thoughts. Precisely, how all these pathways’ work may by some enacting deliverance as to how they interact is far from clear.

Still, it is probably too early to draw firm conclusions from them about the exact neural correlates of visual consciousness. We have suggested that on theoretical grounds are based the neuroanatomy of the macaque monkey that primates are not directly aware of what is happening in the primary visual cortex, even though most of the visual information flows through it. Although on hypothetical grounds that are supported by some experimental evidence, with the exception, that, it is still controversial.

Let us consider once again, if for example, mindfully you rotate the letter”N” 90 degrees to the right, is a new letter formed? In seeking answers such that to are fractionally vulnerable to the plexuities of the mind’s eye, scientists say, most people conjure up an image in their mind’s eye, mentally ‘look’ at it, add details one a time and describe what they see. They seem to have a definite picture in their heads, but where in the brain are these images formed? How are they generated? How do people ’move things around’ in their imaginations?

Using clues from brain-damaged patients and advanced brain imaging techniques, neuroscientists have now found that the brain uses virtually identical pathways for seeing objects and for imagining them, only it uses these pathways in reverse.

In the process of human vision, a stimulus in the outside world is passed from the retina to the primary visual cortex and then to higher centres until an object or event is recognized. In mental imaging, a stimulus originates in higher centres and is passed down to the primary visual cortex. Where it is recognized.

The implications are beguiling. Scientists say that for the first time they are glimpsing the biological basis for abilities that make some people better at math or art or flying fighter aircraft. They can now explain why imagining oneself shooting baskets like Michael Jordan can improve one’s athletic performance. In a finding that raises troubling questions about the validity of eyewitness testimony, they can show that an imagined object is, to the observer’s brain at least, every bit as real as one that is seen.

“People have always wondered if there are pictures in the brain.” More recently, the debate centred on a specific query: As a form of thought, is mental imagery rooted in the abstract symbols of language or in the biology of the visual system?

The biology arguments are winning converts every day. The new findings are based on the notion that mental capacities like memory, perception, mental imagery, language and thought are rooted in complex underlying structures in the brain. Thus an image held in the mind’s eye has physically than ethereal properties. Mental imagery research has developed apart with research on the human visual system. Each provides clues to the other helping along the forcing out the details of a highly complex system.

Vision is not a single process but the linking of subsystems that process specific aspects of vision. To understand how this works, we are to consider looking at an apple on a picnic table ten feet away. Light reflects off the apple, hits the retina and is sent through nerve fibres to an early visual way station might that we call the visual buffer, here the apple image is literally mapped onto the surface of brain tissue as it appears in space, with high resolution. “You can think of the visual buffer is a screen,” as if it were “A picture can be displayed on the screen from the camera, which are your eyes, or from a videotape recorder, which is your memory.”

In this case, the image of the apple is held on the screen as the visual buffer carries out a variety of other features are examined separately. Still, the brain does not as yet know is seeing an apple. Next, distinct features of the apple are sent to two higher subsystems for further analysis, as they are often referred to as the ‘what’ system and the ‘where’ system. The brain needs to match the primitive apple pattern with memories and knowledge about apples, in that it seeks knowledge from visual memories that are held like videotapes in the brain.

The ‘what’ system, in the temporal lobe, contains cells that are tuned for specific shapes and colours of objects, some respond to red, round objects in an infinite variety of positions, ignoring local space. Thus, the apple could be on a distant tree, on the picnic table or in front of your nose, thereby, it would still stimulate cells tuned for red round objects, which might be apples, beach balls or tomatoes.

The ‘where’ system, in the parietal lobe, contains cells that are tuned to fire when objects are in different locations. If the apple is far away, one set of cells is activated, while another set fires if the apple is close up. Thus the brain has a way of knowing where objects are in space so the body can navigate accordingly.

When cells in the ‘what’ and ‘where’ systems are stimulated, they may combine their signals in yet a higher subsystem where associative memories are stored, such as this systems are like a card file where visual memories, as if held on videotapes, can be looked up and activated. If the signals from the ‘what’ and ‘where’ system finds a good match in associative memory, one is to find the knowable object as an apple. You, however, also know what it tastes and smells like, that it has seeds, that it can be made into our favourite pie and everything else stored in your brain about apples.

However, sometimes, recognition does not occur at the level of associative memory. Because it is far away, the red object on the picnic table could be a tomato or an apple. You are not sure of its identity, and so, another level of analysis kicks in.

This highest level, in the frontal lobe, is where decisions are made, and to use the same analogy, it is like a catalogue for the videotape is in the brain. You look up features about the image to help you identify it. A tomato has a pointed leaf, while an apple has a slender stem. When the apple stem is found at this higher level, the brain decides that it has an apple in its visual field.

Signals are then fired back down through the system to the visual buffer and the apple is recognized. Significantly, every visual information that sends information upstream through nerve fibres also receives information back from that area. Information flows richly in both directions at all times.

Mental imagery is the result of this duality. Instead of a visual stimulus, a mental stimulus activates the system. The stimulus can be anything, including a memory, odour, face, reverie, song or question, for example, you look up the videotape in associative memory for cat, images are based on previously encoded representations of shape, whereby you look up the videotape in associative memory for cat.

When that subsystem initiated, a general image of a cat was mapped out on the screen, or the visual buffer, in the primary visual cortex. It is a tripped-down version of a cat and everyone’s version is different. Its preliminary mapping calls in detail of whether the cat has curved claws? To find out, the mind’s eye shifts attention and goes back to higher subsystems where detailed features are stored. Activating the curved claws tape, then zoom back down to the front paws of the cat and you add them to the cat. Thus each image is built up, a part at a time.

The more complex the image, the more time it takes to conjure it in the visual buffer. On the basis of brain scans with the technique known as positron emission tomography, its estimates that are required range from 75 to 100 thousandths of a second that to add each new part.

The visual system maps imagined objects and scenes precisely, mimicking the real world, in that, you scan it and study it as it was there.

How is this to be? This can be demonstrated when people are asked to imagine objects at different sizes: “Imagine a tiny honeybee.” “What colour is its head?” To do this, people have to take time to zoom in on the bee’s head before they can answer. Conversely, objects can be imagined so that they overflow the visual field. “Imagine walking toward a car,” “It looms larger as you get closer to it. There comes a point where you cannot see the car at once, in that it seems to overflow the screen in your mind’s eye.”

People with brain damage often demonstrate that the visual systems are doing double duty. For example, stroke patients who lose the ability to see colours also cannot imagine colours.

An epilepsy patient experienced a striking change in her ability to imagine objects after her right occipital lobe was removed to reduce seizures. On the same line, before surgery, the woman estimated she would stand, in her mind’s eye, about 14 feet from a horse before it overflowed her visual field. After surgery, she estimated the overflow at 34 feet. Her field of mental imagery was reduced by half.

Another patient underwent to endure the disability affecting his ‘what’ system while his ‘where’ system was intact. If you were to ask him to imagine what colour is the inside of a watermelon, he does not know, and if you press him, he might guess blue. However, if you ask him, is Toronto closer to London than Winnipeg, he answers correctly instantly.

Imaging studies of healthy brains produce similar findings, As when a person is asked to look at and then to imagine an object, the same brain areas are activated. When people add detail to images, they use the same circuits used in vision. Interestingly, people who say they are vivid, as imaginers can show stronger activation of the relevant areas in the brain.

People use imagery in their everyday lives to call up information in memory, at which time, to initiate reason and to learn new skills, the scientists say. It can lead to creativity. Albert Einstein apparently got his first insight into relativity when he imagined chasing after and matching the speed of a beam of light.

It can improve athletic skills. When you see a gifted athlete, move in a particular way, you not how he or she moves, and you can use that information to program your own physiological techniques as to improve everyone. Basically, his brain uses the same representations in the ‘where’ system to help direct actual movements and imagined movements. Thus, refining these representations in imagery will transfer frowardly into the actualized momentum, provided the motions are physically practised.

Humans exhibit vast individual differences in various component’s of mental imaging, which may help explain certain talents and predilection. Fighter pilots, for example, can imagine the rotation of complex objects in a flash, but most people need time to imagine such tasks.

Currently in progress, are studies in the examination of the brains of mathematicians and artists with a new imaging machine that reveals individual difference in the way brains are biologically wired up, still, they are looking to see if people who are good at geometry have different circuitry from those people who are good at algebra.

In a philosophical conundrum arising from the new research, it seems that people can confuse what is real and what is imagined, raising questions about witnesses’ testimony and memory itself.

Meanwhile, in visual perception, must have some really superb mechanistic actions in our favour, yet to be considered is the expectation to expect, if not only for yourself, however, in order to see an object when you have a part of the picture. As these expectations allow you to see an apple, its various fragments can drive the system into producing the image of an apple in your visual buffer. In other words, you prime yourself so much that you particularly can play the apple tape from your memory banks. Thus, people can be fooled by their mind’s eye, justly of imaging a man standing before a frightened store clerk and you quicken to assume a robbery is under way. It is dark and he is in the shadow. Because you expect to see a gun, your thresholds are lowered and you may occasion to run the tape for a gun, even though it is not there. As far as your brain is concerned, it saw a gun, yet it may not have been real.

Luckily, inputs from the eye tend to be much stronger than inputs from imagination, but on a dark night, under certain circumstances, it is easy to be fooled by one’s own brain.

It is amazing that imagination and reality are not confused more often, least of mention, images are fuzzier and less coherent than real memories, and humans are able to differentiate them by how plausible they seem.

Although our new epistemological situation suggests that questions regarding the character of the whole no longer lies within the domain of science, derived from the classical assumption that knowledge of all the constituent parts of a mechanistic universe is equal to knowledge of the whole. This paradigm sanctioned the Cartesian division between mind and world that became a pervasive preoccupation in western philosophy, art, and literature beginning in the seventeenth century. This explains in no small part why many humanists-social scientists feel that science concerns itself only with the mechanisms of physical reality and it, therefore, indifference or hostile to the experience of human subjectivity - the world where a human being with all his or her myriad sensations, feelings, thoughts, values and beliefs and yet, have a life and subside into an ending.

Nevertheless, man has come to the threshold of a state of consciousness, regarding his nature and his relationship to the cosmos, in terms that reflect ‘reality’. By using the processes of nature as metaphor, to describe the forces by which it operates upon and within man, we come as close to describing ‘reality’ as we can within the limits of our comprehension. Men will be very uneven in their capacity for such understanding, which, naturally, differs for different ages and cultures, and develops and changes over the course of time. For the reasons that using metaphors and myth to provide will always be necessary ‘comprehensible’ guides to living. In this way, man’s imagination and intellect play vital roles on his survival and evolution.

Notwithstanding, of ethics, differing in transliterations to which underlie such different conceptions of human life as that of the classical Greeks, Christianity and an adherent of Judaism as a religion or the Hebrew’s or the Jewish culture - the Hebrews are a native inhabitants of the ancient kingdom of Judah, least of mention, the lunisolar calendar used by the Jew’s or their culture or their religious mark in the events of the Jewish year, dating the creation of the world at 3761Bc.

Europe owes the Jew no small thanks for making people think more logically and for establishing cleaner intellectual habits - nobody more so than the Germans who are a lamentable déraisonnable [unreasonable] race . . . Wherever Jews have won influence they have taught men to make finer distinctions, more rigorous inferences, and to write in a more luminous and cleanly fashion, their task as ever to bring the people ‘to listen to raison’.

His position is very radical. Nietzsche does not simply deny that knowledge, construed as the adequate representation he the world by the intellect, exists. He also refuses the pragmatist identification of knowledge and truth with usefulness: He writes that we think we know what we think is useful, and that we can be quite wrong about the latter.

Nietzsche’s view, his ‘perspectivism’, depends on his claim that the being not sensible conception of a world independent of human interpretation and of which interpretation would correspond if that were to unite knowledge. He sums up this highly controversial interpretation in The Will to Power: Fact and precisely what there is not, only interpretations.

Perspectivism does not deny that particular views can be like some versions of contemporary anti-realism, and it attributes of a specific approach in truth as fact relationally determines it and justly untenable by those approaches in themselves. Still, it refuses to envisage a single independent set of facts, to be accounted for all theories, thus Nietzsche grants the truth of specific scientific theories, it does, however, deny that a scientific interpretation can possibly be ‘the only justifiable interpretation of the world’, neither the fact’s science addresses nor methods it employs are privileged. Scientific theories serve the purpose for which they have been devised, but they have no priority over the many other purposes of human life.

For those curiously consigned by the uncanny representations brought-about to the affectual nature between mind and mental as drive of both Freud and Nietzsche will soon attest to some uncanny theories.

In the late 19th century Viennese neurologist Sigmund Freud developed a theory of personality and a system of psychotherapy known as psychoanalysis. According to this theory, people are strongly influenced by unconscious forces, including innate sexual and aggressive drives. Freud, recounts the early resistance to his ideas and later acceptance of his work. From the outset of a psychoanalysis, Freud attracted followers, many of whom later proposed competing theories. As a group, these neo-Freudians shared the assumption that the unconscious plays and important role in a person’s thoughts and behaviours. Most parted company with Freud, however, over his emphasis on sex as a driving force. For example, Swiss psychiatrist Carl Jung theorized that all humans inherit a collective unconscious that contains universal symbols and memories from their ancestral past. Austrian physician Alfred Adler theorized that people are primarily motivated to overcome inherent feelings of inferiority. He wrote about the effects of birth order in the family and coined the term sibling rivalry. Karen Horney, a German-born American psychiatrist, argued that humans have a basic need for love and security, and become anxious when they feel isolated and alone.

Motivated by a desire to uncover unconscious aspects of the psyche, psychoanalytic researchers devised what is known as projective tests. A projective test asks people to respond to an ambiguous stimulus such as a word, an incomplete sentence, an inkblot, or an ambiguous picture. These tests are based on the assumption that if a stimulus is vague enough to accommodate different interpretations, then people will use it to project their unconscious needs, wishes, fears, and conflicts. The most popular of these tests are the Rorschach Inkblot Test, which consists of ten inkblots, and the Thematic Apperception Test, which consists of drawings of people in ambiguous situations.

Psychoanalysis has been criticized on various grounds and is not as popular as in the past. However, Freud’s overall influence on the field has been deep and lasting, particularly his ideas about the unconscious. Today, most psychologists agree that people can be profoundly influenced by unconscious forces, and that people often have a limited awareness of why they think, feel, and behave as they do.

The techniques of psychoanalysis and much of the psychoanalytic theory based on its application were developed by Sigmund Freud. Squarely, his work concerning the structure and the functioning of the human mind had far-reaching significance, both practically and scientifically, and it continues to influence contemporary thought.

The first of Freud's innovations was his recognition of unconscious psychiatric processes that follow laws differently from those that govern conscious experience. Under the influence of the unconscious, thoughts and feelings that belong together may be shifted or displaced out of context; two disparate ideas or images may be condensed into one; thoughts may be dramatized in images rather than expressed as abstract concepts; and certain objects may be represented symbolically by images of other objects, although the resemblance between the symbol and the original object may be vague or farfetched. The laws of logic, indispensable for conscious thinking, do not apply to these unconscious mental productions.

Recognition of these modes of operation in unconscious mental processes made possibly the understanding of such previously hard to grasp psychological phenomena as dreaming. Through analysis of unconscious processes, Freud saw dreams as serving to protect sleep against disturbing impulses arising from within and related to early life experiences. Thus, unacceptable impulses and thoughts, called the latent dream content, are transformed into a conscious, although no longer immediately comprehensible, experience called the manifest dream. Knowledge of these unconscious mechanisms permits the analyst to reverse the so-called dream work, that is, the process by which the latent dream is transformed into the manifest dream, and through dream interpretation, to recognize its underlying meaning.

A basic assumption of Freudian theory is that the unconscious conflicts involve instinctual impulses, or drives, that originate in childhood. As these unconscious conflicts are recognized by the patient through analysis, his or her adult mind can find solutions that were unattainable to the immature mind of the child. This depiction of the role of instinctual drives in human life is a unique feature of Freudian theory.

According to Freud's doctrine of infantile sexuality, adult sexuality is a product of a complex process of development, beginning in childhood, involving a variety of body functions or areas (oral, anal, and genital zones), and corresponding to various stages in the relation of the child to adults, especially to parents. Of crucial importance is the so-called Oedipal period, occurring at four to six years of age, because at this stage of development the child for the first time becomes capable of an emotional attachment to the parent of the opposite sex that is similar to the adult's relationship to a mate; the child simultaneously reacts as a rival to the parent of the same sex. Physical immaturity dooms the child's desires to frustration and his or her first step toward adulthood to failure. Intellectual immaturity further complicates the situation because it makes children afraid of their own fantasies. The extent to which the child overcomes these emotional upheavals and to which these attachments, fears, and fantasies continue to live on in the unconscious greatly influences later life, especially loves relationships.

The conflicts occurring in the earlier developmental stages are no less significant as a formative influence, because these problems represent the earliest prototypes of such basic human situations as dependency on others and relationship to authority. Also, basic in moulding the personality of the individual is the behaviour of the parents toward the child during these stages of development. The fact that the child reacts, not only to objective reality, but also to fantasy distortions of reality, and, however, greatly complicates even the best-intentioned educational efforts.

The effort to clarify the bewildering number of interrelated observations uncovered by psychoanalytic exploration led to the development of a model of the structure of the psychic system. Three functional systems are distinguished that are conveniently designated as the id, ego, and superego.

The first system refers to the sexual and aggressive tendencies that arise from the body, as distinguished from the mind. Freud called these tendencies ‘Triebe’, which literally means “drives,” but is often inaccurately translated as “instincts” to show their innate character. These inherent drives claim immediate satisfaction, which is experienced as pleasurable; the id thus is dominated by the pleasure principle. In his later writings, Freud tended more toward psychological rather than biological conceptualization of the drives.

How the conditions for satisfaction are to be caused is the task of the second system, the ego, which is the domain of such functions as perception, thinking, and motor control that can accurately assess environmental conditions. To fulfill its function of adaptation, or reality testing, the ego can enforce the postponement of satisfaction of the instinctual impulses originating in the id. To defend it against unacceptable impulses, the ego develops specific psychic means, known as defence mechanisms. These include repression, the exclusion of impulses from conscious awareness; projection, the process of ascribing to others one's own unacknowledged desires; and reaction formation, the establishments of a pattern of behaviour directly opposed to a strong unconscious need. Such defence mechanisms are put into operation whenever anxiety signals a danger that the original unacceptable impulses may reemerge.

An id impulse becomes unacceptable, not only from a temporary need for postponing its satisfaction until suitable reality conditions can be found, but more often because of a prohibition imposed on the individual by others, originally the parents. All these demands and prohibitions are the major content of the third system, the superego, the function of which is to control the ego according to the internalized standards of parental figures. If the demands of the superego are not fulfilled, the person may feel shame or guilt. Because the superego, in Freudian theory, originates in the struggle to overcome the Oedipal conflict, it has a power akin to an instinctual drive, is in part unconscious, and can cause feelings of guilt not justified by any conscious transgression. The ego, having to mediate among the demands of the id, the superego, and the outside world, may not be strong enough to reconcile these conflicting forces. The more the ego is impeded in its development because of being enmeshed in its earlier conflicts, called fixations or complexes, or the more it reverts to earlier satisfactions and archaic modes of functioning, known as regression, the greater is the likelihood of succumbing to these pressures. Unable to function normally, it can maintain its limited control and integrity only at the price of symptom formation, in which the tensions are expressed in neurotic symptoms.

A cornerstone of modern psychoanalytic theory and practice is the concept of anxiety, which makes appropriate mechanisms of defence against certain danger situations. These danger situations, as described by Freud, are the fear of abandonment by or the loss of the loved one (the object), the risk of losing the object's love, the danger of retaliation and punishment, and, finally, the hazard of reproach by the superego. Thus, symptom formation, character and impulse disorders, and perversions, and sublimations, represent compromise formations - different forms of an adaptive integration that the ego tries to achieve through essentially successfully reconciling the different conflicting forces in the mind.

Various psychoanalytic schools have adopted other names for their doctrines to show deviations from Freudian theory.

Carl Gustav Jung, one of the earliest pupils of Freud, eventually created a school that he preferred to call analytical psychology. Like Freud, Jung used the concept of the libido; However, to him it meant not only sexual drives, but a composite of all creative instincts and impulses and the entire motivating force of human conduct. According to his theories, the unconscious is composed of two parts, the personal unconscious, which contains the results of the completion by the individualities as characterologically is the entity of the experience, and the collective unconscious, the reservoir of the experience of the human race. In the collective unconscious exist many primordial images, or archetypes, common to all individuals of a given country or historical era. Archetypes take the form of bits of intuitive knowledge or apprehension and normally exist only in the collective unconscious of the individual. When the conscious mind contains no images, however, as in sleep, or when the consciousness is caught off guard, the archetypes commence to function. Archetypes are primitive modes of thought and tend to personify natural processes as to such mythological concepts as good and evil spirits, fairies, and dragons. The mother and the father also serve as prominent archetypes.

An important concept in Jung's theory is the existence of two basically different types of personality, mental attitude, and function. When the libido and the individual's general interest are turned outward toward people and objects of the external world, he or she is said to be extroverted. When the reverse is true, and libido and interest are entered on the individual, he or she is said to be introverted. In a completely normal individual these two tendencies alternate, neither dominating, but usually the libido is directed mainly in one direction nor the other; as a result, two personality types are recognizable.

Jung rejected Freud's distinction between the ego and superego and recognized part of the personality, similar to the superego, that he called the persona. The persona consists of what a person may be to others, in contrast to what he or she is. The persona is the role the individual chooses to play in life, the total impression he or she wishes to make on the outside world.

Alfred Adler, another of Freud's pupils, differed from both Freud and Jung in stressing that the motivating force in human life is the sense of inferiority, which begins when an infant can comprehend the existence of other people who are better able to care for themselves and cope with their environment. From the moment the feeling of inferiority is established, the child strives to overcome it. Because inferiority is intolerable, the compensatory mechanisms set up by the mind may get out of hand, resulting in - centred neurotic attitudes, overcompensations, and a retreat from the real world and its problems.

Adler laid particular stress on inferiority feelings arising from what he regarded as the three most important relationships: those between the individual and work, friends, and loved ones. The avoidance of inferiority feelings in these relationships leads the individual to adopt a life goal that is often not realistic and is frequently expressed as an unreasoning will to power and dominance, leading to every type of antisocial behaviour from bullying and boasting to political tyranny. Adler believed that analysis can foster a sane and rational “community feeling” that is constructive rather than destructive.

Another student of Freud, Otto Rank, introduced a new theory of neurosis, attributing all neurotic disturbances to the primary trauma of birth. In his later writings he described individual development as a progression from complete dependence on the mother and family, to a physical independence coupled with intellectual dependence on society, and finally to complete intellectual and psychological emancipation. Rank also laid great importance on the will, defined as “a positive guiding organization and integration of, which uses creatively and inhibits and controls the instinctual drives.”

Later noteworthy modifications of psychoanalytic theory include those of the American psychoanalysts’ Erich Fromm, Karen Horney, and Harry Stack Sullivan. The theories of Fromm lay particular emphasis on the concept that society and the individuals are not separate and opposing forces, that the nature of society is determined by its historic background, and that the needs and desires of individuals are largely formed by their society. As a result, Fromm believed, the fundamental problem of psychoanalysis and psychology is not to resolve conflicts between fixed and unchanging instinctive drives in the individual and the fixed demands and laws of society, but to cause harmony and an understanding of the relationship between the individual and society. Fromm also stressed the importance to the individual of developing the ability to use his or her mentality fully, emotional, and sensory powers.

Horney worked primarily in the field of therapy and the nature of neuroses, which she defined as of two types: situation neuroses and character neuroses. Situation neuroses arise from the anxiety attendant on a single conflict, such for being faced with a difficult decision. Although they may paralyse the individual temporarily, making it impossible to think or act efficiently, such neuroses are not deeply rooted. Character neuroses are characterized by a basic anxiety and a basic hostility resulting from a lack of love and affection in childhood.

Sullivan believed that all development can be described exclusively for interpersonal relations. Character types and neurotic symptoms are explained as results of the struggle against anxiety arising from the individual's relations with others and are some security systems, maintained for allaying anxiety.

An important school of thought is based on the teachings of the British psychoanalyst Melanie Klein. Because most of Klein's followers worked with her in England, this has become known as the English school. Its influence, nevertheless, is very strong throughout the European continent and in South America. Its principal theories were derived from observations made in the psychoanalysis of children. Klein posited the existence of complex unconscious fantasies in children under the age of six months. The principal source of anxiety arises from the threat to existence posed by the death instinct. Depending on how concrete representations of the destructive forces are dealt within the unconscious fantasy life of the child, two basic early mental attitudes result that Klein characterized as a “depressive position” and a “paranoid position.” In the paranoid position, the ego's defence consists of projecting the dangerous internal object onto some external representative, which is treated as a genuine threat emanating from the external world. In the depressive position, the threatening object is introjected and treated in fantasy as concretely retained within the person. Depressive and hypochondriacal symptoms result. Although considerable doubt exists that such complex unconscious fantasies operate in the minds of infants, these observations have been very important to the psychology of unconscious fantasies, paranoid delusions, and theory concerning early object relations.

Freud was born in Freiburg (now Pukbor, Czech Republic), on May 6, 1856, and educated at Vienna University. When he was three years old, his family, fleeing from the anti-Semitic riots then raging in Freiberg, moved to Leipzig. Shortly after that, the family settled in Vienna, where Freud remained for most of his life.

Although Freud’s ambition from childhood had been a career in law, he decided to become a medical student shortly before he entered Vienna University in 1873. Inspired by the scientific investigations of the German poet Goethe, Freud was driven by an intense desire to study natural science and to solve some challenging problems confronting contemporary scientists.

In his third year at the university Freud began research work on the central nervous system in the physiological laboratory under the direction of the German physician Ernst Wilhelm von Brücke. Neurological research was so engrossing that Freud neglected the prescribed courses and as a result remained in medical school three years longer than was required normally to qualify as a physician. In 1881, after completing a year of compulsory, military service, he received his medical degree. Unwilling to give up his experimental work, however, he remained at the university as a demonstrator in the physiological laboratory. In 1883, at Brücke’s urging, he reluctantly abandoned theoretical research to gain practical experience.

Freud spent three years at the General Hospital of Vienna, devoting him successively to psychiatry, dermatology, and nervous diseases. In 1885, following his appointment as a lecturer in neuropathology at Vienna University, he left his post at the hospital. Later the same year he was awarded a government grant enabling him to spend 19 weeks in Paris as a student of the French neurologist Jean Charcot. Charcot, who was the director of the clinic at the mental hospital, the Salpêtrière, was then treating nervous disorders by using hypnotic suggestion. Freud’s studies under Charcot, which entered largely on hysteria, influenced him greatly in channelling his interests to Psychopathology.

In 1886 Freud established a private practice in Vienna specializing in nervous disease. He met with violent opposition from the Viennese medical profession because of his strong support of Charcot’s unorthodox views on hysteria and hypnotherapy. The resentment he incurred was to delay any acceptance of his subsequent findings on the origin of neurosis.

Freud’s first published work, On Aphasia, appeared in 1891; it was a study of the neurological disorder in which the ability to pronounce words or to name common objects is lost because of organic brain disease. His final work in neurology, an article, “Infantile Cerebral Paralysis,” was written in 1897 for an encyclopedia only at the insistence of the editor, since by this time Freud was occupied largely with psychological than physiological explanations for mental illnesses. His subsequent writings were devoted entirely to that field, which he had named psychoanalysis in 1896.

During the early years of the development of psychoanalysis and even afterwards, Freud regarded himself as the bearer of painful truths that people, at least upon first hearing or reading, did not want to face. Psychoanalytically oriented therapy involves facting great pain in giving up certain deeply held, personally important beliefs. If it is understood, Nietzsche’s words would have touched a sympathetic chord in Freud when he wrote that ‘achievable things are truly productive are offensive’. Nietzsche insisted, as did Freud, On resisting the temptations toward easy answerers and superficiality in the face of painful truths. Nietzsche attributes’ of his present days that it is more need than ever of what continues to count as untimely-I mean: Telling the truth. (Even during some things, that truth can be reached and communicated.)

In 1894 The Antichrist and Nietzsche Contra Wagner (both completed in 1888) were first published, Nietzsche refers to himself as a psychologist in both works, referring to such works to his analysis as ‘the psychology of conviction, of faith’. He states that ‘one cannot be a psychologist or physician without at the same time being an anti-Christian,’ that ‘philology and medicine [are] the two great adversaries of superstition. That ‘Faith’ as an imperative is the veto against science.’ Nietzsche offers a psychological analysis of the powerful and primitive forces at work in the experience and condition of faith and a scathing attack on the Apostle Paul. Although Freud had no affectionate feeling for Paul, he was an atheist and understood religious experience and belief from a psychological perspective that was related to Nietzsche’s understanding (as well as Feuerbach to whom both Nietzsche and Feud were indebted. ” On particular importance for psychoanalysis (and for understanding Freud) of the idea of inventing a history (including of one’s self) to convene in the particular resource of needs.

From the early years in the development of psychoanalysis up until the present day, there have been substantial discussion and debate regarding the extent to which Nietzsche discovered and elaborated upon ideas generally ascribed to Freud as well as the extent to which Freud may have been influenced by Nietzsche in his development of a number of fundamental psychoanalytic concepts. In 1929 Thomas Mann, a great admirer of Freud, wrote: “He [Freud] was not acquainted with Nietzsche in whose work everywhere appear like gleams of insight anticipatory of Freud’s later views.” Mann considered Nietzsche to be “the greatest critic and psychologist of morals.” In an early study of the development Freud’s thought, their was suggested that Freud was not aware of certain philosophical influence’s on his thought, that Nietzsche “must perhaps be looked upon as the founder of disillusioning psychology,” that “Nietzsche’s division into Dionysian and Apollonian . . . is almost completely identical with that of the primary and secondary function [process],” an that Nietzsche and certain other writers “were aware that this ream had a hidden meaning and significance for our mental life.” Karl Jaspers, who contributed to the fields o psychiatry, depth psychology and philosophy, frequently commented on Nietzsche’s psychological insights and discussed Nietzsche in relation to Freud and psychoanalysis. In his text, General Psychopathology, only Freud appears more frequently than Nietzsche. He went sofar as to state that Freud and psychoanalysis have used ideas pertaining to the “meaningfulness of psychic deviation . . . in misleading way and this blocked the direct influence on [the study of] Psychopathology of great people such as Kierkegaard and Nietzsche” he wrote of Freud popularizing “in crude form” certain ideas elated to Nietzsche’s concept of sublimation.

Jones is to note of “a truly remarkable correspondence between Freud’s conception of the super-ego and Nietzsche’s exposition of the origin of the bad conscience,” Another analyst, Anzieu, offers a summary of Nietzsche’s anticipation of psychoanalytic concepts: It was Nietzsche who invented the term das Es (the id). He had some understanding of the economic point of view, which comprises discharge, and transfer of energy from one drive to another. However, he believed that aggression and self-destruction were stronger that sexuality. On several occasions he used the word sublimation (applying it to both the aggressive and the sexual instincts). He described repression, but called it inhibition, he talked of the super-ego and of quilt feelings, but called them resentment, bad conscience and false morality. Nietzsche also described, without giving them a name, the turning of drives against oneself, the paternal image, the maternal image, and the renunciation imposed by civilization on the gratification of our instincts. The “superman” was the individual who succeeded in transcending his conflict between established values and his instinctual urges, thus achieving inner freedom and establishing his privately personal morality and scale of values, in other words, Nietzsche foreshadowed what was to be one of the major aims of psychoanalytic treatment.

While there is a growing body of literature examining the relationship between the writings of Freud and Nietzsche, there has appeared no detailed, comprehensive study on the extent to which Freud may have been influenced by Nietzsche through the course of his life and the complex nature of Freud’s personal and intellectual relationship to Nietzsche. In part this may be attributed to Freud’s assurances that he had never studied Nietzsche, had never been able to get beyond the fist half page or so of any of his works due both in the overwhelming wealth of ideas and to the resemblance of Nietzsche’s ideas to the findings of psychoanalysis. In other words, Freud avoided Nietzsche in part to preserve the autonomy of the development of his own ideas.

Nietzsche and Freud were influenced by many of the same currents of nineteenth-century thought. Both were interested in ancient civilization, particularly Greek culture. Both were interested in Greek tragedy (and debates about catharsis), both particularly drawn to the figure of Oedipus. Both were interested in and attracted to heroic figures and regarded themselves as such. Both held Goethe in the highest regard, of course. They were influenced by Darwin, evolutionary theory, contemporary theories of energy, anthropology and studies of the origins of civilization. They were influenced by earlier psychological writings, including, possibly those of Hippolyte Taine (1828-1893). They were also influenced by a basic historical sense, “the sense of development and change that was now permeating thinking in nearly every sphere.” They wanted to understand, so to speak, the animal in the human and, as unmaskers, were concerned with matters pertaining to the relation between instinct and reason, conscious and unconscious, rational and irrational, appearance and reality, surface and depth. Both attempted to understand the origins and power of religion and morality, They were influenced by the Enlightenment and the hopes for reason and science while at the same time being influenced d by Romanticism’s preoccupations with the unconscious and irrational. While beginning their career’s in other fields, both came to regard themselves, among other things, as depth psychologists.

All the same, one has to keep in mind the extent to which Nietzsche and Freud were both influenced by forces at work in the German-speaking world of the latter part of the nineteenth century and the extent to which similarities in their thought might be attributed to such factors rathe that Nietzsche having a direct influence upon Freud.

For example, both Nietzsche and Freud were interested in anthropology, both read Sir John Lubbock (1834-1913) and Edward Tylor (1832-1917) and both were influence by the authors. However, an examination of the similarities between Nietzsche and Freud would seem to indicate that there is also the direct influence of Nietzsche upon Freud, so that Wallace, while till writes of Nietzsche’s anticipation of and influence upon Freud. Also, Thatcher, while writing of Nietzsche’s debt to Lubbock, writes specifically of Nietzsche’s, not Lubbock’s, “remarkable” anticipation of an idea central to Freud‘s Future’s of an Illusion.

One can also note Nietzsche’s inclinations to use medical terminology in relation to psychological observation and “dissection”“: At its present state as a specific individual science the awakening of moral observation has become necessary and humans can no longer be spared the cruel sight of the moral dissection table and its knives and forceps. For here the ruled that science that asks after the origin and history of the so-called sensations.

Freud wrote of analysts modelling themselves on the surgeon “who untied all feeling, even his human sympathy, and concentrates hid metal forces on the single aim of performing the operation as skilfully as possible.”:The most successful cases are those in which on process, as it was, without any purpose in view, allows itself to be taken by surprise by any new turn in them, and always supported with an open mind, free from any presuppositions.”

In regard to broad cultural change and paradigm changes Nietzsche was one of the thinkers that herald the effectuality about such changes. In the book on Freud’s social thought, Berliner rites of the changes in intellectual orientation that occurred around 1885, stating that such changes were “reflected in the work of Friedrich Nietzsche. Beliner, goes on to mention some of Nietzsche’s contributions to understanding the human mind, conscience and civilisation’s origin his being representative of ‘uncovering’ or ‘unmasking’ psychology. Berliner concludes, as have other, that: That generation of his [Freud’s] young maturity was permeated with the thought of Nietzsche.”

Nevertheless, although Feud expressed admiration for Nietzsche on a number of occasions, acknowledged his “intuitive” grasp on the concepts anticipating psychoanalysis, placed him among a few of persons he considered great and stated in 1908 that “the degree of introspection achieved by Nietzsche had never been achieved by anyone, nor is it likely ever to be reached again,” he never acknowledges studying specific works of Nietzsche at any length or in any detail what his own thoughts was in regard to specific works or ideas of Nietzsche.

Since whenever an idea of Nietzsche’s that may have influenced Freud is discussed without tracing the influence and development in Nietzsche, and it possibly appearing as if it is being suggested that Nietzsche formulated his ideas without the great help of his forerunners, perhaps taking note of the following words of Stephen Jay Gould regarding our discomfort with evolutionary explanations would be useful at this point: “one reason must reside in our social and psychic attraction to creation myths in it preferences to the evolutionary assemblage for creative myths - . . . identify heroes and sacred places, while evolutionary assemblage provides no palpable particularity, objects as symbols for reverence, worship or patriotism.” Or as Nietzsche put it . . . “Whenever one can see the act of becoming [in contrast to ‘present completeness and perfection’] one grows comparatively cool.

It may, perhaps, be that the imbuing of myth within our lives, in this instance the myth. of the hero (with implications for our relationship to Nietzsche and Freud, the relationships themselves a heroes and Freud’s relationship to Nietzsche), is not so readily relinquished even in the realm of scholarly pursuits, a notion Nietzsche elaborated upon on a number of occasions.

Nietzsche discusses the origins of Greek tragedy in the creative integration of what he refers to as Dionysian and Apollonian forces, named for the representation in the gods Apollo and Dionysus. Apollo is associated with law, with beauty and order, with reason, with self-control and self-knowledge, with the sun and light. Dionysus is associated with orgiastic rites, music, dance and later drama. He is the god of divinity, whom of which is ripped into pieces, dismembered (representing individuation), and whose rebirth is awaiting (the end of individuation) religious rituals associated with him enact and celebrate death, rebirth and rituals associated with crops, including the grape (and wine and intoxication), and with sexuality. Frenzied, ecstatic female worshippers (maenads) ae central to the rituals and celebration. Both gods have a home in Delphi, Dionysus reigning in the winter when his dances are performed there.

In a note from The Will to Power Nietzsche defines the Apollonian and to Dionysian: The word “Dionysia” mean: An urge to unity, a reaching out beyond personality, the every day, social, reality, across the abysmal transitoriness, a passionate-painful overflowing into darker, fuller more floating states, . . . the feeling of the necessary unity of creation and destruction. One contemporary classical scholar writes of “the unity of salvation and destruction, . . . [as] a characteristic feature of all that is tragic.

The word “Apollinian” means: The urge to perfect self-sufficiency, to the typical “individuality” to all that simplified distinguishing, makes strong, closer, unambiguous, typical freedom under the law.

Nietzsche announces, that with admirable frankness that he is no longer a Christian, but he does not wish to disturb anyone’s piece of mind. Nietzsche writes of Strauss’ view of a new scientific man and his “faith” that “the heir of but a few hours, he is ringed around with frightful abysses, and every gaiting step taken ought to make him ask: “Where? From what place. Or to what end? However, rather than facing such frightful questions, Strauss’ scientific man seems to be permitted to such a life on questions whose answer could a bottom be of consequence only to someone assured of eternity. Perhaps in knowing, it, also tended to encourage the belief that, as once put, that in all men dance to the tune of an invisible piper, least of mention, many things must be taken to consider that all things must be known, in that the stray consequences of studying them will disturb the status quo, which can never therefore be discovered. History is not and cannot be determined. The supposed causes may only produce the consequences we expect.

Perhaps of even a grater importance resides of Human, All Too Human. We have already commented of sublimation, however, to explicate upon a definite definition of such rights to or for sublimation it seems implicitly proper to state that sublimation would modify the natural expression of (a primitive, instinctual impulse) in a socially acceptable manner, and thus to divert the energy associated with (an unacceptable impulse or drive) into a personally and socially acceptable activity. It is nonetheless, as Young points out, that Nietzsche heralds a new methodology. He contrasts metaphysical philosophy with his historical [later genealogical] philosophy. His is a methodology for philosophical inquiry into the origins of human psychology, a methodology to be separated with natural sciences. This inquiry “can no longer be separated from natural science,” and as he will do on other occasions, he offers a call to those who might have the ears to hear: “Will there be many who desire to purse such researchers? People likes to put questions of origins and beginnings out of its mind, must one not be almost inhuman to detect in oneself a contrary inclination?

Nietzsche writes of the anti-nature of the ascetic ideal, how it relates to a disgust with itself, its continuing destructive effect upon the health of Europeans, and how it related to the realm of “subterranean revenge” and ressentiment. Nietzsche writes of the repression of instincts (though not specifically of impulses toward sexual perversions) and of their bring turned inward against the self, “instinct for freedom forcibly made latently . . . this instinct for freedom-pushed back and repressed.” Also, “this hatred of the human, and even more is the animal, yet, and, still of the material.” Zarathustra also speaks of the tyranny of the holy or sacred”: He once loved ‘thou shalt’ as most sacred, now he mut finds illusion and caprice even in the most sacred, that freedom from his love may become his prey, the lion is needed for such prey. It would appear that while Freud’s formation as it pertains to sexual perversions and that incest is most explicitly not driven from Nietzsche (although along different line incest was an important factor in Nietzsche’ understanding of Oedipus), the relating of the idea of the holy to the sacrifice or repression of instinctual freedom was very possibly influenced by Nietzsche, particularly in light of Freud’s reference to the ‘holy’ as well as to the ‘overman’. These issues were also explored in The Antichrist that hd been published just to years earlier. In addition, Freud wrote, and, perhaps for the first time, of sublimation: “In have gained a sure inking of the structure of hysteria. Everything goes back to the reproduction of scenes. Some can be obtained directly, other ways by fantasies are weighed up in front of them. The fantasies stem from things that have been heard but understood subsequently, and all their material is of course genuine. They are protective structures, sublimation’s of the fact, embellishment of them, and at the same time serve for self-relief.”

Nietzsche had written of sublimation and he specifically wrote of the sublimation of sexual drives in the Genealogy. Freud’s use of the term differs slightly from his later and more Nietzschean usage such as in Three Essays on the Theory of Sexuality, but as Kaufmann notes, while “the word is older an either Freud or Nietzsche . . . it was Nietzsche who first gave it the specific connotation it has today.” Kaufmann regards the concept of sublimation as one of the most important concepts in Nietzsche’s entire philosophy. Furthermore, Freud wrote that a ‘presentiment’ tells him, “I shall very soon uncover the source of morality’, this is the very subject of Nietzsche’s Genealogy.

At a later time in his life Freud claimed he could not read more than a few passages of Nietzsche due to being overwhelmed by the wealth of ideas. This claim might be supported by the fact that Freud demonstrates only a limited understanding of certain of Nietzsche’s concepts. For example, his reference to the “overman” to which demonstrates a lack of understanding of the overman as a being of the future whose freedom involves creative self-overcoming and sublimation, not simply freely gratified primitive instincts. Later in life, in Group Psychology and the Analysis of the Ego. Freud demonstrates a similar misunderstanding in his equating the overman with the tyrannical father of the primal horde. Perhaps Freud confused the overman with the “master” whose morality is contrasted with that of “slave” morality in the Genealogy and Beyond Good and Evil. The conquering master more freely gratifies instinct and affirms himself, his world and his values as good. The conquering slave, unable to express himself freely, creates a negating, resentful, vengeful morality glorifying his own crippled, alienate condition, and he creates a division not between good (noble) and bad (contemptible), but between good (undangerous) and evil (wicked and powerful - dangerous slave moralities’ at times . . . occur within a singe soul).

Although Nietzsche never gave dreams anything like the attention and analysis given by Freud, he was definitely not one of, “the dark forest of authors who do not see the trees, hopefulessly lost on wrong tracks.” Yet, where he is reviewing the literature on dream, as well as throughout his life, Freud will not, in specific and detailed terms, discuss Nietzsche’s ideas as they pertain to psychoanalysis, just as he will never state exactly when he read or did not read Nietzsche or what he did or did not read. We may never know which of Nietzsche’s passages on dreams Freud may have read or heard of or read of as he was working on The Interpretation of Dreams. Freud’s May 31, 1897, a letter to Fliess includes reference to the overman, contrasting this figure with the saintly or holy which is (as is civilization) connected to instinctual renunciation, particularly incest and sexual perversion. Freud also writes that he has a presentiment that he shall “soon uncover the source of morality,” the subject of Nietzsche’s Genealogy. Earlier, he made what may have been his first reference to sublimation, a concept explored and developed by Nietzsche. We have also pointed to the possible, perhaps even likely, allusions to Nietzsche in letters of September and November 1897 which refer respectively to Nietzsche’s notion of a revaluation or transvaluations of all values and Nietzsche’s idea of his relationship of our turning our nose away from what disgust us, our own filth, to our civilized condition, our becoming “angles.” Nonetheless, Freud adds specifically that so too consciousness turns away from memory: “This is repression.” Then there is Nietzsche’s passage on dreams in which he refers to Oedipus and to the exact passage that Freud refers to in The Interpretation of Dreams. One author has referred to Nietzsche’s idea as coming “preternaturally close to Freud.” At a later point we see that in Freud’s remarks in The Interpretation of Dreams on the distinctiveness of psychoanalysis and his achievements regarding the understanding of the unconscious (his unconscious versus the unconscious of philosophers), Nietzsche is perhaps made present through his very absence.

These ideas of Nietzsche’s on dreams are not merely of interest in regard to the ways in which they anticipate Freud. They are very much related to more recent therapeutic approaches to the understanding of dreams: Nietzsche values dreaming states over waking states regarding the dream’s closeness to the “ground of our being,” the dream “informs” us of feelings and thoughts that “we do not know or feel precisely while awake,” in dreams “there is nothing unimportant or superfluous,” the language of dreams entails ‘chains of symbolical scenes’ and images in place of [and akin to] the language of poetic narration, content, form, duration, performer, spectator - in these comedies you are all of this yourself (and these comedies include the “abominable”). Recent life experiences and tensions, “the absence of nourishment during the day, gives rise to these dream inventions which “give scope and discharge to our drives.”

The self, as in its manifestations in constructing dreams, may be an aspect of our psychic lives that knows things that our waking of “I” or ego may not know an may not wish to know, and a relationship may be developed between these aspects of our psychic lives in which the later opens itself creatively to the communications of the former. Zarathustra states: “Behind your thoughts and feelings, my brother, there stands a mighty ruler, an unknown sage - whose name is self. In your body he dwells, he is your body.” However, Nietzsche’s self cannot be understood as a replacement for an all-knowing God to whom the “I” or, ego appeals for its wisdom, commandments, guidance and the like. To open onself to another aspect oneself that is wiser (“an unknown sage”) in the sense that new information can be derived from it, does not necessarily entail that this “wiser” component of one’s psychic life has God-like knowledge and commandments which if one (one’s “I”) interprets and opens correctly a will set one on the straight path. It is true though that what Nietzsche writes of the self as “a mighty ruler and unknown sage” he does open himself to such an interpretation and even to the possibility that this “ruler”: is unreachable, unapproachable for the “I?” However, the context of the passage (Nietzsche/Zarathustra redeeming the body) and the two sections thereafter are “On the Despisers of the Body” make it clear that there are aspects of our psychic selves that interpret the body, that mediate its direction, ideally in ways that do not deny the body but that aid in the body doing “what it would do above all else, to create beyond itself.

Nietzsche explored the ideas of psychic energy and drives pressing for discharge. His sublimation typically implies an understanding of drives in just such a sense as does his idea that dreams provide for discharge of drives. However, he did not relegate all that is derived from instinct and the body to this realm. While for Nietzsche there is no stable, enduring true self awaiting discovery and liberation, the body and the self (in the broadest sense of the term, including what is unconscious and may be at work in dreams as Rycroft describes it) may offer up potential communication and direction to the “I” or ego. However, at times Nietzsche describes of the “I” or ego as having very little, if any, idea as to how it is being lived by the “it.”

Nietzsche like Fred, describes two types’ mental processes, on which “binds” [man’s] life to reason and it concept in order not to be swept away by the current and to lose himself, the other, pertaining to the world of myth, art an the dream, “constantly showing the desire to shape the existing world of the wide-a-wake person to be variegatedly irregular and disinterestedly incoherent, exciting and eternally new, as the world of dreams.” Art may function as a “middle sphere” and middle faculty (transitional sphere and faculty) between a more primitive “metaphor-world”: of impressions and the forms of uniform abstract concepts.

All the same, understanding what Freud could mean by not reading Nietzsche in his later years is difficult as well as to determine if his is acknowledged of having read Nietzsche in earlier years. Freud never tells us exactly what he read of Nietzsche and never tells us exactly which years were those during which he avoided Nietzsche. We do know of course, that a few years earlier, in 1908. Freud has read and discussed Nietzsche, including a work of direct relevance to his own anthropological explorations as well as to ideas pertaining to the relationship between repression of instinct and the development of the inner world and conscience. We have also seen that lectures, articles and discussions on Nietzsche continue around Freud. It does seem though that Freud demonstrates a readiness to “forgo all claims to priority” regarding the psychological observations of Nietzsche and others that the science of psychoanalysis has confirmed.

Nevertheless, Nietzsche recognized the aggressive instinct and will to power in various forms and manifestations, including sublimated mastery, all of which are prominent in Freud’s writings.

We can also take note with which the work Freud ascribed of the power and importance of rational thinking and scientific laws. Freud writes that the World-View erected upon science conceals the “submission to the truth and rejection of illusions.” He writes, quoting Goethe, of “Reason and Science, the highest strength possessed by man,” and of “the bright world governed by relentless laws which has been constructed for us by science.” However, he also writes discipline, and a resistance stirs within us against the relentlessness nd monotony of the laws of thought and against the demands of reality-testing. Reason becomes the enemy which withholds from us so many possibilities of pleasure.

However, bright the world of science is and however much reason and science represent “the highest strength possessed by man,” this world, these laws, these faculties, require from us “submission” to a withholding enemy that imposes “strict discipline” with “relentlessness and monotony.” However much this language pertains to a description of universal problems in human development, one may wonder it does not reflect Freud’s own experience of the call of reason as a relentless (labouriously) submission.

There is no reason that empirical research cannot be of help in determining what kinds of “self-description” or narratives (as well as, of course, many other aspects of the therapeutic process) may be effective for different kinds of persons with different kinds of difficulties in different kinds of situations. From a Nietzschean perspective, while it is obvious and desirable that the therapist will influence the patient’s or client ‘s self-description and narratives, and the converse as well, a high value will be placed, however, much it is a joint creation of a shared reality, on encouraging the individual to fashion a self-understanding, self-description or narrative that is to a significant extent of his or her own creation. That on has been creative in this way (and hopefully can go on creating) will be a very different experience than having the therapist narrative is simply replacing the original narrative brought to therapy can be thought of and the individual’s increase capacity for playful creative application of a perspectivist approach to his or her life experience and history, though this approach, as any other, would be understood as detached most significantly and related to the sublimation of drives as an aspect of the pursuit of truth. This does not entail that one that one searches with the understanding that what one finds was not uncovered like an archeological find.

Both Freud and Nietzsche are engaged in a redefinition of the root of subjectivity, a redefinition that replaces the moral problematic of selfishness with the economic problematic of what Freud would call narcissism . . . [Freud and Nietzsche elaborate upon] the whole field of libidinal economy: The transit of libido through other selves, aggression, infliction and reception of pain, and something very much like death (the total evacuations of the entire quantum of excitation with which the organism is charged.)

The id, ego and superego effort to clarify the bewildering number of interrelated observations uncovered by psychoanalytic exploration led to the development of a model of the structure of the psychic system. Three functional systems are distinguished that are conveniently designated as the id, ego, and superego.

The first system refers to the sexual and aggressive tendencies that arise from the body, as distinguished from the mind. Freud called these tendencies ‘Triebe’, which literally means “drives,” but which is often inaccurately translated as “instincts” to indicate their innate character. These inherent drives claim immediate satisfaction, which is experienced as pleasurable; the id thus is dominated by the pleasure principle. In his later writings, Freud tended more toward psychological rather than biological conceptualization of the drives.

How the conditions for satisfaction are to be brought about is the task of the second system, the ego, which is the domain of such functions as perception, thinking, and motor control that can accurately assess environmental conditions. In order to fulfill its function of adaptation, or reality testing, the ego must be capable of enforcing the postponement of satisfaction of the instinctual impulses originating in the id. To defend itself against unacceptable impulses, the ego develops specific psychic means, known as defence mechanisms. These include repression, the exclusion of impulses from conscious awareness; projection, the process of ascribing to others one's own unacknowledged desires; and reaction formation, the establishments of a pattern of behaviour directly opposed to a strong unconscious need. Such defence mechanisms are put into operation whenever anxiety signals a danger that the original unacceptable impulses may reemerge.

An id impulse becomes unacceptable, not only as a result of a temporary need for postponing its satisfaction until suitable reality conditions can be found, but more often because of a prohibition imposed on the individual by others, originally the parents. The totality of these demands and prohibitions constitutes the major content of the third system, the superego, the function of which is to control the ego in accordance with the internalized standards of parental figures. If the demands of the superego are not fulfilled, the person may feel shame or guilt. Because the superego, in Freudian theory, originates in the struggle to overcome the Oedipal conflict, it has a power akin to an instinctual drive, is in part unconscious, and can give rise to feelings of guilt not justified by any conscious transgression. The ego, having to mediate among the demands of the id, the superego, and the outside world, may not be strong enough to reconcile these conflicting forces. The more the ego is impeded in its development because of being enmeshed in its earlier conflicts, called fixations or complexes, or the more it reverts to earlier satisfactions and archaic modes of functioning, known as regression, the greater is the likelihood of succumbing to these pressures. Unable to function normally, it can maintain its limited control and integrity only at the price of symptom formation, in which the tensions are expressed in neurotic symptoms.

Nietzsche suggests that in our concern for the other, in our sacrifice for the other, we are concerned with ourselves, one part of ourselves represented by the other. That for which we sacrifice ourselves is unconsciously related to as another part of us. In relating to the other we are in fact relating to a part of ourselves and we are concerned with our own pleasure and pain and our own expression of will to power. In one analysis of pity Nietzsche states that, we are, to be sure, not consciously thinking of ourselves tat it is primarily our own pleasure and pain that we are concerned about and that feelings an reactions that ensue are concerned about and that feelings and reactions that ensue are multi-determined.

Nietzsche has divided nature and that we respond to others in part on the basis of projecting and identifying with aspects of ourselves in them. From Human, All Too Human, Nietzsche writes to a deception in love - We forget a great deal of our own past and deliberately banish it from our minds . . . we want the image of ourselves that shines upon us out of the past to deceive us and flatter our self-conceit - we are engaged continually on this self-deception. Do you think, you who speak so much of ‘self-forgetfulness in love’, of ‘the merging of the ego in the other person’, and laud it so highly, do you think this is anything essentially differently? We shatter the mirror, impose our self upon someone we admire, and then enjoy our ego’s new image, even though we may call it by that other person’s name.

It is commonplace that beauty lies in the eye of the beholder, but all the same, we valuably talk of the beauty of a thing and people as if they are identifiable real properties which they possess. Projectivism denotes any view which sees us similarly projecting upon the world what is in fact modulations of our own minds. According to this view, sensations are displaced from their rightful place in the mind when we think of the world as coloured or noisy. Other examples of the idea involve things other than sensations, and do not consist of any literal displacement. One is that all contingency is a projection of our ignorance, another is that the causal order of events in a projection of our mental confidences in the way they follow from one another. However, the most common application of the idea is in ethics and aesthetics, where man writers have held that talk of the value or beauty of things is a projection of the attitudes we take toward them and the pleasure we take in them.

It is natural to associate Projectivism with the idea that we make some kind of mistake in talking and thinking as if the world contained the various features we describe it as having, when in reality it does not. Only, that the view that we make is no mistake, but simply adopt efficient linguistic expression for necessary ways of thinking, is also held.

Nonetheless, in the Dawn, Nietzsche describes man, in the person of the ascetic, as ‘split asunder into a sufferer and a spectator’, enduring and enjoying within (as a consequence of his drive for ‘distinction’, his will to power) that which the barbarian imposes on others. As Staten points out, Nietzsche asks if the basic disposition of the ascetic and of the pitying god who creates suffering humans can be held simultaneously, an that one would do ‘hurt to others in order thereby to hurt oneself, in order then to triumph over oneself and one’s pity and revel in an extremity of power. Nietzsche appears to be suggesting that in hurting the other In may, through identification, be tempting to hurt one part of myself, so that whatever my triumph over the other, In may be as concerned with one part of my self-triumphing over that par of myself In identify within the other as well as there by overcoming pity and in consequence ‘revel in an extremity of power.’ (Or in a variation of such dynamics, as Michel Hulin has put it, the individual may be ‘tempted to play both roles at once, contriving to torture himself in order to enjoy all the more his own capacity for overcoming suffering’.)

In addition to Nietzsche’s writing specifically of the sublimation of the libidinous drive, the will to power and it vicissitudes are described at times in ways related to sexually as well as aggressive drives, particularly in the form of appropriation and incorporation. As Staten points out, this notion of the primitive will to power is similar to Freud’s idea in Group Psychology and the Analysis of the Ego according to which, ‘identification [is] the earliest expression of an emotional tie with another person . . . It behaves like a derivation of the first oral phase of the organization of the libido, in which the object that we long for and prize is assimilated by eating. It would appear that Nietzsche goes a step further than Freud in one of his notes when he writes: ‘Nourishment - is only derivative, the original phenomenon is, to desire to incorporate everything’. Staten also concludes that, ‘if Freudian libido contains a strong element of aggression and destructiveness, Nietzschean will to power never takes place without a pleasurable excitation that there is no reason not to call erotic. However, that of ‘enigma and cruelty’, that it is only imposed on the beloved object and increases in proposition to the love . . . Cruel people being always masochist also, the whole thing is inseparable from bisexuality: One can only imagine how far Nietzsche and to what extent he would expand of insights other than Freud.

Freud’s new orientation was preceded by his collaborative work on hysteria with the Viennese physician Josef Breuer. The work was presented in 1893 in a preliminary paper and two years later in an expanded form under the title Studies on Hysteria. In this work the symptoms of hysteria were ascribed to manifestations of undischarged emotional energy associated with forgotten psychic traumas. The therapeutic procedure involved the use of a hypnotic state in which the patient was led to recall and reenact the traumatic experience, thus discharging by catharsis the emotions causing the symptoms. The publication of this work marked the beginning of psychoanalytic theory formulated based on clinical observations.

From 1895 to 1900 Freud developed many concepts that were later incorporated into psychoanalytic practice and doctrine. Soon after publishing the studies on hysteria he abandoned the use of hypnosis as a cathartic procedure and substituted the investigation of the patient’s spontaneous flow of thoughts, called free association, to reveal the unconscious mental processes at the root of the neurotic disturbance.

Nietzsche discusses the origins of Greek tragedy in the creative integration of what he calls Dionysian and Apollonian forces. Apollo is associated with law, with pounding order, with reason with containing knowledge, with the sun and light. Dionysus is associated with orgastic rites, music, dance and later drama. Religious rituals associated with him enact and celebrate death, rebirth and fertility. He is also associated with crops, including the grape (and the wine of intoxication), and with sexuality. Frenzied, ecstatic female worshippers (maenads) are central to the rituals and celebrations.

In a note from The Will to Power Nietzsche brings to light the Apollonian and the Dionysian as: The word ‘Dionysian’ is meant of an urge to unity, a reaching out beyond personality, the every day, society, reality, across the abyss of transitoriness: A passionate-painful overflowing into dark, Nietzsche more floating stats, . . . the feeling of the necessary unity of creation and destruction. [One contemporary classical scholar writes of ‘the unity of salvation and destruction . . . (as) a characteristic feature of all that is tragic.]

The word ‘Apollinian’ is meant, among other things, as the urge to perfect - sufficiency, to the typical ‘individual’, to all that simplifies, distinguishes, makes strong, clear, unambiguous, typical, and freedom under the law. Apollo is described as a dream interpreter.

Yet, all the same, we might discern Nietzsche’s influence in an important paper of this period, the 1914 paper ‘On Narcissism: An Introduction. In this paper, Freud explores, among other things, the effects of his finding of an original libidinal cathexis of the ego, from which some is later given off to objects, which fundamentally persists and is related to the object-cathexes much as the body of an amoeba is related to the pseudopodia out which it puts.

The development of the ego consists in a departure from the primary narcissism and results in a vigorous attempt to recover that state. Means of the displacement cause this departure of the libido onto an ego-ideal imposed from without, and satisfaction is caused from fulfilling this ideal. Simultaneously, the ego has sent out the libidinal object-cathexes. It becomes impoverished in favour of these cathexes, just as it does in favour of the ego-ideal, and it enriches it again from it satisfaction in respect of the object, just as it does by fulfilling its ideal.

Freud considers the implications of these findings for his dual instinct theory that divides instincts into the duality of ego instincts and libidinal instincts. Freud questions this division, but does not definitely abandon it, which he will later do in, Beyond the Pleasure Principle.

As indicated, one of Freud’s important points is that the ego tries to recover its state of primary narcissism. This is related to important theme s running through Nietzsche’s writings. Nietzsche is aware of ho we relate to others based on projections of idealized images of ourselves, and he is consistently looking for the way in which we are loving ourselves and aggrandizing ourselves in activities that reflect contrary motivations.

Nietzsche attempts to show that Greek culture and drama had accomplished the great achievement of recognising and creatively integrating the substratum of the Dionysian with the Apollonian. As Siegfried Mandel construed to suggest, Nietzsche destroyed widely held aesthetic views, inspired in 1755 by the archaeologist-historian Johann Winckelmann, about the ‘noble simplicity, calm grandeur’, ‘sweetness and light’, harmony and cheerfulness of the ancient Greeks and posed instead the dark Dionysia force’s that had to be harnessed to makes possible the birth of tragedy.

It is also important to consider that it is through the dream’s Apollonian images that the Dionysian reality can be manifested and known, as it is through the individuated actors on stage that the underlying Dionysian reality is manifested in Greek tragedy. As it is most creative, the Apollonian can allow an infusion of the harnesses in the Dionysian, but we should also note that Nietzsche is quite explicit that when the splendour of the Apollonian impulse is stood before an art that in it frenzies, rapture and excess ‘spoke the truth - . Excess revealed it as truth’. The Dionysian, and . , . . against this new power the Apollonia rose to the austere majesty of Doric art and the Doric view of the world. For Nietzsche, ‘Dionysian and the Apollonian, in new births ever following and mutually augmenting of one, another, controls led the Hellenic genius.’

Nietzsche is unchallenged as the most sightful and powerful critics of the moral climate of the 19th century (and of what remains in ours). His exploration of bringing forth an acknowledged unconscious motivation, and the conflict of opposing forces within the mindful purposes of possibilities of creative integration. Nietzsche distinguishes between two types of mental processes and is aware of the conflict between unconscious instinctual impulses and wishes and inhibiting or repressing forces. Both Freud and Nietzsche are engaged in a redefinition of the root of subjectivity, a redefinition that replaces the moral problem of issues concerning the economic problem of what Freud would call narcissism, . . . Freud and Nietzsche elaborate upon the whole field of libidinal economy: The transit of the libido through other selves, aggression, infliction and reception of pain, and something very much like death, the total evacuation of the entire quantum of excitation that the organism is charged.

The real world is flux and change for Nietzsche, but in his later works there is no “unknowable true world.” Also, the splits between a surface, apparent world and an unknowable but a true world of the things-in-themselves were, as is well known, a view Nietzsche rejected. For one thing, as Mary Warnock points out, Nietzsche was attempting to get across the point that there is only one world, not two. She also suggests that for Nietzsche, if we contribute anything to the world, it be the idea of a “thing,” and in Nietzsche’s words, “the psychological origin of the belief in things forbids us to speak of things-in-themselves.”

Nietzsche holds that there is an extra-mental world to which we are related and with which we have some kind of fixation. For him, even as knowledge develops in the service of - preservation and power, to be effective, a conception of reality will have a tendency to grasp (but only) a certain amount of, or aspect of, reality. However much Nietzsche may at times see (the truth of) artistic creation and dissimulation (out of chaos) as paradigmatic for science (which will not recognize it as such), in arriving art this position Nietzsche assumes the truth of scientifically based beliefs as a foundation for many of his arguments, including those regarding the origin, development and nature of perception, consciousness and - consciousness and what this entails for our knowledge of and falsification of the external and inner world. In fact, to some extent the form-providing, affirmative, this-world healing of art is a response to the terrifying, nausea-inducing truths revealed by science that by it had no treatment for the underlying cause of the nausea. Although Nietzsche also writes of the horrifying existential truths, against which science can attempt a [falsifying] defence. Nevertheless, while there is a real world to which we are affiliated, there is no sensible way to speak of a nature or constitution or eternal essence of the world by it apart from description and perceptive. Also, states of affairs to which our interpretations are to fit are established within human perspectives and reflect (but not only) our interests, concerns, needs for calculability. While such relations (and perhaps as meta-commentary on the grounds of our knowing) Nietzsche is quite willing to write of the truth, the constitution of reality, and facts of the case. There appears of no restricted will to power, nor the privilege of absolute truth. To expect a pure desire for a pure truth is to expect an impossible desire for an illusory ideal.

In the articulation comes to rule supreme in oblivion, either in the individual’s forgetfulness or in those long stretches of the collective past that have never been and will never be called forth into the necessarily incomplete articulations of history, the record of human existence that is profusely interspersed with dark passages. This accounts for the continuous questing of archeology, palaeontology, anthropology, geology, and accounts, too, for Nietzsche’s warning against the “insomnia” of historicisms. As for the individual, the same drive is behind the modern fascination with the unconscious and, thus, with dreams, and it was Nietzsche who, before Freud, spoke of forgetting as an activity of the mind. At the beginning of his, Genealogy of Morals, he claims, in defiance of all psychological “shallowness,” that the lacunae of memory are not merely “passive” but the outcome of an active and positive “screening,” preventing us from remembering what would upset our equilibrium. Nietzsche is the first discoverer of successful “repression,” the burying of potential experience in the articulation, that is, as moderately when the enemy territory is for him.

Still, he is notorious for stressing the ‘will to power’ that is the basis of human nature, the ‘resentment’ that comes once it is denied of its basis in action, and the corruptions of human nature encouraged by religions, such as Christianity, that feed on such resentment. Yet the powerful human being who escapes all this, the ‘Übermensch’, is not the ‘blood beast’ of later fascism: It is a human being who has mastered passion, risen above the senseless flux, and given creative style of his or her character. Nietzsche’s free spirits recognize themselves by their joyful attitude to eternal return. He frequently presents the creative artist than the world warlord as his best exemplar of the type, but the disquieting fact remains that he seems to leave him no words to condemn any uncaged beast of prey who vests finds their style by exerting repulsive power over others. Nietzsche’s frequently expressed misogyny does not help this problem, although in such matters the interpretation of his many-layered and ironic writing is not always straightforward. Similarly, such anti-Semitism, as found in his work is in an equally balanced way as intensified denouncements of anti-Semitism, and an equal or greater contempt of the German character of his time.

Nietzsche’s current influence derives not only from his celebration of the will, but more deeply from his scepticism about the notions of truth and fact. In particular, he anticipated many central tenets of postmodernism: An aesthetic attitude toward the world that sees it as a ‘text’, the denial of facts: The denial of essences, the celebration of the plurality of interpretations and of the fragmented and political discourse all for which are waiting their rediscovery in the late 20th century. Nietzsche also has the incomparable advantage over his followers of being a wonderful stylist, and his perspectives are echoed in the shifting array of literary devices - humour, irony, exaggeration, aphorisms, verse, dialogue, parody with which he explores human life and history.

All the same, Nietzsche is openly pessimistic about the possibility of knowledge: ‘We simply lack any organ for knowledge, for ‘truth’: We ‘know’ (or believe or imagine) just as much as may be useful in the interests of the human herd, the species, and perhaps precisely that most calamitous stupidity of which we shall perish some day’ (The Gay Science).

Nonetheless, that refutation assumes that if a view, as perspectivism it, is an interpretation, it is by that very fact wrong. This is not so, however, an interpretation is to say that it can be wrong, which is true of all views, and that is not a sufficient refutation. To show the perspectivism is really false producing another view superior to it on specific epistemological grounds is necessary.

Perspectivism does not deny that particular views can be true. Like some versions of contemporary anti-realism, it attributes to specific approaches’ truth in relation to facts themselves. Still, it refused to envisage a single independent set of facts, and accounted for by all theories. Thus, Nietzsche grants the truth of specific scientific theories: He does, however, deny that a scientific interpretation can possibly be ‘the only justifiable interpretation of the world’: Neither the fact’s science addresses nor the methods serve the purposes for which they have been devised: Nonetheless, these have no priority over the many others’ purposes of human life.

Every schoolchild learns eventually that Nietzsche was the author of the shocking slogan, "God is dead." However, what makes that statements possible are another claim, even more shocking in its implications: "Only that which has no history can be defined" (Genealogy of Morals). Since Nietzsche was the heir to seventy-five years of German historical scholarship, he knew that there was no such thing as something that has no history. Darwin had, as Dewey points out that effectively shows that searching for a true definition of a species is not only futile but unnecessary (since the definition of a species is something temporary, something that changes over time, without any permanent lasting and stable reality). Nietzsche dedicates his philosophical work to doing the same for all cultural values.

Reflecting it for a moment on the full implications of this claim is important. Its study of moral philosophy with dialectic exchange that explores the question "What is virtue?" That takes a firm withstanding until we can settle that of the issue with a definition that eludes all cultural qualification. What virtue is, that we cannot effectively deal with morality, accept through divine dispensation, unexamined reliance on traditions, skepticism, or relativism (the position of Thrasymachus). The full exploration of what deals with that question of definition might require takes’ place in the Republic.

Many texts we read subsequently took up Plato's challenge, seeking to discover, through reason, a permanent basis for understanding knowledge claims and moral values. No matter what the method, as Nietzsche points out in his first section, the belief was always that grounding knowledge and morality in truth was possible and valuable, that the activity of seeking to ground morality was conducive to a fuller good life, individually and communally.

To use a favourite metaphor of Nietzsche's, we can say that previous systems of thought had sought to provide a true transcript of the book of nature. They made claims about the authority of one true text. Nietzsche insists repeatedly that there be no single canonical text; There are only interpretations. So, there is no appeal to some definitive version of Truth (whether we search in philosophy, religion, or science). Thus the Socratic quest for some way to tie morality down to the ground, so that it does not fly away, is (and has always been) futile, although the long history of attempts to do so has disciplined the European mind so that we, or a few of us, are ready to move into dangerous new territory where we can situate the most basic assumptions about the need for conventional morality to the test and move on "Beyond Good and Evil," that is, to a place where we do not take the universalizing concerns and claims of traditional morality seriously.

Nietzsche begins his critique here by challenging that fundamental assumption: Who says that seeking the truth is better for human beings? How do we know an untruth is not better? What is truth anyway? In doing so, he challenges the sense of purpose basic to the traditional philosophical endeavour. Philosophers, he points out early, may be proud of the way they begin by challenging and doubting received ideas. However, they never challenge or doubt the key notion they all start with, namely, that there is such a thing as the Truth and that it is something valuable for human beings (surely much more valuable than its opposite).

In other words, just as the development of the new science had gradually and for many painfully and rudely emptied nature of any certainty about a final purpose, about the possibilities for ever agreeing of the ultimate value of scientific knowledge, so Nietzsche is, with the aid of new historical science (and the proto-science of psychology) emptying all sources of cultural certainty of their traditional purposiveness and claims to permanent truth, and therefore of their value, as we traditionally understood that of the term. There is thus no antagonism between good and evil, since all versions of equal are equally fictive (although some may be more useful for the purposes of living than others).

At this lodging within space and time, In really do not want to analyse the various ways Nietzsche deals with this question. Nevertheless, In do want to insist upon the devastating nature of his historical critique on all previous systems that have claimed to ground knowledge and morality on a clearly defined truth of things. For Nietzsche's genius rests not only on his adopting the historical critique and applying to new areas but much more on his astonishing perspicuity in seeing just how extensive and flexible the historical method might be.

For example, Nietzsche, like some of those before him, insists that value systems are culturally determined they arise, he insists, as often as not form or in reaction to conventional folk wisdom. Yet to this he adds something that to us, after Freud, may be well accepted, but in Nietzsche's hands become something as shocking: Understanding of a system of value is, he claims, requires us more than anything else to see it as the product of a particular individual's psychological history, a uniquely personal confession. Relationship to something called the "Truth" has nothing to do with the "meaning" of a moral system; as an alternative we seek its coherence in the psychology of the philosopher who produced it.

Gradually, in having grown into a greater clarity of what every great philosophy has endearingly become, as staying in the main theme of personal confessions, under which a kind of involuntary and an unconscious memoir and largely that the moral (or immoral) intentions in every philosophy formed the real germ of life from which the whole plant had grown.

A concentration has here unmasked claims to “truth” upon the history of the life of the person proposing the particular "truth" this time. Systems offering us a route to the Truth are simply psychologically produced fictions that serve the deep (often unconscious) purposes of the individual proposing them. Therefore they are what Nietzsche calls "foreground" truths. They do not penetrate into the deep reality of nature, and, yet, to fail to see this is to lack "perspective."

Even more devastating is Nietzsche's extension of the historical critique to language it. Since philosophical systems deliver themselves to us in language, that language shapes them and by the history of that language. Our Western preoccupation with the inner for which perceivable determinates, wills, and so forth, Nietzsche can place a value on as, in large part, the product of grammar, the result of a language that builds its statements around a subject and a predicate. Without that historical accident, Nietzsche affirms, we would not have committed an error into mistaking for the truth something that is a by-product of our particular culturally determined language system.

He makes the point, for example, that our faith in consciousness is just an accident. If instead of saying "In think," we were to say "Thinking is going on in my body," then we would not be tempted to give the "In," some independent existence, (e.g., in the mind) and make large claims about the ego or the inner. The reason we do search for such an entity stem from the accidental construction of our language, which encourages us to use a subject (the personal pronoun) and a verb. The same false confidence in language also makes it easy for us to think that we know clearly what key things like "thinking" and "willing" are; Whereas, if we were to engage in even a little reflection, we would quickly realize that the inner processes neatly summed up by these apparently clear terms is anything but clear. His emphasis on the importance of psychology as queen of the sciences underscores his sense of how we need to understand more fully just how complex these activities are, particularly the emotional appetites, before we talk about them so simplistically, the philosophers that concurrently have most recently done.

This remarkable insight enables Nietzsche, for example, at one blow and with cutting contempt devastatingly to dismiss as "trivial" the system Descartes had set up so carefully in the Meditations. Descartes's triviality consists in failing to recognize how the language he imprisons, shapes his philosophical system as an educated European, using and by his facile treatment of what thinking is in the first place. The famous Cartesian dualism is not a central philosophical problem but an accidental by-product of grammar designed to serve Descartes' own particular psychological needs. Similarly Kant's discovery of "new faculties" Nietzsche derides as just a trick of language - a way of providing what looks like an explanation and is, in fact, as ridiculous as the old notions about medicines putting people to sleep because they have the sleeping virtue.

It should be clear from examples like this (and the others throughout), which there is very little capability of surviving Nietzsche's onslaught, for what are there to which we can points to which did not have a history or deliver it to us in a historically developing system of language? After all, our scientific enquiries in all areas of human experience teach us that nothing is ever, for everything is always becoming.

Nietzsche had written that with repression of instincts and their turn inward, ‘the entire inner worlds, originally as thin as if it were stretched between two membranes, expanded and extended it, acquired depth, breadth, and heighten the same writing of a ’bad conscience’ . . . [as] the womb of all ideal nd imaginative phenomena . . . an abundance of strange new beauty and affirmation and perhaps beauty it.

The developments in the finding of an original libidinal cathexis of the ego, from which some is later given off to object but fundamentally persists and is related to the object-cathexes much as the body of an amoeba is related to the pseudopodia in which it puts out.

The development of the ego consists in a departure from the primary narcissism and result in a vigorous attempt to recover that state. This departure is caused by means of the displacement of the libido onto an ego-ideal imposed from without, and satisfaction is caused from fulfilling this ideal.

While the ego has sent out the libidinal object-cathexes, it becomes impoverished in favour of these cathexes’ it again from its satisfactions in respect of the object, just as it does by fulfilling its ideal.

Freud considers the implications of such finds for his dual instinct Theory that divides instincts into the duality of ego instinct and libidinal instincts. Freud questions this division, but does not definitely abandon it, which he will do in beyond the Pleasure Principle.

As indicted, one of Freud’s important points is that the ego attempts to recover its state of primary narcissism. This is related to important themes we relate to others based on projections of idealized images we are loving ourselves and aggrandizing ourselves in activities that reflect contrary motivations.

As a mother gives to her child that of which she deprives her . . . is it does not clear that in [such] instances man loves something of him . . . more than something else of themselves . . . the inclinations for something (wishes, impulse, desire) is present in all [such] instances to give in to it, with all the consequences, are in any even not ‘unegoistic’.

As Freud is entering his study of the destructive instincts - the death instinct and its manifestations outward as aggression a well as its secondary turn back inward upon it - might wonder if Nietzsche, who had explored the vicissitude’s of aggression and was famous for his concept of will to power, was among the ‘all kinds of things’ Freud was reading. At least Freud clearly had the ‘recurrence of the same’ on his mind during his period, while pessimism and relevance on pleasure during this period. While Freud’s through release of or discharge of and decreases of tension have strong affinities with Schopenhauer, there is the comparatively different ‘pleasure ‘of Eros’.

One point to be made is that Nietzsche’s concept of the will to power was an attempt to go beyond the pleasure principle and beyond good and evil. A principle of which, as for Nietzsche the primary drives to ward-off its primitive and more sublimated manifestations. All the same, pain is an essential ingredient since it is not a state attained at the end of suffering but the process of overcoming it (as of obstacles and suffering) that the central factor in the experience of an increase of power and joy.

Freud writes of as no other kind or level of mastery, the binding of instinctual impulses that is a preparatory act. Although this binding and the replacement of primary process with the secondary process operate before and without necessary regard for ‘the development of unpleasure, the transformation occurs on behalf of the pleasure principle, the binding is the preparatory act that introduces and assures the dominance of the pleasure principle’ . . . The binding . . . [is] designed to preparatory excitement for its final elimination in the pleasure of discharge.

For the individual who suffers this repeated and frustrated effect of pleasure, it is not only the object of the past that cannot be recovered, nor the relation that cannot be restored or reconstructed. Nevertheless, it is time it that resists the human ill and proves is unyielding. Between pleasure and satisfaction, a prohibition or negation of pleasure is enacted which necessitates the endless repetition and proliferation of thwarted pleasures. The repetition is a vain effort to stay, or to reverse time, such repetition reveals a rancor against the present that feeds upon it.

However at this point we might be tempted, as many have been, to point to the new natural science as a counter-instance that typifies the dulling of natural science of a progressive realization of the truth of the world, or at least a closer and closer approximation to that truth? In fact, it is interesting to think about just how closely Kuhn and Nietzsche might be linked in their views about the relationship between science and the truth of things or to what extent modern science might not provide the most promising refutation of Nietzsche's assertion that there is no privileged access to a final truth of things (a hotly disputed topic in the last decade or more). It tells us say here that for Nietzsche science is just another "foreground" way of interpreting nature. It has no privileged access to the Truth, although he does concede that, compared with other beliefs, it has the advantage of being based on sense experience and therefore is more useful for modern times.

There is one important point to stress in this review of the critical power of Nietzsche's project. Noting that Nietzsche is not calling us to a task for having beliefs is essential. We have to have beliefs. Human life must be the affirmation of values; Otherwise, it is not life. Nonetheless, Nietzsche is centrally concerned to mock us for believing that our belief systems are True, are fixed, are somehow eternally right by a grounded standard of knowledge. Human life, in its highest forms, must be lived in the full acceptance that the values we create for ourselves are fictions. We, or the best of us, have to have the courage to face the fact that there is no "Truth" upon which to ground anything in which we believe; we must in the full view of that harsh insight, but affirm ourselves with joy. The Truth is not accessible to our attempts at discovery; What thinking human beings characteristically do, in their pursuit of the Truth, is creating their own truths.

Now, this last point, like the others, has profound implications for how we think of ourselves, for our conception of the human. Because human individuals, like human cultures, also have a history. Each of us has a personal history, and thus we ourselves cannot be defined; we, too, are in a constant process of becoming, of transcending the person we have been into something new. We may like to think of ourselves as defined by some essential rational quality, but in fact we are not. In stressing this, of course, Nietzsche links him with certain strains of Romanticism, especially (from the point of view of our curriculum) with William Blake.

This tradition of Romanticism holds up a view of life that is radically individualistic, - created, - generated. "In must create my own system or become enslaved by another man's" Blake wrote. It is also thoroughly aristocratic, with little room for traditional altruism, charity, or egalitarianism. Our lives to realize their highest potential should be lived in solitude from others, except perhaps those few we recognize as kindred souls, and our life's efforts must be a spiritually demanding but joyful affirmation of the process by which we maintain the vital development of our imaginative conceptions of ourselves.

Contrasting this view of a constantly developing entity might be appropriate, but without essential permanence, with Marx's view. Marx, too, insists on the process of transformation of ideas but for him, the material forces control the transformation of production, and these in turn are driven by the logic of history. It is not something that the individual takes charge of by an act of individual will, because individual consciousness, like everything else, emerges form and is dependent upon the particular historical and material circumstances, the stage in the development of production, of the social environment in which the individual finds him or her.

Nietzsche, like Marx, and unlike later Existentialists, de Beauvoir, for example, recognizes that the individual inherits particular things from the historical moment of the culture (e.g., the prevailing ideas and, particularly, the language and ruling metaphors). Thus, for Nietzsche the individual is not totally free of all context. However, the appropriate response to this is not, as in Marx, the development of class consciousness, a solidarity with other citizens and an imperative to help history along by committing one to the class war alongside other proletarians, but in the best and brightest spirits, a call for a heightened sense of an individuality, of one's radical separation from the herd, of one's final responsibility to one's own most fecund creativity.

Because Nietzsche and the earlier Romantics are not simply saying, we should do what we like is vital. They all have a sense that - creation of the sort they recommend requires immense spiritual and emotional discipline - the discipline of the artist shaping his most important original creation following the stringent demands of his creative imagination. These demands may not be rational, but they are not permissively relativistic in that 1960's sense ("If it feels good, do it"). Permissiveness may have often been attributed to this Romantic tradition, a sort of 1960's “shop til you drop" ethic, but that is not what any of them had in mind. For Nietzsche that would simply be a herd response to a popularized and bastardized version of a much higher call to a solitary life lived with the most intense but personal joy, suffering, insight, courage, and imaginative discipline.

This aspect of Nietzsche's thought represents the fullest nineteenth-century European affirmation of a Romantic vision of the as radically individualistic (at the opposite end of the spectrum from Marx's views of the social and economically determined). A profound and lasting effect in the twentieth century as we become ever more uncertain about coherent social identities and thus increasingly inclined to look for some personal way to take full charge of our own identities without answering to anyone but ourselves.

Much of the energy and much of the humour in Nietzsche's prose comes from the urgency with which he sees such creative - affirmation as essential if the human species is not going to continue to degenerate. For Nietzsche, human beings are, primarily, biological creatures with certain instinctual drives. The best forms of humanity are those of whom most excellently express the most important of these biological drives, the "will to power," by which he means the individual will to assume of one and create what he or she needs, to live most fully. Such a "will to power" is beyond morality, because it does not answer to anyone's system of what makes up good and bad conduct. The best and strongest human beings are those of whom create a better quality in values for themselves, live by them, and refuse to acknowledge their common links with anyone else, other than other strong people who do the same and are thus their peers.

His surveys of world history have convinced Nietzsche that the development of systems has turned this basic human drive against human beings of morality favouring the weak, the suffering, the sick, the criminal, and the incompetent (all of whom he lumps together in that famous phrase "the herd"). He salutes the genius of those who could accomplish this feat (especially the Jews and Christians), which he sees as the revenge of the slaves against their natural masters. From this century - long acts of revenge, human beings are now filled with feelings of guilt, inadequacy, jealousy, and mediocrity, a condition alleviated, if at all, by dreams of being helpful to others and of an ever-expanding democracy, an agenda powerfully served by modern science (which serves to bring everything and everyone down to the same level). Fortunately, however, this ordeal has trained our minds splendidly, so that the best and brightest (the new philosophers, the free spirits) can move beyond the traditional boundaries of morality, that is, "beyond good and evil" (his favourite metaphor for this condition is the tensely arched bow ready to shoot off an arrow).

Stressing it is important, which upon Nietzsche does not believe that becoming such a "philosopher of the future" is easy or for everyone. It is, by contrast, an extraordinarily demanding call, and those few capable of responding to it might have to live solitary lives without recognition of any sort. He is demanding an intense spiritual and intellectual discipline that will enable the new spirit to move into territory no philosopher has ever roamed before, a displacing medium where there are no comfortable moral resting places and where the individual will probably (almost unquestionably) has to pursue of a profoundly lonely and perhaps dangerous existence (so the importance of another favourite metaphor of his, the mask). Nevertheless, this is the only way we can counter the increasing degeneration of European man into a practical, democratic, technocratic, altruistic herd animal.

By way of a further introduction to Nietzsche's Beyond Good and Evil, it would only offer an extended analogy, Still, to extend some remarks into directions that have not yet been explored.

Before placing the analogy on the table, however, In wish to issue a caveat. Analogies may really help to clarify, but they can also influence us by some unduly persuasive influences of misleading proportions. In hope that the analogy In offer will provide such clarity, but not at the price of oversimplifying. So, as you listen to this analogy, you need to address the questions: To what extent does this analogy not hold? To what extent does it reduce the complexity of what Nietzsche is saying into a simpler form?

The analogy put to put on the table is the comparison of human culture to a huge recreational complex in which several different games are going on. Outside people are playing soccer on one field, rugby on another, American football on another, and Australian football on another, and so on. In the club house different groups of people are playing chess, dominoes, poker, and so on. There are coaches, spectators, trainers, and managers involved in each game. Surrounding the recreation complex is wilderness.

These games we might use to characterize different cultural groups: French Catholics, German Protestants, scientists, Enlightenment rationalists, European socialists, liberal humanitarians, American democrats, free thinkers, or what possesses you. The variety represents the rich diversity of intellectual, ethnic, political, and other activities.

The situation is not static of course. Some games have far fewer players and fans, and the popularity is shrinking; Some are gaining popularity rapidly and increasingly taking over parts of the territory available. Thus, the traditional sport of Aboriginal lacrosse is but a small remnant of what it was before contact. However, the Democratic capitalist game of baseball is growing exponentially, as is the materialistic science game of archery. They might combine their efforts to create a new game or merge their leagues.

When Nietzsche looks at Europe historically, what he sees is that different games have been going on like this for centuries. He further sees that many participants in anyone game has been aggressively convinced that their game is the "true" game, which it corresponds with the essence of games or is a close match to the wider game they imagine going on in the natural world, in the wilderness beyond the playing fields. So they have spent much time producing their rule books and coaches' manuals and making claims about how the principles of their game copy or reveal or approximate the laws of nature. This has promoted and still promotes a good deal of bad feeling and fierce arguments. Therefore, in addition anyone game it, within the group pursuing it there has always been all sorts of sub-games debating the nature of the activity, refining the rules, arguing over the correct version of the rule book or about how to educate the referees and coaches, and so on.

Nietzsche's first goal is to attack this dogmatic claim about the truth of the rules of any particular game. He does this, in part, by appealing to the tradition of historical scholarship that shows that these games are not eternally true, but have a history. Rugby began when a soccer player broke the rules and picked up the ball and ran with it. American football developed out of rugby and has changed and is still changing. Basketball had a precise origin that can be historically found.

Rule books are written in languages that have a history by people with a deep psychological point to prove: The games are an unconscious expression of the particular desires of inventive game’s people at a very particular historical moment; these rule writers are called Plato, Augustine, Socrates, Kant, Schopenhauer, Descartes, Galileo, and so on. For various reasons they believe, or claim to believe, that the rules they come up with reveals something about the world beyond the playing field and are therefore "true" in a way that other rule books are not; they have, as it was, privileged access to reality and thus record, to use a favourite metaphor of Nietzsche's, the text of the wilderness.

In attacking such claims, Nietzsche points out, the wilderness bears no relationship at all to any human invention like a rule book; He points out that nature is "wasteful beyond measure, without purposes and consideration, without mercy and justice, fertile and desolate and uncertain simultaneously: Imagine malaise of its power - how could you live according to this indifference. Living-is that not precisely wanting to be other than this nature.” Because there is no connection with what nature truly is, such rule books are mere "foreground" pictures, fictions dreamed up, reinforced, altered, and discarded for contingent historical reasons. Moreover, the rule manuals often bear a suspicious resemblance to the rules of grammar of a culture, thus, for example, the notion of an ego as a thinking subject, Nietzsche points out, is closely tied to the rules of European languages that insist on a subject and verb construction as an essential part of any statement.

So how do we know what we have is the truth? Why do we want the truth, anyway? People seem to need to believe that their games are true, but why? Might they not be better if they accepted that their games were false, were fictions, deal with the reality of nature beyond the recreational complex? If they understood the fact that everything they believe in has a history and that, as he says in the Genealogy of Morals, "only that which has no history can be defined," they would understand that all this proud history of searching for the truth is something quite different from what philosophers who have written rule books proclaim.

Furthermore these historical changes and developments occur accidentally, for contingent reasons, and have nothing to do with the games, or anyone game, shaping it according to any ultimate game or any given rule book of games given by the wilderness, which is indifferent to what is going on. There is no basis for the belief that, if we look at the history of the development of these games, we discover some progressive evolution of games toward some higher type. We may be able, like Darwin, to trace historical genealogies, to construct a narrative, but that narrative does not reveal any clear direction or any final goal or any progressive development. The genealogy of games suggests that history be a record of contingent change. The assertion that there is such a thing as progress is simply another game, another rule added by inventive minds (who need to believe in progress); it bears no relationship to nature beyond the sports complex.

While one is playing on a team, one follows the rules and thus has a sense of what form right and wrong or good and evil conduct in the game. All those carrying out the same endeavour share this awareness. To pick up the ball in soccer is evil (unless you are the goalie), and to punt the ball while running in American football is permissible but stupid; in Australian football both actions are essential and right. In other words, different cultural communities have different standards of right and wrong conduct. The artificial inventions have determined these called rule books, one for each game. These rule books have developed the rules historically; Thus, they have no permanent status and no claim to privileged access.

Now, at this point you might be thinking about the other occasion in which of Aristotle's Ethics, acknowledges that different political systems have different rules of conduct. Still, Aristotle believes that an examination of different political communities will enable one to derive certain principles common to them all, bottom-up generalizations that will then provide the basis for reliable rational judgment on which game is being played better, on what was good play in any particular game, on whether or not a particular game is being conducted well or not.

In other words, Aristotle maintains that there is a way of discovering and appealing to some authority outside any particular game to adjudicate moral and knowledge claims that arise in particular games or in conflicts between different games. Plato, of course, also believed in the existence of such a standard, but proposed a different route to discovering it.

Now Nietzsche emphatically denies this possibility. Anyone who tries to do what Aristotle recommends is simply inventing another game (we can call it Super-sport) and is not discovering anything true about the real nature of games because they do not organize reality (that has the wilderness surrounding us) as a game. In fact, he argues, that we have created this recreational complex and all the activities that go on in it to protect themselves from nature (which is indifferent to what we do with our lives), not to copy some recreational rule book that wilderness reveals. Human culture exists as an affirmation of our opposition or to contrast with nature, not as an extension of rules that include both human culture and nature. That is why falsehoods about nature might be a lot more useful than truths, if they enable us to live more fully human lives.

If we think of the wilderness as a text about reality, as the truth about nature, then, Nietzsche claims, we have no access at all to that text. What we do have accessed to conflicting interpretations, none of them based on privileged access to a "true" text. Thus, the soccer players may think them and their game is superior to rugby and the rugby players, because soccer more closely represents the surrounding wilderness, but such statements about better and worse are irrelevant. There is nothing a rule bound outside the games themselves. Therefore, all dogmatic claims about the truth of all games or any particular game are false.

Now, how did this situation come about? Well, there was a time when all Europeans played almost the same game and had done so for many years. Having little-to-no historical knowledge and sharing the same head coach in the Vatican and the same rule book, they believed that the game was the only one possible and had been around for ever. So they naturally believed that their game was true. They shored up that belief with appeals to scripture or to eternal forms, or universal principles or to rationality or science or whatever. There were many quarrels about the nature of ultimate truth, that is, about just how one should tinker with the rule book, about what provided access to God's rules, but there was agreement that such excess must exist.

Take, for example, the offside rule in soccer. Without that the game could not continue in its traditional way. Therefore, soccer players see the offside rule as an essential part of their reality, and since soccer is the only game in town and we have no idea of its history (which might, for example, tell us about the invention of the off-side rule), then the offside rule is easy to interpret as a universal, a requirement for social activity, and we will find and endorse scriptural texts that reinforce that belief. Our scientists will devote their time to linking the offside rule with the mysterious rumblings that come from the forest. From this, one might be led to conclude that the offside rule is a Law of Nature, something that extends far beyond the realms of our particular game into all possible games and, beyond those, into the realm of the wilderness it.

Of course, there were powerful social and political forces (the coach and trainers and owners of the team) who made sure that people had lots of reasons for believing in the unchanging verity of present arrangements. So it is not surprising that we find plenty of learned books, training manuals, and locker room exhortations urging everyone to remember the offside rule and to castigate as "bad" those who routinely forget that part of the game. We will also worship those who died in defence of the offside rule. Naturally any new game that did not recognize the offside rule would be a bad game, an immoral way to conduct one. So if some group tried to start a game with a different offside rule, that group would be attacked because they had violated a rule of nature and were thus immoral.

However, for contingent historical reasons, Nietzsche argues, that situation of one game in town did not last. The recreational unity of the area divides the developments in historical scholarships into past demonstrations, in that all too clearly there is an overwhelming amount of evidence that all the various attempts to show that one specific game was exempted over any of all other true games, as they are false, dogmatic, trivial, deceiving, and so on.

For science has revealed that the notion of a necessary connection between the rules of any game and the wider purposes of the wilderness is simply an ungrounded assertion. There is no way in which we can make the connections between the historically derived fictions in the rule book and the mysterious and ultimately unknowable directions of irrational nature. To conform of science, we have to believe in causes and effects, but there is no way we can prove that this is a true belief and there is a danger for us if we simply ignore that fact. Therefore, we cannot prove a link between the game and anything outside it. History has shown us, just as Darwin's natural history has proved, that all apparently eternal issues have a story, a line of development, a genealogy. Thus, notions, like species, have no reality-they are temporary fiction imposed for the sake of defending a particular arrangement.

So, God is dead. There is no eternal truth anymore, no rule book in the sky, no ultimate referee or international Olympic committee chair. Nietzsche did not kill God; History and the new science did. Nietzsche is only the most passionate and irritating messenger, announcing over the intercom system to anyone who will listen that an appeal to a system can defend someone like Kant or Descartes or Newton who thinks that what he or she is doing grounded in the truth of nature has simply been mistaken.

This insight is obvious to Nietzsche, and he is troubled that no one is worried about it or even to have noticed it. So he has moved to call the matter to our attention as stridently as possible, because he thinks that this realization requires a fundamental shift in how we live our lives.

For Nietzsche Europe is in crisis. It has a growing power to make life comfortable and an enormous energy. However, people seem to want to channel that energy into arguing about what amounts to competing fictions and to force everyone to follow particular fictions.

Why is this insight so worrying? Well, one point is that dogmatists get aggressive. Soccer players and rugby players who forget what Nietzsche is pointing out can start killing each other over questions that admit of no answer, namely, question about which group has the true game, which ordering has a privileged accountability to the truth. Nietzsche senses that dogmatism is going to lead to warfare, and he predicts that the twentieth century will see an unparalleled extension of warfare in the name of competing dogmatic truths. Part of his project is to wake up the people who are intelligent enough to respond to what he is talking about so that they can recognize the stupidity of killing each other for an illusion that they misunderstand for some "truth."

Besides that, Nietzsche, like Mill (although, in a very different way), is seriously concerned about the possibilities for human excellence in a culture where the herd mentality is taking over, where Europe is developing into competing herds - a situation that is either sweeping up the best and the brightest or stifling them entirely. Nietzsche, like Mill and the ancient pre-

Socratic Greeks to whom he constantly refers, is an elitist. He wants the potential for individual human excellence to be liberated from the harnesses of conformity and group competition and conventional morality. Otherwise, human beings are going to become destructive, lazy, conforming herd animals, using technology to divert them from the greatest joys in life, which come only from individual striving and creativity, activities that require one to release one's instincts without keeping them eternally subjugated to controlling historical consciousness or a conventional morality of good and evil.

What makes this particularly a problem for Nietzsche is that he sees that a certain form of game is gaining popularity: Democratic volleyball. In this game, the rule book insists that all players be treated equally, that there be no natural authority given to the best players or to those who understand the nature of quality play. Therefore the mass of inferior players is taking over, the quality of the play is deteriorating, and there are fewer and fewer good volleyball players. This process is being encouraged both by the traditional ethic of "help your neighbour," now often in a socialist uniform and by modern science. As the mass of more many inferior players takes over the sport, the mindless violence of their desires to attack other players and take over their games increases, as does their hostility to those who are uniquely excellent (who may need a mask to prevent themselves being recognized).

The hopes for any change in this development are not good. In fact, things might be getting worse. For when Nietzsche looks at all these games going on he notices certain groups of people, and the prospect is not totally reassuring.

First there remain the overwhelming majority of people: the players and the spectators, those caught up in their particular sport. These people are, for the most part, continuing as before without reflecting or caring about what they do. They may be vaguely troubled about rumours they hear that their game is not the best, they may be bored with the endless repetition in the schedule, and they have essentially reconciled them that they are not the only game going on, but they had rather not thought about it. Or else, stupidly confident that what they are doing is what really matters about human life, is true, they preoccupy themselves with tinkering with the rules, using the new technology to get better balls, more comfortable seats, louder whistles, more brightly painted side lines, more trendy uniforms, tastier Gatorade - all in the name of progress.

Increasing numbers of people are moving into the stands or participating through the newspaper or the television sets. Most people are thus, in increasing numbers, losing touch with themselves and their potential as instinctual human beings. They are the herd, the last men, preoccupied with the trivial, unreflectingly conformist because they think, to the extent they think at all, that what they do will bring them something called "happiness." Yet they are not happy: They are in a permanent state of narcotized anxiety, seeking new ways to entertain themselves with the steady stream of marketed distractions that the forces of the market produce: Technological toys, popular entertainment, college education, Wagner's operas, academic jargon.

This group, of course, includes all the experts in the game, the cheerleaders whose job it is to keep us focussed on the seriousness of the activity, the sports commentators and pundits, whose life is bound up with interpreting, reporting, and classifying players and contests. These sportscasters are, in effect, the academics and government experts, the John Maddens and Larry Kings and Mike Wallaces of society, those demigods of the herd, whose authority derives from the false notion that what they are dealing with is something other than a social-fiction.

There is a second group of people, who have accepted the ultimate meaninglessness of the game in which they were. They have moved to the sidelines, not as spectators or fans, but as critics, as cynics or nihilists, dismissing out of hand all the pretensions of the players and fans, but not affirming anything themselves. These are the souls who, having nothing to will (because they have seen through the fiction of the game and have therefore no motive to play any more), prefer to will nothing in a state of paralysed skepticism. Nietzsche has a certain admiration for these people, but maintains that a life like this, the nihilist on the sidelines, is not a human life.

For, Nietzsche insists, to live as a human being, is to play a game. Only in playing a game can one affirm one's identity, can one create values, can one truly exist. Games are the expression of our instinctual human energies, our living drives, what Nietzsche calls our "will to power." So the nihilistic stance, though understandable and, in a sense, courageous, is sterile. For we are born to play, and if we do not, then we are not fulfilling a worthy human function. Also, we have to recognize that all games are equally fictions, invented human constructions without any connections to the reality of things.

So we arrive at the position of the need to affirm a belief (invent a rule book) which we know to have been invented, to be divorced from the truth of things. To play the best game is to live by rules that we invent for ourselves as an assertion of our instinctual drives and to accept that the rules are fictions: they matter, we accept them as binding, we judge ourselves and others by them, and yet we know they are artificial. Just as in real life a normal soccer player derives a sense of meaning during the game, affirms his or her value in the game, without ever once believing that the rules of soccer have organized the universe or that those rules have any universal validity, so we must commit ourselves to epistemological and moral rules that enable us to live our lives as players, while simultaneously recognizing that these rules have no universal validity.

The nihilists have discovered half this insight, but, because they cannot live the full awareness, they are very limited human beings.

The third group of people, that small minority that includes Nietzsche himself, who of which are those who accept the game’s metaphor, see the fictive nature of all systems of knowledge and morality, and accept the challenge that to be most fully human is to create a new game, to live a life governed by rules imposed by the dictates of one's own creative nature. To base one's life on the creative tensions of the artist engaged with creating a game that meets most eloquently and uncompromisingly the demand of one's own irrational nature-one's wish-is to be most fully free, most fully human.

This call to live the - created life, affirming one in a game of one's own devising, necessarily condemns the highest spirits to loneliness, doubt, insecurity, emotional suffering, because most people will mock the new game or be actively hostile to it or refuse to notice it, and so on; Alternatively, they will accept the challenge but misinterpret what it means and settle for some marketed easy game, like floating down the Mississippi smoking a pipe. Nevertheless, a generated game also brings with-it the most intense joy, the most playful and creative affirmation of what is most important in our human nature.

Noting here that one’s freedom to create is important one's own game is limited. In that sense, Nietzsche is no existentialist maintaining that we have a duty and an unlimited freedom to be whatever we want to be. For the resources at our disposable parts of the field still available and the recreational material lying around in the club house-are determined by the present state of our culture. Furthermore, the rules In devise and the language In frame them in will ordinarily owe a good deal to the present state of the rules of other games and the state of the language in which those are expressed. Although in changing the rules of my game, let it be known that my starting point, or the rules have the availability to change, and are given to me by my moment in history. So in moving forward, in creating something that will transcend the past, In am using the materials of the past. Existing games are the materials out of which In fashion my new game.

Thus, the new philosopher will transcend the limitations of the existing games and will extend the catalogue of games with the invention of new ones, but that new creative spirit faces certain historical limitations. If this is relativistic, it is not totally so.

The value of this endeavour is not to be measured by what other people think of the newly created game; Nor does its value lie in fame, material rewards, or service to the group. Its value comes from the way it enables the individual to manifest certain human qualities, especially the will to power. Nonetheless, it seems that whether or not the game attracts other people and becomes a permanent fixture on the sporting calendar, something later citizens can derive enjoyment from or even remember, that is irrelevant. For only the accidents of history determination of whether the game invented is for my-own attractions in other people, that is, becomes a source of value for them.

Nietzsche claims that the time is right for such a radically individualistic endeavour to create new games, new metaphors for my life. For, wrongheaded as many traditional games may have been, like Plato's metaphysical soccer or Kant's version of eight balls, or Marx's materialist chess tournament, or Christianity's stoical snakes and ladders, they have splendidly trained us for the much more difficult work of creating values in a spirit of radical uncertainty. The exertions have trained our imaginations and intelligence in useful ways. So, although those dogmatists were unsound, an immersion in their systems has done much to refine those capacities we most need to rise above the nihilists and the herd.

Now, putting its analogy on the table for our consecrations to consider and clarify by some central points about Nietzsche. However, the metaphor is not so arbitrary as it may appear, because this very notion of systems of meanings as invented games is a central metaphor of the twentieth century thought and those who insist upon it as often as not point to Nietzsche as their authority.

So, for example, when certain postmodernists insist that the major reason for engaging in artistic creativity or literary criticism or any form of cultural life be to awaken the spirit of creative play that is far more central than any traditional sense of meaning or rationality or even coherence, we can see the spirit of Nietzsche at work.

Earlier in this century, as we will see in the discussions of early modern art, a central concern was the possibility of recovering some sense of meaning or of recreating or discovering a sense of "truth" of the sort we had in earlier centuries. Marxists were determined to assist history in producing the true meaning toward which we were inexorably heading. To the extent that we can characterize post-modernism simply at all, we might say that it marks a turning away from such responses to the modern condition and an embrace, for better or worse, of Nietzsche, joyful - affirmation in a spirit of the irrationality of the world and the fictive qualities of all that we create to deal with life.

One group we can quickly identify is those who have embraced Nietzsche's critique, who appeal to his writing to endorse their view that the search to ground our knowledge and moral claims in Truth are futile, and that we must therefore recognize the imperative Nietzsche laid before us to - create our own lives, to come up with new - descriptions affirming the irrational basis of our individual humanity. This position has been loosely termed Antifoundationalism. Two of its most prominent and popular spokespersons in recent years have been Richard Rorty and Camille Paglia. Within Humanities departments the Deconstructionists (with Derrida as their guru) head the Nietzschean charge.

Antifoundationalists supportively link Nietzsche closely with Kuhn and with Dewey (whose essay on Darwin we read) and sometimes with Wittgenstein and take central aim at anyone who would claim that some form of enquiry, like science, rational ethics, Marxism, or traditional religion has any form of privileged access to reality or the truth. The political stance of the Antifoundationalists tends to be radically romantic or pragmatic. Since we cannot ground our faith in any public morality or political creed, politics becomes something far less important than personal development or else we have to conduct our political life simply on a pragmatic basis, following the rules we can agree on, without according those rules any universal status or grounding in eternal principles. If mechanistic science is something we find, for accidental reasons of history, something useful, then we will believe it for now. Thus, Galileo's system became adopted, not because it was true or closer to the truth that what it replaced, but simply because the vocabulary he introduced inside our descriptions was something we found agreeable and practically helpful. When it ceases to fulfill our pragmatic requirements, we will gradually change to another vocabulary, another metaphor, another version of a game. History shows that such a change will occur, but how and when it will take place or what the new vocabulary might be-these questions will be determined by the accidents of history.

Similarly, human rights are important, not because there is any rational non-circular proof that we ought to act according to these principles, but simply because we have agreed, for accidental historical reasons, that these principles are useful. Such pragmatic agreements are all we have for public life, because, as Nietzsche insists, we cannot justify any moral claims by appeals to the truth. So we can agree about a schedule for the various games and distributing the budget between them and we can, as a matter of convenience, set certain rules for our discussions, but only as a practical requirement of our historical situation, least of mention, not by any divine or rationality that of any system contributes of its distributive cause.

A second response is to reject the Antifoundationalist and Nietzschean claim that no language has privileged contact to the reality of things, to assert, that is, that Nietzsche is wrong in his critique of the Enlightenment. Plato's project is not dead, as Nietzsche claimed, but alive and well, especially in the scientific enterprise. We are discovering ever more about the nature of reality. There may still be a long way to go, and nature might be turning out to be much more complex than the early theories suggested, but we are making progress. By improving the rule book we will modify our games so that they more closely approximate the truth of the wilderness.

To many scientists, for example, the Antifoundationalist position is either irrelevant or just plain wrong, an indication that social scientists and humanity’s types do not understand the nature of science or are suffering a bad attack of sour grapes because of the prestige the scientific disciplines enjoy in the academy. The failure of the social scientists (after generations of trying) to come up with anything approaching a reliable law (like, say, Newton's laws of motion) has shown the pseudoscientific basis of the disciplines, and unmasks their turn to Nietzschean Antifoundationalism as a feeble attempt to justify their presence in the modern research university.

Similarly, Marxists would reject Antifoundationalism as a remnant of aristocratic bourgeois capitalism, an ideology designed to take intellectuals' minds off the realities of history, the truth of things. There is a truth grounded in a materialist view of history, fostering, that only in diverting philosophers away from social injustice. No wonder the most ardent Nietzscheans in the university have no trouble getting support from the big corporate interests to and their bureaucratic subordinates: The Ford Foundation, and the National Endowment for the Humanities. Within the universities and many humanities and legal journals, some liveliest debates go on between the Antifoundationalists allied and the Deconstructionists under the banner of Nietzsche and the historical materialists and many feminists under the banner of Marx.

Meanwhile, there has been a revival of interest in Aristotle. The neo-Aristotelians agree with Nietzsche's critique of the Enlightenment rational project-that we are never going to be able to derive a sense of human purpose from scientific reason - but assert that sources of value and knowledge are not simply a contingent but arise from communities and that what we need to sort out our moral confusion is a reassertion of Aristotle's emphasis on human beings, not as radically individual with an identity before their political and social environment, but moderate political animals, whose purpose and value are deeply and essentially rooted in their community. A leading representative for this position is Alisdair McIntyre.

Opposing such a communitarian emphasis, a good deal of the modern Liberal tradition points out that such a revival of traditions simply will not work. The break down of the traditional communities and the widespread perception of the endemic injustice of inherited ways is something that cannot be reversed (appeals to Hobbes here are common). So we need to place our faith in the rational liberal Enlightenment tradition, and look for universal rational principles, human rights, rules of international morality, justice based on an analysis of the social contract, and so on. An important recent example such a view is Rawls' famous book Social Justice.

Finally, there are those who again agree with Nietzsche's analysis of the Enlightenment and thus reject the optimistic hopes of rational progress, but who deny Nietzsche's proffered solution. To see life as irrational chaos that we must embrace and such joyous affirmation as the value-generating activity in our human lives, while recognizing its ultimate meaninglessness to the individual, too many people seem like a prescription for insanity. What we, as human beings, must have to live a fulfilled human life is an image of eternal meaning. This we can derive only from religion, which provides for us, as it always has, a transcendent sense of order, something that answers to our essential human nature far more deeply than either the Enlightenment faith in scientific rationality or Nietzsche's call to a life of constantly metaphorical - definition.

To read the modern debates over literary interpretation, legal theory, human rights issues, education curriculums, feminist issues, ethnic rights, communitarian politics, or a host of other similar issues is to come repeatedly across the clash of these different positions (and others). To use the analogy In started with, activities on the playing fields are going on more energetically than ever. Right in the middle of most of these debates and generously scattered throughout the footnotes and bibliographies, Nietzsche's writings are alive and well. To that extent, his ideas are still something to be reckoned with. He may have started by shouting over the loud speaker system, in a way no to which one bothered attending; now on many playing fields, the participants and fans are considering and reacting to his analysis of their activities. So Nietzsche today is, probably more than ever before in this century, right in the centre of some vital debates over cultural questions.

You may recall how, in Book X of the Republic, Plato talks about the "ancient war between poetry and philosophy." What this seems to mean from the argument is an ongoing antagonism between different uses of language, between language that seeks above all, denotative clarity the language of exact definitions and precise logical relationships and language whose major quality is its ambiguous emotional richness, between, that is, the language of geometry and the language of poetry (or, simply put, between Euclid and Homer)

Another way of characterizing this dichotomy is to describe it as the intensive force between a language appropriates and discovering the truth and one appropriate to creating it, between, that is, a language that sets it up as an exact description of a given order (or as exactly presently available) and a language that sets it up as an ambiguous poetic vision of or an analogy to a natural or cosmic order.

Plato, in much of what we studied, seems clearly committed to a language of the former sort. Central to his course of studies that will produce guardian rulers is mathematics, which is based upon the most exact denotative language we know. Therefore, the famous inscription over the door of the Academy: "Let no one enter here who has not studied geometry." Underlying Plato's remarkable suspicion of a great deal of poetry, and particularly of Homer, is this attitude to language: Poetic language is suspect because, being based on metaphors (figurative comparisons or word pictures), it is a third remove from the truth. In addition, it speaks too strongly to the emotions and thus may unbalance the often tense equilibrium needed to keep the soul in a healthy state.

One needs to remember, however, that Plato's attitude to language is very ambiguous, because, in spite of his obvious endorsement of the language of philosophy and mathematics, in his own style he is often a poet, a creator of metaphor. In other words, there is a conflict between his strictures on metaphor and his adoption of so many metaphors (the central one of some dramatic dialogues is only the most obvious). Many famous and influential passages from the Republic, for example, are not arguments but poetic images or fictional narratives: The Allegory of the Cave, the image of the Sun, the Myth of Er.

No comments:

Post a Comment