Full-text resources of CEJSH and other databases are now available in the new Library of Science.
Visit https://bibliotekanauki.pl

Results found: 13

first rewind previous Page / 1 next fast forward last

Search results

Search:
in the keywords:  REDUCTIONISM
help Sort By:

help Limit search:
first rewind previous Page / 1 next fast forward last
EN
Time sensitivity seems to affect our intuitive evaluation of the reasonable risk of fallibility in testimonies. All things being equal, we tend to be less demanding in accepting time sensitive testimonies as opposed to time insensitive testimonies. This paper considers this intuitive response to testimonies as a strategy of acceptance. It argues that the intuitive strategy, which takes time sensitivity into account, is epistemically superior to two adjacent strategies that do not: the undemanding strategy adopted by non-reductionists and the cautious strategy adopted by reductionists. The paper demonstrates that in adopting the intuitive strategy of acceptance, one is likely to form more true beliefs and fewer false beliefs. Also, in following the intuitive strategy, the listener will be fulfilling his epistemic duties more efficiently.
Psychologia Społeczna
|
2008
|
vol. 3
|
issue 4(9)
336-339
EN
T. Witkowski’s provocation has shown three groups of mechanisms (irrational, rational, and socio-economic) of building pseudoscientific “theories” at the junction of scientific psychology and practice. This paper is focused on rational mechanisms. It is argued that pseudoscientific concepts are often accepted on the basis of an implicit assumption that they are guaranteed by experts and the research in other domains from which they borrow their terminology. This assumption is a consequence of the division of labor in science. A complementary mechanism is the tendency to build false conceptual dichotomies, illustrated in the commentary by the concept of explicit/implicit memory (Bedford, 1997), as well as the dichotomized assignment of mental function to brain hemispheres, inspired by works of Sperry and his colleagues on the “split brain”. Finally I try to show how these mechanisms together may lead to the false feeling of understanding and belief that functions of the mind can be controlled.
Ekonomista
|
2009
|
issue 6
761-774
EN
The research program of New Keynesianism as born in early 1970s was dominated by the search for microeconomic explanations of persistent business fluctuations. For this purpose New Keynesians applied reductionism characteristic of the model of perfect competition. Simultaneously, New Keynesian models reveal different systemic imperfections contradictory to the idea of representative agent. Simple aggregation of (incomparable by assumption) individual results seems to be an obvious paradox. New Keynesian theoretical constructions were divided into three groups: 1. imperfect competition models and heterogeneity of agents, goods and/or transactions; 2. models of heterogeneity between homogenous groups of agents; 3. models of continuum of homogenous agents. Against this background it was shown that reductionism misrepresented the core of Keynesian thought and deprived it of its originality and distinction as compared to the classical approach.
EN
In this essay I presented basic information on complexity of life, starting with general characteristics of complexity. The complexity was explained in terms of quantity of parts, organized by not their summation, but by their ability to selforganization and creation of novelty on the way to the wholeness in a course of evolution. A short review of evolving cell from elementary particles and atoms, through simple chemical molecules and macromolecules, self-organized networks and morphological structures of the cell were presented. All living organisms reveal distinct hierarchical organization from basic level (atoms, molecules) to cells and multicellular individuals. Complexity is increasing with the increased hierarchical levels. However phenomena of life on higher level of hierarchy can not be explained or reduced to characteristics and properties of chemical molecules but by interaction between them and integration of various molecular processes at a lover levels of hierarchy. More detailed view on hierarchical organization of the human, relation between various levels of hierarchy, and emergence of highest human abilities belonging to the broad level of culture are discussed.
EN
What is the ontological status of a musical work? This paper enters the discussion of the question between Julian Dodd and Michael Morris. Dodd is a proponent of a type-token view, which is a version of Platonism. Morris has formulated an argument that purports to show that a musical work cannot be a token of a type. If successful, the argument presents a serious challenge for a type-token theorist with implications for Platonism as a whole. Morris’s argument is, however, problematic in a few respects. The aim of this paper is to identify the problems and weaken the strength of Morris’s argument thus restoring the original appeal of a type-token view.
EN
The well-known critical analyses of the classical Nagelian model of reduction point at some problems related to the explanatory power of bridge laws. The functional model of reduction, proposed by Kim as an alternative, denies the necessity of bridge laws, and -using the functional model of explanation - tries to avoid them. This paper analyses the characteristics of both Nagelian and functional reduction, and shows that the implicit thesis of functional reduction stating that bridge laws are avoidable, is in fact untenable. By recognizing the inevitability of bridge laws, a new model of reduction can be formulated. The new model of reduction provides an appropriate framework to treat emergent phenomena - which were traditionally incompatible with reductive physicalism - together with the classical examples of reduction. This paper introduces the notion of emergent reduction, a new interpretation of emergent phenomena, which lifts emergence out of the standard examples of contemporary non-reductive physicalism. The corner-stone of the new interpretation of emergent phenomena is the recognition of the similarity between emergent laws and fundamental laws. The investigation of this similarity draws attention to the importance of scientific frame-theories.
EN
The article consist of three parts, in which the authoress analyses Paul Horwich's semantic deflationism as an example of naturalistic theory of language. In the first introductory part, she introduces the basic theses of the theory of truth and theory of meaning given by Horwich. In the second, she proposes terminological solutions concerning naturalism and reductionism. In the third substantial part, she presents some objections to the postulated reductionist consequences of Horwich's theory that can be found in the literature. She claims that his theory is in fact naturalistic, but not reductionist in a way described by Horwich's opponents. In the last part, she discusses the problem of normativity in the context of the presented theory, pointing out that some of Horwich's proposals are insufficient and need to be supplemented.
8
Content available remote

Superweniencja – pytanie o trywialność

88%
Avant
|
2011
|
vol. 2
|
issue 2
215-224
EN
When it comes to the mind-body problem, different kinds of physicalism were the most popular approaches among philosophers. The presence of anomalous monism with its lack of (the) laws concerning mental events and multiple realizability led to a doubt regarding reductionism and a slow movement away from it. It did not, however, weaken the popularity of physicalism. Thus, the problem that had to be faced was to create such a form of physicalism that would reject the reduction of what was mental to what was physical. No difference of one sort without a difference of another sort is a slogan that expresses the idea of supervenience, the idea that according to many philosophers was supposed to be the right expression of physicalism of this particular type. The text briefly presents the intuitions that are hidden behind the notion of supervenience and its main varieties: weak, strong and global. Moreover, the text touches upon the fault of supervenience which was observed in its symmetry and, most of all, in its triviality. This type of fault would force the philosophers to admit that this relation is metaphysically irrelevant
EN
From the very outset sociology has waged a fight for its survival against 'reductionism': it resists the interpretation of social phenomena through other levels of reality (demographic, biological, geographic), because allegedly that would deny the specificity of social life and the singularity of humans as a unique biological species. However, the current state of knowledge, especially dynamic developments in biology (socio-biology, neo-Darwinism, evolutionary biology, evolutionary psychology), has forced sociology to re-think its relationship to reductionism in general - accepting it as a possible principle of interpretation - and specifically - precisely specifying when it is possible and useful to apply the reductionist principle. Although it is possible to respect criticism of radical reductionism (Frankl, Bertalanffy), today it is necessary to search for new ways of cooperating with the natural sciences and probably even to re-design study curricula in the field. By accepting the reductionist principle sociology is not losing its specificity, as long as it takes into the account the distinctiveness of the social and natural sciences (some social phenomena really cannot be described or explained in reductionist terms).
Filozofia (Philosophy)
|
2016
|
vol. 71
|
issue 8
669 – 679
EN
The article deals with cognitive strategies in social cognition considering its two radical patterns, i.e. naturalism and interpretivism. The author’s view is that it is necessary to differentiate between so called “interpretive philosophy” (G. Abel) and interpretivism as a methodological program in social sciences. A special attention is paid to those ways of naturalization of social knowledge applying the modernized evolutionary Darwinian perspective. Especially the so called “epidemiological approach” of D. Sperber who promotes ontological reductionism which does not need to be necessarily accompanied by the theoretical one is analysed.
Filozofia (Philosophy)
|
2023
|
vol. 78
|
issue 4
285 – 295
EN
Practical reductionism is a program based on the claim that the sole relevant information in the sphere of practical deliberation (and of its moral evaluation) is how good the envisaged action is, while the other traditional concepts offering practical and moral orientation – especially virtues – are at the best superfluous (if they recommend the same as the inquiry of goodness) and in all other cases unintelligible and harmful (in so far as they pretend to be something good but recommend suboptimal action). Practical reductionism can be utilitarian, if the sole or dominant criterion of goodness is utility, and it can be cognitively optimistic, if it counts with the possibility to achieve perfect knowledge of the good itself and of the situation in which it should be applied. Such utilitarian and cognitively optimistic practical reductionism is a main topic in Plato’s Laches, and it (or some of its relatives) is present in several other dialogues, notably in the Charmides and the Protagoras. My aim in this paper is to elaborate the concept of practical reductionism (in close regard to the Laches), to show its presence in some other texts, and finally to consider the philosophical contribution of such a bizarre thought.
|
2008
|
vol. 17
|
issue 4(68)
171-190
EN
It seems with respect to mental entities that W.V.O. Quine was a reductionist, eliminativist, anti-mentalist and physicalist. He was a reductionist because he held that mental entities are physical phenomena in disguise. He was an eliminativist because he claimed that the scientific image of the world makes no room for such entities whose functioning is not connected with the use of energy. His philosophy was an anti-mentalist because he argued that mental phenomena are to be studies through behavior. He was a physicalist because in his opinion sciences should strive to present a uniform description of the world in all its aspects by referring to physical facts. If we take a closer look at his writings, however, all these assumptions are borne out only in part. The reductionist postulate is moderated by the opinion that theorems of different sciences are not fully intertranslatable. His eliminativism must cope with the argument that unmitigated physicalism obliterates the difference between human beings and zombies. Anti-mentalism is seriously undermined by the role he assigns to the process of language acquisition motivated by empathy. Physicalism is doubtful insofar as it has not been sufficiently distinguished by him from functionalism.
Filozofia (Philosophy)
|
2015
|
vol. 70
|
issue 1
38 – 46
EN
“Algorithms of life” is the term used by François Jacob in his perhaps most well-known book Logic of Life. Algorithms refer to the “program” that allows us to follow our goal in a very precise manner. We have a finite number of operations that can lead us to understand living beings in the way Jacob does. This does not mean that we can predict everything what will happen in our lives, but we are able to estimate with a high probability of success the future trajectory of the living being (we call it the program) as well as his/her origins (history, evolution of the organism). To understand an organism means to be able to follow these directions and not only analyse its current state. The author shall attempt to explore some counter-positions (for example that held by Georges Canguilhem) and to find the roots of Jacob’s theory. Some of them can be found for example in Claude Bernard’s theory.
first rewind previous Page / 1 next fast forward last
JavaScript is turned off in your web browser. Turn it on to take full advantage of this site, then refresh the page.