Jump to content

Abdul-Aziz

Senior Members
  • Posts

    67
  • Joined

  • Last visited

Everything posted by Abdul-Aziz

  1. Psychologists have a vague idea of what intelligence is, but they often get bogged down in the actual specifics as to what intelligence really means. Even our definition of planet is constantly changing. Did you know that Pluto was recently demoted from its previous planetary status? It is now a "dwarf planet".
  2. How can we speak of artificial intelligence, when modern psychologists cannot even agree on a definition of human intelligence? First, provide an operational definition of human intelligence; then, one can begin to speak of artificial intelligence.
  3. Well, it does demonstrate that the connection between intelligence and genetics is one largely of speculation and unsatiated wish-fulfillment. I do not doubt the fact that genetics plays a role in the development of human intelligence, but it is a marginal one at best and it can exert a considerable influence over the lives of some individuals, such as Newton. However, the price of genius is the onset of madness, and the history of Sir Isaac Newton's eccentricities and personal foibles are a perfect illustration of this. In fact, genius is a dimension of human behaviour that involves more than just being "intelligent"; it is generally accompanied by some form of mental illness (particularly psychosis, manic depression), suggesting that it is a kind of psychological condition akin to having a genetic brain disorder or being an idiot savant. Who knows? Maybe being too intelligent might not exactly be good for your own personal health.
  4. It is also well-known that twin studies are plagued by numerous methodological difficulties that largely invalidate them as legitimate sources upon which further scientific investigation can be based. First, you are using a generalization based on a small, unrepresentative subpopulation of individuals and using this as an explanation for phenotypic/behavioural variation within the general population at large. In order for twin studies to have any validity whatsoever, twins from a representative sample of the genetic variation within a population would need to be adopted into families from a representative sample of the environmental variation within a population. No twin study has ever done this. Secondly, investigators have yet to identify those specific environmental factors and gene-gene interactions most responsible for cognitive development, meaning that whatever estimates of heritability are made, they are ultimately based upon the logical fallacy of correlation equals causation. And thirdly, conventional mathematical calculations for the estimation of heritability involve the minimization and isolation of genotype-environment correlations through randomization of micro-environmental influence. Given the sheer complexity of human social behaviour and the fact that early childhood socialization begins at birth, being central to both infant survival and future integration into adult society, the genotypic-environmental contributions to the distribution of human phenotypic characters can neither be minimized nor relegated to that of an experimental control, as in the case of animals or plants; because of this, it is virtually impossible to disentangle the relative contributions of both genes and environment to the development of phenotypic diversity. Hence, because no twin study has ever been able to minimize, isolate, and control for the presence of genotypic-environmental interaction, estimates of heritability based on the available evidence are both meaningless and unwarranted. The fact that your studies refer to the heritability of intelligence, makes your thesis even more ridiculous than it already is. First, there are no specific genes that code for intelligence, and secondly, there is no consensus on what intelligence is amongst modern psychologists. Thirdly, IQ is not even a phenotypic trait, making the notion of its heritability problematic to begin with; it is grounded in no theoretical substructure and can only be measured in one way and one way alone. IQ measurements themselves are determined by convention; the measurements obtained vary wildly from researcher to researcher, demonstrating that IQ is not an objective measure in the first place. Besides, even if there was such a thing as "IQ" and it were supposedly "heritable", it wouldn't affect the malleability or plasticity of trait possession, because of the infinite number of combinations between high/low heritability and high/low malleability dictated by logic and observed widely throughout nature. Monozygotic twins would be more identical in anatomical structure than dizygotic twins. Either way, this adds nothing to your case that IQ is somehow genetically inherited. So? Tsien doctored some genes to improve the memory and learning skills of mice. Interesting. However, this adds nothing to your case that human intelligence is largely genetically heritable or that genes for intelligence have been found in mice, let alone human beings. I think the main point here is: Are there genes for genius? Comparisons of the genes of people of high versus average intelligence have produced slow progress in identifying the many genes that contribute to cognitive ability. Given complete equality of opportunity, the only thing affecting their genetics would be the influence of external variables. Wait! Did you just mention a study from Thomas Bouchard? I hope you know that Thomas Bouchard is a neo-Nazi who has strong ties to the white supremacist Pioneer Fund. He also supports the racist theories of the Bell Curve and is a close personal friend of another white supremacist named Linda Gottfredson. The fact that you mentioned a known racist named Thomas Bouchard as one of your sources almost completely invalidates your argument and reflects very negatively on what kind of person you are. Are you hiding anything we should know about?
  5. Please feel free to PM me if you have any questions that need to be answered or statements that need to be made.

  6. Well the question you need to ask yourself is: if intelligence is largely determined by genetic factors, then how come no one has been able to discover genes that code for intelligence? No. My argument is that, because the concepts of IQ and g are based on certain falsehoods that have been definitively refuted on scientific/mathematical grounds, they must be seen as concepts that are devoid of empirical merit. This is absolutely in keeping with the scientific method and is known as the principle of falsifiability. Scientific endeavour is socially constructed to begin with; the content of science is culture specific and ultimately shaped by both the perceptual biases and combined socio-cultural interactions of its various participants. Your failure to acknowledge this demonstrates a fundamental misunderstanding of what the very nature of science is, indicating that for you, DNA functions as ideology. Given complete equality of opportunity, the only thing that would affect outcomes would be the freedom to choose and personal preference.
  7. No, that is not my argument. What I did say was that every human being, provided they are neither physically nor mentally disabled in some way, possesses the same equal potential for achievement. I have already pointed this out before. Well, for those born with genetic brain disorders or idiot savants, certainly.
  8. I never said that intelligence was not influenced by genetics, but I have strongly emphasized that environment plays a much greater role in the determination of human intelligence than the relatively insignificant contributions of genetics. Again, correlation is not causation. How can one assume a genetic explanation for a simple correlation when it is almost theoretically impossible to prove the existence of genes that code for intelligence? Multiple studies have also shown that the cultural transmission of socio-economic status and the maternal effects of both womb and infant nurturing can easily account for whatever similarities exist in twin IQ, as opposed to the shortcomings of an explanation derived from human genetics. Besides, haven't twin studies been largely discredited by modern science anyway?
  9. Yes, that is what the tests are designed to do, as originally intended by their inventor Alfred Binet. However, numerous modern psychometricians have decided to stray far away from the original good intentions of M. Binet. Some racist academics (Rushton, Lynn, Brand, Gottfredson etc.) still perpetuate the quaint notion that IQ scores are an accurate reflection of g-factor intelligence.
  10. No. I am saying that variations in cognitive ability are primarily due to modifications within the environment, not genetic heritability. After all, every human being is born with the same equal potential for achievement. The argument for genetic heritability is frequently invoked by racists/fascists as a means of sustaining and perpetuating conditions of structural injustice. This, coming from a person who hasn't even begun to scratch the surface of my most fundamental arguments.
  11. Actually, all IQ tests are pretty meaningless. Whether one is taking an online IQ test to qualify for Mensa or doing one of Weschler's Adult Intelligence Scales, the only thing measured on such a test is the ability to regurgitate culture-specific bits of trivial information.
  12. You're taking a metaphor for unused human potential and interpreting it way too literally. The fact is, because of the mysterious nature and great complexity of human intelligence (the science of which is still in its infancy), combined with the shortness of our life span and the often arbitrary limitations imposed upon us by our surroundings, we never can reach our full potential to begin with. No. It is a logical fallacy to say that correlation is causation. To infer the existence of a primary agent based on such insufficient proofs (a simple positive correlation for example) is not substantiated by the evidence at hand, making it "obviously chimerical".
  13. There is no scientific basis for intelligence testing. Even the g-factor around which the determination of IQ is supposedly correlated is based on mathematical blunders and outright miscalculations, a fact that was first pointed out by French psychologist Alfred Binet, the so-called founder of IQ testing. After a period of 30 years in which IQ testing was generally ignored, the practice was revived again by racist academic Arthur Jensen in 1969, presumably as a means of lashing out against the US Civil Rights Movement. The time has finally come to sweep quaint notions such as IQ and g-factor intelligence into the dustbin of history, where they ultimately belong. Racist, sexist, and classist dogmas such as IQ and Spearman's g have no place within the field of civilized, intelligent discourse since they are instruments by which the weak and the defenceless are marginalized to the very fringes of society.
  14. Are you having a bad case of déjà vu? If you have nothing intelligent to contribute, then please don't add anything to the thread.
  15. Well, first of all, I never said that everyone has precisely the same intellect. What I did say was that everyone has the same potential to develop their natural abilities to the fullest extent, the actual extent of which is unknown because of the influence of external variables. The gradual, monotonic increase in IQ scores throughout the twentieth century has necessitated the successive re-standardizations of intelligence tests from decade to decade. This is known as the Flynn effect and clearly demonstrates that such things as IQ are fundamentally social constructs that are responsive to purely environmental stimuli. Actually, the only twin study that supported the substantial heritability of IQ was that of Sir Cyril Burt in 1966, who assumed that 80% of human intelligence was heritable. This study was later proved to be a forgery. Whatever similarity exists between IQs can easily reflect such things as shared maternal environment and nutrition (especially in the womb), as in the case of monozygotic twins reared apart. The New Zealand psychologist James Flynn, actually determined the experimental requirements necessary for conducting a sound twin study in the book Race, IQ, and Jensen (1980), making the existence of such a study almost impossible. First, the twins would need to constitute a representative sample of the total genetic variation within a population and would need to be adopted into a family that embodied a representative sample of the total environmental variation within a population, it would also need to control for heteroskedasticity or positively correlated gene-environment interactions, as well as the unintentional behavioural/developmental modifications introduced by maternal effects. To my knowledge, a sound twin study, whether of monozygotic or dizygotic twins, demonstrating the substantial genetic heritability of IQ has never been conducted. Because twin studies do not account for genotype-environment correlations, the notion of IQ having any kind of heritability is essentially undefined. Most of the twin studies you are relying on for your estimates of substantial heritability of intelligence are from obvious frauds and racists like our dear Burt or Jensen. While I do not discount the influence of genetics over the development of the human personality, I believe genetic influence is obviously quite minimal, with the influence of external variables being considerably more important. For example, Eric Turkheimer et al., in a 2003 twin study (http://www3.interscience.wiley.com/journal/118847316/abstract), demonstrated that the so-called broad-based heritability of IQ is massively influenced by socio-economic status and other external variables, which can vary from between zero at low status to .80 at high status. Anyone who believes that the heritability of IQ has any value as a legitimate measure needs to explain away Turkheimer et al.'s findings. According to him: Results demonstrate that the proportions of IQ variance attributable to genes and environment vary nonlinearly with SES. The models suggest that in impoverished families, 60% of the variance in IQ is accounted for by the shared environment, and the contribution of genes is close to zero; in affluent families, the result is almost exactly the reverse. You are reading things too literally. We use less than 10% of our overall intellectual potential (which is enormous), with brain being used metaphorically for that very same potential. You're missing the point. The fact that g can be positively correlated with anything demonstrates its obviously chimerical nature. I appreciate the fact that we can agree on something. However, what I meant to say was that we use a small percentage of (maybe less than 10%) our intellectual potential, with the word "brain" being used metaphorically as a substitute for mental capacity. I was not referring to brain-mediation of our physiological processes, which involve the operation of 100% of the brain. Because I believe in the universality of equal rights/treatment and that all human beings possess equal intellectual potential, I am suddenly a fanatic? Equal rights are necessary because people should be treated as individuals, and not boxed in by arbitrary labels, whether government-legislated or otherwise. The spectre of crypto-fascism rears its ugly head once again...
  16. Of course, some people would argue that IQ tests only measure how good one is at writing IQ tests, if they measure anything at all (which is not likely). I believe that human intelligence is so complex that it cannot be measured by simplistic tools like IQ testing. The modern notion that human intelligence is multidimensional seems much closer to the truth.
  17. What you are forgetting is that an IQ test, regardless of how it is calculated, is only considered valid or reliable insomuch as it is seen as an accurate gauge of g-factor intelligence. Theoretically speaking, the tests are constructed in such a way as to be highly correlated with measures of g or "g-loaded". The more "g-loadings" a test has, meaning measures that correlate highly with the extrapolation of g from a positively correlated data matrix, the more likely it is to measure general cognitive ability. For example, Raven's Progressive Matrices are considered by some authorities to be one of the most heavily "g-loaded" intelligence tests in existence because its various measures correlate highly with g. In addition, many of the pseudo-intellectual bigots from amongst the IQ crowd, such as J.P. "penis size = brain size" Rushton or Richard "all women are stupid" Lynn, often closely associate g with IQ, using one as proxy for the other. To sum up, because modern psychometricians view tests of cognitive ability as being more or less accurate reflections of g-factor intelligence, it follows that by debunking g one severely undermines the validity of IQ as a reliable measure. As for your second statement, we are all born equal, but you forget that we also have the same potential to develop our natural abilities to their fullest extent. In a sense, everyone is a potential Einstein, minus the influence of external variables of course. Remember that human nature is essentially malleable and plastic. Because we only use less than 10% of our central nervous systems, all human beings (provided they are neither mentally nor physically disabled) have the same innate capacity for high intellectual/creative achievement throughout the life cycle. Unfortunately, whether one develops their capacities to the fullest or not, is another story.
  18. The Concept Of IQ Is Based On Academic Charlatanism And Scientific/Mathematical Falsehood The use of IQ as a measure of human intelligence is neither scientific nor properly grounded in the appropriate multivariate statistical analysis Our current apprehension of what constitutes the proper scope of psychometric analysis is grounded upon a series of pseudo-scientific formulations without any legitimate basis in multivariate statistical analysis. Consequently, the field of intelligence testing is nothing more but a neo-eugenic programme which has been deliberately obscured from public view; it is a front for modern scientific racism concealed behind the impenetrable veil of innocuous disguise. Not only is mental testing founded on a series of scientific falsehoods and gross distortions of reality, but the so-called general factor of intelligence or g is without any discernible physiological/neuro-anatomical correlates, given the spatial distribution of human cognitive functioning across both hemispheric regions of the brain. There is substantial consensus within the scientific community that those biochemical processes within the central nervous system which motivate the general operations behind fluid/crystallized intelligence can neither be objectively measured nor accurately gauged. In 2001, an elaborate genome-wide scan of 1842 DNA markers, conducted by researcher R. Plomin et al. failed to reveal even a single allele that was responsible for the trait of intelligence or a genetic mechanism that mediated its supposed heritability. According to a recent interview with journalist Carl Zimmer of Scientific American in 2008, even the distinguished R. Plomin was forced to admit: I’m not willing to say that we have found genes for intelligence because there have been so many false positives. They’re such small effects that you’re going to have to replicate them in many studies to feel very confident about them. Needless to say, other investigators have found the search for a genetic basis to human intelligence just as elusive and frustrating. In fact, there is strong agreement within academia that intelligence is a multidimensional trait that cannot be subsumed under the field of quantitative genetic analysis because of its amorphous nature and great complexity. This should come as no surprise, given the fact that no consensus has ever been reached concerning the nature of human intelligence itself. Yet, despite the fact that no genetic basis for intelligence has ever been found and that none of the major psychologists of the twentieth century has ever provided a coherent definition of g factor intelligence, a small minority of psychometricians still believe that intelligence is a unitary, highly localized trait that can be measured both quantitatively and with a high degree of scientific objectivity. Nevertheless, given the multidimensionality of human intelligence, it is still surprising to note that so many are willing to still believe that it can be quantified as a linear variable. Thus, in order to understand the current nature of affairs, we must first delve into the history of intelligence testing. The rise and fall of Spearman's hypothesis: Even after Spearman’s initial statistical theses were definitively refuted, he is still followed by bigots world-wide The intelligence quotient is ultimately grounded upon Spearman’s hypothesis, which is often falsely spoken of as a unidimensional factor extrapolated from a multidimensional covariance matrix. Unfortunately, its sole function as an artificial statistical artefact has been the legitimization of institutionalized racist/sexist discrimination against the already disadvantaged. Nevertheless, we must discover what factors ultimately contributed to the ideological formulation of Spearman’s nefarious hypothesis from a socio-historical vantage point. The field of psychometric intelligence was first pioneered by the nineteenth century psychologist Alfred Binet, who intended it as a diagnostic tool in the identification of children in need of remedial education within the French public school system. It was based on a test item score and was designed to indicate whether a child’s mental age corresponded with his chronological age, indicating average scholastic performance; mental ages below and above the chronological age could either indicate mental retardation or intellectual giftedness. Because previous psychometricians had enormous difficulty in defining what general intelligence was, the British psychologist C. Spearman decided to circumvent this difficulty by operationalizing the definition of human intelligence as a g factor. He reasoned that because all subtest scores for intelligence correlated positively, they must be the sole product of a single variable, namely the general factor of intelligence. g was gradually teased out of a given data set through the factorialization of variate co-ordinates organized mathematically into correlation matrices. Believing that he had discovered an objective measure for what he termed g, instead of a tautological proposition that automatically reinforced its claim to objectivity by dispensing with scientific methodological analysis, Spearman and Hart declared in 1912: Citizens, instead of choosing their careers at almost blind hazard, will undertake just the professions suited to their capacities. One can even conceive the establishment of a minimum index to qualify for parliamentary vote, and above all for the right to have offspring. The publication of Spearman’s hypothesis, the intellectual mainsprings upon which modern intelligence testing is founded, unleashed a firestorm of criticism across both sides of the Atlantic, accompanied by a wave of the most bitter invective. Through the development of a new statistical technique in 1916, the so-called Thomson theory of sampling, it was argued that many common factors, rather than a single unidimensional variable as postulated by Spearman, could be derived from the same correlation matrix. Other investigators of the period argued that the indeterminate nature of the factor analytic model automatically refuted Spearman’s general factor of intelligence. Many noticed that his g was the statistical equivalent of a random variate defined by functional constraints which, when applied to the subtest data, produced an infinite multiplicity of latent common factors which explained all manifest variables within the data set itself. This observation alone single-handedly undermined the unidimensionality of g as an objective measure extracted from the linear covariance matrices and operationalized as a proxy for human intelligence. Thus, Spearman’s position on intelligence was gradually made untenable and whatever achievements he had previously made in the field of psychodiagnostics were rendered superfluous. Mere knowledge of those variables embedded within a correlation matrix, as well as all correlations between common factors is not sufficient enough mathematical proof to facilitate the identification of common/unique factors. As a consequence, an infinite range of variables could be mathematically constructed and made to resemble the same correlational pattern amongst the observed variables, as well as those common/unique factors obtained through factorial analysis of the same vector spaces. If such is the case, as many of Spearman’s critics contended, what is g? How can something so conceptually nebulous be defined with such mathematical rigour? The statistician and polymath E.B. Wilson argued in 1928 that Spearman’s factor analytic method, upon which the entire modern pseudo-science of intelligence testing is based, was incapable of defining what g is. Wilson pointed out that in a vector space of (p + m) dimensions containing p observed equations and their corresponding variables, it would not be possible to correlate g with a given vector in vector space (p + m) because of the finite number of variables. This made g an indeterminate factor and an undefinable entity, meaning that the nature of human intelligence itself could not be explained. Spearman argued that by transforming p, the observed number of variables, into an infinite set or by increasing the number of variables indefinitely, g would eventually be rendered as a quantitatively determinate factor. Nonetheless, towards the end of his career, Spearman was forced to abandon his belief that a single variable or even a finite set of variables could be extracted from a correlation matrix and made to correlate with g. He admitted, unrealistically, that g could only become determinate after constructing a series of matrices with a steadily increasing number of subtests, characterized by nonzero absolute correlations with g. Yet in order for this to be achieved, Spearman argued disingenuously that the available statistical methods needed to be further refined in terms of accuracy and precision measurement. As to how many subtests were needed in order to transform g into a determinate factor that tied together the positive correlations of a data set, he was unable to specify. This made the socio-historical inability to assign a coherent definition to the g factor of intelligence a problem that haunts modern psychometric testing to this very day. Thurstone modifies Spearman’s untenable hypothesis However, Spearman’s failed hypothesis was resurrected once again by L. Thurstone in the 1920s. Instead of interpreting Spearman’s general theory of human cognition as an objective measure of some unitary factor of intelligence (because of unresolved issues such as factor indeterminacy), Thurstone transformed it into a kind of scientific methodological approach based on the multiple factorialization of between-variable correlations. Through the application of this statistical technique, he was able to demonstrate the multidimensional nature of human cognition. If the partial correlations did not equal zero after being factored out, then a second, third, fourth etc. general common factor would be partialled out from the data matrix until the partial correlations were eventually reduced to a zero sum, producing a multiplicity of general factors or “g-like intelligences”. Still, this approach produced a number of conceptual difficulties for Thurstone, which were never encountered in the work of Spearman because of his exclusive dependence upon the extrapolation of a single common factor (which Thurstone rejected on conceptual grounds). Clearly then, further investigation within the field of applied mathematics was needed in order to fully implement his serendipitous insights as applied statistical reality. Eventually, Thurstone managed to resolve his methodological difficulties by developing the simple structure approach to the problem of geometric rotation. He envisioned the extrapolation of common/unique factors through multiple factorial analysis from a geometrical perspective, as orthogonal axes of a linear transformation system encompassing a substantial distribution of test scores within vector space. Regardless of whether the origin or their corresponding orthogonal axes remained identical, the same number of test points can be explained by a multiplicity of differential co-ordinate systems. Through the reduction of all explanatory mechanisms to their most fundamental structure, reflecting the level of necessary parsimony methodologically required by Occam’s razor, the problem of selecting the co-ordinate system of maximal utility was immediately resolved. In other words, through the orthogonal linear rotation of the co-ordinate system, each test point was assigned only a few non-zero factor loadings, reducing the number of common factors needed. However, as the number of common factors steadily increased, it became difficult to envision their position within a two-dimensional system of linear co-ordinates, a serious difficulty that was only ameliorated somewhat by advances in computer programming software. Although modern psychometrics gravitates (as opposed to its Jensenist counterpart) around Thurstone’s elaboration of Spearman’s hypothesis on the existence of the general intelligence factor in the form of multiple/hierarchical second-order factor analysis, what Thurstone failed to realize was that he had actually set the field of intelligence testing back almost 50 years. Thurstone, as well as other psychologists, believed that intelligence was essentially a multidimensional trait that could not be falsified because it could be evaluated objectively through the administration of psychodiagnostic testing. In spite of this, a paradoxical question still remained for Thurstone and his followers: How does one measure a variable that is multi-dimensional and does not preserve empirically defined orders of relation between two subjects (such as that embodied within the binary propositions A = B or A greater than B)? Theoretically speaking, only one-dimensional variates can be assigned a single, unitary value. If the psychometricians of the Thurstonian school were going to continue speaking of IQ, then it would be necessary to specify what kind of intelligence was being referred to and the general factor being used to represent it symbolically. In the end, nothing Thurstone did for psychometrics was ever subjected to any kind of rigorous methodological analysis, such as the experimental replication of his formulae within a controlled setting (especially his orthogonal solution to the rotation problem). This series of unresolved difficulties with his work on the level of conceptual analysis ultimately had the effect of minimizing his contributions to the modern pseudo-science of psychometric intelligence testing, making him out to be nothing more but a footnote in the great annals of empirical science. Jensen revives the long-forgotten and long-debunked Spearman Hypothesis After the work of Spearman and Thurstone had been practically forgotten for a number of decades, came A. Jensen in the 1960s with an explosive article published in the Harvard Educational Review. Within the article itself, he controversially advocated the substantial genetic heritability of black-white differences in IQ during the height of the Civil Rights Movement in the USA. Through his work during this period, he singlehandedly managed to revive the old IQ controversy of the early twentieth century by adding a biological explanation to the apparent 15 point gap between black-white test scores, a result which had been initially uncovered by the racist propagandist Audrey Shuey and the white supremacist Pioneer Fund in the late 1950s. All of Jensen’s data concerning black/white differences in IQ comes from Shuey’s magnum opus, entitled The Testing of Negro Intelligence (1958), a bigoted tract which presupposes white racial superiority from a quasi-biological determinist standpoint. However, the concept of g employed by Jensen and upon which all of modern psychometric analysis rests is not identical to the concept of g as formulated statistically by Spearman. Whereas Spearman formulated a concept of g based on a common general factor of unitary intelligence extracted from amongst the positive intercorrelations of a series of subtests, Jensen employed a vector space transform, or the first principal component as an ersatz substitute for Spearman’s g. The general factor of Jensen is not identical to that of Spearman and should not be confused as such; it is an artificial statistical construction employed by Jensen to obscure the obvious methodological shortcomings and technical illogicalities of Spearman’s g. Spearman’s g was based on the unsubstantiated presupposition that intelligence is a reified substance; it can be measured and assigned a discrete linear variable derived from the successive intercorrelations of a positive data matrix. On the other hand, Jensen’s ideological conceptualization of the g factor is based on a highly variable first principal component designed to express the total variance within a data set. Nevertheless, because the principal component cannot be reduced to zero through partial correlations, no g can be extracted from it. Instead, it functions ontologically as a highly localized description of a positively correlated eigenmatrix, that ultimately varies from data set to data set. Before proceeding any further, we must first ask ourselves, what is the difference between principal components analysis and factor analysis? In principal components analysis, the objective is to locate those components most expressive of both common and unique (total) data variance within a correlation matrix of p variables. This is made to ascend from higher to lower levels of dimensional analysis through the technique of factorialization by means of eigendecomposition. On the other hand, factor analysis is a mode of statistical analysis designed to locate those latent variables that can be operationalized as common factors which contribute to the covariance amongst test variables. According to the enlightened principles of Jensenist obscurantism then, the first principal component is an ersatz g mathematically employed to express algebraically the total variance within a multidimensional data set. In the corpus of Jensen’s writings on mental testing, g always functions as a means of encompassing the total variance both within and between eigenmatrices. g never functions operationally as the sole common factor extracted from a data set, but is deliberately operationalized as such through the construction of a factorial similarity artificially generated by the Jensenist correlation of congruence. As Jensen dryly observed in a 1979 article: For example, the g extracted from the six verbal subtests of the Weschler Adult Intelligence scale has a correlation of 0.80 with the g extracted from the five performance subtests, in the standardization sample. Because the correlation matrix can be bisected into a longitudinal plane, the standardized random variables X (subscript) i and X (subscript) j = X (subscript) j and X (subscript) i. Because of its symmetric properties, the eigenvectors can be arranged in order of decreasing/increasing eigenvalue, resulting in the selection of a dominant eigenvector labelled R. Through its function as weight vector, it produces a linear space transform which encompasses much of the total variation within and between eigenmatrices, otherwise known as the first principal component or the PC1. Let us examine how Jensen uses his version of ersatz g (which is really the PC1 in disguise) and deliberately conflates it with Spearman’s. When the variables within the data set are positively correlated, all correlations are equal or r * (>0), where the dominant eigenvector R is the equivalent of (ree’) + (+ 1 – r) * I. In this equation, r = correlation, e = column vector, I = identity matrix, and the summand (+ 1 – r) is the equivalent of a rank 1 matrix of dominant eigenvalue. This value is represented as re’e = pr, because Re = (ree’)e + (re’e)e = pre. (+ 1 – r) * I is added to the + p – 1 eigenvalues, which become + 1 – r in decreasing/increasing scalar value, whereas the eigenvectors themselves remain unaffected. Hence, R is rendered equivalent to the total variance of PC1, which is 100[r + (+ 1 – r)/p] and has an overall eigenvalue of pr + (+ 1 – r), as opposed to the remaining eigenvalues which are the sums of (+ 1 – r) + (p – 1). The ratio of eigenvalues in ascending/descending order of value is determined as being expressed by the equation pr/(+ 1 – r) + 1, where p = total number of subtests administered. Note that the ratio increases between eigenvalues the more p subtests are administered. This means that as p increases in order of magnitude, PC1 will always encompass more and more of the total variance in any positively correlated data set, much more so than PC2, PC3, PC4 and so on, but can never be reduced to Spearman’s common factor of intelligence because it cannot be partialled down to zero. As we can see from the abovementioned conflation of the g factor with PC1, this factorial technique innovated by Jensen is a far cry from the factorial analysis of g first introduced by Spearman; Spearman defined g as a common factor that could be extracted from positively correlated subtests, and which directly pointed to a reified, unitary entity localized within the nether regions of the central nervous system; the same g extracted from one set of positive correlations is the same g extracted from the next. On the other hand, Jensen employs the PC1 as an ersatz proxy for Spearman’s g, which cannot be reduced to zero partial correlations as the g factor is gradually extrapolated from the data set, because there is no general factor to be extracted within the data collected. What Jensen has is a large number of PC1s that are localized descriptions of data sets, which vary enormously from data set to data set each time a new eigenmatrix is reduced through dimensional analysis. Because all of Jensen’s common factors vary from data set to data set, and therefore cannot represent any kind of common factor whatsoever, he employs a congruence coefficient that supposedly points to similarities between eigenvectors of different subtests. Thus, Jensen’s ersatz g is not a common factor, but a highly parasitic g that varies according to which diagonalizable linear transformation or eigenmatrix is analyzed, a problem he tries to rectify through the introduction of congruence coefficients or factor similarity. In G Factor: The Science of Mental Ability (1998), Jensen operationalizes the factorialization of the g factor through a series of congruence coefficients along linear algebraic lines which: are typically above 0.95, indicating virtually identical factors, usually with highest congruence for the g factor. Unfortunately, this adds virtually nothing to our knowledge of what factor similarity is to begin with because any data set can have identical within-matrix correlations of 1, whereas between-matrix correlations can ultimately be of null value. When characterized by the same uniform parent distribution, randomly generated eigenvectors produced cosines that, when finally tabulated, were found to oscillate between the limiting values of (+ 0.995) – (+ 0.999); through chi-square analysis, they yielded approximate values of between (+0.949) – (+ 0.998). Hence, as demonstrated conclusively by the foregoing, Jensen’s common factor only measures the average amongst those test scores included within his correlation matrices and is not to be confused with Spearman’s g. For every eigenmatrix, a different g is extracted, rendering our current notion of g as a common factor virtually meaningless. Jensen’s PC1, masquerading as an ersatz g, is a fluid, relatively unstable measure that is subjectively altered from test to test. What psychometricians of the modern Jensenist school work with is not g, which was always unworkable in practice, but a proxy for g which exhibits tremendous variance between subtests and the dominant eigenvectors they represent. The uselessness of IQ and the racist bigots who defend it Jensen is an academic charlatan and a fraud; all those who parrot his nonsensical, eugenicist opinions have little understanding of the scientific method or the actual workings of experimental observation itself. In actuality, the entire field of intelligence testing is based on fraud and gross distortion, with even less scientific import than such academic fields as phrenology or the criminal anthropology of Cesare Lombroso. The concept of IQ has produced nothing of tangible benefit throughout the twentieth century, other than contributing to the continued social marginalization of groups already excluded from the mainstream of society. What has the concept of IQ contributed to human knowledge? Has it led to any new discoveries or human inventions? Have people’s lives ever been immeasurably improved by the existence of this arbitrary, yet impossible to define g factor construct? The fact that the concept of IQ has been carried into the twenty-first century within the same putrefied, virtually homogeneous Spearmanian cage it had at the beginning of the twentieth century demonstrates quite clearly that the field of intelligence testing contributes nothing to the advancement of human affairs, other than as a convenient means of reinforcing the bigotry and prejudices of others. The truth of the matter is that there is no such thing as a general factor of intelligence, because what it is changes from test to test and is subjectively defined by whomever wields it as a statistical instrument. As a matter of fact, not a single psychometrician from the inauspicious days of Spearman up until now has been able to provide a single workable definition as to what intelligence is. As such, intelligence testing was designed by bigots to spread racial hatred against others under the usual thin veneer of scientific endeavour. After all, what better way to discriminate against others and insulate oneself from personal attack, other than to appeal to science in the defense of one’s personal prejudices? The sole purpose of IQ and the eugenicist arguments which provide its ideological buttress is to exercise power over others by silencing individual voices and erasing previously autonomous individual identities. The only effective way of dealing with the field of intelligence testing is to eviscerate it on purely logico-mathematical grounds, ceaselessly attacking the narrow-minded bigots who preach it from the pulpits of academe. There is no scientific basis for IQ; the fact that it has remained virtually unmodified since its very inception by Spearman at the beginning of the twentieth century, and subsequently contributed absolutely nothing to the progress or advancement of human civilization, should constitute sufficient proof that IQ is not an instrument of science, but an ideological weapon used to facilitate the brute domination and control of others, as well as furnishing a means by which blind prejudice is rationalized. Smash IQ and resist the racially motivated bigots who preach it! Help put a stop to mental testing and the racial discrimination it entails!
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.