Google+

Computational Cognition- Future of Psychological Research

A paradigmatic change in psychology is marked by the Cognitive Revolution of the 1950s that swayed away from the structuralist  point of view of pioneers like Wundt and Titchener.  The need and desire to study  underlying cognitive processes of behavior marked the cognitive revolution brought about by the introduction of computer simulation in psychology by Newell and Simon (1972). A new era of information processing theories, and the likening of the mind to computers came in the field, but did not completely change research methods; computers were still used to their minimum capacity.

 Towards the end of 1990s and beginning of 2000, many cognitive scientists emerged who brought in computers as major tools to understand humans, studying representations, language processing, and other higher-order processes. This was possible because of the use of Artificial Intelligence Technology and advancement in technology-related software for computers. Researchers used computational techniques to develop their theories and research. For instance,  Steven Pinker in his book “How the Mind Works,” depicted the intersection of simulation, mathematical modelling, and psychology. Daniel Kahneman, the Nobel Prize winner for examining prospect theory, discussed risk taking in economic decisions and used structural modelling. Richard Schiffrin, famous for memory research, used computer simulations and Bayesian modelling as pillars to understand complex behaviors like optimal and non-optimal decision making and coevolution of event memory and knowledge. Psychology as a field is evolving, and computational techniques are not used full-fledgedly; the focus still lies on behavioral studies.  Unfortunately, the underlying theoretical assumptions are never questioned; they are blindfoldedly followed by all. Therefore, there seems to emerge a need for a new revolution, a Computational Cognitive Revolution, where the focus is on using models and structures to analyse those underlying structures that have not been explored yet. A new way of thinking of how the brain works is important; this can include various forms, like using big data, using AI, mathematical modelling, and Bayesian structures, to name a few.

Advances in computation have led to a number of researchers adapting computation to represent neural networks, the development patterns of connectivity, and characteristics of neural processes in the field of learning. A model called as Error-correcting learning is used, instead of the teacher giving instructions and constantly guiding the output variables, a computer provides these instructions to participants. For instance, this model can be used to figure out how the basketball got into the hoop, instead of the coach correcting and anlysing the participants action a computer could analyse and determine which motor commands were used and what was the direction, target and process followed.. This was possible because of the use of supervised learning algorithms and multiple learning algorithms. 

In the field of Language and Morphology, for a long time, research has been conducted on morphology. With the introduction of the connectionist models it has become easier to depict the aspects of morphology and how generations of regularization of neural networks have accounted for the major changes in language processing. 

In a study, co-evolution as a computational and cognitive model highlights the similarities and differences of the computational model and cognitive model that complement each other. Therefore, recognizing the need for a combination of both the techniques for effective and accurate results.

Inclusion of computers and AI can bring in a shift in research techniques from behavioral experiments to big data analysis and computational modelling, which re-establishes the structure and form of the field. Behavioral studies are complex and unpredictable, in order to strengthen the predictability and validity  use of new tools, computational methods, and statistical models becomes essential. For instance, in order to study consumer’s choices on clothes, instead of  setting up a behavioural study, wherein participants would come and in a systematised way choose which clothes they like, what brand do they prefer, why do they prefer the brand, the money that they spend, the colour choices and the material. We could use websites, and networking pages to gather their opinions with images of clothes, offering suggestions related to a cloth they like and use polls to take their opinions, surveys and conduct experiments online. 

Taking in the behavioral experiment setup, first, it would be something that is created by the researcher, so probing elements could be involved; it may be systematic, but simultaneously constrained to the variety of clothes chosen by the researcher. Second, the researcher could be sampling from a limited  participant pool, their presence in the setting itself could affect the responses of the participants, and anonymity, an important part of research that is crucial for unbiased views, would not exist. Third, the place where they set up would be time-bound, and may have age, sex, race, cultural, and religious biases; moreover, it would be focused study and only applicable to a small group of people. It would also require a lot of effort of the researcher and would be costly with unsatisfying results.

Looking at the second case of using large databases- an advantage would be that it would be generalizable, it could be more updated and use modern tools. An important feature is the use of the design model for clothes, which has a larger variety of clothes adding in a variety of brands and their relation on choices that could be studied, it would be standardized and have easy access with proper terms policy with companies providing databases. It would be expensive, but would provide richer and more meaningful data.

However, a few loopholes with the interface of technology into experiments have curtailed people from being computer-friendly for experiments. First, database sharing can in some cases be a threat to individuals’ privacy, and these can be misused. Second, there may be socially desirable views, cultural barriers of orthodox beliefs affecting behavioral choices and lacking seriousness. Though these issues are less compared to the behavioral technique, there is one issue of crowdsourcing of experiments, which is not a much-researched venture. Researchers are still finding ways on how to incorporate such databases and techniques. Lack of knowledge about these open sources and crowdsourcing companies is a big barrier  that needs to be looked into.

Thus, computational cognition is the need of the hour which can revolutionize the experimentation techniques in psychology, improving and enhancing the knowledge structure of the field. It will take time and seem complex and many would be against the interdisciplinary structure that psychology is changing into, but after all, human beings are social animals, no process takes place in isolation, inter-relationships throw light on problems and help to find solutions that are smarter and the accuracy of technology would be a catalyst for this new type of research, A new cognitive revolution, where computers and humans interact to better understand the most complex mechanism on earth – the human mind.

 Heena Khudabadi

mini_logo.png