eISSN: 2084-9885
ISSN: 1896-6764
Neuropsychiatria i Neuropsychologia/Neuropsychiatry and Neuropsychology
Bieżący numer Archiwum Artykuły zaakceptowane O czasopiśmie Rada naukowa Bazy indeksacyjne Prenumerata Kontakt Zasady publikacji prac
Panel Redakcyjny
Zgłaszanie i recenzowanie prac online
SCImago Journal & Country Rank
1-2/2024
vol. 19
 
Poleć ten artykuł:
Udostępnij:
Artykuł przeglądowy

An overview of the application of artificial intelligence in psychotherapy: A systematic review

Mohammad Tahan
1
,
Tamkeen Saleem
2

  1. Department of Psychology and Education of Exceptional Children, University of Tehran, Tehran, Iran
  2. Department of Clinical Psychology, Shifa Tameer-e-Millat University, Islamabad, Pakistan
Neuropsychiatria i Neuropsychologia 2024; 19, 1–2: 28–39
Data publikacji online: 2024/08/12
Plik artykułu:
- An overview.pdf  [0.14 MB]
Pobierz cytowanie
 
Metryki PlumX:
 

Introduction

Artificial intelligence (AI) is defined as the ability of a machine to carry out tasks that, if performed by a human being, would be considered as intelligent; such tasks may involve reasoning, perceiving, learning, decision-making, adapting and controlling (Luxton 2014). The origins of the field of AI can be traced back to the era of computer development in the 1940s, and it was formally provided its name by John McCarthy in 1956 (Buchanan 2005). AI technology may be represented by the machines, computer software, networks, or brain-computer interfaces that are inspired by living biology. AI has been actively involved in the field of medicine since the 1970s for research and development in virtual and physical bio-medical sciences. The physical branch is more commonly well known for AI technology due to the role of software (care bots) in executing various tasks in the realm of care, for instance, helping the elderly by assisting them in taking the medication or assisting the surgeons in surgery. Currently, there are many reports describing robots as solo surgeons too (Hamet and Tremblay 2017; Luxton 2014). The practical aspect is indicative of deep or machine learning, for instance, electronic medical records.
Artificial intelligence has been introduced in multiple fields, including games, law, stock trading, remote sensing devices, and diagnostic procedures (Tahan 2018; Shukla and Jaiswal 2013). Most of such advanced AI has trickled into routine applications, so much so that these AI applications cannot be distinguished individually, as is understood that once something is implemented as a general application and is of immense use, it cannot be specified as AI anymore. AI applications seem to have a ubiquitous infrastructural presence in any given industry. The late 1990s and the beginning of the 21st century saw the intertwining of AI technological elements into those of the larger frame of systems (Shukla and Jaiswal 2013). Attentiveness in AI has principally focused on self-governing systems that might replace humans as mental health professionals in their respective fields. While cognitive psychology is expected to uncover new paths of inquiry from algorithmic-instruction theory, the algorithmic-instruction theory may also have more concrete implications as it measures the complicated data related to human behavior. Thus, this points out the interdisciplinary synergistic research in cognitive psychology, neuroscience and computer sciences which may benefit humans in one way or other (Mozera et al. 2019). In this modern age of technology, having no boundaries and limits, considering the ethical and moral repercussions of artificial intelligence there are three sides to the argument. One argument reflects the opinion that many people in the world are already stricken by poverty and unemployment, and therefore more replacements of humans with machines may add to these social problems. Another group argues that society cannot progress or derive benefits from resources without the help of machines that can think for themselves at least partially. Meanwhile, the third group is least bothered about these issues at all, as is typical of human society (Shukla and Jaiswal 2013).
The COVID-19 pandemic influenced all walks of life globally and many systems were changed and digitally updated. The use of AI software for prediction of mental health disorders was greatly increased (Ćosić et al. 2020). Digital psychiatry, specifically artificial intelligence and machine learning, was employed for the early detection as well as prediction of mental health problems (Hariman et al. 2019; Zhou et al. 2022). It is also expected that AI software will be able to help mental health professionals to define and refine the mental problems in a more objective manner as compared to the process followed currently by DSM-5 (Graham et al. 2019). The latest AI software can assist in prediction of risk of psychological pathologies (Schultebraucks et al. 2020), identification of suicidal risks (Just et al. 2017), usage of psychotherapy (Ewbank et al. 2020), and may destigmatize the reporting of mental health problems as virtual human interviewers (Lucas et al. 2017). The COVID-19 pandemic has adversative outcomes for human psychology and behavior even long after preliminary recovery from the illness. There have been some COVID-19 recovery programs specifically targeting mental health which were based on AI prediction and treatment strategies. The new AI software may be helpful in enhancing the resources for mental health professionals in order to cope with mental health challenges.
Therefore, the focus of this article is to review the uses of AI technologies that apply to psychotherapeutic practice and research activities. Finally, there is a discussion on the use of artificial intelligence in psychotherapeutics for patients, mental health professionals, and the field of psychology.

Method

The current work is a scoping review of the extant research on the application of artificial intelligence in the field of psychotherapy. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) criteria were utilized to describe the studies’ methodologies and findings, although no protocol was registered for this review.
Study eligibility
The review focused on studies exploring the application of AI in psychotherapy, whereby all publications on qualitative and quantitative research, including interviews, surveys, observations and experiments, were eligible for inclusion. We excluded secondary reports, short communications and letters to editors.
Search strategy and information sources
All studies published in the English language from January 2000 until the time of the review were considered. Based on the information published, this period is relevant for the investigation into the application of artificial intelligence in psychotherapy. The databases used to search for publications were Web of Science, Scopus, Medline, Embase and PubMed. The first search was conducted in January 2019, and a follow-up search was performed in June 2019 to ensure that all relevant studies were captured. The keywords “Artificial Intelligence”, “Psychology”, “Psychotherapy”, “Computer” and “Technology” were used to search the databases to ensure that the review was sufficiently sensitive, wherein both text words and controlled vocabulary were employed to perform the search. Additional relevant publications were identified through manual scanning of the references of the identified papers.
Selection of sources of evidence
The researchers independently verified the relevance of all studies identified during the search of the databases. First, they examined the publications’ titles and abstracts to ensure that they were suitable. Then, they inspected the articles’ full texts to identify their eligibility for inclusion, in line with the inclusion criteria.
Quality assessment
For the quality assessment of the selected articles a quality assessment tool was used that comprised three aspects to be covered which were study plan, characteristics of the samples, and data collection tools. Each aspect was rated 0-3, and a higher score indicated higher methodological quality. The cut-off scores were used for classifying the article as low/medium/high quality, where 0-3 indicated low quality, 4-6 medium quality and 7-9 high quality.
Data analysis
The authors of the present study designed a data extraction form to filter the details regarding concerning the purpose, design and method of the studies, analysis, and main findings. The form also collected information on use of IT and AI tools for Assessment, Training and Treatment as well as psychological issues, techniques and application methods. The information on attitude towards use of AI in psychology was also extracted. After extraction of the required data, the findings of eligible studies were synthesized (Fig. 1).

Artificial intelligence and psychology

Many psychologists think that the discipline of artificial intelligence has the capability to assimilate all of the marvels created by the humanoid mind. This functionalist outlook endorses the human mind as a figurative system and psychology as the study of the numerous computational processes where the mental activities are constructed, structured, ordered and deciphered. Although it is under debate as to how various psychological phenomena can be justified in computational terms, it is yet to be seen which AI concepts are applicable for computer modeling methodologies following a psychologist’s perspective (Wang et al. 2016).
The role of AI in psychological science is still underestimated among the psychological science specialists (Tahan 2019), despite the innovative wave that has emerged regarding the use of AI in psychotherapy. For instance, data mining techniques have been utilized to differentiate between vulnerable groups having suicidal risks in comparison to those not at suicidal risk (Morales et al. 2017), to provide a self-help program for students with depression and anxiety (Fitzpatrick et al. 2017), to provide a companion for patients with mental health problems during or after ambulatory treatments (Glomann et al. 2019), to establish a full scale automated diagnosis utilizing fuzzy logic representing psychiatric reasoning (Kravets et al. 2017), and to provide online web and social media-based social therapy to people with mental health problems (D’Alfonso et al. 2017).

Artificial intelligence technology and development of psychotherapy tools for clinical assessment, training, and treatment

In 1966 the first-ever simulation of psychotherapy was used when a human-computer interface was developed under the name of ELIZA. It was a program designed to impersonate communication with a sense of empathy based on Carl Roger’s style of therapeutic communication. A question-answer format was used in order to respond to questions when the user typed through a keyboard. ELIZA was based on syntax language to generate the formulated answers centered on a model that could only mimic conversations (Luxton 2014; Servan-Schreiber 1986).
In 1970s another computer-assisted program was developed at Stanford University by Dr Kenneth M. Colby. The program was named PARRY, and it mimicked a person with paranoid schizophrenia, and just like ELIZA, PARRY could converse with others (Güzeldere and Franchi 1995). This program had passed the Turing Test, which is a test for judging the intelligence of machines. The requisite of this test is to imitate the human real-time written conversations with a human judge in such a manner that the human judge cannot differentiate whether the responses are by a program or a real-time human being (Turing 1950). Teuscher and Hofstadter (2006) reported that for PARRY the judges could not differentiate between the AI program and a real patient with paranoid schizophrenia.
A program called Woebot, developed and launched as a standalone IOS app in 2017, is a mental health chatbot for depression and anxiety. It works as a 24/7 therapy chatbot. It has been developed using natural language processing, and has excellent writing skills and therapeutic elements. It makes use of cognitive-behavioral therapy in order to provide scripted responses to the patients who use it. It is based on the therapeutic framework of CBT and is able to create therapeutic conversation for those who access it (Zaidi 2018).
Previously AI was considered to be effective for mechanical tasks only. However, the use of AI provides evidence that AI is able to reflect emotional intelligence. Research has been conducted to enable chatbots to perceive, recognize and respond to human emotions, where the chatbots have responded to the shades and tones in human vocal and facial expressions. A machine that has an appearance like a human is called a humanoid. A humanoid called Pepper was developed in 2016. It is said to be emotional software that has been engineered to sense and understand anger or sadness or other feelings, which further guides it to interact with a human being (Pardes 2018).
In view of the fact that a significant segment of psychotherapeutic work involves conversation, AI bots can be used to participate in conversations displaying emotional intelligence with clients, and to lessen the worries held by the clients using empathy. This indicates that AI goes beyond the mechanical tasks and is effective in conducting tasks dealing with interpretation and exchange of emotions.
A further aspect may be that psychotherapy is not purely emotional and abstract; there are other features as well, such as reasoning and learning, which makes it possible for AI bots to deliver the therapy effectively to people with mental health issues. This means that when knowledge is more structural and technical, it can exist in harmony with knowledge that is conceptual and emotion based.
At present we find advanced technologies with virtual human avatars, i.e. virtual reality simulations of human beings. Such technology as virtual reality simulation, language processing and knowledge-based artificial intelligence is able to conduct interactive and intellectual dialogue and exchange of communication. The Institute for Creative Technologies at the University of Southern California has developed life-like virtual-reality human clients for clinical and supervisory training. The virtual-reality human clients impersonate the symptoms of psychological disorders and interact with the training therapists (Rizzo et al. 2011). This program helps in provision of customized training of various levels for mental health professionals. However, more research is needed to establish the effectiveness of such programs. These AI virtual reality patients may be used for other patient-therapist interactions in mental health care such as psychological assessments, testing and treatments.
Virtual reality avatars are being used for awareness raising and support mechanisms for mental health patients. SimCoach has been developed to link military personnel and their family members to mental health care and other resources related to well-being (DeAngelis 2012; Rizzo et al. 2011). In future it is expected that such AI avatars could provide the psychological services to anywhere dependent on the availability of the internet services and accessibility. The benefit may be that any individual in need of psychological help could be able to conveniently access the AI virtual avatars/consultants at any time and have the psychological assessments, recommendations, and referrals and so on as per the individual needs of patients. This method would be much more interactive and involving in comparison to static online internet-based consultations. Further, it would be convenient for the clients to seek help from a virtual program from their homes and issues such as stigmatization will not hit them hard and not stop them taking psychological services.
Researchers have described some methodological concerns to be considered when designing systems for automatic detection of voice pathology, in order to enable comparisons to be made with previous or future experiments. The proposed methodology is built around the Massachusetts Eye & Ear Infirmary (MEEI) Voice Disorders Database, which to date is the only commercially available one. Discussion about key points on this database is included. Any experiment should have a cross-validation strategy, and results should supply, along with the final confusion matrix, confidence intervals for all measures. Detector performance curves such as detection error trade-off (DET) and receiver operating characteristic (ROC) plots are also considered. An example of the methodology is provided, with an experiment based on short-term parameters and multi-layer perceptrons (Saenz-Lechon et al. 2006).
The use of KIOSK-based computerized screening for mental health can be effective as a number of people may be screened in a short span of time, e.g. during natural calamities, emergencies and military operations. Psychological assessments would be made more easily, efficiently and sophisticatedly due to the capacity of AI technology for processing complex data and may reduce the uncertainties in screening problems or errors.
A new field has come into existence in the social sciences, i.e. cyberpsychology or the psychology of cyberspace. Studying people’s reactions and their cyberspace behavior is another evolving concept created by computers and online networks (Shaw and Gaines 2005). Cyberpsychology research is aimed at solving two main problems, i.e., enhanced applications of IT to counter various psychological issues and the consequences of the use and interaction of various tools of cyberspace and associated psychological and psychiatric issues among the users.

Knowledge change: potential of AI in psychotherapy and concerns of psychologists

The claims and conception of knowledge have changed over time. The advances of AI as psychotherapeutic tools are the confirmation and proof of knowledge improvisation. Sometime ago it was presumed that AI would never be able to enter the realms of psychotherapy because computers were thought of as carrying out only mechanical knowledge whereas humans possess emotional intelligence. Recent research evidence shows that it is not only humans who possess the capacity for emotional intelligence, but the new wave computers and machines possess abilities related to emotional intelligence. It is a reminder for us that with the development and research, knowledge can be achieved and transformed; even apparently unfeasible and impractical forms of ideas and knowledge may hatch and transform into a valuable reality (Tahan and Zygoulis 2020).
Despite the evidence available, some theorists hold the view that AI bots do not have emotional intelligence. Emotional intelligence was described as a capacity to monitor one’s own feelings and emotions as well as those of others, to be able to differentiate them and use this information to generate cognitive and behavioral responses (Salovey and Mayer 1990). With reference to this definition, it can be stated that AI bots do possess emotional intelligence. On the other hand, this definition may be used to support the argument that AI bots do not actually recognize or feel emotions and only generate a response based on information that it is receiving and as per its programming. Thus, this aspect is still controversial, but it cannot be overlooked that advanced technology has made considerable advances where emotional intelligence is being explored through AI.
Simultaneously, another aspect that is noteworthy and alarming for humans is that with these AI bots, the field of psychotherapy is no longer constrained to professional human psychotherapists. The psychotherapists have to compete with the robots to be effective in treatments of people with mental illnesses. Formerly, people seeking mental health services had to visit physical spaces and clinics of the psychotherapists in order to receive treatment. Conversely, with AI bots, it has become very convenient for the people seeking psychological services to receive mental health care at home; additively AI software and technology can cater to a broader range of audiences. Thus, this AI technology has facilitated the psychotherapeutic knowledge (Carr 2008).

Reluctance among psychologists to use AI in psychotherapy

Despite the applications being introduced in the clinical interactions, reluctance is found among the mental health professionals to use AI, which is addressed by multiple fascinating inquiries in this regard. Generally, the patients develop a strong bond with the therapist; from a Freudian perspective, transference and countertransference are also developed in the patient-therapist relationship; often they facilitate the treatment, but at times appropriate care needs to be taken, especially when termination of therapy is necessary. However, there is grave concern about whether the patients would be able to develop a therapeutic attachment and trustworthiness towards machine practitioners equipped with AI. Would there be a perception among the patients that they are dealing with a superior functioning machine that surpasses the knowledge and capabilities of human beings?
Schipor et al. (2008) argues that the shortcomings of the application of computers in psychotherapy are primarily related to ethical issues; the patient-therapist bonding is missing and the absence of a human being from this form of relationship is controversial. In a similar vein, the originator of ELIZA maintained that it should not be permissible for computers and machines to make crucial decisions for the reason that computers lack the characteristics of compassion and wisdom which are possessed by humans (Weizenbaum 1976). However, we also observe in the literature that many researchers have argued, as mentioned earlier, that AI machines and computers can undoubtedly experience emotions, or as a minimum, the detection and manifestation of emotions can be imitated and demonstrated (Bartneck et al. 2008). Relational and social intimacy, empathy, and the therapeutic bond are significant aspects that affect clinical interaction and treatment (Lambert and Barley 2001).
The availability and easy access to the software to the patient at free will can be considered akin to the example of drug prescription and administration that can be conducted only under medical supervision due to the predisposing risk factors. Computer models tend to oversimplify things, and the odds of reducing the adaptability and ability of the human expert are much higher. Consequently, with more and more advancement, there is a higher likelihood that the mental health professional might not be able to comprehend patient-issues without the aid of a computer. Benefits of the computer-based application at any level in psychology cannot be ignored, but appropriate care is necessary given the rising expectations from this particular approach. Given the gravity and complexity of psychological issues, limitations associated with the system cannot be neglected; for instance, problems such as nightmares, compulsive gambling, tics, and enuresis have not been resolved to date. It means that despite the utility of these systems, continuous updating is required; however, as of now, substitution by the specialist is not feasible.
Furthermore, another aspect of reluctance and concern is the cultural differences and expectations that may influence the therapeutic outcomes. This calls for a probe of whether AI software or practitioners can apply therapeutic techniques for the cultural aspects and expectations. Further research is needed in this regard.
There has been disinclination among psychologists and psychotherapists to use artificial intelligence due to a perceived threat of employment insecurities, being replaced by computer-led programs and artificial intelligence, and some other associated consequences. There is evidence in various job sectors such as banking, customer service, law, and health indicating that due to automation, information technology and artificial intelligence employees are being replaced or reduced (Brynjolfsson and McAfee 2011; Markoff 2011). Despite this worry that virtual AI practitioners or software may replace mental health professionals, many researchers believe that these may assist and facilitate but would not completely replace mental health professionals. Nevertheless, their introduction is expected to reduce the cost of psychological services in the upcoming years. No matter what the fears are, the discipline of psychology has always evolved and made use of technological innovations to enhance and improvise the practice and research in mental health.
Another primary concern in AI-assisted psychotherapy is the quality of treatment. There exists longitudinal organizational resistance as well as passive resistance, but in the future, it is expected that human-machine interaction would be far more frequent in comparison to today. When the first automated diagnostic system was introduced, “Micyn”, there was a lot of criticism and rejection (Rialle et al. 1994). Unfortunately, to steer clear of possible misdiagnosis, Micyn use was not implemented (Frost 2008; Coyle et al. 2005). Diagnostic accuracy of 99% is achievable by implementing an expert system, but in similar circumstances, practitioners might need a comprehensive record, thereby indicating their requirement only as an aid in the process rather than an option or replacement.
The justification of substantial capital investments in the implementation of these complex systems and the dilemma with regards to the expected delivery of these systems in terms of the actual needs of the psycho/logy expert are inevitable in terms of the intricacy and purpose of the distinctive IT applications in real-life situations and problems. The predicament of the adaptability of the user to the system complexity is another problem faced in the given context. Tests on system efficiency by employing a minimal prototype can perhaps clear the doubt. The issue of user rejection can be elucidated by implementing specific features for the human-computer interface (HC)I. Although only on a gadget level, a good information structure grounded on an proficient and skilled system is needed without which even the electronic documentation known as Help cannot resolve this issue.
A collaborative attitude is expected from the application of intelligent tutoring systems (ITS). An expert system-dependent ITS with the touch of a finger has been developed from the enormous blend of AI and psychology; to boost the proficiency in tackling the tailored assistance presented to teachers, it is necessary to devise new resources and make appropriate application in the context of an ITS (Rialle et al. 1994). Psychology-based computer applications are mainly referred to as psychotherapy. The computer at this juncture is far more relevant than humans in a similar space. The roots of AI in cognitive models have prepared mental health professionals to employ the computer only as an ancillary support system during treatment. A battery of broad-based applications exists that can be classified as follows: dedicated internet sites for self-help; computer-administered therapy; web-based applications for identifying and evaluating via the Internet; advocacy; adjunctive palmtop computer therapy; consultation via the internet; communicative voice messaging programs; therapy based on simulated reality; biofeedback treatments through ambulatory bodily monitoring; computer-generated support group sites. Basic advantages extended by the utilization of information technology in psychology and psychotherapy are: extended and supervised treatment for patients; shorter interface with mental health professionals; cost-effective treatment modalities; devices might aid in making management choices; the sense of employing the technology and computerization to support mental health professionals is not novel; for instance, in the application of lengthier interviews it is being used in one way or another (Mozera et al. 2019).
On a positive note, the advancement and integration of AI technology in the practice of psychology and psychotherapy can have effects that may completely transform the mental health services and bring social benefits. Psychologists and mental health practitioners need to play an active role in the development and renaissance of AI technology.

Artificial intelligence applications in psychology

Hypnotherapy may be carried out classically, but excellent outcomes are also achieved by the application of electronic techniques either partially or totally. Involving the computer has fully automated audio/video flow, thus reflecting the higher IT involvement. To help the psychotherapist health care services have allowed access to virtual reality; the specialists have approved the VR role and think that it might further refine and evolve the field of clinical psychology. The outcomes of panic and phobia disorder treatment were not at par with the expectations from the computer-based application, even though they were highly economical (Tahan 2019).
Children’s age-based educational games have become common and hold great potential which psychiatrists must explore. The concept of employing games in education with an increasing level of complexity is being applied. These games are, more often than not, based on various systems of progressive artificial intelligence models. Mental health professionals are required to integrate these strategies. Studies related to the applicability of 3D games as a resolute therapy instrument have been piloted (Kowalski 2011). The first outcome seems encouraging; nonetheless, the use of these strategies for reaching a treatment solution is an uphill task. So, psychiatrist-guided supervision of behavioral rule modification of the therapeutic games is required from time to time.
The superiority of AI dynamics as against that of psychology can be attributed to its strong mathematical backbone and its essential industrial applications. The 1980s saw a rise in the use of an expert system as a market asset after that of production systems (D’Alfonso et al. 2017). Expert systems and psychology cannot be separated. Over time IT experts have realized that there is a need for competent rule extraction from people in order to devise new techniques. Here repertory grid elicitation, or RepGrid, which can yield data for analysis – quantitative and qualitative – was identified and integrated into the historical understanding. From the psychological perspective, the expert programs may be implemented coupled with personal construct consciousness. One must not forget that these psychologist approaches do not come cheap. So a middle path has to be taken for the skilled programs with generalized guidelines regarding humans, and the cognitive pattern may be developed first, and then later a methodology for self-acquiring guidelines through candid interactions with the patients may be implemented.
These skilled programs are multifaceted and sensitive applications that select a set of guidelines based on the human expert experience in given circumstances. Apprehensions regarding their implementation exist, but generally they are applied to a set of guidelines and eventually efforts are made to bring clarity to this set. The computing dimensions puts forward that high-performance computing such as GRID or CLOUD can achieve great heights. However, human thoughts are a very complex subject to know especially if they cannot be put down in words, and in the absence of a communication network, it may additionally restrain the transfer of knowledge. The theoretical viewpoint puts forward that the applicability of these systems is vast because the basic idea behind the development of AI was to be able to imitate human thinking; but to achieve this feat, we have a long way to go. Numerous divisions of AI are trying to copy parts of human life and behavior, from genetic procedures and neural systems to artificial being, and game theory. Each skilled program essentially requires three significant constituents: knowledge, an inference device, and an interface. In this aspect, the universal translator seems to be the first application in the pipeline. The level of knowledge offers to resolve a vast and varied database like Google and complemented by an equally formidable expert system. The application of expert systems in speech therapy can further enhance its use. Researchers are also of the view that in the light of the above this improved fuzzy expert system can also be applied in providing at-home treatment of the patient (De Mello and De Souza 2019). Different AI techniques are used in general psychiatry. For instance, even low-quality input data are enough in the correct diagnosis of dyslexia by merging the applicability of fuzzy and genetic algorithms. The passive voice itself can be used as an auxiliary source of information in creating a good anamnesis. Voice pathology is also being used in finding important psychological cues; outcomes of the Massachusetts Eye & Ear Infirmary (MEEI) Voice Disorders Database are one such example (Dwyer et al. 2018). These outcomes cannot be treated separately as multiple factors can be responsible for the change in a patient’s voice. Its use can provide valuable information about the patient if used concomitantly with other measurements.

Interpersonal information retrieval program

Experts in psychology and social scientists need to be updated with the current scenarios of the digital world. Data collection of people or communities is central, and newer ways must be devised; today information retrieval from the internet is also possible. Social life is either partially or fully being virtualized, so much so that personal details can easily be accessed online. These sets of information can be distinguished into explicit data or implicit data; the former is obligatory for the communal network so that the consumer or user is well aware regarding the content that is uploaded for public access and is also aware about the implications of creating content that is partly or completely public; the former data involve information that is given by way of interaction with all those acquainted via the social network platform.
The inability to distinguish between the computer-generated ecosphere and actual real-life connectivity with people is the biggest threat to the user because he/she may not be discerning about the type and the confidentiality of the information made available. The virtual space has the information; therefore a boundary technically called an interface must be generated in the social networks. Accessing these personal details of the user without his/her consent is not possible without obtaining explicit permission. Two segments comprise the proposed system: first is the HCI-based interface using AI agents, and the second is the information retrieval system. The convenience of the use of the computer is made possible by using expert systems and HCI techniques. Too much investment is not required to gain direct or indirect benefits in case of higher emotional intelligence in some individuals. Therefore, a computer capable of emulating these kinds of abilities is stressed by experts.

An information retrieval system – IRS

The IRS system comprises the data gathering, indexing and searching, and presentation. The data gathering is done by retrieving information from the cyberspace as per the user-set rules. Occasionally, it is applied as an answer of a quest by making use of independent units that may transfer the knowledge to the fundamental database. Second is indexing, which requires an immediately searchable database that is looked at. Although various indexing approaches exist, the relevance of each depends mostly on the size of the data set. Therefore, a traditional database management system (DBMS) is manipulated to stock the data. The third is searching – wherein a dedicated set of AI-based operations are used for each implicit DBMS user. Fourth is presentation – here, the graphics and visuals based interface is employed for graphical data representation; clustering procedures are also used. Cyberpsychology is a related research field still in its infancy with unlimited potential due to the high speed of technological development (Kowalski 2011).

Results

In order to assess the existence and application of the computer registered programs designed for psychotherapy, a review was conducted and is summarized in Table 1.
Table 1 indicates a more evidence-based perspective presenting each application/program of artificial intelligence in psychotherapeutic world. The table clearly indicates which programs are inactive or active, their functions and purpose. The techniques recommended for psychological issues and methods employed in conjunction with a computer are represented. This helps to make the analysis more comprehensive and useful to illustrate the appliance of AI in psychotherapy. Four of the AI programs are high in quality for assessing mental health issues. ELIZA and PARRY are no longer active or in use. They have been tested to provide evidence that AI software may be helpful in psychotherapy, where ELIZA acted as a therapist and PARRY acted as a patient with paranoid schizophrenia. This paved way for development of further programs. Woebot has been significantly used for depression and anxiety and it has prove to be effective for the amelioration of symptoms of depression and schizophrenia. Wysa is a Chatbot that helps reframe and cope with negative feelings and organize thoughts. Woebot has been effective to assist with relationships, grief and depression, while Wysa has been effective for people suffering stress, anxiety and sleep loss. In comparison to Woebot, Wysa is better at providing specific exercises which may be physical/behavioral or mental in order to overcome difficult situations and depression. ORCHA (Organisation for the Review of Care and Health Applications) recommends Wysa as the best app for managing anxiety and stress during COVID 19.
Fabulous* takes a holistic approach to motivate users to be more productive and have higher energy. The app is more than just a habit tracker, or a way to create new rituals – it is a personal coach and happiness trainer.
Your.MD* is a health tracker and symptom checker powered by AI which has been developed by doctors and data scientists. It provides instant personalized health information and services, whenever and wherever it is needed, for free.
eQuoo* is an evidence-based Emotional Fitness Game, combining the excitement and joy of gaming and the expertise of mental health professionals to provide a new form of mobile prevention and therapy for young adults aged 18-28 years.

Conclusions

The present paper addresses the application of artificial intelligence in psychotherapy. In this digitized world the use of AI in mental health care is unquestionably a growing area which is likely to have a weighty impact on psychological and psychotherapeutic practices in the years to come. Despite the reluctance and concerns of psychologists regarding the use of AI in clinical interactions, diagnosis, therapy, and its ethical use in psychological practice, AI seems to be a beneficial health care tool. AI may help to promote the health of patients and enhance the practices of mental health professionals and overall society by improving care and conditions of human beings. Literature reveals that AI-supported programs and systems that function to assist mental health care professionals can help in provision of psychological services. In the near future, extensive usage of AI may further improvise the applications and outcomes of psychological services. Evidence-based research is needed to generate confirmed findings. Further, mental health care professionals need positive attitude to direct others to use AI technologies in practice and build evidence through research for the renaissance of psychology and to bring benefits for patients, the profession, and society as a whole.

Data availability statement

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Compliance with ethical standards

As a theoretical article, it included the data available as evidence in text based on only authentic sources published in well-reputed peer reviewed journals and books. The information search for the present article involved information that had no distorted realities; it formed the foundation of the research. Further, data from previously published research was used in which informed consent was obtained by the primary research investigators. For the present study only reports and articles permissible for reproduction for the sake of research have been used and it has in no manner directly affected the people of concern, as composite narrative writing has been used. No human participants have been involved in this study.

Disclosures

This research received no external funding.
Institutional review board statement: Not applicable.
The authors declare no conflict of interest.
References
1. Bartneck C, Lyons MJ, Saerbeck M. The relationship between emotion models and artificial intelligence. Proceedings of the Workshop on the Role of Emotions in Adaptive Behaviour and Cognitive Robotics in affiliation with the 10th International Conference on Simulation of Adaptive Behavior: From animals to animates (SAB 2008). Osaka, Japan 2008.
2. Brynjolfsson E, McAfee A. Race against the machine: How the digital revolution is accelerating innovation, driving productivity, and irreversibly transforming employment and the economy. MIT Sloan School of Management Cambridge, MA 2011. Retrieved from http://ebusiness.mit.edu/research/Briefs/Brynjolfsson_McAfee_Race_Against_the_Machine.pdf
3. Buchanan BG. A (very) brief history of artificial intelligence. AI Magazine 2005; 26: 53-60.
4. Carr N. Is Google making us stupid? Yearbook of the National Society for the Study of Education 2008; 107: 89-94.
5. Ćosić K, Popović S, Šarlija M, et al. Artificial intelligence in prediction of mental health disorders induced by the COVID-19 pandemic among health care workers. Croat Med J 2020; 61: 279-288.
6. Coyle D, Matthews M, Sharry J, et al. Personal investigator: A therapeutic 3D, the game for adolescent psychotherapy. J Interactive Technology Smart Educ 2005; 2: 73-88.
7. D’Alfonso S, Echarri SO, Simon R, et al. Artificial intelligence-assisted online social therapy for youth mental health. Front Psychol 2017; 8: 796.
8. de Mello FL, de Souza SA. Psychotherapy and artificial intelligence: A proposal for alignment. Front Psychol 2019; 10: 263.
9. DeAngelis T. A second life for practice? Monitor Psychol 2012; 43.
10. Dwyer DB, Falkai P, Koutsouleris N. Machine learning approaches for clinical psychology and psychiatry. Annu Rev Clin Psychol 2018; 14: 91-118.
11. Ewbank MP, Cummins R, Tablan V, et al. Quantifying the association between psychotherapy content and clinical outcomes using deep learning. JAMA Psychiatry 2020; 77: 35-43.
12. Fitzpatrick KK, Darcy A, Vierhile M. Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Ment Health 2017; 4: e19.
13. Frost B. Computer and technology enhanced hypnotherapy and psychotherapy. A review of current and emerging technologies 2008. Available from: www.neuroinnovations.com/ctep/technology_and_computer_enhanced _psychotherapy.pdf
14. Glomann L, Hager V, Lukas CA, Berking M. Patient-centered design of an e-mental health app. In: Advances in Artificial Intelligence, Software and Systems Engineering. Ahram TZ (Ed.). Springer International Publishing, New York, NY 2019; 264-271.
15. Graham S, Depp C, Lee EE, et al. Artificial intelligence for mental health and mental illnesses: an overview. Curr Psychiatry Rep 2019; 21: 116.
16. Güzeldere G, Franchi S. Dialogues with colorful “personalities” of early AI. Stanford Human Rev 1995; 4: 161-169.
17. Hamet P, Tremblay J. Artificial intelligence in medicine. Metabolism 2017; 69S: S36-S40.
18. Hariman K, Ventriglio A, Bhugra D. The future of digital psychiatry. Curr Psychiatry Rep 2019; 21: 88.
19. Inkster B, Sarda S, Subramanian V. An empathy-driven, conversational artificial intelligence agent (Wysa) for digital mental well-being: real-world data evaluation mixed-methods study. JMIR Mhealth Uhealth 2018; 6: e12106.
20. Just MA, Pan L, Cherkassky VL, et al. Machine learning of neural representations of suicide and emotion concepts identifies suicidal youth. Nat Hum Behav 2017; 1: 911-919.
21. Kowalski G. Information retrieval architecture and algorithms. Springer Science Business Media, Ashburn, VA, USA 2011.
22. Kravets A, Poplavskaya O, Lempert L, et al. The development of medical diagnostics module for psychotherapeutic practice. In: Creativity in Intelligent Technologies and Data Science. Kravets AG, Shcherbakov M, Parygin D, Groumpos PP (Eds.). Springer International Publishing, New York, NY 2017; 872-883.
23. Lambert MJ, Barley DE. Research summary on the therapeutic relationship and psychotherapy outcome. Psychotherapy: Theory, Research, Practice, Training 2001; 38: 357-361.
24. Lucas GM, Rizzo A, Gratch J, et al. Reporting mental health symptoms: breaking down barriers to care with virtual human interviewers. Front Robot AI 2017; 4: 51.
25. Luxton DD. Artificial intelligence in psychological practice: Current and future applications and implications. Prof Psychol Res Pr 2014; 45: 332-339.
26. Markoff J. Armies of expensive lawyers, replaced by cheaper software. The New York Times 2011. Retrieved from http://www.nytimes.com/2011/03/05/science/05legal.html
27. Morales S, Barros J, Echávarri O, et al. Acute mental discomfort associated with suicide behavior in a clinical sample of patients with affective disorders: ascertaining critical variables using artificial intelligence tools. Front Psychiatry 2017; 8: 7.
28. Mozera MC, Wiseheartd M, Novikoffc TP. Artificial intelligence to support human instruction. Proc Natl Acad Sci USA 2019; 516: 3953-3955.
29. Pardes A. The Emotional Chatbots Are Here to Probe Our Feelings. 2018. Retrieved from https://www.wired.com/story/replika-open-source/
30. Rialle V, Stip E, O’Connor K. Computer-mediated psychotherapy ethical issues and difficulties in implementation. Humane Med 1994; 10: 185-192.
31. Rizzo AA, Lange B, Buckwalter JG, et al. (2011). An intelligent virtual human system for providing healthcare information and support. Stud Health Technol Inform 2011; 163: 503-509.
32. Saenz-Lechon N, Ilorente JIG, Gomez P et al. Methodological issues in the development of automatic systems for voice pathology detection. Biomed Signal Proc Control 2006; 1: 120-128.
33. Salovey P, Mayer JD. (1990). Emotional Intelligence. Imagin Cogn Pers 1990; 9: 185-211.
34. Schipor OA, Pentiuc St. G, Schipor DM. A Fuzzy Rules Base for Computer Based Speech Therapy. Proceedings of 9th International Conference on Development and Application Systems. Suceava, Romania 2008; 305-308.
35. Schultebraucks K, Shalev AY, Michopoulos V, et al. A validated predictive algorithm of post-traumatic stress course following emergency department admission after a traumatic stressor. Nat Med 2020; 26: 1084-1088.
36. Servan-Schreiber D. Artificial intelligence and psychiatry. J Nerv Ment Dis 1986; 174: 191-202.
37. Shaw MLG, Gaines BR. Expertise and expert systems: emulating psychological processes. Knowledge Science Institute, University of Calgary 2005. Available from http://pages.cpsc.ucalgary.ca/~gaines/reports/PSYCH/Expertise/Expertise.pdf
38. Shukla S, Jaiswal V. Applicability of artificial intelligence in different fields of life. IJSER 2013; 1: 2347-3878.
39. Tahan M. An overview of artificial intelligence applications and psychology. Avicenna J Neuropsychophysiol 2018; 5: 3-10.
40. Tahan M. Artificial intelligence applications and psychology: an overview. Neuropsychopharmacol Hung 2019; 21: 119-126.
41. Tahan M, Zygoulis P. Artificial intelligence and clinical psychology – current trends. J Clin Develop Psychol 2020; 2: 31-48.
42. Teuscher C, Hofstadter DR. Alan Turing: Life and legacy of a great thinker. Springer, New York, NY 2006.
43. Turing AM. Computing machinery and intelligence. Mind 1950; 49: 433-460.
44. Wang Z, Xie L, Lu T. Research progress of artificial psychology and artificial emotion in China. CAAI Trans Intell Technol 2016; 1: 355-365.
45. Weizenbaum J. Computer power and human reason: From judgment to calculation. Freeman & Co., San Francisco, CA 1976.
46. Zaidi D. Woebot - World’s First Mental Health Chatbot gets $8 million in Funding. 2018. Retrieved from https://chatbotsmagazine.com/woebot-worlds-first-mental-health-chatbot-gets-8-million-in-funding-3369a1e7ba9
47. Zhou S, Zhao J, Zhang L. Application of artificial intelligence on psychological interventions and diagnosis: An overview. Front Psychiatry 2022; 13: 811665.
Copyright: © 2024 Termedia Sp. z o. o. This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) License (http://creativecommons.org/licenses/by-nc-sa/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material, provided the original work is properly cited and states its license.
© 2024 Termedia Sp. z o.o.
Developed by Bentus.