Decoding Consciousness in Artificial Intelligence
Volume 22, Issue 1 (2024), pp. 1–9
Pub. online: 18 January 2024
Type: Philosophies Of Data Science
Open Access
Received
5 December 2023
5 December 2023
Accepted
16 January 2024
16 January 2024
Published
18 January 2024
18 January 2024
Abstract
The exploration of whether artificial intelligence (AI) can evolve to possess consciousness is an intensely debated and researched topic within the fields of philosophy, neuroscience, and artificial intelligence. Understanding this complex phenomenon hinges on integrating two complementary perspectives of consciousness: the objective and the subjective. Objective perspectives involve quantifiable measures and observable phenomena, offering a more scientific and empirical approach. This includes the use of neuroimaging technologies such as electrocorticography (ECoG), EEG, and fMRI to study brain activities and patterns. These methods allow for the mapping and understanding of neural representations related to language, visual, acoustic, emotional, and semantic information. However, the objective approach may miss the nuances of personal experience and introspection. On the other hand, subjective perspectives focus on personal experiences, thoughts, and feelings. This introspective view provides insights into the individual nature of consciousness, which cannot be directly measured or observed by others. Yet, the subjective approach is often criticized for its lack of empirical evidence and its reliance on personal interpretation, which may not be universally applicable or reliable. Integrating these two perspectives is essential for a comprehensive understanding of consciousness. By combining objective measures with subjective reports, we can develop a more holistic understanding of the mind.
References
Albantakis L, et al. (2022). Integrated information theory (IIT) 4.0: Formulating the properties of phenomenal existence in physical terms. arXiv preprint: https://arxiv.org/abs/2212.14787.
Butlin P, et al. (2023). Consciousness in artificial intelligence: Insights from the science of consciousness. arXiv preprint: https://arxiv.org/abs/2308.08708.
Chalmers DJ (2023). David J. Chalmers. Neuron, 111(21): 3341–3343. https://doi.org/10.1016/j.neuron.2023.10.018
Du B, Cheng X, Duan Y, Ning H (2022). fMRI brain decoding and its applications in brain–computer interface: A survey. Brain Sciences, 12: 228. https://doi.org/10.3390/brainsci12020228
Ellia F, et al. (2021). Consciousness and the fallacy of misplaced objectivity. Neuroscience of Consciousness, 2021(2): niab032. 2021. https://doi.org/10.1093/nc/niab032
Fleming S (2023). The integrated information theory of consciousness as pseudoscience. https://osf.io/preprints/psyarxiv/zsr78/.
Goff P (2023). Understanding consciousness goes beyond exploring brain chemistry. https://www.scientificamerican.com/article/understanding-consciousness-goes-beyond-exploring-brain-chemistry/.
Gülen K (2023). Exploring the mind in the machine. While some argue that AI can be capable of subjective experience and consciousness, others believe that machines are fundamentally incapable of having these experiences. https://dataconomy.com/2023/03/23/can-artificial-intelligence-have-consciousness/.
Jarow O (June 30, 2023). Why scientists haven’t cracked consciousness. The science of consciousness still has no theory. Vox. https://www.vox.com/future-perfect/2023/6/30/23778870/consciousness-brain-mind-hard-problem-neuroscience-koch-chalmers.
Kleiner J (2023). Consciousness requires mortal computation. https://philarchive.org/archive/JOHCRM.
Mashour GA, Roelfsema P, Changeux JP, Dehaene S (2022). Conscious processing and the global neuronal workspace hypothesis. Neuron, 105(5): 776–798. https://doi.org/10.1016/j.neuron.2020.01.026
Mialon G, Fourrier C, Swift C, Wolf T, Scialom T LY (2023). GAIA: A benchmark for general AI assistants. arXiv preprint: https://arxiv.org/abs/2311.12983.
Michel M, Lau H (2021). Higher-order theories do just fine. Cognitive Neuroscience, 12(2): 77–78. https://doi.org/10.1080/17588928.2020.1839402
Morris MR, et al. Levels of AGI: Operationalizing progress on the path to AGI. arXiv preprint: https://arxiv.org/abs/2311.02462.
Negro N (2023). Can the integrated information theory explain consciousness from consciousness itself? Review of Philosophy and Psychology, 14: 1471–1489. https://doi.org/10.1007/s13164-022-00653-x
Wilterson AI, Kemper CM, Kim N, Webb TW, Reblando AMW, Graziano MSA (2020). Attention control and the attention schema theory of consciousness. Progress in Neurobiology, 195: 101844. https://doi.org/10.1016/j.pneurobio.2020.101844
Witzel MJ (2023). My debate with AI about it being conscious. https://www.linkedin.com/pulse/my-debate-ai-being-conscious-mark-j-witzel/.
Zimmer C (2023). Leading theories of consciousness square off. The New York Times. https://www.nytimes.com/2023/07/01/science/consciousness-theories.html?auth=login-google1tap&login=google1tap.