Abstract: (2076 Views)
Mohammad Ali Mahmoodi, Ph.D.
Hashem Sadeqi
Abstract
In a stream-of-consciousness novel, the author attempts to pave the way for his audience to encounter the characters’ mental experience directly. The characters’ mental content, which covers various levels of the mind and even reaches its pre-speech layers, is narrated within this variety of levels. Since connection of memories in the pre-speech layers of the mind occurs through association, one of the methods used by writers for showing the mentality of characters is association. In this case association becomes a device in the hands of writers for creating a link between the objective and the subjective world of the characters, in addition to depicting the constant flow of the mind from memory and a mentality, moving then towards another memory and mentality; finally depicting a picture and image which links to other related pictures and images too. This article is essentially concerned with the survey of association and its related features within the pre-speech layers of mind, its quality in the stream-of-consciousness stories, in addition to its correspondence with the mind’s mechanism. For this purpose, initially association and its governing rules are expressed, and then the significance of association in the narration of such fictions and its difference with recall is defined. Following this, the manners in which mentalities are offered in different methods of narration are surveyed through giving some examples from these fictions. The results of this research show that among different stream-of-consciousness novels, the method of inner monologue shapes associations more than other methods and develops through a high range of associations. Furthermore, utilization of recall of memories and mentalities in inner monologue counts as a weakness due to its contrast with the entity of the pre-speech layers existing in the mind.
Article Type:
مقالات علمی پژوهشی |
Subject:
Contemporary Literature/ Story / Novel Received: 2019/10/23 | Accepted: 2019/10/23 | Published: 2019/10/23