Today, handheld devices can accommodate a large amount of different resources. Thus, a considerable effort is often required to mobile users in order to search for the resources suitable for the specific circumstance. Further, this effort rarely brings to a satisfactory result. To ease this work, resource recommenders have been proposed in the last years. Typically, the recommendation is based on recognizing the current situations of the users and suggesting them the appropriate resources for those situations. The recognition task is performed by exploiting contextual information and preferably without using any explicit input from the user. To this aim, we propose to adopt a collaborative scheme based on an emergent paradigm. The underlying idea is that simple individual actions can lead to an emergent collective behavior that represents an implicit form of contextual information. We show how this behavior can be extracted by using a multi-agent scheme, where agents do not directly communicate amongst themselves, but rather through the environment. The multi-agent scheme is structured into three levels of information processing. The first level is based on a stigmergic paradigm, in which marking agents leave marks in the environment in correspondence to the position of the user. The accumulation of such marks enables the second level, a fuzzy information granulation process, in which relevant events can emerge and are captured by means of event agents. Finally, in the third level, a fuzzy inference process, managed by situation agents, deduces the user situations from the underlying events. The proposed scheme is evaluated on a set of representative real scenarios related to meeting events. In all the scenarios, the collaborative situation-aware scheme promptly recognizes the correct situations, except for one case, thus proving its effectiveness

A collaborative situation-aware scheme based on an emergent paradigm for mobile resource recommenders

CASTELLANO, GIOVANNA;FANELLI, Anna Maria;
2012-01-01

Abstract

Today, handheld devices can accommodate a large amount of different resources. Thus, a considerable effort is often required to mobile users in order to search for the resources suitable for the specific circumstance. Further, this effort rarely brings to a satisfactory result. To ease this work, resource recommenders have been proposed in the last years. Typically, the recommendation is based on recognizing the current situations of the users and suggesting them the appropriate resources for those situations. The recognition task is performed by exploiting contextual information and preferably without using any explicit input from the user. To this aim, we propose to adopt a collaborative scheme based on an emergent paradigm. The underlying idea is that simple individual actions can lead to an emergent collective behavior that represents an implicit form of contextual information. We show how this behavior can be extracted by using a multi-agent scheme, where agents do not directly communicate amongst themselves, but rather through the environment. The multi-agent scheme is structured into three levels of information processing. The first level is based on a stigmergic paradigm, in which marking agents leave marks in the environment in correspondence to the position of the user. The accumulation of such marks enables the second level, a fuzzy information granulation process, in which relevant events can emerge and are captured by means of event agents. Finally, in the third level, a fuzzy inference process, managed by situation agents, deduces the user situations from the underlying events. The proposed scheme is evaluated on a set of representative real scenarios related to meeting events. In all the scenarios, the collaborative situation-aware scheme promptly recognizes the correct situations, except for one case, thus proving its effectiveness
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/131628
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 7
social impact