This work aimed at expanding the architecture of WhatsOnWeb (WoW; Di Giacomo et al. 2007) following accessibility and usability state-of-the-art criteria. WoW is a visual Web search engine that conveys the indexed dataset using graph-drawing methods on semantically clustered data. In previous studies (Federici et al., under revision; Federici et al. 2008), we found that top–down representation of the most widespread Web search engines does not take into account the level of accessibility of information, and this ranking highlights the distance between the quantitative order of Web Popularity (WP) and the qualitative level of accessibility of the retrieved information. Conversely, WoW semantically analyzes the search results, and automatically links them in a network of concepts and sub-concepts. The whole information retrieved is presented to the user simultaneously in a interactive visual map, overcoming the gap between quantitative order and qualitative level of information. The redesigning process of WoW has been performed by following the user-centered design methodology (Hutchins et al. 1985) and in compliance with theWCAG1.0. This way, we implemented a sonification algorithm that converts data relations and their spatial coordinates in non-speech sounds (sonification). Moreover, considering future developments on the BCI, a two-state navigation architecture has been enforced. With WhatsOnWeb, we aim at providing an autonomous assistive technology tool that allows for an independent navigation, based on both an integrated synthesizer for screen reading and a sonification system, by which conveying geometrical–spatial information representation. The result is an innovative web search and result-targeting approach that also reduces the gap between quantitative and qualitative result ranking, as seen on major web search platforms.

A Visual Sonificated Web Search Clustering Engine

MELE, MARIA LAURA;LIOTTA, Giuseppe;DI GIACOMO, Emilio;BORSCI, SIMONE;FEDERICI, Stefano
2009

Abstract

This work aimed at expanding the architecture of WhatsOnWeb (WoW; Di Giacomo et al. 2007) following accessibility and usability state-of-the-art criteria. WoW is a visual Web search engine that conveys the indexed dataset using graph-drawing methods on semantically clustered data. In previous studies (Federici et al., under revision; Federici et al. 2008), we found that top–down representation of the most widespread Web search engines does not take into account the level of accessibility of information, and this ranking highlights the distance between the quantitative order of Web Popularity (WP) and the qualitative level of accessibility of the retrieved information. Conversely, WoW semantically analyzes the search results, and automatically links them in a network of concepts and sub-concepts. The whole information retrieved is presented to the user simultaneously in a interactive visual map, overcoming the gap between quantitative order and qualitative level of information. The redesigning process of WoW has been performed by following the user-centered design methodology (Hutchins et al. 1985) and in compliance with theWCAG1.0. This way, we implemented a sonification algorithm that converts data relations and their spatial coordinates in non-speech sounds (sonification). Moreover, considering future developments on the BCI, a two-state navigation architecture has been enforced. With WhatsOnWeb, we aim at providing an autonomous assistive technology tool that allows for an independent navigation, based on both an integrated synthesizer for screen reading and a sonification system, by which conveying geometrical–spatial information representation. The result is an innovative web search and result-targeting approach that also reduces the gap between quantitative and qualitative result ranking, as seen on major web search platforms.
2009
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11391/1170677
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 1
social impact