Sound collection and visualization system enabled participatory and opportunistic sensing approaches

Sunao Hara, Masanobu Abe, Noboru Sonehara

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

This paper presents a sound collection system to visualize environmental sounds that are collected using a crowd-sourcing approach. An analysis of physical features is generally used to analyze sound properties; however, human beings not only analyze but also emotionally connect to sounds. If we want to visualize the sounds according to the characteristics of the listener, we need to collect not only the raw sound, but also the subjective feelings associated with them. For this purpose, we developed a sound collection system using a crowdsourcing approach to collect physical sounds, their statistics, and subjective evaluations simultaneously. We then conducted a sound collection experiment using the developed system on ten participants. We collected 6,257 samples of equivalent loudness levels and their locations, and 516 samples of sounds and their locations. Subjective evaluations by the participants are also included in the data. Next, we tried to visualize the sound on a map. The loudness levels are visualized as a color map and the sounds are visualized as icons which indicate the sound type. Finally, we conducted a discrimination experiment on the sound to implement a function of automatic conversion from sounds to appropriate icons. The classifier is trained on the basis of the GMM-UBM (Gaussian Mixture Model and Universal Background Model) method. Experimental results show that the F-measure is 0.52 and the AUC is 0.79.

Original languageEnglish
Title of host publication2015 IEEE International Conference on Pervasive Computing and Communication Workshops, PerCom Workshops 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages390-395
Number of pages6
ISBN (Print)9781479984251
DOIs
Publication statusPublished - Jun 24 2015
Event13th IEEE International Conference on Pervasive Computing and Communication, PerCom Workshops 2015 - St. Louis, United States
Duration: Mar 23 2015Mar 27 2015

Other

Other13th IEEE International Conference on Pervasive Computing and Communication, PerCom Workshops 2015
CountryUnited States
CitySt. Louis
Period3/23/153/27/15

Fingerprint

visualization
Visualization
Acoustic waves
experiment
evaluation
listener
discrimination
statistics
Crowdsourcing
human being
Area Under Curve
Emotions
Classifiers
Color
Experiments
Statistics

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Computer Science Applications
  • Human-Computer Interaction
  • Health(social science)

Cite this

Hara, S., Abe, M., & Sonehara, N. (2015). Sound collection and visualization system enabled participatory and opportunistic sensing approaches. In 2015 IEEE International Conference on Pervasive Computing and Communication Workshops, PerCom Workshops 2015 (pp. 390-395). [7134069] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/PERCOMW.2015.7134069

Sound collection and visualization system enabled participatory and opportunistic sensing approaches. / Hara, Sunao; Abe, Masanobu; Sonehara, Noboru.

2015 IEEE International Conference on Pervasive Computing and Communication Workshops, PerCom Workshops 2015. Institute of Electrical and Electronics Engineers Inc., 2015. p. 390-395 7134069.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Hara, S, Abe, M & Sonehara, N 2015, Sound collection and visualization system enabled participatory and opportunistic sensing approaches. in 2015 IEEE International Conference on Pervasive Computing and Communication Workshops, PerCom Workshops 2015., 7134069, Institute of Electrical and Electronics Engineers Inc., pp. 390-395, 13th IEEE International Conference on Pervasive Computing and Communication, PerCom Workshops 2015, St. Louis, United States, 3/23/15. https://doi.org/10.1109/PERCOMW.2015.7134069
Hara S, Abe M, Sonehara N. Sound collection and visualization system enabled participatory and opportunistic sensing approaches. In 2015 IEEE International Conference on Pervasive Computing and Communication Workshops, PerCom Workshops 2015. Institute of Electrical and Electronics Engineers Inc. 2015. p. 390-395. 7134069 https://doi.org/10.1109/PERCOMW.2015.7134069
Hara, Sunao ; Abe, Masanobu ; Sonehara, Noboru. / Sound collection and visualization system enabled participatory and opportunistic sensing approaches. 2015 IEEE International Conference on Pervasive Computing and Communication Workshops, PerCom Workshops 2015. Institute of Electrical and Electronics Engineers Inc., 2015. pp. 390-395
@inproceedings{8aa99bc169034647adeead30bbdd663e,
title = "Sound collection and visualization system enabled participatory and opportunistic sensing approaches",
abstract = "This paper presents a sound collection system to visualize environmental sounds that are collected using a crowd-sourcing approach. An analysis of physical features is generally used to analyze sound properties; however, human beings not only analyze but also emotionally connect to sounds. If we want to visualize the sounds according to the characteristics of the listener, we need to collect not only the raw sound, but also the subjective feelings associated with them. For this purpose, we developed a sound collection system using a crowdsourcing approach to collect physical sounds, their statistics, and subjective evaluations simultaneously. We then conducted a sound collection experiment using the developed system on ten participants. We collected 6,257 samples of equivalent loudness levels and their locations, and 516 samples of sounds and their locations. Subjective evaluations by the participants are also included in the data. Next, we tried to visualize the sound on a map. The loudness levels are visualized as a color map and the sounds are visualized as icons which indicate the sound type. Finally, we conducted a discrimination experiment on the sound to implement a function of automatic conversion from sounds to appropriate icons. The classifier is trained on the basis of the GMM-UBM (Gaussian Mixture Model and Universal Background Model) method. Experimental results show that the F-measure is 0.52 and the AUC is 0.79.",
author = "Sunao Hara and Masanobu Abe and Noboru Sonehara",
year = "2015",
month = "6",
day = "24",
doi = "10.1109/PERCOMW.2015.7134069",
language = "English",
isbn = "9781479984251",
pages = "390--395",
booktitle = "2015 IEEE International Conference on Pervasive Computing and Communication Workshops, PerCom Workshops 2015",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Sound collection and visualization system enabled participatory and opportunistic sensing approaches

AU - Hara, Sunao

AU - Abe, Masanobu

AU - Sonehara, Noboru

PY - 2015/6/24

Y1 - 2015/6/24

N2 - This paper presents a sound collection system to visualize environmental sounds that are collected using a crowd-sourcing approach. An analysis of physical features is generally used to analyze sound properties; however, human beings not only analyze but also emotionally connect to sounds. If we want to visualize the sounds according to the characteristics of the listener, we need to collect not only the raw sound, but also the subjective feelings associated with them. For this purpose, we developed a sound collection system using a crowdsourcing approach to collect physical sounds, their statistics, and subjective evaluations simultaneously. We then conducted a sound collection experiment using the developed system on ten participants. We collected 6,257 samples of equivalent loudness levels and their locations, and 516 samples of sounds and their locations. Subjective evaluations by the participants are also included in the data. Next, we tried to visualize the sound on a map. The loudness levels are visualized as a color map and the sounds are visualized as icons which indicate the sound type. Finally, we conducted a discrimination experiment on the sound to implement a function of automatic conversion from sounds to appropriate icons. The classifier is trained on the basis of the GMM-UBM (Gaussian Mixture Model and Universal Background Model) method. Experimental results show that the F-measure is 0.52 and the AUC is 0.79.

AB - This paper presents a sound collection system to visualize environmental sounds that are collected using a crowd-sourcing approach. An analysis of physical features is generally used to analyze sound properties; however, human beings not only analyze but also emotionally connect to sounds. If we want to visualize the sounds according to the characteristics of the listener, we need to collect not only the raw sound, but also the subjective feelings associated with them. For this purpose, we developed a sound collection system using a crowdsourcing approach to collect physical sounds, their statistics, and subjective evaluations simultaneously. We then conducted a sound collection experiment using the developed system on ten participants. We collected 6,257 samples of equivalent loudness levels and their locations, and 516 samples of sounds and their locations. Subjective evaluations by the participants are also included in the data. Next, we tried to visualize the sound on a map. The loudness levels are visualized as a color map and the sounds are visualized as icons which indicate the sound type. Finally, we conducted a discrimination experiment on the sound to implement a function of automatic conversion from sounds to appropriate icons. The classifier is trained on the basis of the GMM-UBM (Gaussian Mixture Model and Universal Background Model) method. Experimental results show that the F-measure is 0.52 and the AUC is 0.79.

UR - http://www.scopus.com/inward/record.url?scp=84946025031&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84946025031&partnerID=8YFLogxK

U2 - 10.1109/PERCOMW.2015.7134069

DO - 10.1109/PERCOMW.2015.7134069

M3 - Conference contribution

AN - SCOPUS:84946025031

SN - 9781479984251

SP - 390

EP - 395

BT - 2015 IEEE International Conference on Pervasive Computing and Communication Workshops, PerCom Workshops 2015

PB - Institute of Electrical and Electronics Engineers Inc.

ER -