item3

What's New | Greetings | Schedule | Members | Contact | Japanese

2025

December
November
October
September
August
July
June
May
April

March
February
January

2024

December
November
October
September
August
July
June
May
April

March
February
January

2023

December
November
October
September
August
July
June
May
April

March
February
January

2022

December
November

October
September
August
July
June
May
April

March
February

January

2021

December
November
October

September
August
July
June
May

April

March
February
January

2020

December
November

October
September
August
July
June
May
April

Before April 2020

Schedule in November 2020


The 51st Perceptual Frontier Seminar: Analysis of Visual Images and Speech

Date and time: Wednesday, 11 November 2020, 16:40-18:10
Venue: Talks are online. After the talks, we will get together in Kazuo's Office, Room 709 on the 7th Floor of Building 3, Ohashi Campus, Kyushu University, Fukuoka, Japan <http://www.design.kyushu-u.ac.jp/kyushu-u/english/access>
Language: English
Organizer: Kazuo UEDA (Kyushu Univ./ReCAPS/Research and Development Center for Five-Sense Devices)

Program

16:40-17:10 An introduction to visualization by p-flow: Gradient- and feature-based optical flow and vector fields extracted from image analysis.
Wataru SUZUKI (1,2), Atsushi HIYAMA (2,3), Noritaka ICHINOHE (1), Wakayo YAMASHITA (4), Takeharu SENO (5), Hiroshige TAKEICHI (6,7,*)
(1) NIN,NCNP, (2) AIP,RIKEN, (3) University of Tokyo, (4) Kagoshima University, (5) Kyushu University, (6) ISC,RIKEN, (7) NIMH,NCNP, (*)presenter

To formulate an algorithm or pseudo flow or p-flow that extracts the critical information (“cues”) to the flexible primate visual motion perception from natural movies, feature vectors were extracted from a local motion vector field, tracked between successive frame pairs, visualized with moving dots and were used in demonstrations and psychophysical experiments as stimuli. The results support validity of the algorithm as “definition” of the “cues” to human visual motion perception, which is specifically suited to ecological stimuli such as self-motion, deformation of nonrigid objects and animates, and clarify precisely at what the primate visual system is good or bad, veridical or illusory, and optimized or subsidiary.

17:10-17:25 How the acoustic correlates of English obstruents appear in multivariate analysis
Yixin ZHANG*, Yoshitaka NAKAJIMA**, Kazuo UEDA***, and Gerard B. REMIJN****
*Kyushu University, **Sound Corporation, ***Kyushu University/ReCAPS/Research and Development Center for Five-Sense Devices, ****Kyushu University/ReCAPS

To identify the acoustic correlates of obstruents, we performed an origin-shifted factor analysis of critical-band-filtered British English speech. Our results confirm that obstruents delimit syllables rather than constitute syllable nuclei, showing that multivariate analysis of acoustic natures of speech can provide insight into English phonology.

17:25-17:55 Evaluative processing of single-exposed food images
Alexandra WOLF*, Shunsuke TAMURA**, Kazuo UEDA***, and Yoji HIRANO**
*JSPS International Research Fellow/Kyushu University, **Kyushu University, ***Kyushu University/
ReCAPS/Research and Development Center for Five-Sense Devices

This study was conducted in order to disambiguate how the type of evaluation determines the relationship between viewing time and preference. The stimuli consisted of a set of 160 naturalistic food images that had been drawn from a database for experimental research (FoodPics). For all 26 Japanese participants, manual responses, as well as gaze metrics, were recorded. An eye-tracking device (EyeLink 1000) with sufficient reliability for the present purpose has been used. The data provided firm evidence against the notion that longer viewing facilitates preference formation. 

After the talks, we will get together in Kazuo's office, Room 709 on the 7th Floor of Building 3.

What's New | Greetings | Schedule | Members | Contact | Japanese

Last updated:
Copyright (c) 2013-2024 Research Center for Applied Perceptual Science, Kyushu University. All rights reserved.