Xiaoying Wei

I am a Ph.D. student (2022 ~ ) at the Computational Media and Arts (CMA), HKUST, and supervised by Prof. Mingming Fan. My research interest is Human-Computer Interaction (HCI). I received my Master's degree from the Department of CST at Tsinghua University in 2020.

I focus on the fields of Human-Computer Interaction (HCI) and Accessibility. Specifically, I apply user-centered design, VR/AR, artificial intelligence, machine learning, visualization, sensing, and qualitative research methods to empower people by investigating 1) Human-AI collaboration/teaming, 2) Aging and Accessibility challenges, and 3) Novel VR/AR and Sensing Techniques.

I am also interested in drawing and video editing. I am glad to co-operate with the researchers in the HCI field.

Xiaoying Wei

Publications


CHI 2021 (TH-CPL A)
Auth+Track: Enabling Authentication Free Interaction on Smartphone by Continuous User Tracking
Chen Liang, Chun Yu, Xiaoying Wei, Xuhai Xu, Yongquan Hu, Yuntao Wang, Yuanchun Shi

We propose Auth+Track, a novel authentication model that aims to reduce redundant authentication in everyday smartphone usage. By sparse authentication and continuous tracking of the user's status, Auth+Track eliminates the "gap" authentication between fragmented sessions and enables "Authentication Free when User is Around"...
[PDF] [Video]

UbiComp 2021 (TH-CPL A)
QwertyRing: Text Entry on Physical Surfaces Using a Ring
Yizheng Gu, Chun Yu, Zhipeng Li, Zhaoheng Li, Xiaoying Wei, and Yuanchun Shi

The software keyboard is widely used on digital devices such as smartphones, computers, and tablets. The software keyboard operates via touch, which is efficient, convenient, and familiar to users. However, some emerging technology devices such as AR/VR headsets and smart TVs do not support touch-based text entry. In this paper, we present QwertyRing, a technique that supports text entry on physical surfaces using an IMU ring...
[PDF] [Video]

CHI 2019 (TH-CPL A)
HandSee: Enabling Full Hand Interaction on Smartphones with Front Camera-based Stereo Vision
Yu Chun, Xiaoying Wei, Shubh Vachher, Yue Qin, Chen Liang, Yueting Weng, Yizheng Gu, Yuanchun Shi

We present HandSee, a novel sensing technique that can capture the state of the user’s hands touching or gripping a smartphone. We place a prism mirror on the front camera to achieve a stereo vision of the scene above the touchscreen surface. HandSee enables a variety of novel interaction techniques and expands the design space for full hand interaction on smartphones...
[PDF] [Video]

UIST 2019 (TH-CPL A)
Accurate and Low-Latency Sensing of Touch Contact on Any Surface with Finger-Worn IMU Sensor
Yizheng Gu, Yu Chun, Zhipeng Li, Weiqi Li, Shuchang Xu, Xiaoying Wei, Yuanchun Shi

Head-mounted MR systems enable touch interaction on any physical surface. However, optical methods (i.e., with cameras on the headset) have difficulty in determining the touch contact accurately. We show that a finger ring with IMU can substantially improve the accuracy of contact sensing from 84.74% to 98.61% (f1 score), with a low latency of 10 ms...
[PDF] [Video]



Education


HKUST 02/2022 - Present
Ph.D. student in Computational Media and Arts

Tsinghua University 09/2017 - 06/2020
M.S. student in Computer Science and Technology

Xiamen University 09/2013 - 06/2017
B.S. student in Software Institute


Internship & Work Experience


Product Manager, Meituan 06/2020 - 11/2021
UX designer, Huawei 06/2018 - 09/2018


Honors & Awards


Outstanding award of Tsinghua University Challenge Cup          2018
Second prize of Service Outsourcing Enterpreneurship Competition for Chinese College Student          2016
Silver & best Popularity Award of Creative Public Service Advertising Competition in Fujian          2016


Skills


Using Maya, Flash, AE, Pr, PS, AI for prototyping, 3D modeling and effect display.
Using Unity3D, Andriod Studio, Processing for DEMO development.



Updated in Dec. 2021