Human-Machine Systems (IHMS) Lab was created by Professor Yingzi Lin when
she joined Northeastern University in August 2005, and it has been growing
rapidly ever since. She and her lab members all aim to address the
following research question:
What if machines can
recognize and respond to your physical states and/or emotions? At the
Intelligent Human-Machine Systems (IHMS) Laboratory, we are working hard to
make it a reality.
Our goal is to
develop intelligent machine systems that assist and interact with human
operators. The goal is to apply non-intrusive sensing technology to detect
physiological signals, and then use sensor fusion to infer the human states
and provide feedback. Finally, we develop advanced human assistance systems
that function as mediator between machines and human operators —
intelligent human machine interaction.
A few example projects
we have been working on, are:
Developing intelligent materials, that can be used to infer operator
states through sensing of physiological parameters;
Developing mathematical models to calculate perceive aesthetic of
computer interface, and how this perceived can affect usability and
Recognizing facial expressions and how these are linked to emotional
states, and modeling human-like facial expressions in computer interfaces;
Developing alternative input devices to help people with
disabilities overcome the accessibility issues they face when trying to use
conventional computer interfaces.
As you can see, the
types of projects we work on are quite different in nature, but they all
have a similar goal: to make human-machine interaction more efficient,
effective and satisfying. Technology should work for people, not the other
Please use the tabs
in the navigation bar on the left to visit the different pages on our lab.
Here you can read more about our people, projects, activities, etc.