Resources

Videos of our research

Humanoid Research

Bidirectional LSTM-based Network for Fall Prediction in a Humanoid
DDP-based Parachute Landing Optimization for a Humanoid
A Rollover Strategy for Wrist Damage Reduction in a Forward Falling Humanoid

COVID-19 (Novel Coronavirus)

Solutions for COVID-19 (Novel Coronavirus)
BiPAP Breathing Test
A Covid-19 Emergency Response for Remote Control of a Dialysis Machine with Mobile HRI

CAESAR

Development
Development of Robotic Head for Doing HRI
CAESAR Following a Red Ball
A functional prototype of a human-like robot hand using a 3D printer i
In this effort, Guilherme Teixeira, a visiting undergraduate student, designed a human-like robot arm and produced its various mechanical parts using a 3D printer. He also completed a mechatronics design and working prototype of the robot hand. See the following video for details.
CAESAR Following a Student
Preparing for Metrotech Tree-lighting ceremony
Manipulation
Torque Control for pressing heavy buttons
Emotion-based robotic manipulation
Social Interaction
CAESAR - Basic Expressions
CAESAR - Face mimicking
CAESAR A Socially Expressive Humanoid Robot for Expressing Emotions Using Behavior Modulation

Wearable Interfaces

A Remote Motion Control Interface for CRS Robotic Manipulator i
In this effort, we developed a wearable sensing and computing platform, consisting of a jacket with embedded sensors, a microcontroller, and a Wi-Fi chip, to allow a user to teleoperate the CRS A255 robotic manipulator through the manipulation of his/her arms. The designed system enables natural and intuitive human-robot interaction in contrast to traditional robot command and control interfaces, such as a teach-pendant.
Wearable Interface for Controlling Robotic Arm
Multiple Interfaces for Robotic Arm Control

Mobile Interfaces

With Robots
Qbot control using iPod's sensors and gesture recognition i
This video shows a Qbot mobile robot (based on iRobot's Create platform) being commanded by an iPod Touch held in a user's hand. As the hand is tilted forward and tuned to the right, the iPod's accelerometer data is used to command the Qbot to move forward and turn to the right, respectively. Alternative iPod interfaces allow the Qbot to be commanded to a specified position on the grid.
iPhone control of iRobot Create with 2DOF robotic manipulator i
This video shows a iRobot Create with a 2 DOF robotic manipulator being commanded by an iPod touch held in a user's hand. The user can use touch, sensor, and gesture recognition capabilities of the iPhone to command the mobile robot and the manipulator.
Qbot control using iPod's touch feature i
This video shows a Qbot mobile robot (based on iRobot's Create platform) being commanded by an iPod Touch held in a user's hand. As the user touches various locations on the grid of the iPod graphical user interface, the Qbot in the physical world is commanded to the corresponding location.
Robot arm control using iPod's sensors, touch feature, and gesture recognition i
This video shows a wrist-joint of a 5 degree-of-freedom robotic manipulator being commanded by an iPod Touch held in a user's hand. As the hand is tilted forward and backward, the iPod's accelerometer data is used to command the wrist-joint to tilt forward and backward, respectively. The same iPod interface allows the user to command the other 4 degrees-of-freedom as well.
Speech controlled mobile robot using iPhone i
This video demonstrates the use of an iPhone to command the motion of a differentially-driven mobile robot by speaking commands into the iPhone's microphone. The user may command the robot to drive forward or backward, rotate clockwise or counter-clockwise, or to open or close the gripper arm. A text-to-speech synthesizer is used to provide the user with voice responses.
Control of mobile robots using interfaces with vision feedback
An Augmented Reality Approach to Human-Robot Collaboration for Mobile Robot Navigation
Mixed-Reality Environment for Robotic Manipulation
Mobile Mixed-Reality Interactions with Multi-Robot Systems
Decentralized Multi-Robot Coordination and Supervision AEM Strategy using Distributed Ledgers
Mobile Mixed-Reality Interfaces that Enhance HRI in Shared Spaces
Augmented Reality Interface for Random Object Manipulation in a Human-Robot Collaborative Environment
An Augmented Reality Interface for Human-Robot Interaction in Unconstrained Environments
An Augmented Reality Framework for Robotic Tool-path Teaching
An Augmented Reality Spatial Referencing System for Mobile Robots
With Lab Experiments
Magnetic levitation monitoring and control using iPod i
This video shows a magnetic levitation experimental testbed being commanded using an iPod Touch. The user commands the position of the ball using a slider at the bottom of the iPod graphical user interface (GUI). The measured position of the ball is communicated to the GUI in real-time, displayed in a textbox, and used to animate the ball position on the GUI. Alternative iPod interfaces allow the user to manipulate gains of an algorithm controlling the ball position.
DC Motor control using iPod i
This video shows a DC motor position control experimental testbed being commanded using an iPod Touch. The user commands the position of the DC motor using a slider at the bottom of the iPod graphical user interface (GUI). The measured position and velocity responses of the motor is communicated in real-time to the GUI and plotted in the two graphs shown. Alternative iPod interfaces allow the user to manipulate gains of an algorithm controlling the DC motor position.
iPhone controlled DC Motor Example i
This video demonstrates the use of a multimodal interface on iPhone to monitor and command the position of a DC motor. The user may command the position of the motor by manually inputting the angle into a text box, using buttons and sliders, using the iPhone's on-board accelerometer, or using touch gestures to interact with an animation of the motor. The user may also tune the controller gains in order to change the dynamical behavior of the motor. A real-time plot of the sensor data is provided at the bottom of the interface.
iPhone controlled laboratory experiments i
This video, which was submitted to the Simulink Student Challenge, demonstrates the use of iPhone applications in order to monitor, command, and control a variety of commercially-available laboratory experimental test beds. Real-time plots of sensor data as well as animations of the experiments are provided to the user.
Teleoperation based interactive learning of Robot Kinematics
Vision based control of a ball and beam testbed
Wireless networked control of a Motor testbed i
Exploring the role of a smartphone as a motion sensing and control device in the wireless networked control of a Motor testbed
Stabilizing a ball and beam test-bed
Visual servoing of an inverted pendulum
Interactive mobile interface with augmented reality

STEM K-12

Ribbon Cutting Robot i
A robot created by the Mechatronics lab students cuts ribbon to mark the merger between NYU-Poly and NYU.
Robots Teach; Brooklyn Kids Learn
STEM: Focusing on our Future
STEM: K-12 Teacher training program on "It Ain't Rocket Science"
This is what young people need

Other

Dynamic Obstacle Avoidance with Size Consideration for Mobile Robotics i
The goal for this effort is to develop a vision-based robotic system that allows a mobile robot to avoid a collision with an oncoming object with the object size considered. A camera-based vision system is mounted on an omniwheel mobile robot and consists of two cameras placed side by side. The distance of the oncoming object from the camera is determined by using stereo vision. In addition, the visual system determines the size of the object by using an area-based method.
Blockly controlled robot puppet
Blockly controlled robotic puppet
Blocks based visual programming environment for teaching robot programming to K-12 students
Telerehabilitative Solutions for Stroke Patients
Blockly Based Mobile Interface for Robotics