Reshaping urban landscapes through innonvative technological solutions.
Under guidance of Dr. Jorge Ortiz, our team focuses on developing a software framework that integrates and processes data collected from smart cities - All in real-time. Its ultimate purposes are to improve urban safety, mobility, and quality of life.
This project will be especially helpful for pedestrians with disabilities or limited mobility, such as individuals who limp. By accessing a dedicated SASS mobile app, these users can receive tailored walking guidance, avoid unsafe intersections, and improve their ability to navigate the city with greater independence and safety.
Build a robust and synchronized data collection system to gather information from multiple camera views and IMU sensors.
Extract useful features from the collected dataset, such as 2D/3D pose, gait characteristics, and IMU-based features.
Evaluate the effectiveness of different features for REID and tracking tasks, and analyze the correlations among these features.
Currently we are aligning all sensor data to the same event that could occur. For instance, both camera and motion sensor detects a pedestrian limping, SASS syncs them to the same timestamp.
We plan to primarily focus on adding a feature where SASS is able to identify a person in a more discrete manner. Cameras were only able to identify someone who is in need when they raise their hand in the middle of the street. We want to make pedestrians simply walk around with their phone, that is registered to SASS, and they help from there once detect their movements.
An undergraduate senior at Rutgers University majoring in electrical engineering and doing research in psychology.
An undergraduate senior at Rutgers University majoring in Electical Engineering.
A masters graduate student at Rutgers University in the Electrical and Computer Engineering Department. Specializes in Machine Learning.
May 29, 2025
Project was discussed with Dr. Ortiz and team introductory.
June 5, 2025
Discussed more about upcoming plans for research. We are planning on collecting sensor data via phone positioning and IMU tracking.
June 10, 2025
Read more papers about multi-modal fusion and feasibility of proposed framework. Website was also made to share everyone our journey.
June 17, 2025
Start creating server clients for the cameras. We are planning to use goPros and a Reolink PoE camera.
Interested in collaborating or learning more? Reach out to us at the Rutgers Winlab or at mprias216@gmail.com