I am a computer engineer from Philadelphia. I have worked on engineering teams small, large, and huge.
Let’s get in touch! levsau.engineer@gmail.com
WORK EXPERIENCE
iPhone Hardware Team → Apple
Ball Bonder Process Engineering → Kulicke & Soffa
Health Informatics → Children's Hospital of Pennsylvania
Functional Fabrics Research → Center for Functional Fabrics
Read more on my LinkedIn or Resume :^)
MEOWS
“The Motion Enhanced Off-limits Water Spray”
Award Winning Senior Design Project
Born from the frustration of trying to keep my two lovely cats off my kitchen counter, MEOWS is a modern smart home appliance that uses machine learning to detect when your furry friend is on your counter before aiming at them and spraying them with a small stream of water.
My team and I are absolutely honored to have been awarded second place in the 2024 Drexel Engineering Senior Design Championships.
To read the code & learn more, Check out the Github Repo!
Developed for my engineering senior design project, this prototype of MEOWS is built from a 3D-printed chassis and an electric water gun. It aims using stepper motors and is powered by a Raspberry Pi 4b. It sees with a wide angle camera and an infrared motion sensor.
Me (left) with my friends/teammates Abhishek Dave (center) and Ben Esposito (right). Not shown in this photo are our teammates Jack Pinkstone and Noah Robinson.
See MEOWS in action!
With Ducky:
Technical demonstration:
Publications
“Recognizing Complex Gestures on Minimalistic Knitted Sensors: Toward Real–World Interactive Systems” – 2023 | arxiv page | pdf
Developments in touch-sensitive textiles have enabled many novel interactive techniques and applications. Our digitally-knitted capacitive active sensors can be manufactured at scale with little human intervention. Their sensitive areas are created from a single conductive yarn, and they require only few connections to external hardware. This technique increases their robustness and usability, while shifting the complexity of enabling interactivity from the hardware to computational models. This work advances the capabilities of such sensors by creating the foundation for an interactive gesture recognition system. It uses a novel sensor design, and a neural network-based recognition model to classify 12 relatively complex, single touch point gesture classes with 89.8% accuracy, unfolding many possibilities for future applications. We also demonstrate the system's applicability and robustness to real-world conditions through its performance while being worn and the impact of washing and drying on the sensor's resistance.