{"id":22804,"date":"2026-03-03T14:15:17","date_gmt":"2026-03-03T06:15:17","guid":{"rendered":"https:\/\/cde.nus.edu.sg\/ece\/?page_id=22804"},"modified":"2026-04-15T14:57:29","modified_gmt":"2026-04-15T06:57:29","slug":"nuseceprojectshowcase2026","status":"publish","type":"page","link":"https:\/\/cde.nus.edu.sg\/ece\/nuseceprojectshowcase2026\/","title":{"rendered":"NUS ECE Project Showcase 2026"},"content":{"rendered":"\n\n\t\t\t<img decoding=\"async\" loading='false' src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/ECE-Project-Showcase-2026-1920-x-1080-px-1-1.png\" alt=\"\" \/>\t\t\t\n\t\t\t\t<a href=\"#\" aria-label=\"previous\" role=\"button\">\n<\/a>\n\t\t\t\t<a href=\"#\" aria-label=\"next\" role=\"button\">\n<\/a>\n<h3>\n\t\tAbout ExCEllence (ECE Project Showcase)\n\t<\/h3>\n\tThe <strong>ExCEllence (ECE Project Showcase)<\/strong> is an annual flagship event by NUS ECE. The showcase celebrates the ingenuity, technical depth, and real-world impact of student-driven engineering projects across diverse domains &#8211; including artificial intelligence, communications, robotics, microelectronics, energy systems, and emerging interdisciplinary technologies.\nFollowing a successful inaugural run, the 2026 edition returns with an even stronger lineup of projects, broader industry participation, and deeper engagement opportunities.\nAt its core, ExCEllence is about:\n<strong>\ud83c\udfafRecognising excellence<\/strong> in student innovation and achievement<br data-start=\"430\" data-end=\"433\" \/><strong>\ud83e\udd1dBridging academia and industry <\/strong>through authentic engagement<br data-start=\"514\" data-end=\"517\" \/><strong>\ud83d\ude80Inspiring the next generation <\/strong>of engineers to design, build, and lead\n\t\t\t<a href=\"https:\/\/cde.nus.edu.sg\/ece\/nuseceprojectshowcase\/\" target=\"_blank\" rel=\"noopener\">\n\t\t\t\t\t\tExplore Past Project Showcase\n\t\t\t\t\t<\/a>\n\n\n\t<p><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2025\/04\/D7K_8615.jpg\" alt=\"\" width=\"2560\" height=\"1695\" \/><\/p>\n<strong data-start=\"368\" data-end=\"414\">Engineering Intelligence!<\/strong>\n\n<h2>\n\t\tWhat to Expect at ExCEllence 2026\n\t<\/h2>\n<h2>\n\t\t<a href=\"https:\/\/forms.office.com\/r\/JE4KgdfkiE\" title=\"Call for Nominations\" target=\"_self\">\n\t\tCall for Nominations\n\t\t<\/a>\n\t<\/h2>\n\t<strong>Know an outstanding project that deserves recognition?<\/strong>\nWhether it&#8217;s a Capstone project, a course-based innovation, or an award-winning competition entry, we want to spotlight the very best work from our ECE students.\n\ud83d\udccc <strong>Who can nominate?<\/strong> Students and faculty members<br data-start=\"430\" data-end=\"433\" \/>\ud83d\udccc <strong>What are we looking for?<\/strong> Innovative, impactful, and well-executed projects<br data-start=\"514\" data-end=\"517\" \/>\ud83d\udccc <strong>Why nominate?<\/strong> Give deserving projects the spotlight and inspire the ECE community!\n\ud83c\udf1fSelected projects stand a chance to win attractive prizes. Find out more about the <a href=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2025\/03\/Award-Details.pdf\">eligibility and the prizes\/awards details<\/a>.\n\ud83c\udf1f<a href=\"https:\/\/forms.cloud.microsoft\/r\/RQmGmP7EWV\">Submit your nominations<\/a> today and help us showcase the best of ECE!<br data-start=\"430\" data-end=\"433\" \/>Find out more about the <a href=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2025\/03\/eligibility-for-the-competition.pdf\">eligibility for the competition<\/a>.\n\n<h2>\n\t\tExperience ExCEllence!\n\t<\/h2>\n\tDiscover bold ideas. Meet future engineers. Celebrate innovation in action.\nExplore cutting-edge ECE projects, engage with industry professionals, and witness students competing for top honours.\n<strong>Be Part of the Excitement!<\/strong>\n\u2705 Explore breakthrough innovations<br data-start=\"1447\" data-end=\"1450\" \/>\u2705 Vote for the <strong data-start=\"428\" data-end=\"446\">Most Inspiring<\/strong>, <strong data-start=\"448\" data-end=\"466\">Most Impactful<\/strong>, and <strong data-start=\"472\" data-end=\"492\">Most Sustainable<\/strong> projects (note: Student-Run Teams are not eligible to participate)<br data-start=\"1502\" data-end=\"1505\" \/>\u2705 Support outstanding student achievements<br data-start=\"1553\" data-end=\"1556\" \/>\u2705 Stand a chance to win attractive prizes as a voter\n\ud83d\udc49 <strong data-start=\"1014\" data-end=\"1028\">RSVP by 8 April:<\/strong> <a href=\"https:\/\/forms.cloud.microsoft\/r\/Hm0hSKCmuv\" target=\"_new\" rel=\"noopener\" data-start=\"1029\" data-end=\"1071\">https:\/\/forms.cloud.microsoft\/r\/Hm0hSKCmuv<\/a>\n<h2>\n\t\tCategories\n\t<\/h2>\n<h3>\n\t\tCompetitive\n\t<\/h3>\n<h3>\n\t\tNon-Competitive\n\t<\/h3>\n<h2>\n\t\tCapstone Posters\n\t<\/h2>\n\t<strong>This category focuses on the thorough and meticulous analysis\u00a0and presentation of Capstone findings through an A1-sized poster.<\/strong>\n<ul>\n<li>\nThe evaluation criteria include the clarity and design of your poster, the overall impact of your work, your presentation skills, etc.\n<\/li>\n<\/ul>\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/Capstone_Loh-Yin-Heng.jpg\" alt=\"Capstone_Loh Yin Heng\" height=\"784\" width=\"1453\" title=\"Capstone_Loh Yin Heng\" \/>\n<h2>\n\t\tA Low Cost Intuitive Robot Arm Teleoperation Framework\n\t<\/h2>\nTeleoperation systems have gained significant attention in robotics, especially for applications where human operators need to control robots remotely in real time. This is important in settings where direct human involvement is difficult, unsafe, or inefficient, and where intuitive robot control is needed for testing, training, or task execution. This project demonstrates a low-cost robotic teleoperation system for a robot arm, with a focus on improving usability and reducing setup complexity. By supporting simulation, real robot deployment, automated calibration, and a custom software launcher, the system makes it possible.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Lei-Lingkai-.png\" alt=\"Capstone_Lei Lingkai\" height=\"553\" width=\"1168\" title=\"Capstone_Lei Lingkai\" \/>\n<h2>\n\t\tAdvanced LEGO Robotics Platform\n\t<\/h2>\nThis project develops a low-cost robotics platform that integrates LEGO SPIKE Prime with Raspberry Pi using a custom PCB and UART communication. By enabling ROS2 support on Ubuntu, it overcomes hardware and software limitations of existing systems, allowing flexible sensor integration and advanced control. The platform supports motor control via H-bridge, real-time data handling, and modular expansion, lowering the barrier to entry for robotics education while maintaining high-level functionality.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Gordon-Hoon.jpg\" alt=\"Capstone_Gordon Hoon\" height=\"961\" width=\"1280\" title=\"Capstone_Gordon Hoon\" \/>\n<h2>\n\t\tAI\/ML-Driven Fault Detection and Diagnosis in Multi-Level Three-phase Inverter Topologies\n\t<\/h2>\nThis project integrates advanced Machine Learning (ML) techniques to address challenges in fault detection and identification. We have achieved 100% accuracy in detecting both 2 level and 3 level inverter open-circuit faults.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Jie-Hao-Tan.png\" alt=\"Capstone_Jie Hao Tan\" height=\"377\" width=\"1201\" title=\"Capstone_Jie Hao Tan\" \/>\n<h2>\n\t\tAutonomous Vehicle (AV) Perception Systems: Reviewed and Reengineered\n\t<\/h2>\nThe global autonomous car market has been experiencing rapid growth. This project yields an improved traffic sign detection system using YOLOv11. The model is trained on annotated traffic sign images to detect and classify different road signs in real time. Inspired by recent improvements in object detection research, the project enhances the detection pipeline by optimizing feature extraction and training strategies to improve accuracy and robustness. The trained model is intended for efficient deployment on edge or embedded systems. The final system aims to support intelligent transportation applications by enabling reliable detection of traffic signs under various environmental conditions.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Xinyu-Li.png\" alt=\"Capstone_Xinyu Li\" height=\"1080\" width=\"1920\" title=\"Capstone_Xinyu Li\" \/>\n<h2>\n\t\tBiofeedback and Freezing of Gait Detection for Parkinson&#8217;s Disease\n\t<\/h2>\n\t<p>This project developed a wearable Edge-AI system for real-time detection and biofeedback of Freezing of Gait in patients with Parkinson&#8217;s Disease. Motion sensors worn on the ankles monitor a patient&#8217;s gait to detect Freezing of Gait using on-device machine learning, and alert them to stop to prevent falls. Subsequently, left and right cues are provided to assist the patient in resuming walking. To improve the reliability of Freezing of Gait detection, the device uses a patient-independent model, and transitions to a patient-dependent model as more Freezing of Gait episodes are detected.<\/p>\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Leong-Deng-Jun.jpg\" alt=\"Capstone_Leong Deng Jun\" height=\"1934\" width=\"2560\" title=\"Capstone_Leong Deng Jun\" \/>\n<h2>\n\t\tDesign of a Redundant Inertial Navigation System with Hardware-Level Voting and Sensor Fusion for Autonomous Platforms\n\t<\/h2>\nThis project presents a novel redundancy architecture for autonomous maritime systems based on hardware-level voting. Supported by ST Engineering, the system integrates multiple GNSS-aided Inertial Navigation Systems (INS) onto a single PCB, producing an aggregated navigation datastream. Unlike conventional software-based failover approaches, the design utilizes an arbitration mechanism on a hardware-level to evaluate sensor consistency in real time. Faulty inputs are identified and excluded through voting logic, while valid measurements are fused to generate a consolidated output. This approach enhances system robustness, improves stability, and enables rapid fault detection for safety-critical autonomous maritime operations.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Hannah-scaled.png\" alt=\"Capstone_Hannah\" height=\"1751\" width=\"2560\" title=\"Capstone_Hannah\" \/>\n<h2>\n\t\tFederated Domain Adaptation for Video Action Recognition\n\t<\/h2>\nFederated Video Domain Adaptation enables collaborative learning across distributed and non-IID video datasets while preserving privacy, but has yet to be explored. We propose Multi-scalE Temporal domAin aLignment (METAL), a multi-temporal-scale knowledge distillation-based framework that leverages temporal information at multiple resolutions to improve cross-domain video action recognition with only model parameter transfers. METAL trains per-scale transformer encoders on source-clients, then performs independent knowledge voting at each temporal scale to generate robust pseudo-labels on the target-server. A novel L2 variance penalty enforces cross-scale consistency during scale-based knowledge distillation, preventing a singular dominant scale. Late fusion and feature-based knowledge distillation with confidence-weighted ensemble pseudo-labels combines complementary temporal information for final predictions.\n<h2>\n\t\tHigh Speed Image Processing on FPGA Using Convolutional Neural Network (CNN)\n\t<\/h2>\nThis project implements a Convolutional Neural Network (CNN) for object detection on a Field-Programmable Gate Array (FPGA), targeting high frame rates, low latency, and reduced power consumption compared to CPU and GPU based solutions. Conventional CNNs are computationally intensive and memory demanding which poses challenges for resource-constrained FPGA platforms. To address this, model optimization techniques such as pruning and quantization are applied to reduce complexity and memory usage. These optimizations enable efficient deployment of the CNN onto FPGA hardware while maintaining effective object detection performance.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/Espressif_Kai-Sheng-Bock-scaled.jpg\" alt=\"Espressif_Kai Sheng Bock\" height=\"1707\" width=\"2560\" title=\"Espressif_Kai Sheng Bock\" \/>\n<h2>\n\t\tHuman Detection and Tracking in Indoor Spaces Using Nano Drones\n\t<\/h2>\n<p>In the aftermath of disasters such as earthquakes, rapid and efficient search and rescue operations are critical for saving lives and minimizing casualties. Traditional methods often face significant challenges in navigating dangerous, complex and unstable environments, such as partially collapsed buildings, where visibility and access are severely restricted. The recent and rapid advancement of drone technology has unlocked new possibilities for such applications.<\/p>\n<p>The project aims to develop a thermal-based approach to search for and track humans in indoor environments. This work integrates thermal sensors and develops algorithms that enable palm-sized nano Crazyflie drones, despite their limited computing and sensing capabilities, to autonomously search for, detect, and track humans in indoor environments while avoiding simple obstacles such as walls.<\/p>\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Zakariyyaa-Chachia.png\" alt=\"Capstone_Zakariyyaa Chachia\" height=\"1104\" width=\"1512\" title=\"Capstone_Zakariyyaa Chachia\" \/>\n<h2>\n\t\tInvestigating Ultra-Low Power, Brain-Inspired, Neuromorphic Hardware to Detect Interference Patterns on Board Satellites\n\t<\/h2>\nInvestigating ultra-low power, brain-inspired, neuromorphic hardware to detect interference patterns on board satellites. Radio frequency interference is an ongoing and evolving issue within the satellite industry, but currently developing systems involving deep learning models can be computationally expensive. This project offers a low power solution to this problem, using cutting edge hardware modelled after the human brain, Spiking Neural networks and deep learning.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Tiara-Mirchandani.png\" alt=\"Capstone_Tiara Mirchandani\" height=\"772\" width=\"877\" title=\"Capstone_Tiara Mirchandani\" \/>\n<h2>\n\t\tMulti-Layered Anatomical Hand Model for Physics-Based Interaction\n\t<\/h2>\nThis project presents a fully anatomical, physics-driven digital human hand model. To overcome the limitations of hollow models like MANO, this framework combines a high-fidelity CT-derived skeletal mesh with a multi-layered physical simulation. By coupling a volumetric Finite Element Method (FEM) muscle layer with mass-spring skin constraints and torque-driven rigid body joint mechanics, the model achieves real-time, volume-preserving soft-tissue deformation and realistic collision responses, creating an anatomically accurate model for hand-object interaction.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Jia-Cheng-Raymand-T.jpeg\" alt=\"Capstone_Jia Cheng, Raymand T\" height=\"406\" width=\"662\" title=\"Capstone_Jia Cheng, Raymand T\" \/>\n<h2>\n\t\tNear-Infrared Transmission-Control Mechanism of Metasurface Arrays on Novel Substrate Materials\n\t<\/h2>\nThis project pioneers a highly efficient near-infrared (NIR) photodetector by integrating Titanium (Ti) plasmonic metasurfaces directly onto 4H-SiC, a premier next-generation semiconductor substrate. Through rigorous 3D electromagnetic simulations exploring complex optical phenomena, we engineered an optimized nanoline array that achieves an exceptional ~90% optical absorbance near the target wavelength. By successfully translating these theoretical nanoscale models into comprehensive Electron Beam Lithography (EBL) layouts, this work bridges the gap between computational physics and physical nanofabrication for robust, integrated photonic architectures.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Ricky-Wang.png\" alt=\"Capstone_Ricky Wang\" height=\"410\" width=\"670\" title=\"Capstone_Ricky Wang\" \/>\n<h2>\n\t\tNon-Invasive Monitoring of Implantable Battery Using Ultrasonics and AI\n\t<\/h2>\nReliable monitoring of implantable batteries is critical, but traditional invasive electrical methods drain limited power reserves and pose patient hazards. While ultrasonic sensing provides a non-invasive alternative, varying skin thicknesses severely distort acoustic signals. To resolve this, we propose a robust diagnostic framework utilizing Swept Frequency Ultrasonic Reflection (SFUR) paired with a novel cascaded Orthogonal Spectral Extreme Gradient Boosting (OS-XGBoost) architecture. By employing PCA dimensionality reduction and dynamically coupling battery health with charge estimation, the model successfully filters physiological noise to achieve an R2 of 0.99 for both SOH and SOC estimation. Its minimized 6.43 KB memory footprint demonstrates exceptional viability for resource-constrained edge microcontrollers.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Ang-Hui-Jie.png\" alt=\"Capstone_Ang Hui Jie\" height=\"1080\" width=\"1920\" title=\"Capstone_Ang Hui Jie\" \/>\n<h2>\n\t\tPassive Acoustic Monitoring of Melting Glaciers\n\t<\/h2>\nGlaciers are sensitive indicators of climate change, and their accelerated melting is a visible consequence of global warming. This project investigates melting activity using acoustic data captured from tidewater glaciers, which produce distinct sound signatures. Passive Acoustic Monitoring (PAM) is a non-invasive and cost-effective approach that offers high temporal resolution for studying underwater sound propagation at the Hansbreen glacier terminus in Hornsund Fjord, Spitsbergen. Acoustic simulations were conducted in the Julia ecosystem using UnderwaterAcoustic.jl and Bellhop beam-tracing to model sound transmission under varying environmental conditions, including bathymetry, sound-speed profiles, glacier-front geometry, sound directionality, and seabed properties.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Mahir-Faysal.jpg\" alt=\"Capstone_Mahir Faysal\" height=\"840\" width=\"1270\" title=\"Capstone_Mahir Faysal\" \/>\n<h2>\n\t\tReal-Time Underwater Voice Communication\n\t<\/h2>\nThis project presents the design and implementation of a real-time underwater voice streaming system using low-bitrate speech compression and an underwater communication framework. Audio is recorded from one webpage, compressed into encoded chunks, transmitted underwater, and then decoded and played back on a second webpage with low latency. To achieve this, the system integrates Lyra with UnetStack and is evaluated through benchmarking, latency analysis, and parameter tuning. The final system demonstrates reliable webpage-to-webpage underwater voice communication, showing that low-latency digital speech transmission can be successfully achieved in an underwater environment.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Mihindukulasooriya-S.jpg\" alt=\"Capstone_Mihindukulasooriya S\" height=\"556\" width=\"634\" title=\"Capstone_Mihindukulasooriya S\" \/>\n<h2>\n\t\tSelf-Balancing Control System for Spacecraft Simulator\n\t<\/h2>\nThe rapid growth of small satellite applications demands highly reliable subsystems, particularly the Attitude Determination and Control System (ADCS), which governs precise satellite orientation for stability, power generation, and payload pointing. However, ADCS failures are often linked to limitations in ground-based testing. Air-bearing simulators, used to emulate near torque-free conditions, are highly sensitive to misalignment between the centre of mass and centre of rotation, introducing disturbance torques. This work presents an automatic balancing system using least squares estimation of CM offset from IMU data and piezoelectric actuation, achieving a fourfold improvement in reaction wheel saturation time and significantly enhancing test fidelity.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Gayathri-Thattarupar.png\" alt=\"Capstone_Gayathri Thattarupar\" height=\"1079\" width=\"1884\" title=\"Capstone_Gayathri Thattarupar\" \/>\n<h2>\n\t\tSelf-Sustaining Communication Module for Enhanced Resilience in Miniaturized Satellite Operations\n\t<\/h2>\nThis project presents the design and validation of an in-house developed RF-based ground node for satellite communications. A modular RF front-end printed circuit board (PCB) operating in the UHF band was developed to enhance satellite telemetry, tracking, and localisation. The design integrates filtering, low-noise amplification, and power amplification using controlled-impedance RF layout techniques. Experimental testing demonstrated an uplink gain of approximately 16 dB, closely matching theoretical predictions. The system supports resilient, low-cost ground-station infrastructure for emerging small-satellite communication and tracking networks.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Ang-Jing-Neng.png\" alt=\"Capstone_Ang Jing Neng\" height=\"528\" width=\"1078\" title=\"Capstone_Ang Jing Neng\" \/>\n<h2>\n\t\tStudy of a Non-Iterative Method Using Phaseless Data for Solving Inverse Scattering Problems\n\t<\/h2>\nSolving inverse scattering problems using non-iterative methods and phaseless data is increasingly important in optical imaging, radar detection, biomedical diagonstics, and space-based remote sensing, where phase measurements are costly, or physically inaccessible. This project aims to develop an improved phase retrieval algorithm to accurately reconstruct phase information from intensity data, alongside a novel non-iterative inversion algorithm that enhances reconstruction accuracy beyond traditional methods such as Born, Rytov, and modified Born approximation inversion methods. MATLAB simulations are used to validate the performance of the algorithms.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Zi-Yang-Chai-scaled.jpg\" alt=\"Capstone_Zi Yang Chai\" height=\"1180\" width=\"2560\" title=\"Capstone_Zi Yang Chai\" \/>\n<h2>\n\t\tSystematic RF Hardware Design Methodology for TEMPEST Attacks Based on Display Pixel Harmonics\n\t<\/h2>\nTEMPEST attacks recover video display content from unintentional electromagnetic emissions, where the pixel rate and its harmonics carry the display information. Since different resolutions and refresh rates produce different pixel rates, and different displays amplify different harmonics, reliable carrier capture requires hardware covering a wide frequency range. Existing literature, largely focused on demodulation and image reconstruction algorithms, employs overly wideband hardware, resulting in unnecessarily large and impractical antennas. This work proposes a systematic methodology using a pixel harmonic based design table to identify candidate carriers and select the minimum required bandwidth for a target set of displays.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/Capstone_Leipakshi-Gupta.jpeg\" alt=\"Capstone_Leipakshi Gupta\" height=\"721\" width=\"1280\" title=\"Capstone_Leipakshi Gupta\" \/>\n<h2>\n\t\tTerrain Aware SE(2) Planning for Multifloor Navigation in Indoor Digital Twins\n\t<\/h2>\nThis project studies global path planning for wheeled robots in indoor point cloud based digital twins, with a particular focus on navigation across multiple floors. While existing digital twin systems can reconstruct buildings and support localisation, they do not generally provide a complete method for safe and feasible multifloor navigation for SE(2) constrained wheeled agents such as wheelchairs and service robots. This is especially important in real indoor environments, where narrow corridors, clutter, ramps, floor level changes, and elevator transitions affect whether a path is actually traversable.\n<p>The work builds on the SEB Naver framework as a baseline local planner and examines how raw point cloud data can be converted into an SE(2) terrain representation containing elevation, surface normal, risk, and signed distance information. On this representation, local motion planning is performed using Kino A* and trajectory optimisation. To extend this framework beyond a single floor, this report proposes a higher level multifloor planning approach in which each floor is represented by its own terrain map and vertical connections such as elevators or ramps are modelled as transitions in a graph. Global planning is then carried out by combining inter-floor graph search with floor level SE(2) planning.<\/p>\n<h2>\n\t\tEspressif Competition \n\t<\/h2>\n<strong>This category features selected Capstone projects developed in collaboration with Espressif Systems, showcasing innovative applications built using Espressif technologies.<\/strong>\nParticipation in this category is by invitation only and is limited to Capstone teams whose projects are part of the Espressif collaboration programme.\n<ul>\n<li>The showcase features presentations and live demonstrations of the prototypes or the completed systems.<br \/>\n<\/li>\n<li>The evaluation criteria assess the understanding of, and ability to apply, fundamental engineering principles in solving real-world problems, creativity in design and execution, among other things.<\/li>\n<\/ul>\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/Capstone-and-Espressif_Yalin.png\" alt=\"Capstone and Espressif_Yalin\" height=\"1434\" width=\"2559\" title=\"Capstone and Espressif_Yalin\" \/>\n<h2>\n\t\tBiofeedback and Freezing of Gait Detection for Parkinson&#8217;s Disease\n\t<\/h2>\nThis project developed a wearable Edge-AI system for real-time detection and biofeedback of Freezing of Gait in patients with Parkinson&#8217;s Disease. Motion sensors worn on the ankles monitor a patient&#8217;s gait to detect Freezing of Gait using on-device machine learning, and alert them to stop to prevent falls. Subsequently, left and right cues are provided to assist the patient in resuming walking. To improve the reliability of Freezing of Gait detection, the device uses a patient-independent model, and transitions to a patient-dependent model as more Freezing of Gait episodes are detected.\n<h2>\n\t\tCentralized &amp; Distributed Object Detection using Autonomous Vehicles on any Unknown Terrain using Machine Learning\n\t<\/h2>\nObject detection using Machine Learning is a reasonably well-studied topic with several ML algorithms in place. Recent YOLO series based algorithms detect objects in real-time and hence become attactive when mounted on an autonomous vehicle! In this project, we will consider a co-operative multi-vehicle scenario in which vehicles(mBots) participate in a coordinated and cooperative manner to detect objects on an unknown terrain.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/Espressif_Kai-Sheng-Bock-scaled.jpg\" alt=\"Espressif_Kai Sheng Bock\" height=\"1707\" width=\"2560\" title=\"Espressif_Kai Sheng Bock\" \/>\n<h2>\n\t\tHuman Detection and Tracking in Indoor Spaces Using Nano Drones\n\t<\/h2>\nIn the aftermath of disasters such as earthquakes, rapid and efficient search and rescue operations are critical for saving lives and minimizing casualties. Traditional methods often face significant challenges in navigating dangerous, complex and unstable environments, such as partially collapsed buildings, where visibility and access are severely restricted. The recent and rapid advancement of drone technology has unlocked new possibilities for such applications.\n<p>The project aims to develop a thermal-based approach to search for and track humans in indoor environments. This work integrates thermal sensors and develops algorithms that enable palm-sized nano Crazyflie drones, despite their limited computing and sensing capabilities, to autonomously search for, detect, and track humans in indoor environments while avoiding simple obstacles such as walls.<\/p>\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/PSC-modules.jpg\" alt=\"PSC modules\" height=\"1280\" width=\"960\" title=\"PSC modules\" \/>\n<h2>\n\t\tPersonal Smart Companion (PSC)\n\t<\/h2>\nPSC is a next-generation, personalized smart alarm clock and desktop or wearable companion designed for university students. Through voice and touch interaction, PSC supports productivity, healthy routines, emotional wellbeing, and focus on habits in a seamless, non-intrusive way. Grounded in behavioral science principles, it uses IoT-enabled hardware and cloud intelligence to deliver timely nudges that help students stay motivated, organized, and emotionally supported throughout their academic journey.\u00a0 As an ambient companion device, PSC passively and continuously monitors aspects of the user&#8217;s routines and wellbeing, responding in subtle and supportive ways. It combines six core features: a Pomodoro productivity timer, habit tracker, multimodal mood logging, motivational audio support, environmental sensing\u00a0for air quality and ambient conditions, and a Telegram bot\u00a0for remote access to wellness insights and data. Together, these features transform PSC into a dependable, trustworthy companion that promotes student wellbeing, productivity, and healthier daily habits.\n<h2>\n\t\tSmart Traffic Cone System\n\t<\/h2>\nSmart traffic cone is an integral part of smart city trend. Traffic cones are bright orange markers on the road used for safety and traffic management. These cones are placed on roads or footpaths to temporarily redirect traffic in a safe manner. They are also used to create separation or merge lanes during road works or car accidents. An upgraded version of the traditional traffic cone that are equipped with sensors and connectivity to improve road safety, traffic flow management, and communication is a smart traffic cone. If equipped with mobility, they can be deployed from a remote location, and because of connectivity, can be monitored remotely. Thus, the smart traffic cone system enhances safety for road workers compared to passive traffic cones.\n<h2>\n\t\tEE\/CEG Courses Projects\n\t<\/h2>\n\t<strong>This category highlights the outstanding project executions across various EE- and CEG-coded courses in ECE.\u00a0<\/strong>\n<ul>\n<li>\nThe showcase should ideally include a live demonstration of the system.\n<\/li>\n<li>\nThe evaluation criteria assess the understanding of, and ability to apply, fundamental engineering principles in solving real-world problems, creativity in design and execution, etc.\n<\/li>\n<li>\nA printed or displayed\u00a0visual aid is optional. If included, it will serve only to aid in the explanation and will not be evaluated.\n<\/li>\n<\/ul>\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/CDE2605R_Sai.jpg\" alt=\"CDE2605R_Sai\" height=\"659\" width=\"956\" title=\"CDE2605R_Sai\" \/>\n<h2>\n\t\tCDE2605R &#8211; Live Video from Stratospheric Weather Balloon\n\t<\/h2>\nThis project is about a weather balloon payload that transmits live video at 1280MHz using a Commercial Off the Shelf Video transmitter and a student developed flight telemetry board transmitting Automatic Packet Reporting System (APRS) packets at 144.390MHz. This is the first such payload in the region involving video transmission from the stratosphere using the Amateur Radio band. The payload was launched via a weather balloon at 0650H, after securing all required approvals. The payload ascended to an altitude of 31 km, the edge of space. Throughout the mission, the team maintained continuous real-time video feed alongside telemetry data.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/CG2028_Yi-Feng-Chan.jpg\" alt=\"CG2028_Yi Feng Chan\" height=\"960\" width=\"1280\" title=\"CG2028_Yi Feng Chan\" \/>\n<h2>\n\t\tCG2028 &#8211; LIFT (Live Intelligent Fall Tracker)\n\t<\/h2>\nThis Smart IoT wearable addresses the &#8220;long lie&#8221; problem by providing real-time fall detection for the elderly wearer. Powered by a STM32 Cortex-M4 and ESP32 microcontroller, this system uses a two-stage state machine to track free fall and a subsequent impact phase via accelerometer and gyroscope. An assembly moving average filter processes raw sensor data rapidly. Beyond motion tracking, an integrated heartbeat sensor transmits vital health analysis. Upon fall detection, a buzzer notifies bystanders, while WhatsApp and Telegram notifications are sent to the elderly&#8217;s emergency point of contact. All sensor data is logged to Ubidots cloud dashboard.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/CG2111A_Khushi-Madnani.jpeg\" alt=\"CG2111A_Khushi Madnani\" height=\"1200\" width=\"1600\" title=\"CG2111A_Khushi Madnani\" \/>\n<h2>\n\t\tCG2111A &#8211; Alex Rover\n\t<\/h2>\nAlex is a teleoperated search-and-rescue robot developed for a simulated lunar mission environment. The system features a dual-processor architecture, integrating a Raspberry Pi for high-level sensing and mapping with an Arduino for real-time motor and actuator control. It utilizes LiDAR for environment scanning, a color sensor for zone detection, and a constrained camera for visual feedback. A 4-DoF robotic arm enables precise payload manipulation. The robot supports dual-operator control over a networked interface, enabling efficient navigation, object retrieval, and mission execution in an unknown, obstacle-rich environment.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/CG2111A_Divyansh-Garg.jpg\" alt=\"CG2111A_Divyansh Garg\" height=\"960\" width=\"1280\" title=\"CG2111A_Divyansh Garg\" \/>\n<h2>\n\t\tCG2111A &#8211; Moonbase SG Rescue Mission (Alex)\n\t<\/h2>\nOur remotely operated rescue robot navigates an obstacle course to locate, retrieve, and deliver medical supplies to injured astronauts. What sets our implementation apart is a full ROS2-based SLAM pipeline running on a Raspberry Pi inside Docker, using an RPLidar A1M8 and slam_toolbox to build a live occupancy grid map &#8211; visualised in real-time on Foxglove Studio over Tailscale VPN. With no wheel encoders, we engineered a scan-matching-only odometry solution. Two operators control the robot simultaneously over a TLS-secured TCP link: one for navigation, one for the 4-DoF robotic arm, supported by a colour sensor, camera, and hardware E-stop.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/CG3207_Teng-Yi-Heng.png\" alt=\"CG3207_Teng Yi Heng\" height=\"1438\" width=\"1605\" title=\"CG3207_Teng Yi Heng\" \/>\n<h2>\n\t\tCG3207 &#8211; Dual-Core Pipelined RISC-V Processor with Atomic Instruction Support\n\t<\/h2>\n<p>This project presents the design and implementation of an dual-core, 5-stage pipelined processor implementing the RV32IM ISA, written in Verilog and tested on the Nexys 4 FPGA. Each core can run its own program, allowing for high thread-level parallelism. To share memory and peripherals between cores while avoiding race conditions, we implemented atomic Load-Reserved\/Store-Conditional instructions to enable synchronization primitives such as mutexes to be used in the programs. To demonstrate its utility, we present a rendering demo (moving ball) where each core draws different halves of the screen, sharing variables and a memory-mapped VGA peripheral.<\/p>\n<p>Other enhancements include hazard detection to automatically stall\/flush pipeline stages when needed, data forwarding to avoid pipeline stalls where possible, global history-based branch prediction scheme for better speculative execution, and a Karatsuba multiplier for fast and efficient 32b multiplication.<\/p>\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/CG3207_The-Trung-Bui.jpg\" alt=\"CG3207_The Trung Bui\" height=\"960\" width=\"1280\" title=\"CG3207_The Trung Bui\" \/>\n<h2>\n\t\tCG3207 &#8211; In-Order Dual-Issue RISC-V32IM Processor on FPGA with CoreMark Benchmarking\n\t<\/h2>\n<p>This project presents the design and implementation of a processor based on the RV32IM ISA. The processor is developed using Verilog and prototyped on a Nexys 4 FPGA. It supports execution of compiled assembly programs in binary format.<\/p>\n<p>To improve performance, the processor incorporates a dual-issue in-order execution pipeline with a 5-stage architecture, allowing two instructions to be issued simultaneously. Key enhancements include a hazard detection, data forwarding, and branch prediction, all designed to mitigate pipeline stalls and improve throughput.<\/p>\n<p>The performance is evaluated using the CoreMark benchmark and achieves a score of 2.22 CoreMark\/MHz.<\/p>\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/CG3207_Wenbo-Zhu.jpg\" alt=\"CG3207_Wenbo Zhu\" height=\"960\" width=\"1280\" title=\"CG3207_Wenbo Zhu\" \/>\n<h2>\n\t\tCG3207 &#8211; Mach-V:  In-order SuperScalar RISC-V CPU Implementation on Nexys 4 FPGA\n\t<\/h2>\nThe open-standard RISC-V ISA is driving increasing demand for high-performance, customizable processors that go beyond traditional in-order academic designs. This project investigates the design and implementation of a superscalar RISC-V processor incorporating advanced microarchitectural techniques, including a 5-stage pipeline, dynamic branch prediction, and a dual-issue superscalar architecture. The processor&#8217;s performance is evaluated using the CoreMark benchmark, achieving a score of 2.51 CoreMark\/MHz and delivering a 2\u00d7 throughput improvement over the baseline single-issue, single-cycle implementation.\nLooking ahead, this work aims to explore further enhancements to improve instruction-level parallelism, overall throughput, and operating frequency. Potential directions include deeper pipelining and more sophisticated branch prediction mechanisms, among other advanced architectural optimizations.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/6240104512475565666.jpg\" alt=\"6240104512475565666\" height=\"588\" width=\"1280\" title=\"6240104512475565666\" \/>\n<h2>\n\t\tCG4002 &#8211; PokemonAR\n\t<\/h2>\nThis project presents a real-time augmented reality Pok\u00e9mon battle system that transforms physical spaces into interactive arenas. Players command Pok\u00e9mon through physical gestures captured by a wrist-mounted MPU6050 IMU on a FireBeetle ESP32, and voice commands for hands-free Pok\u00e9mon switching. Pretrained neural networks for action classification and voice recognition are deployed on the Ultra96 FPGA using fixed-point HLS-synthesized accelerators. The iOS visualizer leverages LiDAR mesh reconstruction for depth-aware occlusion, rendering world-anchored 3D battle scenes in real time. All components communicate over an MQTT broker hosted on AWS EC2, enabling synchronized turn-based PvP and PvE gameplay across a distributed edge-cloud architecture.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/EE1111A_Zhu-Yuehua-.png\" alt=\"EE1111A_Zhu Yuehua\" height=\"698\" width=\"934\" title=\"EE1111A_Zhu Yuehua\" \/>\n<h2>\n\t\tEE1111A &#8211; Design of Off-Grid Solar Panel Array with Suntracking for Jakarta\n\t<\/h2>\nThis project presents the design of an off-grid floating solar photovoltaic (PV) system with single-axis sun tracking for coastal regions in North Jakarta. By utilizing water surfaces and incorporating tracking mechanisms, the system overcomes land scarcity while enhancing energy yield. A prototype using Arduino, LDR sensors, and servo motors demonstrates real-time tracking capability. Simulation results show that the tracking system increases annual energy output by approximately 14% compared to fixed-tilt panels. The proposed design offers a scalable, sustainable solution for urban renewable energy generation in tropical environments.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/EE1111A_He-Jiaxie.gif\" alt=\"EE1111A_He Jiaxie\" height=\"1536\" width=\"2048\" title=\"EE1111A_He Jiaxie\" \/>\n<h2>\n\t\tEE1111A &#8211; OrbitGrow: An Automated Rotary Hydroponic System with Adaptive Nutrient Dosing\n\t<\/h2>\nThis project presents an automated rotary hydroponic system that maximizes space efficiency and crop yield through synchronized mechanical rotation, smart sensing, and adaptive control. A rotating drum and side LED panel enhance uniform light exposure. A centralized watering and nutrient system uses a main reservoir and pump to circulate solution through the hydroponic loop, while EC, pH, and temperature sensors monitor reservoir conditions. Nutrient dosing is applied directly to the reservoir with temperature-compensated EC control to maintain stability. An intelligent atomizer-fan system regulates climate, and adaptive lighting control optimizes growth, delivering a precise monitoring and controlling tailored to each crops&#8217; specific needs.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/EE2026_Du-Derrick.jpg\" alt=\"EE2026_Du Derrick\" height=\"1279\" width=\"1706\" title=\"EE2026_Du Derrick\" \/>\n<h2>\n\t\tFPGA-Based Guitar Multi-Effect Pedal\n\t<\/h2>\nThis project is a real-time digital guitar effects pedal built on a single Basys3 FPGA. It addresses the cost, bulk, and inflexibility of traditional analogue pedalboards by consolidating an entire signal chain into one reconfigurable device. Audio from an electric guitar is processed through a Pmod I2S2 codec at studio-quality 48 kHz \/ 24-bit. The pipeline includes a multi-mode preamp, cabinet speaker simulation, tap-tempo delay, reverb, and a chromatic tuner. Users interact through a performance mode for live playing and an edit mode for detailed parameter adjustment, displayed on both an OLED and a VGA monitor with real-time waveform visualisation. By exploiting the FPGA&#8217;s deterministic parallel processing, the entire pipeline runs within a single sample period with zero perceptible latency.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/EE2026__Hsu-Yu-Chen-scaled.png\" alt=\"EE2026__Hsu, Yu-Chen\" height=\"1316\" width=\"2560\" title=\"EE2026__Hsu, Yu-Chen\" \/>\n<h2>\n\t\tEE2026 &#8211; Metacircuit\n\t<\/h2>\nThis project presents the design and implementation of an interactive FPGA-based analog circuit simulator on the Basys3 board using Verilog. Users construct circuits via a VGA interface with mouse input, placing components such as resistors and sources. The system generates a netlist, builds the system matrix using modified nodal analysis, and solves it using single-precision floating-point computation. Initial support focuses on DC analysis. Visualization includes waveform display and current animation. Switches, LEDs, and an OLED provide a compact status display and control for system interaction.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/EE2026_Kenneth-Wong-scaled.jpg\" alt=\"EE2026_Kenneth Wong\" height=\"1920\" width=\"2560\" title=\"EE2026_Kenneth Wong\" \/>\n<h2>\n\t\tEE2026 &#8211; Object Tracker with FPGA\n\t<\/h2>\nThis project is a real-time computer vision system built on a single Basys3 FPGA. It serves as an experiential learning tool on the principles of object tracking. A camera feed and graphical user interface overlays are displayed through a VGA monitor. Using a mouse, users can interactively customise the image processing pipeline via intuitive drag &amp; drop controls, dynamic sliders and scroll-wheel. They can also gain insights into our Union-Find Disjoint Set blob detection algorithm through pop-up explanations and modify its parameters. The switches and buttons on the FPGA are used to tune the PD controller of the pan-tilt servo, which will follow the detected object smoothly.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/EE4218_The-Trung-Bui.jpg\" alt=\"EE4218_The Trung Bui\" height=\"960\" width=\"1280\" title=\"EE4218_The Trung Bui\" \/>\n<h2>\n\t\tEE4218 &#8211; FPGA-Based Hardware Accelerator for Ternary Transformer Sentence Sentiment Analysis\n\t<\/h2>\n<p>This project designs and implements hardware co-processors to accelerate a ternary Transformer for sentence sentiment analysis on the KV260 FPGA. Modules include a projection engine for Q\/K\/V\/O layers and a score engine for attention computation.<\/p>\n<p>The accelerators interface with the on-board CPU via AXI and AXI-Stream. Two implementations are explored: Verilog HDL and HLS (C\/C++), both integrated with software. The model supports sentiment and topic classification (Sports, Business, Sci\/Tech, World), with a Python software version used as a performance baseline.<\/p>\n<p>This work demonstrates FPGA acceleration of Transformer models and trade-offs between HDL and HLS.<\/p>\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/final_Trijal-Srimal.jpg\" alt=\"final_Trijal Srimal\" height=\"1022\" width=\"1280\" title=\"final_Trijal Srimal\" \/>\n<h2>\n\t\tEE4218 &#8211; Hardware Accelerated Inference Of Satellite Imagery On FPGA&#8217;s\n\t<\/h2>\nThis project presents a hardware-accelerated ship detection system for real time inference of satellite imagery on the Kria KV260 platform. A sliding-window pipeline extracts seven features from each image patch and feeds them to a trained 7-2-1 MLP classifier. The system compares software, HLS, and custom HDL implementations, using AXI DMA and AXI-Lite to move data and weights between the ARM processor and FPGA fabric. A Tkinter-based GUI enables real-time testing and timing comparison across modes. Results demonstrate how FPGA acceleration reduces inference latency while preserving the accuracy of the trained model for binary ship-versus-background classification on the board and evaluation dataset.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/tinyissimo_demo_screenshot_Hannah-Lee.png\" alt=\"tinyissimo_demo_screenshot_Hannah Lee\" height=\"1510\" width=\"1138\" title=\"tinyissimo_demo_screenshot_Hannah Lee\" \/>\n<h2>\n\t\tEE4218 &#8211; Hardware Acceleration of a Lightweight YOLO Implementation on an FPGA-based SoC\n\t<\/h2>\nThis project deploys TinyissimoYOLO, a quantized lightweight YOLO model, on the Kria KV260 FPGA-based SoC. Trained on a 3-class COCO subset, quantized via PTQ and exported to TFLite, three inference modes are compared: a TFLite software baseline on 4 ARM A53 cores, an HDL accelerator, and an HLS-generated accelerator, achieving ~2\u00d7 and ~4\u00d7 speedups with minimal accuracy loss (&lt;1% mAP) post-quantisation. Proposing a single configurable layer, HDL and HLS utilize only 6% LUTs as Logic + 12% DSPs and 27% LUTs as Logic + 16% DSPs respectively, with significant FPGA logic resource headroom remaining, highlighting the potential of custom hardware accelerators over software solutions for real-world edge machine learning deployment.\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/EE4218_Wenbo-Zhu.jpg\" alt=\"EE4218_Wenbo Zhu\" height=\"720\" width=\"1280\" title=\"EE4218_Wenbo Zhu\" \/>\n<h2>\n\t\tEE4218 &#8211; Verilog Neural Network (VNN): Hardware Acceleration of Basic Image Classifier\n\t<\/h2>\n<p>In this project, we develop a custom hardware accelerator to improve the speed of an image classification system.<\/p>\n<p>Our approach uses a Convolutional Neural Network (CNN) trained on the CIFAR-10 dataset to recognize objects such as animals and vehicles. With an innovative and highly efficient RTL microarchitecture optimized using various RTL transformation techniques, our RTL implementation achieves a 2709\u00d7 speedup (8127 images\/sec), while the HLS implementation delivers a 585\u00d7 speedup (1705 images\/sec) compared to CPU-only execution (3 images\/sec) on the KV260 platform.<\/p>\n<p>Through this work, we demonstrate how specialized hardware can accelerate machine learning workloads and offer advantages over traditional software-based solutions.<\/p>\n<h2>\n\t\tMSc Projects\n\t<\/h2>\n<strong>This category showcases the year-long MSc projects and celebrates creativity, technical excellence, and real-world application of knowledge.<\/strong>\n<ul>\n<li>\nThe showcase should ideally include a live demonstration of the system.\n<\/li>\n<li>\nThe evaluation criteria include technical complexity and depth, ingenuity, the broader implication of the project, presentation skills, etc.\n<\/li>\n<li>\nA printed or displayed visual aid is optional. If included, it will serve only to aid in the explanation and will not be evaluated.\n<\/li>\n<\/ul>\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/MSC_Jia-Mingxuan-1-scaled.jpg\" alt=\"MSC_Jia Mingxuan\" height=\"1152\" width=\"2560\" title=\"MSC_Jia Mingxuan\" \/>\n<h2>\n\t\tFPGA-Accelerated NAND Flash Controller with Parallel LDPC-Based Error Correction\n\t<\/h2>\nThis project develops an FPGA-accelerated NAND Flash controller integrated with a Quasi-Cyclic Low-Density Parity-Check (QC-LDPC) engine to combat bit-flip errors caused by storage wear-out. A zero-latency parallel encoder and a 64-lane folded Layered Min-Sum decoder were designed for high-throughput data recovery. Architectural optimizations, including Q5.2 fixed-point quantization and combinational early-termination, were applied to minimize hardware overhead. The RTL implementation was rigorously verified against a MATLAB golden model, demonstrating highly robust error correction efficiency.\n<h2>\n\t\tExternal Competition Projects\n\t<\/h2>\n\t<p><strong>Discover ECE students&#8217; outstanding achievements in competitions beyond NUS! <\/strong><\/p>\n<h2>\n\t\tKnowledge Sharing Projects\n\t<\/h2>\n\t\t<nav>\n\t\t\t<ul>\n\t\t\t\t\t\t\t\t<li data-index=\"0\">\n\t\t\t\t\t<h5>\n\t\t\t\t\t\t<a href=\"void(0);\">ELExAI Chatbot\n\t\t\t\t\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t<\/h5>\n\t\t\t\t<\/li>\n\t\t\t\t\t\t\t<\/ul>\n\t\t<\/nav>\n\t\t\t\t\t<h5>\n\t\t\t\t\t\t\t\t\t\t\t\tELExAI Chatbot\n\t\t\t\t\t\t\t\t\t\t\t<\/h5>\n\t\t\t\t\t<p><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/04\/ELExAI_Guang-Sheng-Tay.png\" alt=\"\" width=\"458\" height=\"261\" \/><\/p>\nELExAI is an intelligent Telegram-based admissions chatbot designed to support NUS ECE outreach by providing accurate, customised responses to frequently asked questions. Built on AWS, ELExAI operates seamlessly in both private and group chats, including admissions Telegram groups. The system uses a Retrieval\u2011Augmented Generation (RAG) approach to ground AI-generated answers in official sources such as the ECE FAQ website. By delivering context-aware and personalised replies, ELExAI reduces repetitive staff workload while improving information accessibility and engagement for prospective students.\n\t\t\t\t\t<!-- \/content -->\n\t<!-- \/tabs -->\n<h2>\n\t\tStudent-Run Team\n\t<\/h2>\n\t<p><strong>Explore existing student-led project teams\u00a0<\/strong><\/p>\n\t\t<nav>\n\t\t\t<ul>\n\t\t\t\t\t\t\t\t<li data-index=\"0\">\n\t\t\t\t\t<h5>\n\t\t\t\t\t\t<a href=\"void(0);\">NUS Bumblebee\n\t\t\t\t\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t<\/h5>\n\t\t\t\t<\/li>\n\t\t\t\t\t\t\t\t<li data-index=\"1\">\n\t\t\t\t\t<h5>\n\t\t\t\t\t\t<a href=\"void(0);\">NUS Calibur Robotics\n\t\t\t\t\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t<\/h5>\n\t\t\t\t<\/li>\n\t\t\t\t\t\t\t\t<li data-index=\"2\">\n\t\t\t\t\t<h5>\n\t\t\t\t\t\t<a href=\"void(0);\">NUS Formula SAE Race Car\n\t\t\t\t\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t<\/h5>\n\t\t\t\t<\/li>\n\t\t\t\t\t\t\t\t<li data-index=\"3\">\n\t\t\t\t\t<h5>\n\t\t\t\t\t\t<a href=\"void(0);\">NUS Mars Rover\n\t\t\t\t\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t<\/h5>\n\t\t\t\t<\/li>\n\t\t\t\t\t\t\t<\/ul>\n\t\t<\/nav>\n\t\t\t\t\t<h5>\n\t\t\t\t\t\t\t\t\t\t\t\tNUS Bumblebee\n\t\t\t\t\t\t\t\t\t\t\t<\/h5>\n\t\t\t\t\t<p><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/DSC04493-3-1024x575.jpg\" alt=\"\" width=\"465\" height=\"261\" \/>Bumblebee is a student-run, multi-disciplinary robotics team. The Bumblebee team designs and builds Autonomous Underwater Vehicles (AUVs) and Autonomous Surface Vessels (ASVs) to navigate across oceans independently, from the shore line and the water surface, to deep waters.<\/p>\n\t\t\t\t\t<h5>\n\t\t\t\t\t\t\t\t\t\t\t\tNUS Calibur Robotics\n\t\t\t\t\t\t\t\t\t\t\t<\/h5>\n\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2025\/04\/photo_2025-04-01_15-27-52.jpg\" alt=\"\" width=\"435\" height=\"261\" \/>NUS Calibur Robotics is a student-led competitive team at the National University of Singapore, bringing together talent from various disciplines to design, build, and program autonomous combat robots. Established in 2019, the team competes in the prestigious DJI RoboMaster University Championships, showcasing advanced robotics and control systems. Calibur Robotics aims to field a full fleet of seven specialized robots for the global championships in China, continually pushing the boundaries of student-led robotics innovation.\n<p>\u00a0<\/p>\n\t\t\t\t\t<h5>\n\t\t\t\t\t\t\t\t\t\t\t\tNUS Formula SAE Race Car\n\t\t\t\t\t\t\t\t\t\t\t<\/h5>\n\t\t\t\t\t<p><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/R25MIS-scaled.jpg\" alt=\"\" width=\"391\" height=\"261\" \/>The National University of Singapore Formula SAE (NUS FSAE) team comprises of passionate and talented undergraduates from the NUS Innovation and Design Program. The team designs, builds, tests and race a formula style race car every year to compete in FSAE Michigan, one of the toughest inter-varsity competitions.<\/p>\n<p>With a rich history from combustion engines to cutting-edge electric vehicles, their journey has been a testament to relentless dedication, teamwork, and a pursuit of excellence.<\/p>\n\t\t\t\t\t<h5>\n\t\t\t\t\t\t\t\t\t\t\t\tNUS Mars Rover\n\t\t\t\t\t\t\t\t\t\t\t<\/h5>\n\t\t\t\t\t<p><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2026\/03\/nus-mars-rover.jpg\" alt=\"\" width=\"348\" height=\"261\" \/>The NUS Mars Rover is a multidisciplinary group of undergraduate students from the NUS Innovation &amp; Design Program. Our mission is to design and build an autonomous Mars rover to compete annually in the University Rover Challenge, held at the Mars Desert Research Station in the USA.<\/p>\n\u00a0\nWe are passionate about pushing the boundaries of robotics and engineering, leveraging innovative solutions to tackle the challenges of planetary exploration. Through collaboration and hands-on learning, we aim to develop a robust and versatile rover capable of performing complex tasks in a Mars-like environment.\n\t\t\t\t\t<!-- \/content -->\n\t<!-- \/tabs -->\n<h2>\n\t\tJudges\n\t<\/h2>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/bb-plugin\/cache\/Tham-Chen-Khong-scaled-circle.jpg\" alt=\"Tham Chen Khong\" title=\"Tham Chen Khong\" \/>\n\t\t\t\t \n\t\t<h3>A\/Prof Tham Chen Khong<\/h3>Associate Professor\t\t\t\t\t\t\t\n\t\t\t\t<a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a>\t\t\t\t \n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/bb-plugin\/cache\/Jun-Wei-1-circle.jpg\" alt=\"Jun Wei\" title=\"Jun Wei\" \/>\n\t\t\t\t \n\t\t<h3>Dr Lee Jun Wei<\/h3>Principal Research Engineer at DSO\t\t\t\t\t\t\t\n\t\t\t\t<a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a>\t\t\t\t \n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/bb-plugin\/cache\/Abhishek-circle.jpg\" alt=\"Abhishek\" title=\"Abhishek\" \/>\n\t\t\t\t \n\t\t<h3>Dr Abhishek Rai<\/h3>Assistant Professor\t\t\t\t\t\t\t\n\t\t\t\t<a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a>\t\t\t\t \n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/bb-plugin\/cache\/Wong-Jit-Chin-circle.png\" alt=\"Mr Wong Jit Chin (Principal Research Engineer  at DSO)\" title=\"Wong Jit Chin\" \/>\n\t\t\t\t \n\t\t<h3>Mr Wong Jit Chin<\/h3>Principal Research Engineer at DSO\t\t\t\t\t\t\t\n\t\t\t\t<a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a>\t\t\t\t \n<h2>\n\t\tJudges for Espressif Competition\n\t<\/h2>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/bb-plugin\/cache\/junius-pun-circle.jpg\" alt=\"junius pun\" title=\"junius pun\" \/>\n\t\t\t\t \n\t\t<h3>Mr Junius Pun<\/h3>Software Engineer at Espressif\t\t\t\t\t\t\t\n\t\t\t\t<a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a>\t\t\t\t \n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/bb-plugin\/cache\/Yogesh-Mantri-circle.jpg\" alt=\"Yogesh Mantri\" title=\"Yogesh Mantri\" \/>\n\t\t\t\t \n\t\t<h3>Mr Yogesh Mantri<\/h3>Software Architect at Espressif\t\t\t\t\t\t\t\n\t\t\t\t<a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a>\t\t\t\t \n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/bb-plugin\/cache\/Tang-Kok-Zuea-circle.jpg\" alt=\"Dr Tang Kok Zuea\" title=\"Tang Kok Zuea\" \/>\n\t\t\t\t \n\t\t<h3>Dr Tang Kok Zuea<\/h3>Senior Lecturer\t\t\t\t\t\t\t\n\t\t\t\t<a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a>\t\t\t\t \n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/bb-plugin\/cache\/Xu-Yuecong-circle.jpg\" alt=\"Xu Yuecong\" title=\"Xu Yuecong\" \/>\n\t\t\t\t \n\t\t<h3>Dr Xu Yuecong<\/h3>Lecturer\t\t\t\t\t\t\t\n\t\t\t\t<a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a><a href=\"\" target=\"_self\">\t\t\t\t\n\t\t<\/a>\t\t\t\t \n<h2>\n\t\tCountdown to ExCEllence 2026\n\t<\/h2>\n\t\t\t\tDay\n\t\t\t\tHour\n\t\t\t\tMinute\n\t\t\t\tSecond\n\n","protected":false},"excerpt":{"rendered":"<p>About ExCEllence (ECE Project Showcase) The ExCEllence (ECE Project Showcase) is an annual flagship event by NUS ECE. The showcase celebrates the ingenuity, technical depth, and real-world impact of student-driven engineering projects across diverse domains &#8211; including artificial intelligence, communications, robotics, microelectronics, energy systems, and emerging interdisciplinary technologies. Following a successful inaugural run, the 2026 [&hellip;]<\/p>\n","protected":false},"author":8,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_acf_changed":false,"site-sidebar-layout":"no-sidebar","site-content-layout":"page-builder","ast-site-content-layout":"full-width-container","site-content-style":"unboxed","site-sidebar-style":"unboxed","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"disabled","ast-breadcrumbs-content":"","ast-featured-img":"disabled","footer-sml-layout":"","theme-transparent-header-meta":"default","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"set","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""}},"footnotes":""},"class_list":["post-22804","page","type-page","status-publish","hentry"],"acf":[],"_links":{"self":[{"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/pages\/22804","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/users\/8"}],"replies":[{"embeddable":true,"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/comments?post=22804"}],"version-history":[{"count":4,"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/pages\/22804\/revisions"}],"predecessor-version":[{"id":23086,"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/pages\/22804\/revisions\/23086"}],"wp:attachment":[{"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/media?parent=22804"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}