Team Leader, Robotics, IEEE Senior Member, A/Editor IEEE RA-L
Research Leader, Visual-AI, Intelligent Automation (IA) Group
Multi-Sensor Fusion & Machine Vision
Multi-Sensor Fusion & Machine Vision
AI and Machine Learning
Energy and Machine Learning
HDR Candidate
Data-efficient Machine Learning for Robotic Perception and Manipulation
HDR Candidate
Extended Robotic Reality for Robot Learning
HDR Candidate
Robotic Intelligence for Construction Waste Sorting
HDR Candidate
Vision Systems for Processes Control in Metal Additive Manufacturing
HDR Candidate
Developing the Next Generation Materials Science Lab
HDR Candidate
Additive Manufacturing
Human-Centered Automation
Aerospace & Advanced Manufacturing
Manufacturing & Materials
Deploying robots in collaboration with humans is seen as an enabler of major changes in construction productivity for various tasks, such as digital twain, quality/compliance inspection, progress monitoring and automated interior finishing.
IEEE Transactions on Industrial Electronics, 2020 IROS 2020 Mechanism and Machine Theory Robotics and Computer-Integrated Manufacturing FToMM Symposium on Robot Design, Dynamics and Control
XR devices offer a range of spatial perception capabilities that can enhance human and robot collaboration. From 6DoF tracking to depth perception, gesture recognition, and environmental mapping, these features enable applications spanning human and robot interaction for manufacturing, healthcare, and beyond, XR devices continue to push the boundaries of spatial perception and redefine how we interact with robots
This is a generic article you can use for adding article content / subjects on your website.
Robotic additive manufacturing is a technology that can deposit material and fabricate complex parts. Part geometry during the process can negatively affect the shape and size of the final manufactured object. In-process spatio-temporal 3D reconstruction, also known as 4D reconstruction, allows for early detection of deviations from the design in robotic additive manufacturing, thus providing the opportunity to rectify at an early stage, making the process more robust, efficient and productive.
Our study, conducted in Melbourne's suburban areas, highlights two main types of soiling on photovoltaic (PV) panels: dust and bird droppings. Dust accumulation results from natural wind and climate conditions, while bird droppings, which we specifically distinguished from debris like leaves or shadows, can significantly impact solar efficiency even in small amounts. The size and precise location of bird droppings are especially critical, as they can lead to "hotspots" that reduce power output and, over time, cause permanent damage to the panel. The cleaning methods differ as well; dust can be removed with low-pressure water jets, but bird droppings require specialized cleaning solutions for effective removal. Our dataset was carefully curated to reflect these distinct cleaning needs. The dataset reveals that soiling on PV panels appears in a ratio of approximately 1:2 for dust to bird droppings, which introduces a class imbalance. Bird droppings also present a unique challenge for detection—they are small, irregular in shape, and typically cover less than 2% of the panel’s surface area, in contrast to larger, more uniform dust patches. Our dataset includes Polycrystalline panels with a blue background, where bird droppings often appear as white or gray spots. This dataset provides an invaluable resource for developing advanced detection methods for soiling, contributing to optimized cleaning schedules and enhancing solar panel efficiency.