Weather App

Project Description

The Weather App is an Android application designed to deliver accurate and current weather information to users. It utilizes tomorrow.api to fetch real-time weather data for various locations, ensuring a comprehensive user experience.

Key Features

  • City Search: Users can swiftly search for cities with a search bar featuring an auto-complete option for quick and easy location selection.

  • Weather Display: Once a city is chosen, the app retrieves and presents relevant weather data in an intuitive format that is easy to read and understand.

  • Visualization: The app employs Highcharts to create visually appealing charts that illustrate weather trends, enhancing user comprehension of the data.

  • Social Integration: Users can share current weather conditions directly to X (formerly Twitter) from within the app, promoting social engagement and information sharing.

Technology Stack

The Weather App is developed using the MEAN stack, which includes:

  • MongoDB: Used for data storage, ensuring that user and weather data is managed efficiently.

  • Express.js: Serves as the web application framework, facilitating the handling of server-side logic.

  • Angular: Powers the front-end user interface, providing a responsive and interactive experience for users.

  • Node.js: Acts as the server-side environment, enabling scalable network applications capable of handling multiple requests simultaneously.

This robust combination of technologies allows the Weather App to deliver real-time weather updates effectively and efficiently to users.

Demo of the core features of the app

SWEET - Weakly Supervised Person Name Extraction for Fighting Human Trafficking

This paper presents a weak supervision pipeline tailored for the extraction of person names from noisy human trafficking proxy data, with an emphasis on escort advertisements. The objective is to enhance the accuracy and efficiency of extracting relevant information in contexts where traditional methods may struggle due to the inherent noise in the data.

A significant contribution of this study is the introduction of HTGen, a new dataset generated synthetically using GPT technology. HTGen consists of a diverse collection of escort advertisements, intentionally crafted to support the advancement of research in this area. By providing a well-structured dataset, the paper aims to facilitate deeper exploration and understanding within both academic and professional circles focused on human trafficking issues.

The weak supervision approach leverages models capable of learning from the noisy data, allowing for robustness in uncovering relevant person names. By addressing the challenges posed by the variability and inconsistencies found in escort advertisements, this pipeline can serve as a valuable tool in efforts to combat human trafficking and improve the overall efficacy of data-driven interventions.

Code, Poster, Paper

Image Classification from Scratch and Scaling Using Multicore Matrix Multiplication

During my volunteering experience at the Prometheus Lab at McGill, I engaged in a project focused on image classification from scratch. The goal was to develop a Java-based vision system that could effectively classify the type of room a robot was navigating.

The project involved constructing a custom image classification program, which was built entirely from the ground up. To optimize the handling of images captured by the robot, I implemented a batching system. This approach allowed for efficient processing of multiple images simultaneously, reducing overall running time.

I also integrated serialization in the codebase, facilitating the saving and loading of both the model and training data. This enhancement significantly improved the workflow, enabling easier management of the various datasets used during training and testing phases.

To scale the application for larger datasets, I utilized multicore matrix multiplication packages. This approach leveraged parallel processing capabilities, which enhanced computational efficiency and supported the handling of increased volumes of training and testing data.

The culmination of these efforts resulted in a robust image classification system adept at rapidly and accurately identifying the type of room based on visual inputs from the robot, reflecting the project's objectives and the practical application of advanced programming techniques in machine learning.

Poster

Brain Tumor Segmentation with Attention-Based U-Net

This project focused on enhancing brain tumor segmentation through an improved U-Net architecture. The primary modifications involved the integration of Squeeze-and-Excitation Blocks and Convolutional Block Attention Modules (CBAM) into the decoder sections of the original U-Net model.

The Squeeze-and-Excitation Block enhances the representational power of the network by adaptively recalibrating channel-wise feature responses. This approach allows the model to focus on more relevant features while suppressing less informative ones, thus facilitating better segmentation outcomes.

Similarly, the CBAM attention module introduces a dual attention mechanism, which sequentially infers attention maps along the channel and spatial dimensions. This helps the model to better emphasize critical regions within the input images that contribute significantly to effective segmentation.

The combination of these attention mechanisms within the U-Net decoder has demonstrated noteworthy improvements in performance metrics on the brain tumor segmentation task, showcasing the efficacy of incorporating advanced attention techniques in neural network designs. The research findings from this project were later published as a full-length paper, contributing to the ongoing exploration of deep learning applications in medical imaging.

Paper

Rocket Engine DAQ

During my internship at USC Liquid Propulsion Laboratory, I had the opportunity to contribute to the Data Acquisition (DAQ) team. My primary focus was on implementing Python code that facilitated the retrieval and visualization of engine data. This data was collected from Labjack devices integrated within the DAQ monitoring system.

My responsibilities included developing scripts that would efficiently fetch real-time engine data, ensuring accuracy and reliability. I utilized libraries such as Matplotlib and Pandas for data visualization, enabling our team to analyze performance metrics effectively. This experience not only enhanced my programming skills but also deepened my understanding of the complexities involved in monitoring rocket engine performance.

Collaboration with team members was crucial as we worked together to troubleshoot any issues that arose within the data collection process. Overall, this internship provided me with a solid foundation in data acquisition systems and their application in aerospace engineering.