HOME ABOUT PORTFOLIO RESUME CONTACT
MADELEINE
   WEAVER


ABOUT ME

Candidate for a Research Masters Degree with a Concentration in Robotics at Carnegie Mellon University

My name is Madeleine Weaver and I am pursuing a Masters degree in Mechanical Engineering with a concentration in Robotics at Carnegie Mellon University. I earned an associates degree in Electrical and Computer Engineering at Massachusetts Bay Community College, during which time I participated in a NSF-funded research program at the University of Southern California's MxR lab where I worked on a team to build an augmented reality environment to study ethical decision-making.

I earned my bachelor's degree in Electrical and Computer Engineering from Northeastern University and worked in Professor Alireza Ramezani's SiliconSynapse Lab as an undergraduate research assistant to help build a bio-mimetic bat robot. I also participated two years in a row on a finalist team in NASA's Big Idea Challenge to address specific challenges in implementing robotic systems in space. While studying at Northeastern University, I had one position as an Animatronics Engineer Co-Op at Hasbro and another at FGC Plasma Solutions as an Electrical Engineer Co-Op. In pursuing a Master's degree at Carnegie Mellon University, I became a member of AiPEX lab where I studied Artificial Intelligence and Machine Learning applied to robotic control and design. I have presented my research at several global venues including companies in India and Japan, as well as CMU Africa. My interests include robotics and the intersection of art and technology.



PORTFOLIO


MY WORK

Here are some photos of my past projects.

Humanoid Face Robot
AiPEX Lab, Carnegie Mellon University

Click Project Title for Description
While in the computer vision domain there exist many deep learning methods to generate and retarget lip-synchronization motion to 3D models, there is presently no analogous method to retarget these 3D animations to physical robots. In this work in progress, we seek to develop a physically constrained method to co-design hardware and a controller to enable the natural animation of a robot corresponding to input speech audio.

<b>Reinforcement Learning Network Training</b><p><em> Depicted is an example of the our Actor-Critic
              RL agent iteratively changing the kinematic design of the robot and comparing the generated 3D mesh to a ground-truth 3D mesh representing
              our desired actuation.
             </em></p>
<b>Plot of the Robot Face Mesh with Actuator Placement Indicated</b><p><em>This figure shows a naive placement of our actuators according to the 
              FACS model, a model which many works cite when motivating the kinematic design of humanoid face robots. These positions, the angles the forces apply to each
              point and the amplitude at which the force is applied is sent to our simulator. The output of the simulator is then compared to our ground truth to calculate
              the network reward. These parameters are then iteratively changed to optimize our simulations likeness to our ground truth animation.
             </em></p>
<b>Physical Robot</b><p><em>This illustrates the cable-driven actuation of our physical robot.</em></p>
<b>Research Poster</b><p><em>This research poster describes the full motivation and project pipeline, and was presented at CMU Africa to encourage students to form collaborations with
               Pittsburgh-based AiPEX Lab.
             </em></p>

Jellyfish Robot
Bioinspired Robotics Course Project, Carnegie Mellon University

Click Project Title for Description
As the effects of climate change begin to affect ecosystems in ways that aren’t immediately apparent, it’s important to observe environments at risk. Typically researchers use robotic underwater vehicles to track and study aquatic life. These robots are robust to incident currents underwater, but the wake from their propellers may disrupt the ecosystem the robot intends to study. To make aquatic environmental monitoring systems less obtrusive, we propose using a jellyfish-inspired robot to study marine life. We hypothesize that the wake from jellyfish-inspired robots will be less turbulent and therefore reduce the disturbance to the ecosystem. We show that an electromagnet-driven propulsion system will produce less turbulent wake than propeller-based systems.

<b>Results</b><p><em> Our simulated jellyfish bell was too large to be practical to prototype using digital fabrication methods. By lasercutting acrylic guides defining the cross-sectional shape 
              of our CAD model, we were able to sculpt a model from WED clay and manually create a fiberglass mold.
             </em></p>
<b>Jellyfish Bell Simulation</b><p><em>To inform the design of our jellyfish bell, we iteratively changed the parameters of our CAD model and simulated its motion given the calculated force applied
              by the electromagnetic actuator. We selected our design based on how well it displaced water while also resisting deformation enough to return to its initial shape after the load was removed.
             </em></p>
<b>Small-Scale Prototype Testing</b><p><em>We submerged a small-scale prototype of our jellyfish robot in water to determine the turbulance its propulsion mechanism created. This turbulance measurement
              was compared against turbulance created by propeller propulsed robots. Results are quantified in our poster. </em></p>
<b>Demo Poster</b>

Mouth Robot
Northeastern University, Carnegie Mellon University
Click Project Title for Description
Humanoid robots are increasingly being developed for healthcare, education and service applications. One aspect of humanoid robotics that remains an unsolved problem is achieving high fidelity lip synchronization. One potential approach to improving lip synchronization is using machine learning (ML) methods to mechanically actuate human speech sounds, linking auditory and visual output. The present work gives recommendations for improving humanoid robot hardware and software to better mimic human speech. The method includes sampling audio from the robot for testing of a convolutional neural network (CNN) trained on human audio to determine if the robot audio signals are similar to human audio signals. It was determined that limitations in pitch and tone range of both the variable pitch pneumatic sound generator and the deformable resonance tube would need to be improved. Reinforcement learning was also recommended for future research to explore more of the hardware’s abilities to produce a more human-like sound.

<b>Mouth Robot</b><p><em>We collected the robot output data using a microphone and data acquisition script written in MatLab. To support our goal of controlling the robot to correctly say the 4 different versions
            of the word 'ma' in mandarin, each trial was comprised of a set of three vocal pitches per speech tone.
           </em></p>
<b>Soft Robotic Tongue</b><p><em>This soft robotic tongue was constructed to add tonal complexity and range to the robotic mouth hardware. It was controlled using a pneumatic controller donated by the 
            the Soft Robotic Toolkit of Harvard University. 
           </em></p>
<b>Results</b>
<b>Mel Spectrogram</b>

Bat Bot
SiliconSynapse, Northeastern University

The Bat Bot was designed and fabricated to test a hypothesis that a robot could fly by flapping its wings, and in doing so it could interact with people more safely than a propeller-driven arial robot. My contribution to this project was lasercutting PCB traces onto the carbon fiber chassis to meet requirements concerning weight and drag, designing the layout of a flexible PCB that served a dual purpose as the bat's wing, soldering surface-mounted components to the PCB under a microscope, and creating the skeleton program to access the actuators and sensors.

<b>Bat Bot Chassis</b><p><em> My contribution was laser cutting copper traces connecting the motors and radio
           transmitter to the battery, eliminating the need for a PCB and reducing the weight of the robot. In
           order to complete this task, I had to edit the original chassis to allow a path for the traces and arrange the
           hardware considering the target center of mass. I meticulously measured the hardware to bring the traces directly to the
           pads so I could connect the traces to the hardware with solder bridges.</em></p>
<b>Bat Bot Wing</b><p><em>I soldered each component onto the flexible membrane PCB, designed to also serve as the
           wing of the Bat Bot. Having not had prior experience soldering components this small, I did extensive online research and watched several training videos. I then programmed the STM32
           processor in C to send commands to the sensors and drivers over an SPI interface.</em></p>
<b>Layout Design for Flexible PCB</b>
<b>A Version of the Bat Bot</b><p><em>This is an older version of the bat robot which we worked to improve.</em></p>

Augmented Reality Environment
Mixed Reality Lab, University of Southern California

To test the effect of fidelity and immersion of an augmented reality environment on ethical decision-making, we created a virtual environment where the user would play a role in the trolley problem. My individual contribution was designing and fabricating a physical lever using a motorcycle damper to enable the discrete control of the stiffness of the lever. The lever was outfitted with motion trackers to enable the realistic motion of a corresponding virtual lever. I also 3D modeled several scene components like hard hats and vests in Blender.

<b>Our Best Poster Honorable Mention Award</b>
<b>Our Human Model</b><p><em>We used 3D scans of interns for our human models and I modeled hard hats, ear protection
           and safety vests in Blender.</em></p>
<b>Real Lever Juxtaposed with Virtual Lever</b>
<b>Integrated Motorcycle Damper</b><p><em>To allow study to include discrete values of resistance.</em></p>

Horseshoe Crab Robot
Fab Lab, Dassault Systemes

This was a project I worked on independently while I was an intern at the Dassault Systemes FabLab. I was initially inspired to design this horseshoe crab robot after having attended a BattleBots-like event thrown by a robot fighting league called Mass Destruction that was held in a makerspace called the Artisan's Asylum where I used to volunteer. I noticed that the simple wedge shaped robots were the most effective despite many not even having a weapon, because they were able to push the opposing robot against a wall in which position its own weapon would destroy itself. This horseshoe crab robot could turn itself right side up like horseshoe crabs use their tails in the wild to avoid similar self-destruction.

<b>Horseshoe Crab Robot Shell</b>
<b>Pneumatic Silicone Actuator</b><p><em>I 3D printed 3-part molds to make several iterations of the actuator.
           The one that worked the best is a composite of two different designs, one that curled and one that grew in length
           when inflated.</em></p>
<b>Actuator Inflated to Lift Tail</b>
<b>Airflow Control Mechanism</b><p><em>Milled to follow the contours of the inside of the shell</em></p>


PERSONAL PROJECTS

1985 Honda Aero 80 Restoration
Pittsburgh, PA

I purchased this scooter on Craigslist from the neighbor of an elderly woman who used the scooter to get back and forth to her mailbox before she passed away. I intentionally chose a scooter that was in really rough shape to teach myself to repair, as I could not possibly make this scooter worse and I could learn to repair basically everything. Repairs performed included de-rusting and lining fuel tank, replacing the bystarter/choke, replacing ignition coil, replacing front and rear brakes, cleaning the carburetor, replacing piston rings and cylinder gaskets, replacing cracked fuel and air hoses, as well as repairing cracked fairings and painting. Despite all this work, it has only ever run for days at a time. The repairs are still in progress as I wait for it to return from a machine shop after the cylinder is bored. I believe the ghost of its previous owner still haunts it.

<b>The day I brought this scooter home</b>
<b>The day I actually got it to run</b>
<b>Part of the painting process</b>
<b>Painted and assembled scooter</b>

Collin's Head
Providence, RI

I used body-safe silicone to make a mold of a friend's head so I could make a plaster cast, make a fiberglass mold of the cast and re-cast his head in silicone for robotic applications.

<b>Front View</b>
<b>Rear View</b>
<b>Sculpting Mold Step</b>
<b>Close Up</b>

Miscellaneous
Providence, RI

I improvised a screenprinting setup using an overhead lamp from home depot and panes of plexiglass and glass taken from picture frames. I also welded a part-lamp-part-planter at The Steel Yard in Providence, RI.

<b>This is a cool back patch for my fake scooter gang.</b>
<b>This is the design for a t-shirt I made for my friend Ryan. He loves Guy Fieri and metric
                tape measures, both pictured.
              </b>
<b>Here is the design for a t-shirt I made for my friend Ryan and the screen.</b>
<b>This is both a lamp and a planter I welded.</b>


CONFERENCES

Tesla AI Day
Palo Alto, CA

"Have you considered the utility of the human spirit?"



RESUME


RESUME



CONTACT


MADELEINE WEAVER

Pittsburgh, PA
Phone: (781) 591-1292
Email: maddygweaver@gmail.com
LinkedIn: linkedin.com/in/madeleine-weaver