Shireen Sharma
Space leaves more questions than answers.
I've always believed that the universe is vast, yet barely understood. This sense of mystery is what draws me to both the science that unravels it and the engineering that makes exploration possible.
I am an undergraduate at UC Berkeley studying astrophysics and aerospace engineering, and I'm most interested in instrumentation and space systems — from exoplanet RV surveys to CubeSat payloads to rocket propulsion.
Education
Internships & Fellowships
Research Projects
Technical Clubs
During my time with SEB, I contributed to the design of a remote quick-disconnect system for high-pressure N₂O fueling, a critical safety feature for high-altitude and high-pressure conditions. Using Fusion 360, I helped design and refine this system to minimize the risk of catastrophic failures during fueling, particularly in altitudes where rapid pressure changes could lead to instability. The system prioritizes safety by allowing remote disengagement, reducing human error and the risks associated with handling volatile materials in extreme conditions.
Community
Skills & Interests
Contact
This summer I will be joining SpaceX as an incoming Hardware Engineering Intern on the Falcon Ground Segment team in Hawthorne, California. My work will focus on the ground infrastructure that supports Falcon launch operations, TT&C.
Extreme precision radial velocity (EPRV) spectroscopy is a powerful method for detecting and characterizing exoplanets, measuring the tiny Doppler shifts a planet induces in its host star's spectrum. However, achieving Earth-analog detection requires addressing astrophysical noise sources that are comparable in amplitude to the planetary signal itself. As NASA prepares for the Habitable Worlds Observatory (HWO), a mission designed to directly image Earth-like planets around nearby stars, there is an urgent need for tools that can predict how proposed EPRV survey strategies will perform under realistic conditions before telescope time is allocated and the mission is launched.
To address this, I developed a modular, Python-based EPRV survey simulation framework at JPL. The simulator uses a target star list from the HWO candidate catalog (JPL/Caltech/NExScI) and builds a complete observational profile for each star by querying SIMBAD for stellar parameters and calculates instrument-specific exposure times using the NEID and KPF exposure time calculators, applies a physically motivated minimum exposure floor at the 10 cm/s threshold using the Chaplin et al. (2019) oscillation-averaging filter, and accounts for per-visit overhead with values from the WIYN/NN-EXPLORE call for proposals (300 seconds for NEID and a placeholder 147 seconds for KPF). The scheduler ensures observational constraints like airmass limits, altitude windows, and per-night visibility checks, and supports both fixed-cadence and evenly spread scheduling modes across user-defined multi-year baselines.
I expanded the simulator to support multi-instrument survey design by implementing a post-scheduling strategy layer. This layer reassigns individual observations between NEID at Kitt Peak and KPF at Mauna Kea using three configurable strategies: a primary/fallback scheme that defaults to NEID and redirects observations to KPF when Kitt Peak's clear-night fraction drops below 50%, a strict alternating scheme that alternates between instruments in time order, and a parallel independent mode that serves as an upper-bound control. The weather model independently samples clear nights from monthly site statistics for each observatory, preserving the uncorrelated nature of conditions at the geographically separated high-altitude sites. I also developed a toggleable Chaplin filter that applies a stellar-property-dependent exposure floor uniformly across instruments, correctly treating oscillation-averaging time as a function of stellar properties rather than as an instrument parameter.
For stellar noise modeling, I implemented synthetic radial velocity time series generation using custom Gaussian Process kernels, including a Matern 5/2 kernel, a quasi-periodic kernel for activity signals, and a granulation and oscillation model based on the ARGO framework's SHO-based kernel architecture. Each simulated observation is time-stamped in both local civil time and barycentric Julian date, with light-travel-time corrections applied. The output is written to a structured CSV file alongside the synthetic RV signal and photon-noise uncertainty.
I presented this work to a professional audience of interns, scientists, and engineers across JPL, discussing the scientific motivation, the simulator architecture, and the results of multi-instrument strategy comparisons on the HWO target list.
This project's purpose was to examine key technology gaps in small satellites, gathering insights from commercial, academic, and government sectors. Our goal was to identify solutions and create a strategy to track progress in closing these gaps for future NASA missions.
I led a comprehensive gap analysis of Communications and Guidance, Navigation, and Control (GNC) systems for small satellites by benchmarking current radio frequency (RF) and lasercom performance and quantifying technology readiness level (TRL) maturity to identify critical shortfalls.
My research highlighted a significant paradigm shift as small satellite missions move beyond low-Earth orbit (LEO). While traditional RF systems across the S, X, and Ka bands remain the backbone of telemetry, they are rapidly becoming bottlenecks for modern, data-intensive payloads. To address this, I benchmarked emerging free space optical (FSO) communication, which offers gigabit-level data rates that provide an order of magnitude improvement over RF. However, these systems introduce rigorous engineering constraints in precise beam pointing and atmospheric attenuation management. My specific case study on failure analysis involved quantifying data across 528 missions to evaluate how technology gaps correlate with mission outcomes. This analysis revealed that while Communications and GNC dominate the volume of smallsat projects, accounting for over 80% of efforts, the sector is reaching a maturity plateau. This saturation suggests that future breakthroughs will depend on the successful integration of hybrid RF-optical architectures and autonomous, X-ray pulsar-based navigation for deep space independence.
I also helped profile propulsion, power, and ISAM systems to define ROI-driven R&D priorities and investment strategies. For instance, our data showed that power systems represent a high-impact investment opportunity, currently accounting for only 27% of development efforts despite critical gaps in energy density for high-power missions. These synthesized cross-domain findings were used to inform future NASA S3VI small-sat development and strategic planning.
We presented these results to NASA Ames researchers, top students across Berkeley's aerospace, mechanical engineering, and astrophysics departments, and industry professionals from NewSpace companies.
The NASA L'SPACE Mission Concept Academy is a program where students design a space mission, gaining hands-on experience with real-world engineering challenges under NASA standards.
All images and graphics from the final presentation slides prepared by Team 17, NASA L'SPACE MCA, December 2025.
Through this program, I gained experience with industry tools like Siemens NX for CAD modeling and JMARS for Mars landing site selection. I also honed my technical writing and communication skills, producing professional reports including MCR, SRR, MDR, and PDR. The program provided training in systems engineering, heat transfer, and project management, all while ensuring compliance with ITAR/EAR and NASA review procedures.
The Black Widow mission aims to investigate Martian lava tubes in the Arsia Mons region to assess their potential for habitability and future human use. My team developed a hybrid robotic rover design capable of navigating complex surface terrain and performing a tethered descent into deep vertical pit craters. This mission will provide critical data on radiation protection and environmental stability in these caves, supporting NASA's long-term goals for Martian exploration.
As the lead for mechanical design and systems coordination, I led the development of a hybrid robotic system for exploring Martian lava tubes. My primary focus was ensuring the rover could land safely and ingress vertical skylights to reach subsurface targets.
Key aspects of my work included:
Hybrid Mobility Architecture: I designed a Hybrid Leg-Wheel System, chosen for its terrain adaptability. It allows the rover to traverse slopes greater than 45° and navigate rocky lava tube floors. I modeled the system in OnShape and NX to integrate the wheel-chassis and suspension, ensuring stability during transitions from surface to cave operations.
Tether Deployment & Ingress System: I developed a tethered deployment mechanism to support the rover during descent into pit craters, some as deep as 178 meters. I coordinated its integration with the command and data handling and power subsystems, ensuring it could support both mechanical functions and potential data and power transmission.
Systems Integration & Requirements: I coordinated mechanical interfaces with science payloads like TECP and MOMA, ensuring the system met NASA's constraints: total mass under 250 kg and a stored configuration of 3.5m x 3.5m x 3.5m, so the rover would fit within the specified launch vehicle envelope while enabling its expanded mobility system for surface operations.
The project focused on exploring how the mechanical properties of polymer-matrix lattice composites can be improved by embedding a rigid 3D lattice structure inside a softer polymer matrix, rather than relying on the individual materials alone. Lattice structures are known for their strong mechanical performance-to-weight ratio, but they often fall short in terms of absolute performance due to issues like brittleness and high anisotropy. These limitations arise from the internal voids in the lattice design and the directional dependence of the material properties. By introducing a matrix material, it was hypothesized that the composite could provide a more isotropic, stronger material while maintaining the benefits of the lattice structure. This idea is inspired by fiber-reinforced composites, which are widely used in aerospace, automotive, and biomedical industries because they combine a stiff material for load-bearing with a softer one for energy absorption and bonding.
The primary objectives were to investigate whether adding a stiffer or softer matrix around a 3D lattice would result in an overall material that was stiffer or softer, respectively, and to determine whether the lattice geometry (unit lattice versus patterned lattice) would affect these outcomes.
We began by selecting materials with varying stiffnesses: Nylon 12, Polycarbonate (PC), and Polycarbonate ABS (PC ABS). Nylon 12 was chosen for its ductility and orthotropic behavior, with significant variations in modulus depending on the orientation. Polycarbonate and PC ABS were selected for their higher stiffness and isotropic properties. The lattice structure was modeled using a Cubic Midpoint Lattice from ANSYS SpaceClaim, but due to computational limitations, the model was manually created in SolidWorks instead. The unit lattice dimensions were based on ASTM D3039 standards, and the 1mm-diameter tubes were patterned to form a full cube. A matrix of the same size as the lattice was then created, covering the lattice with the chosen material. These models were imported into ANSYS Workbench for finite element simulations. A total of twelve simulations were conducted, testing three different material configurations and two lattice configurations (unit lattice and 2x2x2 patterned lattice). The simulations were set up in ANSYS Mechanical, with a compressive force of 20 N applied to the top face of each sample and the bottom face fixed in place.
The results confirmed the hypothesis that adding a matrix to the lattice structure increased the stiffness and strength of the composite material. The patterned lattice configurations showed the greatest improvements, with the 2x2x2 lattice combined with Polycarbonate matrix material exhibiting the highest stiffness and Young's modulus in the X-direction. PC-ABS outperformed Nylon 12 by approximately 5% at 5.58 GN/m. The results also revealed that the choice of matrix material had a more pronounced effect on the mechanical properties of patterned lattices than unit lattices. The analysis also highlighted that the assumptions of perfect bonding between the matrix, lattice, and plates might have overestimated the strength of the material, as real-world bonding is rarely perfect.
Moving forward, the next steps involve validating these findings through physical testing of 3D printed prototypes. Using industrial-grade 3D printers such as the Stratasys Fortus 450mc, prototypes of the lattice structures with different matrix materials will be fabricated. These physical samples will undergo compression testing, and strain measurement techniques like Digital Image Correlation (DIC) will be used to track deformation and validate the simulation results. Future research could also explore a wider range of matrix materials, lattice geometries, and more advanced modeling techniques to optimize the mechanical properties of the composite for specific industrial applications, particularly in fields that require high-performance materials such as aerospace and medicine.
The SPORES group (Stellar/System Properties & Observational Reconnaissance for Exoplanet Studies with HWO) is an interdisciplinary team of astronomers and planetary scientists preparing for NASA's Habitable Worlds Observatory (HWO). Their main goal is to maximize precursor knowledge about potential target stars for HWO's future exoEarth survey, which aims to discover habitable Earth-like planets and analyze their atmospheres. spores-hwo.berkeley.edu →
As part of the SPORES-HWO research group, I conducted literature reviews and analyzed exoplanet radial velocity data to help characterize potential target stars for HWO's future survey. My Python-based RV analysis involved processing observational data to extract velocity signatures and assess target suitability for the HWO exoEarth catalog. Through this project I also familiarized myself with astroquery and SIMBAD, tools for programmatic access to astronomical databases that I later used extensively in my JPL work.
Bondi-Hoyle-Lyttleton (BHL) accretion describes how a compact object, such as a star or black hole, gravitationally captures material from a surrounding gas and dust cloud, drawing it inward through a degrading orbit. Named after Hermann Bondi, Fritz Hoyle, and Janusz Lyttleton, the framework is foundational to understanding how matter accumulates in protoplanetary disks, where measuring the accretion rate and energy release of disk material is key to understanding how planetary systems form. Our project applied this framework to probe accretion behavior in the disk around Orion Source I, a massive young stellar object in the Orion Molecular Cloud.
Edgar, R. G. (2004). A review of Bondi-Hoyle-Lyttleton accretion. New Astronomy Reviews, 48(10), 843-859. doi:10.1016/j.newar.2004.06.001 · arXiv:astro-ph/0406166
To model the NaCl molecular outflow, we assumed a torus-shaped (donut) geometry for the outflow region and generated position-velocity (PV) plots to probe the velocity structure of the disk. We created spectral cubes from interferometric data to identify a representative slice along the major axis of the disk, then experimented with different cuts around that axis to generate varying velocity distributions. This allowed us to locate regions where BHL accretion may be occurring by looking for characteristic velocity gradients and asymmetries in the emission profiles.
We plotted over ten PV graphs and analyzed FITS files using Python tools including AstroPy, Matplotlib, NumPy, spectral cube, GADGET, and CAFE-SLAB. Working with ULAB mentors and Dr. Melvin Wright, we used these PV cuts to determine the rotational direction of Orion Src I and measured the spatial extent of the NaCl molecular outflow, finding an inner radius of 7.2 AU and an outer radius of 24.48 AU. We concluded that NaCl emission traces the inner region of the disk, consistent with it being a temperature-sensitive tracer concentrated near the protostar.
We presented our findings at the ULAB Poster Symposium to peers, faculty, and postdoctoral researchers across Berkeley's Astronomy Department.
The transit method detects exoplanets by measuring the slight dimming of a star's light as a planet passes in front of it, creating a characteristic dip in the star's brightness curve called a light curve. The depth of this dip is proportional to the ratio of the planet's area to the star's area, allowing astronomers to infer planet size and orbital parameters.
For this project, my team and I focused on selecting optimal exoplanet targets for observation, considering key factors such as magnitude, elevation, and transit depth. Target selection begins by defining observation sessions that start 30 minutes before the predicted ingress (beginning of the transit) and continue 30 minutes after egress (end of the transit). The objective is to ensure we observe the full event while establishing baseline data outside the transit. We aimed for targets with a magnitude between 10 and 14.5, as extremely bright or faint stars are difficult to observe. The candidate also needs to be positioned more than 30° above the horizon during the observation window, and the transit depth should exceed 1 ppt to be considered a good candidate.
For data collection, we utilized the transit method, a highly effective technique for detecting hot Jupiters, which are large exoplanets that orbit close to their stars. By measuring the flux of the target star and comparison stars, we could detect the dimming caused by an exoplanet passing in front of its star. The TESS (Transiting Exoplanet Survey Satellite) helped us confirm whether our target was indeed an exoplanet or a false positive, utilizing the TESS Target of Interest (TOI) catalog. Additionally, differential photometry was applied to analyze changes in brightness, comparing the target star to nearby comparison stars to refine our observations.
After data collection, we calibrated the images to remove noise and distortions using BIAS images, which highlight imperfections in the telescope's lens. With AstroImageJ, we processed the data to correct for these imperfections, ensuring clean, accurate images. Plate solving was then used to match the observed stars with their Right Ascension (RA) and Declination (DEC) coordinates, helping us pinpoint the exact location of our target stars in the sky. This process was critical for accurate data analysis, ensuring we could track the stars over multiple observation sessions. Finally, we used AstroImageJ to graph the light curves, which revealed the presence of exoplanet transits, and compared these curves to ensure the results were accurate, helping us distinguish true planetary signals from false positives.
We presented our research in a comprehensive final presentation to faculty, researchers, and program peers at the Harvard-Smithsonian CfA.
In my study, I explored the transition between rocky and volatile-rich planets, particularly focusing on the 1.6 Earth-radius (R⊕) line. Exoplanets are typically categorized into rocky planets like Earth, intermediate-sized planets (super-Earths or sub-Neptunes) that may have rocky cores with gas or ice envelopes, and large gas giants. One of the central questions in exoplanet research is at what point a planet transitions from being rocky to having significant volatile content, which plays a crucial role in determining habitability. By analyzing mass and radius data from the Kepler space telescope, I explored how this transition influences our understanding of planetary formation and the potential for Earth-like conditions in other star systems.
To conduct this study, I used data from L. A. Rogers' 2015 paper, which analyzed a sample of 49 sub-Neptune-sized exoplanets with precise radial velocity measurements. The study integrated transit photometry to measure radii and Doppler velocity to determine planet masses, allowing for accurate estimates of bulk density. Using this information, I compared each planet's mass and radius against theoretical mass-radius curves for different compositions (such as silicate, iron, and Earth-like), which helped determine the probability of a planet being rocky. This analysis provided key insights into the 1.6 R⊕ threshold, a crucial boundary that marks the transition between rocky planets and those with substantial volatile envelopes.
In addition to the observational data, I used statistical models such as Markov-chain Monte Carlo (MCMC) to account for measurement uncertainties and population-level trends. The results highlighted that planets larger than about 1.6 R⊕ are statistically unlikely to be purely rocky, with a significant increase in volatile content as planets approach and surpass this threshold. This transition is essential not only for refining planet formation theories but also for informing the search for potentially habitable planets. My analysis aligned with previous research, confirming the robustness of the 1.6 R⊕ boundary across different models and assumptions about planetary composition.
The methodology I employed is built on combining observational data with theoretical models, offering a comprehensive approach to the rocky-to-volatile transition. The findings have broad implications for understanding planetary demographics and the potential for life on exoplanets. Identifying the 1.6 R⊕ cutoff provides a clearer framework for determining which planets might possess conditions similar to Earth, guiding future missions and observatories in selecting targets for atmospheric characterization and habitability studies.
I presented this paper, along with my synthesized findings, to an audience of upper-division astrophysics undergraduates and course staff. Through this presentation, I shared not only the results of my research but also the process and methodologies that led to my conclusions, fostering discussions on the implications of these findings for future exoplanet exploration.