Advanced science.  Applied technology.

Technology Today Podcast
Go to SwRI Technology Today Podcast Episode 67: SwRI’s Space Robotics Center

Episode 67: SwRI’s Space Robotics Center

How to Listen

Listen on Google Podcasts, Apple Podcasts, or via the SoundCloud media player above.


Engineers in SwRI’s new Space Robotics Center are developing software to operate robots in space. Robots are called in when a job is too dangerous for humans but they move differently in space. The Space Robotics Center captures cosmic characteristics, like lighting fluctuations and low friction, allowing development of robotics in a space-like environment. The center specializes in simulation, advanced perception, robot manipulation and off-road autonomy, capabilities that allow robots to accomplish assembly, manufacturing and other tasks in space.

Listen now as SwRI engineers Meera Day Towler and Lily Baye Wallace discuss the advancing field of space robotics, the Institute’s new cutting-edge center and the future of space robotics technology.

Visit Space Robotics Engineering to learn about SwRI’s space robotics and automation solutions.


Below is a transcript of the episode, modified for clarity.

Lisa Peña (LP): Transforming Earth-based robotics into technology that operates in space, that's the goal of SwRI engineers in the Institute's new Space Robotics Center, where they're bringing a little corner of the cosmos to campus. That's next on this episode of Technology Today.


We live with technology, science, engineering, and the results of innovative research every day. Now, let's understand it better. You're listening to the Technology Today Podcast presented by Southwest Research Institute. From deep sea to deep space, we develop solutions to benefit humankind. Transcript and photos for this episode and all episodes are available at Share the podcast and hit that subscribe button on your favorite podcast platform.

Hello. And welcome to Technology Today. I'm Lisa Peña. A new Space Robotics Center is bringing space-like conditions to SwRI. In this lab, engineers are developing capabilities for robots that can carry out tasks in space like assembly and manufacturing. Robots for space needs special capabilities. SwRI engineers Meera Day Towler and Lily Baye-Wallace are leading research and development in the new center. They're here to tell us about the groundbreaking field of space robotics and how they are simulating space conditions earthside. Thank you for being here, Meera and Lily.


SwRI Space Robotics Center

The new SwRI Space Robotics Center offers research and development for robotics simulation, advanced perception, manipulation and off-road autonomy. Equipment in the lab simulates conditions in space.

Lily Baye-Wallace (LB): Thank you for having us.

Meera Day Towler (MT): Thank you.

LP: So let's start with understanding this vast field of space robotics. What is it? What type of technology are we talking about here?

MT: Sure. So space robotics is put maybe a little bit too simply robots in space. So taking things that we do in terrestrial robotics, so manufacturing robotics, when you think of automotive facilities, for example, where they have lots of robots doing different types of automation tasks, those same type of things, but doing them in space. And on the vehicle side, taking capabilities that we use for autonomous vehicles like self-driving cars and transforming those for use in space as well.

LP: Are robotics currently used in space or is this capability still in development, not quite ready yet for deployment?

LB: Robotics are currently used in space and have been for decades. Many roboticists and engineers were inspired in their youth or their college days by videos of the Canadarm manipulator-- that's the huge robotic arm that's on the ISS-- or the camera arms on the Perseverance and Curiosity rovers.

What differentiates those arms from our work is a focus on software to enable a variety of possible autonomous actions with reduced human input, robotics that are, instead of being manually operated by a human, either up in space, near the robot, or down in some sort of command center to the robot, making intelligent decisions about what it thinks it should do and getting confirmation from a human, potentially in delayed time, to then perform the action. That's what we call supervised autonomy. And that's what we're working towards.

LP: OK, so we're talking about robots that need less hands-on input from a person. Is it a matter of programming and pushing a few buttons? Is that what we mean when we're talking about autonomous robots?

MT: It can really depend. It's kind of a spectrum, really. So there's teleop, which is the traditional, there's a joystick and a human is controlling the robot. Because of the time lag in space, it's better if that human is an astronaut on the International Space Station or in the space shuttle. That's kind of the way we've seen this done in the past. As we move towards more autonomous operations, though, we're looking at things like the robot uses its sensors to get a feel for its environment and start to understand what it needs to do for its task.

And then maybe it sends down its plans to a human operator on Earth. And the operator says, yes, that's OK or, no, try again, try to replan that. So that kind of fits into that supervised autonomy model. Eventually, you want to work towards more and more autonomy so the robot needs less and less input from people. So as we move farther and farther away from Earth, we're able to do more capabilities autonomously.

LP: All right. So robots in space are not new, but SwRI is bringing these new capabilities. So why are robots needed in space? What are they used for?

LB: When deciding to deploy automation in an application, roboticists often refer to the three Ds. Is this application dull, dirty, or dangerous for humans to do? And when it comes to doing just about anything in space, it is incredibly risky and expensive for humans to do it. Humans need to be kept safe and kept fed. And also it's incredibly heavy to ship food and water. Robots do not need this food, water. What they need is power and insulation from the really harsh environment from space. And we can provide that with the power that's already provided for the satellites or the shuttle systems that are in space. The robotics currently in space perform some servicing tasks, as driven by a human either on board or, as we mentioned before, with a remote set plan. They are often collecting scientific samples and data for us to better understand the universe outside of our planet. We've been putting satellites into orbit around the Earth since the '50s. And now our atmosphere is full of satellites that are and also aren't used regularly. Immediate concerns and prime candidates for automation is the capability to service, fix, or perhaps decommission these satellites completely autonomously.

MT: Yeah, you mentioned something that falls into the category of ISAM, in-space servicing assembly and manufacturing. ISAM generally accommodates some of those things like recycling satellites, refueling, taking old satellites and throwing them back down into Earth to burn up. That's a huge area of interest right now because certain orbits are becoming really crowded. We think of space as being this enormous place, right?

But within certain orbits that are highly used and with the proliferation of launch, there's more and more going up into certain orbits. And that's causing a lot of problems and crowding in certain places. So a great example of robotics in space is for ISAM for helping kind of clean up that orbit and those orbits that are getting crowded.

LP: So we need to create some more space in space.
Robot arm in a SwRI lab

Engineers test software capabilities on robotic arms like this one. Robotic arms in space carry out assembly, manufacturing and other tasks that would be difficult or dangerous for a human.

MT: Exactly.

LP: Yeah. [LAUGHS] OK, so when we talk about manufacturing tasks, specifically, tell us a little bit about that. What type of manufacturing tasks are conducted in space?

MT: Sure. So manufacturing kind of goes back to that ISAM idea because that's the M in ISAM. And manufacturing can be anything from doing assembly processes, so launching things in smaller pieces and then assembling large structures on orbit. It can also include doing surface finishing. So something was 3D printed. Maybe you need to do some surface finishing on that to make it have the right tolerance.

LP: So part of manufacturing is making parts--

MT: Yes.

LP: --to do quick repairs?

MT: Mm-hmm.

LP: So can you give us an example of what might need to be repaired in space?

MT: So there's a few different approaches for repairing things in space. There's the idea of sending up kind of a warehouse of things, and then you have a robot arm with that warehouse of things that can perform repairs and hot-swap parts as needed as they break. There's also 3D printing new parts and being able to put those on. That's another great use of both 3D printing and robotics.

There's also an area of research called in-space resource utilization, ISRU. And what that is useful for is it's trying to scoop up lunar regolith, so lunar soil, and separate out various components of that and use those separated components to form materials that then can be used to print things, so like building bricks to build structures on the moon, for example. So that's another place we see ISRU and robotics for manufacturing, not necessarily in orbit, but on the lunar surface as we prepare to have extended duration stays on the moon, as well as we're looking to move to Mars.

LP: All right. So much for robots to do. I want to just quickly go back to that idea of clearing out some of the old satellites or getting them back into use. Can you explain how a robot might accomplish that specific task?

MT: Sure. So one of the great use cases for some of our robotics capabilities is if you have something that's been on orbit for a while, maybe it's gotten damaged. Maybe it's really old, you don't have a great model of it. You don't necessarily know what you're looking at, what's up there, how it's moving. Maybe it's been dead for a while. We have some capabilities that allow us to look at that part and tell us what it is.

So we can do a 3D reconstruction so we get a feel for what that shape is. And from that, we can also determine how do we need to move a robot arm to grab this. So something is turning slowly. Can we figure out how it's turning and use our robotics to be able to grab it in such a way that doesn't damage it? Because if we damage something that we're trying to grab, we risk creating more space trash, which kind of defeats the purpose of what we're trying to do.

LP: So we know you're working with no gravity in space, zero gravity. What are other special circumstances to consider when designing robots for space?

LB: So one of the basic laws of physics, every action has an equal and opposite reaction. And that sadly holds true for robotics in space, too. When the Canadarm on the ISS moves, it is essentially a fixed arm because the ISS is so massive, the movements of the robotic arm, despite it being quite big, don't actually impact where the ISS is in space. If you were to shrink the ISS down and make it closer in proportion in size to its arm, that might not hold true anymore.
SwRI Engineers Meera Day Towler and Lily Baye Wallace collaborate on a project in the new Space Robotics Center

SwRI engineers Lily Baye Wallace (left) and Meera Day Towler (right) collaborate on a project in the new Space Robotics Center. They are part of a team adapting existing terrestrial technologies to function in the harsh space environment.

And if the system, in that case, isn't as large as the ISS, which is true of many of the satellites that are out there, they're between the size of a microwave and a fridge and slightly larger than a fridge. The motions of your arms, of the robotic arms that potentially might be on board, could actually have a big impact on where the base is in space. What's tricky there is in order to keep it in the orbit that it's currently moving, you do have options. And that's to use things called station keepers.

But in order to use station keepers, you can expend fuel. Fuel is another thing that we have to put into space that's incredibly expensive. So if we leverage an arm that has more joints than it needs, so those are extra degrees of freedom, as the term roboticists use in space, we can move the arm in such a way that it minimizes the amount of momentum that it imparts on the base that it's on. This comes into our supervised autonomy and plans for even greater autonomy of the arms moving in an even smarter fashion than we could come up with on our own.

LP: Essentially, there are movement or functionality aspects to consider, due to how things move in space. And so I talked about the no gravity. But we're also talking about a difference in light and other key differences there with what space is. So can we touch a little bit on that, too?

MT: Yeah, so the lighting one is a particularly interesting thing to think about. Because there's two things here. One is that space has extreme lights and extreme darks. That is pretty challenging for classical machine vision algorithms. The other thing that makes that problem even harder is that spacecraft are full of very reflective surfaces. So that's been one of the challenges that we've looked to address really early on is, can we make sure that we're trying to test these extremes when we're testing our perception algorithms and our machine vision algorithms?

If we have a very bright light on something that's partially in shadow but partially in light and has a lot of reflectivity off of several of the surfaces, are we still able to do that 3D reconstruction? But that is a significant problem. And that is an area that even here on Earth, we have to work really hard with lighting conditions and controlling the lighting conditions and compensating for these extreme lighting conditions. So it's a huge area of research for us.

LP: OK, so to get back to what we talked about at the top. At SwRI, we have a new cutting edge area for designing space robotics. So tell us about this lab, its equipment, its capabilities.

LB: The Space Robotics Center is a test bed for robotics, autonomous systems, and machine vision in space. Within it, we have an OptiTrack Markered Motion Capture System, complete with infrared and RGB-- that's red, green, blue-- cameras synchronized to a series of lights completely enclosed in blackout curtain to help us simulate these harsh lighting conditions of space-- that single bright lighting source of the sun or the vast darkness when we are in its shadow.

Within this blackout space, we have a fine, finished granite slab that we will float robots and other samples using air bearings to simulate motion in a low friction environment. This operates like a hockey table if the air came out of the puck instead of the table. This space also includes one mock satellite, to start, for generating some of the simulated sensor data.

MT: Yeah, so the OptiTrack Motion System that Lily mentioned, so there's a few different things that we're doing with that specifically. So that type of system is something that our human performance group does for biomechanics and for looking at people and doing 3D reconstructions of their skeletons. We have some internal research that's taking that same approach, but generalizing it for other types of objects.

So we can use that system to collect generic tracking data of an object moving through the space. So, so far we've been doing that testing in simulation, but we're in the process of moving that to hardware. And we're really looking forward to seeing the results on that one. But not only does that motion capture system allow us to track things as they move through our markerless motion capture, but it also allows us to use the markered version of it.

So you know those people with the balls on their outfits and they move around and that's how you do animations? That same type of thing, those little marker balls, we have some of those too. And we can put them on the robot and the things that we're trying to grapple with the robot so we can track how the robot is doing relative to how we think it should do based on our simulations.
Towler and Wallace stand behind a fine-finish granite slab used to float robots and other samples using air bearings to simulate motion in a low-friction environment

Towler and Wallace stand behind a fine-finish granite slab used to float robots and other samples using air bearings to simulate motion in a low-friction environment. The slab is a centerpiece of the new Space Robotics Center.

LP: OK, so walk us through a day in the Space Robotics Center. What happens there?

LB: Depends on what team you're working on. Are you there today to help us collect data? What you might be doing is you're going to open the curtain and you might be moving the satellite into its new position. Maybe you are testing out our markerless system, which means that you might not be putting markers on the satellite. If you're using it to compare against how well our simulation did, you might be putting markers on it.

You're preparing for running a bright light source dataset, so you're going to turn on at full brightness one of our singular light sources on the side and take a series of images where all of the cameras are synced. And it monitors the satellite essentially going through the structure. You might then hand this dataset over to a teammate or you might do this yourself is put this through our latest version of our machine learning algorithm and see if it can accurately track where the satellite was in the environment.

Maybe you're working on our trajectory prediction algorithm, which means you're seeing if you give it the first half of the trajectory, if it can guess where the satellite is going to go. Maybe that's if you're on the software and computer vision side. If you're on the robotics side, you might come ready to prepare the platform for another reduced friction test. So you might be making sure that our air compression cylinder is filled up.

You're going to boot up the robot and deploy the motion plan that you're going to test out on it. You might be putting those markers on it in order to compare it against the data that we have. You've probably got a friend or two along with you, because our structure is rather large, in order to make sure that things are operating safely and as planned.

LP: All right, that is a one busy center.

LB: Yes.

LP: On any given day, a lot going on. So currently, what challenge are you working to solve? Is there a particular challenge you're tackling at the moment?

MT: Sure. So there's several challenges we have on our plate at any given time. When we first started this space robotics effort, one of the earliest things we did was talk to as many people as we could in the space industry, both at SwRI, S-w-R-I, and outside of SwRI to get a feel for what does the industry need, what does the space industry need from the robotics industry? And then we thought about what do we do on the robotics side that could help meet some of those needs.

So from that we have four technology offerings that have emerged. And those are where most of our internal research projects are focused at the moment. Those four offerings are simulation, advanced perception, robot manipulation, and off-road autonomy. So simulation was one of the ones that came out pretty early on because we realized that there weren't a lot of datasets.

So the datasets that we were talking about earlier where we had this capability to generate those, that data didn't really exist when we first started out on this project. So we had to find creative ways of coming up with our own simulated data. So we've been exploring using gaming engines to be able to create photorealistic images and really explore that lighting and those lighting extremes and that reflectivity to get ourselves realistic datasets.

Simulation also has an advantage of giving us a place where we can test cheaply. We can test our autonomy algorithms in sim much more thoroughly and inexpensively than we could in hardware, which is another great reason to use simulation. Next is advanced perception. So that's taking things that we do in industrial robotics, like looking at a part and understanding what it is and what its shape is to autonomous vehicles, which is looking at the world around us, looking at our environment and trying to understand what that environment is and how do we need to interact with that environment.

And that type of information is highly relevant to the space industry, as well, because we've been talking about this idea of space debris and understanding what are we looking at when we're looking at a damaged satellite. That same reconstruction technology is highly valuable for that. The next one is robotic manipulation, a fancy word for robot arms. So a lot of the work we've been talking about with our dynamics aware motion planning, how can we use motion planning and figuring out how to move a robot to adapt to a dynamic environment. All of that is highly relevant to ISAM to ISRU and beyond.

And then finally, off-road autonomy. So taking things that we've been doing for the DOD for the last 15 years and looking at an environment, figuring out what places are safe to drive, what places are not so safe to drive, what places should we avoid. That type of thing is really useful to the moon. Because there's not roads on the moon, right? So until there are roads on the moon, we really need to be applying some of these off-road autonomy technologies.

LP: A vehicle on the moon, it would be helpful if it could drive itself.

MT: Definitely, yes.

LP: OK. So we use a lot of scientific terms and research terms that most of us here at the Institute are familiar with. But for listeners who maybe aren't familiar with the term "datasets," could you tell us what they are and why that's important?

MT: Sure, so "datasets" is kind of a generic term that we use for lots of numbers or lots of images or lots of inputs to a sensor system. And that can take a number of different forms. So for off-road autonomy, maybe it's a video feed from a stereo camera pair moving through a field. Or for some of our advanced perception work, it might look like looking at sequential images of a satellite during a slow rotation.

Or other things might be IMU data, so inertial measurement unit data, so how much something is shaking or moving, discrete time steps of that information. So really datasets is a pretty general term that we use for any inputs to our autonomy systems so that we can understand our environment and what is currently happening to the robot.

LP: OK, great. So when we talk about robotics for space, there are two components, hardware and software. We discussed what the hardware needs to accomplish, things like manufacturing parts for repairs. But explain what the software does.

MT: Sure. So the software that we're looking at for space robotics, there's a couple of different things within our four capabilities of simulation, advanced perception, manipulation, and off-road autonomy. Within each of those, there's software capabilities that we are looking to develop. Within simulation, that dataset generation is really important in being able to create data to be inputs to our perception systems. Within advanced perception, a lot of the software that we're doing focuses on machine vision, both classical and machine learning.

Most of our internal research projects right now that are in space for perception are focusing on more classical types of perception because we want to be able to do this on a processor that could run in space. So it's a much lower compute processor than what we're used to here on Earth. We're really spoiled with the desktop computers that we work with, even the laptops that we have. Even small computers that seem relatively not so fancy are much nicer than what we can fly in space. So that's a big part of what we're doing with our software, as well, is can we take these complex perception algorithms and port them to something that's close to what could fly in space. To move into manipulation and off-road autonomy, for manipulation, a lot of our software work comes from being able to dynamically plan a path. So instead of sending the robot up saying, you're going to move from point A to point B. We know exactly where those are. We know exactly what this process is going to be here. Here are the joint moves that you need to do to execute that.

What we're looking at doing is, robot, look around your environment. Figure out where you need to start and where you need to end. And then from there, plan what your trajectory needs to be. So there's a difference there in that it's not pre-programmed. It has to look around and figure out what it's supposed to do. And that's kind of where we come in, not only on the space robotics side, but also in our industrial manipulators work is using that visual input or using other kinds of input to figure out, robot, what do I need to do next? LISA PEÑA: That's incredible. OK, so in reading about our SwRI space robotics capabilities, I learned about Scan-and-Plan. So what is it and how does it work in space?

LB: This is that, hey, robot, look at your environment. Look at where you're starting. And I have a process that I need you to perform on something that's near you. Go ahead and plan that motion based on the surface that you're near. That's Scan-and-Plan. It is a term for a suite of tools that enable real-time trajectory planning from 3D scan data. That's the premise. You can put any object in front of a robot, take a scan of the object, and use that scan to plan the path that the robot will take to perform some process on the part's surface. This could be grinding, sanding, painting, welding, just to name a few different possibilities.

For Scan-and-Plan to work in space, we need vision systems that can work in these harsh lighting conditions that we discussed, these high radiation operating conditions, algorithms that can work on the robust. But again, we mentioned that these are low memory processors. These are processors that have less storage than your phone did five years ago. And algorithms that can plan the arms' motions to prevent any of the damage that we mentioned earlier that could come from you moving an arm about in a really violent way on a satellite that's pretty small relative to the size of the arm.

MT: And I think it's really worth emphasizing with Scan-and-Plan that one of the things that makes it really novel is that it doesn't have to know what the part is in advance. So it can look at something and figure out based on this coarse scan, I'm going to do a finer scan to make sure I have all those details of this part. And then from there, I'm going to make my plan.

It doesn't need to know, hey, this is roughly the shape that you're expecting. It can start with no information and look around and figure out what to do from there. So that's one really great aspect of Scan-and-Plan is that it can handle things that it hasn't seen before essentially.

LP: So it's already doing that here on Earth. It's already in use for that capability. And now your team is tweaking it for space? MEERA DAY TOWLER: Exactly.

LP: Really cool. So when we talk about all these capabilities and the things you're doing, has your work been implemented in space already? Are we getting to that point? Or when do you when do you foresee some of the research you're doing now coming to fruition and being used in space?

MT: Hopefully very soon. So we don't have anything flying right now. But we have a number of projects that are looking in the direction of flight. And we're looking at opportunities where we can get some of our code on orbit. There's kind of a staying within the space industry. You have to have flown to fly. And that makes it a little bit hard to enter into the market.

Because if you don't have that prior experience with flying your software, it might be hard to fly some new software, right? But that's a little bit of a barrier to getting new things up in orbit. However, we have seen a shift over the last couple of years. And I anticipate that this is going to continue towards taking on things that are novel and trying to fly and test out things that are different and experimental, a little bit more experimental.

Particularly as the cost to launch things gets cheaper and cheaper, there's more interest in trying to fly some of these missions that are maybe a little bit higher risk because it's technology that hasn't flown before. But we're seeing a lot of interest in that. And that's something we're actively pursuing. And I think within the next three years, we'll have something on orbit, if not much sooner.

LP: All right. And that kind of takes us to our next question. So what do you envision for the long-term future of space robotics? Ultimately, what large-scale projects are becoming possible with this technology?

MT: So I think we're really on a tipping point with the space robotics industry right now. It's an excellent time to be doing this research and to be working in this industry. Because we're starting to see this shift partly towards the higher risk that I mentioned earlier of trying things that are new and that are a little bit less tested, hasn't flown before. So we're seeing this new space economy that's moving towards doing more things on orbit and doing things farther and farther away from Earth.

So doing more things on orbit involves things like ISAM, the in-space servicing assembly and manufacturing. So doing things like refueling old satellites and repairing satellites, things that we want to have done and that makes sense to have done financially, but we'd rather if robots can do them rather than having to send people out there. It doesn't make sense. The risk doesn't make sense for the person to do it, for most of these cases.

So there's a huge push towards more ISAM, more autonomy in space. And as we start to move farther and farther away, there's a communication lag from outer space to Earth. So we have to have that autonomy. We have to have increased levels of capabilities of those robotic systems to be able to perform those science missions that are farther and farther out. And I'm really excited to be working in the industry right now. There's never been a better time to work in space robotics.

LP: The sky is-- not even the sky's the limit, right? [LAUGHS] Beyond.

MT: Maybe the solar system is the limit.

LP: The solar system is the limit. That's awesome. I love that. So part of what we like to do on the podcast is learn a little bit about the person behind our engineers and scientists. So tell us, I'd like for each of you to take some time to tell us about your journey into robotics. What sparked your interest into this field? And do you have any advice for people who would like to start a career in the field?

LB: Yeah. Well, I had always been curious about solving problems as a kid. And I got plenty of inspiration from seeing engineers on NOVA. What really sparked my interest was my Controls coursework in college. I was already getting a mechanical engineering degree. But I really love working at the intersection of so many disciplines. Robotics requires mechanical engineering, electrical design, system architecture, human factors, which is how humans are going to interact with the system and software design.

Advice for people interested in pursuing a career in robotics would have to be to stay curious and don't be afraid to ask questions. I got my position in my graduate research lab in college by asking the people at a Salsa dance class what they did for a living. And I happened to meet Dr. Carly Thalman in her last semester of her PhD while she was seeking a master's student to continue her research for her. Being genuinely curious about others and your passions not only opens doors, but it starts conversations, which can turn to collaborations at your school or now in my life, paid work, once you're in industry.

LP: And never underestimate the power of Salsa dancing.


LB: No, not at all.

LP: I think that's the lesson there. No, great story. Thank you for sharing your journey into robotics and the really incredible work you're achieving at this time. And, Meera, if you want to tell us a little bit more about your journey and your advice for someone who wants to get into the field.

MT: Sure. So I've known since I was about six years old that I wanted to be an engineer. I was that kid. So I also actually really loved my controls work in college, that intersection of mechanical and software systems.

LP: What do you say at six years old when you want to be an engineer? Like, did you know the term? Or did you know what you wanted to do specifically?

MT: I had some ideas for what I wanted to do. I thought airplanes and space were really cool. So I was always a space nerd growing up. So that was a big part of what drove me into this current role was always thinking that space was the coolest thing. And I'd cut out articles from the newspaper about NASA news and that type of thing. I really was that nerd as a kid.

LP: Yeah.

MT: It was great. [LAUGHS] But yeah, I loved Controls. I started working at SwRI. When I graduated from undergrad, I worked in Division 18, our mechanical engineering division. I did controls for turbomachinery. And about five years into that, after getting a master's, I decided that I wanted to try something new. And that's when I moved over into robotics. Because that's a great application of Controls engineering as well.

And a couple years after that, we started looking at what are other places that we should be looking at for robotics. And that's when space robotics came to be. And as a space nerd who does robotics work, I volunteered to take the lead on that. And it was a lot of fun. It's been an incredible thing to get to work on.

It's not a career path I envisioned for myself because I don't think this career actually existed when I was a kid. And it certainly didn't exist when I was six years old. This is such a new and evolving aspect of the space robotics field. So my advice would be, don't limit yourself to things that currently exist. Because you never know, you just never know what's going to happen.

LP: Wow, that's profound. You know, think big, thinking out of the box. And our new saying today is the solar system is the limit. So amazing. Well, Meera and Lily, thank you both for telling us about your work today as you advance this, again, incredible, cutting-edge field of space robotics.

MT: Thank you.

LB: Thank you so much for having us.

And thank you to our listeners for learning along with us today. You can hear all of our Technology Today episodes, and see photos, and complete transcripts at Remember to share our podcast and subscribe on your favorite podcast platform.

Want to see what else we're up to? Connect with Southwest Research Institute on Facebook, Instagram, X, LinkedIn, and YouTube. Check out the Technology Today Magazine at

And now is a great time to become an SwRI problem solver. Visit our career page at

Ian McKinney and Bryan Ortiz are the podcast audio engineers and editors. I am producer and host, Lisa Peña.

Thanks for listening.


Southwest Research Institute offers space robotics and automation solutions to transform Earth-based robotics capabilities into space worthy technology. SwRI leverages its extensive and accomplished history in terrestrial offroad autonomy, perception, robotics and simulation to shift those capabilities into space.