Editor’s Note: The U.S. National Aeronautics and Space Administration (NASA) has long been recognized as a prolific source of technology innovations that were later transferred to the private sector. Below are two articles from NASA—“NASA Space Robotics Dive into Deep-Sea Work” and “NASA Turns to AI to Design Mission Hardware”— that describe NASA-developed designs with relevance for ocean robotics and complex custom parts.
March 28, 2023
NASA Space Robotics Dive into Deep-Sea Work
What’s the difference between deep space and the deep sea? For a robot, the answer is: not much. Both environments are harsh and demanding, and, more importantly, both are far removed from the machine’s operator.
That’s why a team of roboticists from NASA’s Johnson Space Center in Houston decided to apply their expertise to designing a shape-changing submersible robot that will cut costs for maritime industries.
“What NASA taught us is to put together robust software autonomy with a capable hardware morphology and deploy it in a remote setting,” said Nic Radford, founder, chairman, president, and CEO of Houston-based Nauticus Robotics Inc. During his 14 years at Johnson, Radford was, among other roles, deputy project manager and chief engineer for the humanoid robot Robonaut 2. Now more than 20 engineers who worked on that project and other NASA robots have joined the 80-person team he’s put together at Nauticus.
Whether a robot is working in space or on the seabed, the operator is far away, with limited communication and knowledge of the robot’s surroundings, Radford said.
“Even if you’re putting it on the space station and controlling it from the ground, there’s no high-speed data network. Talking to the space station is more akin to using dial-up.” So the robot has to sense and understand its environment, navigating obstacles and manipulating objects with minimal operator input.
For Robonaut 2, this meant Johnson engineers had to develop not just advanced hardware, like tendon-powered hands, elastic joints, and miniaturized load cells, but also vision systems, force sensors, and infrared sensors to gather information, as well as image-recognition software, control algorithms, and ultra-high-speed joint controllers to process and act on that data.
Built under a partnership between NASA and General Motors (GM), Robonaut 2 proved itself as an astronaut assistant aboard the International Space Station. But it was also a testbed for all these advanced robotic systems. NASA wants to develop robots to do dangerous work in space, run “precursor missions” that prepare for the arrival of human astronauts, and maintain facilities like the planned lunar Gateway station when astronauts aren’t around. GM, meanwhile, wanted to explore robotics that could assist factory workers. The project produced about 50 patents, several of which have already been commercialized as a robotic glove that GM and others are using in the workplace.
Cutting the Cord
Unlike a robot in space, deep-sea robots can be connected to operators with a cable to allow high-speed data transfer and close control. But Radford said this comes at the price of staffing and operating a huge support vessel on the surface, to the tune of about $100,000 and 70 metric tons of greenhouse gas emissions per day.
Nauticus is eliminating that cord by enabling its robots to work with minimal supervision from a control center on a distant shore.
Bright orange, fully electric, and about the size of a sports car, Aquanaut, the company’s signature robot, resembles a propeller-driven torpedo as it motors to its destination. At that point, its shell pops open and the nose flips upward to reveal a suite of cameras and other sensors, now facing front. Two arms swing out, ending in claw hands that can be fitted with different tools.
To test the robot in 2019, the team returned to Johnson and used the giant astronaut training pool in the center’s Neutral Buoyancy Lab, where the robot could try out its systems in full view of operators and cameras.
A Floating Factotum
Aquanaut is designed for versatility, and Radford noted that there are plenty of different jobs for a subsea robot. Offshore oil and gas production is an obvious target because it requires a huge amount of underwater equipment that needs inspection and maintenance. But the fastest-growing ocean industry is wind power. About 25,000 offshore turbines are planned for operation by 2030, Radford said, and they all will require servicing and inspection.
With wild fish populations declining steeply, aquaculture – the farming of fish, shrimp, and other seafood – is growing fast, and the nets and cages in those underwater farms need regular cleaning and inspection, Radford said.
Other potential jobs include port management, maintaining subsea telecommunication cables, offshore mining of rare materials, and defense applications. Radford estimated the total maritime economy at about $2.5 trillion.
By early 2022, Nauticus had produced two Aquanauts and planned to build 20 more in the following three years. The company will primarily use them to provide affordable services, rather than selling them. For operations that require surface support, Nauticus is building a boat called Hydronaut that can be operated remotely or navigate itself.
By applying space solutions to maritime problems, Radford plans to make the Nauticus name synonymous with ocean robotics, he said. “Space is amazing because it feels existential – it’s way out there, and people want to explore it. But there are also many real challenges right here beneath the ocean, and we could stand to do more innovating in the ‘blue economy.’”
NASA has a long history of transferring technology to the private sector. The agency’s Spinoff publication profiles NASA technologies that have transformed into commercial products and services, demonstrating the broader benefits of America’s investment in its space program. Spinoff is a publication of the Technology Transfer program in NASA’s Space Technology Mission Directorate (STMD).
For more information on how NASA brings space technology down to Earth, visit: spinoff.nasa.gov.
This article was originally published by NASA.gov. (Editor: Loura Hall)
NASA Turns to AI to Design Mission Hardware
By Karl B. Hille
NASA’s Goddard Space Flight Center, Greenbelt, Maryland
GREENBELT, Md.—Spacecraft and mission hardware designed by an artificial intelligence may resemble bones left by some alien species, but they weigh less, tolerate higher structural loads, and require a fraction of the time parts designed by humans take to develop.
“They look somewhat alien and weird,” Research Engineer Ryan McClelland said, “but once you see them in function, it really makes sense.”
McClelland pioneered the design of specialized, one-off parts using commercially available AI software at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, producing hardware he has dubbed evolved structures.
To create these parts, a computer-assisted design (CAD) specialist starts with the mission’s requirements and draws in the surfaces where the part connects to the instrument or spacecraft – as well any bolts and fittings for electronics and other hardware. The designer might also need to block out a path so that the algorithm doesn’t block a laser beam or optical sensor. Finally, more complex builds might require spaces for technicians’ hands to maneuver for assembly and alignment.
Once all off-limits areas are defined, the AI connects the dots, McClelland said, producing complex structure designs in as little as an hour or two. “The algorithms do need a human eye,” he said. “Human intuition knows what looks right, but left to itself, the algorithm can sometimes make structures too thin.”
These evolved structures save up to two-thirds of the weight compared to traditional components, he said, and can be milled by commercial vendors. “You can perform the design, analysis and fabrication of a prototype part, and have it in hand in as little as one week,” McClelland said. “It can be radically fast compared with how we’re used to working.”
Parts are also analyzed using NASA-standard validation software and processes to identify potential points of failure, McClelland said. “We found it actually lowers risk. After these stress analyses, we find the parts generated by the algorithm don’t have the stress concentrations that you have with human designs. The stress factors are almost ten times lower than parts produced by an expert human.”
McClelland’s evolved components have been adopted by NASA missions in different stages of design and construction, including astrophysics balloon observatories, Earth-atmosphere scanners, planetary instruments, space weather monitors, space telescopes, and even the Mars Sample Return mission.
Goddard physicist Peter Nagler turned to evolved structures to help develop the EXoplanet Climate Infrared TElescope (EXCITE) mission, a balloon-borne telescope developed to study hot Jupiter-type exoplanets orbiting other stars. Currently under construction and testing, EXCITE plans to use a near-infrared spectrograph to perform continuous observations of each planet’s orbit about its host star.
“We have a couple of areas with very tricky design requirements,” Nagler said. “There were combinations of specific interfaces and exacting load specifications that were proving to be a challenge for our designers.”
McClelland designed a titanium scaffold for the back of the EXCITE telescope, where the IR receiver housed inside an aluminum cryogenic chamber connects to a carbon fiber plate supporting the primary mirror. “These materials have very different thermal expansion properties,” Nagler said. “We had to have an interface between them that won’t stress either material.”
A long-duration NASA Super-Pressure Balloon will loft the EXCITE mission’s SUV-sized payload, with an engineering test flight planned as early as fall of 2023.
Ideal design solution for NASA’s custom parts
AI-assisted design is a growing industry, with everything from equipment parts to entire car and motorcycle chassis being developed by computers.
The use case for NASA is particularly strong, McClelland said.
“If you’re a motorcycle or car company,” McClelland said, “there may be only one chassis design that you’re going to produce, and then you’ll manufacture a bunch of them. Here at NASA, we make thousands of bespoke parts every year.”
3D printing with resins and metals will unlock the future of AI-assisted design, he said, enabling larger components such as structural trusses, complex systems that move or unfold, or advanced precision optics. “These techniques could enable NASA and commercial partners to build larger components in orbit that would not otherwise fit in a standard launch vehicle, they could even facilitate construction on the Moon or Mars using materials found in those locations.”
Merging AI, 3D printing (or additive manufacturing), and in-situ resource utilization will advance In-space Servicing, Assembly, and Manufacturing (ISAM) capabilities. ISAM is a key priority for U.S. space infrastructure development as defined by the White House Office of Science and Technology Policy’s ISAM National Strategy and ISAM Implementation Plan. This work is supported by the Center Innovation Fund in NASA’s Space Technology Mission Directorate as well as Goddard’s Internal Research and Development (IRAD) program.
This article was originally published by NASA.gov.
Get the manufacturing industry news and features you need for free in a format you like.