.

RSS

Mention data, and thoughts may turn immediately to the insatiable data needs of artificial intelligence. But AI and large language models are just one of the forces driving the need for — and generating — large volumes of data. Consider, for instance, the broad field of materials science.

“At the end of the day, it’s the data that matters,” says UC Santa Barbara professor Tresa Pollock. She and McLean Echlin, a research scientist in her lab, know too well that progress in their own area of study and many others depends on instruments and experiments that generate enormous amounts of multimodal data, all of which has to be moved, stored, and retrieved over various time periods. The capability to merge large amounts of data from many different sources is at the heart of new discoveries across many fields.

Facing the recent data explosion ignited by increasingly sophisticated scientific instrumentation across the UCSB campus, Pollock, the Alcoa Distinguished Professor of Materials and then the interim dean of the UCSB College of Engineering (COE), successfully applied for a $250,000 grant from The Hearst Foundation. It enabled the purchase of state-of-the art hardware to accelerate data acquisition, transfer, and analysis and to dramatically enhance storage capacity. That allowed many more graduate and undergraduate students to engage in research in this important emerging area, serving a Hearst Foundation goal of “preparing students to thrive in a global society.” The grant was also instrumental in UCSB’s ability to secure a $5 million NSF grant in 2025 to further advance the university’s Multimodal Imaging initiative.

“One of the foundational capabilities of the Materials Department is the continuously evolving suite of very high-end materials-characterization instruments, including electron microscopes and computed tomography (cT) machines, which now routinely generate large volumes of 3D and 4D (time dependent) data,” Pollock reports. Attracting some five hundred total facility users from departments across campus, she adds, “The instruments in the Materials Department facilities alone can generate tens of terabytes [one thousand gigabytes] worth of data in a few hours of operation.” 

“The typical Google account on campus might accept gigabytes (GB) of data, so cloud computing can serve many research needs,” Echlin adds. “But we can easily generate that by imaging a single, one-micron-thick laser-sectioned slice of a material [during a 3D tomography measurement]. Such huge amounts of data require a bespoke way of dealing with it. The latencies involved in sending and retrieving so much data to and from the cloud — which can be many days — are too long. The cloud can only do so much. In the system made possible by the Hearst grant, we're talking about much faster ethernet connectivity to large storage pools, where no cloud is necessary.”

Data moves so rapidly that speed might not seem an issue, but, when many billions of data bits are being sent or retrieved, latencies lasting milliseconds add up fast. Thus, the closer the site where data is generated is to where it is stored and processed, the better.

Google Says “Bye”

So, how do researchers know when they have too much data and a new approach is needed? One sign, Pollock says, is that Google “kicks you out of the cloud. That’s what they did to us. We had a month to shrink our data or move it all somewhere else.”
 
In giving data-driven researchers the boot in fall 2023, Google identified some “top offenders,” Pollock continues. “We were among them, although I took that as a bit of a point of pride. Echlin was required to reduce his cloud data storage by ninety-five percent.”

Materials professor Daniel Gianola was also flagged as a data “culprit.”

“Our group, in collaboration with several others on campus, performs advanced electron microscopy and diffraction to study structure-property relationships in metallic and ceramic materials used in extreme environments,” Gianola explains. “These instruments are equipped with ultrafast state-of-the-art cameras that are sensitive down to single-electron events and can generate upwards of a terabyte of data per minute, which then must be processed efficiently to decipher the finest details of our materials. The cameras, several of which are located in the core Microscopy and Microanalysis Facility (MMF) on campus, have made the handling of data a central Google-cloud challenge representing a moment of deep data truth.”
 
“The exodus from Google meant that we had to start placing things proximal to each other [to gain the above-mentioned speed advantage],” Echlin says, adding that previous efforts on campus aimed at placing data creation, processing, and storage close to each other had been “mostly serendipitous.” 

Echlin had for some time been seeking to optimize use of the High Performance Computing Center (HPC), which is part of the UCSB Center for Research Computing (CRC) in Elings Hall — also the site of the microscopy suite. “Having the infrastructure for connectivity in one building rather than spread across campus, unifies a lot,” Echlin says. “Connecting those entities was an essential step that gave us places to put data archivally and also to literally stream the data directly from an instrument to storage as we're generating it.”

“Prior to receiving the Hearst grant, users were at the mercy of slow data rates to store, process, and archive data from those instruments, and they always struggled to have effective data backup and archiving options,” Gianola adds. 

“The grant enables streamlined workflows across the data ecosystem. I believe that this system will serve as a model for other shared experimental facilities around the country,” Pollock adds, “Faculty like Dan Gianola, who in March received the 2026 Brimacombe Medal  from The Minerals, Metals & Materials Society for outstanding mid-career scientists, will need these foundational data capabilities to continue to lead the field.”

As a result of the grant, Echlin says, “At least temporarily, the campus data needs are under control. We have a decently large buffer between how much storage we have and how much we need, but we’ll need more in a couple of years.”

System hardware like that installed in the HPC is only as valuable as the technical support provided to maintain it and ensure the integrity of, and access to, data, which HPC provides. “They lease us some real estate and give us some real estate, where we can put some of the servers, and they help us maintain them,” Echlin says. “Support also comes from GRIT (General Research Information Technology), which has the focus of ensuring that the needs of researchers and scientists are represented on campus.” 

A duplicate data setup has been placed in the Chemistry Building and mainly serves  the cryo-electron microscope, which also generates vast volumes of data. Dorit Hanein, a faculty member in the Departments of Bioengineering as well as Chemistry & Biochemistry, manages the facility.

Campus Archival Resources

A key element of improving long-term data storage for all research efforts on campus, derives from a very old technology that has been improved and now occupies a new data-storage niche. Says Pollock, “For archival data, we're moving to high-density tape.”

“Tape is associated with the birth of the computer age, but the densities have gone up, so that, even though it’s slower, a lot of data centers use it to store infrequently accessed data,” Echlin adds. “In our scenario, it will be used for data that was made long enough ago that it probably can go on a shelf, and if somebody really needs it, then they can get it out of that deep storage; they just won’t have immediate access.”
 
The result of the collaboration involving the UCSB Research Office and Hearst resources, an archival tape system has been established as a service to all researchers on campus to ensure long-term access to their data.

The Future of Data

In terms of data’s future, Pollock says, “I don't see any let up; there are valuable discoveries to be made as we develop algorithms to learn from large, multimodal datasets and understand their complexities. We can only make these discoveries if we have the large datasets at our disposal. Looking further ahead, she adds, “As a field, we want a large, up-to-date digital library of materials that we can use to discover, design, and manufacture better materials for the future.” There will undoubtedly be developments in AI that enable us to operate on the data and accelerate the cycle for designing new materials. We are just at the beginning of what is an exciting time to be a materials scientist. The first step, though, is to ensure that the data lives somewhere near where it is generated, and we are grateful to the Hearst Foundation for making that possible.”

UC Santa Barbara chancellor emeritus and mechanical engineering professor Henry T. Yang is among three UCSB faculty who were elected as new members of the American Academy of Arts and Sciences for 2026. Yang is joined by Alice Alldredge, an emeritus professor in the Department of Ecology, Evolution and Marine Biology (EEMB); professor of astrophysics Lars Bildsten; and 249 other new members of the 2026 cohort, each a leader in academia, the arts, industry, journalism, philanthropy, policy, research, or science. The first cohort of AAAS, which was chartered in 1780, included George Washington.

In addition to having occupied the chancellor's office at UCSB for thirty-one years, Yang is also a distinguished professor of mechanical engineering with particular research expertise in solid mechanics, materials and structures, dynamic systems and control, and micro and nanotechnology, as well as aerospace structures, structural dynamics, composite materials, finite elements, transonic aeroelasticity, wind and earthquake structural engineering, and intelligent manufacturing systems. He has authored or co-authored more than 190 articles for scientific journals, as well as a widely used textbook on finite element structural analysis.

Yang is a member of the National Academy of Engineering and a fellow of the American Institute of Aeronautics and Astronautics, the American Society for Engineering Education and the American Society of Mechanical Engineers. Since stepping down as chancellor in 2025, he has returned to teaching full-time, although he continued to teach a class in mechanical engineering every year for the duration of his tenure as chancellor.

Read the entire article.

Continuing a strong tradition at UC Santa Barbara’s Robert Mehrabian College of Engineering at UC Santa Barbara, Yuheng Bu, assistant professor in the Computer Science Department, has received a prestigious Early CAREER Award from the National Science Foundation (NSF). 

The ability of generative artificial intelligence (AI) to produce text at scale has created an urgent need for trustworthy ways to identify and trace AI-generated content. The project for which Bu received the CAREER Award, titled “LLM Watermarking and Beyond: Foundations and Algorithms via Distributional Information Embedding,” is aimed at advancing watermarking, a family of methods that embed a hidden signal into generated text so that it can be identified later, while maintaining the text’s usefulness and naturalness.

We caught up with Bu earlier this month.

Q: Can you describe generally your intention to develop an attribution model that is more reliable than current approaches?
Yuheng Bu: “The existing practice of watermarking cannot encode more than a single yes-or-no signal, telling us only whether or not a piece of text appears to be watermarked. This binary identifier is useful for basic detection, but is often not sufficient for richer attribution orforensic use, because it cannot reveal which model generated the text, when it was produced, or with whom it was associated.

Q: How does your approach improve on binary watermarking to make LLMs more useful? 
YB: In our approach, metadata, such as the model version, generation source, timestamp, or user-level attribution information, can be encoded. This richer information would make watermarking more useful, since it supports not only detection, but also fine-grained tracing and accountability.

Q: Does your approach address the issues of watermark forgery and erasure? 
YB: Yes. Watermarks can often be removed by rewriting the text without changing its meaning. For example, an LLM can paraphrase the text, or it can be translated into another language and then translated back, which may preserve the meaning while disrupting the watermark signal. This means that the watermark may not survive even when the content itself remains essentially unchanged.
     A related concern is watermark forgery or spoofing. In that case, an attacker may generate harmful content, such as hate speech, that falsely appears to carry the watermark of a legitimate system. This can damage the credibility and reputation of the watermarking scheme, because it creates the impression that the protected system produced content that it actually did not.
     To address these challenges, we need watermarking methods that are both more robust to removal and more secure against forgery. 

Q: How might your research support responsible use of generative AI in research, education, and general society?
YB: One way is by improving tools for protecting intellectual property in datasets, increasing trust in automated reviews and other AI-assisted writing, and supporting secure communication among AI systems. These goals are achievable through our proposed research, in which a general framework would be developed to support these applications. At the same time, full real-world impact will also require follow-on work and broader efforts in the same direction.

Q: Speaking of the future of AI research, can you tell us about the educational component of the project?
YB: Educational activities include developing hands-on training via short course modules for high school students, interactive workshops for junior high families, and workshops for high school science teachers. Undergraduate and graduate students will have opportunities for rich learning embedded in competitions around watermarking to increase AI security. These efforts will give students early practice in thinking about reliability, security, and design trade-offs in generative AI, so that they learn to build and evaluate AI systems responsibly from the start.

Q: The proposal mentions embedding multi-bit information into text generated by LLMs. Why is that so challenging?
YB: Embedding multi-bit information means encoding a small amount of metadata into the generated text. It’s challenging because text has limited freedom: the model must still produce fluent, natural, and semantically accurate language. The more information we try to embed, the harder it is to preserve text quality while also maintaining robustness and security.

Q: Can you do it?
YB: We are making progress toward this goal. We have developed a theoretical framework for analyzing text quality, robustness, and security in zero-bit watermarking, and we are exploring different strategies to generalize this framework to multi-bit watermarking. Currently, we can embed a few bits in a sentence, but we believe there is room for improvement.

Q: What is the distributional information embedding problem mentioned in the proposal? 
YB: In simple terms, traditional watermarking often works by inserting identifying information into an existing text passage. In contrast, for generative AI watermarking, the information is embedded during the creation process itself. That means that we gently steer how the model generates content by adjusting the probability distribution over next-token predictions so that the final output carries a hidden signature that can later be detected.

Q: What is meant by the sequential generation of LLM–generated text?
YB: Generated text is sequential because each new word depends on the words that came before it. For example, after “The cat sat on the,” the next word is much more likely to be “mat” than something unrelated. That is very different from independent samples, where each draw is made separately and does not depend on previous ones, like repeatedly drawing numbers from — and returning them to — a bag. So, “dependence” here means context dependence across tokens, or small pieces of text that an LLM can generate.

Q: Can you talk a little about the tradeoff, mentioned in the proposal, between watermark robustness and preserving natural text?
YB: One key trade-off is between information rate and text quality. That is, the more information we try to embed in the text, the harder it is to keep the output fully natural and fluent. Another is between robustness and detectability, because while making the watermark stronger can improve detection and make removal harder, it may also increase distortion or make the pattern easier for an attacker to identify and spoof.

Q: This project proposal has an ambitious list of goals. Can you realize all of them? 
YB: A single project is unlikely to fully solve the problems of watermark forgery and removal, so our contribution is best understood as what we hope will be a meaningful foundation, not a complete solution. The goal is to develop algorithms and authentication mechanisms that substantially improve our ability to distinguish genuine watermarks from forgeries and to identify likely removal attempts, with provable guarantees in well-defined settings.

Q: What makes “in-context” watermarking different from its predecessors?
YB: Unlike most existing watermarking methods, which require access to the model’s decoding process, in-context watermarking embeds the watermark through the user prompt alone, using the LLM’s in-context learning and instruction-following ability.
     This makes it different in two ways. First, it is model-agnostic: the party applying the watermark does not need control over the model internals or the decoding algorithm. Second, it is especially useful in settings such as AI-generated peer reviews, where organizers may suspect LLM use but have no access to the underlying model. In that case, the watermark can be induced through carefully designed prompts, and later detected from the generated text.

Thanks to a new nonprofit — the Electrochemistry Foundry (ECF) — and construction begun under its auspices, UC Santa Barbara is poised to join a group of collaborating partners in a new era of battery prototyping. The effort is aimed at bridging the gap between innovative technology and commercial availability, thus securing the technological foundations of the modern economy.

The public launch of ECF was announced on April 15, with the goal of accelerating the commercialization of advanced energy technologies in California’s first shared-use battery pilot manufacturing line. A $28 million competitive award from the California Energy Commission (CEC) has enabled the organization to move forward with developing the facility, to be located in Hayward, California, on the eastern side of San Francisco Bay, bringing job opportunities to a designated disadvantaged community.
  
The network of collaborating entities includes UC Berkeley, UC Riverside, the Volta Foundation, the Catalyst Innovation Group, and Lawrence Berkeley National Laboratory. Additionally, ECF will onshore world-class manufacturing expertise from South Korea through dedicated operational support from Top Material, an industry leader in operating flexible Li-ion manufacturing lines.  

“The support from the California Energy Commission will help to establish a state-of-the-art battery-component manufacturing pilot line at UCSB’s recently established OASIS research facility, thus bolstering efforts to develop and scale-up of novel manufacturing processes while training the next-generation of battery engineers,” said Jeff Sakamoto, a battery expert in the Materials and Mechanical Engineering Departments at UCSB’s Robert Mehrabian College of Engineering, the Mehrabian Endowed Chancellor's Chair, and director of the U.S. Department of Energy's Mechano-chemical Understanding of Solid Ion Conductors.

ECF’s mission calls for it to provide shared infrastructure and expertise required to address the current high-cost transition from laboratory research to industrial-scale production, considered the “missing link” in the American innovation ecosystem. With access to ECF’s pilot line, a startup developing, say, a new battery cathode can produce the first fifty multi-layer pouch cells needed to show to investors, without building a $10 million facility of their own.

“I’ve seen too many brilliant breakthroughs stall out in the pilot-scale gap,” said ECF CEO Dr. Brenna Teigler, whose background includes roles at Activate, Cyclotron Road, and the U.S. Department of Energy. “Our vision is a world powered by electrochemistry, where the path from scientific discovery to societal impact is open to all innovators. The next great battery breakthrough — whether it comes from a startup or an established company — should not be stopped by the cost of infrastructure they can't justify building alone.” 
“The ECF will fill a great unmet need by bridging the gaps between cutting-edge battery innovation, commercialization, and scale-up in California,” said Sakamoto, adding that UCSB will contribute ceramic-electrolyte R & D to enable advanced electrochemical technologies such as solid-state batteries and membranes for energy-efficient lithium separation.
 
The centerpiece of ECF’s operations is its 20,000-square-foot state-of-the-art facility, strategically located to leverage the Bay Area’s hardware-engineering talent and support economic growth in. Scheduled for completion in late 2026, the facility sits in the middle of the highest concentration of electrochemistry startups, world-class academic institutions, national labs, startup accelerators, and venture capitalists in the world. It features a comprehensive manufacturing line capable of producing at least ten thousand cells per year supporting both pouch and cylindrical formats.

“We are at a historic moment in the evolution of energy technology, where laboratory breakthroughs must rapidly become industrial realities to meet California’s climate and energy goals,” said Anthony Ng, manager of technology innovation and entrepreneurship at the CEC.  “Batteries supporting a clean grid and electrifying transportation play a critical role in realizing California’s vision of a one-hundred-percent clean-energy future. The CEC’s $28 million award to ECF supports this vision, thus ensuring that California remains the global hub for the entire lifecycle of electrochemical development.”

As a nonprofit with no commercial stake in the technologies developed within it, ECF operates as a fully IP-neutral resource. Its users retain complete ownership of their intellectual property, eliminating the conflicts of interest that can complicate partnerships with for-profit contract manufacturers or corporate innovation programs. This open-access model is supported by industrial-grade capabilities including precision electrode fabrication and advanced cell assembly in a high-spec dry room, formation- and performance-validation systems, and the data infrastructure to link raw materials directly to electrochemical results. Together, these resources allow startups and researchers to refine processes, produce industry-ready products, and prepare for high-volume production. 

“The future of the battery industry depends on our ability to scale both innovation and the workforce supporting it," said Yen T. Yeh, executive director of the Volta Foundation. “ECF’s approach brings those together, and Volta Foundation is proud to partner in supporting it.”

ECF is also committed to community-focused workforce development. In partnership with the Bay Area Community College Consortium and United Steelworkers, ECF will provide hands-on training that prepares those in the local workforce for high-skill careers in the energy and advanced-manufacturing sectors.

ECF is hiring and accepting expressions of interest from startups, researchers, and companies seeking pilot-scale electrochemical manufacturing and testing capacity. The facility is expected to welcome its first users in the first quarter of 2027. Get more information here.

Nearly thirty current students, incoming graduate students, and recent alumni from The Robert Mehrabian College of Engineering (COE) at UC Santa Barbara have been awarded prestigious graduate research fellowships from the National Science Foundation (NSF), recognizing their exceptional promise in science and engineering. The NSF awarded more than 2,500 fellowships for the 2026-27 academic year from a pool of nearly 14,000 applicants. Recipients are selected based on intellectual merit and broader impacts, including their potential to advance scientific discovery and contribute to society.

“The NSF GRFP is one of the clearest indicators of future leadership in science and engineering,” said Umesh Mishra, dean of The Robert Mehrabian College of Engineering. “Seeing so many of our students recognized at this level speaks to the culture of innovation, rigor, and collaboration that drives discovery at UC Santa Barbara, both at the graduate and undergraduate levels, and impacts the global economy.”

The NSF Graduate Research Fellowship Program (GRFP), one of the nation’s most competitive honors for graduate students in STEM fields, provides three years of financial support over a five-year period. Each fellowship includes a $37,000 annual stipend and a $16,000 cost-of-education allowance, totaling $159,000. 

This year’s recipients from COE include fifteen current students, at least eight incoming PhD students, and six alumni who are now pursuing graduate degrees at other institutions. The 2026-27 cohort represents a wide range of disciplines and research areas across the college, spanning six departments from COE: bioengineering, chemical engineering, computer science, electrical and computer engineering, materials, and mechanical engineering. 

2026-27 GRFP Recipients from The Robert Mehrabian College of Engineering

Current UCSB students (14)

Ethan Chen, Materials PhD student

First-year PhD student, Ethan Chen studies the complex supercurrent behavior of hybrid Josephson junctions based on thin-film cadmium arsenide, a two-dimensional topological insulator. His research is aimed at advancing fault-tolerant qubits, a key step toward more efficient quantum computing systems.

Advised by materials professor Susanne Stemmer, Chen says the NSF Graduate Research Fellowship provides both validation and opportunity. “The fellowship affirms the confidence I have in my scientific proficiency and highlights the impact of my research support system,” he said, adding that it enables him to pursue more curiosity-driven research into quantum materials.

Keyes Eames, Materials PhD student 

Advised by Distinguished Professor Steven DenBaars, Keyes Eames is a first-year PhD student broadly focused on gallium nitride (GaN) optoelectronics, with potential applications ranging from energy-efficient data communication to ultraviolet sterilization technologies.

For Eames, the fellowship’s greatest impact is the flexibility it provides. “The most important professional impact of the award will be academic freedom,” he said. “The flexibility enables me to choose the most impactful research and career path. On the personal side, the incredible research opportunity and implied career trajectory provided by this fellowship feel like an incredibly exciting and weighty responsibility.”

Trevor Hagan, Materials PhD student 

First-year PhD student Trevor Hagan, who is co-advised by Rachel Segalman and Craig Hawker, is developing electrostatically crosslinked polymer networks with applications in plastic waste recycling. His research focuses on enabling more efficient reuse and repurposing of polymers, advancing sustainable materials solutions.

Raised in rural southern Indiana, Hagan credits his parents for nurturing his early interest in science, often traveling long distances to support his curiosity. Receiving the fellowship marks both a personal and academic milestone.

“I am the first in my family ever to attain an academic degree, and thus also the first to pursue a PhD,” he said. “Having received an NSF Fellowship is both personally and academically affirming that multiple blind reviewers found my background, training, and research ideas compelling enough to invest in.”

Anika Jena, Chemical Engineering undergraduate 

Anika Mahajan Jena, a chemical engineering undergraduate senior, will pursue her PhD at Stanford University, where she will develop and probe advanced functional polymeric materials for health, sustainability, and human advancement. As an undergraduate researcher at UCSB, Jena studied phase separating membrane-actin network composites and tuned their mechanical properties with then-chemical engineering assistant professor Sho Takatori. Later, she engineered supramolecular architectures by coupling DNA nanotubes to condensates with physics professor Deborah Fygenson.

Jena, a 2024 Congressional Goldwater Scholar, explains that the NSF Fellowship will support her research beyond traditional funding constraints. “This fellowship will enable me to freely conduct independent, cross-disciplinary work on self-driven research projects,” she said. She indicates possible future research in self-healing, stimuli-responsive, conductive, and biocompatible soft materials.

Christopher Koh, Mechanical Engineering PhD student 

A first-year PhD student, Christopher Koh is exploring the intersection of control theory and machine learning to advance safer, more reliable intelligent systems. Advised by Francesco Bullo, his work focuses on integrating the mathematical rigor of control theory with the flexibility of machine learning, an approach aimed at improving performance in safety-critical applications such as robotics, aerospace systems, and power grids.

While machine learning offers powerful new capabilities, its lack of reliability in high-stakes environments remains a key limitation. Koh’s research seeks to address that challenge, developing methods that can meet the demands of real-world deployment where precision and dependability are essential.

Receiving the prestigious NSF Fellowship provides both validation and stability early in his graduate career. “Securing funding in the current environment at the federal level provides a large sense of relief,” Koh said. “This allows me to focus more fully on my research and degree requirements, and to pursue work that aims to make these systems more flexible, safe, and reliable.”

Ben Kunimoto, Bioengineering PhD student; Data-Driven Biology Trainee

Ben Kunimoto, a first-year bioengineering PhD student advised by Siddharth Dey, is developing a new technology to map the subcellular localization of the transcriptome, enabling researchers to determine where mRNA from all genes is situated within individual cells. His work addresses a longstanding challenge in biology: while nonuniform mRNA localization is known to play a critical role in processes ranging from embryogenesis and tissue development to immunology and neuroscience, it has been difficult to study at a genome-wide scale due to technological limitations. Kunimoto aims to overcome this barrier and apply the method to investigate how polarized mRNA localization influences cell differentiation during early mammalian embryogenesis.

“I feel very honored to have received this fellowship, and I’m thankful that it will help me and my lab pursue interesting and valuable research,” he said.

Karlee Macaw, Mechanical Engineering master’s student 

Karlee Macaw, a first-year master’s student advised by Ryan Stowers, studies how cells respond to the mechanical properties of their environment, with implications for diseases such as cancer and fibrosis.

Macaw describes the fellowship as a pivotal milestone in her academic journey. “It validates the work I’ve committed to and makes pursuing a PhD focused on mechanobiology research a real possibility,” she said, noting that it provides the freedom to focus on impactful research. 

Conor Puglsey, Chemistry & Biochemistry Undergraduate student

A fourth-year undergraduate triple-majoring in chemistry and biochemistry, pharmacology, and statistics and data science, Conor Pugsley has conducted research in the Materials Department since his first day on campus. Working with materials associate professor Angela Pitenis, he has investigated how polymer architecture influences mechanical performance, with the goal of designing more effective materials for biomedical implants. A Beckman Scholar and active participant in Center for Science and Engineering Partnerships (CSEP), Pugsley has built a strong foundation at the intersection of chemistry, materials science, and data-driven research.
 
He will pursue a PhD in Bioengineering, where he plans to study the chemistry of biomaterials to enable precision drug delivery of complex macromolecular therapeutics. His proposed research focuses on using supramolecular chemistry to control hydrogel mesh structures, allowing for more precise encapsulation and release of therapeutics—an approach with potential to reduce overdose risks and improve automated dosing in medical treatments.

“Receiving this fellowship reflects positively on my drive to pursue research and highlights the excellent mentorship I have received along the way,” Pugsley said. “It enables me to pursue my own work in graduate school and focus on the intellectual curiosity that initially brought me to biomaterials research, while laying the foundation for securing future research support as I work toward becoming a professor.”

Naomi Rehman, Computer Science PhD student 

A first-year PhD student advised by Tim Sherwood and Jonathan Balkind, Naomi Rehman works at the intersection of computer architecture and artificial intelligence, aiming to improve AI efficiency and enable privacy-preserving systems on edge devices.

Rehman says that the fellowship affirms her academic path and opens new possibilities. “When I switched from robotics to computer engineering with the hope of becoming a computer architect, I was nervous about whether it was a wise decision. Receiving this fellowship feels like a confirmation that it was,” she said. “I’m especially proud to represent women in computer architecture, and I hope receiving this encourages other women to pursue careers in the field. Academically, I’m extremely grateful for the freedom that this fellowship gives me to pursue the work that I find most promising and impactful.”

Ava Salami, Chemistry undergraduate 

A fourth-year chemistry and biochemistry major, Ava Salami works as an undergraduate researcher in the lab of bioengineering assistant professor Marley Dewey. She investigates the use of cell type-specific extracellular vesicles within biomaterials toward tissue-specific repair in the musculoskeletal system. 

Rohil Shah, Computer Science undergraduate

A fourth-year undergraduate in the College of Creative Studies, Rohil Shah studies computing with a focus on search systems and artificial intelligence. His research centers on designing efficient, time-aware search engines that can rapidly adapt to new or evolving information, with applications ranging from real-time information retrieval to combating misinformation during breaking events. After graduating, Shah will pursue a master’s degree in computer science at Stanford University.

Shah credits his mentors—especially computer science professor Tao Yang, with whom he works most closely—and the broader UCSB community for shaping his academic path and research direction.

“Receiving the fellowship feels like the product of their mentorship and guidance,” he said. “It motivates me to take full advantage of this opportunity and to explore the intersection of search systems and artificial intelligence in order to develop technologies that can meaningfully improve people’s lives.”

Roenigk Straub, Materials PhD student

Roenigk Straub, a first-year PhD student advised by Daniel Oropeza, studies parameter-induced porosity in additively manufactured metals, with specific applications in aerospace, electronics cooling, and catalysis.

The fellowship has increased Straub’s confidence as he transitions into academic research. “Prior to coming to UCSB, I had been focused on working in industry, so receiving this fellowship has increased my belief in myself as a researcher,” Straub said. “Learning that others see potential in my work further motivates me to make the scientific breakthroughs I outlined in my proposal a reality.”

Straub says the fellowship will allow him to broaden the number of alloys and functional properties that he can examine, expanding the potential impact of his work to a wider range of next-generation technologies.

Andres Torres, Mechanical Engineering PhD student 

A first-year PhD student advised by Elliot Hawkes and Tobia Marcucci, Andres Torres is developing autonomous capabilities for soft robots designed to safely interact with humans, similar, Torres says, to “Baymax from the movie, Big Hero 6.” 

Torres describes the award as both a personal milestone and a reflection of strong mentorship at UCSB. “Receiving the NSF Fellowship means so much to me,” said Torres. “I definitely would not have won this without the amazing resources and faculty here at UCSB, especially my advisors. This fellowship will enable me to pursue my dream of working on robotics research!”

Christopher Xu, Mechanical Engineering PhD student 

Christopher Xu, a first-year PhD student advised by Elliot Hawkes, develops highly agile robotic systems capable of navigating complex environments, with applications in exploration and environmental monitoring. For example, he is working on a robot that jumps into trees like a squirrel.

Xu says the fellowship allows him to fully immerse himself in research. “I can focus entirely on research, which is what I came here to do,” he said, adding that the support gives him the freedom to explore ambitious ideas that might otherwise be difficult to pursue. “I love the work that I do in the lab like a hobby, and I am grateful to be able to do it with so much flexibility.”

Roland Yin, Electrical Engineering undergraduate student

Roland Yin, a senior undergraduate in electrical engineering, conducts research on quantum and inorganic materials under the guidance of professors Stephen Wilson and Ram Seshadri. His work explores structure-property relationships, beginning with energy-efficient synthesis of sodium battery cathodes and evolving to superconducting materials for quantum applications.

Currently, Yin investigates electronic correlations in kagome superconductors, a quasi-two-dimensional class of materials that exhibit an interplay between unconventional superconductivity and charge-density waves. By selectively alloying and doping these systems, he aims to understand how electron–electron interactions emerge and to leverage intrinsic Josephson effects for quantum sensing applications, including high-sensitivity radio-frequency detection in quantum computers and atomic clocks.

“Receiving the NSF GRFP is both exciting and deeply validating,” Yin said. “It recognizes the research I have pursued over the past few years as a meaningful contribution to quantum technologies, and it affirms that I have the potential to grow into an independent researcher who can frame interesting questions, discover innovative solutions, and communicate noteworthy results. As someone who envisions a future in academia, being selected by NSF gives me a strong sense of support heading into my PhD and academic career.”

In addition to the fourteen current engineering students, at least eight incoming PhD students who will begin their studies at UCSB this fall, and six recent alumni now pursuing graduate degrees at other universities, also received NSF Graduate Research Fellowships. 

Recent UCSB alumni now pursuing graduate degrees at other universities (6)

Stephanie Anujarerat (Chemical Engineering), now at Carnegie Mellon University

Marcus Condarcure (Chemical Engineering), now at Purdue University

August Dolmatch (Chemistry), research assistant in the lab of Carolyn Mills (Bioengineering)

Aaron Huang (Physics), undergraduate researcher in the lab of Omar Saleh (Materials), now at the University of Chicago 

Sofia Rivalta Popescu (Chemical Engineering), now at Stanford University

Anton Semerdjiev (Chemical Engineering), now at the California Institute of Technology

Incoming PhD students who will begin their studies at UCSB in fall 2026 (8)

Three in the Electrical & Computer Engineering Department

Two each in the departments of Bioengineering and Materials 

One in the Chemical Engineering Department

Established in 1952, the GRFP has supported over 70,000 graduate research fellows, many of whom have gone on to become leaders in research and innovation. More than 40 former fellows have received Nobel Prizes, underscoring the program’s long-standing role in advancing scientific innovation and leadership. 

“Basically with the right AI model, if you fit as much data as you want and contribute as much computing power as possible, then it can achieve great things. But the key question is, What is next?” That’s the question Xin (Eric) Wang, an assistant professor of computer science, addressed on April 9 at the most recent installment of the UC Santa Barbara Library AI in Action speaker series.

Wang presented his work alongside Fabian Offert, an assistant professor for the history and theory of digital humanities. The speaker series continues next month, with The Robert Mehrabian College of Engineering distinguished professor Simon Billinge, director of the California NanoSystems Institute, and Nina Miolane, assistant professor of electrical and computer engineering and co-director of REAL AI (Reliable, Efficient, and ALigned AI) for Science, speaking on May 18 at 4 p.m. in the library’s first-floor Instruction & Training Room (1312).

Wang, who directs the UCSB Center for Responsible Machine Learning, discussed his approach to the future of AI — the development of AI agents that can understand and respond to humans and to their environment, and take action on the information they receive.

“I believe the next big step is to build AI agents that can see and understand the multimodal environment, that can chat with humans in natural language, and more importantly, take actions to interact with the environment.” He sees applications for AI agents in always-on-call medical assistants, smart drones that can irrigate crops, and digital agents that people interact with through their phones and apps.

As part of his presentation, Wang described how his startup, Simular, where he is head of research, is developing AI agents that can reason more like humans do.

Many current large language models, he said, use a process called “chain-of-thought reasoning,” a step-by-step reasoning process,similar to the way a person might run through a list of ideas. “But here is the problem: humans don’t actually think this way, at least not all the time,” Wang said. Thought isn’t always verbal and sequential: it can leap from idea to idea, free-associate, and veer off track before coming up with an answer.

To incorporate more of those looser, human-like thought processes, Wang and his research team are introducing the idea of “soft thinking” to AI, a more associative process in which the AI agent can follow different lines of inquiry simultaneously. Wang is also working on how to better evolve “soft thinking” and other traits in AI over time.

At the presentation, Wang discussed how most development of AI agents is modeled on biological processes — finding a successful version, then amplifying that version while discarding the others, a survival-of-the-fittest model. “But if you think about it, AI agents are not biological, they're digital. Then why are we forcing them to evolve like biological individuals?” Wang said. “That’s very inefficient, actually. AI agents are not constrained by genetics, lineage,or reproduction. And they can directly share everything they have, including their trajectories, experience, tools, and workflows.”

In a recent paper posted to arXIV, he and computer science PhD students Zhaotian Weng, Antonis Antoniades, Deepak Nathani, Zhen Zhang, and Xiao Pu report that this approach, using AI agents that evolve as a group, outperforms self-evolving AI agents and matches or improves upon human-designed AI agents.

AI: Embedded in Culture

Before Wang’s presentation, Offert, director for the Center for Humanities and Machine Learning, discussed the emerging discipline of critical AI studies, a field that is the subject of his forthcoming book, Vector Media, written with Leonardo Impett, an assistant professor at the University of Cambridge in the UK.

The book focuses on the idea of embedding, the way that text, images, and graphs are represented in a way that they can be processed as data by machines. One aspect of embedding is that images and text often need to be compressed, which results in lost data.

Offert showed an example: a blurry video of the movie Shrek, in which the images had been compressed but were still recognizable. “Compression exploits our inattention to certain perceptual aspects of visual data,” Offert said.

Compression also works in only one direction. “We can never retrieve the image from the embedding,” Offert said. “We can, of course, save a kind of artificial connection here and just remember that we embedded this image, but we cannot go back.”

Offert traced the idea of compression through the early days of artificial intelligence, and discussed its connections with both vision science and political and economic history. For example, financial markets are “very abstract, and removed from the natural form of goods and services,” he said. “The price of making everything exchangeable, making everything commensurable through embedding is that we lose the characteristics of the media that we embed.”

In a question-and-answer session following the talks, Offert discussed AI’s role in different fields. In the sciences, Offert said, “Time and again, it has been shown that synthetic data [developed by iterating through AI models] is fine. It doesn’t get you all the way, but it works.” In contrast, multiple recent papers have shown that for some humanities research, “especially in terms of aesthetics, if you look at image generators, there’s a convergence on the worst kind of stock photography,” he said. “It becomes very clear what good and bad applications of AI systems are, where they really shine, which is not in making images for you, but in other domains."

“It’s one of those must-do classes. Honestly, there's not going to be a better opportunity to really dig into a specific project and see it to fruition.” That’s how fourth-year computer science student Cooper Hawley describes the UC Santa Barbara Computer Science (CS) Department’s two-quarter capstone course sequence, which brings together student teams and industry leaders to tackle real-world problems. After spending six months working on challenges from supporting stroke survivors through a smartwatch app to allowing property managers to easily schedule repair and maintenance work, eight student teams presented their projects in March at the annual CS capstone day.

The event draws undergraduate and graduate students, faculty, alumni, and industry partners for the presentations, a poster session, awards, and networking opportunities for the forty students participating in the capstone project. At the event on March 9, the teams presented their projects in front of a judging committee made up of Chris Bunch from Google, Wei-Tsung Lin from Salesforce, and Alexis Cole from Amazon. 

“Over the years, I've seen the capstone projects evolve from isolated technical prototypes into highly sophisticated, production-ready systems. There is a noticeable trend toward solving high-stakes, real-world problems,” said Chandra Krintz, computer science professor and associate dean of the UCSB Graduate Division. “The sophistication has increased because students are no longer just coding,” noted Krintz, who taught this year’s winter-quarter capstone project course. “They are acting as systems architects, integrating complex distributed agents, leveraging recent advances in artificial intelligence and machine learning, and incorporating advanced cloud services.”

This year, three teams took home awards. First-prize winner Team Altera created one of those sophisticated, production-ready designs, Krintz said. “They tackled the notorious bottleneck of healthcare referral coordination by transforming the multiday medical-referral process into a pipeline that allows patients to schedule referral appointments in less than an hour.”

Team Cadense took second place for the smartwatch-based therapy system they developed to support stroke survivors as they retrain their walking cadence. And third-place winner Team RapidRecall, sponsored by Cottage Health, designed and built a system that automates the analysis of recalls from the FDA and industry suppliers, accelerating a hospital’s ability to intercept harmful medical devices and pharmaceuticals before they reach the patient.

“The judges have repeatedly told us, ‘This is the best day of our year.’ And we have senior people in the industry who see a lot of cool things,” said computer science professor Dahlia Malkhi. She said that, while she knew it was a cliché, the capstone students were all winners. “They all worked so hard, they had phenomenal achievements, and they went from being forty individual students in the fall to being eight cohesive teams, each with a mini company or mini startup in a completely different domain. And their presentations were just staggering.”

Students began the capstone project in Malkhi’s fall-quarter course. (A capstone project in either computer science or computer engineering is a graduation requirement for computer engineering majors, and an optional elective for computer science majors.) In the first few weeks, students formed groups, then heard pitches about potential projects from industry partners AppFolio, Artera, Cadense, Cottage Health, Mysten Labs, Planet Labs, SpaceComputer, and Visual Layer. Groups ranked their preferences, and Malkhi and her team matched the groups with projects. 

From there, students met both in class and then with a mentor as they conceptualized and developed their approach to the problem. Rithwik Kerur, a second-year computer science PhD student and the capstone teaching assistant, also supported the teams from project selection to their final presentations.

Faster, More Accurate Medical Referrals

Hawley and his teammates — fourth-year computer engineering majors Aden Jo, Benjamin Soo, David Duenas, and John Hagedorn — worked with sponsor Artera, which provides a platform used by healthcare organizationsto improve patient access, engagement, and outcomes. With the help of Kerur and Anav Sanghvi, a senior software engineer at Artera, they designed a process that allows doctors and patients to set up and schedule referrals with the help of an AI agent — a process that typically takes several days, but that the team shortened to less than an hour.

Doctors and other health-care providers have access to their patients’ intake interviews and their progress throughout the system, Hawley noted, so that there is human input and oversight at every step.  

Hawley said that the change in his team’s project between the two quarters was unmistakable. “In the first quarter, it felt like it was not coming together. But by the time we entered the second quarter, with the break between the two, we could look back more transparently and see that we’d set a really nice foundation.”

Keeping Movement on Track for Stroke Survivors

Team Cadense — fourth-year computer engineering students Vincent Cheong, Jim Wang, Christopher Lai, Scott Ricardo Figueroa-Weston, and Jeremiah Wong — was inspired by a stroke survivor named Kevin and his efforts to relearn how to walk.  While closely supervised in physical therapy clinics, stroke survivors and others with movement disorders practice movements so that the brain re-learns through practice, repetition, and feedback. But at home, these exercises can be more challenging to maintain, which can affect a patient’s progress.

To address this problem, Team Cadense developed both a device and an app that helps patients continue to train at home. The device attaches to a walking stick and provides an auditory beat that allows patients to sync their movements to visual, audio, and haptic cues. The smartwatch app, which can be used independently or with the device, gives patients feedback as they walk. 

“At first, when we were designing the app, we designed with ourselves — students — in mind,” Figueroa-Weston said. “But when we gave the app to the patients, they didn’t know what to do.”  They used this feedback to create a user-friendly interface with a simple design and large buttons that would be easy to use for older patients who have  physical impairments.

“The most rewarding part of this process was being able to talk with the patients,” Figueroa-Weston said. “That was something that I definitely would not have gotten outside of the class. And it has shaped my process of how I would build things in the future. Now, I know it’s important to talk to your users as soon as possible.”

Capstone Projects Kickstart Careers

While some projects wrap up once the course ends, others continue to grow. Three of the eight capstone projects from last year went live in some form, Malkhi said, including one team that formed a startup that was subsequently funded by the City of Santa Barbara’s incubation program. 

In other cases, student projects become the seeds for further academic research. “I’ve already admitted several students who want to continue their projects,” Malkhi said, “so we’re going to incorporate them into the research we have in my lab.”

Malkhi noted that no matter the outcome of the capstone project itself, the experience has long-term ripple effects. “Some of the students get jobs from the companies they’ve interacted with. Some get jobs from the judges or spectators that came to the capstone event,” she said. “And all the students that come back to talk with me say that in their job interviews, they have a story to tell beyond their transcripts. They can talk about the project and the teamwork, the challenges they’ve encountered, and the full-fledged design and product. They tell me that this is the one thing that gets them job offers, and that they also feel good about interviewing, because they have something to show for themselves.”

A team of three UC Santa Barbara students is one of fifty teams worldwide that is advancing to the World Finals of the 2026-’27 International Collegiate Programming Contest (ICPC). Team Team UCSB-WA qualified by placing sixteenth out of fifty-two teams at the North America Championship (NAC) event, held March 22 in Orlando, Florida. The top sixteen teams from Canada and the U.S. earned guaranteed qualification to the World Fin als, to be held November 15–29 in Dubai, United Arab Emirates. Two additional wildcard spots were awarded, with the possibility of more to follow. to be held in Dubai, United Arab Emirates November 15-29.

Coached by alumnus (at right, from left) Wesley Hung (’24 Computer Science), the team includes second-year undergraduate students Ezra Furtado-Tiwari (CCS Computing, Mathematics) and Om Mahesh (CCS Mathematics; COE Computer Science), and fourth-year undergrad David Qiao (L&S Computer Science, Mathematics).  

Widely regarded as the premier collegiate programming competition, the ICPC challenges teams to solve complex algorithmic problems under intense time constraints, mirroring the type of problem-solving required in modern software development. Now, the advancing teams will present those skills on the world stage. 

World Finals competitors will represent top problem-solving talent. “To advance, a team needs extreme problem-solving skills,” said event organizer and UCSB computer science professor Daniel Lokshtanov. “Imagine the hardest homework problem you've ever gotten; now, solve ten of those in five hours.” In fact, a defining moment for the team came late in the contest, when a breakthrough on a difficult interactive problem secured, by a hair, their spot at the World Finals.

 

 

 

 

 

 

 

 

 

For Qiao, this victory marks a fulfilling dream realized after four years of competing in the Southern California Regionals, with just one problem separating him from qualifying in past years.

“This year, by not only making it to NAC but also qualifying for worlds, I feel like I’ve completed every goal I’ve had for my competition career,” he said. “I'm very thankful for my team as well and the group that we practice with; it's been a joy to discuss problems alongside them.”

The team’s finish took everything — and almost every minute — they had, as they finished in sixteenth place, just on the guaranteed cutoff to qualify for the World Finals and with only fourteen minutes of penalty time separating them from the next school. "We finished just on the guaranteed cutoff to qualify for the World Finals with a margin of only  fourteen minutes of penalty time ahead of the next school," Qiao said.

"I'm incredibly proud of Ezra, Om, and David," said UCSB distinguished computer science professor and College of Creative Studies (CCS) dean, Timothy Sherwood. “The ICPC World competition is the biggest possible stage for these creative problem solvers! I also love that all three of our undergraduate-serving colleges (CCS, Engineering, and Letters & Science are represented on one team. That is representative of the collaborative spirit that makes UCSB so wonderfully unique."

Ezra Furtado-Tiwari took on the sensibility of being in it for the challenge and the experience. “I’m most excited about trying harder problems and learning new techniques,” he said. “Even when problems feel out of reach, exploring them is part of what makes this process so rewarding.”

Said Coach Wesley Hung, “I'm really proud of them for qualifying for the ICPC World Finals. I've seen them practicing diligently for months leading up to both the regional and national competitions, and I'm very happy to see their hard work pay off.”

In an ongoing effort to bring quantum science out of the tightly controlled lab environment and into the field, researchers from UC Santa Barbara and the University of Massachusetts Amherst have, for the first time, demonstrated a chip-scale, stabilized, visible light laser that drives a trapped ion atomic optical clock and quantum qubit, paving the way toward compact, portable and scalable trapped-ion quantum information systems.
 
“This work is foundational in that we demonstrated that chip-scale integrated photonic stabilized lasers can be used to connect precision light to one of the narrowest atomic optical transitions that people work with, with the trapped ion itself created on a surface trap chip operating at room temperature,” said Daniel Blumenthal, a professor of electrical and computer engineering at UCSB and a senior author of a paper published in Nature Communications.

Miniaturization is the name of the game for Blumenthal’s research group, which is working to shrink what are normally large lasers and optical components and often room-sized quantum optical light-matter experiments, down to about the size of a deck of cards. The traditional lasers and other components that power these experiments typically occupy 90% of the setup on table-tops and equipment racks that require hand-tuning and are very susceptible to environmental disturbances. In scaling these components down to the chip and providing room temperature operation, it becomes possible to bring the power and precision of quantum measurement, sensing and computation to more researchers and to a wider variety of experiments, as well as making these technologies more robust and portable.

“These portable quantum circuits can then be located at many places on the Earth and flown in satellites, to the moon, and into space,” Blumenthal said. “The ability to use these precision portable clocks opens a wealth of applications and fundamental science including search for dark matter and dark energy, the mapping of gravity, and measurement of general relativity and the search for fundamental constants, and possible time varying changes in these constants.” Networks of these clocks can sense and measure gravity on Earth and create gravity maps around other solar objects, he added, or sense shifts in geological conditions.

Visit The Current for the complete story.

A new age of 3D printing is here, even though the initial technology for what is also known as additive manufacturing arrived less than twenty years ago. UC Santa Barbara is stepping into the era thanks to a $1.15 million grant from the National Science Foundation (NSF) to purchase the most cutting-edge 3D printing technology available: a 3D rapid nanoprinting system based on two-photon photolithography. The equipment will enhance the capabilities of the already widely recognized UCSB Nanofabrication Facility, (aka the “Nanofab” or “Nanotech”). 

“The unique capabilities of this system open the door to new approaches to nano- and micro-manufacturing of complex structures and devices that are no longer constrained by geometry nor confined to two-dimensional planes,” the authors write.

Galan MoodyBy securing the grant, lead PI Galan Moody (left), UCSB professor of electrical and computer engineering, and four co-PIs — Marley Dewey (Bioengineering), Andrew Jayich (Physics), Sumita Pennathur (Mechanical Engineering), and Andrea Young (Physics) — are ensuring that UCSB can take a leadership role in pushing the boundaries of what the new technology can do. Says Moody: “There are just a few universities in the U.S. that have tools with these capabilities.”

The tools are needed, the proposal reads, “because we are at the limit of what can be achieved with existing nanofabrication tools, which have enabled wafer-scale fabrication of semiconductors, dielectrics, and metals with resolution down to approximately ten nanometers [nm], but only in a planar [essentially two-dimensional] geometry. Additional complex, time-consuming steps are required to create increasingly essential 3D microstructures.”

Advances in the past couple of years have brought 3D printing to the realm of the very small, supporting an array of applications by enabling on-chip 3D printing of microstructures, a capacity that will benefit researchers in many disciplines.

Moody provides an example to illustrate a limitation of the first 3D printing technology. “Normally,” he says, “you start with a semiconductor wafer [typically silicon] and use photolithography to yield a semiconductor with a pattern that has been transferred to it. If you look top down, it's just a 2D pattern. There is depth to it, but it's more like a thin film that’s maybe a few hundred nanometers thick, so you can’t raster-scan it in all three dimensions to make a nice 3D structure. Ten-nm-resolution lithography is available at off-campus commercial foundries, but none is capable of creating complex 3D structures with nanoscale resolution and high speed for high-throughput prototyping, which are required for next-generation devices. Being able to make structures in true three dimensions opens new capabilities.”

Five Researchers, Five Uses

The principal investigators’ research reflects the diverse focuses of potential users — and uses — of the new equipment. 

 

 

Moody, an expert in integrated quantum photonics, will create new photonic chip designs for ultra-efficient entanglement distribution and networking (example above; photographs courtesy of Nanoscribe): Panel A: A top-down view shows seven tiny lines — wave guides — at the edge of a photonic chip. Light travels from the chip to another component, such as a fiber optic cable. The flaring bright spots (bottom) are 3D-printed lenses that prevent the light from diverging as it leaves each guide. Panel B: Direct-write printing on fiber of a triangular component that creates a smooth transition between the smaller-diameter light from the wave guide and the larger-diameter fiber optic cable (right). Panel C: A spherical “ball” lens about 30-40 microns wide and a wave guide (entering from bottom right) that was printed to it. Light enters, hits the sphere, and is then launched up. Smoothness is key, because jagged edges can trap or scatter light, resulting in loss.

Jayich will apply his expertise in trapped ion systems to microprint 3D ion traps for optical clocks. Some of what he needs for experiments is not available on campus, and off-campus commercial vendors — except those who sell the new 3D-printing tools — are unable to provide what he needs for rapid fabrication of prototypes. The new tool will enable rapid prototyping of 3D-printed ion-trap structures on campus.

Dewey has a newly established research program at UCSB combining biomaterials with extracellular vesicles for skeletal repair and disease treatment, including broader impacts in coral regeneration and repopulation. She plans to use the system to create patterned biomaterials, or scaffolds, like the one shown at left, which can be used for a variety of purposes.  

Pennathur intends to 3D print micro-systems to analyze fluids, and also to make micro-fluidic channels on chips. Researchers in her lab engineer micro- and nanofluidic systems to study how fluids, ions, and biological molecules behave when confined to channels approaching nanometer scales — where surface chemistry, Debye layers, and geometry — rather than bulk fluid properties — govern transport. A central goal is translating the physics of such processes into functional devices, such as biosensors, passive flow-control systems, and implantable therapeutics that regulate themselves without external actuation. 3D printing enables rapid iteration on device architectures that would otherwise require weeks of cleanroom fabrication, accelerating the path from physical insight to working prototype. The photograph above shows a nanofluidic channel with integrated electrodes, with a dime for scale.

In Young’s research, he combines nano-fabrication and electronics to investigate the properties of electronic states in quantum materials. He can use the 3D-printed tool to create what he calls a nano-SQUID, or superconducting quantum interference device, and attach it to the tip of an atomic-force microscope to enhance its ability to characterize materials. The image below shows a nano-SQUID that Young would be able to print in-house.

All of the PIs will work closely with the Nanofab technical and operational director, Brian Thibeault, to coordinate system installation and qualification, as well as to implement training and the educational outreach activities described below.

Like Jayich, Moody would like to do in-house work that is currently either not being done or has to be sent to vendors for required engineering. “Using the new tool to do it ourselves has multiple benefits,” he says. “We do the work faster and for less cost, my students will get trained on the very best equipment, and we can share the in-house knowledge with the rest of the UCSB photonics community.” 

Fueling the Future Workforce

Any new technology requires a workforce of people who can use it, especially in industry, and Moody sees educating students as a high priority. The proposal spoke convincingly about plans to train not only UCSB students on the equipment, but also local community-college students. That gives those students, many of whom are first-generation college students, access to good jobs without having to earn a PhD or, perhaps, depending on their circumstances, attend university at all. “They will learn to use the equipment if they go through one of our boot camps or participate in one of the many UCSB internship programs operated with the support and leadership of the Center for Science and Engineering Partnerships,” Moody says.

Those programs include the Central Coast Partnership for Regional Industry-focused Micro/Nanotechnology Education (CC-PRIME). Led by Santa Barbara City College and run through the California NanoSystems Institute (CNSI) at UCSB, CC-PRIME partners with local technology companies of all sizes and regional community colleges to train students in nanofabrication skills. It is part of a broad effort to build a regional educational pipeline to cultivate the micro/nanotechnology workforce.

Some students who participate in such programs might then opt to pursue a four-year degree or attend graduate school, but not all companies need someone with an advanced degree to run their fabrication processes. “If students have a certification saying, ‘I've gone through these boot camps,’ then they become valuable assets to companies,” Moody observes. “That might lead some students to think, Hey, I can do this. I've got the skills. Let me go get a job” now.

What Makes the New Tech New

While both the original and the new processes of 3D printing share a name, the earliest technology was actually more of a two-dimensional framework, Moody explains. “Three-dimensional objects could be created, but only by aggregating (the ‘additive’ part of “additive manufacturing,” as the first generation of 3D printing was also called) very thin layers of materials, which had length and width but no real depth.
 
Quantum networks, secure quantum communications, quantum sensors, and optical quantum computers and simulators require high-quality sources of entangled photon pairs, and much research is devoted to improving source quality, especially the pair-generation rate (PGR) which is the number of photon pairs extracted from the chip per second. The proposal explains that, Optical extraction efficiency from integrated chips to fiber is limited by the diameters of the various modes, which, typically, differ from each other by ten to fifty percent for quantum networking. To get around that discrepancy and “get the light into the fiber,” Moody says, “we bring the fiber right to the edge of the chip to capture it before it can diverge.”

The new system makes it possible to print a tiny polymer lens fewer than fifty micrometers wide onto the edge of a chip, enabling it to guide the optical mode. Fig. 3, Panel B shows an optical fiber and a small 3D-printed cone on the end of it that also acts as a lens to focus the light so that it goes into and propagates down the fiber rather than scattering as “loss,” a big problem in photonics.The lens can be printed either on the edge of a chip or on a fiber, as long as the light coming out of the chip is “matched to collect into the fiber,” Moody says.

“We try to arrange the components such that the light doesn’t “see” the fiber as a discontinuity. That allows the light beam coming out of the photonic chip to couple nicely into the fiber and keep going, resulting in very low loss,” Moody adds. “That's really hard to do, because there is often a mismatch between the shape of the optical beam in our small wave guide and how it looks in the much-larger fiber. To make the transition requires a smooth 3D structure without any jagged edges, which can trap or scatter the light, resulting in loss.

Smooth nanoscale 3D printing is essential for many structures and devices being made at UCSB. The new 3D printing technology can deliver it.

 

Subscribe to RSS