story
stringlengths
15
26k
five faculty members from the texas a&m university college of engineering were selected to receive a 2021 distinguished achievement award from texas a&m and the association of former students they were among 24 outstanding members of the universitys faculty and staff to be honored the college of engineering recipients recognized for their research were dr mahmoud el-halwagi professor and bryan research and engineering chair in chemical engineering artie mcferrin department of chemical engineering and managing director of the gas and fuels research center; and dr svetlana sukhishvili professor department of materials science and engineering and director of the soft matter facility dr ibere alves professor of practice harold vance department of petroleum engineering and dr aakash tyagi professor of practice department of computer science and engineering were recognized for teaching; and dr gerard l coté texas a&m regents professor james j cain professor i in biomedical engineering and director of the center for remote health technologies and systems was recognized for graduate mentoring the university-level distinguished achievement awards were first presented in 1955 and have since been awarded to those who have exhibited the highest standards of excellence at texas a&m in recognition of their achievements each recipient receives a cash gift an engraved watch and a commemorative plaque see the complete list of honorees on texas a&m today
since the early industrial revolution in the mid-1700s fossil fuels have acquired an ever-growing footprint in energy production but the environmental concerns of fossil fuels use and their inevitable depletion have led to a global shift toward renewable energy sources these transitions however raise questions about the best choice of renewables and the impact of investing in these resources on consumer cost in a recent study published in the journal nature communications researchers at texas a&m university have devised a metric that reflects the average price of energy in the united states much like how the dow index indicates trends in stock market prices the researchers metric reflects the changes in energy prices resulting from the type of energy sources available and their supply chains energy is affected by all kinds of events including political developments technological breakthroughs and other happenings going on at a global scale said stefanos baratsas a graduate student in the artie mcferrin department of chemical engineering at texas a&m and the lead author on the study it's crucial to understand the price of energy across the energy landscape along with its supply and demand we came up with one number that reflects exactly that in other words our metric monitors the price of energy as a whole on a monthly basis today the energy industry is largely dominated by fossil fuels like coal natural gas and petroleum an increase in fossil fuel consumption particularly in the last few decades has raised increasing concerns about their environmental impact most notably the intergovernmental panel on climate change has reported an estimated increase at 02 degrees celsius per decade in global temperature which is directly linked to burning fossil fuels but only around an 11% share of the total energy landscape comes from renewable sources although many countries including the united states have committed to using more renewable energy sources there isnt a way to quantitatively and accurately measure the price of energy as a whole for example an establishment might use a combination of solar and fossil fuels for various purposes including heating power and transportation in this case it is unclear how the price would change if there is an increased tax on fossil fuels or subsidies in favor of renewables are introduced energy transition is a complex process and there is no magic button that one can press and suddenly transition from almost 80% carbon-based energy to 0% said dr stratos pistikopoulos director of the texas a&m energy institute and senior author on the study we need to navigate this energy landscape from where we are now toward the future in steps for that we need to know the consolidated price of energy of end users but we don't have an answer to this fundamental question
to address this research gap the researchers first identified different energy feedstocks such as crude oil wind solar and biomass and their energy products so for example crude oils energy products are petrol gasoline and diesel next they categorized the energy end users as either residential commercial industrial or transportation further they obtained information on which energy product and how much of it is consumed by each user from the united states energy information administration last they identified the supply chains that connected the energy products to consumers all this information was used to calculate the average price of energy called the energy price index for a given month and forecast energy prices and demands for future months as a potential real-world use of this metric the researchers explored two policy case studies in the first scenario they studied how the energy price index would change if a tax on crude oil was imposed one of their main findings upon tracking the energy price index was that around $148 billion could be generated in four years for every $5-per-barrel increase in crude oil tax also this tax would not significantly increase the monthly cost of energy for us households in the second case study that explored the effect of subsidies in the production of electricity from renewable energy sources they found that these policies can cause a dip in energy prices even with no tax credit baratsas said their approach offers a way to optimize policies at the state regional and national level for a smooth and efficient transition to clean energy further he noted that their metric could adapt or self-correct its forecasting of energy demands and prices in the event of sudden unforeseen situations like the covid-19 pandemic that may trigger a drastic decrease in the demand for energy products this metric can help guide lawmakers government or non-government organizations and policymakers on whether say a particular tax policy or the impact of a technological advance is good or bad and by how much said pistikopoulos we now have a quantitative and accurate predictive metric to navigate the evolving energy landscape and thats the real value of the index this research is funded by the texas a&m energy institute and mays business school
in 2020 the world watched as spacex launched a two-man crew in a commercially built and operated spacecraft to the international space station they then watched as they safely splashed down two months later long before the actual liftoff however researchers with nasa worked with staff from texas a&m university to ensure the crew would land safely under emergency conditions three years ago a team from nasa worked out the design of the life raft that would have protected the crew in the event of an emergency in the gulf of mexico the first water landing of nasa astronauts in 45 years at a texas a&m facility in college station cody kelly 10 an aerospace engineering graduate who currently serves as deputy for national affairs with the nasa search and rescue mission office worked closely with the offshore technology research center (otrc) staff to perform extensive and detailed testingnasa used this testing to make an initial down-selection for the life raft design that became the nasa baseline for all human-rated spacecraft across all existing human spaceflight programs kelly said the raft ops were the first look at rescue beacon integration into our spacesuits early in the design cycle located in texas a&ms research park the otrc is a graduated national science foundation (nsf) engineering research center supporting the offshore oil and gas industry that conducts basic engineering research and develops systems for the economic and reliable recovery of hydrocarbons at ocean depths of 3 000 feet or more it is jointly operated by texas a&m the texas a&m engineering experiment station and the university of texas at austins cockrell school of engineering additionally we were able to test rescue beacon designs prior to manufacturing kelly said early testing provided the opportunity for efficient data sharing between the orion program and our commercial crew counterparts dr richard mercier professor in the zachry department of civil and environmental engineering at texas a&m has been director of the otrc since 2001 and manages all wave basin projects and the research program
otrc has conducted numerous projects for nasa and associated contractors (boeing united space alliance jacobs engineering) since the facility opened in 1991 he said codys project was the fourth in a series that was executed between 2004 and 2017 having to do with testing of life preserver units life rafts and miscellaneous vehicle egress equipment the otrc wave basin is capable of large-scale simulations of the effects of wind waves and currents on fixed floating and moored floating structures the wave basin is 150 feet long and 100 feet wide with a depth of 19 feet the pit located in the center of the basin has a depth of 55 feet with 48 individually controlled paddles the wavemaker can generate various wave conditions including unidirectional and multidirectional regular and irregular (random) waves sixteen dynamically controlled fans can generate prescribed gusty wind conditions from any direction a modular current generation system consisting of banks of submerged jets can generate sheared current profiles nasa provided all equipment and test protocols for these projects but the facility offered capabilities of producing wind and waves for prescribed sea states and also safety divers to assist nasa personnel in the water our staff and student workers are always eager to support and participate in these projects mercier said nasa shared videos photos and facility test data with their partners at spacex and boeing to help engineers certify and fly the design following testing at otrc astronauts robert behnken and douglas hurley were on the spacex crew dragon spacecraft that took off into orbit from the kennedy space center in florida with their landing fully protected by the life raft design that was decided upon in 2017 at the otrc
the offshore technology research center provides technology expertise and services needed for the development of drilling production and transportation systems that enable the safe and economically viable exploitation of hydrocarbon resources in deep and ultra-deep water it has a deepwater model basin the only one of its kind in the us
erica schabert a private school teacher is committed to helping her middle school students gain a deeper understanding of math and science however like many educators and campuses around the country the pennsylvania teacher has limited financial resources to spend on professional development due to the economic downturn caused by the covid-19 pandemic fortunately the hope lutheran christian school teacher learned about the stem 4 innovation conference hosted by texas a&m universitys college of engineering thanks to a generous contribution from chevron this event which was held virtually jan 11-12 was free for all participants and will continue to be available through july 13 to any pk-12 teacher on-demand a total of 576 teachers registered for the conference which is a 218% increase from the 2020 face-to-face event held in college station by offering this conference virtually the college of engineering was able to reach 416 first-time attendees including educators from 44 us states and 13 countries schabert who was one of those first-time attendees came away impressed it was the best professional development ive seen in a long time she said everybody was willing to share and the conference had such an open and warm environment the stem 4 innovation conference has a long and rich history that goes back to 2008 previously offered as a face-to-face conference in college station the event brings teachers together to learn about innovative stem (science technology engineering and math) strategies and tools that they can immediately utilize in the classroom in addition conference coordinators take advantage of texas a&ms designation as a tier-one research institution to create a unique attendee experience that includes interactions with world-class researchers the emergence of covid-19 forced conference coordinators to rethink the event leading to the decision to go virtual still the interactive conference continued to offer expert presentations resources chat rooms and exhibits the virtual experience was great said christina campos a first-time attendee who teaches at west oso junior high in corpus christi it felt like we were in person except without any walking the 2021 conference continued to tap expertise from across texas a&ms campus workshops presentations and exhibits were offered by faculty from the college of engineering the college of science college of education and human development college of geosciences college of agriculture and life sciences college of veterinary medicine and biomedical sciences texas a&m forest services and texas a&m university at galveston four interactive workshops were designed to give teachers a better understanding of the engineering process for example one session taught by experts from the college of agriculture and life sciences asked participants to brainstorm a new food product pitch it to consumers and investigate bringing the product to market another session organized by the college of education and human development encouraged teams of teachers to design a functional and aesthetically pleasing office that was economical while also meeting social distancing requirements breakout sessions also were offered these sessions which addressed a variety of topics that focused on either elementary or middle/high school level needs included jobs of the future: exploring stem careers in natural resources with project learning trees green jobs guide bringing the great outdoors indoors through virtual field trips stem challenges using sphero robotics and critical precollege stem knowledge the virtual conference created an environment where teachers who feel especially isolated and stressed during the pandemic could connect to each other and find help this conference has opened my eyes beyond anything i can even imagine said campos i am the stem coordinator and i was struggling to get resources but this conference enabled me to sit in the comfort of my home and meet some amazing people conference coordinators believe that 2021s virtual experience offers lessons that will be incorporated into future stem 4 innovation conferences this has helped people realize that you can have some authentic learning experiences virtually said john peterson conference coordinator and associate director of the college of engineerings spark! pk-12 engineering education outreach many people still prefer face-to-face but maybe we can plan some hybrid version of the conference for the people from california pennsylvania or australia who cant travel to texas a&m for two days participants left the conference feeling very motivated to bring what they learned into the classroom and to continue their relationship with texas a&m i fell in love with everything i saw: wonderful research possibilities great projects equipment friendly staff and outreach programs said georgina grillo a secondary teacher at golden valley school in heredia costa rica who already plans to attend the 2022 conference the conference made me wish that i could go back to college at texas a&m's college of engineering
researchers from the texas a&m university department of materials science and engineering and los alamos national laboratory (lanl) materials science at radiation and dynamics group are improving the metals used to construct nuclear technology this collaboration is made possible by the texas a&m university system national laboratories office and lanl dr michael demkowicz and dr kelvin xie from texas a&m and dr yongqiang wang from lanl are investigating hydrogen retention in metals that are exposed to nuclear processes with the intent of improving how these materials perform over time hydrogen retention in metals is a serious concern in nuclear technology nuclear reactions and transmutations by reactor neutrons cause changes to chemical elements and isotopes thus introducing additional hydrogen into materials often hydrogen accumulates to levels that exceed the solubility limit a material can absorb the excess of hydrogen in the material causes brittleness and weakness limiting its functionality over time another issue with excess hydrogen is that it accumulates at trapping sites such as in spaces in the material and at grain boundaries most metal alloys are actually made up of lots of microscopic crystals packed closely together these little crystals are called grains and the planes where adjacent grains are fused together are called grain boundaries in the reactor environment the defects created by the irradiation process such as vacancies and dislocations also become new traps for hydrogen when many hydrogen atoms are trapped together they form gas bubbles in the metal similar to carbon dioxide bubbles in soda these bubbles can facilitate the formation of large voids that can cause severe damage to the material one way to manage the hydrogen accumulation and subsequent damage is to remove as much of it as possible from the material unfortunately most materials do not contain a good pathway for hydrogen to travel through in this project the researchers will create composites where pathways for hydrogen motion are built into the structure of the material the novel materials to be developed and tested as part of this project are expected to exhibit ‘self-healing behavior whereby damage would be removed from the material even as it is created demkowicz said this would be an outcome of their unique internal microstructures in order to test the materials for hydrogen retention diffusion and outgassing the researchers will use the unique collection of ion implantation and ion beam analysis tools available at lanls ion beam materials laboratory these tools will allow the researchers to introduce and track the amount of hydrogen in the material with designer microstructures thus providing data for the amount of hydrogen that is outgassed the ideal outcome for this project would be to identify composite materials that allow enough hydrogen outgassing to keep the retained hydrogen levels at or below their threshold such materials would be able to withstand nuclear reactions better than materials currently in use in addition to advancing basic science this project will explore technical issues that are crucial to the development of future fusion reactors demkowicz said this article originally appeared on the texas a&m university system national laboratories office website
the texas a&m university system national laboratories office (nlo) was formed by the chancellor to be a conduit for expanding engagement with the national laboratories for faculty staff and students of the a&m system this office engages with all department of energy and national nuclear security agency laboratories and sites the nlo has developed a multi-element program to help texas a&m system researchers develop collaborative ties with researchers at los alamos national laboratory (lanl) execute the texas a&m system and lanl collaborative research projects and formalize long-term relationships where appropriate such as through joint appointments
the engineering genesis award for multidisciplinary research was presented to 26 texas a&m engineering experiment station (tees) researchers and their teams during a virtual award ceremony on dec 15 the award which is presented to tees researchers who have secured significant research grants of $1 million or more were given to the following: pi: roozbeh jafari biomedical engineering $123 million grant from the department of defense-defense threat reduction agency for covid-19: rate-covid: rapid analysis of threat exposure operationalization pi: ranjana mehta industrial and systems engineering co-pi: saurabh biswas biomedical engineering $5 million grant from the national science foundation for b2: learning environments with augmentation and robotics for next-gen emergency responders pi: ankit srivastava materials science and engineering co-pis: patrick shamberger ibrahim karaman svetlana sukhishvili raymundo arroyave and yu xie materials science and engineering; alaa mohamed elwany industrial and systems engineering; mohammad naraghi aerospace engineering $47 million grant from the department of defense-army research laboratory for materials and manufacturing processes for the army of the future pi: le xie electrical and computer engineering co-pis: prasad enjeti and pr kumar electrical and computer engineering $4 million grant from the department of energy for secure monitoring and control of solar power distribution system through dynamic watermarking pi: pavel tsvetkov nuclear engineering co-pis: sean mcdeavitt and mark kimber nuclear engineering $36 million grant from the natura resources llc for research and development support for molten salt research reactor licensure pi: roozbeh jafari biomedical engineering co-pis: jack mortazavi computer science and engineering; melissa grunlan biomedical engineering; thomas ferris industrial and systems engineering $35 million grant from the national institutes of health for an unobtrusive continuous cuff-less blood pressure monitor for nocturnal hypertension pi: zachary grasley civil and environmental engineering co-pi: jeffrey bullard dallas little junuthula reddy thomas lacy civil and environmental engineering; arthur schwab soil and crop sciences $28 million grant from the department of defense for concrete and composites experiments and modeling for army applications pi: zheng oneill mechanical engineering $27 million grant from the department of defense-washington for securing grid-interactive efficient buildings through cyber defense and resilient system pi: swaminathan gopalswamy mechanical engineering co-pis: swaroop darbha and sivakumar rathinam mechanical engineering; dylan shell and zhangyang wang computer science and engineering; john valasek aerospace engineering; gholamreza langari george hw bush combat development complex $25 million grant from the department of defense-research laboratory for arl: air ground coordination pi: james wall texas a&m center for applied technology (tcat) co-pis: keith biggers tcat; john walewski civil and environmental engineering $17 million grant from the department of energy for facility data and technology integration pi: patrick shamberger materials science and engineering co-pi: emily pentzer and svetlana sukhishvili materials science and engineering; choongho yu and jonathan felts mechanical engineering; charles culp college of architecture $15 million grant from the department of energy-washington for salt hydrate eutectic thermal energy pi: mustafa akbulut chemical engineering co-pi: joseph kwon chemical engineering $15 million grant from the department of energy-office of fossil energy for dynamic binary complexes as super-adjustable viscosity modifiers for hydraulic fracturing fluids pi: samuel noynaert petroleum engineering co-pi: fred dupriest petroleum engineering $15 million grant from the department of energy for changing the way geothermal wells are drilled: physics-based drilling parameter selection workflow implementation and training in order to reduce non-productive time and increase rop pi: stephen cambone cro co-pis: jeyavijayan rajendran electrical and computer engineering; rainer fink ana goulart byul hur and wei zhan engineering technology and industrial distribution; gholamreza langari cybr $15 million grant from the department of defense-air force-research laboratory for hardware integrity verification utilizing scanning electron microscopy pi: arum han electrical and computer engineering $14 million grant from the national institutes of health for developing extracellular vesicle-based therapeutics against pre-term birth through the use of maternal-fetal interface on a chip pi: jim morel nuclear engineering co-pi: raymundo arroyave materials science and engineering; amine benzerga aerospace engineering; jean-luc guermond and bojan popov mathematics $14 million grant from department of energy-national nuclear security administration for collaborative research and development supporting llnl missions pi: jim morel nuclear engineering co-pis: marvin adams jean ragusa and mauricio tano retamales nuclear engineering; jean-luc guermond mathematics $14 million grant from the department of energy-national nuclear security administration for collaborative research and development supporting stockpile stewardship pi: mary mcdougall biomedical engineering co-pis: jim ji qemg; steven wright electrical and computer engineering; peter nghiem veterinary integrative biosciences $13 million grant from the national institutes of health for multi-coil multi-nuclear add-on system for clinical field strength nmr-based biomarker detection for duchenne muscular dystrophy pi: christopher limbach aerospace engineering co-pis: rodney bowersox and richard miles aerospace engineering $12 million grant from the department of defense-air force-office of scientific research for canonical validation experiments for hypersonic aerodynamics pi: yu ding industrial and systems engineering co-pis: jiang hu and pr kumar electrical and computer engineering; sarbajit banerjee chemical engineering $12 million grant from the national science foundation for cps: medium: real-time learning and control of stochastic nanostructure growth process through in situ dynamic imaging pi: mahmoud el-halwagi chemical engineering co-pis: joseph kwon chemical engineering; lucy mar camacho chico environmental engineering $12 million grant from the department of energy for deploying intensified automated mobile operable and novel designs diamond for treating shale gas wastewater (108) pi: amy martin civil and environmental engineering $11 million grant from the texas department of transportation for balanced mix design system for superpave hot-mix asphalt mixtures with rap pi: jack mortazavi computer science and engineering; co-pi: ricardo gutierrez-osuna computer science and engineering $11 million grant from the national science foundation for sch: int: personalized models of nutrition intake from continuous glucose monitors pi: danny davis public service and administration; co-pis: stephen cambone cro; william norris international affairs $11 million grant from the department of defense-office of net assessment for assessing warfare in the digital age pi: mladen kezunovic electrical and computer engineering $1 million grant from the department of energy-washington for big data synchrophasor monitoring and analytics for resiliency tracking
the process of fabricating materials is complicated time-consuming and costly too much of one material or too little can create problems with the product forcing the design process to begin again advancements in the design process are needed to reduce the cost and time it takes to produce materials with targeted properties funded by the national science foundation (nsf) researchers at texas a&m university are using advanced computational and machine-learning techniques to create a framework capable of optimizing the process of developing materials cutting time and costs our general focus is working on materials design by considering process-structure-property relationships to produce materials with targeted properties said dr douglas allaire associate professor in the j mike walker 66 department of mechanical engineering in our work we demonstrate a microstructure sensitive design of alloys with a bayesian optimization framework capable of exploiting multiple information sources bayesian optimization-based frameworks use prior knowledge as models to predict outcomes in the past researchers have used this framework in correlation with a single information source (simulation or experiment) if that method failed the process starts again with the hopes of making the right adjustments based on this model the researchers have rejected this notion and instead believe that many information sources can be pulled using a bayesian framework to develop a more complete picture of underlying processes they have combined multiple information sources to create materials with targeted properties more efficiently by looking at data in its entirety rather than its parts what we think that is very different is that you can have many different potential models or information sources said dr raymundo arróyave a professor in the department of materials science and engineering there are many ways to understand/model the behavior of materials either through experiments or simulations our idea is to combine all of these different models into a single ‘fused model that combines the strengths of all the other models while reducing their individual weaknesses their research titled efficiently exploiting process-structure-property relationships in material design by multi-information source fusion was recently published in vol 26 of the acta materialia journal these model chains have historically not considered the breadth of available information sources said allaire they consider single models along the chain from process through structure to property as a result they are not as efficient or accurate as they could be the researchers are currently testing this framework by developing dual-phase steels typically used on automobile frames dual-phase steels are made out of two phases with very different and complementary properties there are two phases; the martensite phase makes this particular steel very strong said arróyave the ferritic phase is softer and makes the steel more compliant and amenable to deformation with only martensitic microstructures these materials are strong but they break easily however if you combine the strength of martensite with the ductility of ferrite you can make steels that are very strong can absorb energy during impact and that can be fabricated into complex shapes such as car frames using the method developed in this work the goal is to develop a framework that more precisely and effectively predicts the needed composition and processing (recipe) for a specific design in turn this decreases the number of simulations and experiments required drastically reducing costs the knowledge that we gain about the material design process as a whole using our framework is much greater than the sum of all information extracted from individual models or experimental techniques said dr ankit srivastava assistant professor for the materials science and engineering department the framework allows researchers to efficiently learn as they go as it not only collects and fuses information from multiple models/experiments but it also tells them which information source ie a particular model or experiment provides them the best value for their money or time which really enhances the decision-making process in the future they hope their framework is widely used when attempting tasks that involve integrated computational materials design our hope is that by presenting these model fusion-based bayesian optimization capabilities we will make the search process for new materials more efficient and accurate said allaire we want any researcher to use the models that they have available to them without worrying as much about how to integrate the models into their own modeling chain because our bayesian optimization framework handles that integration for them
sickle cell disease (scd) is the most prevalent inherited blood disorder in the world that affects between 70 000 to 100 000 americans however it is considered an orphan disease meaning it impacts less than 200 000 people nationally and is therefore underrepresented in therapeutic research a team led by dr abhishek jain from the department of biomedical engineering at texas a&m university is working to address this disease im trying to create these new types of disease models that can impact health care with the long-term goal of emphasizing on applying these tools and technologies to lower health care costs said jain assistant professor we strategically wanted to pick up those disease systems which fall under the radar in orphan disease category jains research is in organ-on-a-chip where cells from humans can be grown on usb-sized devices to mimic the way the organ would work inside the body this sort of system is ideal for testing new drug treatments as drugs cannot be tested on humans and animal models have not shown to be a good representation of how a patient and disease would interact with a treatment for scd patients the organ-on-a-chip would also be beneficial because patients can present with mild to severe cases jain works with tanmay mathur a fourth-year doctoral student who trained as a chemical engineer in his undergraduate years his research focused on microfabrication techniques and simulationsskills he said merge well into the organ-on-a-chip research he now performs in jains lab the team collaborates closely with the texas medical center in houston
their work recently published in bioengineering & translational medicine builds off a 2019 publication in the journal lab on chip where the team demonstrated that endothelial cells (cells that line the blood vessels) could be used to model the disease physiology of a patient without having to stimulate the model to perform differently than a healthy vessel traditionally these cells were not used for disease modeling so in that way our approach is very novel mathur said we are one of first to harness these cells and employ them in disease modeling research mathur and jain demonstrate that these models can be used to differentiate between patients the first step: build a blood vessel that mimics a patients vessel for that the team would need two components patient blood and endothelial cells collecting the blood involved a simple blood draw they faced a challenge with the endothelial cells however they would need to take a biopsy of the cells or use stem cells to grow their own neither of which was ideal then they found the answer was in the blood what we learned is within blood samples are some endothelial cells also circulating jain said we call them blood outgrowth endothelial cells that we can harness very easily thats what is new about this work you can get those cells grow them so thats theres enough in number and then you can make blood vessels now that they could build the vessels the next step was to see if these models would show how the disease has various biological impacts in different patients again the goal was to be able to test treatments on these models so the closer they mimiced their human patient the better
were able to differentiate a very severe sickle cell patient in terms of their phenotype from very mild patients mathur said moving forward we can take a larger population of any sickle cell disease patients and assess them using our organ-chip technology and then categorize them into different groups based on symptoms their findings indicate that these organs-on-a-chip could lead to patient-centric personalized treatment improving how clinicians approach this and other cardiovascular diseases when you take it to the field now it can become a predictive device jain said now you do not have to know whether the patient is mild or severe you can test for that you can predict if patient is serious and can dictate their therapeutic needs the next step is to continue to expand the patient cohort to collect more results a long-term goal would be to use the patient information collected to develop a database to better predict disease progression you take a history of a lot of these patients and their cardiovascular health with this device and you can predict which patient might have better chance of having a stroke and you start treating them early on jain said mathur said even with future challenges he looks forward to continuing their research i think even though it may take 10 15 years we will at least push forward some of the research that were doing and get it out in the clinical field he said we are one of the only groups in the world that have started this field of personalized treatment i feel that our impact is pretty high and im sure we will be able to expand the same treatment to other cardiovascular diseases and attract more attention and deeper insights into the biology that we are looking at this work is funded by a trailblazer award jain received from the national institute of biomedical imaging and bioengineering
during the start and onrush of the covid-19 pandemic all hospitals in the state of texas restricted visitation for intensive care unit (icu) patients and their families because of the infectious nature of the virus family members were unable to visit or even communicate with their loved ones leaving many individuals with feelings of anxiety confusion and fear of the unknown a group of doctors from houston methodist and researchers from texas a&m university identified a novel approach of adapting a virtual icu or vicu to make family visitations possible the vicu involves physiological sensors that are monitored 24/7 in the operations center of the hospital the system also includes two-way audio/video communication technology in the past this technology has been reserved for virtual visits by doctors and consultants but it was instrumental in connecting icu patients with their family members especially when dealing with the highly infectious coronavirus a new process was designed and rapidly implemented to allow family members to book a visit with their loved ones virtually after speaking to a virtual registered nurse (vrn) family members receive a link in a text which would connect them to the audio/video equipment installed in the patients room this improvised solution was important since bringing in outside technology such as phones or tablets into the icu was no longer possible because it posed sanitation threats and subjected medical staff to much higher risks of infection often patients werent able to use everyday technology themselves due to sedation and intubation restrictions and would require assistance from medical staff to use the equipment
using vicu technology not only provided more accessible technology for the patient but also posed less of a risk of infection for health care workers even though some patient assistance could be needed depending on the severity of the patients condition we needed to think outside the box and use something that wasnt intended for this purpose at all said dr farzan sasangohar assistant professor of industrial and systems engineering at texas a&m this is an example of effective improvisation and adaptation which are key characteristics of resilient health systems we know from the literature that family engagement is extremely important it has positive impacts on patient outcomes to evaluate this improvised solution the research team interviewed 230 family members after using this technology to communicate with their loved ones and overall the responses were very positive albeit emotional a majority of users reported feelings of joy and relief being able to see their family members still there were reports of sadness with family members seeing their loved one in a difficult situation particularly when the patient was intubated or unable to speak there were also some responses recorded from family members who lost their loved ones while in icu and reported that they were grateful to have had this option to see them and get closure some users reported potentially making this available on-demand to enable initiating calls on their own the adoption of a vicu for family visitation shows promise during an unusual and trying time for many however several areas identified for modification and improvement need to be addressed i believe our family-centered approach provides an opportunity to take us closer to a real open-icu concept in which family members are engaged more efficiently and effectively sasangohar said this will have a significant positive impact on patient outcomes
the texas a&m engineering experiment station's health care market segment advances research in the key areas of medicine health care-related technology and life sciences using a multi-disciplinary approach our strengths include bioinformatics computational biology and systems biology for agricultural environmental and life sciences next generation medical devices and systems and education training and outreach programs for pharmaceutical workforce development
dr matthew yarnold assistant professor in the zachry department of civil and environmental engineering at texas a&m university has been chosen to deliver the 2021 robert j dexter memorial award lecture the award was given by the steel bridge task force oversight council of the american iron and steel institute the national steel bridge alliance and the american association of state and highway transportation officials t-14 technical committee for structural steel design yarnold will present a lecture on his past and current research findings at the next meeting of the steel bridge task force on aug 12 2021 in philadelphia pennsylvania the robert j dexter memorial award lecture program provides an opportunity for individuals early in their careers in structural engineering to present a lecture on their steel bridge research activities to the steel bridge task force and participate in its semiannual three-day meeting recipients become invited guests of the steel bridge task force comprised of leading steel bridge experts the program was instituted in 2005 in memory of robert j dexter an associate professor of civil engineering at the university of minnesota who was an internationally recognized expert on steel fracture and fatigue problems in bridges yarnolds research includes structural steel behavior bridge engineering the experimental assessment of structural systems novel techniques for structural health monitoring and engineering education he has extensive experience with the experimental testing of structural systems and has led research projects for the national science foundation state departments of transportation and private engineering firms he is an active member of several national committees through the american society of civil engineers and the transportation research board yarnold has more than 17 years of structural engineering research and design experience he began his career at lehigh university where he received his bachelors and masters degrees following graduation he accepted a position with the engineering firm ammann & whitney where he contributed to more than 15 bridge design and rehabilitation projects while also obtaining his professional engineering license after a successful career as a consultant he returned to academia and completed his doctoral degree at drexel university he joined the civil and environmental engineering department at texas a&m in 2017 yarnold is also an affiliated faculty member and researcher at the structural and materials testing lab at the center for infrastructure renewal
the center for infrastructure renewal (cir) a joint center between the texas a&m engineering experiment station and the texas a&m transportation institute is a leading global source for the development of transformative infrastructure solutions through cross-industry and government agency collaboration the cir facilitates the creation of state-of-the-art methods technologies and solutions that society needs for infrastructure renewal the cir houses researchers who are developing advanced and sustainable materials and structural systems that will reduce cost and extend infrastructure life safety resiliency and durability
a new texas a&m university-developed technique that allows for the creation of building materials using local soils could prove key not only to the success of future space missions to the moon and mars but also to establishing a solid and safe foothold on both a futuristic concept that came one step closer to reality with last weeks successful landing of nasas perseverance rover on the surface of mars thanks to a 2021 nasa innovative advanced concepts (niac) program grant awarded to a team led by texas a&m researcher and nasa niac fellow dr sarbajit banerjee an innovation that began with boggy water-logged soils from canada to texas may soon be applied to the rocky razor-sharp regolith that dominates the lunar and martian landscapes in order to help solve a three-part problem banerjee says has plagued the space agency since the apollo missions: excessive dust unnecessary damage and untold danger five of the six apollo landings had issues with dust blocking the astronauts view of the surface forcing them to guess at the final landing location and sometimes landing on slopes dangerously close to the maximum tolerance one time precariously close to a major crater said banerjee a professor in the department of chemistry with an affiliated appointment in the department of materials science and engineering if nasa and its commercial partners are to mount a sustained presence on the moon and mars and land on outer solar system bodies we need to find a way to tame surface materials for landing and mobility banerjee in collaboration with colleagues in the texas a&m college of engineering and college of architecture proposes to do just that in his teams niac effort regolith adaptive modification system (rams) to support early extraterrestrial planetary landings and operations their publication is one of 16 selected by nasa for phase i awards out of nearly 300 proposals submitted for consideration each award provides nine months of seed funding that allows researchers to further develop their ideas in order to compete for up to $500 000 more in phase ii funding that will help them further advance and refine their technology over the course of two additional years for more than two decades the niac program has nurtured visionary ideas capable of transforming future nasa missions with the creation of breakthroughs radically better or entirely new aerospace concepts while engaging americas innovators and entrepreneurs as partners in the journey the program seeks innovations from diverse and non-traditional sources selecting projects that study innovative technically credible advanced concepts that could one day change the possible in aerospace in much the same way they previously created an economical environmentally friendly alternative to concrete using clay-based soil from a backyard in texas or a geopolymerized wood fiber prototype suitable for all weather-roads using mucky canadian muskeg soil banerjee and his team are confident their niac team can create landing pads and other prepared surfaces on mars out of regolith to address what he sees as one of the most critical surface-related developments since the apollo program the texas a&m teams proposed rams approach relies on sequentially delivered microcapsules chemically tuned to react with the components of regolith through a series of exothermic reactions to create geopolymerized subsurface slabs by employing a similar process that helped them perfect the development of sustainable building materials which quickly gain strength after being 3d printed the team will use a sequence of chemical reactions to coat all surfaces and make high-strength vanadium steel skins and anchors using a de facto nano steel mill powered by locally harvested minerals and highly exothermic reactions as an added bonus banerjee notes the nanothermite and encapsulating systems necessary to run it are both lightweight and safe to fly the team is a subset of the texas a&m lunar surface experiments program comprised of both faculty and student researchers whose purpose is to design and build fundamental science experiments and technology demonstrations to be delivered to the lunar surface as payloads aboard commercial lunar landers thereby making the moon a new laboratory for texas a&m view a complete list of niac awards for 2021 and previous years this article originally appeared on the texas a&m college of science website
a texas a&m university researchers collaborative study titled an integrated approach for managing microgrids with uncertain renewable sources demand response and energy markets has been chosen for a 2020-21 los alamos national laboratory (lanl) collaborative program award in the category of research projects dr natarajan gautam professor in the wm michael barnes 64 department of industrial and systems engineering has partnered with lanl staff scientist dr harsha nagarajan for a four-year research project focusing on energy needs demand and future power supply for small communities and microgrids using solar and wind power this four-year research project is a part of a collaboration between the texas a&m university system and lanl it is designed for a&m system researchers to collaborate with lanl researchers on an identified topic suitable for joint-effort funding from national laboratories office and lanl one of the main goals of this program is to increase the depth and number of research collaborations between the two institutions and the individuals involved while figuring out an efficient way to bring renewable power to a small community population a question that needs to be answered is whether that power gets stored in individual batteries for future household use or if it gets sold back into the grid to be used by individuals in real time all while keeping the power available and at affordable rates we are trying to see what challenges we may face so that while we develop this kind of distributed technology we will have it in place so that everything runs smoothly gautam said you want to have quality service you never want the lights to be off but you also want to reduce the cost as much as possible and have as little inconvenience as possible having highly distributed energy means that people will create energy locally as an alternative to large-scale power grids thereby using renewable energy efficiently gautam says the research will help mitigate spiking and plummeting currents and ultimately make electricity cheaper for consumers when they are only tapping into the grid when power is needed gautam and nagarajan are going to be evaluating two possibilities for homes to manage power supply-demand imbalance: each residence having its own battery for power storage that may include a device resembling teslas powerwall or residences having one main line connected to the grid large-scale power operations have the ability to diversify their portfolio when needed it is more difficult to supply power to a community of people because of the size of a microgrid on the other hand if large-scale grids experience an interruption there are many people who are affected in addition weather poses its own set of uncertainties particularly with renewable energy which can be exacerbated when working on a community level fossil fuels provide more leeway in deciding how much power is needed on the grid at any given time and production can be cut or increased from there based on demand when working with renewables the weather on any given day is the gatekeeper for how much power if any can be produced if its a bright sunny day and there is a considerable amount of solar power being produced the next question is what to do with all of that extra power however on a cloudy day there will not be much production gautam says that with the number of statistical methods included in this research there may be a way to automatically predict this in advance ultimately leading to a more efficient grid system you dont want each individual to have to make this decision every day we want to have something in a software system that will do what is best for users gautam said this is very data driven and we have a lot of historical data we can pull from you want to make an informed decision right now but also have a good idea of how the future is going to pan out this research includes a lot of optimization modeling and uncertainty in the system especially when you talk about wind and solar it is very difficult to predict tomorrow or the day after let alone several months down the road gautam said that is part of the reason as to why we havent gone ahead with much of these renewable energy efforts is because of the amount of uncertainty thats there the foundation for this research leans heavily into the operations research and systems engineering areas of industrial and systems engineering because of the number of analytical methods needed for forecasting and creating an integrated system that can handle the fluctuation of providing renewable power to small communities of people is the ultimate goal
using machine learning researchers at texas a&m university have developed an algorithm that automates the process of determining key features of earths subterranean environment including bountiful reservoirs of groundwater oil and natural gas specifically the researchers algorithm is designed on the principle of reinforcement or reward learning here the computer algorithm converges on the correct description of the underground environment based on rewards it accrues for making correct predictions of the pressure and flow expected from boreholes subsurface systems that are typically a mile below our feet are completely opaque at that depth we cannot see anything and have to use instruments to measure quantities like pressure and rates of flow said dr siddharth misra associate professor in the harold vance department of petroleum engineering and the department of geology and geophysics although my current study is a first step my goal is to have a completely automated way of using that information to accurately characterize the properties of the subsurface the algorithm is described in the december issue of the journal applied energy simulating the geology of the underground environment can greatly facilitate forecasting of oil and gas reserves predicting groundwater systems and anticipating seismic hazards depending on the intended application boreholes serve as exit sites for oil gas and water or entry sites for excess atmospheric carbon dioxide that need to be trapped underground along the length of the boreholes drilling operators can ascertain the pressures and flow rates of liquids or gas by placing sensors conventionally these sensor measurements are plugged into elaborate mathematical formulations or reservoir models that predict the properties of the subsurface such as the porosity and permeability of rocks for their study misra and his team chose a type of machine-learning algorithm based on the concept of reinforcement learning simply put the software learns to make a series of decisions based on feedback from its computational environment imagine a bird in a cage the bird will interact with the boundaries of the cage where it can sit or swing or where there is food and water it keeps getting feedback from its environment which helps it decide which places in the cage it would rather be at a given time said misra algorithms based on reinforcement learning are based on a similar idea they too interact with an environment but it's a computational environment to reach a decision or a solution to a given problem so these algorithms are rewarded for favorable predictions and are penalized for unfavorable ones over time reinforcement-based algorithms arrive at the correct solution by maximizing their accrued reward another technical advantage of reinforcement-based algorithms is that they do not make any presuppositions about the pattern of data for example misra's algorithm does not assume that the pressure measured at a certain time and a certain depth is related to what the pressure was at the same depth in the past this property makes his algorithm less biased thereby reducing the chances of error at predicting the subterranean environment when initiated misra's algorithm begins by randomly guessing a value for porosity and permeability of the rocks constituting the subsurface based on these values the algorithm calculates a flow rate and pressure that it expects from a borehole if these values do not match the actual values obtained from field measurements also known as historical data the algorithm gets penalized consequently it is forced to correct its next guess for the porosity and permeability however if its guesses were somewhat correct the algorithm is rewarded and makes further guesses along that direction the researchers found that within 10 iterations of reinforcement learning the algorithm was able to correctly and very quickly predict the properties of simple subsurface scenarios misra noted that although the subsurface simulated in their study was simplistic their work is still a proof of concept that reinforcement algorithms can be used successfully in automated reservoir-property predictions also referred as automated history matching a subsurface system can have 10 or 20 boreholes spread over a 2-5-mile radius if we understand the subsurface clearly we can plan and predict a lot of things in advance; for example we would be able to anticipate subsurface environments if we go a bit deeper or the flow rate of gas at that depth said misra in this study we have turned history matching into a sequential decision-making problem which has the potential to reduce engineers efforts mitigate human bias and remove the need of large sets of labeled training data he said future work will focus on simulating more complex reservoirs and improving the computational efficiency of the algorithm this research is funded by the united states department of energy
the texas a&m university system board of regents today named dr m katherine banks as the sole finalist for the position of president of texas a&m university this is a tremendous honor banks said the core values of texas a&m its rich traditions unique culture and commitment to the greater good is the very foundation of this great university and resonates deeply with me i hope to build upon that framework in our pursuit of preeminence without losing what makes texas a&m so special texas a&m is one of a kind and theres nowhere else id rather be banks is currently director of the texas a&m engineering experiment station dean of the texas a&m college of engineering and vice chancellor of engineering and national laboratories for the a&m system in those roles she spurred unprecedented growth in the college of engineering while also being a pivotal leader in some of the a&m systems greater accomplishments including recruiting the army futures command to the rellis campus and winning a federal contract to help manage los alamos national laboratory on wednesday chancellor john sharp recommended banks as sole finalist and the board approved under state law regents name a finalist for at least 21 days before making the appointment at a subsequent meeting the board was excited to know the search yielded tremendous interest and many qualified candidates said elaine mendoza chairman of the board of regents this speaks to the stellar reputation credibility and positive momentum of texas a&m university the board is confident that dr banks will lead the university to even greater heights while celebrating the traditions and spirit that make texas a&m unique for more information see the full press release about banks being named sole finalist
texas a&m university researchers have recently shown superior performance of a new oxide dispersion strengthened (ods) alloy they developed for use in both fission and fusion reactors dr lin shao professor in the department of nuclear engineering worked alongside research scientists at the los alamos national laboratory and hokkaido university to create the next generation of high-performance ods alloys and so far they are some of the strongest and best-developed metals in the field ods alloys consist of a combination of metals interspersed with small nanometer-sized oxide particles and are known for their high creep resistance this means that as temperatures rise the materials keep their shape instead of deforming many ods alloys can withstand temperatures up to 1 000 c and are typically used in power generation and engines within aerospace engineering as well as cutlery the nuclear community has a high need for reliable and durable materials to make up the core components of nuclear reactors the material must be high strength radiation tolerant and resistant to void swelling (materials develop cavities when subjected to neutron radiation leading to mechanical failures) nuclear researchers like shao are consistently seeking to identify quality creep-resistant and swelling-resistant materials for their use in high-temperature reactors in general ods alloys should be resistant to swelling when exposed to extreme neutron irradiation said shao however the majority of commercial ods alloys are problematic from the beginning this is because almost all commercial ods alloys are based on the ferritic phase ferritic alloys classified by their crystalline structure and metallurgical behavior have good ductility and reasonable high-temperature strength however the ferritic phase is the weakest phase when judged by its swelling resistance therefore making the majority of commercial ods alloys fail in the first line of defense shao known internationally for his pioneering work in radiation materials science directs the accelerator laboratory for testing alloys under extreme irradiation conditions shao and his research team collaborated with the japanese research group at hokkaido university led by dr shigeharu ukai to develop various new ods alloys we decided to explore a new design principle in which oxide particles are embedded in the martensitic phase which is best to reduce void swelling rather than the ferritic phase said shao the resulting ods alloys are able to survive up to 400 displacements per atom and are some of the most successful alloys developed in the field both in terms of high-temperature strength and superior-swelling resistance details of the complete project were published in the journal of nuclear materials along with the most recent study the team has since conducted multiple studies and attracted the attention from the us department of energy and nuclear industry the project resulted in a total of 18 journal papers and two doctoral degree dissertations
dr paul gratz and dr jiang hu in the department of electrical and computer engineering at texas a&m university are utilizing machine-learning models to detect performance bugs within processors and create a more efficient method to combat this real-world problem as consumers we upgrade to a new phone gaming system or smart device for the home because the newer model offers better battery life graphics performance and overall capabilities when these bugs go undetected and are released into ‘the wild in our homes and into our everyday lives – the performance we lose out on as a result can have a greater long-term effect than we might realize when it comes to computer bugs there are two main types: functional and performance a functional bug means an error within a processor creates a computing result that is simply wrong for example if a processor is asked to solve three plus two and its result is seven there is clearly an issue with that result a performance bug is not as simple to detect suppose you want to drive from college station to houston said hu professor in the department at one point you somehow make a mistake and drive toward dallas thats a big mistake and its pretty easy to tell but there are different paths to houston some are shorter some are longer – that difference is hard to tell because you still arrived at your desired destination performance bugs can fly under the radar and remain unnoticed forever – ultimately diminishing the progress to be made in all facets of modern technology fortunately gratz and hu are working with collaborators at intel corporation on a promising answer
by utilizing machine-learning models and automating the process gratz and hu are hopeful that the effort currently spent on performance validation can be drastically reduced ultimately leading to technologies that reach their full potential more efficiently and effectively this is the first application of machine learning to this kind of problem said gratz associate professor its the first work we have really found that actually tries to tackle this problem at all gratz and hu explained that their procedure allows them to do in a day with one person what currently takes a team of several engineers months to accomplish the first hurdle in detecting these bugs is defining what they might look like the computing industry faces this challenge during initial performance analyses when a new technology shows a performance somewhat better than the previous generation it is hard to determine if that processor is running at its full potential or if a bug is reducing the outcomes and they should expect better results if you have a 20% gain between one generation to the next how do you know that the 20% gain couldnt have been 25% or 30% gratz said this is where the teams machine-learning models come into play the models are able to guess what the performance of the new machine will be based on these relationships so that the team can see if there is a divergence because chips are more compact and complex than ever before there is a higher chance for such bugs to appear as the complexity of chip design grows the conventional method to detect and eliminate bugs manual processor design validation is increasingly difficult to maintain thus a need for an automated solution became apparent intel contacted hu in june 2019 with hopes of collaboration to solve this critical issue this work has been supported by a grant from the semiconductor research corporation the researchers published their current findings in a paper that was accepted into the 27th institute of electrical and electronics engineers international symposium on high-performance computer architecture a top tier conference in computer architecture
as humankind steps into new frontiers in space exploration satellites and space vehicles will need to pack more cargo for the long haul however certain items like dish antennas used for wireless communication pose a challenge since they cannot be very densely packed for flight because of their signature bowl shape now researchers at texas a&m university have used the principles of origami the ancient japanese art of paper folding to create a parabolic structure from a flat surface using a shape-memory polymer when heated the researchers showed that the shape-memory polymer changes its shape in a systematic way that mimics folds this reshaping lifts the material into the shape of a dish further they also showed that their origami-engineered dish antennas performed as efficiently as conventional smooth dish antennas initially we were largely focused on self-folding origami structures: how would you make them how would you design them into different shapes what material would you use said dr darren hartl assistant professor in the department of aerospace engineering having answered some of these questions we turned to some real-world applications of origami engineering like adaptive antennas for which there has been very little work done in this study we combine folding behavior and antenna performance and address that gap the researchers have described their antenna design in the journal smart materials and structures antennas come in various designs and their major function is to transmit or receive information in the form of electromagnetic waves some antennas like the ones for communicating between a television and a space-bound satellite are curved in the shape of a parabola this ensures that the electromagnetic waves hitting the bowl-shaped antenna are reflected and converge to a small point of focus by extension when these antennas transmit electromagnetic waves they do so in a narrow direction a feature known as directionality
thus parabolic reflectors are a natural choice for space applications since they either pick up or send information in a specific direction however their shape makes them inconvenient to store in space vehicles where there is limited room this problem is exacerbated when many antennas need to be stored onboard one way to address this hurdle is origami engineering using this technique flat 2d structures can be folded into elaborate 3d shapes if parabolic antennas can be made flat using origami they can be stacked or rolled up inside of a rocket and when ready for deployment be unrolled and folded into a parabolic shape however hartl explained that folding a piece of flat material into a smooth bowl is difficult and nonintuitive conventional origami design entails folding thin sheets of material at sharp creases engineering structures on the other hand have a thickness and the choice of material can make it hard to get these sharp creases he said consequently we need to create folds that exhibit smooth bending to facilitate paper-like folding at the creases the researchers turned to shape-memory composites that change their shape when heated in addition these materials are inexpensive lightweight flexible and capable of being stretched multiple times without being damaged first they built a flat 2d surface using strips of shape-memory composites and cardstock simply put pieces of stiff cardstock which formed flat facets were held together by the shape-memory composites similar to how the ribs of an umbrella hold the fabric in place further at the vertices where the composites meet they cut out tiny holes to serve as corner creases when the assembly folds into a 3d parabola the researchers showed that the composites changed their shape by bending systematically when heated eventually lifting the cardstock pieces into a parabolic bowl-like shape they also tested if their multifaceted parabolic reflector worked as efficiently as a smooth parabolic antenna and found that the two antennas performed comparably hartl said this research is an important step toward using the principles of origami to make highly functional engineering structures that can be stowed compactly and easily deployed when needed in addition to other applications future advances based on this research will likely result in morphing reflector antennas for military and space telecommunication applications he said this research is funded by the national science foundation and the air force office of scientific research
dr nick duffield the royce e wisenbaker professor i in the department of electrical and computer engineering at texas a&m university was named a 2020 fellow by the association for computing machinery (acm) for his contributions to network measurement and analysis acm is the world's largest and most prestigious society of computing professionals the acm fellows program recognizes the top 1% of acm members for their outstanding accomplishments in computing and information technology and/or outstanding service to acm and the larger computing community fellows are nominated by their peers with nominations reviewed by a distinguished selection committee duffield is director of the texas a&m institute of data science (tamids) at which he and his collaborators pursue new approaches to data science research education operations and partnership the work conducted within tamids spans multiple disciplines and connects researchers from across texas a&m to bring together elements of data science from engineering technology science and the humanities and inform wider social challenges duffield also holds a courtesy appointment in the department of computer science and engineering duffields research focuses on data science and computer networking with current projects concerning algorithms for data streaming and machine learning computer network measurement and resilience and applications of data science to urban science transportation agriculture and hydrology
the texas a&m institute of data science (tamids) serves and fosters collaborations across the university and its affiliated agencies it is a joint undertaking of texas a&m university with the texas a&m engineering experiment station and texas a&m agrilife research tamids is an inclusive umbrella organization for data science and facilitates interactions between researchers in diverse application areas and those with expertise in core methodologies promotes education in data science across the university and pursues outreach to commercial and governmental organizations in the wider data science ecosystem
the national academy of inventors (nai) has named two engineering faculty members from texas a&m university to its 2021 class of nai senior members the two new senior members are from the college of engineering: dr saurabh biswas associate professor of practice department of biomedical engineering and executive director for commercialization and entrepreneurship texas a&m engineering experiment station dr roozbeh jafari tim and amy leach professor department of biomedical engineering department of computer science and engineering and the department of electrical and computer engineering nai senior members are active faculty scientists and administrators from nai member institutions with success in patents licensing and commercialization they have produced technologies that have brought or aspire to bring real impact on the welfare of society the 2021 class includes 63 accomplished academic inventors who are named on 625 issued us patents and who represent 37 nai member institutions research universities governmental entities and nonprofit institutes worldwide the title of nai senior member was established in 2019 the selection of biswas and jafari brings the number of current texas a&m faculty members who are nai senior members to 11 in addition 13 current texas a&m faculty members are nai fellows congratulations to dr biswas and dr jafari for earning this distinction said vice president for research dr mark a barteau an nai fellow also thank you to the nai for recognizing the innovations of our outstanding a&m faculty members in the ongoing quest for solutions that better the human condition and address our worlds most challenging problems nai is a member organization comprising of us and international universities and governmental and nonprofit research institutes with more than 4 000 individual inventor members and fellows spanning more than 250 institutions worldwide
machine learning is widely used in various applications such as image recognition autonomous vehicles and email filtering despite its success concerns about the integrity and security of a models predictions and accuracy are on the rise to address these issues dr yupeng zhang professor in the department of computer science and engineering at texas a&m university and his team applied cryptographic algorithms called zero-knowledge proof protocols to the domain of machine learning "these protocols will allow the owner of a machine-learning model to prove to others that the model can achieve a high accuracy on public datasets without leaking any information about the machine-learning model itself " said zhang the researchers' findings were published in the proceedings from the association for computing machinery's 2020 conference on computer and communications security machine learning is a form of artificial intelligence that focuses on algorithms that give a computer system the ability to learn from data and improve its accuracy over time these algorithms build models to find patterns within large amounts of data to make decisions and predictions without being programmed over the years machine-learning models have undergone a great deal of development which has led to significant progress in several research areas such as data mining and natural language processing several companies and research groups claim to have developed machine-learning models that can achieve very high accuracy on public testing samples of data still reproducing the results to verify those claims remains a challenge for researchers it is unknown if they can achieve that accuracy or not and it isn't easy to justify the theoretical foundation of cybersecurity and cryptography is the science of protecting information and communications through a series of codes so that only the sender and the intended recipient have the ability to view and understand it its most commonly used to develop tools such as encryptions cybertext digital signatures and hash functions there are approaches outside of cryptography that could be used one of which involves releasing the model to the public however as machine-learning models have become critical intellectual property for many companies they can't be released because they contain sensitive information essential to the business "this approach is also problematic because once the model is out there there is a software tool online anyone could use to verify " said zhang "recent research also shows that the model's information could be used to reconstruct it and used for whatever they desire" as an application of cryptography zero-knowledge proof protocols are a mathematical method that allows the owner of a machine-learning model to produce a succinct proof of it to prove with overwhelming probability that something is true without sharing any extra information about it while there has been a significant improvement in the use of general-purpose zero-knowledge proof schemes in the last decade constructing efficient machine-learning prediction and accuracy tests remains a challenge because of the time it takes to generate a proof "when we applied these generic techniques to common machine-learning models we found that it would take several days or months for a company to generate a proof to prove to the public that their model can achieve what they claim " said zhang for a more efficient approach zhang and his team designed several new zero-knowledge proof techniques and optimizations specifically tailored to turn the computations of a decision tree model which is one of the most commonly used machine-learning algorithms into zero-knowledge proof statements using their approach on the computations of a decision tree they found that it would take less than 300 seconds to generate a proof that would prove the model can achieve high accuracy on a dataset as their newly developed approach only addresses generating proof for decision tree models the researchers want to expand their approach to efficiently support different types of machine-learning models contributors to this project include zhiyong fang doctoral student in the computer science and engineering department; and doctoral student jiaheng zhang and dr dawn song from the university of california berkeley this work is supported by the national science foundation defense advanced research projects agency and the center for long-term cybersecurity
the texas a&m engineering experiment station (tees) and lockheed martin corporation signed a master research agreement (mra) on april 29 to further cement their longstanding research collaboration bridget lauderdale vice president and general manager of the f-35 lightning ii program at lockheed martin and dr m katherine banks vice chancellor of engineering and national laboratories for the texas a&m university system and director of tees signed the agreement during a ceremony in the zachry engineering education complex the two research powerhouses have developed a robust working relationship over the years collaborating on many cutting-edge research projects the master research agreement will facilitate collaborations across disciplines including hypersonics advanced networks autonomy and cyber and directed energy lockheed martin is excited to support and partner with texas a&m to solve complex problems that impact the national security of the united states and our allies the research and development we will accomplish with the master research agreement and our ongoing partnership with the university consortium for applied hypersonics will ensure our nation is prepared for the unpredictable world we live in today and far into the future said lauderdale from hypersonics to directed energy autonomous vehicles to advanced networks this partnership will elevate texas a&ms $200 million investment into the george hw bush combat development complex (bcdc) and its mission to support leading edge national security research and technology development banks said we have developed a true national security innovation ecosystem and this expanded partnership with lockheed martin will provide tremendous opportunities for our faculty researchers and students this close partnership helps facilitate the adoption of these advances while supporting the next generation of hypersonic researchers said dr rodney bowersox associate dean for research at tees and ford i professor of aerospace engineering last year tees was selected by the us department of defense joint hypersonics transition office (jhto) which is led by dr gillian bussey to lead the national university consortium for applied hypersonics (ucah) bowersox who is the tees executive director for ucah commented this mra is perfectly aligned with the jhto vision of an inclusive ecosystem of university industry and laboratory partnerships to provide innovative solutions to applied hypersonics problems while educating the national workforce the bcdc located on the rellis campus in bryan texas will soon be the site of many kinds of research testing including the innovation proving ground (ipg) a challenging outdoor test site for autonomous aerial ground and subterranean vehicles additional state-of-the-art testing facilities are coming online soon including a one-of-a-kind ballistic aero-optics and materials (bam) test range for directed energy and hypersonics research the bcdc complex is located on 2 000 acres and includes a full complement of facilities equipment and instrumentation including the research integration center (ric) with laboratories accelerator space and offices for the army futures command and other texas a&m system collaborators construction is currently underway on the bam test range as well as the ipg which is an outdoor testing area for designing analyzing and validating new technologies in challenging environments
finding effective covid-19 vaccine formulas alone is not enough to put the global pandemic behind us thats why the texas a&m university system is collaborating with fujifilm diosynth biotechnologies texas to train the workforce that is mass producing two covid-19 vaccine candidates for the federal government for the past nine months a dedicated team of texas a&m university scientists at the national center for therapeutics manufacturing (nctm) has been training workers on the biomanufacturing basics needed to produce the covid-19 vaccine candidates the nctm is a joint research center of the texas a&m engineering experiment station and texas a&m with just four instructors and a handful of support staff the team has trained more than 200 new employees of fujifilm diosynth biotechnologies texas the company is the texas a&m systems biomanufacturing subcontracting partner in the national emergency manufacturing program texas a&m is doing a great public service said john sharp chancellor of the texas a&m system by collaborating with fujifilm diosynth biotechnologies texas to increase the vaccine supply our team is helping save a bunch of lives chancellor sharp recently visited the nctm to learn more about its success the nctm employee training is arranged through the center for innovation and advanced development and manufacturing (ciadm) in college station a texas a&m system program established in 2012 by the federal government for just this kind of national emergency
to meet fujifilm diosynth biotechnologies texas aggressive hiring pace in recent months our team has delivered nearly nonstop training said dr zivko nikolov director of nctm and a professor of bioprocess engineering at texas a&m im so honored to lead such a dedicated team the training is a customized intensive seven-day hands-on curricula of various aspects of cell culture and basic molecular biology aseptic processes and microbiology and upstream and downstream processing of biological materials the team worked quickly to build the program within weeks of the federal request last july the training began almost immediately and has continued ever since the ability to respond rapidly to an emergency is the main original goal of the ciadm program said dr william jay treat director of ciadm and chief manufacturing officer for the texas a&m health science center since its creation nctm has been critical to developing training programs to meet the manpower required for an emergency such as this pandemic the nctm has more than 25 000 square feet of dedicated instructional space with several million dollars worth of traditional stainless and single-use systems for upstream and downstream bioprocessing it has contracted with more than 80 subject-matter experts to build a catalog of training programs that serve industry government and academia in the past eight years nctm has trained nearly 1 600 students including new hires and employed professionals undergraduate/graduate students military veterans and others transitioning careers and even high school students interested in stem careers we have never been more proud of our work than in the past nine months said jenny ligon nctm assistant director for workforce development who has been with nctm since its creation in 2012 im so proud of our small but mighty team we are happy to do our part in getting everyone vaccinated as a center of excellence for science manufacturing and engineering we are pleased to closely collaborate with texas a&m to train new hires to support the manufacture of life-impacting medicines and vaccines at our college station facility said dr gerry farrell chief operating officer of fujifilm diosynth biotechnologies texas it is critical that we continue to train local talent to feed this growing and vibrant texas biotech community
the national center for therapeutics manufacturing is a first-of-its-kind multi-disciplinary workforce education institution and biopharmaceutical manufacturing center located at texas a&m university in college station texas the nctms workforce development mission is to provide education training and outreach programs to produce a highly skilled workforce for the vital us and global pharmaceutical industry
related stories
dr valerie g segovia has been named associate director of outreach and education for the texas a&m engineering experiment stations nuclear engineering and science center (nesc) and associate director of the nuclear power institute (npi) in her new role segovia will work with the nesc director and staff to manage expand and direct all nesc outreach and education programs as npi associate director she will manage expand and direct the statewide workforce development mission and supervise all npi staff i have such great feelings of satisfaction gratitude and appreciation segovia said in regards to her new roles at nesc and npi every day is challenging inspiring and intellectually stimulating in working with our students educators and partners while meeting our missions segovia began her career with npi over a decade ago and has created novel programs that have been implemented across the state of texas including the npi flagship program the workforce industry training program traditionally only 15%-17% of us high school graduates pursue a stem (science technology engineering and mathematics) college degree however through segovias leadership an average of 75%-80% of graduating high school students who participated in npi programs have pursued a stem degree over the past decade segovia is a gulf coast workforce board member and has worked in the public school system as an elementary teacher high school counselor high school assistant principal and high school principal she was recognized as a texas association of secondary school principals (tassp) region iii principal of the year and tassp texas principal of the year finalist and she also won the joseph b whitehead educator of distinction award for exemplary dedication to the field of education additionally she was named a kavu channel 25 hometown hero under her leadership and management palacios high school was named a top performing school by us news & world report segovia holds a bachelors degree in interdisciplinary studies and masters degrees in counseling and administration and supervision and a doctorate in educational leadership this article originally posted by the nuclear power institute (npi)
dr chanan singh and doctoral student arun karngala from the department of electrical and computer engineering at texas a&m university are working to develop a reliability framework for the distribution system so that utility companies can be better prepared for uncertainties that may arise singh is a regents professor the irma runyon chair professor and university distinguished professor by developing these models and methods to perform the analysis of the distribution level of the power grid adverse effects of localized weather events or equipment failure can potentially be prevented the researchers framework can be used to test the systemwide impact of installing rooftop solar and energy storage by the customers in the distribution system we found that with 40% of customers installing solar capacity that amounts to 15 times the peak demand of the respective households karngala explained with sufficient energy storage systems the reliability indices measured significant improvements for example the system average interruption frequency index was improved by 50% the system average interruption duration index was improved by 70% and the customer average interruption duration index was improved by 45% karngala said that this framework can also be used to decide the capacity of solar rooftop installation if the installed solar capacity is increased from one time the peak demand to two times the peak demand the reliability indices show steady improvement the improvement in indices tapers off after the installed solar capacity is increased more than 25 times the peak demand performing reliability studies can help create business cases for purchasing such storage and ongoing research on storage technologies is helping to provide more affordable and reliable alternatives the research team is focused on the analysis and reliability at the distribution level as it is the most vulnerable of all stages of power allocation and therefore can cause the most trouble for customers further unlike high-level sectors of the power grid – such as power generation and transmission – that have existing methods of analysis and procedures to ensure that the reliability will be maintained in the presence of uncertainties at specified levels the distribution level generally does not have such standards most independent system operators (isos) ensure they have enough power generation reserve so that if an unexpected issue arises (eg transmission line failure generator failure the load being higher than forecasted etc) resulting in the total load not being supplied the load can be adjusted so that it is not lost completely for all customers many isos use criteria that ensure that on average this load curtailment would not occur more than one day in 10 years such standards are not typically used at the distribution level this work was published in ieee transactions on sustainable energy in january the winter storm event that happened recently in texas was of a different nature that spanned the entire state singh said but extreme weather can be in a variety of forms for example you can have tornadoes or hurricanes where the effect is not statewide but instead more limited areas are affected we believe that in those situations these models and the tools that they will provide to us to manage the system will enhance the reliability of the distribution system because you dont have to rely only on the power that is coming from the grid but also from other local sources such as solar and perhaps wind one challenge that the team is facing is that there are many different kinds of generating systems being integrated into distribution systems that must be accounted for in this framework analysis as karngala explained distribution systems previously were considered the only consumers of energy but today there are newer technologies and many more distributed energy resources coming into the distribution system such as solar panels wind generation and storage the exciting part about working on distribution systems is that these are in a phase of change now karngala said these are changing from traditional systems to much more advanced systems and we are in that transition phase where we need to develop models and methodologies ultimately the team is looking to build a comprehensive framework of reliability analysis where approaches such as demand response price strategies and operational strategies can be included and be expanded upon as the power grid evolves there is no shortage of projects that can be developed around this framework as many models methods and operational strategies can be included in the reliability evaluation karngala said this work is funded by the department of energy as part of the us-india collaborative for smart distribution system with storage project
the texas a&m engineering experiment station's energy market segment features innovative solutions to obstacles in energy production processing and consumption our strengths include natural gas fossil and non-fossil-based technologies energy economics and multi-scale energy systems engineering; upstream petroleum engineering technology through industry partnerships; energy consumption optimization in commercial and industrial building operations; power system infrastructure integration with transportation to create energy ecosystems; and turbomachinery performance and reliability in rotor dynamics acoustics seals tribology couplings computational and experimental fluid dynamics heat transfer torsional vibrations materials and finite element analysis
before your phone can be a source of endless tiktok videos you must first acquire it through a global supply chain but when these supply chains are disrupted whether due to a global pandemic unexpected winter storms or a massive cargo ship blocking a busy trade canal it can affect everything from the food you eat to the toilet paper on the shelves of local grocery stores dr eleftherios lefteris iakovou the harvey hubbell professor of industrial distribution at texas a&m university and director of manufacturing and logistics innovation initiatives for the texas a&m engineering experiment station is utilizing his years of expertise in supply chain research to bolster current supply chains and develop new resilient supply chain systems he recently published an extensive look into supply chains and how to build a more resilient system in the united states so what exactly is a supply chain at its core it is the network or process involved in the creation and sale of a product to consumers and industrial customers iakovou also the co-director of the global value chains program at the bush school and director of supply chain management for the secureamerica institute of the texas a&m university system explained that it is far more intricate than just that however a supply chain is an extended enterprise over which we do five things: we plan we buy we make we move we sell he said over this extended enterprise we manage four flows we manage flows of products flows of processes flows of information and financial flows and unless we do this synergistically we cannot add value to the supply chain itself defining disruptions when one part of this delicately balanced network fails and disrupts the process the rest of the supply chain consumers companies and the nation included are directly affected this can be a disruption in supply or production such as when a manufacturing plant in japan is out of commission due to a tsunami; a disruption in demand and supply as seen with the increased need and lower supply of power during the winter storm in texas; or a disruption in transportation due to the suez canal being blocked by one of the largest container ships in the world the ever given vessel he explained that these global supply chains are very brittle and highly susceptible to disruptions particularly surrounding uncertain or unplanned events for example the covid-19 pandemic demonstrated a lack of preparedness by nations and companies alike for black swan (low probability-high impact) events it's one thing to prepare an organization based on chronically occurring disruptions and it's another thing to think ‘how can i create a supply chain that is flexible enough agile enough so when something really bad happens it has the posture to bounce back as quickly as possible iakovou said the future of resilience developing a resilient supply chain starts with understanding that one size does not fit all each has its own set of complications and strengths in the modern global economy cost reduction has almost become synonymous with outsourcing and offshoring the manufacturing of components for products however it is important for companies to still have home-based manufacturing plants that operate even at a lower production rate in case there is an issue with the offshored supply iakovou said covid-19 demonstrated the sole supplier model is out of business companies are pushing for low cost low cost itself is not sustainable he said the federal government on the other side needs resilience and security for supply chains critical to the nation (eg pharmaceuticals semiconductor chips large capacity batteries rare earth minerals) but resilience costs money as its based on redundancies (diversified sourcing additional inventories) iakovou suggests the government has a key role to play by transitioning companies away from a shareholder model and into a stakeholder model as suggested in 2019 by the business roundtable private and government sectors would have to collaborate and consider not only shareholder value but also workers and associated partners society national competitiveness and security and the environment that's a monumental shift that if it happens correctly would allow for new optimal trade-offs between cost efficiencies resilience and sustainability he said iakovou pointed out that in the past governmental agencies were critical in de-risking innovations related to the internet touch screens and vaccines he believes the government should work with the movers and shakers in society and the private sector to continue to advance similar innovations in supply chains critical to the nation and to further support the nations well-being competitiveness security and global leadership there is a talk in my humble opinion deservedly so about the elevated role of the government as a further catalytic entity to spur more innovation in the way we design these global supply chains so that they display cost-competitive resilience in order to address the new realities he said and that's where the stakeholder model would be very handy so i absolutely believe that this is the way to move forward
the texas a&m engineering experiment station and texas a&m university have a demonstrated history of strong leadership and excellence in fields that affect everyday life like supply chain management the thomas and joan read center for distribution research and education applies both advanced research skills and a thorough understanding of distribution to every project their experts who merge long-standing industry experience with the latest developments in academia conduct both primary and secondary research for manufacturers distributors professional associations and publications texas a&ms global supply chain laboratory teaches students the importance of a dynamic end-to-end supply chain vision as well as cutting-edge solutions for wholesale and industrial distribution channels
researchers at texas a&m university are developing novel tests to diagnose preeclampsia earlier in a pregnancy even before symptoms occur allowing hospitals to better treat pregnant patients preeclampsia a pregnancy complication that can lead to organ damage especially in the liver and kidneys is one of the leading causes of maternal and baby deaths during pregnancy current diagnosis typically occurs very late in the pregnancy (around the 20th week) by documenting common symptoms such as high blood pressure protein in the urine and swelling in the legs however an added complication is that these symptoms resemble many common side effects associated with a healthy pregnancy and sometimes there may also be no symptoms even if preeclampsia exists what were looking for is more specific biomarkers that could be monitored and addressed said dr samuel mabbott assistant professor in the department of biomedical engineering the research is being developed in collaboration with dr gerard coté professor in the department of biomedical engineering and director of the texas a&m engineering experiment station center for remote health technologies and systems and dr mahua choudhury associate professor in the texas a&m college of pharmacy to better diagnose the condition we are developing an assay on paper that using a drop of blood and a small hand-held meter can measure a biomarker in the blood much earlier in the pregnancy further by using paper-based systems the technology is better suited for low resource settings where preeclampsia is even more prevalent coté said coté and mabbott have a keen interest in developing diagnostic tests that can be utilized in underserved under-resourced and often remote environments both are also developing cardiac and diabetes diagnostic and monitoring tests for these populations as part of an engineering research center funded by the national science foundation called precise advanced technologies and health systems for underserved populations the current testing relies on microrna noncoding rna sequences in the body that are involved in protein creation mabbott said compared to antibody-based testing microrna may help clinicians detect diseases earlier antibodies or antigen-based diagnostics are understandably well utilized their levels can be used to quantify disease states mabbott said the problem is it takes a long time for those biomolecules to achieve detectable levels levels of disease-related microrna levels are perturbed much earlier in the disease cycle along with microrna the team also aims to find histones a specific family of proteins changes in expression of both of these biomarkers can be monitored and measured to better detect patients who are more susceptible to preeclampsia one challenge mabbott said is identifying emerging disease biomarkers and most significantly clinically validating them by combining the detection of micrornas and histones in one multiplexed paper fluidic device it is hoped that the robustness and accessibility of the test will increase the long-term goal is to have a test that can be easily administered to pregnant patients in any circumstance and provide test results within 30 minutes since the test itself is novel and the targeted biomarkers are revolutionary it is incredibly exciting to be working on the project mabbott said were trying to think ahead by using both emerging biomarkers and accessible test formats
the texas a&m engineering experiment station's health care market segment advances research in the key areas of medicine health care-related technology and life sciences using a multi-disciplinary approach our strengths include bioinformatics computational biology and systems biology for agricultural environmental and life sciences next generation medical devices and systems and education training and outreach programs for pharmaceutical workforce development
the texas a&m engineering experiment station's health care market segment advances research in the key areas of medicine health care-related technology and life sciences using a multi-disciplinary approach our strengths include bioinformatics computational biology and systems biology for agricultural environmental and life sciences next generation medical devices and systems and education training and outreach programs for pharmaceutical workforce development
dr tim davis and dr roozbeh jafari were recently named recipients of the texas a&m engineering experiment station's research impact award the award recognizes research that has had an impact broadly defined as leading to outcomes that extend beyond conventional boundaries including opening new lines of research solving a long existing problem or producing tools or products that have become widely adopted in practice by industry and/or government davis received the award for developing novel methods for solving graph problems using linear algebra and creating widely used algorithms and software for sparse matrix computations jafari was recognized for pioneering context-aware physiological monitoring devices based on wearable computers davis is a professor in the department of computer science and engineering at texas a&m university his research is focused on two main areas: sparse linear algebra on graphics processing units (gpus) and methods for solving graph algorithms in the language of sparse linear algebra over semirings gpus provide the promise of high performance and lower energy use but they work best on very regular problems the challenge is to map the irregular nature of sparse matrix algorithms to perform well on gpu architectures davis many honors include receiving the 2018 walston chubb award for innovation and a dean of engineering excellence award from the college of engineering he was elected as a fellow of the society for industrial and applied mathematics for "contributions to sparse matrix algorithms and software including the university of florida sparse matrix collection" he also is a fellow of the association for computing machinery and the institute of electrical and electronics engineers dr davis work on sparse matrices and graphblas is extremely impactful and found in a variety of products and tools said dr scott schaefer department head and holder of the lynn ‘84 and bill crane ‘83 department head chair in computer science and engineering the interest from industry in his recent work has been extraordinary jafari is a professor with joint appointments in the department of biomedical engineering the department of computer science and engineering and the department of electrical and computer engineering he is also the director of the embedded signal processing lab his research focuses on wearable computer design and signal processing with applications in health care wellness and enhancing productivity and safety of the users he has received numerous honors and awards including several best paper awards and is a presidential impact fellow at texas a&m a recipient of the national science foundation career award and a fellow of the american institute for medical and biological engineering the potential for dr jafaris work on wearable computers to monitor health status and predict infections or other negative events is tremendous said dr mike mcshane department head and holder of the james cain professorship ii in biomedical engineering of course the ongoing coronavirus pandemic has heightened interest in developing such advanced tools for the welfare of all
dr satish bukkapatnam director of the institute for manufacturing systems has been awarded the george l smith international award for excellence in the promotion of industrial engineering this award given by the institute of industrial and systems engineers (iise) is presented to an individual who has made significant contributions to the discipline and exemplifies being a goodwill ambassador for the profession bukkapatnam has traveled to various educational institutions and developed many programs and formal partnerships particularly in france and india after assuming the role of the director of the texas a&m engineering experiment stations (tees) institute for manufacturing systems in 2014 he also worked with contacts from the technocentre henri-fabre on the am2 transatlantic partnership led by texas a&m university and tees and the arts et métiers institute of technology to develop a vigorous research portfolio in smart manufacturing with the main focus on smart manufacturing and artificial intelligence for the fourth industrial revolution or industry 40 it is an honor to be chosen for this award i see this as a recognition of the bonds of friendship and partnership our colleagues from france and india had forged with us bukkapatnam said in particular i wish to thank arts et metiers and professor el mansori for the untiring efforts to build the am2 partnership i also wish to thank professors tiwari ramesh babu and satyanarayana for enabling vigorous interactions with our colleagues from indias premier institutions especially the indian institutes of technology and national institute of industrial engineering over the span of 14 years bukkapatnam offered internships to 14 undergraduate students hosted two bose scholars recruited five graduate students (one of whom is a current faculty member at arizona state university mentored both students and early-career faculty and co-organized three major workshops to promote emerging topics in smart manufacturing "dr bukkapatnam is the most deserving of this recognition by the iise it is a testimony to his great efforts in promoting research and education in smart manufacturing globally said dr eyad masad executive director of global initiatives for tees his impact encompasses the full spectrum: joint research programs exchange of students workforce development and development of unique infrastructure for manufacturing in collaboration with other institutions"
the institute for manufacturing systems (ims) serves as the focus of manufacturing research outreach and education for the texas a&m engineering experiment station (tees) the ims embodies texas a&m universitys land-grant mission by providing direct access to the manufacturing and technology expertise of the texas a&m university system by texas industry and the texas engineering community
artificial intelligence (ai) continues to be a growing part of our everyday lives as we become more accustomed to seeing autonomy replacing even the most mundane tasks the military is no exception as they continue to prioritize the safety of soldiers in combat dr thomas ferris associate professor in the wm michael barnes department of industrial and systems engineering at texas a&m university is working with the crew optimization and augmentation technologies (coat) program a project under the us army futures command (afc) to support human crew members in future ground vehicle systems in addition to developing and testing advanced driver displays and interfaces coat research seeks to integrate automation with ai properties into the vehicle cockpit in order to help reduce the size of manned crews this means fewer soldiers put in harms way but brings unique challenges to balancing workload across system elements and maintaining necessary performance levels this project looks at sharing jobs between humans and autonomy this requires knowing which jobs are best suited for humans and which for the ai agents and how to ensure mission effectiveness of the human-vehicle system as responsibilities and roles change ferris said if there is a just a driver and that is his only job thats easier to understand and study now we need to think about what else does the driver do when hes not driving because the ai agent can take over at times and when and how does the human know to regain responsibility of the driving
the afc focuses on the modernization of the us army and longer-term future technologies and systems one aspect of this project focuses on providing sensor data so onboard soldiers can maintain awareness of their surroundings or situation awareness but in a way that minimizes safety risks in the past soldiers would typically pop their head out of the top of the vehicle to visually survey their surroundings in real time which exposes the soldiers to enemy fire going forward and with the help of this research soldiers will have the ability to see all external activity from inside the vehicle and be protected by its armor thus making their missions substantially safer some technologies being investigated include video cameras that capture surroundings in real time and feed that data back to the soldiers inside the vehicle soldiers inside the vehicle cockpit could be using these feeds to drive the vehicle they are currently passengers in or drive a completely separate remote-controlled vehicle ultimately the soldiers task responsibilities need to be flexible in order to support flexible management of the vehicle and thus safer and more effective mission performance
in a recent evaluation experiment test subjects drove military vehicles through eight different courses at camp grayling a military base in central michigan with each course requiring a different driving function to determine how the subjects perform using display configurations that included helmet-mounted and vehicle-mounted visual displays
the next steps in the research include incorporating more automation into the task flow for soldiers operating the vehicle in order to allow them to focus their efforts on other tasks and ultimately optimize the safety and performance of the system then the question becomes for the human crewmembers how can we optimize performance on the tasks that theyre responsible for ferris said can we bring in artificial intelligence and can we bring in automation to make some tasks easier without sacrificing overall mission objectives so that the humans can essentially handle the responsibilities of an entire vehicle with a reduced crew size ferris role in this research began when he connected with chris mikulski the principal investigator and coat test lead as part of conversations to connect researchers at texas a&m and the afc he has since enjoyed a collaboration with a broad group of distinguished engineers and scientists in the military and academic sectors his long-term goal in working with coat is to develop reliable means of assessing soldier cognitive workload during vehicle operations and using that assessment to inform which and how mission-relevant data are displayed to soldiers what is interesting to me about this is that this is a program where theres all of the depths of resource that the us military can offer to explore the future of ground transportation ferris said i always want to make sure that where my contributions are going are toward safer and better systems for soldiers and i feel this is both an exciting and noble research effort
dr r stanley williams hewlett packard enterprise company chair professor in the department of electrical and computer engineering at texas a&m university and director of the hewlett packard enterprise center for computer architecture research is ranked in the top tier of scientists in the world in the field of computer science and electronics from the guide2research 2021 top scientists ranking out of 1 000 scientists and researchers williams who ranked 115th in the world and 80th in the united states is the director of the center for computer architecture research at texas a&m williams has performed research in nano-electronics -ionics and -photonics and how to utilize the nonlinear dynamical properties of matter to perform computation efficiently before joining texas a&m in 2018 he was a senior fellow and senior vice president at hewlett-packard labs where he led the group that developed the first intentional solid-state version of leon chua's memristor prior to this he was a member of the technical staff at bell labs before joining the faculty at the university of california los angeles where he served as a chemistry professor for 15 years he was named one of the top 10 visionaries in the field of electronics by ee times and has received awards in chemistry applied physics and nanotechnology williams has been issued more than 230 us patents published more than 450 peer-reviewed papers and presented hundreds of invited plenary keynote and named lectures at international scientific technical and business events the guide2research 2021 top scientists ranking is based on the h-index metric provided by google scholar and dblp and includes only leading scientists with an h-index of at least 40 williams has over 67 000 citations an h-index of 116 i am gratified that so many researchers have found my work to be sufficiently interesting and useful to be cited in their publications williams said
batteries are a part of everyday modern life powering everything from laptops phones and robot vacuums to hearing aids pacemakers and even electric cars but these batteries potentially pose safety and environmental risks in a study recently published in cell reports physical science researchers at texas a&m university investigated the components of a different kind of battery a metal-free water-based battery which would reduce the flammable nature of standard batteries and decrease the number of metal elements used in their production most batteries are li-ion and contain lithium and cobalt which are globally strategic elements meaning they are located only in certain countries but essential to the global economy and united states battery manufacturing "this work enables the future design of metal-free aqueous batteries " said dr jodie lutkenhaus professor and axalta coating systems chair in the artie mcferrin department of chemical engineering at texas a&m "by going metal-free we can address the pressing global demand for strategic metals used in batteries and by going aqueous we replace the flammable volatile electrolyte with water" using a very sensitive measurement technique called electrochemical quartz crystal microbalance with dissipation monitoring researchers were able to determine how electrons ions and water transfer in the electrode as it is charged and discharged "with this information we showed that enhanced electrode-water interactions lead to improved energy storage performance " she said the energy storage capacity was lower than that of traditional li-ion batteries but this paves the way for a more sustainable and less volatile battery in the future
the research is in its initial stages and there's opportunity for various applications in the real world one particular potential is implantable batteries for medical devices lutkenhaus' interest began when she learned about the strain on strategic elements such a lithium and cobalt due to increased battery manufacturing by using completely different materials such as we do with polymers here we remove metals from the picture completely " she said "my favorite aspect of this work is our ability to deeply characterize the molecular transport processes associated with this redox polymer only in the last few years have we been able to resolve such effects on this time and mass scale" for the future lutkenhaus said they will need to identify more polymers that are compatible with the design once we have that we can produce a high-performance full-cell for practical use she said this project is supported by the us department of energy-basic energy sciences program
the introduction of lithium-ion (li-ion) batteries has revolutionized technology as a whole leading to major advances in consumer goods across nearly all sectors battery-powered devices have become ubiquitous across the world while the availability of technology is generally a good thing the rapid growth has led directly to several key ethical and environmental issues surrounding the use of li-ion batteries current li-ion batteries utilize significant amounts of cobalt which in several well-documented international cases is mined using child labor in dangerous working environments additionally only a very small percentage of li-ion batteries are recycled increasing the demand for cobalt and other strategic elements a multidisciplinary team of researchers from texas a&m university has made a breakthrough that could lead to battery production moving away from cobalt in an article published in the may issue of nature dr jodie lutkenhaus axalta coating systems chair and professor in the artie mcferrin department of chemical engineering and dr karen wooley distinguished professor in the department of chemistry and holder of the wt doherty-welch chair in chemistry in the college of science outline their research into a new battery technology platform that is completely metal free this new battery technology platform utilizes a polypeptide organic radical construction by moving away from lithium and working with these polypeptides which are components of proteins it really takes us into this realm of not only avoiding the need for mining precious metals but opening opportunities to power wearable or implantable electronic devices and also to easily recycle the new batteries said wooley recently honored as the 2021 sec professor of the year they [polypeptide batteries] are degradable they are recyclable they are non-toxic and they are safer across the board the all-polypeptide organic radical battery composed of redox-active amino-acid macromolecules also solves the problem of recyclability the components of the new battery platform can be degraded on demand in acidic conditions to generate amino acids other building blocks and degradation products one of the major breakthroughs in this research according to lutkenhaus the big problem with lithium-ion batteries right now is that they're not recycled to the degree that we are going to need for the future electrified transportation economy lutkenhaus added the rate of recycling lithium-ion batteries right now is in the single digits there is valuable material in the lithium-ion battery but it's very difficult and energy intensive to recover the development of a metal-free all-polypeptide organic radical battery composed of redox-active amino-acid macromolecules that degrade on demand marks significant progress toward sustainable recyclable batteries that minimize dependence on strategic metals as a next step wooley and lutkenhaus have begun working in collaboration with dr daniel tabor assistant professor in the department of chemistry through a 2020 texas a&m triads for transformation (t3) grant that aims to utilize machine learning to optimize the materials and structure of the battery platform
the lead authors on the paper are tan nguyen a current postdoctoral associate at the university of michigan and former doctoral student from the texas a&m department of chemistry and alexandra easley a doctoral student in the department of materials science and engineering at texas a&m this work was financially supported by the national science foundation the welch foundation and the us department of energy office of science
dr jeyavijayan jv rajendran assistant professor in the department of electrical and computer engineering at texas a&m university is partnering with intel corporation for the defense advanced research projects agency (darpa) structured array hardware for automatically realized applications (sahara) project the three-year partnership enables the design of custom chips that include advanced security countermeasure technologies for widespread applications including government security there are two well-understood processor technologies in the semiconductor industry first are field-programmable gate arrays (fpgas) which provide basic functionality that can be modified post-production while this means better security in the supply chain because the manufacturer has no glimpse into the design being implemented it comes at the cost of lower performance and higher power consumption second are application specific integrated circuits (asics) which provide fixed functionality meaning the design cannot be modified post-manufacturing unlike in fpgas the manufacturers of asics are provided the design that is being implemented while this may pose security risks in the supply chain it allows for superior performance less power consumption and less area overall the sahara project which began in december 2020 is facilitating the automated conversion of fpga designs into secure asics to not only strengthen the security but also improve overall processor performance what intel is doing with this asic technology is they are taking the best of both worlds where you can have the configurability of fpga style but close to asic-like performance rajendran said intels structured asics are called easics an intermediate technology between fpgas and standard-cell asics
the goal of the sahara program is to utilize structured asics to meet the performance and security needs of the electronic components used in diverse department of defense applications said kostas amberiadis asic design engineer at intel corporation to accomplish this goal intel will develop a version of its easic tm technology with added security and ip (intellectual property) protection while significantly automating the whole design flow to drastically reduce its development time especially when converting from fpgas to strengthen chip security the sahara project will also explore reverse engineering countermeasures to prevent potential counterfeiting attacks fpgas are widely used in military applications today but the prospect and efficiency that structured asics deliver offer a promising look into the future sahara aims to enable a 60% reduction in design time a 10-times reduction in engineering costs and a 50% reduction in power consumption by automating the fpga-to-structured asics conversion said serge leef a program manager in darpas microsystems technology office in a press release announcing the project because of the nature of the program and the nature of the chips that we are trying to protect this project will not only bolster the semiconductor industry but will also have widespread impact in industries such as the smart grid and other critical infrastructure elements rajendran said rajendrans students and postdoctoral researchers are also working closely with intel on this project and receiving invaluable experience at this stage of their academic and professional careers to bridge the gap between academia and industry rajendran has worked with darpa in the past on partnerships such as the obfuscated manufacturing for gps program and the ongoing automatic implementation of secure silicon program this research was in part funded by the us government the views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies either expressed or implied of the us government
there is nothing glamorous about infrastructure even these days as congress debates the meaning of the word however modern high-quality infrastructure is critical to a vital strategic goal shared by the texas a&m university system and the us army: to build a world-class ecosystem for military technology innovation on the rellis campus its called the george hw bush combat development complex (bcdc) the texas a&m system board of regents took an important step toward this shared goal by appropriating $131 million for infrastructure improvements on the west side of rellis the improvements will undergird two testing ranges for next-generation technology: the innovation proving ground (ipg) and the ballistic aero-optics and materials (bam) test range the infrastructure package includes basic improvements water sewer and electrical power to areas around the runways of the former army and air force base it also includes fiber cabling to fully support 5th-generation (5g) internet capabilities 5g is really important to our partners and potential partners said ross guieb a retired army colonel serving as bcdc executive director the intel community dod (department of defense) and defense industry leaders are all watching closely with interest and excitement army commanders and other us military leaders eagerly await the completion of the bcdc over the next several years the $200 million complex is the result of a partnership between the us army futures command and the texas a&m system texas a&m engineering experiment station and the state of texas the bcdc includes the ipg the bam and other facilities that will bring together researchers from us universities the military and the private sector for collaboration demonstrations and high-tech testing of military prototypes the regents also approved three amendments to enhance bams instrumentation for research and testing bam will host enclosed testing of hypersonic vehicles directed-energy beams and the impact that hypersonic blasts have on various materials the changes improve the tubes rail guidance system add blast target tanks and a soft catch assembly that will safely recover flown objects for post-flight analysis and data collection the combined cost of the changes is $35 million bringing the total estimated cost to about $425 million at one kilometer long and 25 meters in diameter bam will be the nations largest enclosed hypersonic test range it will bridge a critical gap in us research capacity between lab-scale experiments and open-range tests which can cost tens of millions of dollars per test
the texas a&m university system board of regents has named dr john hurtado interim vice chancellor and dean of the college of engineering at texas a&m university and interim agency director of the texas a&m engineering experiment station (tees) he will assume the duties of the position june 1 hurtado will oversee administration of the college and agency while a national search is conducted to fill the positions hurtado joined the department of aerospace engineering at texas a&m in 2001 and serves as deputy director and chief technology officer for the bush combat development complex and professor of aerospace engineering his research areas include aerospace systems and robotics and his work is being used at nasa and sandia national laboratories his patented algorithms were developed for unique miniature robots that the smithsonian institution recently obtained for its permanent collection in the national museum of american history he earned his bachelors degree in aerospace engineering from san diego state university and his masters and doctorate in aerospace engineering from texas a&m hurtado replaces outgoing vice chancellor of engineering and national laboratories dean of the texas a&m college of engineering and agency director for tees dr m katherine banks who was named to these positions in 2011 and was appointed the 26th president of texas a&m in march
every day billions of photos and videos are posted to various social media applications the problem with standard images taken by a smartphone or digital camera is that they only capture a scene from a specific point of view but looking at it in reality we can move around and observe it from different viewpoints computer scientists are working to provide an immersive experience for the users that would allow them to observe a scene from different viewpoints but it requires specialized camera equipment that is not readily accessible to the average person to make the process easier dr nima kalantari professor in the department of computer science and engineering at texas a&m university and graduate student qinbo li have developed a machine-learning-based approach that would allow users to take a single photo and use it to generate novel views of the scene the benefit of our approach is that now we are not limited to capturing a scene in a particular way said kalantari we can download and use any image on the internet even ones that are 100 years old and essentially bring it back to life and look at it from different angles further details about their work were published in the journal association for computing machinery transactions on graphics view synthesis is the process of generating novel views of an object or scene using images taken from given points of view to create novel view images information related to the distance between the objects in the scene is used to create a synthetic photo taken from a virtual camera placed at different points within the scene over the past few decades several approaches have been developed to synthesize these novel view images but many of them require the user to manually capture multiple photos of the same scene from different viewpoints simultaneously with specific configurations and hardware which is difficult and time-consuming however these approaches were not designed to generate novel view images from a single input image to simplify the process the researchers have proposed doing the same process but with just one image "when you have multiple images you can estimate the location of objects in the scene through a process called triangulation said kalantari that means you can tell for example that there's a person in front of the camera with a house behind them and then mountains in the background this is extremely important for view synthesis but when you have a single image all of that information has to be inferred from that one image which is challenging" with the recent rise of deep learning which is a subfield of machine learning where artificial neural networks learn from large amounts of data to solve complex problems the problem of single image view synthesis has garnered considerable attention despite this approach being more accessible for the user it is a challenging application for the system to handle because there is not enough information to estimate the location of the objects in the scene to train a deep-learning network to generate a novel view based on a single input image they showed it a large set of images and their corresponding novel view images although it is an arduous process the network learns how to handle it over time an essential aspect of this approach is to model the input scene to make the training process more straightforward for the network to run but in their initial experiments kalantari and li did not have a way to do this "we realized that scene representation is critically important to effectively train the network " said kalantari to make the training process more manageable the researchers converted the input image into a multiplane image which is a type of layered 3d representation first they broke down the image into planes at different depths according to the objects in the scene then to generate a photo of the scene from a new viewpoint they moved the planes in front of each other in a specific way and combined them using this representation the network learns to infer the location of the objects in the scene to effectively train the network kalantari and li introduced it to a dataset of over 2 000 unique scenes that contained various objects they demonstrated that their approach could produce high-quality novel view images of a variety of scenes that are better than previous state-of-the-art methods the researchers are currently working on extending their approach to synthesize videos as videos are essentially a bunch of individual images played rapidly in sequence they can apply their approach to generate novel views of each of those images independently at different times but when the newly created video is played back the picture flickers and is not consistent we are working to improve this aspect of the approach to make it suitable to generate videos from different viewpoints said kalantari the single image view synthesis method can also be used to generate refocused images it could also potentially be used for virtual reality and augmented reality applications such as video games and various software types that allow you to explore a particular visual environment the project was funded in part by a grant awarded by the texas a&m triads for transformation seed-grant program
a team of texas a&m university researchers is analyzing how a network of localized nodes can implement machine-learning applications such as object recognition in a distributed fashion the research team includes dr alfredo garcia professor in the wm michael barnes 64 department of industrial and systems engineering and dr jeff huang associate professor in the department of computer science and engineering this proposed methodology stands as an alternative to the widely acknowledged federated learning approach federated learning is a machine-learning technique used for training models across multiple decentralized edge devices or servers that hold local data samples without exchanging them since its inception the federated learning approach is a more effective method to traditional centralized machine-learning techniques where all of the local data sets are uploaded to one server whats really exciting in this research is that it shows a robust learning approach for learning models from heterogeneous data streams which are becoming ubiquitous in the real world huang said this research focuses on a more robust alternative to federated learning by considering the approach in which each node periodically updates its own model based upon local data and a network regularization penalty so each node checks in with neighboring nodes every so often to make sure its own model is not too offbeat from that of its neighbors a node is a piece of the network in charge of training a model to put this into perspective there can be millions of nodes processing information at the same time within a matter of seconds nodes will share data with the server but either cant or wont share data with other nodes in a federated learning implementation participating devices need to only periodically communicate parameter updates to a central node where the model parameters are stored however when data streams are heterogeneous both in data rate and quality the model identified by federated learning may not be of the highest quality when the data streams with higher data rates also have lower precision there is a good chance node-producing parameter updates at the fastest pace do not necessarily have the highest quality updates you also run the risk of being exposed to bad data or noise that comes from bad nodes for example photos coming from the latest iphone model with a high-quality camera will have different data quality than photos coming from an iphone 5 federated learning is useful when streaming data across devices is housed in differing geographic locations however there is a downside when there is significant communication overhead and data cannot be transferred to a single location in a timely fashion this is namely the case for high-resolution video in this particular scenario assembling a diverse batch of data points in a central processing location to update a model involves significant latency and may ultimately not be practical in follow-up work with his team garcia is examining the application of the network approach to multitask learning where different nodes do not share the same learning objective or task local model exchange shows similarities between different tasks to provide better learning outcomes
scientists from texas a&m university have developed an extension to an ordinary cellphone that turns it into an instrument capable of detecting chemicals drugs biological molecules and pathogens the advance is reported in reviews of scientific instruments by aip publishing modern cellphones include high-quality cameras capable of detecting low levels of light and eliminating digital noise through software processing of the captured images recent work has taken advantage of this sensitivity to produce cellphone cameras that can be used as portable microscopes and heart rate detectors the current advance is based on two types of spectroscopy one type known as fluorescence spectroscopy measures the fluorescent light emitted by a sample another known as raman spectroscopy is useful for detecting molecules such as dna and rna that do not fluoresce or emit light at very low intensities both types were used to develop this cellphone detector the system includes an inexpensive diode laser as a light source oriented at right angles to the line connecting the sample and the cellphone camera the right-angle arrangement prevents back-reflected light from entering the camera "in addition this right-angle excitation geometry has the advantage of being easier to use for the analysis of samples where a bulk property is to be measured " said author dr peter rentzepis the investigators studied a variety of samples using their constructed cellphone detector including common solvents such as ethanol acetone isopropyl alcohol and methanol they recorded the raman spectra of solid objects including a carrot and a pellet of bacteria carrots were chosen for this study because they contain the pigment carotene the laser light used in their system has a wavelength that is easily absorbed by this orange pigment and by pigments in the bacteria the investigators compared the sensitivity of their system to the most sensitive industrial raman spectrometers available the ratio of signal to noise for the commercial instrument was about 10 times higher than the cellphone system the sensitivity of the cellphone detector could however be doubled by using a single rgb channel for analysis the system has a rather limited dynamic range but the investigators note that this problem can be easily overcome through several hdr or high dynamic range applications that combine images from multiple exposures the additional components including the laser add a cost of only about $50 to the price of a typical cellphone making this system an inexpensive but accurate tool for detecting chemicals and pathogens in the field
ah that all-too-familiar ache at the back of the neck with roughly 80% of jobs being sedentary and often requiring several hours of sitting stooped in front of a computer screen neck pain is a growing occupational hazard smartphones and other devices have also caused people to bend their necks for prolonged periods but is bad posture solely to blame in a recent study researchers at texas a&m university have found that while poor neck and head postures are indeed the primary determinants of neck pain body mass index age and the time of the day also influence the necks ability to perform sustained or repeated movements neck pain is one of the leading and fastest-growing causes of disability in the world said dr xudong zhang professor in the wm michael barnes '64 department of industrial and systems engineering our study has pointed to a combination of work and personal factors that strongly influence the strength and endurance of the neck over time more importantly since these factors have been identified they can then be modified so that the neck is in better health and pain is avoided or deterred the results of the study are published online in the journal human factors a flagship journal in the field of human factors and ergonomics according to the global burden of disease study by the institute for health metrics and evaluation neck pain is ranked as the fourth leading cause of global disability one of the main reasons for neck pain has been attributed to lifestyle particularly when people spend long durations of time with their necks bent forward however zhang said a systematic quantitative study has been lacking on how personal factors such as sex weight age and work-related habits can affect neck strength and endurance for their experiments zhang and his team recruited 20 adult men and 20 adult women with no previous neck-related issues to perform controlled head-neck exertions in a laboratory setting further instead of asking the participants to hold a specific neck posture for a long time similar to what might happen at a workplace they performed sustained-till exhaustion head-neck exertions in the laboratory conducting experiments where subjects do long tasks with their neck can take several hours of data collection which is not very practical for the experimenters and of course the participants in our study said zhang to solve this problem our experiments were strategically designed to mimic workplace neck strains but in a shorter period of time in these exercises subjects were seated and asked to put on an augmented helmet that allowed them to exert measurable force by the neck then the researchers asked them to either keep their necks straight or maintain their neck tilted in a forward or backward position in this position a force was applied to their head and neck on an adjustable frame this exertion was either to their maximum capacity or half of it before testing the researchers noted their subjects age body mass index and the time of day when zhang and his team analyzed their data they found that as expected work-related factors like head/neck posture play a very important role in determining both neck strength and endurance but they also observed that while there was no significant difference between male and female subjects in neck endurance body mass index was a significant predictor of neck endurance also to their surprise the time of day affected the necks ability to sustain an exertion without fatigue it is intuitive to think that over the course of the day our necks get more tired since we use it more said zhang but roughly half of our participants were tested in the morning and the remaining in the afternoon also some of the participants had day jobs and some worked the night shift despite this we consistently found the time-of-day effect on neck endurance the researchers said their database of neck strength and endurance is also necessary for building advanced musculoskeletal biomechanical models of the neck which can then be used to for example tease apart specific neck muscles that are more vulnerable to injury looking ahead we might have the data to begin evaluating if patients recovering from neck injuries are ready to return to work based on whether their neck strength and endurance are within the norm said zhang also engineers and designers could utilize our data to make wearable devices like helmets that are more ergonomic and less stressful on the neck other contributors to this work include dr suman chowdhury from texas tech university and yu zhou bocheng wan and curran reddy from the industrial and systems engineering department this research is funded by the national institute for occupational safety and health part of the centers for disease control and prevention
will it be possible to design materials that are unfazed by extreme temperatures in the near future in a study published in the journal nature computational materials researchers at texas a&m university have described a computational tool to evaluate a materials suitability for high-temperature applications such as gas turbines for jet engines and electrical power generators the computational framework which incorporates artificial intelligence and basic physics can forecast how materials will behave under harsh conditions in a fraction of the time compared to other algorithms we have used an innovative and interdisciplinary approach for screening materials that is a million times faster than traditional techniques said dr raymundo arróyave professor in the department of materials science and engineering at texas a&m university and corresponding author on the study currently these types of calculations even for a small temperature above absolute zero are an enormous challenge because they are computationally expensive since the late 1800s gas turbines have been the workhorse of power generation this drum-shaped machine lined with a series of bent or curved blades converts chemical energy from burning fuel into mechanical energy when the turbines blades rotate this motion is then exploited either to propel an aircraft or generate electricity gas turbines operate in high-temperature corrosive conditions making them prone to damage and progressive deterioration and so designing materials that can withstand extreme temperatures has been an ongoing pursuit among an array of high-temperature tolerant materials ceramics known as max phases are known to have properties that bridge the gap between conventional ceramics and metals in other words they are less brittle than ceramics and have higher temperature tolerance than many metals these materials are ideal candidates for structural components for gas turbines and heat-resistant coatings said dr miladin radovic professor in the materials science and engineering department and a senior author on the study however only a few out of hundreds of possible max phases have been experimentally verified to be high-temperature corrosion and oxidation-resistant
the researchers noted that given the vast number of elements that can be used to make max phases and an even greater number of ways of combining them the task of experimentally verifying how each composite will behave at high temperatures becomes impractical on the other hand computational techniques such as purely machine-learning algorithms have not been as robust at predicting the materials behavior at nonzero temperatures as an alternative to experiments and machine learning physics-based mathematical models offer a rigorous method to evaluate the properties of max phases at different temperatures among these models the most established one called density functional theory can account for the behavior of materials with minimal input data but this theory best applies to materials at their lowest energy state called the ground state to predict their behavior at elevated temperatures more complex and time-consuming calculations are needed these calculations scale very poorly said arróyave for perspective if we want to use density functional theory to calculate the properties of a candidate material at the lowest temperature of zero kelvins that is at the ground state it might take about a day of computational time but now if you want to calculate the same properties at a finite temperature say 1000 kelvins it can take weeks further he noted that predicting the behavior of materials when exposed to oxygen at elevated temperatures is more complicated and may take months or longer even when using thousands of supercomputer processors at a time hence instead of relying solely on just one method arróyave and his team used a three-pronged approach that included a combination of density functional theory machine learning and computational thermodynamics the researchers first calculated some fundamental properties of max phases at zero kelvins with density functional theory next those calculations were used as inputs to a machine-learning model in this way the researchers replaced otherwise computationally expensive calculations from density functional theory with machine-learning models then they used computational thermodynamics to determine the most stable compounds for a given temperature and a certain max phase composition lets consider a max phase made of titanium aluminum and carbon at higher temperatures we could have for example carbon dioxide carbon monoxide and other combinations of carbon and oxygen that might compete to exist said arróyave using our framework one can now determine which phases or combinations we can expect at that temperature how much of it and whether that can be detrimental simply put we can now quickly tell whether the material will decompose at a given temperature the researchers noted that although they tested their computational framework on a few candidate max phases the algorithm can be used for gauging the behavior of other existing or novel materials as well the research will help in rapidly ruling out those elements that might form unstable oxides at the material design phase said arróyave we can then use these materials to build superior gas turbines and other machines that can withstand even the harshest environmental conditions with minimal wear and tear over time these high-performance turbines will benefit not just the aviation and energy industry but also consumers who will see reduced costs this research is funded by the designing materials to revolutionize and engineer our future grant from the national science foundation
dr karim ahmed assistant professor in the department of nuclear engineering at texas a&m university will collaborate with dr anders david ragnar andersson at los alamos national laboratory (lanl) to conduct research to better understand the breakdown of nuclear fuel in order to extend the nuclear fuel cycle the heavy metals that compose the fuel must have a targeted high burnup burnup is a measure of how much uranium is burned in the reactor the faster the energy is extracted from a nuclear source the more efficient the reactor runs this reduces the downtime for refueling as well as the number of fresh nuclear fuel elements needed at high burnup values however the fuel sometimes fragments presenting a potential technical challenge that is poorly understood this is where ahmed and his research team come in they were awarded a developmental fellowship as part of the 2019-20 edition of the texas a&m university system national laboratories office collaborative research program with lanl to pursue this issue we are developing a physics-based multiscale modeling approach to understand this fuel fragmentation phenomenon said ahmed while other research groups at texas a&m and lanl have developed independent submodels to look at different aspects of the complicated physical process of fuel fragmentation ahmeds project is the first step in integrating the models the current regulatory limit of fuel peak burnup sits at 62 gigawatt-days per metric ton of uranium but the united states nuclear industry is considering increasing this limit to improve the economy and efficiency of electricity production before this extension can be approved the nuclear regulatory commission will likely require nuclear power plants to analyze a number of potential operational occurrences as well as their consequences a major factor in such analyzed scenarios is the behavior of fuel rods at high burnup we hope that our work will provide the nuclear community with guidelines to assess the most limiting conditions and possible mitigation strategies for safely extending the fuel peak burnup said ahmed ahmeds lanl collaborator andersson is an expert on atomistic modeling of nuclear materials as a long-term goal we plan on investigating the structure-composition-property relationships in nuclear materials through integrating physics-based multiscale models guided by the principles of integrated computational materials science and engineering he said
the texas a&m university system national laboratories office (nlo) was formed by the chancellor to be a conduit for expanding engagement with the national laboratories for faculty staff and students of the a&m system this office engages with all department of energy and national nuclear security agency laboratories and sites the nlo has developed a multi-element program to help texas a&m system researchers develop collaborative ties with researchers at los alamos national laboratory (lanl); execute the texas a&m system and lanl collaborative research projects; and formalize long-term relationships where appropriate such as through joint appointments
dr zachary grasley has been named the new head of the zachry department of civil and environmental engineering at texas a&m university effective sept 1grasley has been the director of the center for infrastructure renewal (cir) at texas a&m since january 2018i am really excited for the opportunity to lead this dynamic department with such talented and hard-working staff students and faculty he saidwhile directing the cir he helped facilitate the initiation and continued development of the research labs and established significant collaborations with key external partners grasleys research spans from fundamental studies on mechanisms and modeling to applied solutions that lead to intellectual property and commercializationhe is a fellow of the american society of civil engineers the american concrete institute and the american ceramic society grasley also has held leadership and service roles in the american concrete institute and the american ceramic societyhe originally joined texas a&m in 2006 after completing his bachelors degree in civil engineering at michigan technological university and his masters and doctoral degrees in civil engineering at the university of illinois at urbana-champaign grasley accepted a faculty position at virginia tech in 2012 and returned to texas a&m in 2014grasley holds the zachry chair for construction integration and is an inaugural presidential impact fellow he is also a professor in the materials science and engineering departmentgrasley replaces dr robin autenrieth who served in an interim capacity for one year before becoming department head in 2014 and will return to full-time faculty in the fall during her eight-plus-year tenure autenrieth has overseen the growth of the department and academic programs increased endowments and recruited highly talented faculty the environmental engineering degree program was developed under her directiondr autenrieth has done a great job and set us on the path to success grasley said the department is on an upward trajectory and is poised for major growth in research impact reputation and innovative education i am looking forward to helping shepherd this growth
automated vehicle (av) technology is widely acknowledged as a promising means to prevent crashes increase mobility among drivers and even lower emissions but there is a hesitancy toward autonomy among drivers which texas a&m university researchers are attempting to understand dr anthony mcdonald assistant professor and dr ranjana mehta associate professor in the wm michael barnes 64 department of industrial and systems engineering utilize neuroergonomics to measure and model human driver trust in automated vehicles the team has been awarded a grant for this research by the national science foundation (nsf) one of the unique things about this project is that its focused on dynamic trust between humans and a machine or humans and an automated vehicle in this case mcdonald said some people are more predisposed to trust automation and systems than others our hypothesis is that trust among these people differs over time after interactions with the automation and our goal is to measure those changes objectively neuroergonomics is the study of the brain and behavior at work and is the pillar for driver trust measurements with autonomous vehicles more specifically the research team will use brain imaging and model trust and driver behavior to examine how trust calibration models influence dynamic trust and driving behavior the team will conduct two experiments using the driving simulator in industrial and systems engineering to gather driver behavior data and subjective neural and physiological trust measures to understand how particular regions of the brain communicate with each other or dont when a driver wagers trust the first year of this three-year project will begin with data collection on 60-100 participants we will then be able to see how trust builds over time and capture it using brain signals mehta said in the simulator we can also breach driver trust in the automated vehicle by having the simulators automation fail to overtake another vehicle in the manner expected we are interested to see how similar or different human-automation trust and its neural correlates are in the automated vehicle application space the research team will be able to map communication or lack thereof between different regions of the brain as trust in the av is built broken and repaired over the course of the experiment among drivers who vary in their levels of trust in automation these neural markers can be employed to develop trust detection models that can trigger trust calibration methods to ultimately increase the transparency of the vehicles reliability so that the driver can see and calibrate how much control they want to retain while driving our goal is to make substantial progress in our understanding of dynamic trust laying the groundwork for measuring and modeling it mcdonald said we want to produce fundamental contributions that ultimately lead to safer vehicle technology
the department of industrial and systems engineering at texas a&m university installed a new driving simulator to use in research pertaining to driving and autonomous vehicles it is a one-of-a-kind feature on campus that can be driven manually or autonomously with a 270-degree field of vision due to the many different types of research that might require a driving simulator interdisciplinary teamwork is almost inherent in any project that incorporates this technology
cells sense and respond to the mechanical properties of the cellular microenvironment in the body changes in these properties which occur in a number of human pathologies including cancer can elicit abnormal responses from cells how the cells adapt to such changes in the mechanical microenvironment is not well understood a team of researchers at texas a&m university are working to understand cellular mechanosensing the ability to sense and respond to the mechanical properties of the microenvironment in a unique way dr tanmay lele unocal professor in the department of biomedical engineering department of chemical engineering and the department of translational medical sciences partnered with dr charles baer an evolutionary biologist at the university of florida together they used methods of experimental cellular evolution as a means to understand cellular adaptation to biomaterials of controlled mechanical properties the experiments were led by doctoral student purboja purkayastha from the artie mcferrin department of chemical engineering and technical laboratory coordinator kavya pendyala from the department of biomedical engineering at texas a&m before our work it was basically unknown if cells would evolve in controlled mechanical environments lele said we set out to test this possibility cells are products from hundreds of millions of years of evolution and their response to environments whether chemical or mechanical has likely evolved through a process of natural selection chemical constraints are well known to exert selection pressure on cell populations but whether the mechanical properties of a cells environment constitutes a significant agent of natural selection has never been investigated before many types of animal cells exhibit "phenotypic plasticity" they look and function differently in different mechanical environments there are two possible explanations for the plasticity of cells in different mechanical environments first the phenotypes may be optimal such that there is no better way for a cell to function in each environment alternatively the plasticity may be a compromise such that the phenotypic trait is optimal for a given mechanical context but suboptimal in other mechanical contexts the teams research demonstrated that cellular mechanosensing is in fact not optimal but a tradeoff using a combination of experimental cellular evolution on biomaterials of controlled stiffness genome sequencing simulations and gene expression analysis the team showed that cells evolve under selection pressure from biomaterials of controlled mechanical stiffness the teams research was recently published in the journal molecular biology and evolution lele said that experimental cell evolution is a good approach to better understand the mechanisms underlying cellular mechanosensing we are currently using experimental cellular evolution to understand how cancer cells which have a great genomic variation respond to the altered mechanical stiffness and other mechanical properties of tumor microenvironments lele said further the fact that cells can be evolved on biomaterials of controlled properties in vitro opens up new ways to generate engineered cells with properties optimal for those properties
covid-19 caused by the sars-cov-2 has plagued our world over the last year in just one year we have lost over half a million americans and an estimated 25 million worldwide to this virus the uncertainties about its long-term effects as well as how and where it spreads particularly indoors continues to motivate researchers and scientists to find solutions to contain the virus dr arum han professor in the department of electrical and computer engineering at texas a&m university and his collaborators designed an experimental system to show that exposure of the virus to a very high temperature even if applied less than a second can be sufficient to neutralize the virus so that it can no longer infect another human host in march 2020 the united states began shutting down when covid-19 cases began to rise over the past year many have dealt with the severity of the virus that has negatively impacted our country in a number of ways with the pandemic still ongoing getting back to a more normal societal environment is important and this research is a step in the right direction applying heat to neutralize covid-19 has been demonstrated before but in previous studies temperatures were applied anywhere from one to 20 minutes this length of time is not a practical solution as applying heat for a long period of time is both difficult and costly han and his team have now demonstrated that heat treatment for less than a second completely inactivates the coronavirus providing a promising and efficient solution to mitigate the ongoing spread of covid-19 particularly through long-range airborne transmission
medistar corporation approached leadership and researchers from the college of engineering at texas a&m in the spring of 2020 to collaborate and explore the possibility of applying heat for a very short amount of time to kill covid-19 soon after han and his team got to work and built a system to investigate the feasibility of such a procedure their process works by heating one section of a stainless-steel tube through which the coronavirus-containing solution is run to a high temperature and then cooling the section immediately afterward this experimental setup allows the coronavirus running through the tube to be heated up only for a very short period of time through this rapid thermal process the team found the virus to be completely neutralized in a significantly shorter time than previously thought possible their initial results were released within two months of proof-of-concept experiments han explained that if the solution is heated to nearly 72 degrees celsius for about half a second it can reduce the virus titer or quantity of the virus in the solution by 100 000 times which is sufficient to neutralize the virus and prevent transmission the potential impact is huge said han i was curious of how high of temperatures we can apply in how short of a time frame and to see whether we can indeed heat-inactivate the coronavirus with only a very short time and whether such a temperature-based coronavirus neutralization strategy would work or not from a practical standpoint the biggest driver was ‘can we do something that can mitigate the situation with the coronavirus
their research was featured on the cover of the may issue of the journal biotechnology and bioengineering not only is this sub-second heat treatment a more efficient and practical solution to stopping the spread of covid-19 through the air but it also allows for the implementation of this method in existing systems such as heating ventilation and air conditioning systems it also can lead to potential applications with other viruses such as the influenza virus that are also spread through the air han and his collaborators expect that this heat-inactivation method can be broadly applied and have a true global impact influenza is less dangerous but still proves deadly each year so if this can lead to the development of an air purification system that would be a huge deal not just with the coronavirus but for other airborne viruses in general han said
in their future work the investigators will build a microfluidic-scale testing chip that will allow them to heat-treat viruses for much shorter periods of time for example 10s of milliseconds with the hope of identifying a temperature that will allow the virus to be inactivated even with such a short exposure time the lead authors of the work are electrical engineering postdoctoral researchers yuqian jiang and han zhang other collaborators on this project are dr julian l leibowitz professor and dr paul de figueiredo associate professor from the college of medicine at texas a&m; biomedical postdoctoral researcher jose a wippold; associate research scientist in microbial pathogenesis and immunology jyotsana gupta; and electrical engineering assistant research scientist jing dai this work has been supported by grants from medistar corporation several research personnel on the project team were also supported by grants from the national institutes of healths national institute of allergy and infectious diseases
with a global impetus toward utilizing more renewable energy sources wind presents a promising increasingly tapped resource despite the many technological advancements made in upgrading wind-powered systems a systematic and reliable way to assess competing technologies has been a challenge in a new case study researchers at texas a&m university in collaboration with international energy industry partners have used advanced data science methods and ideas from the social sciences to compare the performance of different wind turbine designs currently there is no method to validate if a newly created technology will increase wind energy production and efficiency by a certain amount said dr yu ding mike and sugar barnes professor in the wm michael barnes '64 department of industrial and systems engineering in this study we provided a practical solution to a problem that has existed in the wind industry for quite some time the results of their study are published in the journal renewable energy wind turbines convert the energy transferred from air hitting their blades to electrical energy as of 2020 about 84% of the total electricity produced in the united states comes from wind energy further over the next decade the department of energy plans to increase the footprint of wind energy in the electricity sector to 20% to meet the nations ambitious climate goals in keeping with this target there has been a surge of novel technologies particularly to the blades that rotate in the wind these upgrades promise an improvement in the performance of wind turbines and consequently power production however testing whether or how much these quantities will go up is arduous one of the many reasons that make performance evaluation difficult is simply because of the sheer size of wind turbines that are often several hundred feet tall testing the efficiency of these gigantic machines in a controlled environment like a laboratory is not practical on the other hand using scaled-down versions of wind turbines that fit into laboratory-housed wind tunnels yield inaccurate values that do not capture the performance of the actual-size wind turbines also the researchers noted that replicating the multitude of air and weather conditions that occur in the open field is hard in the laboratory hence ding and his team chose to collect data from inland wind farms for their study by collaborating with an industry that owned wind farms for their analysis they included 66 wind turbines on a single farm these machines were fitted with sensors to continuously track different items like the power produced by the turbines wind speeds wind directions and temperature in totality the researchers collected data over four-and-a-half years during which time the turbines received three technological upgrades to measure the change in power production and performance before and after the upgrade ding and his team could not use standard pre-post intervention analyses such as those used in clinical trials briefly in clinical trials the efficacy of a certain medicine is tested using randomized experiments with test groups that get the medication and controls that did not the test and the control groups are carefully chosen to be otherwise comparable so that the effect of the medicine is the only distinguishing factor between the groups however in their study the wind turbines could not be neatly divided into the test and control-like groups as needed for randomized experiments the challenge we have here is that even if we choose ‘test and ‘control turbines similar to what is done in clinical trials we still cannot guarantee that the input conditions like the winds that hit the blades during the recording period were the same for all the turbines said ding in other words we have a set of factors other than the intended upgrades that are also different pre- and post-upgrade hence ding and his team turned to an analytical procedure used by social scientists for natural experiments called causal inference here despite the confounding factors the analysis still allows one to infer how much of the observed outcome is caused by the intended action which in the case of the turbines was the upgrade for their causal inference-inspired analysis the researchers included turbines only after their input conditions were matched that is these machines were subject to similar wind velocities air densities or turbulence conditions during the recording period next using an advanced data comparison methodology that ding jointly developed with dr rui tuo assistant professor in the industrial and systems engineering department the research team reduced the uncertainty in quantifying if there was an improvement in wind turbine performance although the method used in the study requires many months of data collection ding said that it provides a robust and accurate way of determining the merit of competing technologies he said this information will be beneficial to wind operators who need to decide if a particular turbine technology is worthy of investment wind energy is still subsidized by the federal government but this will not last forever and we need to improve turbine efficiency and boost their cost-effectiveness said ding so our tool is important because it will help wind operators identify best practices for choosing technologies that do work and weed out those that don't ding received a texas a&m engineering experiment station impact award in 2018 for innovations in data and quality science impacting the wind energy industry other contributors to the research include nitesh kumar abhinav prakash and adaiyibo kio from the industrial and systems engineering department and technical staff of the collaborating wind company this research is funded by the national science foundation and industry
the texas a&m university system has named dr nancy currie-gregg interim deputy director and chief technology officer of the george hw bush combat development complex (bcdc) she assumed the duties of the position on june 7 currie-gregg will oversee the bcdcs research events and projects while a search is conducted to fill the position she joined the bcdc located on the texas a&m systems rellis campus in 2019 at its inception serving as the systems engineering and research integration lead currie-gregg is a holder of the don lummus ‘58 professorship of practice in engineering with appointments in the texas a&m university department of aerospace engineering and the wm michael barnes 64 department of industrial and systems engineering she develops and teaches graduate and undergraduate courses in aerospace human factors engineering quantitative risk analysis and reliability engineering system safety engineering and resilient systems engineering her research interests include spacecraft occupant protection human-robot interaction and the optimization of human performance and safety in engineered systems prior to joining texas a&m in the fall of 2017 currie-gregg spent the vast portion of her career supporting nasas human spaceflight programs and projects selected as an astronaut in 1990 she accrued more than 1 000 hours in space as a mission specialist on four space shuttle missions sts-57 in 1993; sts-70 in 1995; sts-88 the first international space station assembly mission in 1998; and sts-109 the fourth hubble space telescope servicing mission in 2002 a retired us army colonel and master army aviator she logged more than 4 000 flying hours in a variety of rotary-wing and fixed-wing aircraft following the space shuttle columbia tragedy in 2002 she led the space shuttle program safety and mission assurance office directing safety reliability and quality assurance efforts enabling the safe return to flight of the space shuttle in 2005 she was then selected as a senior executive member of the nasa engineering and safety center serving for over a decade as the chief engineer at the johnson space center then as principal engineer currie-gregg earned her bachelors degree in biological sciences (interdisciplinary) from the ohio state university her masters degree in safety engineering from the university of southern california and her doctorate in industrial engineering with a specialization in human factors engineering and artificial intelligence from the university of houston currie-gregg replaces outgoing bcdc deputy director and chief technology officer dr john e hurtado who was named interim vice chancellor and dean of the college of engineering at texas a&m and interim agency director of the texas a&m engineering experiment station (tees) in may
scientists are continuously looking for alternatives to fossil fuel-based power plants to diminish the adverse effects of fossil energy sources on the environment and to also build reliability researchers at texas a&m university are studying the viability of solar photovoltaic (pv) grid-tied systems on rooftops to fill that need dr fadhil al-aboosi a researcher with the texas a&m engineering experiment stations gas and fuel research center is leading a team studying the adoption of solar pv systems on building rooftops in countries that have a good solar energy potential even if they are oil or gas producers pv systems are composed of one or more solar panels combined with an inverter and other electrical and mechanical hardware that use energy from the sun to generate electricity we want to mitigate the negative impact of fossil energy sources on the environment to avoid using lands that can add more cost and may be used for other purposes such as agricultural and urban activities al-aboosi said this will address the energy and environmental challenges of the rapid growth of the building sector al-aboosi said the prospects of the implementation of a pv system on building rooftops in texas was studied theoretically for the first time to overcome the lack of performance behavior data of this technology specifically for the selected location the importance and accuracy of results comparing other previous research that has been carried out in the same direction is obtained from the comprehensive analysis of the system performance al-aboosi said we considered technical economic and environmental criteria solar irradiance intensity two modes of single-axis tracking the shadow effect and the pv cell temperature impact on system efficiency the results of their study preliminary evaluation of a rooftop grid-connected photovoltaic system installation under the climatic conditions of texas are detailed in the journal energies the evaluated parameters of the proposed system include energy output array yield (the ratio of daily monthly or yearly direct current energy output from a pv array) final yield array and system losses capacity factor performance ratio return on investment payback period levelized cost of energy and carbon emissions according to the overall performance results of the pv system which researchers propose to be installed on the eastern buildings of the texas a&m campus al-aboosi said they found this to be a technically economically and environmentally feasible solution for electricity generation and could play a significant role in the future energy mix of texas the result of the comparison for the proposed pv system with other pv systems located in different sites around the world showed that their performance does not only depend on solar radiation intensity but the operational and climatic conditions should be considered for any site that is selected to install the pv system he said al-aboosi said it was important to look at all these aspects in order to make solar energy a more viable solution the lack of research in this field either published or implemented environmental concerns and supporting variety in energy sources have sparked our curiosity to perform this study he said furthermore this study has been presented to make texas a&m university and the texas a&m engineering experiment station a pioneer in this field as in other scientific fields it is worth noting that this work is the first study that was done hypothetically in texas based on theoretical analysis whereas all previous studies which were used for comparison with this study have been done based on experimental results at different locations worldwide he said the results were truly impressive and will pave the way for substantial developments in using rooftop grid-tied pv systems it can be used as a future vision especially the economic analysis for estimating the potential of investment incentives subsidies and feed-in tariff (a policy mechanism designed to accelerate investment in renewable energy) to make implementing solar pv systems more attractive in texas and around the world al-aboosi said in any case the long-term performance of the rooftop grid-tied technologies in texas requires further research especially finding proper management strategies of flexible aggregating of distributed energy resources from fossil fuels and renewable energy sources into grid
despite the challenges of the past year the seventh annual texas a&m new ventures competition (tnvc) was held in may to recognize some of the most innovative startups in texas hosted by the texas a&m engineering experiment station (tees) and texas a&m university innovation partners this years competition was a hybrid platform due to covid-19 last years event was held completely online houston's starling medical headed by alex arevalos took home the top $50 000 prize the company is a digital health device company developing an artificial intelligence (ai) and tech-enabled platform as a safer smarter alternative to urinary catheters for individuals with neurogenic bladder dysfunction tnvc was an amazing experience for us at starling and it was such an honor to win the competition especially since there were so many exceptional companies there this year arevalos said this first-place prize and connections we made during it will be instrumental in accelerating our commercialization efforts to bring our life-transforming solution to our future users based on our accelerated regulatory path we expect to be able to bring our device to market in the next two years he said arevalos said the mission of starling medical is to restore the ability to urinate safely and conveniently to an underserved patient population with a condition called neurogenic bladder dysfunction currently the standard of care for these people is for them to self-catheterize five to eight times a day on a set schedule because they cant sense their bladders anymore or activate them on their own due to a neurological condition like a spinal cord injury or multiple sclerosis arevalos explained its an archaic process that leaves people with a significant reduction in quality of life and puts them at risk for costly life-threatening infections with our ai and tech-enabled device we call the urincontrol system our users will be able to urinate safely and conveniently again on their own at the push of a button in addition our device communicates with their smartphone to help them track their bladder fullness throughout the day and it uses ai and our connected telehealth portal to warn them if they are developing a urinary tract infection arevalos said after the necessity of pivoting to a completely virtual program last year due to covid-19 we were pleased to have had the vast majority of the competition in person this year said saurabh biswas executive director of tees commercialization and entrepreneurship division which founded this program tnvc now in its seventh year is a flagship event in the state-wide startup innovation ecosystem we are always excited to learn more about these texas startups and to hear directly from founders and inventors about their technologies that can not only improve the health and well-being of our citizens and the environment but also positively impact our local and state economies innovation partners was proud to co-host the texas a&m new ventures competition in collaboration with many entities across the a&m system a diverse group of sponsors and countless volunteers said andrew morriss vice president of entrepreneurship and economic development with the support of our generous sponsors tnvc had a record setting 2021 program despite all the challenges of the past year we know that the investment will come back to benefit our state economy many times over and benefit society through the impact of the game changing technologies the competing ventures commercialize the prize pool for this years competition was more than $500 000 in cash and in-kind services the full list of winners includes: 1st place $50 000 prize starling medical 2nd place $35 000 prize ictero medical 3rd place $25 000 prize koda health 4th place $15 000 prize microsilicon 5th place $10 000 prize vitanova biomedical 6th place $5 000 prize code walker elevator pitch competition: 1st $5 000 emgenisys 2nd $4 000 skypaws 3rd $2 000 solenic medical 4th $1 000 tybr health special prizes: brazos valley economic development corporation launch prize heliowave aggie angel network investment prize vitanova biomedical ph partners investment prize(s) koda health and starling medical southwest national pediatric device consortium prize(s) solenic medical and hero medical hollinden marketers & strategists services prize tybr health amerra visualization services prize tybr health paragon innovations prize(s) solenic medical and drill docs schwegman lundberg & woessner ip legal services prize tezcat laboratories biotex investment prize ictero medical innovators legal services prize(s) microsilicon and vitanova biomedical west webb albritton and gentry services prize riverwalk therapeutics axle – box services prize(s) skypaws and ava propulsion
the texas a&m engineering experiment station (tees) will collaborate with energy-sector stakeholders several national labs and universities in 10 states as the manager of a new ocean energy safety institute (oesi) the goals of the new oesi include safer workplaces improved environmental stewardship and greater us energy security through advances in technology monitoring equipment and workforce training the oesi will work to mitigate environmental and safety risks for both conventional and renewable energy technologies and prevent geohazards work-process incidents and offshore oil spills the consortium is organized under an agreement announced in may between tees and the us department of the interiors bureau of safety and environmental enforcement and the us department of energy the agreement calls for up to $40 million from the federal government over five years as well as about $12 million in investments from consortium members a smaller-scale oesi had been operated until recently by tees and two other texas universities now the oesi includes 16 universities in 10 states including texas a&m university and prairie view a&m university it also involves several national labs and more than 20 stakeholders representing conventional and renewable energy – including offshore wind and marine and hydrokinetic energy – from every offshore energy producing region tell us how we can help and well be right there said john sharp chancellor of the texas a&m university system were delighted to contribute to the energy sector it fuels so many jobs in texas and across the country katherine banks texas a&m president is the principal investigator on the oesi project she applauded her team for pulling together a diverse array of stakeholders from the energy industry and academic institutions the universities involved in the oesi represent massachusetts maryland virginia florida louisiana texas oklahoma california washington and alaska we are glad the federal government selected texas a&m to support the energy industry banks said tees has nationally recognized expertise in shepherding advanced research and development john pappas tees director of center operations and adjunct faculty member in the department of ocean engineering at texas a&m is the program manager for the oesi project he called the new consortium a game-changer we look forward to being part of the next generation of safety and environmental protection technologies for offshore energy production pappas said our team is extraordinarily diverse creative and talented it will offer new solutions and new ways of thinking tees will be responsible for developing a road map of projects in consultation with consortium members once approved by federal officials the road map becomes a guide for individual projects with yearly objectives while the department of the interiors bureau of safety and environmental enforcement and the department of energy will provide expertise direction and oversight through a joint steering committee (jsc) the oesi will operate independently the jsc will include experts in oil and gas offshore wind and marine and hydrokinetic energy which is the method of converting energy from waves tides ocean currents and thermal and dissolved-salt gradients into electricity faisal khan will be the oesi technical director a chemical engineering professor khan is a leading researcher in offshore technology and safety engineering he emphasized that consortium projects will entail researchers from a variety of engineering fields: ocean industrial chemical civil mechanical and others this is a multidisciplinary holistic approach khan said we will provide technical support and safety and environmental protection technologies for oil gas wind and wave energy production note: the views and conclusions in this release are those of its authors and should not be interpreted as representing the views or policies of the us government mention of trade names or commercial products does not constitute endorsement by the us government
when the covid-19 pandemic began to inundate the united states in early 2020 many researchers looked at how they could adapt their current work to help fight the pandemic dr limei tian assistant professor in the department of biomedical engineering at texas a&m university and her team are developing a novel way to diagnose covid-19 through a mask we are developing a reliable noninvasive point-of-care biosensor that can directly capture and detect sars-cov-2 for rapid detection and surveillance of covid-19 tian said one of tians research focuses is in organic and inorganic hybrid materials for physical chemical and biological sensors and multifunctional surfaces and interfaces as opportunities to research ways to detect and fight covid-19 started tian began to adapt biosensors designed for other disease diagnoses to instead detect covid-19 tian said the goal is to develop a biosensor patch that can be placed into a mask/scarf as a person exhales their breath can be captured the sensor then be easily removed and placed in a hand-held reader for analysis in addition to covid-19 the platform technology being developed in our lab can be readily adapted for rapidly detecting and monitoring other infectious respiratory diseases tian said the team is in the sensor development process tian said one challenge has been improving the sensitivity of the biosensors to capture low concentration of biomarkers in the breath tian was recently recognized for her work in biosensors by receiving the trailblazer r21 award from the national institute of biomedical imaging and bioengineering part of the national institutes of health the award is an opportunity for new and early-stage investigators to pursue research programs that integrate engineering and the physical sciences with the life and behavioral sciences
the texas a&m engineering experiment station's health care market segment advances research in the key areas of medicine health care-related technology and life sciences using a multi-disciplinary approach our strengths include bioinformatics computational biology and systems biology for agricultural environmental and life sciences next generation medical devices and systems and education training and outreach programs for pharmaceutical workforce development
hydrogels are commonly used inside the body to help in tissue regeneration and drug delivery however once inside they can be challenging to control for optimal use a team of researchers in the department of biomedical engineering at texas a&m university is developing a new way to manipulate the gel by using light graduate student patrick lee and dr akhilesh gaharwar associate professor are developing a new class of hydrogels that can leverage light in a multitude of ways light is a particularly attractive source of energy as it can be confined to a predefined area as well as be fine-tuned by the time or intensity of light exposure their work was recently published in the journal advanced materials light‐responsive hydrogels are an emerging class of materials used for developing noninvasive noncontact precise and controllable medical devices in a wide range of biomedical applications including photothermal therapy photodynamic therapy drug delivery and regenerative medicine lee said light-responsive biomaterials are often used in biomedical applications; however current light sources such as ultraviolet light and visible light cannot sufficiently penetrate the tissue to interact with the hydrogel instead the team is researching near-infrared (nir) light which has a higher penetration depth the team is using a new class of two-dimensional nanomaterials known as molybdenum disulfide (mos2) which has shown negligible toxicity to cells and superior nir absorption these nanosheets with high photothermal conversion efficiency can absorb and convert nir light to heat which can be developed to control thermoresponsive materials in the groups previous study published in advanced materials certain polymers react with mos2 nanosheets to form hydrogels building on this discovery the team further utilizes mos2 nanosheets and thermoresponsive polymers to control the hydrogel under nir light by photothermal effect this work leverages light to activate the dynamic polymer–nanomaterials interactions gaharwar said upon nir exposure mos2 acts as a crosslink epicenter by connecting with multiple polymeric chains via defect‐driven click chemistry which is unique nir light allows internal formation of therapeutic hydrogels in the body for precise drug delivery for cancer therapy most of the drugs can be retained within the tumor which will ease the side effects of chemotherapy moreover nir light can generate heat inside the tumors to ablate cancer cells known as photothermal therapy therefore a synergetic combination of photothermal therapy and chemotherapy has shown a higher efficacy in destroying cancer cells this study is funded by the new innovator award from the national institutes of health as well as the texas a&m president's excellence fund through x‐grant and t3
the texas a&m engineering experiment station's health care market segment advances research in the key areas of medicine health care-related technology and life sciences using a multi-disciplinary approach our strengths include bioinformatics computational biology and systems biology for agricultural environmental and life sciences next generation medical devices and systems and education training and outreach programs for pharmaceutical workforce development
to help patients manage their mental wellness between appointments researchers at texas a&m university have developed a smart device-based electronic platform that can continuously monitor the state of hyperarousal one of the key signs of psychiatric distress they said this advanced technology could read facial cues analyze voice patterns and integrate readings from built-in vital signs sensors on smartwatches to determine if a patient is under stress furthermore the researchers noted that the technology could provide feedback and alert care teams if there is an abrupt deterioration in the patients mental health mental health can change very rapidly and a lot of these changes remain hidden from providers or counselors said dr farzan sasangohar assistant professor in the wm michael barnes 64 department of industrial and systems engineering our technology will give providers and counselors continuous access to patient variables and patient status and i think it's going to have a lifesaving implication because they can reach out to patients when they need it plus it will empower patients to manage their mental health better the researchers integrated electronic monitoring and feedback platform is described in the journal of psychiatric practice unlike some physical illnesses that can usually be treated with a few doctor visits people with mental health needs can require an extended period of care between visits to a health care provider information on a patients mental health status has been lacking hence unforeseen deterioration in mental health has a limited chance of being addressed for example a patient with anxiety disorder may experience a stressful life event triggering extreme irritability and restlessness which may need immediate medical attention but this patient may be between appointments on the other hand health care professionals have no way to know about their patients ongoing struggle with mental health which can prevent them from providing the appropriate care hence patient-reported outcomes between visits are critical for designing effective health care interventions for mental health so that there is continued improvement in the patients wellbeing to fill in this gap sasangohar and his team worked with clinicians and researchers in the department of psychiatry at houston methodist hospital to develop a smart electronic platform to help assess a patients mental wellbeing the hospital has the largest inpatient psychiatry clinic in the houston area said sasangohar with this collaboration we could include thousands of patients that had given consent for psychiatric monitoring sasangohars collaborators at houston methodist hospital were already using an off-the-shelf patient navigation tool called caresense this software can be used to send reminders and monitoring questions to patients to better assess their wellbeing for instance individuals at risk for self-harm can be prompted to take questionnaires for major depressive disorder periodically rather than solely relying on the patients subjective assessment of their mental health sasangohar and his team also developed a whole suite of software for automatized hyperarousal analysis that can be easily installed on smartphones and smartwatches these programs gather input from face and voice recognition applications and sensors already built in smartwatches such as heart rate sensors and pedometers the data from all of these sources then train machine-learning algorithms to recognize patterns that are aligned with the normal state of arousal once trained the algorithms can continuously look at readings coming from the sensors and recognition applications to determine if an individual is in an elevated arousal state the key here is triangulation said sasangohar each of these methods on their own say facial sentiment analysis show promise to detect the mental state albeit with limitations but when you combine that information with the voice sentiment analysis as well as physiological indicators of distress the diagnosis and inference become much more powerful and clearer sasangohar noted that both the subjective evaluation of mental state and the objective evaluation from the machine-learning algorithms are integrated to make a final assessment of the state of arousal for a given individual while their technologys prototype is ready the researchers said they still need to improve the battery life of smartphones carrying their software since the algorithms guzzle a lot of power further they noted that they have to address usability issues that is any issues that prohibit patients from using their technology such as difficulty in navigating their application because of the stigmatization that surrounds mental illness we wanted to build a mental health monitoring device that was very discreet said sasangohar so we chose off-the-shelf products like smartphones and then build sophisticated applications that operate within these devices to make monitoring mental health discreet other contributors to the study include dr christopher fowler and dr alok madan from the university of texas mcgovern school of medicine and baylor college of medicine; courtenay bruce and dr stephen jones from the houston methodist institute for academic medicine; dr christopher frueh from the university of texas mcgovern school of medicine and the university of hawaii; and dr bita kash from the methodist institute for academic medicine and texas a&m this research is funded by the texas a&m university presidents excellence grant (x-grant)
researchers at texas a&m university and george mason university are investigating the utilization of population-level data in an effort to codify quarantine policies for policymakers this will ultimately alleviate the spread of diseases during epidemics and pandemics from multiple angles including mass screening quarantining and vaccine distribution dr hrayer aprahamian assistant professor and jiayi lin doctoral student in the wm michael barnes 64 department of industrial and systems engineering at texas a&m and dr hadi el-amine assistant professor of systems engineering and operations research at george mason university have identified quarantine policies using subject-risk information to mitigate the spread of disease while also recognizing the potential for the negative economic impact in order to identify the best possible policy we formulate this decision problem within an optimization framework and use a range of tools to be able to solve the resulting problem accurately and efficiently aprahamian said doing so enables us to solve the problem for realistic problem instances their research aims to not only provide practitioners administrators and policymakers with evidence-based insights and recommendations on mitigating the spread of disease in pandemics but to also demonstrate that operations research and mathematical tools can be used to successfully divulge more optimized mitigation policies to effectively combat the spread of diseases this paper provides an efficient solution scheme to a class of challenging optimization problems that arise in numerous real-world applications like crew scheduling vehicle routing inventory management group testing and bin packing lin said from a practical standpoint this paper addresses an important question that many practitioners have continued to struggle with how can one use the vast amount of covid-19 data to shape informed data-driven policies the researchers attempt to answer this question by providing a mathematical framework to identify quarantine policies that are effective in mitigating the spread of disease while considering both subject-specific risk information and overall economic impact this distinction arises from the fact that there is no one-size-fits-all policy the team conducted a covid-19 case study using real-world risk data for the state of minnesota which was achieved by simulating a realistic community based on census data and then ran an optimization model on this particular community the resulting models were solved using texas a&ms high performance research computing facilities to achieve comprehensive results the experiment was repeated for a range of realistic parameter values to measure the benefits of the proposed policies the team compared their solution to more conventional one-dimensional policies targeted policies tailored to the specific needs of the local population are recommended such specific solutions however are often complex and require us to work closely with local leaders in order to successfully implement them aprahamian said this research demonstrates that the identified data-driven policies outperform conventional measures by both reducing the spread of the disease and having less economic impact one observation that is worth highlighting is that the results reveal that taking no action at all is never the best solution for a wide range of realistic parameter values even in the most extreme of cases lin said these results when scaled translate to hundreds of thousands of fewer infections and millions of dollars of savings aprahamian said such high-level insights are of great value as they can be used by larger national or worldwide agencies to urge local administrators to take action especially at the early stages of the pandemic
having a home near a busy airport certainly has its perks it is close to many establishments and alleviates the problem of wading through endless traffic to catch flights but it does come at a cost tolerating the jarring sounds of commercial airplanes during landing and takeoff researchers at texas a&m university have conducted a computational study that validates using a shape-memory alloy to reduce the unpleasant plane noise produced during landing they noted that these materials could be inserted as passive seamless fillers within airplane wings that automatically deploy themselves into the perfect position during descent when landing aircraft engines are throttled way back and so they are very quiet any other source of noise like that from the wings becomes quite noticeable to the people on the ground said dr darren hartl assistant professor in the department of aerospace engineering we want to create structures that will not change anything about the flight characteristics of the plane and yet dramatically reduce the noise problem the researchers have described their findings in the journal of aircraft aircraft noise has been an ongoing public health issue airplanes can generate up to 75-80 decibels during landing which can be damaging to hearing over the long term for example studies have shown that people exposed to sustained aircraft noise can experience disturbed sleep and an increased risk of stroke and heart disease compared to those who do not live near airports the source of aircraft noise is different during ascent and descent during takeoff the engines are the primary source of noise on the other hand when airplanes slow down to land the engines do not need to generate power and are mostly idling at this time the wings begin to reconfigure themselves to slow down the airplane and prepare for touchdown similar to the opening of venetian blinds the front edge of the wing separates from the main body this change causes air to rush into the space created circle around quite violently and produce noise the idea is similar to how a sound is generated in a flute said hartl when a flute is played air blown over a hole begins to swirl around the hole and the size the length and how i cover the holes produces a resonant sound of a certain frequency similarly the circulating air in the cove created between the front edge of the wing and the main wing resonates and creates a sharp unpleasant noise earlier work from hartls collaborators at nasa showed that fillers used as a membrane in the shape of an elongated s within this cove could circumvent the noise-causing air circulation and thereby lessen the jarring sound however a systematic analysis of candidate materials that can assume the desired s-shaped geometry during descent and then recess back into the front edge of the wing after landing was lacking to address this gap the researchers performed comprehensive simulations to investigate if a membrane made of a shape-memory alloy could go back and forth changing shape for every landing their analysis considered the geometry the elastic properties of the shape-memory alloy and the aerodynamic flow of air around the material during descent as a comparison the researchers also modeled the motion of a membrane made of a carbon-fiber-reinforced polymer composite under the same airflow conditions hartl said these types of simulations are computationally expensive since the flow of air around the conformal material has to be modeled while analyzing the air-induced motion of the material every time the air applies some pressure to the material the material moves and every time the material moves the air moves differently around it he said so the behavior of the airflow changes the structure and the motion of the structural changes the airflow consequently the team had to perform calculations hundreds to thousands of times before the motion of the materials was correctly simulated when they analyzed the outcomes of their simulations they found that both the shape-memory alloy and the composite could change their shape to reduce air circulation and thereby reduce noise however the researchers also found that the composite had a very narrow window of designs that would enable noise canceling as a next step hartl and his team plan to validate the results of their simulations with experiments in these tests the researchers will place scaled-down models of aircraft wings with the shape-memory alloy fillers into wind tunnels the goal is to check if the fillers can deploy into the correct shape and reduce noise in near real-world situations we would also like to do better said hartl we might be able to create smaller structures that can reduce noise and do not require the s-shape which are actually quite large and potentially heavy other contributors to this research include dr gaetano arena dr rainer groh and dr alberto pirrer from the university of bristol england; dr travis turner from the nasa langley research center virginia; and william scholten now at ata engineering inc this research is funded by the engineering and physical sciences research council the royal academy of engineering and the nasa langley research center
finding new solutions to address the challenges posed by plastic waste can dramatically improve global sustainability practices and help achieve a greener future while many researchers are working to solve this problem on an international scale a new multi-institutional team is seeking to turn that waste into a high-performing contributor the research team is working on upcycling plastic waste into liquid lubricants including oil hydraulic fluids heat transfer fluids and greases led by iowa state university the project team includes argonne national laboratory chevron philips chemical company chemstations inc american packaging corp the city of ames resource recovery facility and hy-vee alongside texas a&m university dr ali erdemir halliburton chair in engineering professor and professor in the j mike walker' 66 department of mechanical engineering and the department of materials science and engineering leads the efforts for texas a&m the project is one of 12 funded by the us department of energys plastics innovation challenge an initiative designed to reduce plastic waste in oceans and landfills as well as help to position the us as a global leader in plastics recycling technologies and in the manufacture of new plastics that are recyclable by design their research was recently published in the journal chemsuschem erdemir said the team is working toward the common goal of demonstrating that plastic wastes can be responsibly and economically upcycled into high-performance lubricants and used to minimize friction and wear if successful the team hopes their research could help reduce both energy consumption and greenhouse gas emissions "this project aims to reduce the adverse impacts made by hundreds of millions of tons of waste plastics through upcycling in order to support a circular economy with minimal environmental impact " erdemir said "these responsibly recycled materials will provide new economic incentives by developing through a novel upcycling process to produce innovative value-added products" erdemir said the general public could see day-to-day benefits from this research through a less adverse impact from plastic waste and cheaper and potentially better functioning lubricants used in cars and other industrial activities "reducing plastic wastes to lubricating oils is quite remarkable and may lead to a greener and more sustainable future " erdemir said "benefits could be huge as the end-products of this project will not only help reduce the adverse environmental impacts of plastic wastes but also put them in use in a very green and continuously recyclable manner" by turning the waste into high-performing lubricants that perform as well or even better than their traditional counterparts erdemir said the mechanical components that utilize the lubricants for smooth and safe operation could benefit through mechanical durability energy efficiency and environmental compatibility moving forward the team will be researching both the cost and technology needed to upcycle the plastic waste into lubricants as well as how well the product ultimately performs "by the end of our project we hope that we turn plastic trash into lubricating treasures in a sound and cost-effective way thus helping alleviate the dire consequences of plastic wastes which are already hurting our planet in so many ways erdemir said if proven commercially viable we expect our research findings to turn into a wide range of lubricating products including engine oils and a wide range of industrial lubricants that could help reduce energy consumption and the carbon footprint of future transportation and other industrial systems
by analyzing peoples visitation patterns to essential establishments like pharmacies religious centers and grocery stores during hurricane harvey researchers at texas a&m university have developed a framework to assess the recovery of communities after natural disasters in near real time they said the information gleaned from their analysis would help federal agencies allocate resources equitably among communities ailing from a disaster neighboring communities can be impacted very differently after a natural catastrophic event said dr ali mostafavi associate professor in the zachry department of civil and environmental engineering and director of the urban resilienceai lab and so we need to identify which areas can recover faster than others and which areas are impacted more than others so that we can allocate more resources to areas that need them more the researchers have reported their findings in interface a publication of the royal society the metric that is conventionally used to quantify how communities bounce back from nature-caused setbacks is called resilience and is defined as the ability of a community to return to its pre-disaster state and so to measure resilience factors like the accessibility and distribution of resources connection between residents within a community and the level of community preparedness for an unforeseen disaster are critical the standard way of obtaining data needed to estimate resilience is through surveys the questions considered among many others are how and to what extent businesses or households were affected by the natural disaster and the stage of recovery however mostafavi said these survey-based methods although extremely useful take a long time to conduct with the results of the survey becoming available many months after the disaster for federal agencies allocating funds recovery information is actually needed in a faster and more near real-time fashion for communities that are trailing in the recovery process said mostafavi the solution we thought was to look for emerging sources of data other than surveys that could provide more granular insights into community recovery at a scale not previously investigated mostafavi and his collaborators turned to community-level big data particularly the information collected by companies that keep track of visits to locations within a perimeter from anonymized cell phone data in particular the researchers partnered with a company called safegraph to obtain location data for the people in harris county texas around the time of hurricane harvey as a first step they determined points of interest corresponding to the locations of establishments like hospitals gas stations and stores that might experience a change in visitor traffic due to the hurricane next the researchers mined the big data and obtained the number of visits to each point of interest before and during the hurricane for different communities in harris county they calculated the time taken for the visits to return to the pre-disaster level and the general resilience that is the combined resilience of each point of interest based on the percent change in the number of visits due to the hurricane their analysis revealed that communities that had low resilience also experienced more flooding however their results also showed that the level of impact did not necessarily correlate with recovery its intuitive to assume for example that businesses impacted more will have slower recovery which actually wasn't the case said mostafavi there were places where visits dropped significantly but they recovered fast but then others that were impacted less but took longer to recover which indicated the importance of both time and general resilience in evaluating a communitys recovery the researchers also noted that another important finding was that the areas that are in close proximity to those that had flooding are also impacted suggesting that the spatial reach of flooding goes beyond flooded areas although we focused on hurricane harvey for this study our framework is applicable for any other natural disaster as well said mostafavi but as a next step wed like to create an intelligent dashboard that would display the rate of recovery and impacts in different areas in near real time and also predict the likelihood of future access disruption and recovery patterns after a heavy downpour other contributors to the research include cristian podesta natalie coleman amir esmalian and dr faxi yuan from the civil and environmental engineering department podesta an undergraduate student is the lead author in this study coleman is a national science foundation graduate fellow this research is funded by a national science foundation faculty early career development award
with questions about the short- and long-term reliability and sustainability of the power grid set forth by the electric reliability council of texas more commonly referred to as ercot having alternate energy options is more vital than ever and while many forms of renewable energy such as solar and nuclear call land their home other methods such as offshore wind farms wave energy and current/tide energy are taking to the seas to generate electricity dr moo-hyun kim bauer professor ii in the department of ocean engineering and director of the ocean system simulation and control lab and his team of fellow researchers at texas a&m university believe the next generation of offshore energy lies in the development of a synergistic combination of several renewable energy production methods set atop a floating offshore platform offshore renewable energy can directly power remote islands numerous ocean platforms electric boats and underwater drones and vehicles as well as ‘blue economy systems such as marine aqua-culture fish or macro algae farms kim said it can also be combined with desalination plants and hydrogen factories the ocean renewable energy station will feature wind wave current and solar energy elements that could generate electricity for anything from a coastal or island community a research lab or a military unit tethered where the sea level is 60 meters or deeper the station will be ideal when water depth increases quickly such as along the united states pacific coast or hawaii and will be less obtrusive to the view of coastal residents than a fixed offshore wind farm and it has been proven to have a highly competitive levelized cost of energy (the measurement of lifetime costs divided by energy production) denmark is now building a huge multi-source multi-purpose ocean energy island; wind energy is already competitive against fossil fuels kim said the biggest disadvantage of ocean renewable energy is its variability so some sort of storage method is highly needed to be commercially more useful while offshore wind energy is commercially competitive current wave-energy converters (which sit close to the surface of the water and utilize the natural motion of waves to generate electricity) are less cost-effective and only useful for smaller-scale special purposes to help solve these problems the ocean renewable energy station combines several different methods of renewable energy additionally as kim described larger offshore wind turbines may create better synergy with the other forms of energy production the team also plans to incorporate innovative smart materials into the wave energy converter that will respond to changes in wave height and frequency and allow for more consistent power production now the united states department of energy is the largest funding source for ocean renewable energy and the wind energy industry is growing fast kim said major oil/gas companies are also gradually shifting their business emphasis toward ocean clean energy
occurring faster than the speed of sound the mystery behind the breakdown of plasma discharges in water is one step closer to being understood as researchers pursue applying new diagnostic processes using state-of-the-art x-ray imaging to the challenging subject these diagnostic processes open the door to a better understanding of plasma physics which could lead to advances in green energy production through methods including fusion hydrocarbon reforming and hydrogen generation dr david staack and christopher campbell in the j mike walker '66 department of mechanical engineering at texas a&m university are part of the team pioneering this approach to assessing plasma processes partners on the project include diagnostics experts from los alamos national laboratories and using the facilities at the argonne national laboratory advanced photon source (aps) the team is working with lteoil on patented research into the use of multiphase plasma in carbon-free fuel reforming the research is supported by the dynamic materials properties campaign (c2) and the advanced diagnostics campaign (c3) at los alamos national laboratories through the thermonuclear plasma physics group (p4) principal investigator zhehui (jeph) wang the research which was recently published in physical review research is producing the first-known ultrafast x-ray images of pulsed plasma initiation processes in water staack associate professor and sallie and don davis '61 career development professor said these new images provide valuable insight into how plasma behaves in liquid "our lab is working with industry sponsors on patented research into the use of multiphase plasma in carbon-free fuel reforming " staack said "by understanding this plasma physics we are able to efficiently convert tar and recycled plastics into hydrogen and fuels for automobiles without any greenhouse gas emissions in the future these investigations may lead to improvements in inertial confinement fusion energy sources" inertial confinement fusion in which high temperature high energy density plasmas are generated is a specific focus of the project to better understand the plasma physics involved in this type of fusion staack said the team is developing short timescale high-speed imaging and diagnostic techniques utilizing a simple low-cost plasma discharge system additionally they are seeking to better understand the phenomena that occur when plasma is discharged in liquid causing a rapid release of energy resulting in low-density microfractures in the water that move at over 20 times the speed of sound
campbell a graduate research assistant and phd candidate said the team hopes their discoveries can prove to be a valuable contribution to the collective knowledge of their field as researchers seek to develop robust predictive models for how plasma will react in liquid "our goal is to experimentally probe the regions and timescales of interest surrounding this plasma using ultrafast x-ray and visible imaging techniques thereby contributing new data to the ongoing literature discussion in this area " said campbell "with a complete conceptual model we could more efficiently learn how to apply these plasmas in new ways and also improve existing applications" although they have made progress campbell said current methods are not yet sophisticated enough to collect multiple images of a single plasma event in such a short amount of time less than 100 nanoseconds "even with the state-of-the-art techniques and fast framerates available at the advanced photon source we have only been able to image a single frame during the entire event of interest by the next video frame most of the fastest plasma processes have concluded " campbell said "this work highlights several resourceful techniques we have developed to make the most of what few images we are able to take of these fastest processes" the team is currently working to measure the pressures induced by the rapid phenomena and preparing for a second round of measurements at aps to investigate interacting discharges discharges in different fluids and processes that may limit confinement of higher energy discharges they look forward to the opportunity of using even higher-framerate x-ray imaging methods ranging up to 67 million frames per second compared to 271 thousand frames per second in this study
dr isaac adjei is using his background in drug delivery to engineer new ways to treat late-stage cancer patients adjei assistant professor in the department of biomedical engineering at texas a&m university develops treatments for patients with late-stage cancers to improve their quality of life and help them live longer typically these patients have cancer that has spread to other parts of the body or tumors that do not respond to treatment one of the current treatments for most cancer patients is immunotherapy where antibodies are used against specific receptors on cancer cells to activate the immune system or immune cells are isolated activated and then re-injected into the body to attack the cancer however immunotherapy is less likely to work on patients with late-stage cancer in breast cancer patients only about 5% in the advanced stage will respond to treatment adjeis group takes a different approach to how the immune system interacts with tumors instead of only testing different immunotherapies his team looks to see if they can develop ways to change the environment inside a patients tumor to give the treatment a better chance of working were trying to change the environment within the tumor using nanoparticles that we developed in our lab so that when the new immune cells get there the cells are happy and they can find the cancer cells and kill them adjei said were giving them a fighting chance to be able to do the job you send them there to do one way adjeis team is hoping to accomplish this is by changing the oxygen levels within the tumor as cancer grows it uses a lot of nutrients lowering the amount of oxygen within the tumor once the environment is depleted of nutrients the tumor is prompted to leave which leads to the cancer spreading throughout the body through metastasis the cancer cells also develop ways that allow them to hide from the immune cells in this hypoxic environment making immunotherapy ineffective
if nanoparticles tiny devices 1 000 times smaller than a strand of human hair can be injected into the tumor and produce oxygen the immune system will have a better chance at destroying the tumor adjeis team develops nanoparticles that take advantage of some of the properties of the tumor cells to produce oxygen in that environment to ensure a sustained effect adjei says one goal is to find ways to mimic nature one of my philosophies is if you take something that nature is already using it makes it easy to translate adjei said i did some of my phd coursework in a hospital so there its put into your head before you develop or design anything you have to think about how youre going to ultimately get it into a patient that leads to another challenge his team is working to solve ensuring nanoparticles end up in the right place in the body once again they are using the bodys own systems as a template the liver for instance produces many substances but the body knows where each should go usually using specific proteins what (my graduate student) sri is doing is shes developing these nanoparticles that if you inject them into the body they can recruit some of these proteins and then use that to tell the nanoparticles where to go adjei said once we have this platform we can apply it for different applications such as treatment of cancer stroke even traumatic brain injuries adjei said one of his goals is to train students with the right skills and mentality to help patients which can have a broader impact on the health of the community if i train two doctoral students who have the same mentality and they go out with the goal to improve outcomes for patients thats three people trying to do the same thing adjei said that is my ultimate goal: help patients and train excellent translation-minded students who when they go out even if they are in industry are going to make impacts
the pfizer and moderna covid-19 vaccines have proven to be incredibly effective at fighting the pandemic both of these vaccines are made using messenger rna (mrna) the genetic material that contains instructions for cells to build antigen proteins these mrna vaccines represent a fundamentally different approach to traditional vaccines essentially all vaccines are used to stimulate and train the bodys immune system to recognize and destroy pathogens traditional vaccines contain either killed or weakened forms of a virus or bacterium or proteins associated with the pathogen to provoke an immune response rather than introducing a pathogen or associated protein directly mrna vaccines introduce genetic information that instructs cells to make proteins that are associated with the pathogens triggering an immune system response while mrna vaccines have several major advantages over traditional vaccines – precise immune responses rapid development and production processes inherent safety – there are a few significant drawbacks the most critical of these is the overall thermal instability of rna which begins to break down above freezing temperatures as a result mrna vaccines require stringent cold chain conditions for manufacturing storage and worldwide distribution (-20°c for moderna -80°c for pfizer-biontech vaccines) which has hindered the widespread utilization of them particularly in rural areas and developing countries that lack ultracold freezers and cold-chain assurance to make mrna vaccines much more broadly accessible it is critical to improve mrna vaccine stability while maintaining efficacy and safety a team lead by dr qing sun assistant professor in the artie mcferrin department of chemical engineering has been awarded a texas a&m university x-grant to examine and find solutions to the problems presented by mrna vaccines this team is composed of eight faculty members including sun dr arum han dr xiaoning qian and dr yang shen from the college of engineering; dr paul de figueiredo dr julian leibowitz and dr jim song from the texas a&m university school of medicine; and dr xiuren zhang from the texas a&m university college of agriculture and life sciences the overarching goal of the project titled a multidisciplinary platform to develop thermally stable and highly efficient mrna vaccines is to develop an integrated platform that includes high throughput deep learning and novel experimental systems that predict and produce thermally stable mrna vaccines the x-grant team will develop a machine-learning platform that utilizes deep learning to predict the thermal stabilities of various rna from sequence information the team will then develop a dna/rna synthesis platform that supports the prototyping of mrna vaccines and tests the immunogenicity/efficacy of each of the prototype vaccines the research will initially focus on covid-19 but the goal is to make the platform flexible enough to expand into other infectious agents cancers and other significant human diseases x-grants part of the presidents excellence fund at texas a&m university is an interdisciplinary program designed to bring faculty together across disciplines the programs goal is to unlock creative and imaginative ideas that will address important problems in areas that will significantly impact the most important challenges facing global society for round four of the x-grants program there were more than 200 proposals submitted to the program and eight were finally funded
can we develop models of the cognitive behavior of human-machine collaboration while this might seem like the stuff of science fiction researchers at texas a&m university are currently developing algorithms that interpret situations close to how humans navigate through their daily lives for example say you see something that resembles a rock in the road -- do you keep driving or swerve to avoid it the split-second choice may seem obvious but it is dependent on many factors that can range from whether it is rock or a stray turtle how big the rock is and if there are cars in the lane next to you although human decision-making relies on many variables these decisions are made in a split second among various other possible scenarios "humans pay attention to context; that's part of our intuition " said dr mark balas leland jordan professor in the j mike walker '66 department of mechanical engineering and department of aerospace engineering at texas a&m university "if you're on a high-speed highway and traffic is bunching ahead of you a machine can slow your car down or speed up but it's not doing anything intelligent however the human operator might make a different decision based on what they can see and their motives" the researchers said this intuition might resemble how events in the atomic and subatomic realms occur a process mathematically described by quantum mechanics unlike other mathematical abstractions of decision-making that consider past events to decide the probability of a future outcome quantum mechanical probabilities are based on multiple possibilities that are ever-changing and present simultaneously at the moment of action when a decision is made all of these fluctuating probabilities converge to a single value called a quantum collapse this type of math is used to describe the behavior of electrons photons and other subatomic particles the researchers argued that this math can also be applied to human decision-making as well "at any given moment humans experience many emotions simultaneously " said dr james hubbard oscar wyatt professor in the department of mechanical engineering "but if asked what they are feeling right now they will generally state one emotion out of multiple possibilities you could think of it as a quantum collapse into one emotional state based on this rationale the researchers are investigating if they can create an algorithm for semi-autonomous vehicles that would model how well-trained operators would make decisions in conjunction with these vehicles we want to use quantum probability-based algorithms to address bigger issues like the decisions needed when other drivers make unexpected moves said balas in other words wed like our algorithms to mimic what experienced operators would decide to do in these instances and we think that that might be a novel way to approach the future of human collaboration with autonomous vehicles hubbard and balas are the co-investigators on this project they host a weekly quantum cognition research seminar with their team that they refer to as the center for the hopelessly naïve to discuss their ideas on quantifying and understanding the human decision-making process
commonplace pharmaceuticals such as ibuprofen can carry with them an inherent flaw in their atomic structure which pairs the active beneficial ingredient with a potentially ineffective or even toxic counterpart new research could hold the key to more easily isolating the good while removing the unwanted dr shoufeng lan assistant professor in the j mike walker '66 department of mechanical engineering at texas a&m university is leading a team investigating the use of electromagnetic control over the synthesis of chiral compounds at an atomic level a process that could lead to a plethora of practical applications including in the pharmaceutical industry the team's research was recently published in the journal nature communications "mysteriously all living organisms on the earth consist of only left-handed amino acids and right-handed sugars but not their mirrored counterparts " lan said "the phenomenon is the so-called homochirality of life and it is the ultimate form of asymmetric synthesis" lan used the example of a human hand to demonstrate the concept of chirality noting that if you were to create a mirror image of your hand it could not be perfectly superimposed over the original by identifying a successful method of using asymmetrical synthesis to create new versions of structures for items like ibuprofen lan said better versions of generic pharmaceuticals with reduced toxicity could be created at a lower cost than currently available due to the current purification process however to achieve success the researchers will first need to overcome the practical need to implement this magnetic effect on asymmetric synthesis at room temperature currently this effect is relatively weak even with a strong magnetic field or at a low temperature or 450 degrees lan said the topic of addressing chirality was the basis of the 2001 nobel prize in chemistry which uses an existing chiral object a catalyst molecule to transfer chirality to the desired mirror image form as the final product "this nature communications paper demonstrated a giant atomic-scale magneto-chiral effect that is orders of magnitude stronger " lan said "by applying this effect it is arguably possible to master an asymmetric synthesis or asymmetric self-assembling" lan said his team's research could prove revolutionary to the field by creating a new iteration of biomedical chemical and pharmaceutical applications for example by asymmetrically synthesizing only the active component of racemic lexapro the most common medication in the united states with more than 25 million prescriptions the research might reduce the drugs side effects 'we anticipate that our demonstration could lead to the creation of chiral seeds at the atomic scale " lan said "upon them we hope to transfer the chirality using cutting-edge technologies such as a metal-organic framework to create chiral materials from nanoscales to macroscales"
the texas a&m engineering experiment station (tees) recently received a proposed five-year up to $24-million contract from the army research laboratory (arl) to conduct basic research in establishing a collaborative distributed proving ground that will support autonomous vehicle research across various environments and domains at the george hw bush combat development complex (bcdc) on the texas a&m university system rellis campus the research will be focused on developing virtual proving grounds designed to enable researchers to develop test and demonstrate artificial intelligence (ai) and machine-learning (ml) algorithms for autonomous vehicles visual thermal lidar and radar datasets in relevant and diverse environments will also be collected annotated and curated in both real and virtual environments that can be used to evaluate ai/ml and autonomy algorithms in real and synthetic environments the distributed autonomous robotic experiments and simulations (dares) research project will be conducted in coordination and collaboration with arl researchers at the robotics research collaboration campus (r2c2) in graces quarters at the aberdeen proving ground in maryland the distributed autonomous robotic experiments and simulations cooperative agreement between arl and the texas a&m system will foster the acceleration of fundamental research in autonomy artificial intelligence and machine learning to transform the future of human-agent teaming said arl program manager andrew ladas we are excited to partner with the texas a&m system and utilize their state-of-the-art campus in addition to the labs facilities and assets to take this research to the next level and have them involved in the arl distributed virtual proving ground we look forward to the partnership and enhancing the capabilities of our soldiers in the future operational environment dr srikanth saripalli lead principal investigator for dares at texas a&m said the ability to connect r2c2 with starlab at the rellis campus through the dares program enables us to rapidly test and validate autonomous vehicle capabilities at multiple locations simultaneously which will accelerate the ability to incorporate research results into synthetic environments this will improve the quality of virtual simulations and ultimately increase resilience in autonomous vehicle capabilities saripalli is a professor in the j mike walker 66 department of mechanical engineering at texas a&m university and his dares research team consists of 20 faculty members from mechanical electrical aerospace and computer science dr james hubbard jr oscar s wyatt jr 45 chair i professor and founder of the starlab in the mechanical engineering department provided the vision for the dares project dares will enable the establishment of a virtual proving ground on the rellis campus and is bolstered by a world-class team of experienced researchers this is indeed an exciting and unique opportunity for texas a&m to deliver a high-value asset to the army and its stakeholders said hubbard hubbard was a fellow of the hagler institute for advanced study at texas a&m and was brought to the university in 2018 through the governors university research initiative and the texas a&m university system chancellors research initiative the combination of the governors university research initiative alongside the chancellors research initiative serves as tremendous tools that place the texas a&m university system at a distinct advantage in identifying recruiting selecting and hiring the best of the best faculty and researchers to solve critical problems for our state and nation said texas a&m system chancellor john sharp were honored to have guri/cri faculty members like dr hubbard on our team to help solve military modernization challenges research in the later years of this proposal will occur on the innovation proving ground (ipg) at the bcdc the ipg will provide full instrumentation and 5g capabilities along with the personnel and systems that will prove crucial in capturing and providing necessary data to support the dares project this is a clear case where the combined vision of our faculty the aggie values of service to our nation and the facilities and expertise provided by the bcdc and the state of texas combined to produce this valuable research partnership with the army research lab said ross t guieb bcdc director we appreciate the opportunity to continue serving our current and future soldiers he said as the texas a&m system is becoming nationally recognized in leading academia-military modernization efforts us senator john cornyn (r-tx) said our countrys military readiness depends on innovation and the army research lab is on the front line of that fight this partnership with the texas a&m system will ensure we have the best and the brightest working to address rapidly evolving threats and maintain our strategic advantage around the world