https://engineering.wustl.edu/news/Pages/Making-sense-pictures-of-medical-data-Alvitta-Ottley.aspx903Making sense, pictures of medical data<img alt="" src="/news/PublishingImages/cancer-women-icons.jpg?RenditionID=2" style="BORDER:0px solid;" /><div id="__publishingReusableFragmentIdSection"><a href="/ReusableContent/36_.000">a</a></div><p>A woman goes to the doctor for a mammogram. The result comes back positive.<em> This doesn’t necessarily mean you have cancer, false positives are common</em>, her doctor might say. Maybe the patient is also given a pamphlet with some statistics about mortality and survival rates.<br/></p><p>But the test did come back “positive,” the patient thinks, so maybe the doctor is just trying to make her feel better. Maybe the patient doesn’t understand the difference between mortality and survival rates and hasn’t thought about statistics since a class in high school.</p><p>To help patients better understand their health data and the risks and benefits of treatment options, the National Science Foundation has awarded a $174,254 grant to <a href="/Profiles/Pages/Alvitta-Ottley.aspx">Alvitta Ottley</a>, assistant professor of computer science and engineering and assistant professor of psychology and brain sciences at Washington University in St. Louis.<br/></p><blockquote>“Lots of people are receiving test results and they don’t understand them,” said Ottley. “They have to understand procedures and their risks, and then there are false positives and false negatives. My job is to take this somewhat complex statistical information and present it in ways people can understand.”<br/></blockquote><p>Ottley has worked on general visualization problems that ask how our individual psychology affects the way we receive information and make decisions. She has also built tools for facilitating communication between doctors and patients.</p><p>In this current project, however, the tools she is building are not for experts, they’re for patients with no expertise in medicine or statistics.</p><p>To be sure, there already exist plenty of formats for visually describing these things to laypeople. The most popular of these formats, according to Ottley, is the icon array. A single icon array can be used to indicate, for instance, rates of breast cancer, including people who were accurately tested as well as those who received false negative results.</p><p>Such a graphic might use symbols representing people. In the example pictured, there are 10 rows by seven columns, with an additional five symbols set somewhat apart from the main grid.</p><p>All 75 symbols represent people diagnosed with breast cancer.</p><p>Purple symbols represent those who died from the disease. Blue symbols represent those who were treated and survived. Of those 50 symbols, 17 are outlined in pink. They represent overdiagnosis – people whose cancers would not have been harmful if left untreated.</p><p>The five pink symbols that are set apart from the main grid represent people who would have died if not for the screening.  </p><p>“These can be hard to understand,” said Ottley of icon arrays. “It is especially confusing to someone with no statistical training nor numerical skills. We’re trying to figure out the best way to represent this.”</p><p>Her lab is not just looking at how to render easy-to-understand images, but also whether or not images, themselves, are truly the best way to represent data.</p><p>Even if a picture is worth a thousand words, Ottley wondered, would it make things even clearer if text were added to visualizations?</p><p>Her past work indicated that it will not. “What we’ve found is that if I give you text alone, you’re not really good at understanding it. If I give you data visualization alone, you’re just a little bit better. But if I give you both – it’s completely confusing.” Ottley said. “You have to read the text and understand that, and then try to figure out the visualization, and then determine how the two relate.”</p><p>Complicating the issue, Ottley also found that measures of “spatial ability” can determine a person’s success in reasoning with medical statistics.</p><p>To get to the bottom of the question, members of Ottley’s lab will be looking at approaches that have been successful in decreasing cognitive load – the amount of information a person needs access to in their working memory – and using those approaches to design easier-to-understand visualizations.</p><p>Using eye tracking, the group will determine in what order people are taking absorbing information, and how well they understand it.</p><p>“We will look at what order leads to successful decision making,” Ottley said. “Perhaps everyone who makes the right decision – assuming there is a right decision – consumes the information in a specific order. If we can identify successful strategies and pathways, we can redesign visual representation so that people are more likely to use these pathways.”</p><p>The lab will also look at decision biases – why some people may be prone to taking action because or in spite of the information they’ve received.</p><p>Ottley understands that it’s more than statistics that lead a patient to choose or forego medical treatment. There is a large emotional component, as well. “If a doctor says, ‘you have cancer,’ some people might just say, ‘OK, let’s do this,’ and opt for treatment without a second thought,” she said.</p><p>Then, there are some people who would like to know as much as they can. “Why not give these people the options and the tools to really, truly understand the data?” asked Ottley. The decisions can be difficult, and to some, providing all of this information might seem like placing the burden on an already-anxious patient.</p><p>Ottley sees it differently, however, and hopes to make it easier for others to see, not just the data, but the potential benefits of improved visualization. Instead of burdening patients, she said, “Improved data visualization could empower them.”<br/></p><SPAN ID="__publishingReusableFragment"></SPAN><p>This research is supported by the <a href="https://www.nsf.gov/awardsearch/showAward?AWD_ID=1755734&HistoricalAwards=false">National Science Foundation, grant #1755734</a>.<br/></p><p>​</p><span><div class="cstm-section"><h3>Alvitta Ottley<br/></h3><div> <strong></strong></div><div><div style="color: #343434; text-align: center;"><img src="/Profiles/PublishingImages/Ottley_Alvitta_2016.jpg?RenditionID=3" alt="" style="margin: 5px;"/><br/></div><div style="color: #343434;"><ul style="padding-left: 20px;"><li>Assistant Professor of Computer Science & Engineering<br/></li><li>Expertise: Information visualization, human-computer interaction and visual analytics<br/></li></ul><p style="text-align: center;"><a href="/Profiles/Pages/Alvitta-Ottley.aspx">View Bio</a><br/></p></div></div></div></span><p><br/></p>This icon array is intended to clarify the relationship between screenings and breast cancer (photo: Cancer Research UK)Brandie Jefferson2018-08-13T05:00:00ZAlvitta Ottley developed a way for patients to better understand the results of their mammograms. <p>Improved visual communication with patients could lead to more informed health-care choices<br/></p>
https://engineering.wustl.edu/news/Pages/Building-the-backbone-of-a-smarter-smart-home.aspx899Building the backbone of a smarter smart home<img alt="Smart home illustration" src="/news/PublishingImages/smarthome.jpg?RenditionID=1" style="BORDER:0px solid;" /><div id="__publishingReusableFragmentIdSection"><a href="/ReusableContent/36_.000">a</a></div><p>The state of artificial intelligence (AI) in smart homes nowadays might be likened to a smart, but moody teenager: It’s starting to hit its stride and discover its talents. But it doesn’t really feel like answering any questions about what it’s up to and would really rather be left alone, OK?<br/></p><p><a href="/Profiles/Pages/William-Yeoh.aspx">William Yeoh</a>, assistant professor of computer science & engineering in the <a href="/Pages/home.aspx">School of Engineering & Applied Science</a> at Washington University in St. Louis, is working to help smart home AI to grow up.</p><p>The National Science Foundation awarded Yeoh a $300,000 grant to assist in developing smart home AI algorithms that can determine what a user wants by both asking questions and making smart guesses, and then plan and schedule accordingly. Beyond being smart, the system needs to be able to communicate; to explain why it is proposing the schedule it proposed to the user.</p><p></p><p>These challenges rely heavily on communication. And like that moody teen, communication is not currently one of AI’s strengths.</p><blockquote style="text-align: center;">“So far, a lot of AI is successful in isolation,” Yeoh said.<br/></blockquote><p>Feats such as winning a game of Go or recognizing faces do not require significant user interaction; a computer can do these tasks mostly on its own. “But if AI is going to help people,” he said, “interaction with people is pretty important.”</p><p>The first step in this project is figuring out what the user actually wants: The temperature to be 70 degrees when she gets home, the car fully charged by 8 a.m., and so on. And it must decipher what the user doesn’t want: The air conditioner set at full-blast all day long, or the car to charge during peak hours.</p><p>An AI system could meticulously and continuously ask a user for every one of her preferences. But because that is not possible (or would be, at the least, extremely annoying), the research sets out to determine, “Without bothering users too much, how many questions should we ask them and what should those questions be?” Yeoh noted in his grant application. The reason for asking: to make the smartest possible decisions when the user has not supplied specific information.</p><p>The algorithm that determines which questions to ask will be restricted by how many questions it can ask, and so will have to decide which questions have answers that will yield the most useful information.</p><p>Once the parameters and constraints are set, and the system has all of the available information it’s going to have, it devises a schedule that honors the user’s preferences, and maximizes the user’s comfort, while minimizing energy usage.</p><p>The system’s job won’t be done, however, once the parameters are set and the scheduling is done; it still needs to work on its communication skills.</p><p>Communication is a two-way street. Not only does Yeoh want to develop smart home systems that a user can tell what to do, he also thinks that the system should be able to explain itself to the user.</p><p>For several reasons, “the system needs to be able to explain to the user ‘why,’” Yeoh said. “The field has been working on getting good answers from AI. Now it’s time to get good explanations.”</p><p>Say a user rarely goes in the basement, but she heads down one night to get something out of storage. All of a sudden, the lights go out. Without being able to ask the system why, she may be wondering, “Is there a power outage? Is someone here? Is my house out to get me?”</p><p>If she could simply ask the system what happened, however, she would realize that in an attempt to save energy, the system has been turning off the lights in the basement every day after a certain time, because (via sensors, perhaps) it knows that she rarely goes downstairs in the evening.</p><p>Yeoh’s project is developing the tools to give users the ability to ask those questions, vocally or by turning to a visual interface designed to give users access to a host of information, from details about specific devices to a broader view of energy consumption.</p><p>Having access to that information will allow the user to more narrowly tailor her settings, working with the system instead of simply having to accept its settings. After all, there are lots of variables for a smart home AI system to take into account if it’s monitoring the temperature, locking the doors, charging a vehicle, preheating the oven, and monitoring a security system — all while trying to conserve energy usage.</p><p>“Energy prices, weather, the status of the devices,” Yeoh said of all the changing conditions that AI will need to monitor. “And humans, the most fickle variable of all.”<br/></p><SPAN ID="__publishingReusableFragment"></SPAN><p>This research is supported by the National Science Foundation, grant #1838364<br/></p><p><br/></p><span><div class="cstm-section"><h3>William Yeoh<br/></h3><div style="text-align: center;"><img src="/Profiles/PublishingImages/Yeoh_William.jpg?RenditionID=3" alt="" style="margin: 5px;"/><br/></div><div><ul style="padding-left: 20px; color: #343434;"><li>Assistant Professor of Computer Science & Engineering<br/></li><li>Expertise: Artificial intelligence with an emphasis on developing optimization algorithms for agent-based systems<br/></li></ul><p style="color: #343434; text-align: center;"><a href="/Profiles/Pages/William-Yeoh.aspx">View Bio</a><br/></p></div></div></span><p><br/></p>Elements of a smart homeBrandie Jefferson2018-08-01T05:00:00ZThe National Science Foundation awarded Yeoh a $300,000 grant to assist in developing smart home AI algorithms that can determine what a user wants by both asking questions and making smart guesses.<p>​The right algorithms can make smart homes smarter, more efficient, and more communicative<br/></p>
https://engineering.wustl.edu/news/Pages/In-the-media-Ubers-self-driving-cars-are-back.aspx896In the media: Uber's self-driving cars are back. Well, sort of.<img alt="" src="/news/PublishingImages/Media-Uber.jpg?RenditionID=1" style="BORDER:0px solid;" /><div>Uber's self-driving cars are back on the road in Pittsburgh, but with a lot of changes.<br/></div><div><br/></div><div>In the four months since a self-driving Uber hit and killed a woman walking a bicycle on an Arizona road, the ride-hailing company has made some changes. The first and foremost is keeping the vehicles in manual mode.<br/></div><div><br/></div><div>After 49-year-old Elaine Herzberg was fatally hit in Tempe, Arizona, all of Uber's self-driving programs were shut down. In Arizona, those operations were shuttered permanently. In San Francisco, Uber let its testing permit lapse. Now after police and federal investigations found that driver distraction and glitches with Uber's software were factors in the crash, Uber's ready to resume testing, but with a much different program.<br/></div><div><br/></div><div>In a <a href="https://medium.com/@UberATG/self-driving-cars-return-to-pittsburgh-roads-in-manual-mode-f83e506a04b9" style="background-color: #ffffff;">blog post</a> this week, Uber's head of advanced technologies, Eric Meyhofer, laid out how the self-driving program was reshaped by a "top-to-bottom review" with a "focus on safety."<br/></div><p>The new self-driving program has two "mission specialists" in the front and passenger seats. Before the fatal crash there was only one operator, who sat in the driver seat. <br/></p><div>For now the cars are only in manual mode, meaning they're driven by people just like any other car on the road, even if they look like self-driving vehicles. But it's not a total waste of time. The miles in manual mode will give Uber data about real-time situations and what it's like out on the road. It's also a boon for Uber's mapping efforts, which will be a critical tool for autonomous driving. </div><div><br/></div><div>The test drivers are going back to driving basics with courses in defensive and distracted driving and a more "rigorous training" program before operating an autonomous car. </div><div><br/></div><div>A big change is in the screens. Uber said it reviewed the front-facing tablet and changed it to have fewer distracting features in the interface. Police found driver Rafaela Vasquez was streaming Hulu on a cellphone up until the crash. </div><div><br/></div><div>Sanjoy Baruah, an engineering professor at Washington University in St. Louis, said in a call Uber's revamped program is the right way for the company to proceed. </div><div><br/></div><div>Before, Uber jumped in too quickly before building a safe program. "If you want to introduce a new technology you have to spend some time babysitting the technology — like Uber is proposing now," Baruah said.<br/></div>Sasha Lekach, Mashablehttps://mashable.com/2018/07/25/uber-self-driving-resuming-manual-mode/2018-07-26T05:00:00ZProfessor Sanjoy Baruah weighs in on Uber's decision to resume self-driving car testing.<p>Professor <a href="/Profiles/Pages/Sanjoy-Baruah.aspx">Sanjoy Baruah</a> weighs in on Uber's decision to resume self-driving car testing. <a href="https://mashable.com/2018/07/25/uber-self-driving-resuming-manual-mode/?utm_cid=hp-r-1#6uAm1Kwk2qqD">>> Read the full article on mashable.com</a><br/></p>
https://engineering.wustl.edu/news/Pages/New-faculty-join-School-of-Engineering--Applied-Science-.aspx892New faculty join School of Engineering & Applied Science <img alt="Green Hall" src="/news/PublishingImages/131009_jaa_brauer_green_hall_029.jpg?RenditionID=1" style="BORDER:0px solid;" /><div id="__publishingReusableFragmentIdSection"><a href="/ReusableContent/36_.000">a</a></div><h4> <span class="ms-rteStyle-References">"Each year, we compete with the very best engineering schools to recruit extraordinary faculty members," said Aaron F. Bobick, dean and the James M. McKelvey Professor. "This new cohort is incredibly talented, and we are excited about the new research areas these faculty will bring, as well as their knowledge and experience they bring to our students."<br/></span></h4><p> <br/> </p> <span><hr style="clear: both;"/></span> <h3>Biomedical Engineering </h3><p> <img src="/Profiles/PublishingImages/Princess%20Imoukhuede%20temp.JPG?RenditionID=7" class="ms-rtePosition-2" rtenodeid="4" alt="" style="margin: 10px;"/><a href="/Profiles/Pages/Princess-Imoukhuede.aspx" rtenodeid="2"><strong>Princess Imoukhuede, associate professor</strong></a> </p><ul><li>PhD, bioengineering, California Institute of Technology</li><li>SB, chemical engineering, Massachusetts Institute of Technology <br/></li></ul><p>Imoukhuede joins BME from the University of Illinois at Urbana-Champaign, where she has been an assistant professor in the Department of Bioengineering. Previously, she was a postdoctoral fellow in biomedical engineering at Johns Hopkins University School of Medicine. She has earned numerous awards, including the 2017 NSF CAREER Award in 2017 and 2018 IMSA Distinguished Leadership Award.<br/></p><p>Imoukhuede's research focus examines mechanisms regulating angiogenic signaling with focus on tyrosine kinase receptors, VEGFRs and PDGFRs. She pioneers both quantitative biological measurements and computational biological models to delineate ligand-receptor binding, receptor and effector phosphorylation, and sprouting angiogenic hallmarks (cell proliferation and migration). This bottom-up systems biology paradigm offers mechanistic insight towards directing vascular signaling with translational implications to cancers and cardiovascular diseases. <br/></p><p><br/></p><p> <img src="/Profiles/PublishingImages/Abhinav%20Jha%202018.jpg?RenditionID=7" class="ms-rtePosition-2" rtenodeid="8" alt="" style="margin: 10px;"/><a href="/Profiles/Pages/Abhinav-Jha.aspx" rtenodeid="6"><strong>Abhinav Jha, assistant professor</strong></a></p><ul><li>PhD, optical sciences, University of Arizona</li><li>MS, electrical engineering, University of Arizona</li><li>BTech, electronics engineering, Motilal Nehru National Institute of Technology, Allahabad, India<br/></li></ul><p>Jha joins the BME and Radiology departments at the School of Medicine from Johns Hopkins School of Medicine, where he was an instructor in the Division of Medical Imaging Physics, Department of Radiology and Radiological Science since 2015. Previously, he was a research fellow at Johns Hopkins University. <br/></p><p>Jha's research interests are in the design, optimization and evaluation of medical imaging systems and algorithms using statistical task-based quantitative image-science approaches. He has devised novel theoretical and computational methods for objective evaluation of image quality (OAIQ), simulating imaging systems, and image reconstruction and image analysis. His research has had several clinical and pre-clinical impacts, such as being one of the first to demonstrate the impact of task-specific imaging in improving diffuse optical imaging and diffusion MRI. A major area of current focus is on improving clinical quantitative imaging using a combination of physics and machine-learning-based methods.<br/></p><p><br/></p><p> <img src="/Profiles/PublishingImages/Jai%20Rudra%20temp.jpg?RenditionID=7" class="ms-rtePosition-2" rtenodeid="11" alt="" style="margin: 10px;"/><a href="/Profiles/Pages/Jai-Rudra.aspx" rtenodeid="9"><strong>Jai Rudra, assistant professor</strong></a></p><ul><li>PhD, biomedical engineering, Louisiana Tech University </li><li>BTech, electronics and instrumentation engineering, Jawaharlal Nehru Technological University, Hyderabad, India<br/></li></ul><p>Rudra joins BME from the University of Texas Medical Branch in Galveston, where he has been an assistant professor in the Department of Pharmacology and Toxicology. Previously, he was a postdoctoral fellow at the University of Chicago in the Department of Surgery. <br/></p><p>At the University of Texas, he is a member of the Sealy Center for Vaccine Development, the Center for Addiction Research and of the Human Pathophysiology and Translational Research Graduate Program. His research interests are in the design and synthesis of amyloid-inspired supramolecular biomaterials for applications in vaccine development and immunotherapy. <br/></p><p><br/></p> <span> <hr/></span> <h3>Computer Science & Engineering </h3><p> <strong><img src="/Profiles/PublishingImages/Yevgeniy-Vorobeychik-temp.jpg?RenditionID=7" class="ms-rtePosition-2" alt="" style="margin: 10px;"/>Yevgeniy (Eugene) Vorobeychik, associate professor</strong></p><ul><li>PhD, MSE, computer science & engineering, University of Michigan</li><li>BS, computer engineering, Northwestern University <br/></li></ul><p>Vorobeychik joins CSE from Vanderbilt University, where he has been an assistant professor of computer science and computer engineering since 2013 and an assistant professor of biomedical informatics at Vanderbilt's Medical Center since 2016. Previously, he was a principal and member of technical staff at Sandia National Laboratories and a postdoctoral researcher at the University of Pennsylvania. <br/></p><p>His research interests include algorithmic and behavioral game theory, game theoretic modeling of security, electronic commerce, simulation analysis, social and economic network analysis, optimization, complex systems, multi-agent systems, machine learning. <br/></p><p><br/></p><p> <img src="/Profiles/PublishingImages/Miaomiao%20Zhang%20temp.JPG?RenditionID=7" class="ms-rtePosition-2" rtenodeid="14" alt="" style="margin: 10px;"/><a href="/Profiles/Pages/Miaomiao-Zhang.aspx" rtenodeid="12"><strong>Miaomiao Zhang, assistant professor</strong></a></p><ul><li>PhD, computing, University of Utah</li><li>MS, computer science, East China Normal University, Shanghai</li><li>BS, computer science, Henan Normal University, Henan, China<br/></li></ul><p>Zhang joins CSE from Lehigh University, where she has been an assistant professor of computer science and engineering. Previously, she was a postdoctoral associate in electrical engineering and computer science at Massachusetts Institute of Technology. <br/></p><p>Her research interests are in image analysis, machine learning, statistical modeling and computer vision. Specifically, she is interested in developing fast and robust deformable image registration methods for real-time, image-guided neurosurgery; analyzing anatomical shape changes for studying neurodegenerative diseases, such as Alzheimer's disease, and devising efficient clinical trial-oriented software packages; and leading deep learning research for effective image segmentation and classification, such as tumor identification. <br/></p><p><br/></p><p> <img src="/Profiles/PublishingImages/Ning%20Zhang%20temp.jpg?RenditionID=7" class="ms-rtePosition-2" rtenodeid="17" alt="" style="margin: 10px;"/><a href="/Profiles/Pages/Ning-Zhang.aspx" rtenodeid="15"><strong>Ning Zhang, assistant professor</strong></a> </p><ul><li>PhD, computer science and applications, Virginia Polytechnic Institute and State University </li><li>MS, system engineering, Worcester Polytechnic Institute</li><li>BS, MS, computer science, University of Massachusetts, Amherst<br/></li></ul><p>Zhang joins CSE from Raytheon, a principal cyber engineer and technical lead at Cyber Security Innovations of Raytheon, where he has worked since 2007. In addition, he is an adjunct assistant professor in computer science at Virginia Tech. <br/></p><p>Zhang's research focus is system security, which lies at the intersection of security, embedded system, computer architecture and software. He has worked to protect cyber-physical military systems and critical infrastructures at Raytheon since 2007. <br/></p><p><br/></p> <span> <hr/></span> <h3>Energy, Environmental & Chemical Engineering</h3><p> <img src="/Profiles/PublishingImages/Fangqiong%20Ling.JPG?RenditionID=7" class="ms-rtePosition-2" rtenodeid="20" alt="" style="margin: 10px;"/><a href="/Profiles/Pages/Fangqiong-Ling.aspx" rtenodeid="18"><strong>Fangqiong Ling, assistant professor</strong></a></p><ul><li>PhD, MS, environmental engineering, University of Illinois at Urbana-Champaign</li><li>BS, environmental engineering, Tsinghua University, Beijing <br/></li></ul><p>Ling joins EECE from Massachusetts Institute of Technology, where she has been a postdoctoral associate in the Department of Biological Engineering. She received an Alfred P. Sloan Foundation Microbiology of the Built Environment Postdoctoral Fellowship. <br/></p><p>Ling's research has employed genomics, machine learning and ecological theory to study microbial diversity and community assembly in aquatic ecosystems at the interface of natural and built environments, such as water infrastructure and aquifers. During her postdoc, she developed new genomic metrics for population census based on human microbiome data. She will lead a computational and experimental lab focused on understanding principles underlying biodiversity, functioning and resilience of microbial ecosystems relevant to sustainability and health, and develop methods to enable ecologically-informed engineering designs.<br/></p><p><br/></p><p> <img src="/Profiles/PublishingImages/Jian%20Wang%202018.jpg?RenditionID=7" class="ms-rtePosition-2" rtenodeid="23" alt="" style="margin: 10px;"/><a href="/Profiles/Pages/Jian-Wang.aspx" rtenodeid="21"><strong>Jian Wang, professor</strong></a> </p><ul><li>PhD, MS, chemical engineering, California Institute of Technology </li><li>BS, physical chemistry, University of Science and Technology of China <br/></li></ul><p>Wang joins EECE from Brookhaven National Laboratory, where he has been a scientist with tenure since 2010. He joined Brookhaven in 2002 as the Goldhaber Distinguished Fellow. He also was an affiliate faculty member in the School of Marine and Atmospheric Sciences at Stony Brook University from 2005-2008 and was a visiting scientist at Max Planck Institute for Chemistry in the summer of 2016. He holds four U.S. Patents. <br/></p><p>Wang's research focuses on the processes that drive the properties and evolutions of atmospheric aerosols and the interactions between aerosols and clouds. His current research topics include aerosol properties and processes under natural conditions that were prevalent during pre-industrial era; nucleation and new particle formation; aerosols in the marine environment; effects of aerosols on cloud microphysical properties and macrophysical structure; and development of advanced aerosol instruments focusing on aircraft-based deployments.<br/></p><p><br/></p> <span><hr/></span> <h3>Mechanical Engineering & Materials Science </h3><p> <img src="/Profiles/PublishingImages/Jianjun%20Guan%20temp.jpg?RenditionID=7" class="ms-rtePosition-2" rtenodeid="26" alt="" style="margin: 10px;"/><a href="/Profiles/Pages/Jianjun-Guan.aspx" rtenodeid="24"><strong>Jianjun Guan, professor </strong></a> <br/></p><ul><li>PhD, chemistry, Zhejiang University, Hangzhou, China</li><li>BS, MS, polymer science and engineering, Qingdao University of Science and Technology, China <br/></li></ul><p>Guan comes to MEMS from The Ohio State University, where he has been a professor of materials science and engineering. He joined Ohio State in 2007 after serving as a research assistant professor at the McGowan Institute for Regenerative Medicine at the University of Pittsburgh, where he also a postdoctoral fellow and research associate. <br/></p><p>Guan's research interests are in biomimetic biomaterials synthesis and scaffold fabrication; bioinspired modification of biomaterials; injectable and highly flexible hydrogels; bioimageable polymers for MRI and EPR imaging and oxygen sensing; mathematical modeling of scaffold structural and mechanical properties; stem cell differentiation; neural stem cell transplantation for brain tissue regeneration; bone tissue engineering and cardiovascular tissue engineering.<br/></p><p><br/></p> <SPAN ID="__publishingReusableFragment"></SPAN> <br/><div class="cstm-section"><h3>Faculty by department<br/></h3><ul style="padding-left: 20px; color: #343434;"><li> <span style="font-size: 1em; line-height: 1.3;"><a href="https://bme.wustl.edu/faculty/Pages/default.aspx">Biomedical Engineering</a></span><br/></li><li> <span style="font-size: 1em; line-height: 1.3;"><a href="https://cse.wustl.edu/faculty/Pages/default.aspx">Computer Science & Engineering</a></span><br/></li><li> <span style="font-size: 1em; line-height: 1.3;"><a href="https://ese.wustl.edu/faculty/Pages/default.aspx">Electrical & Systems Engineering</a></span><br/></li><li> <span style="font-size: 1em; line-height: 1.3;"><a href="https://eece.wustl.edu/faculty/Pages/default.aspx">Energy, Environmental & Chemical Engineering</a></span><br/></li><li> <span style="font-size: 1em; line-height: 1.3;"><a href="https://mems.wustl.edu/faculty/Pages/default.aspx">Mechanical Engineering & Materials Science​</a></span></li></ul></div>Beth Miller2018-07-18T05:00:00Z​A diverse group of new faculty joins the School of Engineering & Applied Science at Washington University in St. Louis, bringing the total number to 96.5 during the 2018-2019 academic year.<p>​A diverse group of new faculty joins the School of Engineering & Applied Science at Washington University in St. Louis, bringing the total number to 96.5 during the 2018-2019 academic year.<br/></p>
https://engineering.wustl.edu/news/Pages/Matteucci-and-Neves-awarded-Elite-90.aspx894Matteucci and Neves awarded Elite 90<img alt="WashU Bears" src="/news/PublishingImages/Athletic_WashingtonU_Bear_Logo.png?RenditionID=1" style="BORDER:0px solid;" /><p>​Sophomore Nick Matteucci of the Washington University in St. Louis men's outdoor track and field team and sophomore Bernardo Neves of the men's tennis team were the recipient of the Elite 90 Award at their respective national championship banquets.</p><div>Matteucci has a 4.0 grade point average while studying chemical engineering. He is the first WashU men's track & field student-athlete in school history to earn the award.</div><div><br/></div><div>Neves has a 3.99 grade point average while majoring in mechanical engineering and computer science. He is the third WashU men's tennis student-athlete to earn the award joining Isaac Stein (2010) and Tim Noack (2013).</div><div><br/></div><div>WashU has an NCAA Division III leading 11 selections since the program's inception in 2009-10.</div><div><br/></div><div>The Elite 90, an award founded by the NCAA, recognizes the true essence of the student-athlete by honoring the individual who has reached the pinnacle of competition at the national championship level in his or her sport, while also achieving the highest academic standard among his or her peers. The Elite 90 is presented to the student-athlete with the highest cumulative grade-point average participating at the finals site for each of the NCAA's 90 championships.<br/></div>WashU Bearshttp://www.washubears.com/general/2018-19/releases/20180710f87mdg2018-07-10T05:00:00ZUndergraduate engineering students Nick Matteucci of the Washington University in St. Louis men's outdoor track and field team and Bernardo Neves of the men's tennis team were the recipient of the Elite 90 Award.

Newsletters