Mathematics & Statistics Seminars
Northern Arizona University

Fall 2021 Department Colloquium

The talks will typically take place on Tuesdays at 4:00-5:00pm in Adel Room 164. Please contact Nandor Sieben with questions about the colloquium.


Tuesday 8/31 at 4:00-4:20

Speaker: None

Title: Organizational meeting

Abstract: Please attend or email Nandor Sieben before the meeting if you or your guest would like to give a talk this semester.


Tuesday 9/7 at 4:00-4:50

Speaker: Mitch Hamidi (Embry-Riddle Aeronautical University)

Title: Topological Dynamics: An Operator Algebraic Approach

Abstract: An operator algebra is an algebra of bounded linear operators acting on a Hilbert space that is closed in a certain norm topology. When that algebra is closed with respect to the adjoint operation (an abstract conjugate transpose), we call it a $C*$ -algebra. The prototypical examples of $C*$ -algebras include the ring of $ n \times n $ matrices over the complex numbers and the ring of complex-valued continuous functions on a compact Hausdorff space. The latter example gives an algebraic perspective for studying topological dynamics. In particular, one can build an operator algebra called a crossed product that encodes the dynamical information of a group of homeomorphisms acting on a topological space.

In the 1960s, W. Arveson determined that the action of a homeomorphism on a topological space is better encoded in a crossed product via the action of a semigroup on that space, rather than a group, which led to many important results in operator algebra theory.

I will discuss how and why operator algebraists have been returning to crossed products in the context of groups acting on non-adjoint closed operator algebras, and I will discuss a recent partial solution to when dynamics are encoded fully in this crossed product context.


Tuesday 9/14 at 4:00-4:50

Speaker: Ye Chen

Title: Impacts of vaccination, Alpha and Delta variants on COVID-19 transmission dynamics in the 15 most populous metropolitan statistical areas in the United States.

Abstract: To predict the future course of COVID-19 pandemic in each of the 15 most populous metropolitan statistical areas (MSA) in the US, we extended the SIR model to a comprehensive dynamical system with 42 ODEs accounting for the following impacts, multiple periods of social distancing, vaccination rate, transmissibility of Alpha and Delta variants, vaccine effectiveness for different variants. For each MSA, we used the adaptive MCMC approach to generate samples of the posterior distribution for the adjustable parameters and performed uncertainty quantification. In the process, we obtained estimates of the relative infectiousness of Alpha and Delta as well as their takeoff times in each MSA. With the daily reports of new COVID-19 cases available from January 21, 2020 to August 24, 2021, the region-specific models predict that the current surge will plateau in September, 2021 for most cities. Model parameterizations and the predictions are being updated daily as new surveillance and vaccination data become available, prediction results can be found in the following GitHub site.


Tuesday 9/21 at 4:00-4:50

Speaker: Jim Swift

Title: BLIS: A Bifurcation Lemma for Invariant Subspaces

Abstract: The Equivariant Branching Lemma (EBL) is an important tool in equivariant (symmetric) bifurcation theory. The study of coupled cell networks is a hot topic in dynamical systems. It has been noticed in recent years that the structure of networks can cause invariant subspaces in the dynamics that are not caused by symmetry. As a result, the EBL does not describe some of the observed bifurcations. We present a generalization of the EBL that predicts the branching of solutions in a wide variety of situations, and lends itself to implementation as a numerical algorithm. We call this a Bifurcation Lemma for Invariant Subspaces (BLIS).


Tuesday 9/28 at 4:00-4:50

Speaker: Rachel Neville

Title: Topological Techniques for Characterizing Patterns

Abstract: Examples of complex spatial-temporal patterns are ubiquitous. Irregular time-varying structures, complexity of patterns, and sensitivity to initial conditions, among other things can make quantifying and distinguishing patterns difficult. In recent years, topological data analysis (TDA) has emerged with a promising set of tools for characterizing such systems. These techniques provide a low-dimensional summary of the geometric and topological structure of data. This can be used to quantify of order, for parameters to be learned and studied, and for the evolution of pattern defects to be studied.

In this talk, I will give a brief introduction to persistent homology and discuss how persistence can be leveraged to study pattern forming systems. In particular, I will highlight some of the utility of some of these techniques in studying the formation of disordered hexagonal arrays of nanodots and crystalline structures that emerge in ion bombarded surfaces.


Tuesday 10/5 at 4:00-4:50

Speaker: Dana Ernst

Title: Isomorphisms in mathematics

Abstract: Loosely speaking, an isomorphism is a structure-preserving mapping between two structures of the same type that can be reversed by an inverse mapping that also preserves the structure. Sometimes we get the inverse mapping for free and sometimes we don’t. Two mathematical structures are isomorphic if an isomorphism exists between them. In this talk, we will tinker with examples of isomorphisms in a variety of contexts and explore unifying themes. This talk will be appropriate for undergraduate mathematics majors and mathematics graduate students.


Tuesday 10/12 at 4:00-4:50

Speaker: Angie Hodge-Zickerman

Title: Using technology to think creatively about engaging mathematics learners

Abstract: Remote learning brought about many challenges to engage learners in the virtual mathematics classroom. What did we learn from using technology in the virtual classroom that can be beneficial to any mathematics classroom? In this session, I will share some technologies that can be used to enhance the classroom be it virtual or in person. Please bring a laptop or a personal device if possible.


Tuesday 10/19 at 4:00-4:50

Speaker: Ryan Blackburn (student of Robert Buscaglia)

Title: Modern tools require modern statistics: Remote sensing, ridge regression and Boruta feature selection

Abstract: Current and historic land practices (e.g., monoculture, livestock, logging, fire suppression) have drastically altered the conditions of ecosystems within the United States. Climate change exacerbates this by increasing the severity and scale of threats such as drought, beetle outbreaks, wildfires, and species loss. As a result, ecological restoration has been attempting to return ecological communities to a more resilient state. With the growth of restoration occurring across different environments and scales, it is important to understand the efficacy and impacts of these projects. Different remote sensing techniques provide the opportunity to quantify conditions across scales relevant to each project or ecosystem type. Common issues arise with limited sample sizes, correlated predictors, and the uncertainty of new techniques. Here, we present two studies using different statistical methods to overcome these problems. The first study investigated the use of drone based multispectral data in predicting 33 ecological characteristics across 19 samples in an Illinois tallgrass prairie. Using ridge regression, we tested for significant reductions in 10x5 cross-validated error between different drone-based models and a null model. This allowed us to account for the low sample size and correlated predictors. In the second study, we compared a newer approach of light detection and ranging (lidar) data analysis to previously established approaches for predicting tree basal area, volume, and aboveground biomass in the Southwest. We used Boruta feature selection and random forest which allowed us to account for correlated predictors and nonlinear relationships. We counted variables from each approach retained during variable selection and tested for significant reductions in 10x5 cross-validated error between lidar approaches. Overall, these two studies are examples of how a combination of modern statistical techniques can be used to improve the predictions of ecological characteristics with remote sensing data.


Tuesday 10/26 at 4:00-4:50

Speaker: Brent Burch and Jesse Egbert

Title: Zero-inflated beta distribution applied to word frequency and lexical dispersion in corpus linguistics

Abstract: Corpus linguistics is the study of language as expressed in a body of texts or documents. The relative frequency of a word within a text and the dispersion of the word across the collection of texts provide information about the word’s prominence and diffusion, respectively. The zero-inflated beta distribution enables one to model the relative frequency of a word in a text since some texts may not even contain the word under study. We discuss the expected values of word prominence and dispersion under the zero-inflated beta model. In addition, we compute estimates of a word’s prominence and dispersion for words in the British National Corpus, a 100 million-word collection of written and spoken language of British English.


Tuesday 11/2 at 4:00-4:50

Speaker: Gina Nabours and Marietta E Fule

Title: Summer Bridge Programs and the Department of Mathematics and Statistics

Abstract: While many faculty may choose to take some much-needed rest and relaxation over the summer, department staff and administration are busy running or supporting five different summer bridge programs meant to increase student success in mathematics at the university level. Learn about the ins and outs of these programs to better understand the department’s efforts in helping students achieve their academic goals at NAU.


Tuesday 11/9 4:00-4:50

Speaker: Toby D Hocking joint work with Jonathan Hillman

Title: Optimizing ROC Curves with a Sort-Based Surrogate Loss Function for Binary Classification and Changepoint Detection

Abstract: Receiver Operating Characteristic (ROC) curves are plots of true positive rate versus false positive rate which are useful for evaluating binary classification models, but difficult to use for learning since the Area Under the Curve (AUC) is non-convex. ROC curves can also be used in other problems that have false positive and true positive rates such as changepoint detection. We show that in this more general context, the ROC curve can have loops, points with highly sub-optimal error rates, and AUC greater than one. This observation motivates a new optimization objective: rather than maximizing the AUC, we would like a monotonic ROC curve with AUC=1 that avoids points with large values for Min(FP,FN). We propose a convex relaxation of this objective that results in a new surrogate loss function called the AUM, short for Area Under Min(FP, FN). Whereas previous loss functions are based on summing over all labeled examples or pairs, the AUM requires a sort and a sum over the sequence of points on the ROC curve. We show that AUM directional derivatives can be efficiently computed and used in a gradient descent learning algorithm. In our empirical study of supervised binary classification and changepoint detection problems, we show that our new AUM minimization learning algorithm results in improved AUC and comparable speed relative to previous baselines.


Tuesday 11/16 at 4:00-4:50

Speaker: Shafiu Jibrin

Title: Conjugate Gradient Methods for Computing Weighted Analytic Center for Linear Matrix Inequalities

Abstract: We study the problem of computing the weighted analytic center linear matrix inequality constraints. In this talk, we apply conjugate gradient (CG) methods to find the weighted analytic center. Conjugate gradient methods are known to have low memory requirements and strong local and global convergence properties. The methods considered are the classical methods by Hestenes-Stiefel (HS), Fletcher and Reeves (FR), Polak and Ribiere (PRP) and a relatively new method by Rivaie, Abashar, Mustafa and Ismail (RAMI). We compare the performance of each method on random test problems by observing the number of iterations and time required by the method to find the weighted analytic center. We use Newton’s method exact line search and Quadratic Interpolation inexact line search. Our numerical results show that PRP is the best method, followed by HS, then RAMI, and then FR. They also indicate that both line searches work well, but the exact line search handles weights better than the inexact line search when some weight is relatively much larger than the other weights. We find that all the CG methods with Quadratic interpolation inexact line search failed when some weight is relatively much larger than the remaining weights. [slides]


Tuesday 11/23 at 4:00-4:50

Speaker: Zachariah Etienne

Title: Next-Generation Black Hole and Neutron Star Collision Simulations

Abstract: Perhaps the most significant astronomical discovery of our lifetimes, code-named GW170817, involved the collision of two incredibly dense dead stars called neutron stars. Notably the collision was detected both by gravitational wave observatories (including LIGO), and by traditional telescopes (that detect light). These stars are similar in many ways to gigantic atomic nuclei, and this “multimessenger” collision alone has yielded unprecedented insights into matter and gravity at their most extreme, far beyond what we can study in laboratories on Earth. However our sparse theoretical understanding of such collisions limit scientific insights gained from such discoveries both now and in the future. There is a critical need to improve existing theoretical models built upon supercomputer simulations of colliding neutron stars and black holes. Such simulations generate full, non-perturbative solutions of the general relativistic field equations (numerical relativity). After a gentle introduction to multimessenger astrophysics and the challenges associated with multimessenger source modeling, I will outline a new approach aimed at greatly reducing the cost of these simulations. With the reduced cost comes the potential to both perform colliding black hole simulations on the consumer-grade desktop computer and add unprecedented levels of physical realism to colliding neutron star supercomputer simulations.