Fall 2024

Colloquium talks will take place on Fridays, 1:00pm - 2:00pm. Some talks will be virtual, and some talks will be in-person. For in-person talks, a concurrent Microsoft Teams meeting will be run to allow virtual attendance. Please address inquiries/suggestions to Dr. Bansal at naveen.bansal@marquette.edu

Expand all   |   Collapse all  

September 6 - Dr. Rodney Sparapani (Medical College of Wisconsin)
Machine learning regression and marginal effects inference

Presentation Slides

Marginal effect estimation is a common desire for nonparametric machine learning regression. Marginal effects focus on a covariate subset of interest, S, while aggregating over the complement, C. In particular, marginal effects are often employed to understand the meaning of complex model fits. With real data, such explainability research is a key consideration before acting upon the model ramifications. Gleaning internal consistency insights like these is a necessity for ascertaining model fit trust-worthiness. There are two popular approaches that are applicable to marginal effects: Friedman's partial dependence function (FPD) and Shapley values (SV). Both of these approaches have an assumption of independence between S and C. Ignoring, or being unaware, of this assumption can lead to an incorrect marginal inference. We will explore expanding FPD and SV to the dependent case. Furthermore, these approaches can be computationally demanding. So, more friendly computational extensions will be described.

September 13 - MSSC Faculty Speed Talks

A number of faculty members will be giving short (5-7 minutes) intro/summaries of their research areas. This is a great opportunity for students at all levels, as well as faculty, to learn about current MSSC research and to start new collaborations.

Speakers:

September 20 - no colloquium

No colloquium.

September 27 - Dr. Dan Rowe (向日葵视频)
FMRI Signal Measurements, Image Reconstruction, and Phase Contrast Information

In fMRI, the signal measured by the scanner depends upon each voxel's physical properties through an approximate Fourier relationship. Images reconstructed using the inverse Fourier transform result in complex-valued images due to inhomogeneous magnetic fields. This inverse Fourier transform can be represented as a matrix isomorphism to determine processing and reconstruction induced correlations. Statistical models for detection of task-related magnitude and/or phase activation within complex-valued data are presented. Potential valuable physiological information within the almost always discarded phase image time series is described. Statistical utilization of complex-valued images have been shown to lead to increased detection power, detection of potential additional biological information, and thus should be considered.

October 4 - Dr. Greg Ongie (向日葵视频)
Machine learning for enhanced image reconstruction in medical imaging

In medical imaging, image reconstruction is the process of forming images from scanner measurements, e.g., generating a 3D digital image from an MRI or CT scan. This is a well-understood process when a sufficient number of high-quality measurements are taken. However, in many imaging scenarios, the measurements are incomplete or corrupted by noise. In this case, applying standard reconstruction techniques can result in artifacted or noisy images with limited clinical value. Recently, researchers have proposed to use machine learning to improve image quality in the case of limited/noisy measurements. This talk will highlight two of my recent projects along these lines: (1) a supervised learning technique that restores noisy reconstructions while preserving important clinical features, and (2) an unsupervised learning technique that uses the implicit regularization effect of neural networks to compensate for missing measurements.

October 11 - Graduate Student Poster Session

Regularization in Singular Spectrum Analysis for Functional Times Series
Jesse Adikorley

Simulation and Harmonic Analysis of k-space Readout
John Bodenschatz

Improving 3D Complex Geometrical Optics EIT Reconstructions with a priori Information
Emily Corcoran

Learning-based CT Image Reconstruction with an Improved Signal Detectability Loss
Megan Lantz

Spatially Correlated Sampling from Parallel Partial Emulators for Geophysical Applications
Joey Lyon

Hybrid-PCA for Multivariate Functional and Vector Data
Soroush Mahmoudiandehkordi

Super Resolution Recovery with Implicit Neural Representations
Mahrokh Najaf

Collective Spectral Density Estimation for Multiple Multivariate Time Series
Shirin Nezampour

Smooth and Sparse Multivariate Functional Principal Component Analysis
Mobina Pourmoshir

A Language Theoretic Approach to Enumerating Bounded Permutations
Eric Redmon

Applying Gradient Kernel Dimension Reduction and Parallel Partial Linked Emulator to Postfire Debris Flow Data
Josh Seidman

A CAIPI Approach for Simultaneous Multi-Slice Technique to Increase Activation Detection in FMRI
Ke Xu

October 18 - No colloquium, midterm break.

 

October 25 - Dr. Ravindra Khattree (Oakland University)
Missing Value Imputation for NonNormal Data

We present approaches to missing value imputations when the distributions are highly skewed. One such approach is based on Copula-based transformation of the data. Copulas as the generic all-purpose transformations which can enable one to apply various standard multivariate procedures more efficiently and with better statistical properties and results, have been studied by the author in some earlier work and here we show that this approach can be very effective for missing value imputation as well. A related problem is imputation in case of multivariate Lomax data and we show that in that context, some nice independent methods for imputation can be derived.

November 1 - Applied Statistics Practicum Presentations

Ross Bravo - Stolen Base Success Probabilities

Jeremy Buss - Network Science at Northwestern Mutual

Tanjina Zaman - Early Prediction of Cardiac Diseases in Patients with Quantum Machine Learning and Feature Selection Technique

November 8 - Dr. Steffen Lempp (University of Wisconsin-Madison)

Computability theory is the area of mathematical logic studying the complexity of mathematical objects, usually thought of as coded by sets of natural numbers. A degree structure is now a quotient of the power set of N where sets A and B are identified if they have the 鈥渟ame complexity,鈥 and partially ordered by letting A <= B if questions about A (such as membership of elements) can be 鈥渞educed,鈥 i.e., 鈥渃omputed,鈥 from questions about (membership in) B. I will present a survey of results about these degree structures viewed as algebraic structures, more precisely as partial orders, with a particular focus on their finite substructures, given the high complexity of these structures.

November 15 -

TBD

November 22 -

TBD

November 29 - No colloquium, Thanksgiving break

 

December 6 -

TBD

Expand all   |   Collapse all  


Previous Semesters

Spring 2024

January 26 - James Middleton (Arizona State University)
The role of emotion and emotion regulation in sustaining productive student engagement

Over the years motivation and related phenomena such as interest, anxiety, mindset, grit and others have been studied as both precursors to and consequences of mathematics learning experiences. What the research community is finding, slowly and inexorably, is that no single construct either predicts or describes mathematics engagement fully enough to account for all its manifestations. In fact, what we see is that different aspects of one鈥檚 experience interact in the moment of learning to produce situated behaviors that can be at once joyful and anxious, productive and counterproductive, stable and malleable. With modern computational tools, the relationships among heretofore separate constructs can be studied fruitfully to paint a picture of engaged behavior that is richer, and more student centered than ever before. This talk reviews (briefly) the major findings from over half a century of research on affect and motivation that are converging on this new understanding. In particular, the ways that new accounts of emotion regulation connect disparate lines of research will be used to project some design specs for mathematics classrooms. 

February 9 - John Engbers (向日葵视频)
Co(mbinat)o(ria)l Results on Reciprocals of Thinned Exponential Series

Don鈥檛 worry if you haven鈥檛 taken, or have forgotten, anything about power series 鈥 I鈥檒l remind you of anything necessary. But, interestingly, the power series of e^(-x) (about 0) has a reciprocal e^x = 1 + x + x^2/2! + x^3/3! + 路 路 路 that has a series that is non-negative (i.e. has all non-negative coefficients). If we truncate the series for e鈭抶 after an odd power (like 鈭抶^3/3!), then the series of the reciprocal of the resulting polynomial is still non-negative; but if it is truncated after an even power (like x^10/10!), the reciprocal has some negative coefficients.Gessel gave a lovely combinatorial explanation for the non-negative result.

I鈥檒l talk about some recent work with David Galvin (Notre Dame) and Cliff Smyth (UNC-Greensboro) that extends from truncates of e^(鈭抶) to classes of 鈥渢hinned鈥 versions of e^(鈭抶) (obtained from the power series of e鈭抶 by deleting some set of terms). There will be some fun combinatorial digressions along the way.

The proofs are combinatorial and are hopefully accessible to an undergraduate, so all students and faculty are invited to come. We include some (what I think are) interesting open questions at the end!

February 23 - Josh Seidman (Applied Statistics Practicum)
An R Implementation of Adaptive Marginal Likelihood Sampling for Bayesian Model Selection

In Bin Liu鈥檚 2014 paper Adaptive Annealed Importance Sampling for Multimodal Posterior Exploration and Model Selection with Application to Extrasolar Planet Detection, he introduces an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. For Bayesian model comparison, one needs to evaluate marginal likelihoods which is notoriously difficult because models are often highly nonlinear and multimodal. Importance sampling is a powerful technique for estimating integrals, but finding importance functions that mimic the integrand is challenging in high-dimensional problems. In order to capture the multimodal structure of the models, this algorithm employs mixture distributions, which are adaptively added, deleted, and merged with to approximate the target distributions. Further, the parameters of these mixture components are updated via expectation-maximization. Code accompanying this paper was written in MATLAB. We will present a new R implementation, and demonstrate the algorithm on both challenging target functions and Bayesian model selection examples.

March 15 - No colloquium.

No colloquium due to spring break.

March 22 - Iain Bruce (Polarean Imaging)
Characterizing Regional Pulmonary Function Through Hyperpolarized Xenon-129 MRI

In the span of a 10 second breath hold, a hyperpolarized xenon-129 (HP 129Xe) MRI scan can directly visualize the ventilated airspaces of a patient's lungs. In contrast to the single numeric outputs derived from standard pulmonary function tests, the 3-dimensional HP 129Xe images offer a detailed regional assessment of lung function in patients with obstructive lung diseases. Additionally, when dissolved into the interstitial membrane and transferred to the red blood cells, the HP 129Xe signal experiences a chemical shift that can be used to quantify gas exchange dynamics throughout the lungs of patients with restrictive and pulmonary vascular diseases. Sensitive to both functional and structural changes in early lung disease, HP 129Xe MRI can be used for the longitudinal assessment of disease progression and therapy response.

March 29 - No colloquium.

No colloquium due to Easter break.

April 5 - Kit Chan (Bowling Green State University)
Universality of Automorphic Compositions on the Upper Half-Plane

The automorphisms of the upper half-plane are those functions of the form (z)鈥=鈥(az鈥+鈥b)/(cz鈥+鈥d), where a,鈥b,鈥c,鈥d鈥勨垐鈥勨划, with ad鈥呪垝鈥bc鈥=鈥1. Naturally an automorphism defines a composition map C鈥:鈥H(鈩)鈥勨啋鈥H(鈩) on the space H(鈩) of all analytic functions on by taking C(f)鈥=鈥f鈥呪垬鈥. We show that there is a function f in H(鈩) such that all of its n-fold compositions Cn(f)鈥=鈥f鈥呪垬鈥鈥呪垬鈥呪嫰鈥呪垬鈥 are dense in H(鈩) if and only if |a鈥+鈥d触鈥勨墺鈥2. In this case, we say that the composition map C is hypercyclic.

More generally, we show that for a sequence of automorphisms n(z)鈥=鈥(anz鈥+鈥bn)/(cnz鈥+鈥dn), there is a function f in H(鈩) such that the sequence Cn(f)鈥=鈥f鈥呪垬鈥n is dense in H(鈩) if and only if 濒颈尘蝉耻辫鈥唡an触鈥+鈥厊bn触鈥+鈥厊cn触鈥+鈥厊dn触鈥=鈥勨垶. In this case, we say that the sequence of composition maps Cn is universal. Besides the upper half-plane , we also discuss the phenomena of hypercyclicity and universality for other planar regions.

April 12 - Jay Pantone (向日葵视频)
Experimental Methods in Combinatorics

What number comes next in the sequence
  1, 2, 4, 8, 16, 32, ... ?
  
How about
  1, 2, 3, 5, 8, 13, ... ?
  
Or maybe
  1, 3, 14, 84, 592, 4659, ... ?

Many questions in combinatorics have the form "How many objects are there that have size n and satisfy certain properties?" For example, there are n! permutations (rearrangements) of n distinct objects, there are 2^n binary strings of length n, and the number of sequences of n coin flips that never have two tails in a row is the nth Fibonacci number. The "counting sequence" of a set of objects is the sequence a_0, a_1, a_2, ..., where a_n is the number of objects of size n.

As a result of theoretical advances and more powerful computers, it is becoming common to be able to compute a large number of initial terms of the counting sequence of a set of objects that you'd like to study. From these initial terms, can you guess future terms? Can you guess a formula for the nth term in the sequence? Can you guess the asymptotic behavior as n tends to infinity?

Rigorously, you can prove basically nothing from just some known initial terms. But, perhaps surprisingly, there are several empirical techniques that can use these initial terms to shed some light on the nature of a sequence.

As we talk about two such techniques -- automated conjecturing of generating functions, and the method of differential approximation -- we'll exhibit their usefulness through a variety of combinatorial topics, including chord diagrams, permutation classes, and inversion sequences.

April 19 - Chase Sakitis (向日葵视频)
Bayesian Approaches to GRAPPA, SENSE, and Their Fusion in MR Parallel Image Reconstruction

In fMRI, capturing cognitive temporal dynamics is dependent on how quickly volume brain images are acquired. The sampling time for an array of spatial frequencies to reconstruct an image is the limiting factor in the fMRI process. Parallel imaging techniques Sensitivity Encoding (SENSE), which operates in the image space domain, and GeneRalized Autocalibrating Partial Parallel Acquisition (GRAPPA), which operates in the spatial frequency domain, have been utilized to greatly reduced image acquisition time. In SENSE image reconstruction, coil sensitivities are estimated once from a priori calibration images and used as fixed 鈥渒nown鈥 coil sensitivities for image reconstruction of every subsequent image. In GRAPPA, localized weights, assessed from a priori calibration spatial frequency arrays, are utilized to interpolate the missing lines of the subsampled spatial frequency (k-space) coil arrays. Here, we introduce Bayesian approaches to both SENSE and GRAPPA where prior distributions for the unobserved parameters are assessed from the a priori calibration information. The unknown parameters are jointly estimated a posteriori via the Iterated Conditional Modes algorithm and Markov chain Monte Carlo using Gibbs sampling. This work also explores fusing the GRAPPA and SENSE reconstruction technique along with applying a Bayesian approach to this fused technique. The Bayesian reconstruction techniques utilize prior image information to reconstruct images from the posterior distributions. The traditional image reconstruction techniques and the Bayesian techniques are extensively evaluated using a simulation study and experimental fMRI data. 

April 26 - Saeed Ghahramani (Western New England University)
On Finite Moment Conditions for the Ladder Epochs of Random Walks

This presentation delves into how queuing theory methodologies and findings can be utilized to prove theorems concerning random walks, a reversal of the conventional method wherein random walk results are typically employed to formulate queuing theorems. Specifically, we consider a series of independent and identically distributed random variables, X, X1, X2, ..., with E(X) < 0, constructing a random walk, {Zn}, from their cumulative sums. We focus on the concept of the first strict descending ladder epoch, K, under a negative drift, which implies E(K) < 鈭.

In a twist, we examine a G/G/1 queue, characterized by service times Sn = max(Xn, 0) and interarrival times Tn = 鈭 min(Xn, 0). Such a queue is not a standard GI/G/1 since Sn and Tn are not independent. We know that K is the number of customers served during a busy period of the G/G/1 queue associated with the random walk above. Proofs are obtained of finite moment conditions for busy periods of a general G/G/1 queue in which service times and the interarrival times may be dependent, and are used to obtain finite moment conditions for the ladder epochs of the random walk {Zn : n 鈮 0}.

May 3 - Cheng-Han Yu (向日葵视频)
Semiparametric Latent ANOVA Model for Event-Related Potentials

Event-related potentials (ERPs) extracted from electroencephalography (EEG) data in response to stimuli are widely used in psychological and neuroscience experiments. A major goal is to link ERP characteristic components to subject-level covariates. Existing methods typically follow two-step approaches, first identifying ERP components using peak detection methods and then relating them to the covariates. This approach, however, can lead to loss of efficiency due to inaccurate estimates in the initial step, especially considering the low signal-to-noise ratio of EEG data. To address this challenge, we propose a semiparametric latent ANOVA model (SLAM) that unifies inference on ERP components and their association to covariates. SLAM models ERP waveforms via a structured Gaussian process prior that encodes ERP latency in its derivative and links the subject-level latencies to covariates using a latent ANOVA. This unified Bayesian framework provides estimation at both population- and subject- levels, improving the efficiency of the inference by leveraging information across subjects. We automate posterior inference and hyperparameter tuning using a Monte Carlo expectation-maximization algorithm. We demonstrate the advantages of SLAM over competing methods via simulations and then apply it to data from an ERP experiment on speech recognition, where we assess the effect of age on two components of interest.

Fall 2023

Colloquium talks will take place on Fridays, 1:00pm - 2:00pm. Some talks will be virtual, and some talks will be in-person. For in-person talks, a concurrent Microsoft Teams meeting will be run to allow virtual attendance. Please address inquiries/suggestions to Dr. Rowe at daniel.rowe@marquette.edu.


September 1 - Daniel Adrian (Grand Valley State University)
Improved activation detection from magnitude and phase fMRI data 

Functional MRI is a popular noninvasive technique for mapping brain regions activated by specific brain functions.  However, as fMRI measures brain activity indirectly through blood flow, the so-called 鈥渂rain or vein鈥 problem refers to the difficulty in determining whether measured activation corresponds to (desired) brain tissue or (undesired) large veins, which may be draining blood from neighboring regions.  Now, fMRI data consist of both magnitude and phase components (i.e., it is complex-valued), but in the vast majority of statistical analyses, only the magnitude data is utilized.  However, while activation in the magnitude component can come from both 鈥渂rain or vein鈥, previous work has demonstrated that activation in the phase component 鈥渄iscriminates鈥 between the two: phase activation occurs in voxels with large, oriented vessels but not in voxels with small, randomly oriented vessels immediately adjacent to brain tissue.  Following this motivation, we have developed a model that allows for activation in magnitude and phase, one more general than those previously proposed.

 

September 8 - Computational Sciences Summer Research Fellows Program presentations

Jesse Adikorley
Regularized Singular Spectrum Analysis

Functional time series (FTS) are constituted by dependent functions and can be used to model several applied processes. Several machine-learning approaches have been developed in the literature to gain insight into the stochastic processes that generate FTS. In this work, we present regularization techniques of the Singular spectrum analysis (SSA) method in the analysis of FTS. Regularization techniques of multivariate SSA (MSSA), functional SSA (FSSA), and Hilbert SSA (HSSA) known as regularized MSSA (reMSSA), regularized FSSA (reFSSA), and regularized HSSA (reHSSA) respectively are applied to call center data that contains the number of incoming calls to a bank's call center in Israel. We show that the proposed regularization techniques, reMSSA, reFSSA, and reHSSA, outperform MSSA, FSSA, and HSSA, respectively, by effectively smoothing the rough components generated by MSSA, FSSA, and HSSA of the MTS and FTS objects.

 

Emily Corcoran
Numerical Implementation of the 3D Faddeev Green鈥檚 Function for the Full Nonlinear Complex Geometrical Optics Algorithm on EIT Electrode Data

The full nonlinear Complex Geometrical Optics (CGO) algorithm for Electrical Impedance Tomography (EIT) has not previously been implemented for practical EIT electrode setups in three dimensions, though approximations to the method have been utilized. In this work, we numerically implemented the Faddeev Green鈥檚 function for the full nonlinear method and successfully created image reconstructions on both simulated and experimental EIT voltage data and compared the results to those of other CGO methods. Preliminary results show valuable reconstructions and target localization.

 

Soroush Mahmoudiandehkordi
Investigating transgenerational epigenetic inheritance in humans using IBD mapping

While transgenerational epigenetic inheritance has been observed in plants, it remains a complex biological phenomenon in humans. We hypothesize that transgenerational epigenetic inheritance occurs in regions of the genome that are identical by descent (IBD).

 

Mahi Najaf
Comparing Traditional Iterative Reconstruction Methods with Coordinate-based Neural Networks in CT Image Reconstruction

To make X-ray computed tomography (CT) scans safer for patients, one approach is to acquire fewer X-ray measurements. However, this leads to an ill-posed inverse problem, which is traditionally solved by adding explicit regularization terms to an iterative reconstruction method. Currently, there is interest in using machine learning (ML) techniques to replace explicit regularization terms with implicit or learned forms of regularization. We focus on a recently proposed unsupervised ML technique coming from computer vision known as "coordinate-based neural networks" (CBNNs), which we compare with traditional iterative methods in their ability to reconstruct simulated CT data.

 

Yue Zhao
Regularized Multivariate Functional Principal Component Analysis

Functional principal component analysis (FPCA) has gained considerable attention in recent years due to its ability to handle complex data structure as functional data. Multivariate functional principal component analysis (MFPCA) is a natural extension of FPCA, where each observation consists of a vector of functions. The main contribution of my work is to introduce a regularized MFPCA (reMFPCA) method that incorporates smoothness constraints on functional principal components. In this talk, I will demonstrate the effectiveness of the proposed method through simulations and real data examples and compare the results of the proposed method with other existing regularization approaches.

 

September 15 - Computational Sciences Summer Research Fellows Program presentations

John Bodenschatz
A more representative simulation of complex-valued fMRI data

Current functional magnetic resonance imaging (fMRI) simulation techniques disregard vast amounts of information regarding the nuclear magnetic resonance (NMR) process and the complex-valued data that is output by the machine. In this talk we present work towards a package for simulating fMRI data that is more representative of the machine used in practice through implementation of the gradient echo MR signal equation. This work is supplemented by an application of reconstructing non-Cartesian sampled data at accelerated rates.


Ke Xu
A CAIPI Approach for mSPECS with Through-Plane and In-Plane Acceleration

In order to accelerate the number of images per unit of time to create each volume and decrease the total scan time, efforts have been paid into fMRI studies. Techniques such as SENSE and GRAPPA measure fewer data in an image slice but are able to reconstruct an image. The simultaneous multi-slice (SMS) techniques provide an alternative reconstruction method that multiple slices are acquired and aliased concurrently. Controlled Aliasing in Parallel Imaging (CAIPI) is a technique where the field-of-view is shifted for decreasing the influence of the geometry factor. In this project, a novel SMS technique called A CAIPI approach for mSPECS with Through-Plane and In-Plane Acceleration will be presented. With this approach, an improvement of SNR ratio is accomplished, meanwhile a decrease in geometry factor is achieved.

 

September 29 - Todd Ogden (Columbia University)
Functional data analysis of a compartment modeling framework with applications in dynamic PET imaging

Compartment modeling describes the movement of substances or individuals among different states and has application in epidemiology, pharmacokinetics, ecology, and many other areas.  Fitting such a model to data typically involves solving a system of linear differential equations and estimating the parameters upon which the functions depend.  In order for this approach to be valid, it is necessary that a number of fairly strong assumptions hold, assumptions involving various aspects of the kinetic behavior under investigation.  In many situations, such models are  understood to be simplifications of the "true" kinetic process.  While in some circumstances such a simplified model may be a useful (and close) approximation to the truth, in other cases, important aspects of the kinetic behavior cannot be represented.  We present a nonparametric approach, based on principles of functional data analysis, to modeling of pharmacokinetic data.  We illustrate its use through application to data from a dynamic PET imaging study of the human brain.

 

October 6 - Michelle Guindani (UCLA)
Bayesian methods for studying heterogeneity in the brain

An improved understanding of the heterogeneity of brain mechanisms is considered critical for developing interventions based on observed neuroimaging features. in this talk, we will discuss two examples where models able to capture such heterogeneity appear necessary. First, we will focus on the reorganization of functional connections between brain areas throughout a neuroscience experiment. The transitions between different individual connectivity states can be modulated by changes in underlying physiological mechanisms that drive functional network dynamics, e.g., changes in attention or cognitive efforts. We will propose a multi-subject Bayesian framework for estimating dynamic functional networks as a function of time-varying exogenous physiological covariates that are simultaneously recorded in each subject during an fMRI experiment. Another general problem in neuroscience is detecting regional patterns of brain activation associated with a subject鈥檚 activity or condition, preferably at cellular resolution. Most existing statistical methods solve this problem by partitioning the brain regions into two classes: significantly and non-significantly activated areas. However, we will show that, for highly-noised data like those recorded in novel thin-section microscopy experiments, such binary grouping may provide overly simplistic discoveries by filtering out weak but essential signals. To overcome this limitation, we will propose a new Bayesian approach that allows classifying the brain regions into several tiers with varying degrees of statistical relevance. We will then conclude by outlining extensions and further research directions to study brain connectivity in animal and human experiments.

 

October 27 - Applied Statistics Practicum presentations

Caldwell Gluesing
Data Visualization at Kohler Company

Jennifer Sailor
Using Publicly Available Data to Model Six Species of Lepidoptera in Kruger National Park

Clint Woerishofer
Decoding Customer Insights: A Data-Driven Dive into Chicago's E-commerce and Smart Products Landscape

Qishi Zhan
Data Science Deployment in the Credit Risk Industry

 

November 3 - Jorg Polzehl (Weierstrass Institute)
Smoothing techniques for quantitave MR

Unlike conventional weighted MRI, leading to T1-, T2-, T2*-, or proton density (PD) weighted images in arbitrary units, quantitative MRI (qMRI) aims to estimate absolute physical metrics. qMRI is of increasing interest in neuroscience and clinical research for its greater specificity and its sensitivity to micro-structural properties of brain tissue such as axon, myelin, iron and water concentration. Furthermore, the measurement of quantitative data allows for comparison across sites, time points and participants, and enables longitudinal studies and multi-center trials. Examples are DIffusion Weighted MRI, Multi-Parameter Mapping, and Inversion Recovery MRI. Within this talk I'll discuss statistical issues in modeling of qMRI experiments. Special emphasis will be on adaptive, edge preserving smoothing techniques for parameter maps obtained in qMRI.

 

November 29 - Pei Wang (Miami University)
Sufficient Dimension Reduction and Variable Selection by Feature Filter

The minimum discrepancy approach proves useful in sufficient dimension reduction (SDR). In this study, we propose two novel SDR estimators based on a feature filter technique derived from the characteristic function, employing the minimum discrepancy function. In an ultra-high dimension setting with sparse assumptions, we introduce a regularization method aiming to achieve SDR and SVS (Sufficient Variable Selection) simultaneously. We establish asymptotic results and provide an estimation method for determining the structural dimension. To showcase the efficacy of our method, we conduct extensive simulations and present a real data example.

 

December 1 - Jarek Harezlak (Indiana University)
Novel penalized regression method applied to study the association of brain functional connectivity and alcohol drinking

The intricate associations between brain functional connectivity and clinical outcomes are difficult to estimate. Common approaches used do not account for the interrelated connectivity patterns in the functional connectivity (FC) matrix, which can jointly and/or synergistically affect the outcomes. In our application of a novel penalized regression approach called SpINNEr (Sparsity Inducing Nuclear Norm Estimator), we identify brain FC patterns that predict drinking outcomes. Results dynamically summarized in the R shiny app indicate that this scalar-on-matrix regression framework via the SpINNEr approach uncovers numerous reproducible FC associations with alcohol consumption.

 

December 8 - Rene Gutierrez (Texas A&M University)
Multi-object Data Integration in the Study of Primary Progressive Aphasia

This talk focuses on a multi-modal imaging data application where structural/anatomical information from grey matter (GM) and brain connectivity information in the form of a brain connectome network from functional magnetic resonance imaging (fMRI) are available for a number of subjects with different degrees of primary progressive aphasia (PPA), a neurodegenerative disorder (ND) measured through a speech rate measure on motor speech loss. The clinical/scientific goal in this study becomes the identification of brain regions of interest significantly related to the speech rate measure to gain insight into ND pathways. Viewing the brain connectome network and GM images as objects, we develop a flexible joint object response regression framework of network and GM images on the speech rate measure. A novel joint prior formulation is proposed on network and structural image coefficients in order to exploit network information of the brain connectome while leveraging the topological linkages among connectome network and anatomical information from GM to do inference on brain regions significantly related to the speech rate measure. The principled Bayesian framework allows precise characterization of the uncertainty in ascertaining a region being actively related to the speech rate measure. Empirical results with simulated data illustrate substantial inferential gains of the proposed framework over its popular competitors. Our framework yields new insights into the relationship of brain regions with PPA, offering a deeper understanding of neuro-degeneration pathways for PPA.

 

December 11 - Jaihee Choi (Rice University)
Inference for Set-Based Effects in Genome-Wide Association Studies with Multiple Interval-Censored Outcomes

Massive genetic compendiums such as the UK Biobank have become an invaluable resource for identifying genetic variants that are associated with complex diseases. Due to the difficulties of massive data collection, a common practice of these compendiums is to collect interval-censored data. One challenge in analyzing such data is the lack of methodology available for genetic association studies with interval-censored data. Genetic effects are difficult to detect because of their rare and weak nature, and often the time-to-event outcomes are transformed to binary phenotypes for access to more powerful signal detection approaches. However transforming the data to binary outcomes can result in loss of valuable information. To alleviate such challenges, this work develops methodology to associate genetic variant sets with multiple interval-censored outcomes. Testing sets of variants such as genes or pathways is a common approach in genetic association settings to lower the multiple testing burden, aggregate small effects, and improve interpretations of results. Instead of performing inference with only a single outcome, utilizing multiple outcomes can increase statistical power by aggregating information across multiple correlated phenotypes. Simulations show that the proposed strategy can offer significant power gains over a single outcome approach. We apply the proposed test to the investigation that motivated this study, a search for the genes that perturb risks of bone fractures and falls in the UK Biobank.

Spring 2023

Colloquium talks will take place on Fridays, 1:00pm - 2:00pm. Some talks will be virtual, and some talks will be in-person. For in-person talks, a concurrent Microsoft Teams meeting will be run to allow virtual attendance. Please address inquiries/suggestions to Dr. Spiller at elaine.spiller@marquette.edu

 

March 3 - MSSC Faculty
"Elevator" Research Updates

Speakers:
Wim Ruitenburg
Anne Clough
Dan Rowe
Jay Pantone
Elaine Spiller
Cheng-Han Yu
Greg Ongie

 

March 10 - Kyle Petersen (DePaul University)
Napkin Problems

Suppose a number of mathematicians sit down at a circular banquet table that has napkins evenly spaced between each place setting. When a particular diner sits, they might encounter two napkins (in which case they choose their preferred napkin), they might encounter one napkin because a neighbor already took one (in which case they take the other napkin), or they might encounter zero napkins because both their napkins were already taken by neighbors. If people sit down in a random order and grab napkins from the left or right side of their place at random, what is the expected proportion of napkinless diners? What is the worst order in which people might sit?

In this talk I will tell you the answers to both these questions, as well as some related open questions. Along the way, I will tell you the human story of my engagement with these questions in two different projects, separated by almost 20 years. Characters in this story include eminences such as John Conway, Rob Pike, Pete Winkler, and Don Knuth, each of whom has made major contributions to mathematics and computer science. And napkins.

 

March 24 - Laurie Cavey (Boise State University)
Student Reasoning Evidence as a Tool for Equitable University Math Instruction

What makes math instruction equitable? What might we consider doing differently to make our math instruction more equitable? To address these questions, I will share an example from a video-based curriculum project (VCAST) designed to engage secondary math teacher candidates in the analysis of student reasoning evidence. While the original purpose of the VCAST project was to support future middle and high school teachers鈥 ability to make math accessible to all of their students, the project resulted in significant outcomes for university instructors as well. Building upon the results of the VCAST project, we will consider the potential for engaging faculty in the analysis of student reasoning evidence as a mechanism for establishing equitable university mathematics instruction.

 

March 31 - Anthony Parolari (向日葵视频)
Process-based and data-driven modeling in ecohydrology 

In this data-rich era, hydrologists and other environmental scientists are motivated to measure and model everything, everywhere. Yet, limited time, budgets, and technology constrain the number of variables and resolution that can be measured and modeled; and, furthermore, not all variables and spatiotemporal scales in a system provide useful information. Therefore, broad questions in environmental systems modeling include: What variables, times, and locations are most informative of the relevant processes? And what is the minimum sampling required to achieve robust measurement and modeling? In this talk, I will introduce the field of ecohydrology and discuss current modeling trends and challenges, including the major challenge of model complexity and model order reduction. As a first example, we will review models of soil moisture dynamics, a key variable that controls the sensitivity of plant and soil processes to hydroclimatic variability and is amenable to model order reduction strategies. Secondly, we show that, generally, environmental signals are 鈥渟parse鈥 and this sparsity can be leveraged to reduce temporal sampling requirements and model complexity. Data-driven sparse methods are applied to predict pollutant concentrations and streamflow in ungauged or poorly gauged basins. Further development and application of these methods promises to improve ecohydrological systems sensing and modeling by reducing sample requirements and identifying a minimal set of variables essential to complete characterization of the dynamics. 

 

April 14 - Jessica Conway (Penn State)
HIV viral dynamics following treatment interruption

Antiretroviral therapy (ART) effectively controls HIV infection, suppressing HIV viral loads to levels undetectable using commercial testing. Typically, suspension of therapy is followed within weeks by rebound of viral loads to high, pre-therapy level. However recent observations give nuance to that statement: in a small fraction of cases, rebound may be delayed by months, years, or even possibly, permanently, termed post-treatment control (PTC). We begin with a discussion of mechanisms that may permit PTC, hypothesizing that early treatment induces PTC by restricting the latent reservoir size. Activation of cells latently infected with HIV are thought to drive viral rebound, and early treatment may render it sufficiently small for immune responses to control infection after treatment cessation. ODE model analysis reveals a range in immune response-strengths where a patient may show bistability between viral rebound or PTC. In case of viral rebound, data reveals significant heterogeneity in timing and ensuing dynamics. We will also discuss a proposed phenomenological model assuming simple heterogeneous dynamics in latent reservoir activation to make predictions on time to rebound following treatment interruption. We rely on time-inhomogeneous branching processes to derive a mechanistically-motivated survival function for time-to-rebound. We validate our model with data from Li et al. (2016), specifically a collection of observations of times to viral rebound across 235 study participants following treatment suspension. We show that our model provides good agreement with survival curves generated from study participants.

 

April 21 - Bruce Wade (University of Louisiana at Lafayette)
EPEM: Efficient Parameter Estimation for Multiple Class Monotone Missing Data

The problem of monotone missing data has been broadly studied during the
last two decades and has many applications in various fields. Commonly
used imputation techniques require multiple iterations through the data
before yielding convergence. Moreover, those approaches may introduce
noise or biases to the subsequent modeling. We derive exact formulas and
propose a novel algorithm to compute the maximum likelihood estimators
(MLEs) of a multiple class, monotone missing dataset. Our EPEM algorithm
does not require multiple iterations through the data as other imputation
approaches, thus promising less computing time.

 

April 28 - Rajarshi Guhaniyogi (Texas A&M)
Bayesian Single and Multi-object Regressions with Applications in Neuroimaging 

Of late, neuroscience or related applications routinely encounter regression scenarios involving objects (e.g., multi-dimensional array or tensor, networks). While the most common practice in such scenarios is to construct summary measures from these objects as predictors, it makes scientific and statistical sense to exploit the structure of the objects for more meaningful inference. We will discuss the strategy to perform Bayesian regression with a tensor or a network response, the construction of novel prior distributions on object-valued parameters and posterior inference. Applications of the proposed methodology are presented in reference to brain activation and brain connectome studies. We will further discuss a new multi-object response regression framework in the study of multi-modal imaging data integration in the study of primary progressive aphasia (PPA), a neurological disorder sharing a similar neuro-degenerative pathway as Alzheimer's.

Fall 2022

Colloquium talks will take place on Fridays, 1:00pm - 2:00pm. Some talks will be virtual, and some talks will be in-person. For in-person talks, a concurrent Microsoft Teams meeting will be run to allow virtual attendance. Please address inquiries/suggestions to Dr. Spiller at elaine.spiller@marquette.edu

 

September 9 - Mason Porter (UCLA)
Bounded-Confidence Models of Opinion Dynamics on Networks

I will discuss the modeling of opinion dynamics on various types of networks. After introducing some general questions and ideas in the field, I will focus on bounded-confidence models (BCMs), in which nodes have continuous-valued opinions and update those opinions when they interact with nodes with sufficiently similar opinions. I will discuss various generalizations of BCMs and examine how they affect consensus, polarization, and fragmentation of opinions on BCMs.

 

September 30 - Summer Graduate Research Symposium

Megan Lantz
Deep Learning for Dual Energy CT Reconstruction

Dual energy CT imaging offers the potential to generate high resolution, high contrast, low noise diagnostic images that allow for improved material discrimination while minimizing radiation exposure.  Using dual energy CT transmission data to reconstruct material maps (images) of three distinct materials is an ill-posed, non-linear inverse problem.  In the context of distinguishing between adipose tissue, fibroglandular tissue, and calcifications in simulated breast CT transmissions, we worked towards effective ways to solve this inverse problem using both iterative and neural network approaches.


Soroush Mahmoudiandehkordi
Genome Wide Identity by Descent (GWID)

Genome Wide association studies (GWAS) have discovered several genes associated with diseases, but it has disadvantages in inheritance patterns and rare variants. Identity by Descent (IBD) mapping is a powerful tool for unraveling the genetics underlying complex diseases. We developed a software package Genome Wide Identity by Descent (GWID) which is an effective visualization tool that enables users to scan the genome and examine patterns of disease association. It highlights specific genomic regions of interest, and displays artifacts in the data. GWID uses statistical tests to investigate significance of IBD regions in case-control setups.


Ke Xu
A Shifted Field of View method for Multi-coil Separation of Parallel Encoded Complex-valued Slices in fMRI

Simultaneous multi-slice (SMS) techniques allow for the reduction of repetition time while maintaining a high-resolution for fMRI. One challenge with SMS has been the mitigation of inter-slice signal leakage. Here the Multi-coil Separation of Parallel Encoded Complex-valued Slice with Shifted Field of View (mSPECS-sFOV) model is presented to reconstruct without leakage. Combining the orthogonal properties of Hadamard encoding with the in-plane image shift, voxels in different locations are aliased with different combinations and therefore make the separation process easier. Bootstrap sampling and artificial aliasing for calibration images also included in mSPECS-sFOV. The least square estimation is used for un-aliasing process.


Emily Corcoran
Neural Networks for Classification of Breast Tissue Using Electrical Impedance Tomography Voltage Data

The current best imaging modality for breast cancer detection is mammography, though mammograms offer a relatively high rate of false positives due to their inability to distinguish between cancerous and benign lesions without subsequent biopsies. However, there has been significant evidence that malignant tumors have much higher conductivities than their benign counterparts. Electrical impedance tomography (EIT) is an imaging modality that applies low amplitude current through electrodes placed on the body, and the resulting voltages are used to recover the conductivities within the body. Hence there is great promise in using EIT to detect breast cancer without the high probability of false positives. The research conducted during this 2022 summer fellowship involved the generation of simulated breast phantoms and large EIT voltage datasets and explored the use of EIT voltage data to classify breast tumors as malignant or benign using neural networks. Initial testing is indicating that this is a promising avenue to explore.


Yue Zhao
Regularized Multivariate Functional Principal Component Analysis

In Multivariate Functional Principal Component Analysis, there is an apparent demand for smoothing the functional principal components, and we call this smoothing technique: regularized Multivariate Functional Principal Component Analysis (reMFPCA). In our paper, three computational approaches are established: (1) iterative power algorithm (2) half smoothing approach (3) generalized eigen-decomposition method. Also, a closed-form of tuning parameter selection approach is proposed, and it dramatically increases computational efficiency than the traditional cross-validation approach in Silverman鈥檚 paper (1996). In addition, a reMFPCA R package is in process. It will allow the user to perform our approaches in R.


Jesse Adikorley
Multivariate Functional Time Series Forecasting: Multivariate Functional Singular Spectrum Analysis Approaches Applied To Images and Curves (Remote Sensing Data)

The functional singular spectrum analysis (FSSA) method, applied to functional time series (FTS), was developed by Haghbin et al., 2019 as a functional extension of the singular spectrum analysis (SSA) method developed by Golyandina et al., 2001. SSA is a non-parametric, exploratory method used in time series analysis to identify and extract components that capture mean, seasonal, trend, and noise behaviors in time-dependent data where observations are scalars. Trinka et al. 2021 developed the multivariate FSSA (MFSSA) as the functional extension of multivariate SSA (MSSA) applied to multivariate FTS (MFTS). They also developed both FSSA recurrent forecasting and FSSA vector forecasting algorithms for the FSSA and MFSSA methods. 
In this work, we present an extension of FSSA recurrent forecasting and FSSA vector forecasting which is applied to MFTS data across different dimensional domains.  


Joey Lyon
Emulating a Soil Hydrology Column Model Using a Gaussian Process

We will review a computationally intensive hydrology model of water flowing into and out of a one-dimensional soil column and discuss constructing cheap surrogates with Gaussian process emulators. We will explore where and when the model response is active for various outputs throughout the soil column over a time series of rainfall forcing. We then will look at emulating the entire soil column at any given instant in the rainfall time series. We will discuss these results as well as the future directions for this work.


Shirin Nezampour
Nonparametric Collective Spectral Density Estimation of Multiple Multivariate time series

There are many situations in which more than one aspect of a phenomenon is observed at each time point, giving rise to multivariate time series. In studying multivariate time series, spectral analysis plays an important role in investigating relationships between time series. Analysis of the power spectrum has helped us to understand the dynamics in many serially correlated data in a way that does not require the development of complex parametric models. In this project, we have worked on extending the non-parametric collective spectral density estimation (NCSDE) method introduced by Maadooliat et al. (2018) to the multivariate time series.

October 14 - Muge Karaman (University of Illinois Chicago)
Advanced Diffusion-Weighted MRI for Comprehensive Characterization of Tissue Microstructures

Biological tissues are complex due to the underlying cellular structures and their related functions. In vivo characterization of biological tissues, normal or cancerous, has been a focus of diffusion-weighted MRI (DWI) for the past three decades. Conventional DWI techniques, however, cannot comprehensively characterize biological tissue which contains an array of underlying properties such as cellularity, vascularity, and heterogeneity. This seminar will highlight advanced DWI techniques our group has been developing; describe the theory, implementation, and validation of these novel techniques; and showcase their clinical applications in cancer detection, diagnosis, and treatment assessment. We will also discuss comprehensive approaches to expand the benefits of advanced DWI to many organs. 

 

October 28 - Applied Statistics Master Practicum Presentation, 1pm - 3pm

Our APST master students will present their practicum work.

 

November 4 - Ahmad P. Tafti (University of Pittsburgh) - Virtual via Teams
Enforcing deep few-shot learning for knee semantic segmentation and measurement

Osteoarthritis (OA) is the most prevalent chronic joint disease worldwide, where knee OA takes more than 80% of commonly affected joints. The early detection of knee OA has significantly focused on analyzing knee joint space and cartilage degeneration. Segmentation of the knee joint space thus became the very first step to measure the level of joint degeneration quantitatively and qualitatively. From the computational perspective, deep learning computer vision methods have already demonstrated very successful applications in a variety of medical image analysis tasks, including object detection, image registration, segmentation, and classification. However, there are several fundamental challenges that stop deep learning methods to obtain their full potential in healthcare settings. One can see that they often need a large column of annotated training data to achieve better accuracy over traditional machine learning methods. In this talk, we present a deep few-shot learning strategy to tackle the problem of knee joint space segmentation and measurement in plain radiographs using only a few samples of manually segmented radiographs.

 

November 18 - Zeno Madarasz (Bowling Green State University)
A Strictly Weakly Hypercyclic Subspace

An interesting topic of study for a hypercyclic operator T on a Fr茅chet space X has been whether X has an infinite dimensional closed subspace containing entirely, except for the zero vector, hypercyclic vectors for T. These subspaces are called hypercyclic subspaces. The existence of a strictly weakly hypercyclic operator T, which is a weakly hypercyclic operator that is not norm hypercyclic on a Hilbert space H has been shown by Chan and Sanders. However, it is not known whether there exists a strictly weakly hypercyclic subspace of H. We first show that the left multiplication operator LT with the aforementioned strictly weakly hypercyclic operator T is a strictly WOT-hypercyclic operator on the operator algebra B(H). Then we obtain a sufficient condition for an operator T on a Hilbert space H to have a strictly weakly hypercyclic subspace. After that we construct an operator that satisfies these conditions and therefore prove the existence of a strictly weakly hypercyclic subspace.

 

December 2 - Pamela Harris (UW Milwaukee) - 4pm - 5pm
Multiplex juggling sequences and Kostant's partition function

Multiplex juggling sequences are generalizations of juggling sequences (describing throws of balls at discrete heights) that specify an initial and terminal configuration of balls and allow for multiple balls at any particular discrete height. Kostant鈥檚 partition function is a vector function that counts the number of ways one can express a vector as a nonnegative integer linear combination of a fixed set of vectors. What do these two families of combinatorial objects have in common? Attend this talk to find out!


December 9 - Geoff Wodtke (University of Chicago)
Structural Mean Models with Application to Causal Inference in the Social Sciences

Social scientists are often interested in estimating the marginal effects of a time-varying treatment on an end-of-study outcome. With observational data, estimating these effects is complicated by the presence of time-varying confounders affected by prior treatments, which may lead to bias and inconsistency in conventional approaches to estimation (e.g., matching). In this situation, inverse-probability-of-treatment-weighted (IPTW) estimation of a marginal structural mean model (MSM) remains consistent if treatment assignment is sequentially ignorable and the conditional probability of treatment is correctly modeled, but this method is not without limitations. In particular, it is highly sensitive to model misspecification, relatively inefficient, and difficult to use with many valued or continuous treatments. In this talk, I introduce an alternative method 鈥 regression-with-residuals (RWR) estimation of a structural nested mean model (SNMM) 鈥 that overcomes these limitations. RWR is consistent for the marginal effects of a time-varying treatment if treatment assignment is sequentially ignorable and a model for the conditional mean of the outcome, which nests models for the time-varying confounders, is correctly specified. The performance advantages of RWR-SNMM relative to IPTW-MSM are demonstrated with a series of simulation experiments and with an empirical example based on longitudinal data from the Panel Study of Income Dynamics. I conclude with a discussion of the method鈥檚 limitations and directions for future research.

Spring 2022

Colloquium talks will take place on Fridays, 1:00pm - 2:00pm. Talks will be initially be held only in a virtual format. Those affiliated with 向日葵视频 can join virtually by joining the "MSSC Department Colloquium" Team on Microsoft Teams. Please address inquiries/suggestions to Dr. Ongie at gregory.ongie@marquette.edu

 

February 11 - Nazmus Sakib (University of Buffalo, SUNY)
Multi-disciplinary Collaborative Research: Exploring Endeavors in the Intersection of Medical Informatics, mHealth, and Computational Sustainability

The alarmingly aggravating incidents of sepsis and septic shock, and associated mortality, morbidity, and annual treatment costs among ICU admissions are an increasing concern. SepINav is a medical informatics endeavor that helps ICU practitioners and researchers to monitor and intervene on the existing sepsis patients more efficiently and interactively and conduct retrospective studies to seek rationales to different sepsis scenarios in the ICU. Moreover, Bayesian Online Changepoint Detection will help the practitioners understand the structural changes in patients鈥 vital sign regimes that may harbinger prior to septic shock. Besides, several additional features are added to this data-driven software tool to promise efficient monitoring and intervention and address confounding medical interventions in the ICU.

 

March 4 - Kaitlyn and Peter Muller (Villanova University)
Modeling Covid-19 Spread on a College Campus

In this talk, we present a model for the spread of the original covid-19 variant at a medium-sized college. Our model accounts for the effects of various mitigation measures, both individual and institutional, available in a college setting. We explore the effect of these measures on the spread of covid-19 over the course of a semester. We then fit our model to Villanova University's Fall 2020 dashboard data.

 

March 25 - Wim Ruitenberg (向日葵视频)
One Hundred Years of Logic for Constructive Mathematics

We clarify what is constructive mathematics wihout emotional coloring. There is no need to 'be' a constructivist. Well-known expounders of constructive mathematics include Brouwer, Markov, and Bishop. Common classical mathematics has a formal logic associated with it, known as classical logic. Boolean algebra is associated with this logic. Shorty before 1930 Heyting developed a logic for constructive mathematics. Almost from the beginning critics wondered whether this so-called intuitionistic logic could be justified as the logic of constructive mathematics. Some, including G枚del, were not convinced that it was, or at least that it lacked a proper justification. We confirm that intuitionistic logic is not the logic of constructive mathematics. We present a new correct version of constructive logic.

 

April 1 - Sue Minkoff (University of Texas at Dallas)
Modeling of Trace Gas Sensors

Trace gas sensors that are compact and portable are being deployed for use in a variety of applications including disease diagnosis via breath analysis, monitoring of atmospheric pollutants and greenhouse gas emissions, control of industrial processes, and for early warning of terrorist threats. One such sensor is based on optothermal detection and uses a modulated laser source and a quartz tuning fork resonator to detect trace gases.  If the laser lightis tuned to the right frequency to be absorbed by whatever gas one wishes to detect, then if the gas is present, that heat energy will cause a thermal wave to propagate in the air until it reaches a quartz tuning fork. The tuning fork then vibrates due to a heating of the tines.  We develop the first mathematical model of ROTADE (resonant optothermoacoustic) sensors and solve it via the finite element method. I will discuss determining an optimally designed sensor that maximizes the signal as a function of the geometry of the quartz tuning fork (length and width of the tines, etc).

 

April 8 - Yaser Samadi (Southern Illinois University)
Time Series Analysis for Interval-Valued Data

Many series of data record individual observations as intervals, such as stock market values with daily high-low values, or minimum and maximum monthly temperatures, recorded over time. Moreover, with the advent of supercomputers, datasets can be extremely large, and it is frequently the case that observations are aggregated into intervals (or histograms, or other forms of so-called symbolic data). Taking the average of the intervals results in a loss of information. Therefore, in comparison with classical data, they are more complex and can have internal structures that impose complications that are not evident in classical data. In particular, the time dependency makes it more difficult to deal with and incorporate their complex structures and internal variations. In this talk, we present our proposed autocovariance/autocorrelation functions for interval-valued autoregressive series models. Maximum likelihood estimators are derived by using the ideas of composite likelihood and the pairwise likelihood functions. Asymptotic properties of these estimators are derived. A simulation study shows that the new estimators perform considerably better than those obtained previously.

 

April 22 - Andrew Nencka (Medical College of Wisconsin)
Optimizing traditional and deep-learning based accelerated MR imaging

Magnetic resonance imaging (MRI) offers a rich variety of physiologically relevant contrasts although it is among the slowest medical imaging modalities. The comparatively long duration of MRI acquisitions limits their utility in some diseases and patient populations, while also limiting available resolution and other information content in MRI exams. Work over the last decade has led to the ability to simultaneously acquire images covering multiple cross sections of a patient, thereby shortening acquisition duration by an integer factor equal to the number of simultaneously acquired slices. Such acceleration unlocks new opportunities for enhancing information content in MRI exams. We will discuss the techniques of accelerated MRI and the recent optimization of traditional and deep learning-based image reconstruction algorithms while keeping an eye on the folklore of imaging technology.

 

April 29 - Ben Russo (Oak Ridge National Lab)
System identification techniques

A dynamical system is given as 岷 = f(x), where x : [0, T] 鈫 R^n is the system state and f : R^n 鈫 R^n is some function. Dynamical systems are prevalent in the sciences, such as engineering, biology, neuroscience, physics, and mathematics. However, in many cases even physically motivated dynamical systems can have unknown parameters (i.e. a gray box), such as mass and length of mechanical components, or the dynamics may be completely unknown (i.e. a black box). In such cases, system identification methods are leveraged to gain estimates on the dynamics of the system based on data generated by the system itself. In this talk, we鈥檒l go over some current techniques in non-linear system identification which use some tools from functional analysis.

 

May 20 - Jordan Trinka (Pacific Northwest National Laboratory)
Rfssa: An R Package for Functional Singular Spectrum Analysis

Functional data analysis is a growing field of statistics that is finding increasing use in applied realms such as finance, medicine, and ecology. With the improvements in computational resources of recent years, functional time series (FTS) have become increasingly prevalent. In this talk, we present the functionalities of the Rfssa R package, available on CRAN, that allows the user to perform nonparametric signal extraction and forecasts of FTS. We illustrate the functionalities offered by the package by way of FTS examples of curves that describe call center data and remotely sensed images of vegetation. Through the talk and code demonstrations we illustrate the advantage of creating user-friendly, flexible, and accessible software that can be readily used by practitioners both in academia and industry.

Fall 2021

Colloquium talks will take place on Fridays, 1:00pm - 2:00pm. Talks will be held in a hybrid in-person / virtual format. The in-person portion will be held in the Katharine Reed Cudahy Building, Room 401 on the 向日葵视频 campus. Please address inquiries/suggestions to Dr. Ongie at gregory.ongie@marquette.edu. Those affiliated with 向日葵视频 can join virtually by joining the "MSSC Department Colloquium" Team on Microsoft Teams.

 

September 10 - Shane Hesprich (MU Research Computing Services)
High Performance Computing at Marquette

High Performance Computing (HPC) is a powerful tool used in almost every field from economics and engineering, to healthcare and business information. Raj, Marquette's freely available, centralized HPC resource, provides students and faculty the ability to utilize HPC for research and academic purposes. Understanding how to access and properly utilize this resource can be of great benefit to your career here at Marquette.

 

October 15 -  Chase Sakitis (向日葵视频)
A Formal Bayesian Approach to SENSE Image Reconstruction

In fMRI, capturing cognitive temporal dynamics is dependent upon the rate at which volume brain images are acquired. The sampling time for an array of spatial frequencies to reconstruct an image is the limiting factor. Multi-coil SENSE image reconstruction is a parallel imaging technique that has greatly reduced image scan time. In SENSE image reconstruction, complex-valued coil sensitivities are estimated once from a priori calibration images and used to form a 鈥渒nown鈥 design matrix to reconstruct every image. However, the SENSE technique is highly inaccurate when the sensitivity design matrix is not positive definite. Here, we propose a formal Bayesian approach where prior distributions for the unaliased images, coil sensitivities, and variances/covariances are assessed from the a priori calibration image information. Images, coil sensitivities, and variances/covariances are estimated a posteriori jointly via the Iterated Conditional Modes maximization algorithm and marginally via MCMC using the Gibbs sampling algorithm. Since the posterior marginal distributions are available, hypothesis testing is possible. This Bayesian SENSE (BSENSE) model to reconstruct images is applied to realistically simulated fMRI data. This BSENSE model accurately reconstructs a single slice image as well as a series of slice images without aliasing artifacts and was used to produce magnitude-only task activation.

 

October 20 (Wednesday, 1:00pm) - Danny Smyl (University of Sheffield)
Some recent advances in inverse problems applied to NDE and SHM

The field of inverse problems, the mathematics of estimating and understanding causalities from effects (data), has taken massive strides in the past 20 years. Since the advent of high performance, probabilistic, and learned computation, inversion-based applications in nondestructive evaluation (NDE) and structural health monitoring (SHM) have become increasingly pervasive. In this seminar, we highlight some key contemporary advances in inverse problems applied to NDE and SHM. In this effort, we evidence recent developments in learned (direct) inversion, multi-state reconstruction, sensor optimization, highly dynamical spatial loading prediction, and finite element model error prediction/compensation.

 

October 29 - John Lipor (Portland State University)
Improving K-Subspaces via Coherence Pursuit

Subspace clustering is a powerful generalization of clustering for high-dimensional data analysis, where low-rank cluster structure is leveraged for accurate inference. K-Subspaces (KSS), an alternating algorithm that mirrors K-means, is a classical approach for clustering with this model. Like K-means, KSS is highly sensitive to initialization, yet KSS has two major handicaps beyond this issue. First, unlike K-means, the KSS objective is NP-hard to approximate within any finite factor for large enough subspace rank. Second, it is known that the subspace estimation step is faulty when an estimated cluster has points from multiple subspaces. In this paper we demonstrate both of these additional drawbacks, provide a proof for the former, and offer a solution to the latter through the use of a robust subspace recovery (RSR) method known as Coherence Pursuit (CoP). While many RSR methods have been developed in recent years, few can handle the case where the outliers are themselves low rank. We prove that CoP can handle low-rank outliers. This and its low computational complexity make it ideal to incorporate into the subspace estimation step of KSS. We demonstrate on synthetic data that CoP successfully rejects low-rank outliers and show that combining Coherence Pursuit with K-Subspaces yields state-of-the-art clustering performance on canonical benchmark datasets.

 

November 12 - Jessi Cisewski-Kehe (UW Madison)
Topological Data Analysis That's Out of This World

Data exhibiting complicated spatial structures are common in many areas of science (e.g., cosmology, biology), but can be difficult to analyze. Persistent homology is an approach within the area of Topological Data Analysis (TDA) that offers a framework to represent, visualize, and interpret complex data by extracting topological features which may be used to infer properties of the underlying structures. For example, TDA is a beneficial technique for analyzing intricate and spatially complex web-like data such as fibrin or the large-scale structure (LSS) of the Universe. LSS is known as the Cosmic Web due to the spatial distribution of matter resembling a 3D web. The accepted cosmological model presumes cold dark matter but discriminating LSS under varying cosmological assumptions is of interest. In order to understand the physics of the Universe, theoretical and computational cosmologists develop large-scale simulations that allow for visualizing and analyzing the LSS under varying physical assumptions. Each object in the 3D data set can represent structures such as galaxies, clusters of galaxies, or dark matter haloes, and topological summaries ("persistence diagrams") can be obtained for these simulated data that summarize the different ordered holes in the data (e.g., connected components, loops, voids). The topological summaries are interesting and informative descriptors of the Universe on their own, but hypothesis tests using the topological summaries provide a way to make more rigorous comparisons of LSS under different theoretical models. We present several possible test statistics for two-sample hypothesis tests using the topological summaries, carry out a simulation study to investigate the performance of the proposed test statistics using cosmological simulation data for inference on distinguishing LSS assuming cold dark matter versus a different cosmological model which assumes warm dark matter.

 

November 19 - Robert Krafty (Emory University)
Interpretable PCA for Multilevel Multivariate Functional Data

Many studies collect functional data from multiple subjects that have both multilevel and multivariate structures. An example of such data comes from popular neuroscience experiments where participants' brain activity is recorded using modalities such as EEG  and summarized as power within multiple time-varying frequency bands within multiple electrodes, or brain regions. Summarizing the joint variation across multiple frequency bands for both whole-brain variability between subjects, as well as location-variation within subjects, can help to explain neural reactions to stimuli. This article introduces a novel approach to conducting interpretable principal components analysis on  multilevel multivariate functional data that decomposes total variation into  subject-level and replicate-within-subject-level (i.e. electrode-level) variation, and provides interpretable components that can be both sparse among variates (e.g. frequency bands) and have localized support over time within each frequency band. Smoothness is achieved through a roughness penalty, while sparsity and localization of components are achieved by solving an innovative rank-one based convex optimization problem with block Frobenius and matrix L1-norm based penalties. The method is used to analyze data from a study to better understand reactions to emotional information in individuals with histories of trauma and the symptom of dissociation, revealing new neurophysiological insights into how subject- and electrode-level brain activity are associated with these phenomena.

 

December 3 - Luke Mcguire (University of Arizona)
Post-wildfire debris flow hazards

Fire temporarily alters soil and vegetation properties, promoting increases in runoff and erosion that can dramatically increase the likelihood of destructive flash floods and debris flows. Debris flows, or fast-moving landslides that consist of a mixture of water, mud, and rock, initiate after fires when surface water runoff rapidly erodes sediment on steep slopes. Due to the complex interactions between runoff generation, soil erosion, and post-fire debris-flow initiation, the study of post-fire debris-flow hazards necessitates an approach that couples these processes within a common modeling framework. Models used to simulate these processes, however, often contain a number of poorly constrained parameters, particularly in post-fire settings where there is limited time to collect data and where parameters related to soil and vegetation properties will change over time as the landscape recovers. Here, we describe physics-based models designed to simulate runoff, erosion, and debris flow processes in burned areas as well as how these models can inform reduced-complexity models used to facilitate rapid hazard assessments. We highlight existing gaps in our ability to assess post-fire debris-flow hazards and motivate the need to expand our ability to use numerical modeling to support post-fire hazard assessment and mitigation efforts.

December 8 (Wednesday, 2:15pm) -  Md. Fitrat Hossain (向日葵视频)
Personalized mHealth Monitoring System for Veterans

The word 鈥淐risis鈥 refers to events or event that may lead to dangerous and unstable situation which may affect personal, social or community life adversely. It is used to represent negative changes in affairs like political, social, economic, or environmental. Mental health crisis is an important social issue which represents one鈥檚 behavior, feelings and actions that can be harmful to themselves and the people around them. Because of the wars, in the last decade, mental health crises have severely affected the United States veterans. To take necessary steps to avoid greater damage of a crisis, it is important to predict the different levels of crisis. However, because of the nature of crisis, it is difficult to understand and quantify the different levels of crisis. This research is focusing defining and predicting different levels of mental health crisis of Milwaukee based veterans who are suffering from PTSD in a mHealth setting. As part of this, long-term crisis (a severe stage) defined and validated from PCL-5 score using decision tree and statistical tests. In order to create a mobile based alert system, acute crisis (an intermediate severe situation) has been defined using Cognitive Walkthrough and decision tree based on ecological momentary assessment data.

 

December 10 鈥

Adikorley (向日葵视频)
Multivariate Functional Time Series Forecasting : Multivariate Functional Singular Spectrum Analysis approaches applied to "images and curves/remote sensing鈥


Sunil Mathew (向日葵视频)
Model interpretability in terms of dropout in Neural Networks using Bayesian learning

Neural networks tend to have many layers that enables it to learn complicated patterns via weights that describe the connection between nodes of each layer. Often large number of nodes tend to cause overfitting and poor generalization. A probabilistic framework enables explainability and better insight to what a node in a model learns. A Bayesian approach can be used to determine connection dropout in each layer which enables better model interpretability & combats overfitting.

Spring 2021

Colloquium talks for Spring 2021 will be held virtually via Microsoft Teams. Please address inquiries/suggestions to Dr. Ongie at gregory.ongie@marquette.edu

May 7th (12pm CT) - Prof. Kaitlyn Muller, Department of Mathematics and Statistics, Villanova University

"Clutter Mitigation Techniques in Synthetic-Aperture Radar Imaging"

Abstract: In this talk we will discuss two different methods of addressing the problem of clutter in synthetic-aperture radar (SAR) imaging. SAR images are often used for the purpose of target detection and classification. Therefore it is necessary to produce images that display the target/object of interest clearly and in such a way that they are distinguishable from other objects present in the scene.  These other objects are referred to as clutter and they often scatter just as strongly as targets and can obscure the presence of targets in images. We will begin with the basics of radar/SAR imaging and discuss a common model for volume scattering clutter (i.e. foliage). We will then present two techniques to mitigate the presence of clutter in the images. First we will discuss the more common filtering techniques which require knowledge of the clutter statistics. Second we will discuss correlation imaging, a second order imaging technique, that in certain cases can provide clutter mitigation without a priori knowledge.

April 30th (3:30pm CT) - Prof. Kimia Ghobadi, Department of Civil and Systems Engineering, Johns Hopkins University

"Hospital Resource Optimization for COVID-19"

Abstract: The COVID-19 pandemic has created a significant strain on the healthcare systems since its start. As hospitals cope with the unknown demand and surges in the cases, critical resources like ICU beds have become scarce. Additional beds and field hospitals are considered to meet the increased demand, but simply expanding the capacity is not viable for all hospitals. Better utilization of the currently available capacity can improve access to resources, lower the burden to hospitals and staff, and lead to better patient care. To this end, we developed mathematical models that match the demand with available resources in a regional system of hospitals. Our robust mixed-integer linear models minimize the resource shortage while considering operational constraints and desirable allocation properties such as transfer sparsity, consistency, and locality. Our models can consider primary resources (e.g., beds) in addition to complementary resources (e.g., nurses). We have tested and validated our models on the first wave of the COVID-19 pandemic and the subsequent surges and are currently in use at the Johns Hopkins Health System hospitals. We expanded our models to all hospitals in the US and developed an interactive public website () to help decision-makers on various levels to plan and use their bed resources.

April 9th (1pm CT) - Prof. Peter Hinow, Department of Mathematical Sciences, University of Wisconsin - Milwaukee.

"Automated Feature Extraction from Large Cardiac Electrophysiological Data Sets"

Abstract: A multi-electrode array-based application for the long-term recording of action potentials from electrogenic cells makes possible exciting cardiac electrophysiology studies in health and disease. With hundreds of simultaneous electrode recordings being acquired over a period of days, the main challenge becomes achieving reliable signal identification and quantification. We set out to develop an algorithm capable of automatically extracting regions of high-quality action potentials from terabyte size experimental results and to map the trains of action potentials into a low-dimensional feature space for analysis. Our automatic segmentation algorithm finds regions of acceptable action potentials in large data sets of electrophysiological readings. We use spectral methods and support vector machines to classify our readings and to extract relevant features. We show that action potentials from the same cell site can be recorded over days without detrimental effects to the cell membrane. The variability between measurements 24 h apart is comparable to the natural variability of the features at a single time point. Our work contributes towards a non-invasive approach for cardiomyocyte functional maturation, as well as developmental, pathological, and pharmacological studies.
This is joint work with Viviana Zlochiver, Stacie Kroboth (Advocate Aurora Research Institute), and John Jurkiewicz (graduate student at UWM).

March 26th (1pm CT) - Dr. Ben Freedman, Department of Mathematical and Statistical Sciences, 向日葵视频.

鈥淥n Weakly Nonlinear Boundary Value Problems on Infinite intervals.鈥 

Abstract: In this talk, we will analyze boundary value problems on infinite intervals subject to weakly nonlinear boundary conditions. For such problems, we provide criteria for the existence of solutions as well as a qualitative description of the behavior of solutions depending on a parameter. We investigate the relationship between solutions to these weakly nonlinear problems and the solutions to a set of corresponding linear problems.

March 19th (1pm CT) - Prof. Alex Konomi, Department of Mathematical Sciences, University of Cincinnati.

鈥淐omputer model emulation with high-dimensional functional output in large-scale observing system uncertainty experiments: An application to NASA鈥檚 Orbiting Carbon Observatory-2 (OCO-2) mission鈥 

Abstract: Observing system uncertainty experiments (OSUEs) have been recently proposed as a cost-effective way to perform probabilistic assessment of retrievals for NASA鈥檚 Orbiting Carbon Observatory-2 (OCO-2) mission. One important component in the OCO-2 retrieval algorithm is a full-physics forward model that describes the mathematical relationship between atmospheric variables such as carbon dioxide and radiances measured by the remote sensing instrument. This complex forward model is computationally expensive but large-scale OSUEs require evaluation of this model numerous times, which makes it infeasible for comprehensive experiments. To tackle this issue, we develop a statistical emulator to facilitate large-scale OSUEs in the OCO-2 mission with independent emulation. Within each distinct spectral band, the emulator represents radiances output at irregular wavelengths via a linear combination of basis functions and random coefficients. These random coefficients are then modeled with nearest-neighbor Gaussian processes with built-in input dimension reduction via active subspace and gradient-based kernel dimension reduction. The proposed emulator reduces dimensionality in both input space and output space, so that fast computation is achieved within a fully Bayesian inference framework.

March 12th (1pm CT) - Drs. Sarah Hamilton, Elaine Spiller, and Mehdi Maadooliat, Jay Pantone, and Greg Ongie, Department of Mathematical and Statistical Sciences, 向日葵视频.

A number of faculty members will be giving short (5-10 min) intro/summaries of their research areas. This is a great opportunity for students at all levels, as well as faculty, to learn about current MSSC research and to start new collaborations.

 

Spring 2020

Colloquium talks will be held in the Katharine Reed Cudahy Building, Room 401 at Cudahy Hall on the 向日葵视频 campus. Please address inquiries/suggestions to Dr. Hamilton at sarah.hamilton@marquette.edu

March 6th (2 pm CT) - Dr. Peter Muller, Department of Mathematics and Statistics, Villanova University.

Abstract. Electrical impedance tomography (EIT) is an imaging modality that measures currents and voltages on the surface of a body to image the electrical conductivity within the body.  Image reconstruction in EIT is a severely ill-posed, nonlinear inverse problem.  In this talk, I will present two direct reconstruction methods based on complex geometrical optics solutions: Calder贸n's method and Nachman's D-bar method.  Both methods provide a point-wise reconstruction of the image. Calder贸n鈥檚 method is a linearized approach while the D-bar method solves the fully non-linear inverse problem.  I will present both methods and their ability to address clinical application concerns.

February 28th (2 pm CT) - Dr. Elaine Spiller, Department of Mathematical and Statistical Sciences, 向日葵视频.

Abstract. Geophysical natural hazards 鈥 storm surge, post-fire debris flows, volcanic flows and ash fall, etc. 鈥 impact thousands to millions of people annually. Yet the most devastating hazards, those resulting in loss of life and property, are often both geographically and temporally localized. Thus they are effectively rare events to those impacted. We will present methodology to produce probabilistic hazard maps that can rapidly be updated to account for various aleatoric scenarios and epistemic uncertainties. This hazard analysis utilizes statistical emulators to combine computationally expensive simulations of the underlying geophysical processes with probabilistic descriptions of uncertain scenarios and model parameters. The end goal is not a map, but a family of maps that represent how a hazard threat evolves under different assumptions or different potential future scenarios. Further, this approach allows us to rapidly update hazard maps as new data or precursor information arrives.

February 14th  (2 pm CT) - Drs. Daniel Rowe, Anne Clough, Sarah Hamilton, Naveen Bansal, Wenhui Sheng, Elaine Spiller, and Mehdi Maadooliat, Department of Mathematical and Statistical Sciences, 向日葵视频.

A number of faculty members will be giving short (5 min) intro/summaries of their research areas. This is a great opportunity for students at all levels, as well as faculty, to learn about current MSSC research and to start new collaborations.

February 3rd (1 pm CT) - Swati Patel, Department of Mathematics, Tulane University, On Dynamics for Maintaining Biological Diversity at Various Scales.

Abstract. One of the fundamental questions in ecology and evolutionary genetics is how is biological diversity maintained within and amongst populations. Classical nonlinear differential equations that capture population or genetic interactions have played an important role in developing biological theories on how diversity is maintained. As ongoing empirical investigations uncover the nuances of these interactions, they open the way for more sophisticated models and the need for expanding mathematical methods to analyze them. In this talk, I will develop two sets of multi-scale models, motivated by recent empirical evidence. The first couples differential equations that capture interactions amongst populations with variation within the population. At a finer scale, the second models specific protein-gene interactions that influence population-level traits. For both models, I will discuss new mathematical questions and analysis that provides insight into mechanisms that enable diversity at these various scales.

January 31st (1 pm CT) - Owen Lewis, Department of Mathematics, Florida State University, Electrodiffusion Mediated Maintenance of the Gastric Mucus Layer.

Abstract. Diffusion of charged particles, or electrodiffusion, plays an important role in many physiological systems including the human stomach. The gastric mucus layer is widely recognized to serve a protective function, shielding your stomach wall from the extreme acidity and digestive enzymes present in the stomach. However, there is still much debate regarding the control of electrodiffusive transport through the mucus layer. In this talk, I will discuss a mathematical description of electrodiffusion within a two-phase gel model of gastric mucus and the challenges associated with its analysis and numerical simulation. This model is used to investigate physiological hypotheses regarding gastric layer maintenance that are beyond current experimental techniques.

January 22nd (1 pm CT) Greg Ongie, Department of Statistics, University of Chicago, Rethinking regularization in modern machine learning and computational imaging.

Abstract. Optimization is central to both supervised machine learning and inverse problems in computational imaging. These problems are often ill-posed and some form of regularization is necessary to obtain a useful solution. However, new paradigms in machine learning and computational imaging necessitate rethinking the role of regularization, as I will illustrate with two examples. First, in the context of supervised learning with shallow neural networks, I will show how a commonly used form of regularization has a surprising reinterpretation as a convex regularizer in function space. This yields novel insights into the role of overparameterization and depth in learning with neural networks having ReLU activations. Second, I will discuss a novel network architecture for solving linear inverse problems in computational imaging called a Neumann network. Rather than using a pre-specified regularizer, Neumann networks effectively learn a regularizer from training data, outperforming classical techniques. Beyond these two examples, I will show how many open problems in the mathematical foundations of deep learning and computational imaging relate to understanding regularization in its many forms.

January 21st (1 pm CT) - Scott Hottovy, Department of Mathematics, United States Naval Academy, A simple stochastic model of tropical atmospheric waves.

As tropical storms go, you have probably heard of Hurricanes, Tropical Cyclones, El Ni帽o, and La Ni帽a. But you probably haven't heard of the Madden-Julian Oscillation (MJO). It is the major contributor to rainfall in tropical regions and influences the climate in Wisconsin regularly. Unlike Hurricanes and El Ni帽o, the MJO is still not well understood. In an effort to understand the mechanisms of the MJO, I will describe a model building from a dynamically stationary "background" tropical rainfall model and coupling that to a tropical wave model. These models use Stochastic Differential Equations (SDE) and Stochastic Partial Differential Equations (SPDE) as the building blocks. In the "background" model, an SDE model is used which leads to characteristics of criticality and phase transitions. For the full model with waves, we use a continuous one-dimensional SPDE. Because of the simplicity of the models, we are able to solve many statistics exactly, or run fast numerical experiments.

 

Fall 2019

Colloquium dates and speakers for Fall 2019 - Unless specified, the talks will begin at 2:00pm CT in Room 401 at Cudahy Hall.

  • September 6th - Michael Albert, Department of Computer Science, University of Otago, New Zealand, Wilf-equivalence and Wilf-collapse.
  • September 27th - Guannan Wang, Department of Mathematics, College of William and Mary, Williamsburg, Simultaneous confidence corridors for mean functions in functional data analysis of imaging data
  • October 4th - Billy Herzberg, Department of MSSC, 向日葵视频, Improving EIT (Electrical Impedance Tomography) images using deep learning
  • October 11th - 向日葵视频 Computational Sciences Summer Research Fellowship Talks: 
    • Nazmus SakibUnderstanding confounding medical interventions in Sepsis treatment: A step towards multi-parameter intelligent sepsis prediction in ICU.
    • Ziynet Nesibe Kesimoglu, Inferring competing endogenous RNA (ceRNA) interactions in cancer 
  • October 25th - Jordan Trinka, Department of MSSC, 向日葵视频, Milwaukee, Functional Singular Spectrum Analysis.
  • November 1st - Jacob R. Pichelmeyer, Mathematics Department, Kansas State University, Manhattan, KS.
  • November 8th - Andreas Hauptmann, Department of Mathematical Sciences, University of Oulu, Finland.
  • November 15th - Sunil Mathew, Joseph Coelho, Department of MSSC, 向日葵视频, Milwaukee, Computational Sciences Student Research Presentations.
  • November 22nd - Md Manzur Rahman, Paromita Nitu, Department of MSSC, 向日葵视频, Milwaukee, Computational Sciences Student Research Presentations.
  • December 6th - Rasha Atshan, Andrew Werra, Youming Wang, Wei Xu, Department of MSSC, 向日葵视频, Milwaukee, Applied Statistics Practica Summer Presentations.

Spring 2019

  • January 16th at 1:00 - Xanda Schofield, Department of Computer Science, Cornell University, Preserving Privacy in Statistical Models of Text.
  • January 22nd at 1:00 - Natalia Khuri, Department of Bioengineering, Stanford University, Translational Bioinformatics: How data science and machine learning are linking the molecular world to the clinical world.
  • January 25th at 1:00 - Mark Albert, Department of Computer Science, Loyola University Chicago, Healthcare wearables and computational neuroscience: applications of machine learning.
  • January 31st at 1:00 - Xuetao Wei, School of Information Technology, University of Cincinnati, Follow the Information: Illuminating Emerging Security Attacks and Applications.
  • February 5th at 1:00 - David Koop, Computer and Information Science Department, University of Massachusetts - Dartmouth, Supporting Reproducible Exploratory Data Analysis.
  • February 28th at 2:00 - Mahmoud Zarepour, Mathematical and Statistics Department, University of Ottawa, New Development in Nonparametric Bayesian Inference.
  • March 21st at 2:00 - Benjamin Risk, Rollins School of Public Health, Emory University
  • April 25th at 2:00 - Sedigheh Mirzaei Salehabadi, Biostatistics Department, St Jude Children's Research Hospital, Memphis
  • May 2nd at 2:00 - Han Liu, Department of Electrical Engineering and Computer Science, Northwestern University

Fall 2018

  • September 21st at 2:00 - Michael Zimmer, School of Information Studies, University of Wisconsin - Milwaukee, Pervasive Data Ethics for Computational Research
  • October 5th at 2:00 - Jenna Tague, Department of Mathematics, Fresno State University, Where Do Rate of Change Understandings Begin?
  • October 11th at 2:00 - Daniel Gervini, Department of Mathematical Sciences, University of Wisconsin - Milwaukee, Functional Data Methods for Replicated Point Processes
  • October 26th at 2:00 - Debaleena Chattopadhyay, Department of Computer Science, University of Illinois - Chicago, Designing Beyond the Desktop Technologies for Older Adults
  • November 9th at 2:00 - Noah Schweber, Department of Mathematics, University of Wisconsin - Madison, Cardinal Characteristics and Computability
  • November 16th at 2:00 - Banabithi Bose, Md Kamrul Hasan and Xuyong Yu, Department of Mathematics, Statistics, and Computer Science, 向日葵视频, miRNAdriver: Copy Number Derived microRNA-Gene Interactions in Cancer, Develop a Novel Feature Extraction Techniques for the Fingertip Videos and Apply Artificial Neural Network to Build Hemoglobin Prediction Model and A Kolmogorov-Smirnov Based Test using FFT to Evaluate Electrode-tissue Contact Force.
  • November 30th at 2:00 - Yichao Wu, Department of Mathematics, Statistics, and Computer Science, University of Illinois - Chicago, Nonparametric estimation of multivariate mixtures
  • December 7th at 2:00 - Andrew Brown, Department of Mathematical Sciences, Clemson University, Bayesian Spatial Binary Regression for Label Fusion in Structural Neuroimaging
  • December 12th at 12:00 - Ziynet Kesimoglu and Md Manzur R. Farazi, Department of Mathematics, Statistics, and Computer Science, 向日葵视频, Cancer Subtype Prediction Methodology and Machine Learning Techniques of Analyzing Spectroscopy Data for Age Estimation of Anopheles Arabiensis Mosquito

Spring 2018

  • January 19th at 1:00 - Ahmad Kawam (Special Colloquium), Department of Electrical & Computer Engineering, Texas A&M University, Moving Towards the Next Generation of Healthcare Applications
  • January 22nd at 1:00 - Pamela Harris (Special Colloquium), Department of Mathematical and Statistics, Williams College, Invisible Lattice Points
  • January 26th at 1:00 - Philippe Giabbanelli (Special Colloquium), Department of Computer Science, Northern Illinois University, Mining and predicting complex human systems at scale
  • January 29th at 1:00 - Giovanni Ciampaglia (Special Colloquium) School of Informatics and Computing, Indiana University, Threats to the information ecosystem
  • February 2 at 1:00 - Ping Zhang (Special Colloquium), Center for Computational Health, IBM T.J. Watson Research Center, Predictive Modeling of Drug Effects: Learning from Biomedical Knowledge and Clinical Records
  • February 5 at 1:00 - Michael DiPasquale (Special Colloquium), Department of Mathematics, Oklahoma State University, Piecewise Polynomials and Commutative Algebra
  • February 12 at 1:00 - Jay Pantone (Special Colloquium), Department of Mathematics, Dartmouth College, Sorting Permutations with C-Machines
  • February 15 at 2:00 - Daniel Rowe, Department of Mathematics, Statistics, and Computer Science, 向日葵视频, BiLinear, Bicubic and In Between Spline Interpolation
  • March 8 at 2:00 - Yunied Puig de Dios, Department of Mathematics, University of California - Riverside, A mixing operator T for which (T, T2 ) is not disjoint transitive
  • March 22 at 2:00 - Stefano Lonardi, Department of Computer Science & Engineering, University of California - Riverside, Improving contiguity and correctness of de novo genome assembly via optical map
  • April 5 at 2:00 - Amit Apte, International Centre for Theoretical Sciences, Tata Institute of Fundamental Research, Data assimilation for high dimensional systems: role of unstable subspace
  • April 12 at 2:00 - Wim Ruitenburg, Department of Mathematics, Statistics, and Computer Science, 向日葵视频, The 2017 Putnam Mathematical Competition
  • April 27 at 2:00 - Jody R. Westby, School of Computer Science, Georgia Tech and CEO of Global Cyber Risk LLC, Ethics of Big Data

Fall 2017

  • October 5th at 3:30 - Md. Kamrul Hasan & Md Fitrat Hossain, MSSC Department, 向日葵视频, Smartphone-based Hemoglobin Level Measurement Using Chromatic Analysis of Fingertip Videos on Different Color Spaces and Exploration of a Data Science Methodology to Predict High Risk Behavior for Veterans
  • October 12th at 3:30 - Dr. Robert Barry, Martinos Center for Biomedical Imaging, Harvard University, Neuroimaging of Brain and Spinal Cord at Ultra-High Magnetic Fields
  • October 20th at 1:00 - Dr. Hiroyuki Sato (Special Colloquium), Information Technology Center, University of Tokyo, Higher Dimension Topological data analysis: Exploration into Flexible Applications to Real Scenarios
  • October 26th at 3:30 - Md. Manzur Farazi & Jiblal Upadhya, MSSC Department, 向日葵视频, The False Discovery and False Non-Discovery Rate in Correlated Tests and Spatio-temporal modeling of small patches/region of Greenland Ice-sheet(GrIS)
  • November 9th at 3:30 - Dr. Brooke Magnus, Psychology Department, 向日葵视频, Modeling Questionnaire Data when the Responses are Counts
  • November 16th at 3:30 - Dr. Andrew D. Hahn, Department of Medical Physics, University of Wisconsin - Madison, Applications in MRI of Pulmonary Structure and Function
  • November 30th at 3:30 - Dr. Elaine Spiller, MSSC Department, 向日葵视频, Short-term probabilistic hazard mapping 鈥 forecasting catastrophe without stationary assumptions
  • December 7th at 3:30 - Duc Do & Emily White, MSSC Department, 向日葵视频, Identification of functional MiRNA-Transcription factor-Target gene modules in cancer and Modeling Protein Aggregation Concentration in Saccharomyces cerevisiae

Spring 2017

  • January 26th at 3:30 - Dr. Ryan Croke, Chefsteps, Mathematics: It's What's for Diner
  • February 2nd at 3:30 - Dr. Wim Ruitenburg, MSSC, 向日葵视频, We and the 2016 Putnam Math Competition
  • February 9th at 3:30 - Pete Sparks, MSSC, 向日葵视频, The Double n-Space Property for Contractible n-Manifolds
  • February 23rd at 3:30 - Dr. Dennis Brylow, MSSC, 向日葵视频, Priming the CS Pump: Structure of a State-Wide Push for K-12 Computer Science Education
  • March 3rd at 1:00 - Dr. Lori Ziegelmeier, MSSC, Macalester College, Measuring the Shape of Data with Topology
  • March 9th at 3:30 - Dr. Sarah Hamilton, MSSC, 向日葵视频, Improving Image Quality for Practical Electrical Impedance Tomography Imaging with Direct D-bar Methods
  • March 20th at 1:00- Dr. Paul Bankston, Emeritus MSSC, 向日葵视频, The hunting of the pseudo-arc
  • March 23rd at 3:30 - Dr. John Kornak, Department of Epidemiology & Biostatistics, University of California - San Francisco, Bayesian image analysis in Fourier space
  • March 30th at 3:30 - Dr. Moo K. Chung, Department of Biostatistics and Medical Informatics, UW-Madison, From computational Neuroanatomy to Computational Topology
  • April 6th at 3:30 - Dr. Mariya Soskova, Department of Mathematical Logic and Applications, UW-Madison, Logic and Degrees
  • April 20th at 3:30 - Dr. Benjamin Black, Department of Earth & Atmospheric Science, City University of New York, To see a world in a grain of ash:bridging >10 orders of magnitude in space and time to understand the consequences of volcanism
  • April 28th at 1:30 - Dr. Erikki Somersalo, Department of Mathematics, Applied Mathematical and Statistics, Case Western Reserve University